2401-2450 of 10000 results (50ms)
2019-02-07 ยง
19:45 <sbisson@deploy1001> Synchronized php-1.33.0-wmf.16/extensions/GrowthExperiments/: SWAT: [[gerrit:488988|Help Panel: Fix iOS scroll bug]] (duration: 03m 02s) [production]
19:28 <sbisson@deploy1001> sync-file aborted: SWAT: [[gerrit:488675|GrowthExperiments: Enable search for help panel on testwiki]] (duration: 02m 22s) [production]
19:25 <sbisson@deploy1001> Synchronized wmf-config/InitialiseSettings.php: SWAT: [[gerrit:488675|GrowthExperiments: Enable search for help panel on testwiki]] (duration: 03m 04s) [production]
18:32 <mutante> LDAP - adding raz-shuty to group nda (T214488) [production]
17:06 <jynus@deploy1001> Synchronized wmf-config/db-eqiad.php: Repool db1085 (duration: 03m 03s) [production]
16:03 <jynus> restart db1085, temporary s6 lag on wikireplicas [production]
15:55 <gehel> starting reimage of maps2004 - T198622 [production]
15:51 <jynus@deploy1001> Synchronized wmf-config/db-eqiad.php: Depool db1085 (duration: 00m 58s) [production]
15:16 <anomie@mwmaint1002> Fixing log_search after migrateActors.php on wikitech for T215464. This may cause lag in codfw. [production]
15:16 <anomie@mwmaint1002> Fixing log_search after migrateActors.php on section 8 wikis for T215464. This may cause lag in codfw. [production]
15:16 <anomie@mwmaint1002> Fixing log_search after migrateActors.php on section 7 wikis for T215464. This may cause lag in codfw. [production]
15:16 <anomie@mwmaint1002> Fixing log_search after migrateActors.php on section 6 wikis for T215464. This may cause lag in codfw. [production]
15:16 <anomie@mwmaint1002> Fixing log_search after migrateActors.php on section 5 wikis for T215464. This may cause lag in codfw. [production]
15:16 <anomie@mwmaint1002> Fixing log_search after migrateActors.php on section 4 wikis for T215464. This may cause lag in codfw. [production]
15:16 <anomie@mwmaint1002> Fixing log_search after migrateActors.php on remaining section 3 wikis for T215464. This may cause lag in codfw. [production]
15:16 <anomie@mwmaint1002> Fixing log_search after migrateActors.php on section 2 wikis for T215464. This may cause lag in codfw. [production]
15:16 <anomie@mwmaint1002> Fixing log_search after migrateActors.php on section 1 wikis for T215464. This may cause lag in codfw. [production]
15:07 <anomie@mwmaint1002> Fixing log_search after migrateActors.php on test wikis and mediawikiwiki for T215464. This may cause lag in codfw. [production]
15:01 <marostegui@deploy1001> Synchronized wmf-config/db-eqiad.php: Fully repool db1101 (duration: 00m 55s) [production]
14:39 <marostegui@deploy1001> Synchronized wmf-config/db-eqiad.php: Slowly repool db1101 after alter and mysql upgrade (duration: 00m 55s) [production]
14:34 <jbond42> deploying security updates for libgd3 [production]
12:42 <Amir1> EU SWAT is done [production]
12:42 <ladsgroup@deploy1001> Synchronized wmf-config/Wikibase.php: SWAT: [[gerrit:488907|Set EntityUsageTable addUsage batch size to 300]], Part II (duration: 00m 54s) [production]
12:42 <marostegui> Set dbstore1002 as IDEMPOTENT - T213670 [production]
12:39 <ladsgroup@deploy1001> Synchronized wmf-config/InitialiseSettings.php: SWAT: [[gerrit:488907|Set EntityUsageTable addUsage batch size to 300 (T215146)]], Part I (duration: 00m 55s) [production]
12:34 <marostegui> Powercycle mw1299 as it is down and not responding [production]
12:31 <marostegui@deploy1001> Synchronized wmf-config/db-eqiad.php: Slowly repool db1101 after alter and mysql upgrade (duration: 03m 02s) [production]
12:26 <ladsgroup@deploy1001> Synchronized wmf-config/interwiki.php: SWAT: [[gerrit:488903|Update interwiki cache to have yuewiktionary instead of zh-yue (T214400)]] (duration: 03m 04s) [production]
12:06 <vgutierrez@puppetmaster1001> conftool action : set/pooled=yes; selector: name=cp4026.ulsfo.wmnet [production]
12:03 <arturo> T214448 reimaging again cloudvirt200[1-3]-dev.codfw.wmnet [production]
11:55 <marostegui> Stop MySQL on db1101:3317 and db1101:3318 for mysql upgrade [production]
11:37 <jynus@deploy1001> Synchronized wmf-config/db-codfw.php: Repool db2055 (duration: 03m 02s) [production]
11:17 <fsero> upgrade helm to 2.12.2 on deploy{1001,2001} and contint{1001,2001} T215244 [production]
11:16 <fsero> upgrade helm to 2.12.2 on deploy{1001,2001} and contint{1001,2001} [production]
10:58 <marostegui@deploy1001> Synchronized wmf-config/db-eqiad.php: Depool db1101 for alter and mysql upgrade (duration: 00m 56s) [production]
10:43 <marostegui> Run mysqldump from dbstore1003 to dump dbstore1002:staging.mep_word_persistence - T215450 [production]
09:49 <marostegui> Deploy schema change on db1116 - T210713 [production]
09:41 <akosiaris> reboot mwdebug1001, mwdebug1002, mwdebug2001, mwdebug2002 for VCPU upgrade. T212955 [production]
09:23 <jynus> running alter table on db2055 for perforamance testing T212092 [production]
09:15 <fsero> uploading helm and tiller 2.12.2 deb package to stretch and jessie [production]
08:53 <marostegui> Deploy schema change on s7 codfw master (db2047), this will generate lag on s7 codfw - T210713 [production]
08:34 <godog> swift codfw-prod: more weight to ms-be2047 - T209395 T209921 [production]
08:14 <marostegui> Deploy schema change on s4 primary master (db1068) - T210713 [production]
08:13 <marostegui@deploy1001> Synchronized wmf-config/db-eqiad.php: Repool db1081 (duration: 00m 54s) [production]
07:50 <marostegui> Deploy schema change on db1081 [production]
07:49 <marostegui@deploy1001> Synchronized wmf-config/db-eqiad.php: Depool db1081 (duration: 00m 53s) [production]
07:48 <reedy@deploy1001> Synchronized wmf-config/interwiki.php: Update interwiki cache (duration: 02m 20s) [production]
07:43 <marostegui@deploy1001> Synchronized wmf-config/db-eqiad.php: Repool db1084 (duration: 00m 53s) [production]
07:42 <reedy@deploy1001> Synchronized dblists/: Wikimania T215486 (duration: 00m 54s) [production]
07:03 <marostegui> Deploy schema change on db1084 - T210713 [production]