4301-4350 of 10000 results (74ms)
2020-03-04 §
16:02 <addshore> addshore@mwmaint1002:~$ mwscript extensions/Wikibase/repo/maintenance/rebuildPropertyTerms.php --wiki=wikidatawiki --sleep 1 --batch-size=50 # T244115 [production]
15:47 <liw@deploy1001> Synchronized php: group1 wikis to 1.35.0-wmf.22 (duration: 01m 03s) [production]
15:46 <liw@deploy1001> rebuilt and synchronized wikiversions files: group1 wikis to 1.35.0-wmf.22 [production]
15:29 <vgutierrez> upgrading ATS to version 8.0.6 on eqiad [production]
15:11 <thcipriani> restarting zuul [production]
14:55 <akosiaris@cumin1001> conftool action : set/pooled=true; selector: dnsdisc=eventgate-analytics-external [production]
14:44 <filippo@cumin1001> END (PASS) - Cookbook sre.hosts.downtime (exit_code=0) [production]
14:41 <filippo@cumin1001> START - Cookbook sre.hosts.downtime [production]
14:25 <vgutierrez> upgrading ATS to version 8.0.6 on codfw [production]
14:19 <liw@deploy1001> rebuilt and synchronized wikiversions files: Revert "group1 wikis to 1.35.0-wmf.21 [production]
14:18 <akosiaris> cleanup old LVS eventgate services. T245203 [production]
14:13 <addshore> cache warming stopped on db1126 and db1111 [production]
14:08 <liw@deploy1001> Synchronized php: group1 wikis to 1.35.0-wmf.22 (duration: 01m 04s) [production]
14:07 <liw@deploy1001> rebuilt and synchronized wikiversions files: group1 wikis to 1.35.0-wmf.22 [production]
13:47 <addshore> START warm cache for db1111 & db1126 for Q25-30 million T219123 (pass 3) [production]
13:33 <godog> disable puppet on install1002 to test partman on theemin [production]
13:19 <vgutierrez> upgrading ATS to version 8.0.6 on esams [production]
13:14 <marostegui> Drop fixcopyrightwiki from sanitarium hosts (db1112, db2074) to avoid getting the data alert - T246055 [production]
12:55 <urbanecm@deploy1001> Synchronized wmf-config/throttle.php: 37db2a1: Add new throttle rule for WikiGap Göteborg 2020-03-06 (T246888) (duration: 01m 04s) [production]
12:23 <XioNoX> add flowspec rule on cr3-knams - T243482 [production]
12:20 <Urbanecm> EU SWAT done [production]
12:19 <moritzm> installing 4.9.210-1~deb8u1 kernel on jessie hosts (no reboots, just the upgrade) [production]
12:19 <urbanecm@deploy1001> Synchronized php-1.35.0-wmf.22/extensions/GrowthExperiments/includes/HelpPanel/QuestionStore.php: SWAT: d495f4c: Replace loadRevisionFromId which has been removed in I0c8fe834da79c (duration: 01m 06s) [production]
12:14 <urbanecm@deploy1001> Synchronized wmf-config/throttle.php: SWAT: 1fa9dda: IP Cap Lift for University of Mannheim Wikimedia Event (2020-04-01) (T246832) (duration: 01m 06s) [production]
12:11 <moritzm> imported linux-meta 1.23 to apt.wikimedia.org for jessie-wikimedia [production]
12:04 <urbanecm@deploy1001> Synchronized wmf-config/throttle.php: SWAT: 85a5c05: Add throttle exempt for 2020-03-07 GenderGap Event (T246813) (duration: 01m 05s) [production]
11:51 <addshore> START warm cache for db1111 & db1126 for Q25-30 million T219123 (pass 2) [production]
11:19 <vgutierrez> upgrading ATS to version 8.0.6 on eqsin [production]
11:01 <addshore@deploy1001> Synchronized wmf-config/InitialiseSettings.php: Write to new term store up to Q86 million, was 84 (T219123) cache bust (duration: 01m 03s) [production]
11:00 <addshore@deploy1001> Synchronized wmf-config/InitialiseSettings.php: Write to new term store up to Q86 million, was 84 (T219123) (duration: 01m 04s) [production]
10:52 <vgutierrez> upgrading ATS to version 8.0.6 on ulsfo [production]
10:41 <addshore> START warm cache for db1111 & db1126 for Q25-30 million T219123 (pass 1) [production]
10:38 <vgutierrez> upload trafficserver 8.0.6-1wm1 to apt.wm.o (buster) [production]
10:38 <addshore@deploy1001> Synchronized wmf-config/InitialiseSettings.php: Reading up to Q25M for the new term store everywhere (was Q20M) + warm db1126 & db1111 caches (T219123) cache bust (duration: 01m 04s) [production]
10:36 <addshore@deploy1001> Synchronized wmf-config/InitialiseSettings.php: Reading up to Q25M for the new term store everywhere (was Q20M) + warm db1126 & db1111 caches (T219123) (duration: 01m 05s) [production]
10:20 <marostegui> Remove es2 eqiad and codfw from zarcillo.masters table - T246072 [production]
10:10 <marostegui> Update shards table to set es2 display=0 - T246072 [production]
10:05 <marostegui> es2 maintenance window over T246072 [production]
09:59 <marostegui@cumin1001> dbctl commit (dc=all): 'Give some weight to es2 master es1015 and es2016, now standalone - T246072', diff saved to https://phabricator.wikimedia.org/P10609 and previous config saved to /var/cache/conftool/dbconfig/20200304-095919-marostegui.json [production]
09:55 <marostegui> Reset replication on es2 hosts - T246072 [production]
09:44 <moritzm> installing python-bleach security updates [production]
09:43 <marostegui> Set es1015 (es2 master) on read_only - T246072 [production]
09:38 <addshore> START warm cache for db1111 & db1126 for Q20-25 million T219123 (pass 3 today) [production]
09:21 <marostegui@deploy1001> Synchronized wmf-config/db-eqiad.php: Set es2 as RO - T246072 (duration: 01m 04s) [production]
09:13 <_joe_> removing nginx from servers where it was just used for service proxying. [production]
09:09 <marostegui@deploy1001> Synchronized wmf-config/db-codfw.php: Set es2 as RO - T246072 (duration: 01m 14s) [production]
08:58 <akosiaris> release Giant Puppet Lock across the fleet. https://gerrit.wikimedia.org/r/#/c/operations/puppet/+/464601/ has made it's way to all PoPs and most of codfw without issues, will make it in the rest of the fleet in the next 30mins [production]
08:54 <addshore> START warm cache for db1111 & db1126 for Q20-25 million T219123 (pass 2 today) [production]
08:45 <akosiaris> running puppet on first mw host after merge of https://gerrit.wikimedia.org/r/#/c/operations/puppet/+/464601/, mw2269, rescheduling icinga checks as well [production]
08:41 <akosiaris> running puppet on first es host after merge of https://gerrit.wikimedia.org/r/#/c/operations/puppet/+/464601/, es2019, rescheduling icinga checks as well (correction) [production]