2020-09-21
§
|
07:53 |
<hashar> |
Upgrading all CI Jenkins jobs to Quibble 0.0.45 |
[production] |
07:05 |
<XioNoX> |
upgrade FNM to 1.1.7 in ulsfo - T257035 |
[production] |
06:00 |
<marostegui@cumin1001> |
dbctl commit (dc=all): 'Fully pool es2029 and es2030 T261717', diff saved to https://phabricator.wikimedia.org/P12677 and previous config saved to /var/cache/conftool/dbconfig/20200921-060053-marostegui.json |
[production] |
05:48 |
<marostegui> |
Set innodb_change_buffering = inserts; on db2129 (s6 master) for performance testing |
[production] |
05:47 |
<marostegui@cumin1001> |
dbctl commit (dc=all): 'Pool es2029 and es2030 with more weight T261717', diff saved to https://phabricator.wikimedia.org/P12676 and previous config saved to /var/cache/conftool/dbconfig/20200921-054730-marostegui.json |
[production] |
05:27 |
<marostegui@cumin1001> |
dbctl commit (dc=all): 'Pool es2029 and es2030 with more weight T261717', diff saved to https://phabricator.wikimedia.org/P12675 and previous config saved to /var/cache/conftool/dbconfig/20200921-052704-marostegui.json |
[production] |
05:18 |
<marostegui> |
Stop mysql on: es2013 es2016 es2019 to clone es2032 es2033 es2034 - T261717 |
[production] |
05:06 |
<marostegui@cumin1001> |
dbctl commit (dc=all): 'Pool es2029 and es2030 with more weight T261717', diff saved to https://phabricator.wikimedia.org/P12674 and previous config saved to /var/cache/conftool/dbconfig/20200921-050632-marostegui.json |
[production] |
05:06 |
<marostegui> |
Deploy MCR schema change on s8 eqiad master, lag will appear on s8 (wikidata) on labsdb hosts T238966 |
[production] |
05:03 |
<marostegui@cumin1001> |
dbctl commit (dc=all): 'Depool es2013,es2016 and es2019 to clone new hosts T261717', diff saved to https://phabricator.wikimedia.org/P12673 and previous config saved to /var/cache/conftool/dbconfig/20200921-050305-marostegui.json |
[production] |
05:02 |
<marostegui@cumin1001> |
dbctl commit (dc=all): 'Set es2015 as es2 codfw master T261717', diff saved to https://phabricator.wikimedia.org/P12672 and previous config saved to /var/cache/conftool/dbconfig/20200921-050228-marostegui.json |
[production] |
04:59 |
<marostegui@cumin1001> |
dbctl commit (dc=all): 'Pool es2029 and es2030 with more weight T261717', diff saved to https://phabricator.wikimedia.org/P12671 and previous config saved to /var/cache/conftool/dbconfig/20200921-045919-marostegui.json |
[production] |
04:37 |
<marostegui> |
Set innodb_change_buffering = inserts; on db2116 for performance testing |
[production] |
04:31 |
<marostegui@cumin1001> |
dbctl commit (dc=all): 'Pool es2029 and es2030 for the first time with minimal weight T261717', diff saved to https://phabricator.wikimedia.org/P12670 and previous config saved to /var/cache/conftool/dbconfig/20200921-043154-marostegui.json |
[production] |
2020-09-18
§
|
21:48 |
<tzatziki> |
changed password for Millennium bug@ptwiki |
[production] |
19:28 |
<eileen> |
process-control config revision is 739ea754ca |
[production] |
18:52 |
<pt1979@cumin2001> |
END (PASS) - Cookbook sre.dns.netbox (exit_code=0) |
[production] |
18:46 |
<pt1979@cumin2001> |
START - Cookbook sre.dns.netbox |
[production] |
18:44 |
<ryankemper> |
`sudo kill 254017 254018 254028 254029` to kill some dangling serdi / gzip processes, all the wikidata cleanup should be complete |
[production] |
18:38 |
<ryankemper> |
`sudo kill 126121 126122 126124 126128 249520 249521 254016 254027` on `snapshot1008` to terminate wikidata dump jobs that are in a bad state |
[production] |
18:10 |
<ryankemper> |
Removed stale `wikidatardf-dumps` crontab entry from `dumpsgen@snapshot1008`, stored backup of previous state of crontab in the (admittedly verbose) `/tmp/dumpsgen_crontab_before_removing_stale_wikidata_dump_entry_see_gerrit_puppet_patch_622342` |
[production] |
17:15 |
<mutante> |
lists1001 - apt-get install pwgen to generate passwords (this was installed on previous list server but apparently not puppetized, puppet patch coming up) |
[production] |
16:23 |
<pt1979@cumin2001> |
END (FAIL) - Cookbook sre.hosts.downtime (exit_code=99) |
[production] |
16:21 |
<pt1979@cumin2001> |
START - Cookbook sre.hosts.downtime |
[production] |
15:09 |
<mutante> |
restarting gerrit service to apply gerrit::628338 to make it dump heap if out of memory (T263008) |
[production] |
14:15 |
<ladsgroup@deploy1001> |
Synchronized wmf-config/Wikibase.php: labs: Turn on termbox v2 on desktop for wikidatawiki -- noop for production, sanity sync (T261488) (duration: 00m 56s) |
[production] |
14:13 |
<ladsgroup@deploy1001> |
Synchronized wmf-config/InitialiseSettings.php: labs: Turn on termbox v2 on desktop for wikidatawiki -- noop for production, sanity sync (T261488) (duration: 01m 00s) |
[production] |
13:02 |
<kormat@cumin2001> |
END (PASS) - Cookbook sre.hosts.downtime (exit_code=0) |
[production] |
13:00 |
<kormat@cumin2001> |
START - Cookbook sre.hosts.downtime |
[production] |
12:48 |
<cdanis@cumin1001> |
conftool action : set/pooled=true; selector: dnsdisc=swift,name=eqiad |
[production] |
12:41 |
<kormat> |
reimaging db2125 T263244 |
[production] |
12:39 |
<kormat@cumin1001> |
dbctl commit (dc=all): 'db2089:3316 (re)pooling @ 100%: schema change T259831', diff saved to https://phabricator.wikimedia.org/P12665 and previous config saved to /var/cache/conftool/dbconfig/20200918-123947-kormat.json |
[production] |
12:24 |
<kormat@cumin1001> |
dbctl commit (dc=all): 'db2089:3316 (re)pooling @ 75%: schema change T259831', diff saved to https://phabricator.wikimedia.org/P12664 and previous config saved to /var/cache/conftool/dbconfig/20200918-122444-kormat.json |
[production] |
12:09 |
<kormat@cumin1001> |
dbctl commit (dc=all): 'db2089:3316 (re)pooling @ 50%: schema change T259831', diff saved to https://phabricator.wikimedia.org/P12663 and previous config saved to /var/cache/conftool/dbconfig/20200918-120940-kormat.json |
[production] |
11:54 |
<kormat@cumin1001> |
dbctl commit (dc=all): 'db2089:3316 (re)pooling @ 25%: schema change T259831', diff saved to https://phabricator.wikimedia.org/P12662 and previous config saved to /var/cache/conftool/dbconfig/20200918-115437-kormat.json |
[production] |
11:35 |
<marostegui@cumin1001> |
dbctl commit (dc=all): 'Depool db2125', diff saved to https://phabricator.wikimedia.org/P12661 and previous config saved to /var/cache/conftool/dbconfig/20200918-113509-marostegui.json |
[production] |
11:15 |
<kormat@cumin1001> |
dbctl commit (dc=all): 'db2089:3316 depooling: schema change T259831', diff saved to https://phabricator.wikimedia.org/P12660 and previous config saved to /var/cache/conftool/dbconfig/20200918-111529-kormat.json |
[production] |
10:56 |
<kormat@cumin1001> |
dbctl commit (dc=all): 'db2087:3316 (re)pooling @ 100%: schema change T259831', diff saved to https://phabricator.wikimedia.org/P12659 and previous config saved to /var/cache/conftool/dbconfig/20200918-105645-kormat.json |
[production] |
10:45 |
<jiji@deploy1001> |
helmfile [codfw] Ran 'sync' command on namespace 'push-notifications' for release 'main' . |
[production] |
10:41 |
<kormat@cumin1001> |
dbctl commit (dc=all): 'db2087:3316 (re)pooling @ 75%: schema change T259831', diff saved to https://phabricator.wikimedia.org/P12658 and previous config saved to /var/cache/conftool/dbconfig/20200918-104141-kormat.json |
[production] |
10:35 |
<jiji@deploy1001> |
helmfile [eqiad] Ran 'sync' command on namespace 'push-notifications' for release 'main' . |
[production] |
10:34 |
<hnowlan@deploy1001> |
helmfile [eqiad] Ran 'sync' command on namespace 'kube-system' for release 'calico-policy-controller' . |
[production] |
10:31 |
<hnowlan@deploy1001> |
helmfile [staging] Ran 'sync' command on namespace 'kube-system' for release 'calico-policy-controller' . |
[production] |