2019-11-18
§
|
12:48 |
<Urbanecm> |
Run mwscript recountCategories.php --wiki=dewiki --mode=pages (T238500) |
[production] |
12:47 |
<Urbanecm> |
Run mwscript recountCategories.php --wiki=dewiki --mode=subcats (T238500) |
[production] |
11:32 |
<awight> |
EU SWAT complete |
[production] |
11:28 |
<awight@deploy1001> |
Synchronized php-1.35.0-wmf.5/extensions/Cite: SWAT: [[gerrit:551389|Track pageviews only on content page views, not edits (T214493)]] (duration: 00m 51s) |
[production] |
11:26 |
<awight@deploy1001> |
Synchronized php-1.35.0-wmf.5/extensions/Popups: SWAT: [[gerrit:551397|Don't record Popups actions on non-content pages (T214493)]] (duration: 00m 51s) |
[production] |
11:04 |
<moritzm> |
installing postgresql-common security updates |
[production] |
10:56 |
<moritzm> |
installing python-werkzeug security updates |
[production] |
10:56 |
<marostegui> |
Deploy schema change on db2078 (codfw master for wikidatawiki), this will create lag on s8 codfw - T237120 |
[production] |
10:53 |
<moritzm> |
installing gdb updates from buster point release |
[production] |
10:49 |
<moritzm> |
installing python-cryptography bugfix updates from buster point release |
[production] |
10:45 |
<moritzm> |
updated buster netinst image for 10.2 T238519 |
[production] |
10:16 |
<marostegui> |
Upgrade MySQL on labsdb1012 |
[production] |
09:33 |
<godog> |
remove wezen from service, pending reimage |
[production] |
09:11 |
<marostegui> |
Remove ar_comment from triggers on db2094:3318 - T234704 |
[production] |
09:11 |
<marostegui> |
Deploy schema change on s8 codfw, this will generate lag on s8 codfw - T233135 T234066 |
[production] |
09:03 |
<marostegui> |
Restart MySQL on db1124 and db1125 to apply new replication filters T238370 |
[production] |
07:17 |
<marostegui> |
Upgrade and restart mysql on sanitarium hosts on codfw to pick up new replication filters: db2094 and db2095 - T238370 |
[production] |
07:09 |
<marostegui> |
Stop MySQL on db2070 to clone db2135 - T238183 |
[production] |
06:52 |
<vgutierrez> |
Move cp1083 from nginx to ats-tls - T231627 |
[production] |
06:32 |
<vgutierrez> |
Move cp1081 from nginx to ats-tls - T231627 |
[production] |
06:30 |
<marostegui> |
Restart tendril mysql - T231769 |
[production] |
06:12 |
<vgutierrez> |
Move cp2012 from nginx to ats-tls - T231627 |
[production] |
06:05 |
<marostegui@cumin1001> |
dbctl commit (dc=all): 'Depool db1096:3316 for compression', diff saved to https://phabricator.wikimedia.org/P9652 and previous config saved to /var/cache/conftool/dbconfig/20191118-060508-marostegui.json |
[production] |
06:02 |
<marostegui@cumin1001> |
dbctl commit (dc=all): 'Depool db1105:3312 for compression', diff saved to https://phabricator.wikimedia.org/P9651 and previous config saved to /var/cache/conftool/dbconfig/20191118-060207-marostegui.json |
[production] |
06:01 |
<marostegui@cumin1001> |
dbctl commit (dc=all): 'Repool db2072, db2088:3311, db2087:3316, db2086:3317 after maintenances and schema changes', diff saved to https://phabricator.wikimedia.org/P9650 and previous config saved to /var/cache/conftool/dbconfig/20191118-060114-marostegui.json |
[production] |
05:53 |
<marostegui> |
Deploy schema change on s5 primary master db1100 - T233135 T234066 |
[production] |
03:40 |
<vgutierrez> |
Move cp2007 from nginx to ats-tls - T231627 |
[production] |
00:44 |
<tstarling@deploy1001> |
Synchronized php-1.35.0-wmf.5/includes/Rest/Handler/PageHistoryCountHandler.php: fix extremely slow query T238378 (duration: 00m 59s) |
[production] |
2019-11-15
§
|
22:14 |
<jeh@cumin1001> |
END (PASS) - Cookbook sre.hosts.downtime (exit_code=0) |
[production] |
22:12 |
<jeh@cumin1001> |
START - Cookbook sre.hosts.downtime |
[production] |
21:54 |
<jeh@cumin1001> |
END (PASS) - Cookbook sre.hosts.downtime (exit_code=0) |
[production] |
21:52 |
<jeh@cumin1001> |
START - Cookbook sre.hosts.downtime |
[production] |
21:31 |
<jeh@cumin1001> |
END (PASS) - Cookbook sre.hosts.downtime (exit_code=0) |
[production] |
21:29 |
<jeh@cumin1001> |
START - Cookbook sre.hosts.downtime |
[production] |
21:21 |
<_joe_> |
disabling proxying to ws on phabricator1003 |
[production] |
20:04 |
<XioNoX> |
push pfw policies to pfw3-eqiad - T238368 |
[production] |
20:02 |
<XioNoX> |
push pfw policies to pfw3-codfw - T238368 |
[production] |
19:07 |
<XioNoX> |
remove vlan 1 trunking between msw1-codfw and mr1-codfw, will cause a quick connectivity issue - T228112 |
[production] |
18:07 |
<XioNoX> |
homer push on management switches |
[production] |
17:30 |
<mutante> |
phabricator - -started phd service |
[production] |
17:11 |
<XioNoX> |
homer push to management routers (https://gerrit.wikimedia.org/r/550576) |
[production] |
16:43 |
<hashar> |
Restored zuul-merger / CI for operations/puppet.git |
[production] |
16:29 |
<hashar> |
CI slowed down due to a huge spike of internal jobs. Being flushed as of now # T140297 |
[production] |
16:25 |
<bblack> |
repool cp2001 |
[production] |
16:08 |
<bblack> |
depool cp2001 for experiments |
[production] |
16:02 |
<moritzm> |
rebooting rpki1001 to rectify microcode loading |
[production] |