2020-06-02
§
|
11:10 |
<kart_> |
Finished EU Mid-day SWAT. |
[production] |
11:08 |
<mutante> |
contint1001 - common issue after reinstalls again - a2dismod mpm_event ; systemctl restart apache2 ; puppet agent -tv ( T196968) https://gerrit.wikimedia.org/r/c/operations/puppet/+/451206 |
[production] |
11:07 |
<kartik@deploy1001> |
Synchronized wmf-config/InitialiseSettings.php: SWAT: [[gerrit|601174|Create URL campaign for African languages for COVID-19 translation project (T253305)]] (duration: 01m 00s) |
[production] |
11:01 |
<hnowlan@deploy1001> |
helmfile [STAGING] Ran 'sync' command on namespace 'changeprop-jobqueue' for release 'staging' . |
[production] |
10:48 |
<mutante> |
LDAP - added uid=lulu to group nda (T254121) |
[production] |
10:29 |
<akosiaris> |
switch over ores1XXX hosts to redis::misc from oresrdb hosts. T254226 |
[production] |
10:12 |
<jynus> |
disable non-global root login to gerrit2001 T254162 |
[production] |
10:12 |
<dzahn@cumin1001> |
END (PASS) - Cookbook sre.hosts.downtime (exit_code=0) |
[production] |
10:11 |
<marostegui@cumin1001> |
dbctl commit (dc=all): 'Fully repool db1121, db1148 T252512', diff saved to https://phabricator.wikimedia.org/P11361 and previous config saved to /var/cache/conftool/dbconfig/20200602-101150-marostegui.json |
[production] |
10:09 |
<dzahn@cumin1001> |
START - Cookbook sre.hosts.downtime |
[production] |
10:09 |
<akosiaris> |
switch over ores2XXX hosts to redis::misc from oresrdb hosts. T254226 |
[production] |
10:02 |
<marostegui@cumin1001> |
dbctl commit (dc=all): 'Slowly repool db1121, db1148 T252512', diff saved to https://phabricator.wikimedia.org/P11360 and previous config saved to /var/cache/conftool/dbconfig/20200602-100246-marostegui.json |
[production] |
09:53 |
<marostegui@cumin1001> |
dbctl commit (dc=all): 'Slowly repool db1121, db1148 T252512', diff saved to https://phabricator.wikimedia.org/P11359 and previous config saved to /var/cache/conftool/dbconfig/20200602-095321-marostegui.json |
[production] |
09:49 |
<marostegui@cumin1001> |
dbctl commit (dc=all): 'Depool db1138', diff saved to https://phabricator.wikimedia.org/P11358 and previous config saved to /var/cache/conftool/dbconfig/20200602-094914-marostegui.json |
[production] |
09:44 |
<marostegui@cumin1001> |
dbctl commit (dc=all): 'Slowly repool db1121, db1148 T252512', diff saved to https://phabricator.wikimedia.org/P11357 and previous config saved to /var/cache/conftool/dbconfig/20200602-094441-marostegui.json |
[production] |
09:38 |
<marostegui@cumin1001> |
dbctl commit (dc=all): 'Add db1148 to dbctl depooled T252512', diff saved to https://phabricator.wikimedia.org/P11356 and previous config saved to /var/cache/conftool/dbconfig/20200602-093841-marostegui.json |
[production] |
08:59 |
<ema> |
upload purged 0.15 to buster-wikimedia |
[production] |
08:09 |
<mutante> |
re-imaging contint1001 with buster |
[production] |
07:43 |
<marostegui> |
Stop MySQL on db1121 |
[production] |
07:40 |
<marostegui@cumin1001> |
dbctl commit (dc=all): 'Depool db1121 to clone db1148', diff saved to https://phabricator.wikimedia.org/P11353 and previous config saved to /var/cache/conftool/dbconfig/20200602-074027-marostegui.json |
[production] |
07:32 |
<marostegui@cumin1001> |
dbctl commit (dc=all): 'Repool db1079 after data check', diff saved to https://phabricator.wikimedia.org/P11351 and previous config saved to /var/cache/conftool/dbconfig/20200602-073245-marostegui.json |
[production] |
07:22 |
<marostegui> |
Stop slave on db1079 for data check |
[production] |
07:22 |
<marostegui@cumin1001> |
dbctl commit (dc=all): 'Depool db1079 for data check', diff saved to https://phabricator.wikimedia.org/P11350 and previous config saved to /var/cache/conftool/dbconfig/20200602-072214-marostegui.json |
[production] |
07:06 |
<marostegui> |
Stop MySQL and poweroff on db1138 for on-site maintenance - T253808 |
[production] |
05:01 |
<marostegui> |
Stop mysql on db1141 to save a binary backup - T249188 |
[production] |
01:03 |
<krinkle@deploy1001> |
Synchronized wmf-config/mc.php: I06897bcc92c5 (duration: 00m 59s) |
[production] |
2020-06-01
§
|
20:14 |
<shdubsh> |
downgrade mtail to rc5 in ulsfo -- T254192 |
[production] |
20:12 |
<XioNoX> |
enable IX4/6 on cr4-ulsfo - T237575 |
[production] |
19:57 |
<XioNoX> |
disable IX4/6 on cr4-ulsfo - T237575 |
[production] |
19:55 |
<XioNoX> |
fail vrrp over cr3-ulsfo - T237575 |
[production] |
19:44 |
<shdubsh> |
restart atsmtail in eqsin |
[production] |
18:21 |
<ppchelko@deploy1001> |
Synchronized wmf-config/InitialiseSettings.php: SWAT: [[gerrit:570395|Enable kask-transition for all wikis]] (duration: 01m 00s) |
[production] |
17:59 |
<XioNoX> |
offline cr1-codfw:fpc0 - T254110 |
[production] |
17:47 |
<XioNoX> |
turn online cr1-codfw:fpc0 - T254110 |
[production] |
17:46 |
<shdubsh> |
update mtail in ulsfo caching hosts. restarting atsmtail and varnishmtail |
[production] |
17:31 |
<mutante> |
backup1001 - queued job 42 - gerrit backup after renaming of the file set and addition of LFS data (T254155, T254162) it is incremental, the full one already ran |
[production] |
16:49 |
<otto@deploy1001> |
Synchronized wmf-config/InitialiseSettings.php: EventLogging - fix searchsatisfaction schema URI - testwiki only - T249261 (duration: 00m 59s) |
[production] |
16:48 |
<otto@deploy1001> |
sync-file aborted: EventLogging - fix searchsatisfaction schema URI - testwiki only - T249261 (duration: 00m 02s) |
[production] |
16:39 |
<bstorm_> |
running view updates on db1141 T252219 |
[production] |
14:53 |
<elukey> |
ganeti: increase memory available for an-launcher1001 from 8g to 12g - T254125 |
[production] |
14:44 |
<volans> |
deploying ulsfo mgmt DNS records automatically generated by Netbox ( operations/dns/+/585545/ ) - T233183 |
[production] |
12:00 |
<marostegui@cumin1001> |
dbctl commit (dc=all): 'Fully repool db1142, db1147 T252512', diff saved to https://phabricator.wikimedia.org/P11345 and previous config saved to /var/cache/conftool/dbconfig/20200601-120000-marostegui.json |
[production] |
11:44 |
<marostegui@cumin1001> |
dbctl commit (dc=all): 'Slowly repool db1142, db1147 T252512', diff saved to https://phabricator.wikimedia.org/P11344 and previous config saved to /var/cache/conftool/dbconfig/20200601-114440-marostegui.json |
[production] |
11:30 |
<marostegui@cumin1001> |
dbctl commit (dc=all): 'Slowly repool db1142, db1147 T252512', diff saved to https://phabricator.wikimedia.org/P11343 and previous config saved to /var/cache/conftool/dbconfig/20200601-113032-marostegui.json |
[production] |
10:49 |
<jdrewniak@deploy1001> |
Synchronized portals: Wikimedia Portals Update: [[gerrit:601328| Bumping portals to master (601328)]] (duration: 00m 59s) |
[production] |
10:48 |
<jdrewniak@deploy1001> |
Synchronized portals/wikipedia.org/assets: Wikimedia Portals Update: [[gerrit:601328| Bumping portals to master (601328)]] (duration: 01m 03s) |
[production] |
09:37 |
<volans@cumin1001> |
END (PASS) - Cookbook sre.dns.netbox (exit_code=0) |
[production] |
09:30 |
<volans@cumin1001> |
START - Cookbook sre.dns.netbox |
[production] |
09:26 |
<jynus> |
reenabling puppet on all db/es/pc hosts after deploy of gerrit:599596 |
[production] |
09:22 |
<marostegui@cumin1001> |
dbctl commit (dc=all): 'Slowly repool db1142, db1147 T252512', diff saved to https://phabricator.wikimedia.org/P11342 and previous config saved to /var/cache/conftool/dbconfig/20200601-092220-marostegui.json |
[production] |