2025-04-30
ยง
|
09:55 |
<jmm@cumin2002> |
END (PASS) - Cookbook sre.ganeti.drain-node (exit_code=0) for draining ganeti node ganeti2021.codfw.wmnet |
[production] |
09:55 |
<jmm@cumin2002> |
END (PASS) - Cookbook sre.hosts.reboot-single (exit_code=0) for host ganeti2021.codfw.wmnet |
[production] |
09:48 |
<jmm@cumin2002> |
START - Cookbook sre.hosts.reboot-single for host ganeti2021.codfw.wmnet |
[production] |
09:41 |
<klausman@cumin1002> |
END (PASS) - Cookbook sre.hosts.reboot-single (exit_code=0) for host ml-lab1002.eqiad.wmnet |
[production] |
09:38 |
<elukey@deploy1003> |
helmfile [ml-serve-codfw] Ran 'sync' command on namespace 'revision-models' for release 'main' . |
[production] |
09:38 |
<elukey@deploy1003> |
helmfile [ml-serve-codfw] Ran 'sync' command on namespace 'revertrisk' for release 'main' . |
[production] |
09:38 |
<jmm@cumin2002> |
START - Cookbook sre.ganeti.drain-node for draining ganeti node ganeti2021.codfw.wmnet |
[production] |
09:36 |
<elukey@deploy1003> |
helmfile [ml-serve-codfw] Ran 'sync' command on namespace 'readability' for release 'main' . |
[production] |
09:35 |
<klausman@cumin1002> |
START - Cookbook sre.hosts.reboot-single for host ml-lab1002.eqiad.wmnet |
[production] |
09:34 |
<elukey@deploy1003> |
helmfile [ml-serve-codfw] Ran 'sync' command on namespace 'logo-detection' for release 'main' . |
[production] |
09:32 |
<elukey@deploy1003> |
helmfile [ml-serve-codfw] Ran 'sync' command on namespace 'llm' for release 'main' . |
[production] |
09:31 |
<elukey@deploy1003> |
helmfile [ml-serve-codfw] Ran 'sync' command on namespace 'article-models' for release 'main' . |
[production] |
09:31 |
<jmm@cumin2002> |
END (PASS) - Cookbook sre.ganeti.drain-node (exit_code=0) for draining ganeti node ganeti2020.codfw.wmnet |
[production] |
09:31 |
<jmm@cumin2002> |
END (PASS) - Cookbook sre.hosts.reboot-single (exit_code=0) for host ganeti2020.codfw.wmnet |
[production] |
09:30 |
<marostegui@cumin1002> |
dbctl commit (dc=all): 'es2031 (re)pooling @ 100%: Repooling', diff saved to https://phabricator.wikimedia.org/P75695 and previous config saved to /var/cache/conftool/dbconfig/20250430-093053-root.json |
[production] |
09:30 |
<marostegui@cumin1002> |
dbctl commit (dc=all): 'es1026 (re)pooling @ 100%: Repooling', diff saved to https://phabricator.wikimedia.org/P75694 and previous config saved to /var/cache/conftool/dbconfig/20250430-093040-root.json |
[production] |
09:30 |
<elukey@deploy1003> |
helmfile [ml-serve-codfw] Ran 'sync' command on namespace 'article-descriptions' for release 'main' . |
[production] |
09:29 |
<elukey@deploy1003> |
helmfile [ml-serve-codfw] Ran 'sync' command on namespace 'revscoring-drafttopic' for release 'main' . |
[production] |
09:28 |
<elukey@deploy1003> |
helmfile [ml-serve-codfw] Ran 'sync' command on namespace 'revscoring-draftquality' for release 'main' . |
[production] |
09:28 |
<godog> |
bounce prometheus-statsd-exporter on stat1011 - T389344 |
[production] |
09:27 |
<elukey@deploy1003> |
helmfile [ml-serve-codfw] Ran 'sync' command on namespace 'revscoring-articletopic' for release 'main' . |
[production] |
09:26 |
<elukey@deploy1003> |
helmfile [ml-serve-codfw] Ran 'sync' command on namespace 'revscoring-articlequality' for release 'main' . |
[production] |
09:24 |
<jmm@cumin2002> |
START - Cookbook sre.hosts.reboot-single for host ganeti2020.codfw.wmnet |
[production] |
09:22 |
<elukey@deploy1003> |
helmfile [ml-serve-codfw] Ran 'sync' command on namespace 'revscoring-editquality-goodfaith' for release 'main' . |
[production] |
09:18 |
<elukey@deploy1003> |
helmfile [ml-serve-codfw] Ran 'sync' command on namespace 'revscoring-editquality-damaging' for release 'main' . |
[production] |
09:17 |
<elukey> |
manual restart of the waterline service on maps1009 |
[production] |
09:17 |
<jmm@cumin2002> |
START - Cookbook sre.ganeti.drain-node for draining ganeti node ganeti2020.codfw.wmnet |
[production] |
09:16 |
<jmm@cumin2002> |
END (PASS) - Cookbook sre.hosts.reboot-single (exit_code=0) for host bast7001.wikimedia.org |
[production] |
09:15 |
<marostegui@cumin1002> |
dbctl commit (dc=all): 'es2031 (re)pooling @ 75%: Repooling', diff saved to https://phabricator.wikimedia.org/P75693 and previous config saved to /var/cache/conftool/dbconfig/20250430-091547-root.json |
[production] |
09:15 |
<marostegui@cumin1002> |
dbctl commit (dc=all): 'es1026 (re)pooling @ 75%: Repooling', diff saved to https://phabricator.wikimedia.org/P75692 and previous config saved to /var/cache/conftool/dbconfig/20250430-091534-root.json |
[production] |
09:12 |
<vgutierrez@cumin1002> |
END (PASS) - Cookbook sre.hosts.reboot-single (exit_code=0) for host lvs1013.eqiad.wmnet |
[production] |
09:10 |
<jmm@cumin2002> |
START - Cookbook sre.hosts.reboot-single for host bast7001.wikimedia.org |
[production] |
09:10 |
<jmm@cumin2002> |
END (PASS) - Cookbook sre.ganeti.drain-node (exit_code=0) for draining ganeti node ganeti2019.codfw.wmnet |
[production] |
09:10 |
<vgutierrez@cumin1002> |
START - Cookbook sre.hosts.reboot-single for host lvs1013.eqiad.wmnet |
[production] |
09:09 |
<jmm@cumin2002> |
END (PASS) - Cookbook sre.hosts.reboot-single (exit_code=0) for host ganeti2019.codfw.wmnet |
[production] |
09:04 |
<vgutierrez@cumin1002> |
END (PASS) - Cookbook sre.loadbalancer.admin (exit_code=0) depooling P{lvs1013.eqiad.wmnet} and A:liberica |
[production] |
09:03 |
<vgutierrez@cumin1002> |
START - Cookbook sre.loadbalancer.admin depooling P{lvs1013.eqiad.wmnet} and A:liberica |
[production] |
09:02 |
<jmm@cumin2002> |
START - Cookbook sre.hosts.reboot-single for host ganeti2019.codfw.wmnet |
[production] |
09:00 |
<marostegui@cumin1002> |
dbctl commit (dc=all): 'es2031 (re)pooling @ 60%: Repooling', diff saved to https://phabricator.wikimedia.org/P75691 and previous config saved to /var/cache/conftool/dbconfig/20250430-090041-root.json |
[production] |
09:00 |
<marostegui@cumin1002> |
dbctl commit (dc=all): 'es1026 (re)pooling @ 60%: Repooling', diff saved to https://phabricator.wikimedia.org/P75690 and previous config saved to /var/cache/conftool/dbconfig/20250430-090028-root.json |
[production] |
09:00 |
<jmm@cumin2002> |
END (PASS) - Cookbook sre.hosts.reboot-single (exit_code=0) for host bast5004.wikimedia.org |
[production] |
08:54 |
<jmm@cumin2002> |
START - Cookbook sre.hosts.reboot-single for host bast5004.wikimedia.org |
[production] |
08:53 |
<jmm@cumin2002> |
START - Cookbook sre.ganeti.drain-node for draining ganeti node ganeti2019.codfw.wmnet |
[production] |
08:45 |
<marostegui@cumin1002> |
dbctl commit (dc=all): 'es2031 (re)pooling @ 50%: Repooling', diff saved to https://phabricator.wikimedia.org/P75689 and previous config saved to /var/cache/conftool/dbconfig/20250430-084537-root.json |
[production] |
08:45 |
<marostegui@cumin1002> |
dbctl commit (dc=all): 'es1026 (re)pooling @ 50%: Repooling', diff saved to https://phabricator.wikimedia.org/P75688 and previous config saved to /var/cache/conftool/dbconfig/20250430-084523-root.json |
[production] |
08:35 |
<hashar@deploy1003> |
rebuilt and synchronized wikiversions files: group0 to 1.44.0-wmf.27 refs T386222 |
[production] |
08:33 |
<Emperor> |
ms-be1060 T392796 /usr/local/bin/swift_ring_manager -o /var/cache/swift_rings --doit --skip-dispersion-check --skip-replication-check --immediate-only -v |
[production] |
08:30 |
<marostegui@cumin1002> |
dbctl commit (dc=all): 'es2031 (re)pooling @ 40%: Repooling', diff saved to https://phabricator.wikimedia.org/P75687 and previous config saved to /var/cache/conftool/dbconfig/20250430-083032-root.json |
[production] |
08:30 |
<marostegui@cumin1002> |
dbctl commit (dc=all): 'es1026 (re)pooling @ 40%: Repooling', diff saved to https://phabricator.wikimedia.org/P75686 and previous config saved to /var/cache/conftool/dbconfig/20250430-083017-root.json |
[production] |
08:29 |
<hashar> |
Rolled back MediaWiki train from group 1 to group 0 due to T392988 # T386222 |
[production] |