6101-6150 of 10000 results (58ms)
2021-07-07 §
12:05 <jmm@cumin2002> END (PASS) - Cookbook sre.ganeti.makevm (exit_code=0) for new host mx1002.wikimedia.org [production]
11:49 <jmm@cumin2002> START - Cookbook sre.ganeti.makevm for new host mx1002.wikimedia.org [production]
11:43 <jmm@cumin2002> END (PASS) - Cookbook sre.ganeti.makevm (exit_code=0) for new host mx2002.wikimedia.org [production]
11:29 <jmm@cumin2002> START - Cookbook sre.ganeti.makevm for new host mx2002.wikimedia.org [production]
11:21 <marostegui@cumin1001> dbctl commit (dc=all): 'db2087:3316 (re)pooling @ 100%: Repool after index change', diff saved to https://phabricator.wikimedia.org/P16782 and previous config saved to /var/cache/conftool/dbconfig/20210707-112149-root.json [production]
11:06 <marostegui@cumin1001> dbctl commit (dc=all): 'db2087:3316 (re)pooling @ 75%: Repool after index change', diff saved to https://phabricator.wikimedia.org/P16781 and previous config saved to /var/cache/conftool/dbconfig/20210707-110645-root.json [production]
10:51 <marostegui@cumin1001> dbctl commit (dc=all): 'db2087:3316 (re)pooling @ 50%: Repool after index change', diff saved to https://phabricator.wikimedia.org/P16780 and previous config saved to /var/cache/conftool/dbconfig/20210707-105142-root.json [production]
10:36 <marostegui@cumin1001> dbctl commit (dc=all): 'db2087:3316 (re)pooling @ 25%: Repool after index change', diff saved to https://phabricator.wikimedia.org/P16779 and previous config saved to /var/cache/conftool/dbconfig/20210707-103638-root.json [production]
10:35 <marostegui@cumin1001> dbctl commit (dc=all): 'Depool db2087:3316', diff saved to https://phabricator.wikimedia.org/P16778 and previous config saved to /var/cache/conftool/dbconfig/20210707-103553-marostegui.json [production]
07:56 <moritzm> bounced elasticsearch_5@production-logstash-eqiad on logstash1009 [production]
07:03 <oblivian@deploy1002> helmfile [staging] Ran 'sync' command on namespace 'mwdebug' for release 'pinkunicorn' . [production]
2021-07-06 §
18:34 <otto@deploy1002> helmfile [staging] Ran 'sync' command on namespace 'eventgate-analytics' for release 'production' . [production]
18:34 <otto@deploy1002> helmfile [staging] Ran 'sync' command on namespace 'eventgate-analytics' for release 'canary' . [production]
18:03 <otto@deploy1002> helmfile [staging] Ran 'sync' command on namespace 'eventgate-analytics' for release 'canary' . [production]
18:03 <otto@deploy1002> helmfile [staging] Ran 'sync' command on namespace 'eventgate-analytics' for release 'production' . [production]
17:25 <joal@deploy1002> Finished deploy [analytics/refinery@419d1f0] (hadoop-test): Analytics deploy for Gobblin replacing Camus - HADOOP-TEST [analytics/refinery@419d1f0] (duration: 05m 31s) [production]
17:20 <joal@deploy1002> Started deploy [analytics/refinery@419d1f0] (hadoop-test): Analytics deploy for Gobblin replacing Camus - HADOOP-TEST [analytics/refinery@419d1f0] [production]
17:19 <joal@deploy1002> Finished deploy [analytics/refinery@419d1f0] (thin): Analytics deploy for Gobblin replacing Camus - THIN [analytics/refinery@419d1f0] (duration: 00m 07s) [production]
17:19 <joal@deploy1002> Started deploy [analytics/refinery@419d1f0] (thin): Analytics deploy for Gobblin replacing Camus - THIN [analytics/refinery@419d1f0] [production]
17:19 <joal@deploy1002> Finished deploy [analytics/refinery@419d1f0]: Analytics deploy for Gobblin replacing Camus [analytics/refinery@419d1f0] (duration: 36m 59s) [production]
16:42 <joal@deploy1002> Started deploy [analytics/refinery@419d1f0]: Analytics deploy for Gobblin replacing Camus [analytics/refinery@419d1f0] [production]
15:54 <otto@deploy1002> Finished deploy [analytics/refinery@a8e79f3] (hadoop-test): analytics test cluster deploy for webrequest_test gobblin job migration (duration: 05m 24s) [production]
15:48 <otto@deploy1002> Started deploy [analytics/refinery@a8e79f3] (hadoop-test): analytics test cluster deploy for webrequest_test gobblin job migration [production]
14:00 <marostegui@cumin1001> dbctl commit (dc=all): 'db2072 (re)pooling @ 100%: Repool after index change', diff saved to https://phabricator.wikimedia.org/P16777 and previous config saved to /var/cache/conftool/dbconfig/20210706-140049-root.json [production]
13:53 <otto@cumin1001> END (PASS) - Cookbook sre.aqs.roll-restart (exit_code=0) [production]
13:49 <otto@cumin1001> START - Cookbook sre.aqs.roll-restart [production]
13:49 <otto@cumin1001> END (FAIL) - Cookbook sre.aqs.roll-restart (exit_code=99) [production]
13:49 <otto@cumin1001> START - Cookbook sre.aqs.roll-restart [production]
13:45 <marostegui@cumin1001> dbctl commit (dc=all): 'db2072 (re)pooling @ 75%: Repool after index change', diff saved to https://phabricator.wikimedia.org/P16776 and previous config saved to /var/cache/conftool/dbconfig/20210706-134545-root.json [production]
13:30 <marostegui@cumin1001> dbctl commit (dc=all): 'db2072 (re)pooling @ 50%: Repool after index change', diff saved to https://phabricator.wikimedia.org/P16775 and previous config saved to /var/cache/conftool/dbconfig/20210706-133041-root.json [production]
13:15 <marostegui@cumin1001> dbctl commit (dc=all): 'db2072 (re)pooling @ 25%: Repool after index change', diff saved to https://phabricator.wikimedia.org/P16774 and previous config saved to /var/cache/conftool/dbconfig/20210706-131537-root.json [production]
12:02 <marostegui@cumin1001> dbctl commit (dc=all): 'db2071 (re)pooling @ 100%: Repool after index change', diff saved to https://phabricator.wikimedia.org/P16773 and previous config saved to /var/cache/conftool/dbconfig/20210706-120242-root.json [production]
11:58 <marostegui@cumin1001> dbctl commit (dc=all): 'Depool db2072', diff saved to https://phabricator.wikimedia.org/P16772 and previous config saved to /var/cache/conftool/dbconfig/20210706-115820-marostegui.json [production]
11:57 <marostegui@cumin1001> dbctl commit (dc=all): 'Depool db1118', diff saved to https://phabricator.wikimedia.org/P16771 and previous config saved to /var/cache/conftool/dbconfig/20210706-115732-marostegui.json [production]
11:47 <marostegui@cumin1001> dbctl commit (dc=all): 'db2071 (re)pooling @ 75%: Repool after index change', diff saved to https://phabricator.wikimedia.org/P16770 and previous config saved to /var/cache/conftool/dbconfig/20210706-114739-root.json [production]
11:32 <marostegui@cumin1001> dbctl commit (dc=all): 'db2071 (re)pooling @ 50%: Repool after index change', diff saved to https://phabricator.wikimedia.org/P16769 and previous config saved to /var/cache/conftool/dbconfig/20210706-113235-root.json [production]
11:17 <marostegui@cumin1001> dbctl commit (dc=all): 'db2071 (re)pooling @ 25%: Repool after index change', diff saved to https://phabricator.wikimedia.org/P16768 and previous config saved to /var/cache/conftool/dbconfig/20210706-111731-root.json [production]
11:16 <marostegui@cumin1001> dbctl commit (dc=all): 'Depool db2071', diff saved to https://phabricator.wikimedia.org/P16767 and previous config saved to /var/cache/conftool/dbconfig/20210706-111635-marostegui.json [production]
10:19 <moritzm> installing jackson-databind security updates on buster [production]
09:01 <_joe_> repooling wdqs1007 now that lag has caught up [production]
08:43 <moritzm> installing libuv1 security updates on buster [production]
07:06 <marostegui> Upgrade db1104 kernel [production]
06:54 <moritzm> installing PHP 7.3 securiy updates on buster [production]
06:50 <marostegui> Upgrade db1122 kernel [production]
06:35 <marostegui> Upgrade db1138 kernel [production]
06:31 <marostegui> Upgrade db1160 kernel [production]
00:56 <eileen> process-control config revision is 8d46b52ed4 [production]
2021-07-05 §
17:40 <legoktm> published fixed docker-registry.discovery.wmnet/nodejs10-devel:0.0.4 image (T286212) [production]
15:24 <_joe_> leaving wdqs1007 depooled so that the updater can recover faster, now at 16.5 hours of lag [production]
14:01 <moritzm> uploaded nginx 1.13.9-1+wmf3 for stretch-wikimedoa [production]