8701-8750 of 10000 results (23ms)
2010-11-05 §
23:08 <tfinc> synchronized php-1.5/wmf-config/CommonSettings.php 'Turnning off cn on all but testing wikis before scap' [production]
23:03 <tfinc> synchronized php-1.5/wmf-config/CommonSettings.php 'Setting wgCentralDBname to meta' [production]
20:34 <tfinc> synchronized php-1.5/wmf-config/CommonSettings.php 'Picking up url fix so that udp2log doesnt double count on meta' [production]
20:33 <tfinc> synchronized php-1.5/extensions/CentralNotice/SpecialBannerController.php 'Picking up url fix so that udp2log doesnt double count on meta' [production]
20:15 <RobH> There are no longer any memcached servers in the decommissioned server range. If there are any issues from the changes, the original mc.php is named mc.php.old and will be removed in 72 hours if there are no mishaps [production]
20:14 <robh> synchronized php-1.5/wmf-config/mc.php 'of course one dies AS i sync it' [production]
20:12 <robh> synchronized php-1.5/wmf-config/mc.php 'on secondary review, missed two old servers, removed and updated' [production]
20:08 <RobH> tested new memcached config, all servers working [production]
20:08 <robh> synchronized php-1.5/wmf-config/mc.php 'removed the older servers below srv150 and replaced with tested good new memcached servers' [production]
19:55 <catrope> synchronized images/wikimedia-button.png 'Let's try that with an actual PNG file rather than HTML' [production]
19:50 <catrope> synchronized images/wikimedia-button.png 'New Powered by Wikimedia button' [production]
19:11 <catrope> synchronized php-1.5/skins/common/images/poweredby_mediawiki_88x31.png 'r76126' [production]
19:00 <rfaulk> Added "httpagentparser" Python package on grosley.wikimedia.org from publicly avaialable distutils distribution - this package assists in parsing user-agent header strings found in the 2010/11 fundraiser squid logs [production]
18:58 <rfaulk> Added "setuptools" Python package on grosley.wikimedia.org with apt-get for Wikimedia 2010/11 fundraiser work - This package enables installation of python packages distributed with Python distutils [production]
18:22 <RobH> srv284 is having some booting issues, seems to be harddisk related, but since drac output is slightly garbled, unable to confirm. new rt# 376 [production]
17:54 <RobH> srv284 unresponsive to console, rebooting and fixing it to bring it back into service [production]
17:53 <RobH> srv266 back online and in service [production]
17:52 <mark> Shutdown browne and srv2 for decommissioning - thereby removing the last traces of Fedora from the cluster. Goodbye! [production]
17:43 <RobH> srv266 unresponsive to remote console, rebooting and updating [production]
17:42 <RobH> srv206 fixed, pushed back into lvs [production]
17:25 <RobH> working on srv206, disregard any errors it throws [production]
16:40 <RobH> issue with the new api servers is fixed and they are now back in service [production]
16:04 <RobH> some new api servers are not working right, depooled until they are fixed [production]
15:58 <mark> Removed ibis IPs from Squid ACLs; invalid requests issue has been resolved [production]
15:57 <mark> Fixed NFS mounts on apaches that had them missing since the wikimedia-task-appserver upgrade [production]
15:26 <RobH> working on sq57, disregard flapping [production]
15:24 <RobH> new api apackes srv290-srv301 are online, except srv298 which needs drac correction before installation [production]
15:22 <RobH> dropping old entry for tenwiki in apache config and resyncing/restarting apaches to eliminate error message [production]
15:18 <RobH> pushing srv291-srv301 into lvs [production]
15:11 <RobH> doing puppet runs on srv292-srv301 before pushing them into service [production]
14:57 <mark> Hacked out the 'remotemount' lines in /var/lib/dpkg/info/wikimedia-task-appserver.postrm files to prevent apaches from being without NFS mounts during/between puppet runs and package upgrades [production]
14:23 <mark> Deploying new package wikimedia-task-appserver 1.46 across the cluster, which removes configuration files (now handled by Puppet) [production]
11:59 <catrope> synchronized php-1.5/includes/api/ApiLogin.php 'Revert r76078' [production]
11:49 <catrope> synchronized php-1.5/includes/api/ApiLogin.php 'r76078' [production]
05:57 <apergos> failure booting into be3 on ms4, had to back out. so, no progress, we are back to where we were before the reboots. [production]
05:40 <apergos> cleared up luactivate error, shutdown ms4 again, trying to boot into alt boot environment [production]
05:16 <apergos> used shutdown on ms4, be3 showed as "active on reboot" but it booted into be0 (old boot environment) nonetheless. *grumble* [production]
05:06 <apergos> rebooted ms4 into alt boot environment with current patches applied [production]
00:18 <RobH> new api servers are not coping down the data correctly and not reflecting config changes in puppet, so they fail, srv290+ not online yet [production]
2010-11-04 §
23:06 <RobH> running puppet across the new api servers srv290-srv301 then will push them in service later when i figure out why they are not doing what I want ;P [production]
20:13 <RobH> sq51 hatees me [production]
20:11 <RobH> new api servers srv290-301 are installed and showing in ganglia, having issues getting the first couple to pool into lvs before i push the rest into service [production]
20:09 <RobH> fixed sq51 [production]
19:29 <RoanKattouw> Strike that, have backed out changes [production]
19:06 <RoanKattouw> Until Mark's made sure they're good, that is [production]
19:06 <RoanKattouw> Changing some files in wmf-deployment/includes/media . DO NOT RUN SCAP or otherwise deploy these changes! [production]
18:36 <RobH> added dns entries for payments [production]
17:59 <RobH> doing puppet runs and final setup for srv290-srv301 [production]
16:56 <rfaulk> Added numpy Python package to grosley.wikimedia.org with apt_get ... For use in the 2010/11 fundraiser to facilitate stats gathering by providing scientific computing functionality in Python [production]
16:43 <rfaulk> Added MySQLdb Python package to on grosley.wikimedia.org with apt-get ... This package will be used to access fundraising databases to facilitate the gathering and synthesis of relevant statistics for the 2010/11 Wikimedia findraiser [production]