Re: [Wikitech-l] Problem with CSS Janus License
On Fri, May 11, 2012 at 9:59 AM, Rob Lanphier ro...@wikimedia.org wrote: I think this is a solvable problem without changing our current licensing. Let's not have a big legal discussion on-list about this (or if we must hash this out publicly, let's do it on a list that deals with legal issues rather than tech issues). Hi folks, I commented here: https://bugzilla.wikimedia.org/36747 ...but haven't followed up on list, so I'll do that now. MediaWiki is GPLv2 or later, allowing licensees to comply with either GPL v2 or v3. So someone who wants to distribute the code can comply by conforming to the terms of the Apache-compatible GPL v3. This means we don't have to explicitly move to GPL v3 only. Alternately, anyone who wants to comply only with GPL v2 can retain the option of excising the ASL code. We may still wish to explore the possibility of explicitly moving to the GPL v3 (with similar or later clause), but there's nothing about including Apache-licensed code that forces that conversation. Rob ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Guidelines for db schema changes
On Tue, Apr 24, 2012 at 5:52 PM, Rob Lanphier ro...@wikimedia.org wrote: Assuming this seems sensible to everyone, I can update this page with this: http://www.mediawiki.org/wiki/Development_policy And this is done now. In case you aren't using a threaded mail client, here's the original discussion: http://thread.gmane.org/gmane.science.linguistics.wikipedia.technical/60967 Rob ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Problem with CSS Janus License
I think this is a solvable problem without changing our current licensing. Let's not have a big legal discussion on-list about this (or if we must hash this out publicly, let's do it on a list that deals with legal issues rather than tech issues). I've assigned the bug to myself, and I'll be consulting with our legal staff. Before I say much more, I'd like to talk to them. Rob On Fri, May 11, 2012 at 8:03 AM, Mark A. Hershberger m...@wikimedia.org wrote: Stephen Smoogen has opened a bug about the license in CSS Janus. This needs wider discussion, though, so I'm copying it here. From https://bugzilla.wikimedia.org/36747: I am the maintainer of the mediawiki package for Fedora EPEL project. While putting together the package for 1.19 it was found that the license to maintenance/cssjanus is ASL 2.0 and the license for mediawiki is GPL+2.0. The GPL 2. and ASL 2. are not compatible to the FSF so I am trying to work out my options and to find out what the mediawiki's projects rationale for bundling the two items together. 1) cssjanus has a GPL exception to its ASL license that mediawiki knows of. 2) we need to look at mediawiki being used as GPL 3.0 even though it is not explicitely licensed that way. I'll point the bug to the on-list discussion, so please follow up here. -- Mark A. Hershberger Bugmeister Wikimedia Foundation m...@wikimedia.org ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] BugZilla Portal (NEW)
On Wed, May 2, 2012 at 8:28 PM, Krinkle krinklem...@gmail.com wrote: https://toolserver.org/~krinkle/wmfBugZillaPortal/ Oooo, nice, thanks for this Krinkle! Rob ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] OAuth
On Fri, Apr 27, 2012 at 4:05 PM, Chris Steipp cste...@wikimedia.org wrote: If you have an OAuth use case / user story, please update: http://www.mediawiki.org/wiki/OAuth/User_stories Hi everyone, I'd like to second this ^^^ Support for OAuth is not a binary thing, and so we'll need these user stories to start in earnest on this. Here's how I see the process of building this feature working. Chris is available to lead all of this, but if someone wants to volunteer to do any of this, we don't want to get in the way: Phase 1: User story gathering. User stories seem to be the most sensible way of doing requirements gathering in a collaborative way, since it's easier to prioritize a big list of user stories than it is to prioritize abstract requirements. Phase 2: User story prioritization and requirements distillation. Once the user stories stop trickling in, then we can do rough prioritization on the stories (high, medium, low). This is going to be guided by a combination of community input and technical feasibility. We'll need to distill the requirements from the user stories to find common functionality between various stories. Phase 3: Making the backlog - we'll then construct the initial backlog based on some number of the higher priority user stories. Phase 4: Implementation - start chipping away at the backlog We have some smaller security projects that we'd like Chris to code on before starting in on this, so it'll be a while before Chris starts in on phase 4. However, he's available to start this process now, so if there's an experienced volunteer developer that can pair with Chris now, we could make this happen a lot faster. Any takers? Rob ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
[Wikitech-l] Guidelines for db schema changes
Hi everyone, As we do more frequent deploys, it's going to become critical that we get database schema changes correct, and that we do so in a way that gives us time to prepare for said changes and roll back to old versions of the software should a deploy go poorly. This applies both to MediaWiki core and to WMF-deployed extensions. I'd like to propose that we make the following standard practice: 1. All schema changes must go through a period of being optional. For example, instead of changing the format of a column, create a new column, make all writes happen to the old and new column (if it exists) and deprecate use of the old column. Check if the new column exists before blindly assuming that it does. Only eliminate support for the old column after it's clear the schema migration has happened and there's no chance that we'll need to roll back to the old version of the software. 2. There might be cases where rule #1 will be prohibitive from a performance perspective. However, schema changes like that should be rare to begin with, and should have prominent discussion on this list. In the case where it's impossible to follow rule #1, it is still critical to write scripts to roll back to the pre-change state. 3. For anything that involves a schema change to the production dbs, make sure Asher Feldman (afeld...@wikimedia.org) is on the reviewer list. He's already keeping an eye on this stuff the best he can, but it's going to be easy for him to miss changes in extensions should they happen. I don't have a strong opinion about whether we need to follow rule #1 above through an iteration of our six month tarball release cycle, but we at least need to follow it through the two week deployment cycle. Assuming this seems sensible to everyone, I can update this page with this: http://www.mediawiki.org/wiki/Development_policy (/me desperately tries to avoid yak shaving and updating the policy above for Git) Rob ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
[Wikitech-l] New Engineering Community Group, headed by Sumana Harihareswara
Hi everyone, I'm happy to announce that we have promoted Sumana Harihareswara as manager of Engineering Community group. Sumana started with us as a contractor back in February 2011, initially in a targeted engagement to help out with Google Summer of Code and with the Berlin Hackathon last year. Later that year, as we interviewed people to bring in as Volunteer Development Coordinator, not only did Sumana put in a strong application herself, but recruited very worthy competition for the role. After winning the role, she worked tirelessly to straighten out many kinks in our processes around volunteer development and systematically ensured that new volunteer developers get the recognition and (if needed) help they deserve. She has also applied focus and organization in many areas outside of her immediate purview, for example, recently stepping in as project manager for Git, and occasionally filling in for me when I've been unavailable for the larger Platform Engineering organization. The promotion to Engineering Community Manager isn't so much a change in the way things are done here so much as an official recognition of a vital role that she has already played for the past year. Sumana has been working with Guillaume Paumier and Mark Hershberger under the somewhat ad hoc group title of Technical Liaison; Developer Relations (tl;dr), serving as lead of that group since last year. Under the new Engineering Community name, this group will continue to serve many roles: facilitating collaboration and communication between Wikimedia Foundation and its employees and the larger Wikimedia developer community, as well as facilitating collaboration and communication between the Wikimedia developer community and other Wikimedia communities. Thank you, Sumana, for your hard work over the past year. I'm looking forward to seeing what you and the group accomplish moving forward. Congratulations! Rob ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
[Wikitech-l] Welcome, Chris Steipp
Hi everyone, I’d like to introduce Chris Steipp, who starts today as our Senior Security Engineer. Chris comes to us from Global Media Outreach, where he served as their CTO. Before that, he worked at Novacoast, doing security consulting. He went to school at Royal Hollaway in Egham, UK (just outside London). The Software Security Engineer was originally conceived as a means of increasing our code review capacity. One bottleneck in our code review process was when a specific security review was needed, because most people aren’t as confident in themselves to perform a specific review for security, nor confident in each other. That led to a situation where we had one person (Tim Starling) who would be the bottleneck for complicated security reviews. Now with Chris on board, we have someone (else) whose job it is to do security reviews. However, that won't be Chris's sole responsibility. A lot of Chris's time will be spent designing and developing new features and enhancing existing features of Wikimedia systems, with a particular focus on features requiring expertise in security (such as improved HTTPS support, better/different authentication features, and other handling of sensitive data). Chris is a friendly and enthusiastic teacher, and is planning to lead secure development training for the organization. Chris lives here in the SF Bay area with his (soon to be growing) family. Please join me in welcoming Chris! Rob ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Lua: return versus print
On Fri, Apr 13, 2012 at 7:19 AM, Gabriel Wicke wi...@wikidev.net wrote: From a language perspective, I would much prefer return values instead of side effects, even if those side effects could be converted into a return value with a special print implementation. I think I agree with Gabriel here (and looks like the quickly forming consensus). More reasons why this seems like the right choice: 1. We should be conservative in what we initially support, and only add more if we need it. Return values are the most general solution which we're almost certainly going to need no matter what, whereas output via print is an optimization. 2. We should make this environment one that is fun for good programmers to write clear code, so as to attract good programmers and encourage collaboration, and make everyone feel like learning how our system works has applicability in other parts of their lives. Side effects are a paving stone toward tangled single-programmer write-only code. 3. Premature optimization is the root of all evil. I am no Lua expert, but would guess that the usual collect-in-list-and--finally-join method can avoid the performance penalty in Lua too. If this type of technique works and becomes important, we can probably introduce patterns and possibly helper functions to make the easy default choice. I imagine there's going to be a lot of cut-and-paste going on, so if we can establish best practices early (keeping a close eye on how it's being used), we can introduce some good genetic stock into future Lua scripts. Rob ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Namespace translations in 1.20wmf1 all kosher?
Hi Niklas, Thanks for the detailed response. I think you addressed a lot of my concerns. Comments inline: On Fri, Apr 13, 2012 at 2:02 AM, Niklas Laxström niklas.laxst...@gmail.comwrote: Two concerns: 1. Do the new namespace names have community consensus? What I've heard though the grapevine is that yes, there has been an effort to get consensus, but confirmation of that would be nice. Since when is consensus required for software or localisation changes? As was pointed out by Strainu, namespace changes are a bigger deal than most. That said, I'm satisfied that the appropriate effort was based on the links you gave me, so thank you! 2. For each namespace that was changed, was there a corresponding alias set up? Rumor has it that there might be a few cases where that isn't the case. Here are some facts instead: * Only reported issue I am aware of is at [1], which doesn't look problematic to me * Tim who originally reverted the commit approved it again [2] * I and Siebrand have double checked the commit Perfect, thanks! [1] http://www.mediawiki.org/wiki/Special:Code/MediaWiki/107309 [2] https://gerrit.wikimedia.org/r/3318 [3] https://gerrit.wikimedia.org/r/4390 [3] https://gerrit.wikimedia.org/r/4648 ^ These links were really helpful, thanks! I'm checking in on this in an abundance of caution, since the last time around, there was disagreement about the level of diligence needed. Sorry if this feels like I'm jerking you around; I'm just double checking for known problems from last deployment. Rob ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
[Wikitech-l] A balanced approach to 20% time
Hi everyone, I'm forking off from the Development process doesn't work thread to highlight a message that Sumana sent the other day. Last year, Erik asked me to manage the 20% policy time that has frequently been referenced here. Prior to the Git migration, I've made a point of emphasizing code review, because we had a pretty large backlog, and the consequences of falling behind there were more severe than letting other things slide. The goal of the 20% policy has always been to have a bounded but appropriately large time set aside for supporting volunteer development and high priority bugfixing. Now that we're on Git, and more specifically, now that we're doing pre-commit review, there are a number of activities that we can afford to bump up the priority on. They include: * Reviewing extensions for deployment * Deploying extensions * Patches in Bugzilla * Bugfixing problems outside of developers' focus areas * Shell requests * (possibly in the glorious future) pull requests issued in mirrors (e.g. Github, Gitorious, etc) That's a simplified version. A more complete view is here: https://www.mediawiki.org/wiki/Wikimedia_engineering_20%25_policy#Scope I've handed the baton off to Sumana to manage 20% time on behalf of the community, a point that was included in her message below, but may have been missed. We need to look at the totality of volunteer development activity, plus take a look at high priority bug fixing on behalf of the reader and editor community, and coordinate our approach in addressing those issues. That's what I've asked Sumana to do, which is reflected in her message below. If you haven't already read this, please do. Thanks Rob -- Forwarded message -- From: Sumana Harihareswara suma...@wikimedia.org Date: Tue, Apr 10, 2012 at 5:02 AM Subject: Re: [Wikitech-l] Development process doesn't work (yes this is another complaint from another community member) To: Wikimedia developers wikitech-l@lists.wikimedia.org On 04/04/2012 09:14 AM, Sumana Harihareswara wrote: Petr: My sympathies on the frustration. First I'm going to talk about the problem in general, then about your issue. I can't tell whether anyone read my message on the 4th. I know it was long, but that's because I was addressing pretty much all the open questions at the time. :-) If you have concerns about this issue, please do read it. Petr replied to me offlist to straighten out his particular situation. It sounds like for one of his extensions the ball is in his court, and for the other (OnlineStatusBar), it's in the WMF's. I've updated three pages to clarify and document our process: * https://www.mediawiki.org/wiki/Writing_an_extension_for_deployment now explains that I'm the point of contact to get extensions authors their initial technical design reviews, Howie Fung is their contact to get initial user experience design reviews, and the release manager is their contact to get reviewed code deployed. * https://www.mediawiki.org/wiki/Review_queue now has the status of each extension; about 8 are waiting for more WMF work, and 9 are waiting for responses from extension authors. * https://www.mediawiki.org/wiki/Deployment_queue So I need to get some user experience reviews and some technical/code reviews going for the 8 extensions that are awaiting more WMF work. Tim suggested that it might be more efficient and pleasant if WMF engineers could concentrate on one project each for their 20% community service time, and RobLa has now decided that I should be the one prioritizing and allocating 20%-time responsibilities. So I'm going to be asking some WMF engineers if they could switch from doing patch review (in Gerrit) to reviewing particular extensions, for their 20% days. I have a few people in mind. Another snag, in at least one case, was that WMF engineers are unclear on who qualifies for deployment privileges and how to get them. That's something we started talking about in December in http://lists.wikimedia.org/pipermail/wikitech-l/2011-December/thread.html#56982 and that still needs followup - I believe Ian is going to put some preliminary notes on mediawiki.org soon, and Platform Engineering (specifically RobLa I) will follow up on that. Hope this helps. -- Sumana Harihareswara Volunteer Development Coordinator Wikimedia Foundation ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
[Wikitech-l] Namespace translations in 1.20wmf1 all kosher?
Hi everyone, During the 1.19 cycle, there were some namespace names that were changed and then reverted shortly before deployment. We're about to deploy 1.20wmf1 to a lot of international wikis on Wednesday of next week, and I'm a little worried that we may need to do a speedy revert again. Two concerns: 1. Do the new namespace names have community consensus? What I've heard though the grapevine is that yes, there has been an effort to get consensus, but confirmation of that would be nice. 2. For each namespace that was changed, was there a corresponding alias set up? Rumor has it that there might be a few cases where that isn't the case. I did a quick grep for recent changes to NS_* messages in hopes of finding a few examples of recent changes. Below is a list of changes in 2012 to NS_ messages. We should get this sorted out before we push on Wednesday. Thanks Rob 23ea50c3 (Antoine Musso 2012-02-15 15:29:22 + 47) NS_USER = array( 'male' = 'Përdoruesi', 'female' = 'Përdoruesja' ), 23ea50c3 (Antoine Musso 2012-02-15 15:29:22 + 48) NS_USER_TALK = array( 'male' = 'Përdoruesi_diskutim', 'female' = 'Përdoruesja_diskutim' ), dde3821a (Niklas Laxström 2012-03-28 13:41:19 + 56) NS_USER = array( 'male' = 'Suradnik', 'female' = 'Suradnica' ), dde3821a (Niklas Laxström 2012-03-28 13:41:19 + 57) NS_USER_TALK = array( 'male' = 'Razgovor_sa_suradnikom', 'female' = 'Razgovor_sa_suradnicom' ), b3664209 (Translation updater bot 2012-04-06 15:34:29 + 26) NS_MEDIA = 'Медиум', b3664209 (Translation updater bot 2012-04-06 15:34:29 + 45) 'Медија' = NS_MEDIA, b3664209 (Translation updater bot 2012-04-06 15:34:29 + 46) 'Специјални' = NS_SPECIAL, b3664209 (Translation updater bot 2012-04-06 15:34:29 + 47) 'Слика' = NS_FILE, 80f29d01 ( Reedy 2012-04-07 21:03:44 +0100 47) 'Imagem' = NS_FILE, 3bc64a9f (Translation updater bot 2012-04-03 21:11:13 + 440)'articlepage' = 'Content page' is used for NS_MAIN and any other non-standard namespace and this message is only used in skins Nostalgia, Cologneblue and Standard in the bottomLinks part. 659f18cc (Tim Starling 2012-01-02 22:54:57 + 108) NS_USER = array( 'male' = 'Участник', 'female' = 'Участница' ), 659f18cc (Tim Starling 2012-01-02 22:54:57 + 109) NS_USER_TALK = array( 'male' = 'Обсуждение_участника', 'female' = 'Обсуждение_участницы' ), 764b3492 (Niklas Laxström 2012-02-13 19:36:49 + 61) NS_USER = array( 'male' = 'Përdoruesi', 'female' = 'Përdoruesja' ), 764b3492 (Niklas Laxström 2012-02-13 19:36:49 + 62) NS_USER_TALK = array( 'male' = 'Përdoruesi_diskutim', 'female' = 'Përdoruesja_diskutim' ), f0ecaf0a (Niklas Laxström 2012-04-10 16:36:41 + 38) NS_FILE = 'Датотека', f0ecaf0a (Niklas Laxström 2012-04-10 16:36:41 + 39) NS_FILE_TALK = 'Разговор_о_датотеци', f0ecaf0a (Niklas Laxström 2012-04-10 16:36:41 + 40) NS_MEDIAWIKI = 'Медијавики', f0ecaf0a (Niklas Laxström 2012-04-10 16:36:41 + 41) NS_MEDIAWIKI_TALK = 'Разговор_о_Медијавикију', ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Code review backlog size in Gerrit
On Mon, Apr 9, 2012 at 10:28 PM, Daniel Friesen li...@nadir-seen-fire.comwrote: It would be interesting to recalculate this on a person-to-person basis. Instead of counting by totals look at the frequency of a user's contributions and see if it looks like the switch caused any of our contributors to commit less frequently. You should do that! ;-) Rob ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
[Wikitech-l] MediaWiki 1.20wmf1 deployed to test2 mediawiki.org
Hi everyone, Sam has created the wmf/1.20wmf1 branch and deployed it to test2.wikipedia.org and mediawiki.org. So, please, go for and test! Oh, btw, enjoy the pretty new diff colors! :) http://www.mediawiki.org/w/index.php?title=MediaWiki_1.19%2FRoadmap%2Fstatusdiff=516437oldid=508794 Rob ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
[Wikitech-l] Code review backlog size in Gerrit
Hi everyone, As you know, the review backlog didn't go away when we moved to Gerrit, but our visibility and pretty graphs did. I haven't had the time to sort out a new way of generating the graphs, but I at least have a way of querying the number of revisions, so long as it stays below 500. Here's the technique, which works if you've got a valid gerrit account: ssh -p 29418 gerrit.wikimedia.org gerrit query 'status:open project:^mediawiki/.*' | grep rowCount On Friday, we had 101 unreviewed revisions. As of a little bit ago, it's 88, so we seem to be holding steady, at least. To see what those 88 revisions are, go here: https://gerrit.wikimedia.org/r/#q,status:open+project:%255Emediawiki/.*,n,z Note that the review queue is different in a number of ways from the old way of doing things. The biggest and most obvious is that none of these have actually been merged into master (trunk), so nothing in this queue really blocks a deployment. Also, that query is a bit simplistic, as it doesn't filter out the stuff that was rejected or reviewed and effectively fixmed, but not abandoned by the author. To do that, you need: ssh -p 29418 gerrit.wikimedia.org gerrit query 'status:open project:^mediawiki/.* AND NOT label:CodeReview=-2 AND NOT label:CodeReview=-1' | grep rowCount ...which knocks the count back to 51. What I think this all means we're doing pretty good so far at keeping up. We don't have enough experience to really feel confident we're on top of this, but so far, so good. Rob ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Code review backlog size in Gerrit
On Mon, Apr 9, 2012 at 12:04 PM, Jeroen De Dauw jeroended...@gmail.comwrote: I'd be curious to see some stats on commit activity. I suspect it took quite a dive when switching to git and has still to fully recover. So does not seem very surprising that we have no backlog building up. I don't have full stats, but I did a little ad hoc investigation of core (not extensions) It appears as though we did take a hit right after the migration, but not a huge one. Last week was a return to normal: Week of 2012-01-01: 203 Week of 2012-01-08: 179 Week of 2012-01-15: 118 Week of 2012-01-22: 108 Week of 2012-01-29: 100 Week of 2012-02-05: 168 Week of 2012-02-12: 131 Week of 2012-02-19: 125 Week of 2012-02-26: 99 Week of 2012-03-04: 107 Week of 2012-03-11: 88 Week of 2012-03-18: 87 Week of 2012-03-25: 53 Week of 2012-04-01: 118 It's a bit of an apples-to-apples comparison, because code that was reverted shows up in the pre-Git numbers, but doesn't show up post-Git. However, on the flip side, Roan's mass revert shows up as post-review merging. Here's the command used to generate last week's number: echo Week of 2012-04-01: `git log --format=%s --abbrev-commit --since=2012-04-01 00:00 UTC --until=2012-04-08 00:00 UTC | grep -v ^Merge | wc -l` Note that I'm trying to filter out merge commits, which there are a lot of. Maybe extensions tell a different story, and maybe there's something I'm missing, but it looks like we're getting reasonably healthy activity as measured by commits. You can see a longer history on the SVN side (though once again not entirely apples-to-apples) by going here: http://toolserver.org/~robla/crstats/?report=trunkphase3 ...and graphing newly new, which is new commits for a given week. 100-150 week is fairly typical for the past couple of years. Rob ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
[Wikitech-l] Updated 1.20wmf1 deployment schedule - starting April 10
Hi everyone, In order to avoid stomping on the Easter holiday in Europe, we're postponing the first day of the deployment of MediaWiki 1.20wmf01 Tuesday, April 10, 18:00-20:00 UTC (11am-1pm PDT): MediaWiki 1.20wmf01 deployment - test2, potentially mediawiki.org Wednesday April 11, 18:00-20:00 UTC (11am-1pm PDT): Potentially window for MediaWiki.org 1.20wmf01 deployment if needed Monday, April 16, 18:00-20:00 UTC (11am-1pm PDT): MediaWiki 1.20wmf01 deployment - Deploy to all non-Wikipedia sites (commons, Wiktionary, Wikisource, Wikinews, Wikibooks, Wikiquote, Wikiversity) -- potentially split into two deployments if needed Monday, April 23, 18:00-20:00 UTC (11am-1pm PDT): MediaWiki 1.20wmf01 deployment - Deploy to English Wikipedia Wednesday, April 25, 18:00-20:00 UTC (11am-1pm PDT): MediaWiki 1.20wmf01 deployment - Deploy to other Wikipedias Also, some bikeshedding on the branch name: I vote that the branch name to be wmf/1.20wmf01 , and subsequent deployments to follow that scheme. That way, we can still say 1.20wmf01 as shorthand, and people can figure out that oh, you prefix wmf/ on the front for permissions reasons. Other work that's underway or needs to get underway: * Chad has started the work of converting make-wmf-branch to Git: https://svn.wikimedia.org/viewvc/mediawiki/trunk/tools/make-wmf-branch/default.conf?view=markup * Aaron is also working on the scripts for pulling our WMF-specific hacks and reapplying them in 1.20. * We still need to audit the scap/sync scripts to make sure that .git folder is being excluded on scap. Roan has started looking at that now. During the next few weeks while both 1.19wmf1 and 1.20wmf01 are out there, we're going to need to use both Subversion and Git for deployment-oriented code. Rob ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] selenium browser testing proposal and prototype
On Thu, Apr 5, 2012 at 5:25 PM, Ryan Lane rlan...@gmail.com wrote: How many languages can we reasonably support? We're currently using PHP, Python, Java, OCaml and Javascript (and probably more). Should we also throw Ruby in here as well? What level of support are the Selenium tests really going to get if they require developers to use Ruby? I was initially pretty skeptical on the Ruby choice, and I'm not going to say that I'm sold yet. However, the part of Chris's proposal that's most persuasive to me is the fact that Ruby/Selenium seems to be the most built out, with the largest community. In particular, RSpec seems to be a rather important piece of all of this. I decided to see if there was a Python equivalent of RSpec, and as often is the case these days, the exact question was asked on Stack Overflow: http://stackoverflow.com/questions/7079855/are-there-technical-reasons-a-ruby-dsl-like-rspec-couldnt-be-rewritten-in-pytho In short, Ruby lends itself to that kind of thing a lot more. In reading the Stack Overflow thing, I was reminded of a talk I saw a couple of years ago titled Python vs. Ruby: A Battle to The Death http://blog.extracheese.org/2010/02/python-vs-ruby-a-battle-to-the-death.html ...which, as it turns out, specifically talks about RSpec. Anyway, the point that I'm making here is that it's quite possible that Ruby really is the right tool for the job. Since I'm much more comfortable with Python than Ruby, I'd be much more comfortable if we stuck with Python. Sticking with Python may mean a lot of wheel reinvention to accommodate language orthodoxy which really kinda sucks. I'd like us to test the assertion that the Ruby/Selenium combination is much more mature than the Python/Selenium combination, but if that's true, then I think we may want to suppress the anti-Ruby bias and figure out how we can work with Ruby/Selenium. Rob ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] MediaWiki core deployments starting in April, and how that might work
On Thu, Mar 22, 2012 at 12:44 PM, Rob Lanphier ro...@wikimedia.org wrote: There's a few of us that plan to meet in a couple of weeks to formalize something here, but perhaps we can get this all hammered out on-list prior to that. ...and we've now done that. While it's not truly final, this is a last call before we start to implement this strategy. Our plan, starting next week, is to have an initial 3 week deployment window, followed immediately by repeating 2 week windows. The plan for the first window: * Week of April 9 - deploy to test2, and then to mediawiki.org * Week of April 16 - deploy to all non-Wikipedia sites (commons, Wiktionary, Wikisource, Wikinews, Wikibooks, Wikiquote, Wikiversity) * Week of April 23 - Monday: deploy to English Wikipedia, Wednesday: deploy to other Wikipedias The deployment calendar we plan to use for subsequent deployments * Week 1: Monday: deploy to test2 and mediawiki.org * Week 1: Wednesday: non-Wikipedia projects * Week 2: Monday: English Wikipedia * Week 2: Wednesday: all other Wikipedias We plan to branch once for each deployment cycle (soon every 2 weeks). We will branch off master, make a snapshot, immediately deploy to test2 and mediawiki.org. We'll use that branch for hot fixes for as long as that branch is in production. A branch will typically be in production for 4 weeks; the first 2 weeks as the new branch, and then another 2 weeks as the old branch. As Roan pointed out earlier, the branch name needs to start with wmf/, and suggested a naming scheme wmf/1.xx/yy, where yy is the deployment number. The only thing that sucks about that is that it's easily confused with 1.xx.yy release tarball versions (e.g. 1.18.1). Perhaps we can still do something like wmf/1.20wmf01. Deployments will be core plus extensions, barring any exceptions. There are two reasons for exceptions: * Extensions that need to be deployed on a slower schedule (if any). We're going to figure out how to migrate these extensions off of master and onto a development branch, so that master is available for deployment every two weeks. * Extensions that need to be deployed on a faster schedule (e.g. MobileFrontend). The teams responsible for these deployments will also be responsible for making sure they deploy their code to both of the appropriate production branches, since there will almost always be two in play. Does this all make sense to everyone? Rob ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
[Wikitech-l] Mark Hershberger departing Wikimedia Foundation in May
Hi everyone, I regret to inform you all that Mark Hershberger will be leaving the Wikimedia Foundation at the end of May. Mark started as a contractor at the beginning of 2010, helping the Wikimedia Foundation with code review and the release process. He developed an interest in our testing infrastructure, and was responsible for some of the early work in phpUnit and Selenium integration in our codebase. In early 2011, we reconceived his role here at the Foundation, asking him to step into the role of Bugmeister. During the past year and a half, Mark has undertaken the daunting task of going through the mountainous backlog of bug reports and patches, as well as the incoming stream of new bugs. Additionally, he has pushed us to review our backlog of code review through three deployments now. Mark has been a passionate advocate for community developers, and hasn't shied away from asking for what they need. He has been a friendly and approachable representative for WMF; enthusiastic and cheerful with everyone he deals with. Thank you, Mark, for setting this example! We plan to start recruiting for a new Bugmeister soon. I'll post more details when I have them. In the meantime, please join me in thanking Mark for his service to the Wikimedia movement. Rob ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Re-introducing UNCONFIRMED state to bugzilla
Mark and I just discussed this, and he's going to look into having a CONFIRMED state. The flow would be: NEW-CONFIRMED ...rather than: UNCONFIRMED-NEW That would also mean that all of the currently NEW bugs are presumed to be unconfirmed rather than confirmed, and that we can have a relatively clean list of bugs that someone explicitly declared to be confirmed. Rob On Fri, Mar 9, 2012 at 5:16 PM, Mark A. Hershberger m...@wikimedia.org wrote: I set up Bugzilla today so that, by default, bugs would be in the UNCONFIRMED state. Next week, I plan to begin recruiting volunteers for the Bug Squad, who will help me to verify bugs by testing them against the Beta cluster. The plan is that the Bug Squad will be able to verify these bugs and change them to the NEW state. The Bug Squad idea comes from KDE's Bug Squad (http://techbase.kde.org/Contribute/Bugsquad) and I've begun talking with them. If you have an interest in helping out with or participating in the Bug Squad, please contact me. -- Mark A. Hershberger Bugmeister Wikimedia Foundation m...@wikimedia.org ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] diff colors - more tweaking
Hi Erwin, The change to diff blocks is an interesting idea. Could you submit a patch for this? (either on list or in the bug) The different text highlighting style seems more controversial, so it might be better to keep that separate. Rob On Sun, Mar 4, 2012 at 5:32 AM, Erwin Dokter er...@darcoury.nl wrote: Never happy with the first result, I did some more tweaking. There were too many lines and too thick borders. Also, background was not needed. So I removed those and only used a bottom border for .diffchange and gave the cells consistent styling. http://windekind.demon.nl/images/Diffs-2.png -- Erwin Dokter ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
[Wikitech-l] Code review falling behind (Re: Postponing Git migration until March 21)
On Mon, Feb 27, 2012 at 8:44 PM, Rob Lanphier ro...@wikimedia.org wrote: * Code review is falling back behind. As of right now, we have 38 unreviewed revisions in core (phase3), and another 189 unreviewed revisions in extensions. That's up from the 4 core + 28 extensions on February 4. We basically let code review get back out of hand as we've turned our focus toward bugfixing in deployment. That was last week. This week, it's even more grim: * New (unreviewed): ** core: 49 ** extensions: 242 * Fixme: ** core: 8 ** extensions: 19 The graph: http://toolserver.org/~robla/crstats/ The trendline doesn't look good. What can we do to get back on the right track? Rob ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Database dump of Bugzilla
Hi John, I'm happy that you're looking at this, because I can't say that I'm thrilled with Bugzilla, and I'd be happy to see a better alternative to it. However, I also kinda agree with Chad: it's not that I love Bugzilla, but rather we haven't yet found a system that's better enough to invest in a migration. That's been the state of things for so long that it's very easy to get cynical about it. That said, I'm keeping an open mind, and I'm intrigued by one of your suggestions (Bug Genie). More on that in a bit. One requirement I would like to place on a new system, should we go there, is that it support Wikimedia SUL in some form. I realize that Bugzilla doesn't support this, but the point behind this requirement would be that if we're going to go through a painful migration, integration with our existing login system needs to be one of the benefits. It might mean that this project come behind some form of OpenID provider support on the Wikimedia cluster. More inline... On Fri, Mar 2, 2012 at 5:11 PM, John Du Hart compwhi...@gmail.com wrote: I'm currently investigating alternative bug tracker and project management software for MediaWiki. To do that I'll be installing some different software on the Labs and importing existing bugs for evaluation by the development team and users. The following software is planned for test: - JIRA http://www.atlassian.com/software/jira/overview + Greenhopper + Bonfire I've got a lot of experience with JIRA that I'm happy to share offlist. Since this has partially devolved into a discussion about open vs proprietary, I think you should read this: http://www.atlassian.com/licensing/license ...and not install it until you have. There will be a test ;-) Generally, it's going to take a lot to convince me that a proprietary solution is going to work for us. - YouTrack http://www.jetbrains.com/youtrack/ - The Bug Genie http://www.thebuggenie.com/index.php - Redmine http://www.redmine.org/ - ChiliProject https://www.chiliproject.org/ Redmine is the one that was the frontrunner the last evaluation we did in 2010: http://www.mediawiki.org/wiki/Tracker/PM_tool There's possibly some other stuff to look at on the 2010 list. As I alluded to, I'm intrigued by The Bug Genie. Looks like it's PHP-based open source, under active development for 10 years, and that they've put some thought into the UI. I don't think that one came up in our last eval, but it probably should have. I'd encourage you to nudge that one higher on your list. Of course, this goes back to the original request. To do this I need a dump of the current Bugzilla install. Is it possible for me to get this and under what conditions? Thank you. There's some confidential information in there, so possibly an NDA. I'll talk to Sumana on Monday about this. One thing that I would ask of you, though, is to be mindful of our community's actual capacity for disruptive change and development/operations capacity in general. The reason why we dropped this project back in 2010 was that it became clear that we were trying to do too many things in parallel, and not doing any of those things well. While WMF has more capacity, and you're helping out by volunteering this work, we still might not have capacity for this project. Even if you do a lot of the work, it will be both a disruption for others, as well as a fair amount of work for others. Given we're smack dab in the middle of an incredibly disruptive change (migrating from SVN to Git), it seems like this might be the wrong time to do anything more than experiments. So, that's not to say don't evaluate other systems, but just be mindful that success might still mean waiting until late 2012 or even 2013 for the project to really begin in earnest. If you're ok with that, then I say great! If that sounds like way too long to wait after doing a lot of work, then I'd caution against starting. Rob ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
[Wikitech-l] Reminder: deployment of 1.19 to most Wikipedias today
Hi everyone, Just a reminder that we're deploying MediaWiki 1.19 to most Wikipedias in a few hours (including enwiki), starting at 23:00 UTC (3pm PST). In January, Wikipedia represented 94.7% of our page traffic: http://stats.wikimedia.org/EN/TablesPageViewsMonthlyAllProjects.htm Since we've already deployed to Polish (pl), Dutch (nl), and Esperanto (eo), we need to subtract that out. Those represent 2.85%, 1.20%, and 0.05% of our Wikipedia traffic respectively: http://stats.wikimedia.org/EN/TablesPageViewsMonthlyOriginalCombined.htm ...for a total of 4.1% of Wikipedia traffic, and 3.88% of our overall traffic. That means that today's deployment to the other Wikipedia sites (roughly 90.8% of traffic) is clearly the big one. Our plan is to deploy to enwiki first, then if things go well, march down the list in traffic order. Some things will inevitably break during this time, so please refer to our maintenance notice for how to get in touch with us: https://meta.wikimedia.org/wiki/Wikimedia_maintenance_notice In short, you should be able to follow what's going on on the #wikimedia-tech IRC channel. We still have a few bugs that could use everyone's attention: https://bugzilla.wikimedia.org/buglist.cgi?resolution=---target_milestone=1.19wmf%20deploymentlist_id=95452 Please take a look, and if you can help, please do. Thanks! Rob ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] 1.19 deployment date
On Tue, Feb 28, 2012 at 8:03 AM, Platonides platoni...@gmail.com wrote: On 28/02/12 09:32, Bináris wrote: Hi, You write on https://www.mediawiki.org/wiki/MediaWiki_1.19/Roadmap: Wednesday, March 1 (-2), 23:00-03:00 UTC (3pm-7pm PST): Stage 5 deployment to: - All Wikipedia sites Do you mean Wednesday, Feb 29, or Thursday, March 1? :-) Seems RobLa doesn't implement leap-year logic :) Given that such date comes from shifting one week February 22, I think that's 29th of February. https://www.mediawiki.org/w/index.php?title=MediaWiki_1.19/Roadmapdiff=prevoldid=490559 Oops, fixed now: https://www.mediawiki.org/wiki/MediaWiki_1.19/Roadmap I had fixed it here: http://wikitech.wikimedia.org/view/Software_deployments ...and thought I fixed it on the Roadmap as well, but I guess not. The Software deployments page on wikitech gets a weekly editing pass by the tech management team, and is considered the source of truth in the event there are multiple conflicting statements on dates. Rob ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
[Wikitech-l] Postponing Git migration until March 21
Hi everyone, We'd like to postpone the Git migration 2.5 weeks, with a new final migration date of Wednesday, March 21. Here's the convergence of factors that led us to the new date: * The 1.19 deployment has kept us busy enough that none of the rest of us in Platform Engineering have had spare cycles to help Chad out * It's quite likely we'll have many fixes after we deploy to the bigger wikis (enwiki and friends) on Wednesday * We have a number of unresolved issues in our Git+Gerrit deployment [1][2] * Code review is falling back behind. As of right now, we have 38 unreviewed revisions in core (phase3), and another 189 unreviewed revisions in extensions. That's up from the 4 core + 28 extensions on February 4. We basically let code review get back out of hand as we've turned our focus toward bugfixing in deployment. Chad and Ryan also discovered today that the machine we're using for Git and Gerrit (formey, which is also SVN) just isn't up to hosting the whole mess. So, there's a machine deployment we need to do as well. The good thing about this Here's the new plan: * Week of 3/5 - MediaWiki 1.19RC1 release. Code review stats by end-of-week: 20 new on phase3, 100 new on phase3+extensions * Week of 3/12 - MediaWiki 1.19.0 release. Code review stats by end-of-week: as close to zero as possible in phase3+extensions. Possibly even 1.20wmf1 (first mini deployment untethered to release schedule, first of many...1.20wmf2, 1.20wmf3, etc) * Week of 3/19 - Git migration week. Migration day: Wednesday, 3/21 A MediaWiki tarball release *should* be a relatively minor endeavor. A deployment during this time is a stretch goal. We should be able to make a deployment from a more recent point on trunk if we're disciplined about actually getting through the code review backlog and we do a good job in review. Doing a good job means reverting when we need to. The top priority for Platform Engineering will be the Git migration, so anything that distracts from that (like, for instance, a 1.20wmf1 deploy) may get postponed while we finish this off once and for all. Thank you everyone for your patience on this transition. Rob [1] http://www.mediawiki.org/wiki/Git/Conversion#Unscheduled_items [2] https://bugzilla.wikimedia.org/showdependencytree.cgi?id=22596hide_resolved=1 ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Postponing Git migration until March 21
On Mon, Feb 27, 2012 at 9:39 PM, Krinkle krinklem...@gmail.com wrote: On Feb 28, 2012, at 5:44 AM, Rob Lanphier wrote: So, there's a machine deployment we need to do as well. The good thing about this The good thing about this… is yes ? Oops...heh. I believe I was going to say The good thing about this delay is that it gives us time to figure out a proper hardware strategy. Rob ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] 1.19 deploy to commons rescheduled for Wednesday, Feb 22 18:00-22:00 UTC
Hi everyone, This deployment of 1.19 to commons finally happened about an hour or so ago. We're keeping an eye on possible issues. We haven't seen anything that would cause us to roll back, but we are seeing a few issues. Please report bugs in Bugzilla, or if you're feeling too lazy for that, at least drop a note on this talk page: https://meta.wikimedia.org/wiki/Talk:Wikimedia_maintenance_notice Thanks! Rob On Tue, Feb 21, 2012 at 8:49 PM, Rob Lanphier ro...@wikimedia.org wrote: Hi all, The 1.19 deploy to commons didn't go the way we hoped. We're planning to try again tomorrow, after we have some time to debug some of the problems we hit. Roan and Aaron are discussing some thumb generation issues now, and there are also some Javascript issues that we'll need to resolve tomorrow. A new deployment window is planned for Wednesday, Feb 22 18:00-22:00 UTC Rob ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
[Wikitech-l] State of 1.18
Hi everyone, Here's where we are. We've got a lot of blockers listed below, though it's probably not as bad as it looks from the bug count. == Blocking stage 2 deployment (commons) == Javascript/Resource Loader Tim did a bunch of work in this general area today. I'm hopeful Roan will have some time to review this work and we can all verify that this cleans up a bunch of issues. http://bugzilla.wikimedia.org/34450 -- Probable Javascript loading issues (navigation and tabbing issues, multiple browsers) http://bugzilla.wikimedia.org/34409 -- 'mw.user.options' and 'mw.user.tokens' are sometimes empty on 1.19, breaking watchlist, gadgets, etc. http://bugzilla.wikimedia.org/34517 -- Sometimes scripts imported by mw.loader.load on common.js are not executed http://bugzilla.wikimedia.org/34504 -- [Regression] There should not be a mismatch between html and stylesheet version Logging and permissions Mark and Antoine made a fix to one logging issue, and I puttered around on the larger IRC problem. http://bugzilla.wikimedia.org/34508 -- [Regression] IRC string output for log messages no longer compatible http://bugzilla.wikimedia.org/34503 -- Users without the autopatrol right are being autopatrolled on simple.wikipedia == Blocking stage 3 deployment (all projects except Wikipedia) == No additional bugs. == Blocking stage 4 deployment (all Wikipedia except enwiki) == http://bugzilla.wikimedia.org/34104 -- Schema changes for MediaWiki 1.19 deployment to Wikimedia wikis There are some master switches that need to happen on s1, s5 and s6 prior to deploying to enwiki, dewiki, ruwiki, frwiki == Important to figure out, but not blocking == http://bugzilla.wikimedia.org/34458 -- Editing fails in IE7 http://bugzilla.wikimedia.org/34478 -- Special:MergeAccountaction=submit generates error http://bugzilla.wikimedia.org/34469 -- (un)watch button broken (only in debug mode now) == Administrivia == http://bugzilla.wikimedia.org/33994 -- MediaWiki 1.19 pre WMF deployment actions (tracking) == Query for 1.19 deploy blockers == https://bugzilla.wikimedia.org/buglist.cgi?query_format=advancedlist_id=91536resolution=---target_milestone=1.19wmf%20deploymentknown_name=1.19%20deploy%20blockersquery_based_on=1.19%20deploy%20blockers Etherpad: http://etherpad.wikimedia.org/119triage ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
[Wikitech-l] State of 1.19 deploy
...and just for grins, we'll make that 1.19 -- Forwarded message -- From: Rob Lanphier ro...@wikimedia.org Date: Tue, Feb 21, 2012 at 12:02 AM Subject: State of 1.18 To: Wikimedia developers wikitech-l@lists.wikimedia.org Hi everyone, Here's where we are. We've got a lot of blockers listed below, though it's probably not as bad as it looks from the bug count. == Blocking stage 2 deployment (commons) == Javascript/Resource Loader Tim did a bunch of work in this general area today. I'm hopeful Roan will have some time to review this work and we can all verify that this cleans up a bunch of issues. http://bugzilla.wikimedia.org/34450 -- Probable Javascript loading issues (navigation and tabbing issues, multiple browsers) http://bugzilla.wikimedia.org/34409 -- 'mw.user.options' and 'mw.user.tokens' are sometimes empty on 1.19, breaking watchlist, gadgets, etc. http://bugzilla.wikimedia.org/34517 -- Sometimes scripts imported by mw.loader.load on common.js are not executed http://bugzilla.wikimedia.org/34504 -- [Regression] There should not be a mismatch between html and stylesheet version Logging and permissions Mark and Antoine made a fix to one logging issue, and I puttered around on the larger IRC problem. http://bugzilla.wikimedia.org/34508 -- [Regression] IRC string output for log messages no longer compatible http://bugzilla.wikimedia.org/34503 -- Users without the autopatrol right are being autopatrolled on simple.wikipedia == Blocking stage 3 deployment (all projects except Wikipedia) == No additional bugs. == Blocking stage 4 deployment (all Wikipedia except enwiki) == http://bugzilla.wikimedia.org/34104 -- Schema changes for MediaWiki 1.19 deployment to Wikimedia wikis There are some master switches that need to happen on s1, s5 and s6 prior to deploying to enwiki, dewiki, ruwiki, frwiki == Important to figure out, but not blocking == http://bugzilla.wikimedia.org/34458 -- Editing fails in IE7 http://bugzilla.wikimedia.org/34478 -- Special:MergeAccountaction=submit generates error http://bugzilla.wikimedia.org/34469 -- (un)watch button broken (only in debug mode now) == Administrivia == http://bugzilla.wikimedia.org/33994 -- MediaWiki 1.19 pre WMF deployment actions (tracking) == Query for 1.19 deploy blockers == https://bugzilla.wikimedia.org/buglist.cgi?query_format=advancedlist_id=91536resolution=---target_milestone=1.19wmf%20deploymentknown_name=1.19%20deploy%20blockersquery_based_on=1.19%20deploy%20blockers Etherpad: http://etherpad.wikimedia.org/119triage ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] ExtensionDistributor on mediawiki.org is down
We were having problems with the 1.19 deployment which were caused in part by the load on fenari caused by the distributor. I believe Tim can get it back up and running later. Rob On Tue, Feb 21, 2012 at 8:00 PM, Ryan Kaldari rkald...@wikimedia.org wrote: If I try to generate a snapshot of any extension on mediawiki.org, I get the error Unable to contact remote subversion client. For example: http://www.mediawiki.org/wiki/Special:ExtensionDistributor?extdist_extension=CentralNoticeextdist_version=branches%2FREL1_18extdist_submit=Continue Ryan Kaldari ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
[Wikitech-l] 1.19 deploy to commons rescheduled for Wednesday, Feb 22 18:00-22:00 UTC
Hi all, The 1.19 deploy to commons didn't go the way we hoped. We're planning to try again tomorrow, after we have some time to debug some of the problems we hit. Roan and Aaron are discussing some thumb generation issues now, and there are also some Javascript issues that we'll need to resolve tomorrow. A new deployment window is planned for Wednesday, Feb 22 18:00-22:00 UTC Rob ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
[Wikitech-l] Latest look at the 1.19 deployment blockers
Hi everyone, We have a lot of issues to get through before pushing MediaWiki 1.19 out to commons on Tuesday afternoon (late Tuesday/early Wednesday UTC) The general situation is that there still seem to be many Javascript issues cropping up. Additionally, there's one logging bug that's really causing problems for vandalism patrol. More below: Newly added as blockers: * Bug 34508: [Regression] IRC string output for log messages no longer compatible https://bugzilla.wikimedia.org/34508 Krinkle started looking into this one, and Nikerabbit chimed in. This one can't be fixed until a bot author helps narrow down what the important change was. Right now, there's more heat than light on this issue. * Bug 34510: monobook toolbar or vector with old toolbar not setup randomly https://bugzilla.wikimedia.org/34510 Recently reported. RobLa has some irresponsible speculation on this bug, but this hasn't had a hard look yet. * Bug 34517: Sometimes scripts imported by mw.loader.load on common.js are not executed https://bugzilla.wikimedia.org/34517 * Bug 34495: patrol log credit the user patrolled, not the user patrolling https://bugzilla.wikimedia.org/34495 * Bug 34478: Special:MergeAccountaction=submit generates error https://bugzilla.wikimedia.org/34478 We've seen a couple of cases of this now. This may be related to trying to merge an account on a 1.19 wiki with one on a 1.18 wiki. * Bug 34504: [Regression] There should not be a mismatch between html and stylesheet version https://bugzilla.wikimedia.org/34504 Krinkle says: This is what was causing the Jump to navigation links to appear on cached pages after the 1.19 roll-out. * Bug 34373: populateRevisionSha1.php misses archive rows, whose text is stored directly in the archive table (not text table) https://bugzilla.wikimedia.org/34373 This one was recently reopened Old issues: * Bug 34450: Probable Javascript loading issues (navigation and tabbing issues, multiple browsers) https://bugzilla.wikimedia.org/34450 See RobLa's note to wikitech-l http://thread.gmane.org/gmane.science.linguistics.wikipedia.technical/58858 Andrew fixed the issue with the collapsible sidebar not working as expected. There are other issues in this bug that still need to be resolved. * Bug 34469: (un)watch button broken https://bugzilla.wikimedia.org/34469 This appears to be fixed for debug mode. Still reliably busted in debug mode. The latest speculation is that this is a cross-domain cookie issue, since the javascript is served off of bits.wikimedia.org in debug mode. * Bug 34421: mutiple headers https://bugzilla.wikimedia.org/34421 Hashar has adapted a patch from Krenair, and checked this into trunk. Still needs to be merged and deployed. * Bug 34458: Editing fails in IE7 https://bugzilla.wikimedia.org/34458 We believe this only happens on meta.wikimedia.org, and is caused by common.js there. This may have also been a problem on 1.18. * Bug 34409: 'mw.user.options' and 'mw.user.tokens' are sometimes empty on 1.19, breaking watchlist, gadgets, etc. https://bugzilla.wikimedia.org/34409 We think this one is dead. For the very latest in BZ, run this query: https://bugzilla.wikimedia.org/buglist.cgi?query_format=advancedlist_id=91536resolution=---target_milestone=1.19wmf%20deploymentknown_name=1.19%20deploy%20blockersquery_based_on=1.19%20deploy%20blockers This status pad is being updated here: http://etherpad.wikimedia.org/119triage Rob ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
[Wikitech-l] Please test Javascript on meta.wikimedia.org
Hi everyone, We have some general Javascript errors that we've been discussing here: https://bugzilla.wikimedia.org/show_bug.cgi?id=34450 Yesterday, Roan and others made some fixes in this area in an attempt to address the problems we were seeing. Here's a list of Roan's commits: https://www.mediawiki.org/wiki/Special:Code/MediaWiki/author/catrope In an effort to fix things, he enabled $wgResourceLoaderExperimentalAsyncLoading for logged-in users our 1.19 wikis. meta.wikimedia.org is where we've seen the most problems, so that would be the best place to test. Could anyone who has time please spend some time hammering meta.wikimedia.org, looking out for the problems described in 34450 in particular? Please report issues either there, or if that's getting too spammy, let's overflow here: https://meta.wikimedia.org/wiki/Talk:Wikimedia_maintenance_notice Thanks Rob ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
[Wikitech-l] State of the 1.19 deployment; one (known) blocking bug
Hi everyone, I just wanted to give an update on where we are with the 1.19 rollout. We started the rollout yesterday, as planned, and during our deployment window yesterday, got through about half of the wikis we intended to. You can see some of the fixes we needed to make last night and today here: https://www.mediawiki.org/w/index.php?path=%2Fbranches%2Fwmf%2F1.19wmf1title=Special%3ACode%2FMediaWiki A few key revs: Rename 'blocking' to 'async', and invert the logic in mediawiki.js https://www.mediawiki.org/wiki/Special:Code/MediaWiki/111597 Live hack for history blobs: https://www.mediawiki.org/wiki/Special:Code/MediaWiki/111602 Remove everything except english from WikimediaLicenseTexts.i18n.php, to avoid out-of-memory condition: https://www.mediawiki.org/wiki/Special:Code/MediaWiki/111606 Live hack: stop loading a bunch of languages on log view: https://www.mediawiki.org/wiki/Special:Code/MediaWiki/111614 Sam Reed and Antoine Musso finished off the rest of the planned wikis this morning. We have a pretty big Javascript loading problem that may stop our deployment in its tracks. It's described here: https://bugzilla.wikimedia.org/show_bug.cgi?id=34450 This is just a hunch, but the Javascript problems seem to be more pronounced on meta.wikimedia.org than on other places, so if you're looking to repro, that might be the first place to start. This is early in breaking in this release. We're keeping a close eye on Bugzilla and on this page: https://meta.wikimedia.org/wiki/Talk:Wikimedia_maintenance_notice We point people to this page in our maintenance notices, but people will likely still go to their respective village pumps. Any help shepherding bug reports our way would be greatly appreciated, so that we don't deploy this further with major defects we should have known about. Thanks Rob ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
[Wikitech-l] Please test on our deploy candidate server (test2)
Hi everyone, Stage 0 of our deployment of MediaWiki 1.19 is complete, which is our deployment to test2. This wiki is actually on the live cluster like any other wiki, which is about as close as it gets to what things are going to be like when we deploy it to production wikis like any others. Here's the URL: http://test2.wikipedia.org/wiki/Main_Page In fact, all 800 or so wikis run on the same set of Apache servers, whether they are 1.18 or 1.19, which means this isn't a truly isolated test. There's a small chance that we could accidentally break 1.18 wikis in the process of doing this testing. In particular, if you see something to the effect of Internal 500 error on any wiki (1.18 or 1.19), please let us know using the instructions here: https://meta.wikimedia.org/wiki/Wikimedia_maintenance_notice (basically #wikimedia-tech or the talk page for the notice above) If all goes well, we should be making our first wave of deployments to 1.19 wikis in about 27 hours. Go forth and test! Thanks! Rob ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
[Wikitech-l] Swift thumbnails temporarily disabled; fix underway
Hi everyone Short version: Some people have reported issues with corrupted thumbnail images in which they appear truncated. We’ve identified the cause of the problem and believe we have a fix in place. It may take a few days to fully propagate, during which you may continue to see corrupted thumbnails. Manually purging the image page should repair any broken thumbnail. Longer version: Last week, we enabled Swift as a replacement for NFS for storing images (see blog post [1]). Our goal with this project is to replace a single point of failure, and increase fault tolerance and capacity. Recently, we discovered that images were sometimes getting corrupted. After further investigation, Tim Starling and Ralf Schmitt both independently figured out that whenever a client disconnected early while fetching a thumbnail, the server would write out a partial thumbnail to our Swift cluster and to the cache. Ben ran the numbers, and estimated that roughly 1.6% of the thumbnails were corrupted, and that roughly 4.5% of images had at least one corrupt thumbnail. We've disabled Swift for the time being, going back to our old way of serving thumbnails. Unfortunately, even though thumbnails are no longer coming from our Swift cluster, there will still be images in our Squid cache which we can't easily purge (without creating a large performance problem), so that step only stops the problem from getting worse rather than fixing it. Thankfully, we're reasonably confident there won't be any new broken images. Aaron Schulz came up with a pretty simple fix. We were writing thumbnails to Swift while streaming them to the client, so when the client disconnected, so did the process writing the file into Swift. We have added an MD5 checksum of the generated image to the ETag header when pushing it to Swift. Swift accepts the file for writing, and if the MD5 checksum doesn’t match after the connection to Swift closes, the partial thumbnail is deleted. The changes were two small fixes: one in the thumb generation: http://www.mediawiki.org/wiki/Special:Code/MediaWiki/111517 ...and one in the process that writes images to Swift: https://gerrit.wikimedia.org/r/#change,2598 We aren’t planning to deploy this right away. What we want to do instead is repair the damage first. Ben will write a script that crawls through Swift, searches for corrupt images, nukes them from Swift and purges them from our Squid cache via HTCP. After that’s complete, we’ll then re-enable the new improved thumbnail pipeline with Swift, which *should* no longer keep partial images. Sorry for any problems this might have caused. Rob [1] Ben’s announcement of our Swift deployment http://blog.wikimedia.org/2012/02/09/scaling-media-storage-at-wikimedia-with-swift/ ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
[Wikitech-l] Welcome David Schoonover - Systems Engineer - Data Analytics
Hi everyone, I'm pleased to welcome a new member of our analytics engineering team. David Schoonover starts today in the San Francisco office[1] as our new Systems Engineer for Data Analytics. David previously worked remotely for a company in McLean, VA in Virginia's Silicon...thing (not valley, forest or alley...what do they call it? Silicon road to Dulles?). Anyway, David's work was at Clearspring, the company responsible for the AddThis sharing widget that's installed across 10+ million domains servicing 60k+ requests per second. He and a co-worker built a scalable real-time analytics system to power the Share Count feature of their tools, which is part of a real-time dashboard you get as a user of their service. In addition to David's talent in database technology and software engineering, David also adds to our growing team of capable photographers in Platform Engineering. Because, after all, every good engineering organization should have plenty of great photographers. I've also found David will immediately will draw a dinosaur[2] on a whiteboard in whatever room he is in. So, those of you in our SF office, if you see a plesiosaur on the whiteboard near you, that probably means he's been there recently. Or maybe it's been there a while but now won't erase. Either way, he probably drew it. David will be joining Andrew Otto and Erik Zachte in the analytics group in Platform Engineering, working very closely with Diederik van Liere from Howie's new Product Management team. David, Andrew and Diederik (along with contractor Fabian Kaelin) are working on fixing up our reporting pipeline and report card. Welcome David! Rob [1] Yes, that's right, I've actually hired another person in San Francisco, and this time, I didn't have to convince him to move here. When I have team meetings, I'll have to get a conference room with THREE chairs. And a good phone. [2] Dinosaur samples: http://art.less.ly/dinosaurs/ ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] PHP 5.3 policies
On Sun, Feb 12, 2012 at 5:34 AM, Jeroen De Dauw jeroended...@gmail.com wrote: I'm wondering about two things: * When will we finally drop 5.2.x support? I just did a check of what Dreamhost is running, which seems to be PHP 5.2.17. Maybe they're trailing edge (after all, they still only have MediaWiki 1.16.4), but I'd bet a lot of the other hosters haven't moved to 5.3 yet. I'd say that when the latest versions of the stable distros move over (Ubuntu LTS, Debian stable, RHEL/CentOS), that's a good benchmark, since that's what's likely to be common in hosting environments. Ubuntu Lucid is already at 5.3. I haven't checked Debian or RHEL. Alternatively, it'd be better to actually survey the major general purpose hosters, though it seems that market is a little too fragmented and fluid to get a good sense of typical. * Can we have isolated components in core that require 5.3? I'm not sure how that would work. Once it's in core, I imagine that dependencies would slip in. Rob ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
[Wikitech-l] backcompat keyword
Hi everyone, After Niklas mentioned in another thread that the i18n team was effectively ignoring the code slush, I decided to do some spot checking to see what couldn't wait. What I saw was a lot of backwards compatibility breakage that's going to make releasing 1.19 and the eventual 1.20 release a lot harder, and make life generally miserable for third party users of MediaWiki that just want to stay on top of security releases and maybe want to install a new extension or two, but have extensions that they can't afford the dev resources to update. I've been there myself, and it sucks. At a minimum, we need to start marking backwards-compatibility breaking changes. So, I've created the backcompat keyword, and started the task of marking these. I'd like help in marking these things, since I suspect we've got a lot of work before we can release a 1.19 tarball. Here's the list so far: http://www.mediawiki.org/wiki/Special:Code/MediaWiki/tag/backcompat It's hard enough to make things work. Expending effort trying to make things *not* work is bound to be more successful than anyone is banking on. Please stop this. Thanks Rob ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Git migration - updated schedule workflow descriptions
Hi Niklas Comments inline... On Sat, Feb 11, 2012 at 1:20 AM, Niklas Laxström niklas.laxst...@gmail.com wrote: Since you already are suggesting people to start git repositories, we should prepare the i18n support right now. Translatewiki.net is not ready to handle random git repositories all over the world (nor it necessarily wants to do that). MediaWIki has had very good i18n and l10n including extensions - I don't want to lose that. So we need to plan it now, not when we have hundreds of extensions migrated to git and every one doing the things differently. Fair enough. I think we should handle new extension directories on a case-by-case basis. If our commit access is restricted (committing updated translations, but also fixing i18n issues) we cannot do our job properly, which means that we will start dropping support for stuff that doesn't handle i18n adequately. Which reminds me, does LocalisationUpdate support git? Not yet: https://bugzilla.wikimedia.org/show_bug.cgi?id=34137 Or Extension Distributor? Not yet: https://bugzilla.wikimedia.org/show_bug.cgi?id=27812 * As far as Chad knows, none of the gerrit bugs listed in https://labsconsole.wikimedia.org/wiki/Gerrit_bugs_that_matter are bad enough to block the migration. What, *specifically*, do people hate about gerrit? He needs to know specifics to make decisions that might delay the migration. I read through almost all of the commit mails. If I need to spend any more time on it or do more clicking around my head will probably explode. This means that I want commit emails with links to review tool, review tool should load fast, display all diffs inline (not hidden behind links or anything) and I should be able to filter commits by path or author. In addition I want to be able to tag commits (for l10n team to review or keeping list of commits I want to deploy to the live site). Any failure to meet these requirements is highly likely to be a blocker for me. Oh and something that signs that this commit has followups that fixes problems in it. I copied the issues you raised to the Gerrit bugs that matter page. I also added a tracking bug to make sure we triage that list: https://bugzilla.wikimedia.org/show_bug.cgi?id=34349 * Chad is trying to get the real git repo up and running ASAP so people can start doing their real work there. He and RobLa believe we should encourage people to consider SVN indefinitely slushed. If you are going to do large things, they prefer you start doing them in git. We have been in a slush most of this year already and it has affected the work of our team. We have already started to ignore the slush and will probably do it again. This comment really bears scrutiny. You realize that Chad isn't going to be able to fix the Gerrit/Git issues above if we're mired in a sea of code review, right? Or fixing bugs introduced because some deprecated feature had to be completely removed *right now*, despite the fact that it wasn't harming anyone to keep it around? It's not like your code is actually going to get deployed if you slow down the process this way. I think rather than ignore Brion's email here: http://thread.gmane.org/gmane.science.linguistics.wikipedia.technical/58622/focus=58630 ...you should respond to it. On the other hand we probably will migrate our extensions to git among the first ones. Are extensions used by WMF forced to have the gated trunk model? This is a good question. I don't see how we can avoid it. Gerrit doesn't work with post-commit review, and we make a point of reviewing the extensions that are going onto the cluster. How much space do the git repos require? Right now we spend under 500M to support the i18n of three branches of mediawiki and all extensions. My understanding is that the core repo is about 100M with the full revision history. I don't know what it looks like with all extensions, but I have to believe it'll be manageable. Rob ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Phabricator
On Fri, Feb 10, 2012 at 12:01 PM, John Du Hart compwhi...@gmail.com wrote: Just want to check in to see how everyone's doing. I see a lot of account creations however no posts to differential. Is there a problem I'm not aware of? Documentation problems? Software issues? Do we see something we don't like? What's up? Hi John, Just mulling it over, mainly. I love having a test instance to play with, and agree with you that it looks way easier to use than Gerrit on the surface. I don't want that to be taken as an endorsement of the system just yet, because I don't know enough about either system (Gerrit or Phabricator) to have a valid opinion about which is better for us. What Chad, Sumana and I spoke about earlier today was the tough reality we're in. In order to get moved to Git in the very short term, we're going to have to stay the course on Gerrit. That said, one beautiful thing about Git is that Git repos are far more portable than SVN repos. We could decide we're going to use Gerrit on a probationary basis, and have a serious reevaluation in three months. Sumana and I would like to go in that direction, and we twisted Chad's arm hard enough on this point that he might just say he agrees with us, even if he's secretly blowing us off ;-) There's at least one feature we would need to use this system, which is LDAP integration. I don't think we can seriously consider a system that doesn't have some sort of sensible plan for how it's going to work with our LDAP server. I do, however, love the idea of using OAuth for filling in credentials from other services like Github. It would be ridiculously cool if people could use their Wikimedia SUL logins to access this system. It's also nice that this is written in PHP. Since we have an abundance of PHP developers, that means it's far more likely that customization we need would actually happen. In general, it's good to have options. Like I said, I don't want to put the brakes on our immediate plans, but I think it's worth consideration after we've given Gerrit a try. Rob ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
[Wikitech-l] We're still in a code slush (Re: How to avoid a post-branch code slush)
Hi everyone, I probably mistitled my last message in the How to avoid a post-branch code slush thread. There seems to be a misconception that the code slush is over. Please go back and read the thread. The status quo is *there is still a code slush*. What this means: 1. No big architectural changes without discussion. Not merely hey, I sent a message, and no one responded, but as in: you send a message, and two or three people say yeah, you're right, that needs to happen, make it so without serious dissent 2. No big omnibus whitespace cleanups. These are of debatable use any time, but now they make backporting a big pain-in-the-butt, and we'll need to do a lot of backporting to 1.19. 3. Have a reviewer lined up ready to ok your revision *before you commit it*. If you can't get a tentative commitment from a reviewer, don't commit it. We have a couple of different options in this interim period between now and when we start using Git: a. we let people commit as was previously normal into trunk, but we only migrate the code that's been formally reviewed. b. we insist on pre-commit review from now until we go live on Git. So far, there hasn't been a lot of discussion on either option. Brion and Roan both pointed out problems with option a, while no one has raised a serious objection to option b. Rob On Wed, Feb 8, 2012 at 10:23 PM, Rob Lanphier ro...@wikimedia.org wrote: On Wed, Feb 8, 2012 at 3:43 PM, Brion Vibber br...@pobox.com wrote: This is one of the reasons I've been hoping we'd move to a more pre-commit review model. Especially for big refactorings and cleanups that have limited immediate value, we tend to get a lot of breakages a not a lot of interest in fully reviewing them (eg actually checking all the code paths to make sure they really work). Here's the thinking that lead to where we are: the cutover to Git is the point at which we want to fully move to precommit. It seems like an enormous pain-in-the-butt to move to full precommit with our current toolset (SVN + CodeReview tool). However, we're much closer than we've ever been to having Git, and it may be worth dealing with some short-term pain. To a certain degree, I'd actually consider it desirable to have a permanent 'slush' to the extent that destabilizing work should *always* be talked out a bit and tested before it lands on trunk/head/master. Yup, agree 100%. Let's all just pretend this has always been the status quo starting right now. I think we've already established that there used to be more liberal reversion, and that when that went away, so too went our ability to stay on top of the review queue. If we're not ready to go fully git the instant we branch 1.19, Given that the branch just happened, and we're not ready yet, that's the case. Chad can give you more of an update, but my understanding is: * A (hopefully) final test migration of core is slated for this week. Chad believes he's got all of the blocking problems sorted out. * Extensions migration isn't going so smoothly. The same tools that work splendidly with core seem to crash with the very small subset of extensions that he's tried it out with. Could be a minor problem that's easy to fix, or could be gawd awful. TBD * We'd like a two-week window of warning/testing/playing around before making the cutover. All told, the current plan is beginning of March for core, middle of March for extensions. More details here: https://www.mediawiki.org/wiki/Git/Conversion ...and in the email that I hear Chad is writing :-) we may wish to consider applying more formal review to things proposed to go into trunk on SVN. This may be simpler than attempting to synchronize SVN and git via post-SVN-pre-git reviews... I'd be perfectly fine with either outcome (more formal pre-commit review, or picking our SVN-Git cutover point based on what's reviewed). Rob ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
[Wikitech-l] How to avoid a post-branch code slush (Re: Branch 1.19 on Tuesday, February 6?)
On Wed, Feb 8, 2012 at 4:40 AM, Petr Bena benap...@gmail.com wrote: Hi, is there any update on branching? Thank you Sam will be doing it soon. After that it means, in theory, that trunk will be open for post-deploy commits. However, we *cannot* let the same backlog back up that we did before, and there's no way we can keep up with everything we need to do for deployment (last minute bugfixes, addressing fixmes regardless of committer) while at the same time dealing with a flood of new commits. A big problem with our current post-commit review regime is that it is exactly these times that really regrettable changes can and probably do get made. Many refactoring exercises happen without much discussion on the mailing list. The code doesn't get reviewed, and then it gets entangled with a lot of other important code to the point that we're forced to forge ahead with a suboptimal refactoring decision. In addition to building up a large review backlog, we also find ourselves chasing pockets of breakage due in part to incomplete refactoring and backwards-incompatibility breakages. We're migrating to Git very soon after this release. It would really suck to have a huge pile of unreviewed commits going into trunk. So, I'm going to suggest a Git migration strategy that will avoid having a monsterous backlog. Instead of cutting over trunk at the very latest revision, we cut over at the latest revision that is fully reviewed and ok. Everything before the 1.19 branch point would be grandfathered in, but everything after would need to be reviewed and ok. So, for example, if r111000 is the branch point, and r111000-r111020 are reviewed, but r111021 is a huge omnibus change that sits unreviewed or fixme'd, then r111020 would be the branch point, even if r111022-r12 are fully reviewed. That's hopefully an extreme example, but the goal is to make sure that trunk is always fully reviewed. What would happen to everything after the branch point is tbd. I haven't talked to Chad about this, but I think it's conceivable that we can import the remainder of commits into a branch that we can cherrypick. The dynamic that this will create is that it will motivate more peer-to-peer scrutiny of code, rather than waiting for one of the reviewers to play bad cop. Until we agree on this strategy or some other strategy that everyone agrees is workable, we'll need to keep the code slush in place. Thanks Rob ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] How to avoid a post-branch code slush (Re: Branch 1.19 on Tuesday, February 6?)
On Wed, Feb 8, 2012 at 3:43 PM, Brion Vibber br...@pobox.com wrote: This is one of the reasons I've been hoping we'd move to a more pre-commit review model. Especially for big refactorings and cleanups that have limited immediate value, we tend to get a lot of breakages a not a lot of interest in fully reviewing them (eg actually checking all the code paths to make sure they really work). Here's the thinking that lead to where we are: the cutover to Git is the point at which we want to fully move to precommit. It seems like an enormous pain-in-the-butt to move to full precommit with our current toolset (SVN + CodeReview tool). However, we're much closer than we've ever been to having Git, and it may be worth dealing with some short-term pain. To a certain degree, I'd actually consider it desirable to have a permanent 'slush' to the extent that destabilizing work should *always* be talked out a bit and tested before it lands on trunk/head/master. Yup, agree 100%. Let's all just pretend this has always been the status quo starting right now. I think we've already established that there used to be more liberal reversion, and that when that went away, so too went our ability to stay on top of the review queue. If we're not ready to go fully git the instant we branch 1.19, Given that the branch just happened, and we're not ready yet, that's the case. Chad can give you more of an update, but my understanding is: * A (hopefully) final test migration of core is slated for this week. Chad believes he's got all of the blocking problems sorted out. * Extensions migration isn't going so smoothly. The same tools that work splendidly with core seem to crash with the very small subset of extensions that he's tried it out with. Could be a minor problem that's easy to fix, or could be gawd awful. TBD * We'd like a two-week window of warning/testing/playing around before making the cutover. All told, the current plan is beginning of March for core, middle of March for extensions. More details here: https://www.mediawiki.org/wiki/Git/Conversion ...and in the email that I hear Chad is writing :-) we may wish to consider applying more formal review to things proposed to go into trunk on SVN. This may be simpler than attempting to synchronize SVN and git via post-SVN-pre-git reviews... I'd be perfectly fine with either outcome (more formal pre-commit review, or picking our SVN-Git cutover point based on what's reviewed). Rob ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
[Wikitech-l] New committer: Chris McMahon (cmcmahon)
Hi everyone, Chris McMahon (cmcmahon) is our latest addition to the committer list. Rob On Tue, Jan 31, 2012 at 10:02 AM, Rob Lanphier ro...@wikimedia.org wrote: Hi everyone, I’d like to welcome Chris McMahon to the Platform Engineering team as our new QA Lead. Chris has a long history working in software testing, coming to us most recently from Sentry Data Systems where he was responsible for test automation. One particularly relevant bit of experience from Chris’s past was his work at Socialtext on their wiki product, expanding the Selenium-based automated test suite from 400 individual assertions to 10,000 over the span of two years. Chris is also active in the outside community. He leads the Writing About Testing group and annual conference, which he founded in 2009. He also helped design and build the SeleNesse testing framework, which is a wiki-based tool for building acceptance tests that get executed by Selenium. In his role as QA Lead here, Chris will be responsible for figuring out what sorts of testing process we can bring to MediaWiki development. His first task will to join in on the tail end of the 1.19 deployment process, helping us with whatever last minute testing that makes sense at this stage, but then he’ll have the much larger task of looking at our release and deployment process generally, and figure out which parts would most benefit from the injection of testing rigor. He’ll also be responsible for establishing a more coordinated volunteer effort around testing. Chris will be working remotely from his home in Durango, Colorado. Welcome, Chris! ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
[Wikitech-l] 20 percent time standups prior to 1.19
Hi everyone, This mail is primarily directed toward Wikimedia Foundation employees, but isn't private, so I'm directing it here. As many of you know, we have a policy for Wikimedia Foundation engineers that states that they need to work 20% of their time on maintenance and communication tasks which immediately serve the Wikimedia developer and user community, beyond working on assigned features or longer-term work.[1] Managing how that manifests itself is the responsibility of Platform Engineering, which is why I'm the one frequently prattling on about it. :-) Up until last week or so, this wasn't too tough an activity to coordinate. Code review was in pretty rough shape, so my frequent advice was grab a bucket and bail (as in, start reviewing whatever is in the queue). As the 1.19 deployment starts up next week, coordination is both more important and harder. We're shifting our focus away from code review, and onto fixing blockers and fixmes. Even before recent times, there have been some folks at WMF who have felt a little adrift during their 20% time. So, starting this week, we're instituting explicit 15 minute meeting times for the beginning of everyone's review 20% period, on the #wikimedia-dev Freenode channel. The schedule for these is listed below: https://www.mediawiki.org/wiki/20_percent#IRC_checkins Those of you who noticed me drop this on your calendar last week, and wondered what this was...well, now you know. I've tried to schedule these such that they happen as close to the beginning of everyone's shifts as possible, without having them at times that are crazy for anyone. That's going to let us coordinate on the particulars of what everyone will be working on reasonably close to the time that they'll be doing the work. Right now, the focus of these is going to be on the 1.19 deployment, mainly around ensuring we get all of the fixmes and blockers figured out before the rollout begins. However, even after the dust settles on 1.19, I think we'll still need this level of coordination to start managing the flow of Git pull requests, making real progress on our backlog of patches in Bugzilla, and making progress on resolving the 7000 or so open bugs in Bugzilla. We have a lot of work to get done, and this 20% is going to be a key part of making it happen. Rob [1] https://www.mediawiki.org/wiki/20_percent ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
[Wikitech-l] Branch 1.19 on Tuesday, February 6?
Hi everyone, We're hitting the home stretch. Check this out: http://www.mediawiki.org/wiki/MediaWiki_1.19/Revision_report Summary: 27 unreviewed (new) revisions, 14 fixmes It looks like we're getting close enough to the bottom of the review queue that we could conceivably even hit zero unreviewed early next week. Should we target Tuesday for the branch point? The biggest risk for a February 13 deployment is the number of fixmes left. 14 is quite a lot to get through in the short period of time we have left. Please do not be bashful about fixing other people's old fixmes at this point, or even the newer ones provided you coordinate with the author. Thanks Rob ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Branch 1.19 on Tuesday, February 6?
On Fri, Feb 3, 2012 at 8:23 PM, Jeremy Baron jer...@tuxmachine.com wrote: On Fri, Feb 3, 2012 at 22:46, Risker risker...@gmail.com wrote: On 3 February 2012 19:07, Rob Lanphier ro...@wikimedia.org wrote: The biggest risk for a February 13 deployment [...] The thread indicates the deployment will be on Feb 6, but the body indicates it will be on Feb 13. Perhaps a clarification? I'm not sure but my first guess is that deployment and branching are not the same day. Maybe that allows for a week of extra QA/labs beta tests? or some last minute fixes after branching? That's exactly it. Branching is the first step along the way. We *could* conceivably branch really close to the 1.19 push to test2, but branching beforehand gives us a chance to have things a little closer to frozen for the final week before deploy. Rob ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
[Wikitech-l] Welcome Chris McMahon, QA Lead at Wikimedia Foundation
Hi everyone, I’d like to welcome Chris McMahon to the Platform Engineering team as our new QA Lead. Chris has a long history working in software testing, coming to us most recently from Sentry Data Systems where he was responsible for test automation. One particularly relevant bit of experience from Chris’s past was his work at Socialtext on their wiki product, expanding the Selenium-based automated test suite from 400 individual assertions to 10,000 over the span of two years. Chris is also active in the outside community. He leads the Writing About Testing group and annual conference, which he founded in 2009. He also helped design and build the SeleNesse testing framework, which is a wiki-based tool for building acceptance tests that get executed by Selenium. In his role as QA Lead here, Chris will be responsible for figuring out what sorts of testing process we can bring to MediaWiki development. His first task will to join in on the tail end of the 1.19 deployment process, helping us with whatever last minute testing that makes sense at this stage, but then he’ll have the much larger task of looking at our release and deployment process generally, and figure out which parts would most benefit from the injection of testing rigor. He’ll also be responsible for establishing a more coordinated volunteer effort around testing. Chris will be working remotely from his home in Durango, Colorado. Welcome, Chris! ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
[Wikitech-l] Reviewers: great work today!
Hi folks, With everything that's been going on in WMF land (SOPA + SF Hackathon), our code review work has been falling behind. Thankfully, we've had another great Friday in CR, which has resulted in things getting much better. As of this writing, we only have 60 new revisions without the nodeploy tag: http://www.mediawiki.org/wiki/MediaWiki_1.19/Revision_report ...and 12 fixmes. Friday seems to be a really good day for code review; it was also a banner day two Fridays ago. We're still behind our original projection, but we're making great progress! We should be able to get down to zero-ish by the February 13 deployment to test2. We'll be cutting it close, but we should be able to do it. Onward! Rob ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
[Wikitech-l] Moving forward with Lua
Hi folks, As you may know, WMF's Platform Engineering group plans to embark on a major performance initiative this year, and had chosen inline scripting as having the biggest potential impact given what's practical now. Tim Starling build a Lua prototype last year which showed a lot of promise for making things much faster. One major decision before embarking on this effort was a decision on whether we'd stick with Lua or try another language such as Javascript or Victor's WikiScript implmentation. I wanted to make a decision by the end of the month[1], and I think we've done it. We've decided to build a deployable version of Lua as a new alternative to wiki markup for templates, barring some scandalous revelation about Lua's lurid past or other unforeseen barrier. Tim will be leading this effort, and will start on the implementation some time after the dust settles on the 1.19 deployment and the Git migration. The project page for this is located here: http://www.mediawiki.org/wiki/Lua_scripting Rough notes from our meeting yesterday are also available [2] Rob [1] http://thread.gmane.org/gmane.science.linguistics.wikipedia.technical/57769/focus=57813 [2] http://www.mediawiki.org/wiki/Lua_scripting/Meeting_2012-01-25 ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] MediaWiki 1.19, Resource Loader, Gadgets, and User Scripts
On Fri, Jan 20, 2012 at 7:25 PM, Mark A. Hershberger mhershber...@wikimedia.org wrote: Let's ensure that the Wiki* experience is consistent. We can avoid some of the mistakes like those that happened when we introduced ResourceLoader. Backwards compatibility is important. If we upgrade MediaWiki and we know that people are going to complain because a widespread dependency (like mw.util) disappeared, let's eliminate that experience gap. Thanks for posting this to the list. I've highlighted above the most important part of what you wrote -- we will not introduce known incompatibilities and count on our ability to clean these things up after the fact unless that's the only way to do it. My understanding is we have far better options in this case, and we'll have more than enough work on our hands cleaning up after unknown incompatibilities. A couple of relevant bugs: Bug 33711 - mw.util.$content is undefined in MediaWiki:Common.js https://bugzilla.wikimedia.org/show_bug.cgi?id=33711 Bug 33746 - Resource Loader should pre-depend on mw.util https://bugzilla.wikimedia.org/show_bug.cgi?id=33746 Roan should be weighing in on these soon. My understanding is that he plans to add the necessary dependency to make this work. Rob ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
[Wikitech-l] New committer: Thibaut Horel (zaran)
Hi everyone, Please welcome new committer Thibaut Horel (username: zaran), who plans to help maintain the Proofread Page extension. Thibaut is an active contributor on fr.wikisource.org and is here at the San Francisco Hackathon this weekend. Rob ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
[Wikitech-l] Welcome, Andrew Otto - Software Developer for Analytics
Hi everyone, I'm pleased to announce Andrew Otto will be coming to Wikimedia Foundation as a software developer in Platform Engineering, focused on analytics. We've been hiring for this spot for quite some time, and I'm happy we held out for Andrew. Andrew comes to us from CouchSurfing, where he worked for the past four years as one of the very early technical staff there, working in various places throughout the world (Thailand, Alaska, and New York are the ones I recall). His team scaled their systems from a few web servers and one monolithic database, to a cluster of over 30 machines handling almost 100 million page views per month. He was responsible for introducing Puppet for system configuration at his last job, and much of his work at CouchSurfing has been in reviewing code and maintaining a consistent architecture for CouchSurfing. We're really excited to have Andrew on board to help bring some systems rigor to our data gathering process. Our current data mining regime involves a few pieces of lightweight data gathering infrastructure (e.g. udp2log), a combination of one-off special purpose log crunching scripts, along with other scripts that started their lives as one-off special purpose scripts, but have gradually become core infrastructure. Most of these scripts have single maintainers, and there is a lot of duplication of effort. In addition, the systems have a nasty tendency to break at the least opportune times. Andrew's background bringing sanity to insane environments will be enormously helpful here. Andrew has an email address and is technically starting the onboarding process, but is still wrapping up at CouchSurfing. He'll be with us part-time starting January 17, and ramping up to full-time starting in April. Andrew is based out of Virginia, but is still traveling the world. Right now, you'll find him in New York City. Please join me in welcoming Andrew to the team! Rob ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Home stretch of 1.19 code review
Thanks for sending out this reminder Mark! Comments inline: On Wed, Jan 4, 2012 at 5:23 PM, Mark A. Hershberger mhershber...@wikimedia.org wrote: CRStats (http://toolserver.org/~robla/crstats/) shows that on December 24th we were had around 500 revisions left for review. This wasn't too bad except that we CRStats shows we were supposed to be closer to 300 revisions left. Today is worse. We're up to over 600 revisions for review while Robla's projections show that we're supposed to be closer to 200 revisions for review. I've been going over what we have, and it may not be all *that* dire, though I'd like to get everything tagged before making that pronouncement. Here's the tagging we've got so far: http://www.mediawiki.org/wiki/MediaWiki_1.19/Revision_report There's a couple of big areas that stand out: * educationprogram: can we defer these until after we deploy? * socialprofile: doesn't block 1.19 deployment. My understanding is that ashley would still like review on these, though. We can mark these as nodeploy, though that won't take them out of the graph (only the revision report above) Also, as an aside, I've added a couple of psuedo-tags to the graph: http://toolserver.org/~robla/crstats/ The newly new psuedo-state is a plot of new revisions added in a given week. The reviewed psuedo-state is all revisions that transition from new to some other state in a given week. This shows that the review load has been steadily rising over the past year, which means even though we've been adding capacity, that's not enough to keep up with out either slowing down a little bit to catch up, or really stepping up. Rob Rob ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Code review thoughts: module reviewers
On Tue, Dec 20, 2011 at 3:01 PM, Brion Vibber bvib...@wikimedia.org wrote: I'd like us to seriously consider having primary reviewers for various code modules, so things like this get handled asap and don't end up falling through the cracks -- big changes, and small confusing changes ;) -- should get pretty consistently treated. Mark and I have started the process of tagging code areas, which you can see reflected on the revision report: http://www.mediawiki.org/wiki/MediaWiki_roadmap/1.19/Revision_report There are still a lot of revisions that are still untagged, but not nearly as many as there were a day ago. The tag names are mainly based on extensions right now, but there is a small amount of grouping going on. If someone wants to come up with a more sensible tagging structure, by all means go for it. I hope this is a helpful first step. After the holidays, we can get serious about dividing up the work based on tag. Rob ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
[Wikitech-l] The role of designers in MediaWiki (Re: Diff colors - a disaster waiting to happen)
On Thu, Dec 22, 2011 at 11:04 AM, Chad innocentkil...@gmail.com wrote: I highly doubt we'll get guaranteed backlash over such a minor shift in diff colors. Hehha ha ha ha(*uncontrollable laughter*) Sorry, we will, but not because we should. Pretty much any design change anyone makes runs a substantial risk of endless bikeshedding. While there are only a handful of people that care about the difference between database query strategies or parameter ordering in functions, pretty much everyone who interacts with the product will have an opinion about what it will look like. The colors, the ordering of the layout, the whole bit. Calling it a bikeshed argument probably diminishes the importance of it a bit too much. The colors are important. What the user interface looks like matters a great deal. The question, though, is what is the best way of dealing with user interface design issues? Many of us know that as non-designers, while we may have opinions on these things, they probably don't matter.[1] To make matters worse, there are going to be a lot of things that are truly a matter of taste. So, someone is going to need to make the call, because there's no way to arrive at The Truth. We can do what most open source projects have done all along, which is to leave it up to the last developer that touches the code. That's a fine way of guaranteeing that what we end up with is going to be a hodge-podge of a lot of really bad stuff, combined with a few things that would be good if they were applied consistently and we committed to that particular direction. However, those good things won't be so good since they won't be applied consistently, since it'll be the taste of whoever happens to be around that will be applied. No one has declared Brandon the dictator of these things, nor has anyone declared that we're throwing consensus out the window. However, it's hard to imagine Brandon being able to do anything other than this type of conversation if every decision he makes needs to be justified and discussed to the level that this diff color discussion happened. Furthermore, Brandon ends up dealing with the consequences for design decisions that he isn't consulted on, which is why we need to make sure that we consult him, and be prepared to go along even if we disagree. The right answer is often the most consistent answer, and our odds of achieving consistency go up if we defer to one person for issues that are matters of taste. There's a few other issues tagged design in code review, and there are likely to be more: http://www.mediawiki.org/wiki/Special:Code/MediaWiki/tag/design Brandon is going to be weighing in on those as well, possibly making changes in the code based on his opinions on those. Can we please, oh please, not spin up a monster debate about each of those others? Please? Thanks Rob [1] http://www.codinghorror.com/blog/2006/11/this-is-what-happens-when-you-let-developers-create-ui.html ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Code review thoughts: module reviewers
On Tue, Dec 20, 2011 at 3:01 PM, Brion Vibber bvib...@wikimedia.org wrote: I'd like us to seriously consider having primary reviewers for various code modules, so things like this get handled asap and don't end up falling through the cracks -- big changes, and small confusing changes ;) -- should get pretty consistently treated. I think this is a grand idea. A few code areas: Parser (3 revs) https://www.mediawiki.org/w/index.php?path=%2Ftrunk%2Fphase3%2Fincludes%2Fparsertitle=Special%3ACode%2FMediaWiki%2Fstatus%2Fnew FIlerepo (12 revs) https://www.mediawiki.org/w/index.php?path=%2Ftrunk%2Fphase3%2Fincludes%2Ffilerepotitle=Special%3ACode%2FMediaWiki%2Fstatus%2Fnew API (4 revs) https://www.mediawiki.org/w/index.php?path=%2Ftrunk%2Fphase3%2Fincludes%2Fapititle=Special%3ACode%2FMediaWiki%2Fstatus%2Fnew Resource Loader (4 revs): https://www.mediawiki.org/w/index.php?path=%2Ftrunk%2Fphase3%2Fincludes%2Fresourceloadertitle=Special%3ACode%2FMediaWiki%2Fstatus%2Fnew Someone will need to do some more looking at different ways of slicing things up. So far, I'm not coming up with many easy ways of dividing core with this style of query. Extensions should be a lot easier to divvy up. Anyone got good ideas for how to make this work? Rob ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
[Wikitech-l] Making the Lua/Javascript decision (Re: Performance roadmap update)
On Wed, Dec 14, 2011 at 9:00 PM, Tim Starling tstarl...@wikimedia.org wrote: We still want to do something about parser performance in the first half of 2012, so we're going to bring forward our other performance project, i.e. server-side scripting embedded in wikitext. That's a project which is still at an early stage of planning. We will need to define its scope, and to bite the bullet and make some tough design choices (such as Lua versus JavaScript), if it's going to progress from pipe dream to reality. Let's resolve to have all of the big pieces nailed down by January 31. In particular, it should be possible to make the WikiScript vs Lua vs Javascript decision well before that time. There already seems to be a credible starting point for Lua and WikiScript, but not yet one for Javascript or any other language. Is there anyone willing to put together a Javascript proof-of-concept? Rob ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Deployment process clarification?
Hi Ian, Thanks for sending this mail! (I prodded him to do it in a private thread run amok) More inline... On Wed, Dec 14, 2011 at 5:05 PM, Ian Baker i...@wikimedia.org wrote: [Process A] 1. Discussion happens, a need arises, etc. 2. Someone writes some code, probably in the form of an extension 3. That extension is checked in and makes its way through code review. All revisions must be ok'd to deploy. 4. On-wiki consensus is reached regarding deployment, and the discussion closes. 5. A ticket is created to request deployment on a specific wiki 6. Someone Who's Been Around A While (Tim, Roan, maybe others) looks over the code and decides it's okay 7. (maybe) A discussion regarding the extension happens here on wikitech-l. 8. The ticket gets picked up, extension gets deployed, ticket gets closed. [] [Process B] 1. Someone decides we need the code. 2. The code gets written 3. Code is checked in, makes its way through CR, must be ok'd before deployment 4. A deployer schedules a window and deploys it. [] My specific questions: Process B is what I'm used to, but it seems that for this extension, it's process A. When do we pick one or the other? Is process A for community-contributed code, whereas B is for stuff from WMF? Do things change when we're deploying a whole new extension? Not all community-contributed code has been subjected to process A, and not all WMF-developed code has skated by on process B. However, it's true that it generally looks a lot harder for community-contributed stuff to make it through. It basically boils down to this: does the person deploying it feel comfortable deploying it? Well, here's what they're afraid of: a. Is this something that is going to melt down the servers on my watch? b. Is this something that is going to open up a big security hole in the site? c. Is this something that is going to cause a lot of editor/reader angst and drama? d. Is this something that runs against some WMF policy, stated direction, or unstated preference by one of the powers that be? Process B is generally the process that occurs when the person deploying it isn't terribly afraid of any of the above. If you find yourself in process A, chances are, it's because the person deploying doesn't feel like it's been vetted to the degree necessary that they're going to put themselves on the line by deploying it. I've got more thoughts, but rather than trying to get everything in one email, I'll stop and send now. Rob p.s. I'd suggest a separate thread about InteractiveBlockMessage, rather than lumping it into a meta conversation about deployment process. ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
[Wikitech-l] Reverts on the road to 1.19
Hi everyone, Thank you to everyone who has been reviewing code. Since December 3, there's been a decline from 1183 unreviewed new revisions, to 644 [1] That's the good news. The bad news is that we hadn't done so well in the week prior to that, and we haven't yet made up for lost ground. By the goals set for a 2012-01-31 review completion date[2], we should have been down to 644 new sometime during December 5, and 360 is where we should be now. That would seem to put us roughly 8-9 days behind where we wanted to be. Some of this is keeping the pace on code review, and everyone seems to be doing a more-than-credible job there. However, that effort is going to be wasted if the pace of new code outstrips our capacity to review it. A bunch of us in Platform have been discussing ways to make sure we don't backslide, and came up with a few ideas. We're going to try to get to some important reviews first, as documented on the 1.19 roadmap page[3]. If we can make any ok/revert decisions on these revisions soon enough, and we have to revert, we'll leave ourselves enough time to deal with the consequences. Speaking about reverting, one pattern the reviewers are going to be especially sensitive to: refactoring without mailing list discussion or any clear consensus. If you're doing core work that's not in service to getting 1.19 finished, and you commit it without discussing it much, don't be surprised if your work is unceremoniously reverted. No one likes being trigger happy about reverting, but we really mean it about needing to finish up 1.19. We need to get caught up for a whole raft of reasons (getting to faster release cycles, getting moved over to Git, getting FileBackend done and deployed for Swift). Even if we weren't in this particular crunch, there's a lot of fatigue about unilateral refactoring decisions. Let's at least talk about them before lobbing them out. One thing that's been tough about the 1.18 deployment has been that we're just now stumbling into issues that were introduced as early as January of this year. With the 1.19 deployment, we'll not only close that gap quite a bit (only stretching back to July now), but also potentially make it possible to deploy code not long after it's written, even if its committed early in the deploy cycle. Please work with us to get there. Thanks! Rob [1] http://toolserver.org/~robla/crstats/ [2] https://docs.google.com/a/wikimedia.org/spreadsheet/ccc?key=0Agte_lJNpi-OdDJvZVBmYUtZbm1zYVRCYTFRZkxuVFEhl=en_US#gid=5 [3] http://www.mediawiki.org/wiki/MediaWiki_1.19/Roadmap ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
[Wikitech-l] Updated revision report for 1.19
Hi everyone, In trying to figure out where the bottlenecks are, it seemed it would be helpful to have a breakdown of the revisions. So, here's the first of many revision reports for 1.19: https://www.mediawiki.org/wiki/MediaWiki_roadmap/1.19/Revision_report This updated report now contains three sections: * New revisions by tag * New revisions by author * Fixmes There are several helpful things that jump out now: * Many, many untagged revisions. I think would be very helpful for those doing review to have revisions tagged by product area (e.g. parser, frontend, etc). We * The new revisions by author is a new view, and it's pretty helpful to see the big picture. For example, it looks like Aaron, Patrick, Roan, and Sam have a ton of stuff to review. It may be helpful for those people to pair up with (other) reviewers for some real-time reviews of their code. Rob ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
[Wikitech-l] New code review stats page
Hi all, I spent some time this weekend spiffing up the code review stats page: http://toolserver.org/~robla/crstats/ There may still be some bugs, especially since my cross browser testing involved loading in Chrome and Firefox on a single Linux box, but it should generally work a lot better than the old version. I've changed graphing libraries from flot to jqplot. The latter lets you zoom along x- and y-axis (the latter being a frequently requested feature). I've also untangled some of the spaghetti in the code, which means it should be easier for someone who wants to contribute to do so. Here's the source: https://gitorious.org/mwcrstats (docs are still sorely lacking though...but feel free to bug me if you're interested). The numbers for trunk don't look great at all. I'll probably dig more into the numbers tomorrow, but we're really lagging from our projections there (in fact, we've lost ground this past week). The numbers for trunk/phase3 look better, though I think we may still be going slower than what would be needed for a release. Rob ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] FileBackend branch
On Wed, Nov 30, 2011 at 5:51 PM, Aaron Schulz aschulz4...@gmail.com wrote: I'm starting to finish the initial coding for the FileBackend branch (see https://svn.wikimedia.org/viewvc/mediawiki/branches/FileBackend). I still have to get to thumb.php and img_auth.php though. Simple testing of uploads, re-uploads, moves, deletes, restores, and such are working on my local testwiki. Hi folks, A few more details on this. Aaron is trying to get some important refactoring work done in service to this project: http://www.mediawiki.org/wiki/SwiftMedia We'd like to land this code in trunk as soon as we can, shake out the inevitable bugs, and get this rolled out to the cluster as part of 1.19. That will then make it possible for us to complete the migration from NFS as our file storage tech to Swift. That in turn will enable us to greatly expand our file capacity on commons, as well as making it so that we no longer have a scary single point of failure for file storage on the Wikimedia cluster. Rob ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
[Wikitech-l] Code review goals for 1.19 deployment
Hi everyone, Here's an attempt at putting together a more sophisticated (though not necessarily correct) goal for 1.19: https://docs.google.com/a/wikimedia.org/spreadsheet/ccc?key=0Agte_lJNpi-OdDJvZVBmYUtZbm1zYVRCYTFRZkxuVFEhl=en_US#gid=5 The goal was not to make something 100% accurate based on our past record, but to come up with a goal that's informed by our history, but also nudges us a little to get better than before. During the 1.18 cycle, our goals were based on a linear review rate (same number of revisions every day). That may have been too easy on us in the early going. What was clear looking at the numbers from last time around was that, when focused on code review, the curve looked a lot more like an exponential decay curve (more reviews in the early going, tapering off toward the end). That's assuming that our backlog increase in July was an anomaly that we won't repeat. So, the 1.19 cycle has new review goals that are based on exponential decay curve, with a linear fudge factor built in so that we don't just approach zero, but make it there. Fixmes, however, did seem to have a linear rate during the 1.18 cycle, so I left our goals linear for this release. I've accounted for a plateau in the last two weeks of December when many WMF staff will likely be taking holiday vacation. The goal is to get done with review by January 31, 2012. While this is much later than we were previously hoping for, it's still pretty aggressive given the backlog we have. This assumes that the huge swath of revisions that Hashar marked deferred on Friday all stay deferred, for example, and that we'll still find other pockets of revisions that we can use to knock off 50+ revs a day in the early going. Does this look like a workable goal to everyone? Assuming so, we'll report our progress against this goal, and plan the 1.19 deploy for early February. Rob ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Bugzilla vandalism
On Tue, Nov 22, 2011 at 4:12 AM, Tim Starling tstarl...@wikimedia.org wrote: I looked into the possibility of writing an automated revert tool as a command-line perl script integrated with Bugzilla, but it looked like it would be fairly complicated: Good news: there's a bug filed for this in Bugzilla's tracker: https://bugzilla.mozilla.org/show_bug.cgi?id=363346 Bad news: it's 5 years old, without a lot of visible interest. Also On Tue, Nov 22, 2011 at 5:25 AM, Chad innocentkil...@gmail.com wrote: On Tue, Nov 22, 2011 at 8:16 AM, Nikola Smolenski smole...@eunet.rs wrote: Perhaps some throttling could be helpful here? I don't see a legitimate user filing more than a bug per minute or a change per, say, 15s. I can't seem to find any built-in BZ support for such a thing--nor can I even find a bug (other than this tangentially-related WONTFIX[0]) requesting it. New feature request filed: https://bugzilla.mozilla.org/show_bug.cgi?id=704753 Anyone with some Perl chops want to take on a project? Rob ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Code review graphs working again, and....
On Mon, Nov 21, 2011 at 3:52 PM, Brion Vibber br...@pobox.com wrote: 1.18 hasn't been released yet, so presumably we're still concentrating on solidifying the 1.18 release in CR. Per previous discussions about the 1.17 release delays I would have expected this to have gone *really fast* and been out the door around mid-October, within a few days of going live on the main sites. /me reminds Brion IRL just how nutty mid-October was for WMF tech staff. :) It's also Sam's first release, so it's taking a bit longer than if Tim were 100% focused on it. But the good news is that we're nearly there. It appears we have gotten to a release candidate as of last Friday, which is a good sign; is a final .0 release slated yet? We would probably push it out this week, but there's a security bug we're taking a look at. In the meantime we've got no upcoming 1.19 deployment pressure and nobody assigned to ongoing code review, so there's nothing to compel further action -- it's not surprising to me at all that it's falling behind. Brion and I spoke in real life right after this, which is highly unfair to the rest of you, but it was really efficient for us. Here's the gist: * I've already started a campaign to remind people of the 20% policy (http://www.mediawiki.org/wiki/20_percent ) * I'll also be sending out a revised target date for 1.19, based on looking at the updated numbers and how quickly we reviewed in some of our more determined review sprints, figuring in some time for a little regression during the holidays. * More people from Platform Engineering will be reinforcing that we want the next release to happen soon, even if that means other things (e.g. Git migration) lag as a result. Not a perfect answer, but I think we're improving with each release. This time around, one key difference will be clearing the review backlog before branching, which, with any luck, means less backporting hell. Rob ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
[Wikitech-l] Code review graphs working again, and....
Hi everyone, The JS-generated code review graphs are working again. The phase3 graphs now line up: http://toolserver.org/~bryan/stats/codereview-status-diff.png http://toolserver.org/~robla/crstats/crstats.trunkphase3.html The minor discrepancy between those two graphs can be explained by a minor bug in mine. In my version, those revisions that span multiple trees (e.g. r103519 [1]) aren't counted. That seems to be a relatively small number of revs (mostly localization updates). As for the and..., it looks like code review in /trunk is quickly getting out of control: http://toolserver.org/~robla/crstats/crstats.trunkall.html ...at a pace far faster than I had realized. I'm hoping that a lot of the revisions in there are easily deferred, but if not, things are looking pretty grim for 1.19. Rob [1] https://www.mediawiki.org/wiki/Special:Code/MediaWiki/103519 ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Fwd: [Gendergap] Congratulations to WMF's Sumana Harihareswara!
Cool! This is well-deserved recognition. Rob On Wed, Nov 9, 2011 at 8:23 AM, Erik Moeller e...@wikimedia.org wrote: Congratulations to Sumana :-). May she bring many more women to (wiki-)tech. Erik -- Forwarded message -- From: Sarah Stierch sarah.stie...@gmail.com Date: Wed, Nov 9, 2011 at 6:06 AM Subject: [Gendergap] Congratulations to WMF's Sumana Harihareswara! To: Increasing female participation in Wikimedia projects gender...@lists.wikimedia.org Cc: Sumana Harihareswara suma...@wikimedia.org Hi everyone, I want to congratulate Sumana Harihareswara, Volunteer Development Coordinator for the Wikimedia Foundation, on being chosen as Femmeonomics 50 Women to Watch in Tech. She's listed alongside some really amazing women - including Valerie Aurora who is a friend of this mailing list and co-founder of the Ada Initiative.[1] She's quite a voice within the community - encouraging developers and programmers to get involved in Wikimedia and participating in conferences like Open Source Bridge. On a personal note, Sumana has been a beam of encouragement for me, and has helped me gain confidence in regards to my open source skills and helping me learn more about feminist and women's roles and organizations in the open source community. She's also great to share a bottle of wine with. Congratulations Sumana! We'll be watching (pressures on!) -Sarah [1] Currently only the first ten are listed, but the rest are on the way! http://femme-o-nomics.com/2011/10/the-50-women-to-watch-in-tech-the-first-10/ -- Sarah Stierch Consulting Historical, cultural artistic research advising. -- http://www.sarahstierch.com/ ___ Gendergap mailing list gender...@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/gendergap -- Erik Möller VP of Engineering and Product Development, Wikimedia Foundation Support Free Knowledge: http://wikimediafoundation.org/wiki/Donate ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
[Wikitech-l] Front-end devs needed to fix MediaWiki 1.18 release blockers
Hi folks, Here's the list of release blockers for 1.18: https://bugzilla.wikimedia.org/buglist.cgi?resolution=---target_milestone=1.18.0%20release ...with summaries below. Basically, we're down to two bugs. Could we get some front end development love here? == WikiEditor scrolls the browser and does not insert on IE8 on Windows 7 https://bugzilla.wikimedia.org/32241 Reported November 7, but repro'd on three different machines. This seems like a problem we should also be fixing in deployment. == Unable to add/remove buttons from (classic) toolbar from a gadget after MW update https://bugzilla.wikimedia.org/31511 It's actually not clear to me that this should be a 1.18 release blocker. It's a problem that seems to have occurred in some form on 1.17, and as Roan says: It seems the old toolbar's interface for adding buttons depends on loading order too much. The WikiEditor toolbar's interface is a bit better in this regard, but not much. These interfaces should be redesigned such that the loading order either doesn't matter or must be toolbar-first-gadget-later (so the gadget can depend on the toolbar and force the order to be that way), and such that adding, removing and modifying buttons at any location in the toolbar is supported. This seems like an architectural issue that isn't going to get solved in the time that we'd like to get 1.18 finished off, so if others agree, the simple fix may be put it in the release notes. Let's either agree on this, or agree on a plan for fixing this. Thanks, Rob ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
[Wikitech-l] License exceptions in Wikimedia's repo (was Re: SVN Extension Access)
Hi everyone, There's a ton of issues conflated in the SVN Extension access thread. I'm sure there are things we can improve about that process, and I'll talk to the people involved next week about it. I've asked Sumana not to rush out a response on this thread today, and I ask that we table non-license related issues until after Monday. Now, on to the license issues: On Fri, Nov 4, 2011 at 10:32 PM, Olivier Beaton olivier.bea...@gmail.com wrote: There seems to have been concern with my original license, a BSD-2-Clause with copyright assignment so I don't lose the ability to distribute my code, this isn't the GPL, you need a contributor agreement with BSD (as I understand it). Olivier, I'm sorry your access request didn't go the way you hoped, but this issue alone is enough to torpedo your access request. I'm generally sympathetic to the desire of commercial entities to have a copyright assignment clause (as I've written about before[1]). However, to the best of my knowledge, Wikimedia Foundation is not about to dive into that thicket, and I'd personally be vehemently opposed to us doing so. As best I know, we don't have a stated policy for the license conditions for code that is contributed to our source repository, but we probably should. I'll take a stab at the previously unstated policy now: 1. We have a strong preference for GPL version 2 or later (more on this in a bit). 2. We generally accept licenses that are compatible with GPL v2 or greater. BSD 2 and 3 clause, MIT, LGPL, and many other licenses fall into this category. 3. Don't mess with the license headers of code other people wrote without their consent, because: a) even in cases where it's legal (e.g. while it's perfectly legal to slap a GPLv2 header on code that was previously under BSD), it's rude. Don't do it. b) it's frequently not legal. Don't fix someone's license if you don't believe it is the proper license under this policy. Revert the code instead. 4. GPLv3 usage is still largely an undecided matter, and we ask that committers not use this license in any GPLv2+ licensed extension or in core. I believe that some extensions may be checked in under this license, but we're avoiding it for WMF-created work, simply because of the one-way nature of the decision to move to it. 5. Anyone can contribute to anyone else's code under the stated license for that code, with no other strings attached. Don't take the above as gospel...that's just pretty much the set of assumptions I've been operating under, and I suspect it's more-or-less what others are operating under as well. We can have a debate right here in this thread about whether this is the right policy to have, but I'm not faulting anyone for making these assumptions. There may also be extensions that don't adhere to this policy. If there are, we should probably have a discussion about why that is, and what (if anything) we should do about them. On GPL version 2 or later, what is meant by that is that the license header explicitly say that the file is licensed as GPL either version 2 of the License, or (at your option) any later version.. Not choose whatever version of the GPL you want. Olivier, I'm sorry this wasn't clearly communicated to you until now. If having copyright assignment is a requirement for your extension, I suggest you host your extension elsewhere. Rob [1] http://blog.robla.net/2010/thoughts-on-dual-licensing-and-contrib-agreements/ ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
[Wikitech-l] Upcoming VipsScaler deployment
Hi everyone, Bryan Tong Minh has been doing some great work on VipsScaler extension, but there hasn't been a lot of recent chatter on the mailing list about it. Since we're somewhat close to deploying something on the cluster, it probably warrants at least an email. First, some background. Currently, we use ImageMagick for scaling images on Wikimedia websites. VIPS (http://www.vips.ecs.soton.ac.uk/ ) is an alternative image processing system that has a couple advantages over ImageMagick for our purposes: 1) it's faster, and 2) it's way more efficient with memory. The latter means that we can allow for much larger images with VIPS than we can currently allow using ImageMagick. For example, we currently disallow PNG images greater than 12.5 MP. Changing image scalers may introduce bugs, so we're proceeding cautiously. We have a three step plan: Step 1: Deploy VipsScaler extension, but only use it for PNGs over 12.5MP and/or TIFFs (which currently generate errors). Let this sit in production a while, fixing any bugs we find with this configuration. Step 2: Deploy a comparison tool (to be written) that allows anyone to rescale an image with VIPS, and compare it to the ImageMagick version. Allow more time for testing and fixing. Step 3: Switch to VIPS for everything If you're interested in following progress, there's a couple places you can look: * VipsScaler tracking bug: https://bugzilla.wikimedia.org/28135 * Follow changes in SVN: https://www.mediawiki.org/wiki/Special:Code/MediaWiki?path=%2Ftrunk%2Fextensions%2FVipsScaler Step 1 could go out as early as next week, pending final review of the extension. If anyone is interested in helping accelerate this, the comparison tool in step 2 is only just started (it's the SpecialVipsTest special page code), and Bryan is only working on it when he has time (which isn't right now). When we're ready to schedule this, we'll update the calendar here: https://wikitech.wikimedia.org/view/Software_deployments Rob ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Please review: MediaWiki architecture document
On Wed, Oct 26, 2011 at 7:18 AM, Sumana Harihareswara suma...@wikimedia.org wrote: I encourage anybody who's spent any time developing for MediaWiki to go and read https://www.mediawiki.org/wiki/MediaWiki_architecture_document/text -- it's thorough! There are some FIXMEs in there where Guillaume requests more details about performance, caching, and skins, so if you can add some more info on the talk page this week, that would be great. This is a great document to read even if you have no intent to make any edits. It's great background material. Suggestions on what to cut and optimize would also be appreciated, since we have a length quota for purposes of the AOSA book. Rob ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
[Wikitech-l] Welcome, Antoine Musso! (hashar)
Hi everyone, A belated (re-)welcome to Antoine Musso (a.k.a. hashar on IRC) as a contractor working on continuous integration (e.g. beefing up our unit tests, improving our Jenkins configuration, figuring out how to integrate TestSwarm, etc). Antoine has been working in our community for quite some time, going by the name Ashar Voultoiz on mailing lists, and by hashar on IRC.Now that he's working as a contractor for WMF, he's decided to let everyone know the Superman behind the Clark Kent alter ego :) You've probably already seen him be even more active (if that were possible) in SVN and elsewhere since Monday, which is when he officially started. Welcome, Antoine! Rob ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
[Wikitech-l] Bug 31424 - Anecdotal evidence of IE 8 problems
Hi everyone, We need your help. We have a number of reports on the various village pumps, helpdesks, Twitter, and such that IE8 users are experiencing crashes merely by visiting our site. Here's the bug report: https://bugzilla.wikimedia.org/show_bug.cgi?id=31424 Given the frequency and diversity of reports, there is almost certainly something to this, even though we don't yet have a solid repro case that a developer can actually use to fix this. Here's what we need. If you are actually seeing crashes, we would love to know exact browser version (e.g. IE 8.0.7601.17514), exact operating system version, all plugins installed and their exact versions, and how much RAM your machine has. Please report your findings either in the bug report above, or if you're more comfortable on-wiki, then here: http://en.wikipedia.org/wiki/Wikipedia:Village_pump_%28technical%29#If_you_can_make_IE_crash_pretty_reliably_we_need_your_help. Thanks! Rob ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Git migration planning
On Tue, Oct 4, 2011 at 9:21 AM, Merlijn van Deen valhall...@arctus.nl wrote: Currently, svn.wikimedia.org also hosts other repositories, most notably the pywikipedia repository. Is the plan to keep svn.wikimedia.org and svn-based code review online after the switch, or will the pywikipedia repository also have to switch? We're not in a big hurry to shut down our svn services, and pywikipedia is not really tangled up in the MediaWiki codebase, so I don't imagine we'll be rushing to convert it to git. I imagine there will come a day when we want to get out of the svn business, but we have a few months (at least) to figure out our strategy there. In the short term, pywikipedia in svn can peacefully coexist with MediaWiki + extensions in git. Rob ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Making developer access easier to get
On Mon, Oct 3, 2011 at 6:30 AM, Yaron Koren ya...@wikiworks.com wrote: I don't know whether this is WMF policy now, or a personal decision from Sumana, or a decision made by someone else, but in any case I don't understand it. It seems to me that there are two valid reasons for not simply allowing everyone to get a developer account: the first, and major, reason is to prevent malicious users from vandalizing or deleting code. The second is to prevent well-intentioned but incompetent developers from checking in buggy and/or badly-written code that requires lots of fixes and review time by the reviewers. In both cases, the person's presence in SVN would cause more harm than good. I'm going to leave it to Sumana for the more complete reply here (she and I just spoke). Sumana inherited our existing process, and she's been running it about as well as it can be run. I think she's probably been uncomfortable breaking with tradition while she's relatively new and there's a lot of stakeholders, but I think we've got some room to break with tradition. As it stands, we kinda backed into our current process. When I came on board, Tim had just moved over to OTRS. I spent some time with him understanding the process and figuring out how to speed things up, eventually pulling in other developers. Sumana took over the process at this point, which basically involves a handful of folks (Tim, Chad, and Aaron, I believe) quickly reviewing applications, and Sumana helping to dispatch responses. Since a lot of people are applying, there are quite a few to get through. I think the process could benefit from some transparency on both sides of the equation. I know from talking to her Sumana would love to have a wider pool of developers to vet the list, and a more transparent process would make that easier. Those requesting access can help things by being more transparent, too. Developers who post to the mailing list and participate in IRC will have a lot easier time getting access (assuming they say sensible things in those venues). It's a lot less scary giving access to someone when you have enough information to speculate on what they're likely to do with it, and they've proven that they play well with others. Since we don't have a process for reversing our decision (has anyone ever had their svn access revoked?), we're naturally going to be more conservative about who we let in. Anyway, as Brion says, we're likely to change things around pretty substantially when we migrate to Git, anyway, so we shouldn't spend too much time worrying about what svn access looks like for everyone. I don't want to use that as an excuse to shut down any improvement, but merely want to remind people of why we aren't likely to do anything really radical in the short term. Rob ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
[Wikitech-l] Stage 1 of the 1.18 deployment done
Hi everyone, Stage 1 of the MediaWiki 1.18 deployment is complete. We've deployed to the following wikis: * he.wikisource.org * mediawiki.org * simple.wikipedia.org * simple.wiktionary.org * strategy.wikimedia.org * usability.wikimedia.org The deployment had a few bumps (mainly problems with LiquidThreads, which we reverted to 1.17), but mostly went well. We started late due to getting a last-minute adjustment to the Resource Loader/Het Deploy compatibility fix in. You can see some of the last minute changes we made here: http://www.mediawiki.org/w/index.php?path=branches%2Fwmf%2F1.18wmf1title=Special%3ACode%2FMediaWiki ...as well as logs of major changes to the cluster last night here: http://wikitech.wikimedia.org/view/Server_admin_log Thanks to all involved! Next step is Stage 2 on Monday. More details about the 1.18 deployment generally are here: http://blog.wikimedia.org/2011/09/16/mediawiki118iscomin/ Rob ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
[Wikitech-l] Git migration planning
Hi everyone, For a long time, we've been talking about migrating from Subversion to Git. It's time to start getting more serious about it. First: the need to do this. There is pretty broad acceptance that we should move to a distributed version control system (DVCS). Our current Subversion-based version control system has served us well, but we're in need of a more suitable version control system for our development effort. Our community is very distributed, with many parallel efforts and needs to integrate many different feature efforts. While we've developed lots of coping mechanisms, we sure could use a system that's well suited to more fluid branching and merging. There has been resistance to this in the past, and there still may be some resistance. However, I think we've worn everyone down. :) Next: the selection of Git over other DVCSs. Over the past couple of years, other systems have been mentioned (Bzr, Hg), but there hasn't been any evidence (at least on this mailing list) for anything other than mild support for the alternatives. As you might have seen, our Ops folks have already moved to Git[1], and while they're right in the middle of the tough part of the learning curve, they seem to be adjusting just fine. The complaints seem to be of the I really need to get used to that variety rather than the why are we doing this again? variety. So, given the momentum that Git has, the ample discussion we've had on the subject, and the fact that Ops is already planning to support Git, this seems to be a settled question. So now, the questions shift from if? to when? and how?. When? After some amount of arm twisting by Erik and Brion (*hugz*), I've agreed to float a plan that has us making the migration by the end of the year. This is completely contingent on our ability to get 1.19 deployed in a rapid fashion (which, if we can get through the code review queue at our current rate, could be done in November). Until we have a more fleshed out plan, though, end of the year purely a guess, and subject to change (partly based on any ensuing conversation after this mail). However, assuming we can clear the technical hurdles, there's not any procedural issues I can see getting in the way of a rapid transition. How? Lots of unsorted pieces. There are still decisions we need to make: * Code review tool: barring unforeseen complications, we're planning to use Gerrit. We need to make sure it'll be a suitable replacement for our existing tool * How do we break up the repository? One big repo? Extensions each get their own? We need to sort all of that out. A draft plan is available here: http://www.mediawiki.org/wiki/Git_conversion Rob [1] ...or so I've read on Slashdot, so it must be true: http://hardware.slashdot.org/story/11/09/21/0531246/wikimedia-foundation-releases-their-server-config ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
[Wikitech-l] Please merge deployment changes into 1.17wmf1 and 1.18wmf1 now
Hi all, If you are someone who currently merges code into the 1.17wmf1 branch, this mail is for you: We need you to make sure that you merge your code into the 1.18wmf1 branch as well. That also means if you're deploying code and using syncfile to push individual files, you need to push from both the 1.17wmf1 branch and the 1.18wmf1 branch. If you don't, your code may not make it into the 1.18wmf1 branch. Thanks Rob ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Updates for EXIF rotated photos in trunk 1.18
Fantastic...thanks for jumping on this Brion! And Reedy, thanks for pushing this out! So, this code should be live on http://test2.wikipedia.org plstestkthxbai :) Rob On Tue, Sep 20, 2011 at 3:54 PM, Brion Vibber br...@pobox.com wrote: One of the long-awaited minor features that has finally come in during the 1.18 development cycle is resolving old bug 6672, natively supporting photos where EXIF metadata specifies a non-default orientation. This is very common in photos taken with digital cameras; a portrait-mode image may be saved at the low-level as a 4x3 framebuffer with an orientation tag stating it must be rotated 90 or 270 degrees (or occasionally an upside-down photo needing to be rotated 180 :) While most photo-editing applications understand this metadata natively and will simply show the image at its natural size, web browsers don't -- and neither did the server-side processing that MediaWiki was doing. Bryan Tong-Minh did most of the earlier work on this a few months ago, which is much to be commended! In response to a couple issues noticed on test2.wikipedia.org (bug 31024, bug 31048) I've made some tweaks to make it work more consistently. From now on the image's width height properties will be set to the logical size, taking the orientation into account. This makes thumbnail resizing more consistent (sometimes we got wrong sizes because the requested bounding box would end up applied to the original physical size) and also makes the orientation change transparent to API clients, such as other wikis using ForeignAPIRepo (aka 'InstantCommons'). Clients won't need to know or care if an image uses EXIF rotation; it'll simply always be presented and reported as at its natural logical size. I've applied these fixes to trunk, REL1_18, and 1.18wmf1 -- phpunit test cases have been updated, but it's still conceivable something could be wrong so please do test. :) Note that any exif-rotated photos you've uploaded under the old code will need to be purged to update their file metadata clear out any old thumbs, or they'll continue to possibly render wrong. https://bugzilla.wikimedia.org/show_bug.cgi?id=6672 https://bugzilla.wikimedia.org/show_bug.cgi?id=31024 https://bugzilla.wikimedia.org/show_bug.cgi?id=31048 -- brion ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Taking suggestions for a template language syntax for our skin system
On Mon, Sep 12, 2011 at 2:48 AM, Daniel Friesen li...@nadir-seen-fire.com wrote: On 11-09-11 04:56 PM, Rob Lanphier wrote: I'm curious what problem you're trying to solve. It sounds like you're trying to get people who are currently working on Wordpress skins or Drupal skins to work on MediaWiki skins instead, but to what end? The going back to x part tacked on the end was more of a joke. I'm trying to solve the flaws in our skin system especially the ones where our skin system is deficient compared to other theme engines like WordPress' and Drupal's. But also trying to avoid repeating some of the same flaws in those skin systems. And taking our differences into consideration. I guess I'm asking for the most important user stories, since trying to be all things to all people is almost certainly going to fail. Is the most important user (to you) an individual English Wikipedia account holder who wants to switch away from the default theme, or a MediaWiki administrator who wants to align the look-and-feel of their Wordpress blog with their MediaWiki install? Do you imagine that the person who creates the next great MediaWiki theme is someone who is a CSS expert, or more of a programmer? This thread spurred a small conversation in the office over lunch today. I won't speak for the group, but the conclusion I reached from that conversation was that a more achievable and immediately solvable problem to tackle would be to improve the ability to customize skins purely from CSS. That is, create a new skin that makes it really easy to move blocks of content (navigation elements, etc) around and customize their appearance. I would love to see some alignment of other theming systems with our own, so that we can potentially attract theme experts from other projects to port their existing themes to MediaWiki. Rob ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Taking suggestions for a template language syntax for our skin system
On Thu, Sep 8, 2011 at 11:04 AM, Daniel Friesen li...@nadir-seen-fire.com wrote: Aye, there's absolutely no way HAML is going to fly with the majority of people building skins. Instead of Why the hell do I have to insert all this junk just to make my skin work right? I'm going back to WordPress. kind of issue we had before I cleaned up the skin system we'll end up with a What the hell is this syntax? I'm going back to Drupal. kind of issue. I'm curious what problem you're trying to solve. It sounds like you're trying to get people who are currently working on Wordpress skins or Drupal skins to work on MediaWiki skins instead, but to what end? Rob ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Can we make the life of extension developers easier?
On Tue, Sep 6, 2011 at 2:05 PM, Brion Vibber br...@pobox.com wrote: Generally speaking, we should only throw warnings or remove old interfaces that are actively broken (do not work correctly) or can no longer be sanely maintained -- removing a deprecated interface is a fairly extreme step and should never be done just to make things look cleaner. There may be little or even *negative* benefit to going around and changing all the calling code to use the new interface. I've seen *lots* of regressions in commits that swap something to a new interface without taking into account how the interface actually changed, and they're harder to track down because the changes are often buried in generic code clean-up. +1000. Rob ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Adding MD5 / SHA1 column to revision table (discussing r94289)
On Fri, Sep 2, 2011 at 5:47 PM, Daniel Friesen li...@nadir-seen-fire.com wrote: On 11-09-02 05:20 PM, Asher Feldman wrote: When using for analysis, will we wish the new columns had partial indexes (first 6 characters?) Bug 2939 is one relevant bug to this, it could probably use an index. [1] https://bugzilla.wikimedia.org/show_bug.cgi?id=2939 I generally suspect that a standard index is going to be a waste for the most urgent uses of this. It will rarely be interesting to search for common hashes between articles. The far more common case will be to search for duplicate hashes within the history of a single article. My understanding is that having a normal index on a table the size of our revision table will be far too expensive for db writes. This is the first I've heard of partial indexes (/me researches) I don't know if a partial index is going to be cheap enough that we can use it, and useful enough that we'd want to. Would this be a faster query in a world with a partial index on the first six characters? SELECT rev_id FROM revision WHERE rev_page=12345 AND rev_sha1='4cdbd80be15fcfff139fb8a95f2ca359520939ee' ...or would we have to run a query like this to get the benefit of the index? SELECT rev_id FROM revision WHERE rev_page=12345 AND rev_sha1 like '4cdbd8%' ...and would either of these queries be considered too expensive to run without a partial index? How about with a partial index? Rob ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
[Wikitech-l] Aaron Schulz now full-time at Wikimedia Foundation
Hi everyone, I’m extremely pleased to welcome Aaron Schulz to Wikimedia Foundation as a full-time developer in Platform Engineering. Aaron is a long-time MediaWiki developer, starting as a volunteer in 2007. He quickly proved adept at working with the FlaggedRevs extension, so WMF hired him as a student contractor to take on development, and is now still the primary developer of the FlaggedRevs extension in use on many of our larger wikis today. More recently, he volunteered a lot of the initial work on IPv6 compliance, and has made many small fixes and improvement in the MediaWiki core. Aaron has been so active in MediaWiki in the past few years that 5,527 of the 95,732 revisions in our main source code repository have his name on them.[1] This summer, Aaron continued his student contractor work on our software deployment infrastructure, working on the Heterogeneous Deployment project, which we plan to use to roll out MediaWiki 1.18 in a controlled manner in September. He's one of many developers who is slogging away at reviewing code commits in anticipation of 1.18, and will generally be working on operational and performance-oriented projects. Welcome, Aaron! Rob [1] See http://www.mediawiki.org/wiki/Special:Code/MediaWiki/author/aaron ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] divergence in commit review backlog statistics for trunk
On Wed, Aug 31, 2011 at 1:41 PM, Ashar Voultoiz hashar+...@free.fr wrote: On 31/08/11 21:48, Sumana Harihareswara wrote: The 1.18 revision report https://secure.wikimedia.org/wikipedia/mediawiki/wiki/MediaWiki_roadmap/1.18/Revision_report We should really integrate that revision report into the CR extension. Where is the code used to generate the report? Latest version is here: https://gitorious.org/~hexmode/mwcrstats/hexmodes-mwcrstats/commits/master Rob ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
[Wikitech-l] 1.18 code review status
Hi everyone, Thank you everyone involved for getting the review queue down as low as it is. As it stands, we have 82 new revisions to review, and 57 fixmes: http://www.mediawiki.org/wiki/MediaWiki_roadmap/1.18/Revision_report Back on August 18, we had 171 new revisions to review and 59 fixmes. If you start there an plot linearly to release, that means reviewing 7-8 new revisions per day, and fixing an average of 2 fixmes per day to just barely be done on September 16. Needless to say, we're doing great on the new revisions, but the fixmes are a bit of a problem. Even accounting for the fact that fixmes are being added as a result of the rapid rate of new revision fixing, we're still not fixing the fixmes fast enough. Between August 18 and now, we've netted 2 revisions, by fixing 9 fixmes, and adding 7 more. That's only fixing the fixmes at a rate of 1.5 a day, which won't get us there even if we don't add any more fixmes. Given the fact that reviewing 89 revisions yielded 7 new fixmes, it's reasonable to expect that 5-10% of the remaining new revisions will become new fixmes, which will add 4-8 new fixmes before we're done. So, please take a look at the fixme list (especially if you are one of the committers involved) and do what you can to reduce our load. Thanks! Rob ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] State of page view stats
On Fri, Aug 12, 2011 at 12:21 PM, MZMcBride z...@mzmcbride.com wrote: Domas Mituzas wrote: Building a service where data would be shown on every article is relatively different task from just analytical workload support. For now, building query-able service has been on my todo list, but there were too many initiatives around that suggested that someone else will do that ;-) Yes, beyond Henrik's site, there really isn't much. It would probably help if Wikimedia stopped engaging in so much cookie-licking. That was part of the purpose of this thread: to clarify what Wikimedia is actually planning to invest in this endeavor. If the question is am I wasting my time if I work on this?, the answer is almost certainly not, so please embark. It will almost certainly be valuable no matter what you do. Now, the caveat on that is this: if you ask will I feel like I've wasted my time?, the answer is more ambiguous, because I don't know what you expect. Even a proof of concept is valuable, but you probably don't want to write a mere proof of concept. So, if you want to increase the odds that your work will be more than a proof of concept, then there's more overhead. Here's an extremely likely version of the future should you decide to do something here and you're successful in building something: you'll do something that gets a following. WMF hires a couple of engineers, and starts on working on the system. The two systems are complementary, and both end up having their own followings for different reasons. While it's likely that some future WMF system will eventually be capable of this, getting granular per-page statistics is something that hasn't been at the top of the priority list. In one wasted time scenario, we figure out that it wouldn't be *that* hard to do the same thing with the data we have, and we figure out how to provide an alternative. However, I suspect that day probably gets postponed because there would be some other system providing that function. With any luck, if you build something, it will be in a state that we can actually work together on it at some point after we get the people we plan to hire hired. The more review you get from other people who understand the Wikimedia cluster, the more likely that case is. Here's an extremely likely version of the future should you decide not to do something here: we won't build something like what you have in mind. So, the best way to guarantee what you want will exist is to build it. Re: cookie licking. That's a side-effect of planning in the open. If we wait until we're sure a project is going to be successfully completed before we talk about it, we either won't be as open as we should be, or not taking the risks we should be, or both. Rob ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] State of page view stats
On Thu, Aug 11, 2011 at 3:12 PM, MZMcBride z...@mzmcbride.com wrote: I've been asked a few times recently about doing reports of the most-viewed pages per month/per day/per year/etc. A few years after Domas first started publishing this information in raw form, the current situation seems rather bleak. Henrik has a visualization tool with a very simple JSON API behind it (http://stats.grok.se), but other than that, I don't know of any efforts to put this data into a database. [...] A lot of people were waiting on Wikimedia's Open Web Analytics work to come to fruition, but it seems that has been indefinitely put on hold. (Is that right?) That's correct. I owe everyone a longer writeup of our change in direction on that project which I have the raw notes for. The short answer is that we've been having a tough time hiring the people we'd have do this work. Here are the two job descriptions: http://wikimediafoundation.org/wiki/Job_openings/Software_Developer_Backend http://wikimediafoundation.org/wiki/Job_openings/Systems_Engineer_-_Data_Analytics Please help us recruit for these roles (and apply if you believe you are a fit)! Thanks! Rob ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] State of 1.18 code review/deployment
On Tue, Aug 2, 2011 at 11:55 PM, Roan Kattouw roan.katt...@gmail.com wrote: On Wed, Aug 3, 2011 at 8:47 AM, Brion Vibber br...@wikimedia.org wrote: Sounds good -- can we get a firm commitment that this is our schedule? I think RobLa wants the reverse: a firm commitment from *us* that this is our schedule. It looks good to me too. Yup, that's basically the idea. What I outlined is conservative in some respects (I suspect many people probably still think we could buckle down and get something out by end of August), but I haven't exactly exhaustively planned for all contingencies. We've got a lot of unaccounted-for fixmes, and there's probably some blocking bugs lurking in the database. Also, there's not really a plan for formal testing, and not a lot of room in that timeline to make it happen. In short, there is much that could go wrong with this release, but I'm guessing (rather blindly) there's just enough time to make it work if we're committed to make this happen. I'm willing to do my part. Everyone else willing to do theirs? Rob ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
[Wikitech-l] State of 1.18 code review/deployment
Hi everyone, Here's where we are with the 1.18 code review and deployment. Short version: * Het Deploy almost done * 482 revs for code review. September 16 completion? * Question: do we have to wait until code review is done before deploying to test2? * Question: could Roan, Tim, Brion and others in Haifa get together and decide a reasonable target date for deployment? (probably late September sometime) Long version: With code review, many WMF employees have been traveling to conferences (first OSCON, then Wikimania), so that's put a big hole in our capacity, which shows on the graph: http://toolserver.org/~robla/crstats/crstats.118all.html I've run some very rough numbers over the past month to get a sense of how fast we're reviewing. Here's some different ways of loooking at it: * Worst 7 day performance over the past month: 30 revs/week (4/day) * Best 7 day performance: 320 revs/week (46/day) * Median 7 day performance from 6/29 to today: 114 revs/week (15/day) * Past 30 days: 367 revs (12/day) As of midnight UTC on August 2, we had 482 revisions to review. Assuming we stay on a slow pace through Wikimania (4/day), then pick up to 12/day starting August 8, that means we'll get through the queue around September 16. If we do a little better (15/day), that's September 8. If we stay at 4/day, it gets grim (December 1). And, just for completeness, if we miraculously pick up to 46/day, that's August 19. Friday, September 16 seems reasonably aggressive while being realistic for a finish date on that activity. Note that doesn't include revisions marked fixme. For that list, you can look here: http://www.mediawiki.org/wiki/MediaWiki_roadmap/1.18/Revision_report#Fixmes On the deployment side, Aaron Schulz has taken over work on Heterogeneous Deploy: http://www.mediawiki.org/wiki/Heterogeneous_deployment The big development tasks (that we know of) are done. The work Aaron has been doing is deploying bits and pieces at a time, then fixing the bits that break, then deploying more. We had hoped we could get a test instance of 1.18 deployed before Wikimania, but we hit a few snags in finishing up. There are a few bits of deployment lore that Aaron still needs to learn about in order to finish things off, but assuming he's able to do that and get knowledgeable review of some of the riskier code, he should be able to have the Het Deploy infrastructure in place by the end of next week. The cool thing about Het Deploy is that, once completed, we should be able to deploy MediaWiki 1.18 to a much more production-like test wiki than we've been able to use in the past (e.g. prototype.wikimedia.org), and have it in test for much longer than our current test.wikimedia.org site. We'll be able to deploy to both test.wikimedia.org and/or test2.wikimedia.org prior to releasing to the rest of the cluster. Even better, we'll be able to work with a few pioneering wikis out there to deploy 1.18 to those sites early, then gradually roll it out to a wider audience after we've worked out the kinks with the initial deploys. There is still some reasonable hesitation in deploying 1.18 even to test or test2 prior to fully reviewing the code. I haven't talked to all of the major production cluster guardians about this, but there is a school of thought that suggests that perhaps we can get away with a lighter skimming prior to deploying everywhere. If it is safe enough to get something out to test2 prior to full review being done, I'd love to do it. Question for those with deployment access: what are your thoughts? I spoke with Roan before he left San Francisco, and (I think) he agreed to wrangle folks in Haifa to decide on a date, in part based on the data that I just presented. So, I'm hoping he'll get a chance to do that while he's there. We should plan to have 1.18 running reasonably well on test2 for at least 3-4 days before we start deploying to actual production wikis, and we should hold off on deploying to the big wikis (enwiki, dewiki, etc) for another 3-4 days. So, we should probably build at least a week and probably two into the schedule for that. At the risk of biasing everyone, here's my suggestion: * Monday, September 19: deploy to test2 * Thursday, September 22: deploy to a few lower traffic pilot wikis * Monday, September 26: deploy everywhere That's pretty aggressive given that we're almost certainly going to find bugs during the test2 deployment. I'd feel a lot more confident about a full September 26 rollout if we can get something pushed to test2 sooner than September 19. Thoughts? Rob ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
[Wikitech-l] Software version control 101 (Re: Please don't commit broken code)
On Fri, Jul 29, 2011 at 12:30 PM, Jay Ashworth j...@baylink.com wrote: More pointedly, is there a really *good* Software Development Version Management HOWTO anywhere at all? I've looked at the git book, and it just knocked me on my ass from the first chapter. I *almost* understand BZR... but I haven't seen a good 30,000ft explanation of any of them, written for people who are good programmers, but have never worked with one before. I think this might be what you're looking for: http://producingoss.com/en/vc.html ...though the information is a little dated. Then again, our version control system is a little dated, so maybe this is just right (though ignore pretty much everything said about CVS). Regarding MediaWiki-specific information about branching and so on, that's largely covered here: http://www.mediawiki.org/wiki/Commit_access ...though that documentation could stand some improvement. (volunteers?) Rob ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] On refactorings and huge changes
On Tue, Jul 26, 2011 at 4:49 PM, Chad innocentkil...@gmail.com wrote: On Tue, Jul 26, 2011 at 4:42 PM, Platonides platoni...@gmail.com wrote: If the author considers it ready to merge and there has been no opposition in a week, then merge it. That would solve the rotting problem. I don't think it would be hard to keep a branch up to date for a week (at least without other merges). I disagree here. Silence should not be taken for assent. Perhaps we should formalize the approval process for RfCs. I agree with the rest of your message in that we absolutely need to get more eyes on RfCs...and I'm open to suggestions on how to do so. Well, I think checking into trunk after a week could work once we get caught up with code review on trunk, though email to wikitech-l needs to be a required part of the process. I'd ask that everyone hold off on being this aggressive before we get caught up, since this will only exacerbate the review backlog. As of today, we're closer than we've been for a very long time, with 728 new revisions, and we're trending in the right direction. The RfC process can work if people make an earnest effort to get their RfCs reviewed. Rob ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] changing cc lists default assignees, reducing useless bugmail
On Wed, Jul 20, 2011 at 12:09 PM, Chad innocentkil...@gmail.com wrote: Right. Most people who are on CC lists are there because they asked to be added. Same thing with being the default assignee. I'm not too thrilled that this is happening en masse without having the discussion first... Fair point. I'm less concerned about cc lists than default assignees. In fact, I think that if someone has been taken off the default assignee, they should (by default) be put on the cc line if they haven't already. As for extensions: they absolutely should always have a default assignee, and as a matter of policy I refuse to ever add new extensions to BZ unless someone is willing to take the assignee role. I'm not so sure this is the right position. For WMF-maintained extensions, we may have multiple people who would be logical choices. For other extensions, we're not going to be able to stop people from filing bugs once it's in the repository. If we're going to have a gatekeeper function, I'd prefer that we gate things from getting into the repository if there's no one to fix bugs. Once the extension is in our repository, the bugs will get filed, and the result will be we'll just lose metadata by not letting people enter it. As Mark and Sumana have both expressed, the assignee should mean something. Rob ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
[Wikitech-l] Rebranching 1.18
Hi everyone, We plan to rebranch 1.18 from trunk soon (as in, almost immediately). What we've found is that many things in the current branch are harder to get working than they should be, and that trunk seems to be in better shape (in particular, all tests are passing now). That will mean taking on more code review, but it's work we'll have to do one day or another. Apologies to hashar for not making this decision sooner, since he spent a good chunk of the weekend trying to get unit tests fully operational in the 1.18 branch (but thank you for making the effort!) That's not to say that everything that has been checked into trunk over the past couple of months is automatically going to be a feature of 1.18. Anything that's too complicated or risky that's been checked in will be backed out of the branch and saved for 1.19. Chad Horohoe (^demon on IRC) is going to be the one rebranching. Let us know if you have any questions. Thanks! Rob ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
[Wikitech-l] The final stretch for 1.18 code review
Hi everyone, Thanks to the heroic efforts of many folks here, we're down to a manageable number of revisions. See the newly repaired chart here: http://toolserver.org/~robla/crstats/crstats.118all.html Now that we've got a small enough list, I'm reviving the revision report for 1.18, which is a list of all of the revisions remaining for review, broken down by preferred reviewer. That's available here: http://www.mediawiki.org/wiki/MediaWiki_roadmap/1.18/Revision_report This also has fixmes, broken down by the committer. Revisions with the following tags will be excluded from this report: * nodeploy - won't ever affect deployment (e.g. installer) * 1.18revert - reverted from the 1.18 branch * 1.18ok - still needs a full review, but we'll let it slide for this deployment These tags roughly map onto what we used for 1.17, though not exactly. During that deploy cycle, it was far, far more important to choose our battles to have any hope of getting done soon, whereas now, we know we want to get through everything anyway. So, you should feel a tinge of guilt using these tags. :) These are manually generated (currently by me, but Hexmode took over last time, and I wouldn't be surprised if he does it again). Let me know if there's anything obviously wrong with the list. Otherwise, we're all looking forward to seeing this list shrink to zero! Rob ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
[Wikitech-l] Code review training meeting in San Francisco
Hi everyone, Per our earlier list conversations, Wikimedia Foundation employees are going to be spending more time (20% for most engineers) on things like code review, shell requests, and such. As we've discussed what that actually means, it became clear to everyone that many engineers will need training (or at least refreshers) in order to make the most effective use of that time. Additionally, we would benefit from group discussion about what code review is. Since we have a few remote developers visiting us in San Francisco, we're setting up the first training session for this. This will optimized for local participation in San Francisco, but we're going to at least try to get video for purposes of expanding documentation, and perhaps as educational videos for developers who can't be here but prefer to learn that way. This meeting is scheduled for July 19, at 2:30pm PDT. Here's some of the areas we're planning to have short talks on: * The basics * Stability and performance * UI considerations * Security * Unit testing * General code review philosophy This won't be the last time we'll be presenting these things. I imagine we'll be refining these talks for Hackathons and other events. If you aren't an employee of Wikimedia Foundation, but you'll be in SF and you'd like to come, let me know and I'll see if we can accommodate you. If you have ideas for areas that we should cover that aren't listed above, please let us know. Thanks! Rob ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l