[Wikitech-l] ImageTweaks demo now on display at Wikimedia Labs
Hi Multimedia enthusiasts, Commonists, and Wikitechers, The Multimedia team has been hard at work building a new extension for editing images on-wiki, and we believe we now have a workable demo running on Labs! You can find it on our Multimedia Alpha Wiki[0], where there are also instructions for testing. Note that we have a list of known bugs and failings on that wiki, and we are working on getting those fixed before we push the extension into any kind of deployment - our next steps will likely be to put it on test2wiki, then to push a BetaFeature to Commons if all goes well. We will keep you updated with the status of the project as we progress. If you find more bugs, or have concerns about this extension, you can share them on the Village Pump[1]. You can also file a Phabricator task against ImageTweaks[2] if you prefer to be in more direct contact with the team about a technical issue. Thanks for helping us test new stuff, and I look forward to getting this great tool out to you soon! [0] http://multimedia-alpha.wmflabs.org/wiki/index.php/Main_Page [1] https://commons.wikimedia.org/wiki/Commons:Village_pump#ImageTweaks_extension_now_on_display_at_Labs [2] https://phabricator.wikimedia.org/maniphest/task/create/?projects=ImageTweaks -- Mark Holmquist Lead Engineer Multimedia Team Wikimedia Foundation http://marktraceur.info ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
[Wikitech-l] Image editor prototype
Hi, all! I'm writing to share a prototype image editor that our very own Prateek (prtksxna) has been working on. We're hoping to write an extension around this and provide it on-wiki as a replacement for several bots and off-wiki tools. It's only an experiment, but honestly, our upload pipeline lacks an editing tool, which is *so* 2003. I'm looking towards getting this released on Commons, as a BetaFeature™, within the next few months. Feedback can be shared here, on GitHub in the form of issues, on IRC in #wikimedia-multimedia, or via private e-mail to myself or Prateek, if you prefer. The code is here: https://github.com/prtksxna/ImageEditor You can try a demo here: http://prtksxna.github.io/ImageEditor/ And finally, there's a sneak peek at the API documentation here: http://prtksxna.github.io/ImageEditor/docs/index.html Thanks for your eyeballs and time! -- Mark Holmquist Lead Engineer, Multimedia Wikimedia Foundation mtrac...@member.fsf.org http://marktraceur.info ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] UploadWizard long term problems
On Tue, Sep 15, 2015 at 04:49:49PM +0200, Andre Klapper wrote: > On Tue, 2015-09-15 at 08:10 -0500, Mark Holmquist wrote: > Are these questions listed on some UploadWizard wikipage/section? > https://commons.wikimedia.org/wiki/Commons:Upload_Wizard_feedback says > "To resolve issues, it helps us to have exact steps to reproduce" so > I'd love that sentence to link to such "basic" debugging questions. I don't know if there's a "common questions" link anywhere. I could write it. I guess I just did. I honestly don't think that page is very useful, it gets a lot of noise and not much signal, which is why I don't usually look at it. We could change the config to point at Phabricator, maybe? > Does also asking users to add "?debug=true" to the URL and to try again > make sense in the context of debugging UploadWizard issues, or not? It usually does, if only to get meaningful error messages and line numbers for a bug report. -- Mark Holmquist Lead Engineer, Multimedia Wikimedia Foundation mtrac...@member.fsf.org https://wikimediafoundation.org/wiki/User:MHolmquist signature.asc Description: Digital signature ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] UploadWizard long term problems
On Tue, Sep 15, 2015 at 02:40:17PM +0200, Purodha Blissenbach wrote: > jst a few minues ago, I tried to use the upload wizard of commons. > It stalled in the midst of everything and it did not upload anything. Purodha, This is the sort of bug report that we, as people who try to maintain Upload Wizard, can do exactly nothing with. What file(s) were you using? How long did you wait between opening the page and uploading the file(s)? What license(s) are you uploading them under? What file format are they? What browser were you using? What OS? Do you get any error messages in the browser console? Does any other process on your system seem slow or halted? These are all basic debugging questions that, frankly, on a technical mailing list I feel I shouldn't need to ask. Please either reply in private or file a Phabricator task describing your issue in detail. -- Mark Holmquist Lead Engineer, Multimedia Wikimedia Foundation mtrac...@member.fsf.org https://wikimediafoundation.org/wiki/User:MHolmquist signature.asc Description: Digital signature ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] build.wikimedia.org
On Mon, Aug 25, 2014 at 01:54:28PM -0700, Legoktm wrote: This is now at [[dev.wikimedia.org]]? That sounds like a much better name than build, which I thought was going to be some CI automated builds server from your email title :) But also sounds like it's the absolute end of the line for development work, cf. mediawiki.org. I'll say from personal experience that when I find a website that links to a developers resource, and that resource is all about APIs and data, I ragequit the browser tab because the software is clearly not modifiable, they just want people to play with the toys they've allowed. This is pretty well illustrated by the list of external examples. Is there any discussion of the name of the portal? -- Mark Holmquist Software Engineer, Multimedia Wikimedia Foundation mtrac...@member.fsf.org https://wikimediafoundation.org/wiki/User:MHolmquist signature.asc Description: Digital signature ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Bikeshedding a good name for the api.php API
On Thu, Aug 07, 2014 at 11:23:26AM +0100, Brion Vibber wrote: Well if we kill off XML and other funky formats we can call it the JSON API :) Actually, we should call it the YAML API, to be more complete. https://en.wikipedia.org/w/api.php?format=yamlfm -- Mark Holmquist Software Engineer, Multimedia Wikimedia Foundation mtrac...@member.fsf.org https://wikimediafoundation.org/wiki/User:MHolmquist signature.asc Description: Digital signature ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
[Wikitech-l] Process change: New contributors getting editbugs on Bugzilla
Hi, For the nth time, in #mediawiki, I've had someone ask how to be able to mark a bug as resolved, or claim it, or mark it as a duplicate of another bug. I conceptually know that this means getting editbugs permissions but it happens so infrequently that I never know where to go. Usually what happens is this: 1. I wrack my brain trying to remember the process for about 30 seconds 2. Failing, I try to ping Andre, Quim, and Sumana (none of whom are in the channel, sadly) 3. I search with duckduckgo and pull up nothing of any use 4. I search MediaWiki.org and find outdated status reports about greasemonkey scripts but nothing useful 5. I go to the developer hub pages and look at the welcome-to-the-community process but again find nothing describing this process Solution: We've made every editbugs user able to add editbugs to an account. I've documented the process here: https://www.mediawiki.org/wiki/Bugzilla#Why_can.27t_I_claim_a_bug_or_mark_it_resolved.3F Thanks to Chad for the quick resolution on this, hopefully this will be a positive change overall. -- Mark Holmquist Software Engineer, Multimedia Wikimedia Foundation mtrac...@member.fsf.org https://wikimediafoundation.org/wiki/User:MHolmquist signature.asc Description: Digital signature ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Deployments of new, radically different default features
On Sun, May 18, 2014 at 08:22:13PM +0200, Rainer Rillke wrote: Yeah, that would be cool: I am tool x, I do y and you can disable me pressing button z. Let button z be a prominent element of the UI for the time of testing at large scale. For the record, we did the first part of this - there was a link to the preferences page that would let you disable the feature at the bottom of the metadata panel from the day we pushed to the 3rd pilot sites, I think. As for prominent, I don't think that's a good idea because it would disrupt the realism of the feature. The purpose of such a feature would be temporary, and when it got removed from the UI, it would cause a (small) jolt to users. We already have a bunch of weird one-time things that are cluttering the toolbar, we didn't need another one disrupting the flow of our product. If you're a power user, you know where Special:Preferences is, and we made sure to help you out as much as possible if you don't. I don't think we needed to do anything to make the preference more discoverable, it would have been a waste of our time to do so given the other things we have on our plates. I can't speak to the community side of things - I was under the impression that very few people had actually had that much trouble with the docs, but perhaps the issues with them should be brought up *on their talk page* rather than launching some weird campaign over a feature that, frankly, could not *possibly* have been launched in a more cautious way. Cheers, -- Mark Holmquist Software Engineer, Multimedia Wikimedia Foundation mtrac...@member.fsf.org https://wikimediafoundation.org/wiki/User:MHolmquist signature.asc Description: Digital signature ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Abandoning -1 code reviews automatically?
On Wed, Apr 09, 2014 at 08:08:01AM -0700, Quim Gil wrote: I just learned that OpenStack has a policy to abandon automatically reviews sitting in -1 during more than one week: https://bugzilla.wikimedia.org/show_bug.cgi?id=63533#c1 Maybe one week is too tight for the reality of our project, but what about 2-4 weeks? As probably one of the worst offenders, I'm not a fan. I've written and subsequently left patchsets to sit for many months - possibly over a year in some cases - but I don't think any of them should be abandoned. I see an abandoned patchset as an admittance by the person who wrote the patch that they never should have and that the feature or bugfix was so wrong, it could not even be fixed up. Most of the patches I've left sitting for a while aren't that bad. Even if you don't subscribe to that view, leaving -1'd patches around isn't so terrible because they're a start along the right path. Rather than having someone else start on the same project in a totally new patch, because they didn't see the open change for the same purpose, I'd like them to take up the patch I started that is slightly flawed and finish it up. Abandoning patches will discourage this sort of behaviour. Even without auto-abandoning patches I have seen people work on things that I have already started on... If we do wind up doing this, I foresee a lot more sleepless nights in my near future trying to catch up with all of the code review that has accumulated over the past 2 years during which I often got a few days or weeks to hack on a project and then couldn't come back to it for a while. I really like sleep, guys. Let's leave things as they are. -- Mark Holmquist Software Engineer, Multimedia Wikimedia Foundation mtrac...@member.fsf.org https://wikimediafoundation.org/wiki/User:MHolmquist signature.asc Description: Digital signature ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Maintaining or replacing Gerrit
On Mon, Mar 17, 2014 at 12:52:52PM -0400, MZMcBride wrote: The Wikimedia Foundation currently has staff or contractors dedicated to maintaining human resources software ...which? (I ask because I suspect MZ is referring to the orgchart project I have maybe one day every other month to maintain, and nobody else has touched, so I'm curious) -- Mark Holmquist Software Engineer, Multimedia Wikimedia Foundation mtrac...@member.fsf.org https://wikimediafoundation.org/wiki/User:MHolmquist signature.asc Description: Digital signature ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Adventures in creating new repos / jenkins jobs
On Fri, Feb 28, 2014 at 04:58:51PM +0100, Antoine Musso wrote: The proper URL is https://integration.wikimedia.org/ci/ , the integration.mediawiki.org redirects to the / (though it does not discard the query string which is a bug). I have updated the wiki page, the jenkins_jobs.ini file should have: [jenkins] url=https://integration.wikimedia.org/ci/ user=... password=... # actually an user API token I have had this the entire time we were trying to create the jobs - it did not help, I still saw the issue. -- Mark Holmquist Software Engineer, Multimedia Wikimedia Foundation mtrac...@member.fsf.org https://wikimediafoundation.org/wiki/User:MHolmquist signature.asc Description: Digital signature ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Adventures in creating new repos / jenkins jobs
On Fri, Feb 28, 2014 at 06:43:04PM +0100, Antoine Musso wrote: Got any trace to share? marktraceur@midvalley-the-hornfreak:~/projects/wikimedia/integration/jenkins-job-builder$ jenkins-jobs --conf etc/jenkins_jobs.ini update config/ 'mwext-MultimediaViewer-do-something' INFO:root:Updating jobs in config/ (['mwext-MultimediaViewer-do-something']) INFO:jenkins_jobs.builder:Creating jenkins job mwext-MultimediaViewer-do-something https://integration.wikimedia.org/ci/createItem?name=mwext-MultimediaViewer-do-something Traceback (most recent call last): File /usr/local/bin/jenkins-jobs, line 9, in module load_entry_point('jenkins-job-builder==0.0.584.07fa712', 'console_scripts', 'jenkins-jobs')() File /home/marktraceur/projects/wikimedia/integration/jenkins-job-builder/jenkins_jobs/cmd.py, line 127, in main jobs = builder.update_job(options.path, options.names) File /home/marktraceur/projects/wikimedia/integration/jenkins-job-builder/jenkins_jobs/builder.py, line 581, in update_job self.jenkins.update_job(job.name, job.output()) File /home/marktraceur/projects/wikimedia/integration/jenkins-job-builder/jenkins_jobs/builder.py, line 476, in update_job self.jenkins.create_job(job_name, xml) File /usr/local/lib/python2.7/dist-packages/python_jenkins-0.2.1-py2.7.egg/jenkins/__init__.py, line 400, in create_job raise JenkinsException('create[%s] failed' % (name)) jenkins.JenkinsException: create[mwext-MultimediaViewer-do-something] failed marktraceur@midvalley-the-hornfreak:~/projects/wikimedia/integration/jenkins-job-builder$ I just made a dummy job - commit here: https://gerrit.wikimedia.org/r/116123 Obviously nothing special, but the issue is in the HTTP request code anyway. Cheers, -- Mark Holmquist Software Engineer, Multimedia Wikimedia Foundation mtrac...@member.fsf.org https://wikimediafoundation.org/wiki/User:MHolmquist signature.asc Description: Digital signature ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Adventures in creating new repos / jenkins jobs
On Thu, Feb 27, 2014 at 04:28:52PM -0800, Matthew Walker wrote: Hashar told me I then needed to follow the instructions on [1] to push the jobs to jenkins. Running the script myself was only pain; it kept erroring out while trying to create the job. Marktraceur managed to create the jobs after much kicking down the door aka running the script multiple times. Hilariously, the jobs still don't run. I don't see the code getting checked out on Gallium, and the jobs are all marked LOST with no logs. I'm hopeful that this is an issue related to the repository still being empty, but this may be too-wishful thinking. https://gerrit.wikimedia.org/r/116008 -- Mark Holmquist Software Engineer, Multimedia Wikimedia Foundation mtrac...@member.fsf.org https://wikimediafoundation.org/wiki/User:MHolmquist signature.asc Description: Digital signature ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] February '14 appreciation thread
On Wed, Feb 12, 2014 at 08:45:49PM +0100, Tomasz W. Kozlowski wrote: So, if you'd like to thank someone, now is a good time and opportunity to do so! Following Sumana's example, the rules are: be kind, thank someone, and say why you're thanking them. I'd like to thank Theopolisme, who was/is working with us on the Multimedia team (and across other engineering projects) and has been a tremendous help on a lot of bugs. Rock on, Theo, rock on. -- Mark Holmquist Software Engineer, Multimedia Wikimedia Foundation mtrac...@member.fsf.org https://wikimediafoundation.org/wiki/User:MHolmquist ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] How to retrieve the page, execute some time expensive operation and edit the page ONLY if it wasn't changed meanwhile
On Wed, Jan 22, 2014 at 12:41:30PM +0100, Petr Bena wrote: 1) Retrieve the page - page must not be changed starts NOW 2) Do something what requires user input, possibly may last few minutes 3) Save the page ONLY if it wasn't changed, if it was, go back to step 1 The way we should *cough* do this is If-Unmodified-Since. http://restpatterns.org/HTTP_Headers/If-Unmodified-Since It should be passed a value that matches a Last-Modified value that we get from the previous fetch page contents API call. But, I'm not convinced we could do this with our current API setup... I think there are too many overlaps between things. Last-Modified may apply, in this case, to the text of the page you're fetching, or, since you can batch requests, it may apply to the image information you're fetching, or the post count of the Flow board you're fetching, which may cock up your estimate. And how the API is supposed to know what part of the API call is constrained by the If-Unmodified header is beyond me. Then again, you could apply it to all of them, and just pass the date of your last request as the value. I'd be interested to see this happen, but I suspect it would take a lot of digging. :) -- Mark Holmquist Software Engineer, Multimedia Wikimedia Foundation mtrac...@member.fsf.org https://wikimediafoundation.org/wiki/User:MHolmquist ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Estensione per l'embedding di video
On Mon, Dec 09, 2013 at 06:09:53PM +0100, Cristian Consonni wrote: Ciao gvf, Hi, Cristian. First, as this is a primarily English list, many of the people who might be able to help you may be unable to this time, because of your language choice. Luckily we have the Internet [1]. It's unclear, still, what you are trying to do. Are you trying to *upload* a video to a wiki, and then use that video in a page? Are you trying to *embed* a video from another site, like YouTube, in a wiki page? Are you doing something else entirely? Also, where are you doing this? On a Wikimedia wiki? On Wikia? On your own third-party site? I hope we can help you soon. [1] http://translate.google.com/translate?sl=ittl=enjs=nprev=_thl=enie=UTF-8u=http%3A%2F%2Flists.wikimedia.org%2Fpipermail%2Fwikitech-l%2F2013-December%2F073461.html (don't worry, I will privately message Cristian with a translation of my message into Italian shortly after sending) -- Mark Holmquist Software Engineer, Multimedia Wikimedia Foundation mtrac...@member.fsf.org https://wikimediafoundation.org/wiki/User:MHolmquist ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Facebook Open Academy
On Wed, Nov 13, 2013 at 07:21:01PM -0500, Tyler Romeo wrote: MediaWiki participates in a number of student competitions and programs as an open source mentor (such as GSoC, Code-In, etc.). Today I ran into another one: Facebook's Open Academy Program. https://www.facebook.com/OpenAcademyProgram Where is this being actually organized? The only thing I see is a page and a half at that URL, a blog post from Facebook engineering, and one email address. The lack of buzz around other official fora makes me worry that the organizing software will be Facebook itself. -- Mark Holmquist Software Engineer, Multimedia Wikimedia Foundation mtrac...@member.fsf.org https://wikimediafoundation.org/wiki/User:MHolmquist signature.asc Description: Digital signature ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Code-in: Lua templates
On Sun, Oct 20, 2013 at 10:53:48AM -0700, Quim Gil wrote: Liangent and Micru have proposed to use Code-in to get a bunch of wikitext templates rewritten in Lua. Let's do it! I also believe that this is a very good type of task for Google Code-in. A little warning: where a task can be split into multiple modules, try to make the modules reusable by other modules easily. See my UrlEncoding module [0] for how I've been doing it - AFAIK there's no code style guidelines for this sort of thing, but returning the basic functionality as the function name preceded by an underscore (_) seems pretty simple, and then passing back the in-wikitext functionality as the normal name will make it easy to call from pages. This lets us do things like URL-encode inside of templates, while also letting us URL-encode in Lua modules with relative ease. I guess what I'm saying is, we should probably come up with some sane guidelines about what we want these modules to look like before sending twenty different people to write twenty different solutions for the same thing (URLEncode, URLEncoding, UrLeNcOdInG, URIEncode, and so on), and on top of that, in twenty different code styles. That said, yay Lua! :) [0] https://www.mediawiki.org/wiki/Module:UrlEncoding -- Mark Holmquist Software Engineer, Multimedia Wikimedia Foundation mtrac...@member.fsf.org https://wikimediafoundation.org/wiki/User:MHolmquist signature.asc Description: Digital signature ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
[Wikitech-l] Gerrit custom dashboards
FYI: On our Gerrit instance, there is a feature called dashboards which has been extremely useful for me in the recent past - you can define per- project lists of patchsets based on various search constraints. You can also, as it turns out, define custom dashboards [0] constructed as URLs. I've also created Module:Gerrit dashboard [1] which should be helpful for creating your own, and linking to it from wiki pages. See e.g. the Multimedia CR dashboard [2] [3]. Gerrit doesn't support configuring dashboards on the server without tying them to a project, but I (or one of you...) may wind up diving into Gerrit's codebase to augment an already amazing feature very soon... [0] https://gerrit-review.googlesource.com/Documentation/user-dashboards.html [1] https://www.mediawiki.org/wiki/Module:Gerrit_dashboard [2] https://www.mediawiki.org/wiki/Multimedia/CR_Dashboard [3] https://www.mediawiki.org/wiki/Module:Gerrit_dashboard/Multimedia -- Mark Holmquist Software Engineer, Multimedia Wikimedia Foundation mtrac...@member.fsf.org https://wikimediafoundation.org/wiki/User:MHolmquist signature.asc Description: Digital signature ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] IRC office hour for Extension:BetaFeatures
On Mon, Sep 30, 2013 at 12:29:15PM -0700, Mark Holmquist wrote: Due partly to the recent sparseness of traffic in #wikimedia-office, I've scheduled an office hour [0] for the new BetaFeatures extension, which the WMF is hoping to use for launching new features for beta testing. [0] https://meta.wikimedia.org/wiki/IRC_office_hours#Upcoming_office_hours Hey all, Reminder that this is happening in 15 minutes in #wikimedia-office. Thanks, -- Mark Holmquist Software Engineer, Multimedia Wikimedia Foundation mtrac...@member.fsf.org https://wikimediafoundation.org/wiki/User:MHolmquist signature.asc Description: Digital signature ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
[Wikitech-l] IRC office hour for Extension:BetaFeatures
Hi all, Due partly to the recent sparseness of traffic in #wikimedia-office, I've scheduled an office hour [0] for the new BetaFeatures extension, which the WMF is hoping to use for launching new features for beta testing. This will primarily be a technical discussion focused on people who may want to use the framework, or people who may want to help work on the extension. Possible topics include: * How do I use it? * What features are under development in this framework right now? * How does it work? * Can I have feature X? If you have any questions, or want to hear me wax on about what I've done with the extension thus far, drop on by and have a chat. I'll be in #wikimedia-office on Thursday at 21:00 UTC (14:00 PDT). [0] https://meta.wikimedia.org/wiki/IRC_office_hours#Upcoming_office_hours Happy hacking, -- Mark Holmquist Software Engineer, Multimedia Wikimedia Foundation mtrac...@member.fsf.org https://wikimediafoundation.org/wiki/User:MHolmquist signature.asc Description: Digital signature ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Conditional resource loading
On Mon, Sep 30, 2013 at 09:27:29PM +, Amelia Ireland wrote: I've already used this method with one of the extensions. Is there anything on the backend / PHP side so that I can advantage of ResourceLoader script compression? It seems like it should be possible, but I don't have a thorough-enough knowledge of the innards of Mediawiki, and I can't find anything in the docs on this topic. As long as you know that the extension is enabled before the page loads or the user takes any actions, you can simply find out where the modules are added to the OutputPage (look/grep for addModules) and wrap it in an appropriate if statement/block. I don't think ResourceLoader itself has any nice methods to do this for you, but I also don't think it's a particularly useful feature, given the ease with which developers can do it themselves... -- Mark Holmquist Software Engineer, Multimedia Wikimedia Foundation mtrac...@member.fsf.org https://wikimediafoundation.org/wiki/User:MHolmquist signature.asc Description: Digital signature ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] [Engineering] Etherpad Lite labs instance going down in two weeks - backup time
On Fri, Aug 23, 2013 at 01:02:13PM -0700, Mark Holmquist wrote: The day we have all equally hoped for and dreaded is come to pass: Etherpad Lite has now replaced Etherpad Classic in production, and the labs instance is on its way out. This is my as-wide-as-possible email warning to say that everything on the labs instance, as really should have been expected, is going to be gone soon. Not immediately - we intend to give you two weeks to get your important data off the instance and onto the new one at https://etherpad.wikimedia.org/ - but you should _absolutely_ be moving things as soon as possible. We will also keep a data dump around, in case anything else needs to get pulled out of the pads, but I would suggest not relying on that if you don't have to. And in the future: If a URL has wmflabs.org in it...don't put anything, ANYTHING, important there. The purpose of labs is to let us experiment with new technology without having to worry about reliability. Thanks so much for your help and understanding in the course of this migration. tl;dr: http://etherpad.wmflabs.org is going down in 2 weeks, get yer stuff off it. Good California morning, everyone! I will be taking down this instance at 18:00 UTC, or 11:00 PST, today. That's in TWO HOURS! If you have any remaining etherpad documents on the labs instance, now is most definitely the time to put them on our shiny new production server at https://etherpad.wikimedia.org. If you don't, they will no longer be accessible through the web interface - you'll need to contact myself or someone else on the project to pull them out of the database. It's been quite a ride, thanks for coming along, and thanks so much to the ops team for their hard work getting the new instance up. Ta, -- Mark Holmquist Software Engineer, Multimedia Wikimedia Foundation mtrac...@member.fsf.org https://wikimediafoundation.org/wiki/User:MHolmquist signature.asc Description: Digital signature ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] etherpad.wmflabs.org is now DOWN. (was: Etherpad Lite labs instance going down in two weeks - backup time)
On Fri, Sep 06, 2013 at 09:02:42AM -0700, Mark Holmquist wrote: Good California morning, everyone! I will be taking down this instance at 18:00 UTC, or 11:00 PST, today. That's in TWO HOURS! If you have any remaining etherpad documents on the labs instance, now is most definitely the time to put them on our shiny new production server at https://etherpad.wikimedia.org. If you don't, they will no longer be accessible through the web interface - you'll need to contact myself or someone else on the project to pull them out of the database. It's been quite a ride, thanks for coming along, and thanks so much to the ops team for their hard work getting the new instance up. Despite my hatred for timezones and daylight savings time, I've taken this instance down right on time. If you need anything from the database backup that I'm about to run, feel free to ping me. Also I've left a pretty barebones 404 message on the instance for the stubborn or careless - follow the link to the new instance, please :) -- Mark Holmquist Software Engineer, Multimedia Wikimedia Foundation mtrac...@member.fsf.org https://wikimediafoundation.org/wiki/User:MHolmquist signature.asc Description: Digital signature ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] BetaFeatures framework, and a minor call for technical input
On Wed, Sep 04, 2013 at 09:34:49AM -0400, Nikolas Everett wrote: I worked on an accounting system with similar requirements and we had an even more complicated system but one you might want to consider: 1. When something happens record the event and how much it changed the value along with a timestamp. In our case we'd just have enable and disable events. 2. We ran a job that summarized those events into hourly changes. 3. Every day we a log of the actual value (at midnight or whatever). This let us quickly make all kinds of crazy graphs with super deep granularity over short periods of time and less granularity over long periods of time. Essentially it was an accountant's version of RRDtool. It didn't have problems with getting out of sync because we never had more than one process update more than one field. It is probably overkill but might serve as a dramatic foil to the simpler ideas. Thanks a lot, Nik - I'm sure it is overkill, but it sounds like one of the more seamless solutions without being too heavy on performance. Probably our solution will look like fire events to update the counts, then sync them periodically when some number of events have fired *or* when some amount of time has passed without a sync. I'll set about working on that now; should be a fun day, since I've never worked with our job queue before :) In other news, the BetaFeatures framework, as well as the start of our MultimediaViewer, is up at http://multimedia-alpha.wmflabs.org - go ahead and experiment! -- Mark Holmquist Software Engineer, Multimedia Wikimedia Foundation mtrac...@member.fsf.org https://wikimediafoundation.org/wiki/User:MHolmquist signature.asc Description: Digital signature ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Welcoming Sucheta Ghoshal
On Tue, Sep 03, 2013 at 09:59:20AM -0700, Alolita Sharma wrote: Welcome Sucheta! I am excited to have you on the language engineering team! Congratulations, Sucheta! It's been a blast working with you in the past, and I'm ecstatic to have another brilliant person to help deal with the tangled mess that is writing international software :) Plus, having more Free Software advocates around is never a bad thing! -- Mark Holmquist Software Engineer, Multimedia Wikimedia Foundation mtrac...@member.fsf.org https://wikimediafoundation.org/wiki/User:MHolmquist ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
[Wikitech-l] BetaFeatures framework, and a minor call for technical input
Timezone-appropriate greeting, wikitech! I've been working on a new extension, BetaFeatures[0]. A lot of you have heard about it through the grapevine, and for the rest of you, consider this an announcement for the developers. :) The basic idea of the extension is to enable features to be enabled experimentally on a wiki, on an opt-in basis, instead of just launching them immediately, sometimes hidden behind a checkbox that has no special meaning in the interface. It also has a lot of cool design work on top of it, courtesy of Jared and May of the WMF design team, so thanks very much to them. There are still a few[1] things[2] we have to build out, but overall the extension is looking pretty nice so far. I am of course always soliciting advice about the extension in general, but in particular, we have a request for a feature for the fields that has been giving me a bit of trouble. We want to put a count of users that have each preference enabled on the page, but we don't want to, say, crash the site with long SQL queries. Our theories thus far have been: * Count all rows (grouped) in user_properties that correspond to properties registered through the BetaFeatures hook. Potentially a lot of rows, but we have at least decided to use an IN query, as opposed to LIKE, which would have been an outright disaster. Obviously: Caching. Caching more would lead to more of the below issues, though. * Fire off a job, every once in a while, to update the counts in a table that the extension registers. Downsides: Less granular, sort of fakey (since one of the subfeatures will be incrementing the count, live, when a user enables a preference). Upside: Faster. * Update counts with simple increment/decrement queries. Upside: Blazingly faster. Potential downside: Might get out of sync. Maybe fire off jobs even less frequently, to ensure it's not always out of date in weird ways? So my question is, which of these are best, and are there even better ways out there? I love doing things right the first time, hence my asking. [0] https://www.mediawiki.org/wiki/Extension:BetaFeatures [1] https://mingle.corp.wikimedia.org/projects/multimedia/cards/2 [2] https://mingle.corp.wikimedia.org/projects/multimedia/cards/21 P.S. One of the first features that we'll launch with this framework is the MultimediaViewer extension which is also under[3] development[4] as we speak. Exciting times for the Multimedia team! [3] https://mingle.corp.wikimedia.org/projects/multimedia/cards/8 [4] https://mingle.corp.wikimedia.org/projects/multimedia/cards/12 -- Mark Holmquist Software Engineer, Multimedia Wikimedia Foundation mtrac...@member.fsf.org https://wikimediafoundation.org/wiki/User:MHolmquist signature.asc Description: Digital signature ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Javascript to find other articles having the same external link
On Mon, Aug 26, 2013 at 10:07:16AM +0200, Ole Palnatoke Andersen wrote: I'd love to see a similar thing for articles linking to the same book via ISBN. You can do that in the JavaScript just by adding to the selector at the beginning, and you can also get other magic links at the same time. jQuery( a.external, a.mw-magiclink-isbn, a.mw-magiclink-pmid, a.mw-magiclink-rfc ).after( function() { return jQuery( a ) .text( '⎆' ) // Shorter, relative link (could also use mw.Title here maybe) .attr( href, /wiki/Special:Linksearch/ + this.href ) .before( ); } ); But it looks like Special:Linksearch doesn't support searching for magic links, at least not yet. So I'm afraid this is all for nought. I'm going to hope that CirrusSearch will fix this in some capacity, since it looks pretty simple to fix, and if Chad would like some help with that, he knows where to find me... -- Mark Holmquist Software Engineer, Multimedia Wikimedia Foundation mtrac...@member.fsf.org https://wikimediafoundation.org/wiki/User:MHolmquist ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
[Wikitech-l] Etherpad Lite labs instance going down in two weeks - backup time
The day we have all equally hoped for and dreaded is come to pass: Etherpad Lite has now replaced Etherpad Classic in production, and the labs instance is on its way out. This is my as-wide-as-possible email warning to say that everything on the labs instance, as really should have been expected, is going to be gone soon. Not immediately - we intend to give you two weeks to get your important data off the instance and onto the new one at https://etherpad.wikimedia.org/ - but you should _absolutely_ be moving things as soon as possible. We will also keep a data dump around, in case anything else needs to get pulled out of the pads, but I would suggest not relying on that if you don't have to. And in the future: If a URL has wmflabs.org in it...don't put anything, ANYTHING, important there. The purpose of labs is to let us experiment with new technology without having to worry about reliability. Thanks so much for your help and understanding in the course of this migration. tl;dr: http://etherpad.wmflabs.org is going down in 2 weeks, get yer stuff off it. -- Mark Holmquist Software Engineer, Multimedia Wikimedia Foundation mtrac...@member.fsf.org https://wikimediafoundation.org/wiki/User:MHolmquist ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] gwtoolset : architecture design help
On Fri, Aug 02, 2013 at 11:57:36AM +0200, dan entous wrote: the mappings will serve a specific purpose. they will map potentially unique XML metadata formats and standard XML metadata formats to mediawiki template parameters. would the namespace Metadata_mappings, i prefer plural because there will be many mappings, be too generic or would that suffice for everyone? There are many articles. We use Article:. There are many users. We use User:. It makes little sense to depart from established practice. i still believe that the use of the user name is important. two or more people could come up with their own version of how to map Rijksmuseum metadata with mediawiki template parameters, so if we continue with this namespacing concept the potential title would be : Metadata_mappings:Dan-nl/Rijksmusem. But the Rijksmusem isn't a subresource of you. If anything I would suggest having a base and enabling subpages so users could add their own mappings, hopefully with more informative titles than just their usernames, like Metadata_mapping:Rijksmusem/No_publication_date or something. (admittedly I made something up but you get the idea) one thing i forgot to mention was the addition of an extension to the title to help identify the format of the content of the title. we were planning to use .json, so the end title would be : Metadata_mappings:Dan-nl/Rijksmusem.json. would that make sense to everyone? There's no need for this. Everything in this namespace would be JSON, so putting that information in the title twice would be silly. i took a look at the current implementation of TemplateData on commons and have not seen it used for the templates we're currently looking at. for now i will look into coding the use of the TemplateData if present and if not, fallback to our current db look-up implementation. http://commons.wikimedia.org/w/api.php?action=templatedatatitles=Template:Artwork http://commons.wikimedia.org/w/api.php?action=templatedatatitles=Template:Book http://commons.wikimedia.org/w/api.php?action=templatedatatitles=Template:Musical%20work http://commons.wikimedia.org/w/api.php?action=templatedatatitles=Template:Photograph http://commons.wikimedia.org/w/api.php?action=templatedatatitles=Template:Specimen It wouldn't be hard to add to these templates, and I've already done it for the Information template, so this would be a good idea to do now-or-soon. Interface with Nazmul, who is Rasel160, who's been working on auto-generating forms for Commons templates in UploadWizard, and see if you can't work together on this :) Ta, -- Mark Holmquist Software Engineer, Multimedia Wikimedia Foundation mtrac...@member.fsf.org https://wikimediafoundation.org/wiki/User:MHolmquist signature.asc Description: Digital signature ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Flow with a Tow Truck
On Sun, Jul 28, 2013 at 10:10:25PM +0200, Federico Leva (Nemo) wrote: Dumb question: so, would this be an alternative to Etherpad lite integration with MediaWiki via the EtherEditor extension? Yes and no. There are definitely multiple options here, but here's the situation as I see it: * EtherEditor is in a broken state because it was written for an earlier version of Etherpad Lite, so we'd need to rewrite some of the EPL plugins and API access methods that I wrote initially if we wanted to bring it back. * Etherpad Lite doesn't really play well with wikitext, and its formatting options are only a small subset of what we'd need to support in MediaWiki, which means we either need to use the old method of editing wikitext source or build support for tables, definition lists, and more into EPL core. * It's been almost a year since I started working on EtherEditor and put the etherpad.wmflabs.org instance up, and EPL still isn't puppetized. * Towtruck is built with the idea of helping people in mind - so for things like helping new users or working together on an existing HTML form, it makes a lot of sense. * Towtruck also has better support for non-standard editing systems (like VisualEditor, for example), so we'd probably be better off using it in the long run and not having to worry about writing up-to-date plugins for supporting all possible wikitext constructs and their visual representations - VisualEditor already handles that, so we can just offload it. That being said, maybe the VE team already has different plans, e.g. to use something like ShareJS, which is a little lighter-weight and would give them the ability to customize the interface a lot more. Hope that helps :) -- Mark Holmquist Software Engineer, Multimedia Wikimedia Foundation mtrac...@member.fsf.org https://wikimediafoundation.org/wiki/User:MHolmquist signature.asc Description: Digital signature ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] gwtoolset : architecture design help
On Wed, Jul 24, 2013 at 08:59:25PM +0200, dan entous wrote: Mapping --- a mapping is a json that maps a metadata set to a mediawiki template. we’re currently storing those as Content in the namespace GWToolset. an entry might be in GWToolset:Metadata_Mappings/Dan-nl/Rijkmuseum. 1. does that namespace make sense? a. if not, what namespace would you recommend? I'd say that the example you gave should give a better hint about what the namespace should be called: Metadata mapping. /wiki/Metadata_mapping:Rijkmuseum makes a lot more sense from a resource/ subresource perspective, since Metadata mappings wouldn't be a resource on its own, just a parent directory for other resources. And per-user directories probably wouldn't make much sense, IMO. mediawiki template parameters - the application needs to know what mediawiki template parameters exist and are available to use for mapping media file metadata to the mediawiki templates. for the moment we are hard-coding these parameters in a db table and sometimes in the code. this is not ideal. i have briefly seen TemplateData, but haven’t had enough time to see if it would address our needs. 1. is there a way to programatically discover the available parameters for a mediawiki template? TemplateData is, in fact, exactly what you need for that. -- Mark Holmquist Software Engineer, Multimedia Wikimedia Foundation mtrac...@member.fsf.org https://wikimediafoundation.org/wiki/User:MHolmquist signature.asc Description: Digital signature ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Object.keys called on non-object
On Sat, May 11, 2013 at 04:46:37PM +0200, Marco De Nadai wrote: TypeError: Object.keys called on non-object at Function.keys (native) at Object.WikiConfig (/srv/ www.startupperday.com/Parsoid/js/lib/mediawiki.WikiConfig.js:41:20) at null.anonymous (/srv/ www.startupperday.com/Parsoid/js/lib/mediawiki.parser.environment.js:300:22) at processSome (/srv/ www.startupperday.com/Parsoid/js/lib/mediawiki.ApiRequest.js:98:17) at process._tickCallback (node.js:415:13) Can you help me? :((( Is this wiki readable by anonymous users? If not, the Parsoid service won't be able to get the necessary configuration from your wiki via the API. There's a bug for this - https://bugzilla.wikimedia.org/show_bug.cgi?id=44483 You could try to fix the bug, but it's a tough one. You could also work around the issue by creating a cached JSON response in js/lib/baseconfig, where we have a bunch of default configs laying around. After doing that you may need to change some configuration or even code to get Parsoid's API to use that file instead of an API response, but it's possible. If your wiki *isn't* read-only, then I'm sorry I rambled at you here :) we can help much more easily on IRC at #mediawiki-parsoid on Freenode, because of the relative speed of the communication with IRC. Thanks for trying out the system! Hopefully we can help you get it working on your wiki. Ta, -- Mark Holmquist Software Engineer Wikimedia Foundation mtrac...@member.fsf.org https://wikimediafoundation.org/wiki/User:MHolmquist signature.asc Description: Digital signature ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Not much usage of the Visual Editor?
On Sat, May 11, 2013 at 01:01:05PM -0700, Steven Walling wrote: One thing we could experiment with is turning on VisualEditor as the default for new people who've just signed up. We'd need to do this carefully with guidance from James/Trevor et. al., and with a close eye on the conversion rate. At the very least we could run a few remote usability tests to see if it made a positive difference or just totally confused people. My team could perhaps tackle this as part of our work on the Getting Started workflow we're presenting to newly-registered folks on English Wikipedia. Actually, if you wanted to really streamline the copyediting tasks, you could just link to the veaction=edit version of the edit form rather than the page itself, maybe with a special token that VE could use to notify the user that they were editing the page, instead of the default notification that just says hey, you're using VisualEditor, which is probably not the most helpful text for a totally-brand-new user. -- Mark Holmquist Software Engineer Wikimedia Foundation mtrac...@member.fsf.org https://wikimediafoundation.org/wiki/User:MHolmquist signature.asc Description: Digital signature ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Bugzilla Weekly Report
On Thu, Apr 04, 2013 at 01:51:50PM +0200, Željko Filipin wrote: A few people commented last week that they liked the charts so I have updated them to include this week's data (attached). In slight contrast, would you mind putting them on a wiki somewhere and linking to them? A) I have an email inbox quota and attachments are the first thing to go, B) Attachments of images (even small ones) on a mailing list have always seemed like a bad plan to me, C) It would be more permanent, indexable, and searchable. Thanks! -- Mark Holmquist Software Engineer Wikimedia Foundation mtrac...@member.fsf.org https://wikimediafoundation.org/wiki/User:MHolmquist signature.asc Description: Digital signature ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Seemingly proprietary Javascript
On Tue, Mar 05, 2013 at 12:56:23PM +0100, Alexander Berntsen wrote: GNU LibreJS blocks several Javascript sources around Wikipedia. I was sent to this list by Kirk Billund. My issue as well as Kirk's replies follows. I hope you are okay to read it in this form. https://bugzilla.wikimedia.org/show_bug.cgi?id=36866 We have this issue reported, it's on our radar, and I, at least, intend to fix it in the future. The user JavaScript and CSS might be an issue. I'm not sure how to handle that. I guess we could indicate in the license headers that some parts of the code are under the CC-BY-SA license, or whatever is set to the default license for the wiki. That should be possible, if not trivial. The minification process, however, does *not* cause a problem. We can simply add the comments to the file(s) after the minification. It does mean we'll need to include, potentially, multiple license headers in one HTTP response, but that shouldn't cause much issue. Alternatively we could use a mixed license header, and link to the texts of multiple licenses, or link to multiple files' source code. See the linked bug (above) for more discussion of the technical problems presented, and a few proposed suggestions. It looks like the best way to do it would be the bang comment syntax, suggested by Timo (Krinkle), which would allow each script to be tagged on its own, and that way each script authour would be responsible for their own licensing. I hope that helps, and that the bug discussion is a little more kind than wikitech has seemed :) -- Mark Holmquist Software Engineer Wikimedia Foundation mtrac...@member.fsf.org https://wikimediafoundation.org/wiki/User:MHolmquist ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
[Wikitech-l] Etherpad Lite labs server going down today at 22:00 UTC (14:00 Pacific) for upgrade
The Etherpad Lite server at http://etherpad.wmflabs.org/pad has been surprisingly popular of late, which has revealed an issue in our version. I think the issue has been fixed in the latest version of Etherpad Lite, so I'm going to hastily schedule a deployment in two hours' time. Please be aware that if you're editing a pad at that time, you may lose work. In all the upgrade should take no longer than an hour, and I'll ping the list when it's done with. As absurd as this is for me to be sending out a warning about taking down a labs service, this seems appropriately courteous especially given the amount of use this instance has been getting. Thanks, -- Mark Holmquist Software Engineer Wikimedia Foundation mtrac...@member.fsf.org https://wikimediafoundation.org/wiki/User:MHolmquist ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Etherpad Lite labs server going down today at 22:00 UTC (14:00 Pacific) for upgrade
On Thu, Feb 28, 2013 at 11:53:45AM -0800, Mark Holmquist wrote: The Etherpad Lite server at http://etherpad.wmflabs.org/pad has been surprisingly popular of late, which has revealed an issue in our version. I think the issue has been fixed in the latest version of Etherpad Lite, so I'm going to hastily schedule a deployment in two hours' time. Please be aware that if you're editing a pad at that time, you may lose work. In all the upgrade should take no longer than an hour, and I'll ping the list when it's done with. As absurd as this is for me to be sending out a warning about taking down a labs service, this seems appropriately courteous especially given the amount of use this instance has been getting. All right, the server is going down now and should be back up within the hour. Stay tuned for more. Thanks all for your patience. -- Mark Holmquist Software Engineer Wikimedia Foundation mtrac...@member.fsf.org https://wikimediafoundation.org/wiki/User:MHolmquist ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Etherpad Lite labs server going down today at 22:00 UTC (14:00 Pacific) for upgrade
On Thu, Feb 28, 2013 at 01:53:29PM -0800, Mark Holmquist wrote: All right, the server is going down now and should be back up within the hour. Stay tuned for more. Thanks all for your patience. The server is back up, and appears to be working well now. If you see the server down (usually 503'ing or similar) please contact me on IRC or via email so I can investigate, and please note the time and date you noticed the server going down. This problem sometimes happens when a large number of changes are sent to a single client at once. Cheers, -- Mark Holmquist Software Engineer Wikimedia Foundation mtrac...@member.fsf.org https://wikimediafoundation.org/wiki/User:MHolmquist ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Etherpad Lite labs server going down today at 22:00 UTC (14:00 Pacific) for upgrade
On Thu, Feb 28, 2013 at 02:35:12PM -0800, Faidon Liambotis wrote: I've been in quite important meetings with 15+ attendants where Etherpad Lite has been used exclusively -- so, clearly not for purposes that are testing or staging. So, Labs is the wrong place to have this. Can you coordinate with us (operations team) to move the service into a production environment? This has been on my list for some time now, but I never found a good point to do it. There was some work on Debianizing the package, and maybe some splashing around in the Puppet manifests, but I'd be happy to explore the possibility of production deployment in the near future, whenever there's a sufficient contingent of Ops engineers available to assist. Thanks, Faidon! -- Mark Holmquist Software Engineer Wikimedia Foundation mtrac...@member.fsf.org https://wikimediafoundation.org/wiki/User:MHolmquist ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
[Wikitech-l] MediaWiki BoF at the Southern California Linux Expo this weekend
Hello all, I just wanted to announce that SCaLE[0] has announced their Birds of a Feather schedule[1], and there's going to be a MediaWiki-related one! I intended mostly to have it be about extensions, gadgets, and modules, but if you want to come and talk about core development you're very welcome too. The conference is being held at the LAX Hilton in Los Angeles, CA, and it's relatively affordable to come into the conference. Even more so for students, who get half off the base price. I hope you can make it! The BoF session is on Saturday at 19:00, in the Century CD room. (and if you're interested, I'm also hosting an Etherpad Lite BoF at 21:00 in the Los Angeles B room) [0] https://www.socallinuxexpo.org/scale11x [1] https://docs.google.com/spreadsheet/pub?key=0AkLumNSkddf_dHRMVnhjZmxJTWdFT0NPckl4RzRjNlE -- Mark Holmquist Software Engineer Wikimedia Foundation mtrac...@member.fsf.org https://wikimediafoundation.org/wiki/User:MHolmquist ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Deploying to test2 before other wikis
Is there any reason why this shouldn't be a stated policy? If not, where should we state the policy so that people are aware of it? Considering that testwiki has a nice documentation page [0] and test2wiki doesn't [1], I think we should probably clear that up before announcing any policy. [0] http://wikitech.wikimedia.org/view/Test.wikipedia.org [1] http://wikitech.wikimedia.org/view/Test2.wikipedia.org -- Mark Holmquist Software Engineer, Wikimedia Foundation mtrac...@member.fsf.org http://marktraceur.info * Sent from Ubuntu GNU/Linux * ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] What other FLOSS projects does the WMF contribute to/collaborate with?
On 12-11-20 01:28 PM, Arthur Richards wrote: I'm helping to organize an 'intro to FLOSS' orientation module for new hires at the WMF with Tomasz Finc, and we're curious to know what all the projects are that the we collaborate with/contribute to. Just from my own experience working at the WMF, I know of a handful like Drupal, CiviCRM, Phonegap, Varnish, jQuery, Squid, Open Stack... but I'm very curious to know the others. We just helped the Etherpad Foundation set up a TWN system for Etherpad Lite, so there's that :) We also collaborate with OpenHatch a bunch. Is this a question about on-the-clock contributions, or are we looking for volunteer stuff, too? -- Mark Holmquist Software Engineer, Wikimedia Foundation mtrac...@member.fsf.org http://marktraceur.info * Sent from Ubuntu GNU/Linux * ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Looking for Bugs In All the RIGHT Places
I thought that this tool might be useful in mediawiki development. She was amenable to helping get it working if there was interest. Where, pray tell, is the software? :) -- Mark Holmquist Software Engineer, Wikimedia Foundation mtrac...@member.fsf.org http://marktraceur.info * Sent from Ubuntu GNU/Linux * ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Collaborative code-editing tools (long-distance pair programming)
Do any of you use Cloud9, Brackets, emacs + xhost, or some other tool/service? Do you recommend them? http://etherpad.wmflabs.org/pad/ is all very well and good but it doesn't support syntax highlighting. FWIW there is a way to add in syntax highlighting, and I could probably create a new instance for that. There was also chatter on the Etherpad channel yesterday about writing plugins for compiling and running programs on the backend of the server. Let me know if there's interest. -- Mark Holmquist Software Engineer, Wikimedia Foundation mtrac...@member.fsf.org http://marktraceur.info ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Collaborative code-editing tools (long-distance pair programming)
FWIW there is a way to add in syntax highlighting, and I could probably create a new instance for that. There was also chatter on the Etherpad channel yesterday about writing plugins for compiling and running programs on the backend of the server. Additionally, I suppose, we could write a plugin for enabling a grouping of pads into projects, which would make it easier to have multiple files open at once. I think the major problem is that any files on the etherpad server will need to be downloaded or copy/pasted before you can actually run them, which may or may not be ideal. But again, there may be a solution in the plugin API. (backend of the server - sorry, I just woke up) -- Mark Holmquist Software Engineer, Wikimedia Foundation mtrac...@member.fsf.org http://marktraceur.info ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Collaborative code-editing tools (long-distance pair programming)
On 12-11-08 08:32 AM, Dan Andreescu wrote: +1 on getting collide up and running. It's open source and looks like it already does project management and syntax highlighting From https://code.google.com/p/collide/: Requires Client: recent Chrome or Safari Server: Java 7 JRE Build: Java 7 JDK, Ant 1.8.4+; all other build dependencies are bundled in. It's possible that the first requirement is really more of an official recommendation from Google, but it's nasty that they recommend two non-free ones. And the server requirements are also (at least partly) non-free, IIRC. I'm very willing to be proven wrong on that front. We may be able to use openjdk-7-* as drop-in replacements, but I don't know how nicely they'll play together. *This has been a message from your friendly neighborhood FSF member* Then again, it does seem like a lot less work to run Collide, if we can do it with Chromium and OpenJDK. -- Mark Holmquist Software Engineer, Wikimedia Foundation mtrac...@member.fsf.org http://marktraceur.info ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Work flow for bugfixing
And I really don't like the idea of B. I can easily see people mentioning bugs that are related to a commit in the commit message but not directly fixed by it. Maybe we should start a new branch per-bug instead, and merge the branch when the bug is fixed? That might help with this issue. -- Mark Holmquist Software Engineer, Wikimedia Foundation mtrac...@member.fsf.org http://marktraceur.info * Sent from Ubuntu GNU/Linux * ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Work flow for bugfixing
What about changes like https://gerrit.wikimedia.org/r/#/c/29422/, which mentions bug 1, but obviously doesn't entirely fix it? It wouldn't be put into the branch, or maybe it would be put into the branch but the branch would never get closed. But I like Chad's idea better, currently. -- Mark Holmquist Software Engineer, Wikimedia Foundation mtrac...@member.fsf.org http://marktraceur.info * Sent from Ubuntu GNU/Linux * ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Collaborative code-editing tools (long-distance pair programming)
Then again, it does seem like a lot less work to run Collide, if we can do it with Chromium and OpenJDK. Update: I tried running Collide on my machine. It took some hacking to get through the Ant build process, and finally I came to a point where the README said run ./bin/deploy/collide and I said there's no bin/deploy directory and the README stood silent. I have an email out to the list, but I'm not sure they'll be super-responsive. Cloud 9 may be a better option... :) -- Mark Holmquist Software Engineer, Wikimedia Foundation mtrac...@member.fsf.org http://marktraceur.info * Sent from Ubuntu GNU/Linux * ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Contributing as Wikimedia employees
When contributing to upstream projects (in this case non-MediaWiki projects), using the mediawiki.org address also helps to make clearer the contributions of the Wikimedia orgs as such. You ave probably found yourselves in the situation of finding that some redhat.com, hp.com or whatever.com/org dude is committing a patch in some upstream project, and how good that feels. It can't be bad to generate this type of impression to developers and contributors of open source projects out there. For my part, though I consistently use my @member.fsf.org address, I generally include some mention in either the commit message(s) or some communication with the community that I'm working as an employee of the WMF. I've made inroads with communities that way, and it appears to be sufficient. That too, and we have now this situation with the metrics reports. We get many times the question of WMF contributions compared to independents / 3rd parties and as it is now this is very difficult to calculate. Can I suggest asking HR for a CSV of employee names, to which you should be able to easily add their email addresses (probably from the commit logs)? It might take some time, but it's also relatively simple to watch wikitech-l for new employee announcements and add them to the list. Checking the CSV as part of your metrics script would then be pretty simple. Like Quim, I don't intend to force policy here, I'm just trying to solve the problems raised :) (can we get some community insight, here, maybe?) -- Mark Holmquist Software Engineer, Wikimedia Foundation mtrac...@member.fsf.org http://marktraceur.info ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] whether to do Google Code-In
On 10/22/2012 06:31 PM, Sumana Harihareswara wrote: Last year we decided not to participate in Google Code-In https://www.google-melange.com/gci/document/show/gci_program/google/gci2012/help_page , an outreach program to help us get more 13-to-17-year-old contributors. I outlined the reasons here: http://lists.wikimedia.org/pipermail/wikitech-l/2011-October/055937.html This year, we are again eligible to apply to participate. I estimate that we'd need about 2 organizational administrators and 21 mentors to do it well. So I've opened signups at https://www.mediawiki.org/wiki/Google_Code-In and explained what those people would be committing to. If you want us to apply to participate in GCI this winter, please comment on that page by November 1st. Thanks. I wanted to ping the list again about thisour project and others will really benefit from having new contributors, and the tech community at large will benefit from the education we can help provide these students. And of course, it means new community members and a generally more informed public. Please consider signing the list before the deadline, I'd really love to see this program happen. I'm willing to help a lot, but I can't do it all! Thanks, -- Mark Holmquist Software Engineer, Wikimedia Foundation mtrac...@member.fsf.org http://marktraceur.info ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] [Wmfall] Announcement: Luke Welling joins Wikimedia as Senior Features Engineer
Please join me in a very belated welcome of Luke Welling to the Wikimedia Foundation. :-) Always awesome to have another strong ally in fighting the good fight :) Luke, a hearty welcome to you! -- Mark Holmquist Software Engineer, Wikimedia Foundation mtrac...@member.fsf.org http://marktraceur.info ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Welcome Michelle Grover as QA to the Mobile Team!
I am pleased to announce that Michelle Grover joins WMF this week as a Mobile QA contractor. Cool! It's always great to have more QA people around! Welcome, and we hope to see you around the list and IRC! :) -- Mark Holmquist Software Engineer, Wikimedia Foundation mtrac...@member.fsf.org http://marktraceur.info ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Github replication
On 12-10-03 09:27 AM, Chad wrote: Hi everyone, Just letting everyone know: mediawiki/core is now replicating from gerrit to github. https://github.com/mediawiki/core Next step: extensions! Hi Chad, Will all extensions be replicated? Are we also looking to replicate to, e.g., Gitorious? I'm sure there are docs for this decision, but I haven't seen them--do you have them handy? Thanks, -- Mark Holmquist Software Engineer, Wikimedia Foundation mtrac...@member.fsf.org http://marktraceur.info ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Welcome Željko Filipin, QA Engineer
I am pleased to announce that Željko Filipin joins WMF this week as QA Engineer. All right! Welcome, Željko! It will be great to have some more power behind our QA. I look forward to seeing you around the mailing list and IRC! -- Mark Holmquist Contractor, Wikimedia Foundation mtrac...@member.fsf.org http://marktraceur.info ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Proposal Extension:Browserid (Mozilla Persona) - open source
Is this the begin of a clash of civilisations between WMF and MediaWiki ? http://www.mediawiki.org/wiki/Requests_for_comment/MediaWiki_Foundation is the proper place to discuss that more, if you'd like to. -- Mark Holmquist Software Engineer, Wikimedia Foundation mtrac...@member.fsf.org http://marktraceur.info ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Proposal Extension:Browserid (Mozilla Persona) - open source
What use does this have to WMF? I think we make a lot of cool features that require JavaScript. And yes, it's not universally usable, but it's helpful to spend time on those things, because while they may not make Lynx [0] users happier, they'll probably make things easier for, e.g., people who upload a lot of files [1] or who want to switch between languages more easily [2]. People who want to log in with a central identity and don't like using Google servers seems like a pretty OK group of people to encourage. This isn't to say that it's not worth discussing--it definitely is, and we should keep our minds open to every possible answer. Daniel, if you disagree with the above, more information would be very helpful :) Also, implementing BrowserID would pave the way for OpenBadges [3] integration, which would be _really_ cool. [0] https://en.wikipedia.org/wiki/Lynx_%28web_browser%29 [1] http://www.mediawiki.org/wiki/Extension:UploadWizard [2] https://www.mediawiki.org/wiki/Universal_Language_Selector [3] http://www.openbadges.org/ -- Mark Holmquist Software Engineer, Wikimedia Foundation mtrac...@member.fsf.org http://marktraceur.info ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Proposal Extension:Browserid (Mozilla Persona) - open source
I'm fine with features using JS. It's just important that the feature is either an additive feature that is not necessary for use or it has a way to work without JS. In this case we're talking about login. If you disable JS... you can't even log into your own user account. And disabling JS is supposed to be an improvement to the security of logging in. OK, here is the misunderstanding--if we implemented BrowserID, it wouldn't be the *only* way to login. It would be like OpenID, in that you could login with either a MW login or with OpenID if you have it. If you don't have JS enabled, we'll put in noscript tags that explain what you're missing. Sound OK? Admittedly this is all hypothetical, but in case the Foundation wants to pursue this, I don't want there to be huge misunderstandings :) -- Mark Holmquist Contractor, Wikimedia Foundation mtrac...@member.fsf.org http://marktraceur.info ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Announcement: Mark Holmquist joins Wikimedia as Features Engineer
On 12-09-26 10:23 AM, Akshay Agarwal wrote: Welcome Mark! Thanks, Akshay! And thanks to Terry for the intro (I thanked him on the staff list, not here yet), and to wikitech-l for endlessly entertaining conversations :) -- Mark Holmquist Software Engineer, Wikimedia Foundation mtrac...@member.fsf.org http://marktraceur.info ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Proposal to add an API/Developer/Developer Hub link to the footer of Wikimedia wikis
1) The mediawiki homepage puts ME off. This is mainly because I'm more interested in doing things with the data on wikipedia rather than the software that runs Wikipedia. I think this is the problem we are trying to solve - there are many different types of developers out there and we need something generic to appeal to as many of them as possible. I want to add something a little more general to this point. I think we can avoid appealing to _people_ in this page, and instead appeal to _actions_. I wrote about it on the RFC [0], but I'll repeat myself briefly here. I don't think that throwing people into buckets (i.e., appealing to types of developers) is a useful way to think about this. If you want to *make MediaWiki better*, we send you to a more specific page about how to develop, translate, code review, and document. If you want to *set up or change a MediaWiki site*, we send you to a page with documentation on configuration, performance tweaks, and extensions. If you want to *use MediaWiki sites*, we send you to a page with editing help, information about user permissions, how to use the interface, and so on. We could split this up further, or differently. We could include the same link in multiple pages (e.g., pages about extensions could go in many categories). My point is, I think we could accomplish much the same, maybe more, without lumping people together, when we actually mean to group actions. [0] https://www.mediawiki.org/wiki/Talk:Requests_for_comment/MediaWiki.org_Main_Page_tweaks#Lessons_from_Manual:Skins -- Mark Holmquist Software Engineer, Wikimedia Foundation mtrac...@member.fsf.org http://marktraceur.info ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Code review meeting notes
* Changes and rebases are combined. It should be said, this is part of the recent five tips to get your code reviewed faster. You should never combine a rebase with an actual substantial change, because it makes it very hard to compare between patchsets. (I didn't see this in the quick tips list, thought I should mention it) And if you can, try to use the rebase button as much as possible! -- Mark Holmquist Software Engineer, Wikimedia Foundation mtrac...@member.fsf.org http://marktraceur.info ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Code review meeting notes
- Gerrit won't perform the rebase if it's not necessary Cool! I think the only reason this will be better is reduced number of patchsets, but that's a good thing nevertheless. - Changes as a result of a rebase aren't shown in the changes list when comparing to an old patchset Hm. Will this be file-level whitelisting (i.e., this file changed from the master branch in this patchset, so we'll show the changes) or is it line-level? If the latter, how? Because I'm not sure it's trivial Well, unless this is only applicable to Gerrit-performed rebases, and won't be helpful for conflict-induced manual ones, which wouldn't be nearly as useful. -- Mark Holmquist Software Engineer, Wikimedia Foundation mtrac...@member.fsf.org http://marktraceur.info ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Proposal to add an API/Developer/Developer Hublink to the footer of Wikimedia wikis
Just wanted people to know so they don't reinvent the wheel. :) Along that vein, there are the ever popular skin manual [0] and extension manual [1]. [0] http://www.mediawiki.org/wiki/Manual:Skins [1] http://www.mediawiki.org/wiki/Manual:Extensions -- Mark Holmquist Software Engineer, Wikimedia Foundation mtrac...@member.fsf.org http://marktraceur.info ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Work offer inside
As true as all this is, some of us would prefer to keep an advertising giant like Google out of our business as much as possible. Or alternatively, a non-free software giant like Google. But it comes to roughly the same conclusion. -- Mark Holmquist Contractor, Wikimedia Foundation mtrac...@member.fsf.org http://marktraceur.info ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Work offer inside
That's nice, but as a business decision the wikimedia foundation has decided to host our corporate email with Google. For personal mail we all have the choice of whatever system we would like, but this business decision has been made for us, and if someone wants a wikimedia.org address, I don't think it's an onerous burden to require that they use our current infrastructure to access it, instead of requiring us to do a lot of difficult workarounds. Unless I've missed something, using wikimedia.org for non-employees is a option, not a necessity. It was my understanding that part of this discussion was to require volunteers to use a specific mail server to post to the listbut now I can't find the message that gave me that impression, so maybe I've misunderstood the nature of the thread? (I'm leaving out arguments about the decision to host with Google, but it seems like a relevant thing, perhaps there are archived discussions that I could read?) -- Mark Holmquist Contractor, Wikimedia Foundation mtrac...@member.fsf.org http://marktraceur.info ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Work offer inside
Andrew suggested giving them Google apps accounts. I think it's a great solution--it allows people to use gmail or pretty much any mail client they want. However, it forces them to use the google mail servers, which may be less-than-desirable for some of us (me included) for various reasons. Better to handle blacklisting, and let people use their own mail servers or whatever else they'd like. -- Mark Holmquist Contractor, Wikimedia Foundation mtrac...@member.fsf.org http://marktraceur.info ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] [testing] TitleBlacklist now tested under Jenkins
Ultimately, all extensions hosted on WMF git server will be integrated in Jenkins. If you get PHPUnit test, I will add a job for it as soon as the current jobs are stable enough. At the risk of sounding unoriginal, _zomg this is amazing_. Thanks a bunch, Antoine! -- Mark Holmquist Contractor, Wikimedia Foundation mtrac...@member.fsf.org http://marktraceur.info ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] AJAX sharing of Commons data/metadata
Has there been any discussion of CORS support in Mediawiki / WMF sites anywhere? There was some talk of it in bug 32890 [0] in UploadWizard, and I tried to throw together code for it in a patchset [1], but I didn't spend much time on it (the effort was mostly to put the code into a workable patchset rather than just a snippet in a bug). bawolff also commented on the patchset and linked to bug 20814 [2], which is further discussion of the topic in broader strokes. [0] https://bugzilla.wikimedia.org/show_bug.cgi?id=32890 [1] https://gerrit.wikimedia.org/r/#/c/9718/ [2] https://bugzilla.wikimedia.org/show_bug.cgi?id=20814 -- Mark Holmquist Contractor, Wikimedia Foundation mtrac...@member.fsf.org http://marktraceur.info ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Appreciation thread
Even further thanks to Siebrand and Niklas for their stellar help in fixing a huge l10n regression today. The l10n folks do so much work to support so many people, and they're amazing at it. Bravo. -- Mark Holmquist Contractor, Wikimedia Foundation mtrac...@member.fsf.org http://marktraceur.info ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] $( 'div' ) vs. $( 'div /') in coding conventions (and methods for building DOMs with jQuery)
On 12-08-28 10:40 AM, Trevor Parscal wrote: Rob is correct that using addClass is the preferable way to go for classes, and attr is the preferable way to go for other attributes. They are both are safe since they use setAttribute internally which escapes the values. In creating elements, maybe, but after creation, $.prop() is the preferred way to go because the DOM properties are more reliably synced with the actual state of the UI--apparently jQuery doesn't always properly sync the HTML attributes to the browser state. I'm sure Timo can explain more fully (and maybe more accurately). -- Mark Holmquist Contractor, Wikimedia Foundation mtrac...@member.fsf.org http://marktraceur.info ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] GSoC Project Update (ConventionExtension)
On 12-08-27 04:10 PM, John Du Hart wrote: Thanks the explain in-depth about why storing configuration in articles is a good thing. Keep up the good work. See this is also unnecessary. Your original message might have been better stated as Hey, I love this idea, but is there a reason you decided to use articles instead of a database structure to store the data? Thanks in advance for the no doubt interesting answer. Instead, you antagonized Ashkay and didn't get an answer. And now here we are. Wasn't there a thread about conduct? Where did that end up? :) Ashkay, incidentally, I would also love to hear more about why you decided this, if you have a minute to answer! Thanks all, -- Mark Holmquist Contractor, Wikimedia Foundation mtrac...@member.fsf.org http://marktraceur.info ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] $( 'div' ) vs. $( 'div /') in coding conventions (and methods for building DOMs with jQuery)
Hence, I think we should change our coding conventions to always use `$( 'div /' )`. +1 for valid XHTML. Considering that bytes are cheap and validity is good, this seems like a good idea. I also tried to get an answer about the better between $( 'div class=a-class /' ) and $( 'div /' ).addClass( 'a-class' ), but apparently there's little difference. At least when creating dynamic interfaces, I'd like to have some guidance and consistency if anyone is interested in chatting about it. My preference is the latter, because it avoids extensive HTML inside of JavaScript. But I'd be interested to hear other thoughts. -- Mark Holmquist Contractor, Wikimedia Foundation mtrac...@member.fsf.org http://marktraceur.info ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Appreciation thread
I'd like to amend my thanks to include Derk-Jan Hartman (thedj), who has done some awesome work today helping with bug triage, code review, and patches for UploadWizard. Great stuff! It should make our review/bug-closing sprint a lot quicker. -- Mark Holmquist Contractor, Wikimedia Foundation mtrac...@member.fsf.org http://marktraceur.info ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Lua deployed to www.mediawiki.org
Hmm. I can understand why scribunto is targeted at template editors - although the argument that the end user is *not* the person using the template is kind of like saying the end-user of a car isn't the driver. I'd say it's more like saying that the end-user of a crank pin [1] isn't the driver of the car it's in. Which is pretty much true, because the driver will neither think about, nor probably ever see, the crank pin. Yes, they're technically using it, but it's so far below the analytical level they're using to drive that it's not worth considering. In a similar manner, while Lua (and PHP, and JavaScript, and so on) is being used to generate the page they're viewing, the user has no interest in that, most of the time. Of course, if we can convince people to learn more about Lua through this very simple editing interface, then that's an awesome win and will lead to a more computer-literate user base. Then maybe we can have this chat again in a different context :) [1] http://en.wikipedia.org/wiki/Crank_pin -- Mark Holmquist Contractor, Wikimedia Foundation mtrac...@member.fsf.org http://marktraceur.info ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Appreciation thread
How about a little email thread for us to say nice things about each other? Rules: be kind, thank someone, and say why you're thanking them. In the style of Drew Carey (the points don't matter): Two thousand and twenty points to Timo (Krinkle) for his consistently insightful JavaScript review. Seven hundred and nineteen points to Reedy for his omnipresence. Pi to the fifth points to Chris McMahon for being awesome in many ways, most notably the recent EtherEditor testing he helped with. I'm sure there are many other great people, but others will surely help to thank them all :) -- Mark Holmquist Contractor, Wikimedia Foundation mtrac...@member.fsf.org http://marktraceur.info ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] GSoC wrapup report: TranslateSvg
Yay! Happy times, wrapping up projects and hearing great things. This is quite an exciting project, and I'm interested to see how it all works out! You promised us a video, though! I guess you reused '[1]' later in the email and forgot :) Thanks for coming out to work on this brilliant project, and I really hope we can get it out there very soon. -- Mark Holmquist Contractor, Wikimedia Foundation mtrac...@member.fsf.org http://marktraceur.info ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Invitation for Localisation team development demonstration 2012-08-21 15:00 UTC
This meeting will be held using WebEx. Please ensure that you log in a few minutes before the meeting starts, so that you have time to install any required plug-ins or software. Connection details and a quick link to add this meeting to your calendar can be found below the signature. Funny question, is there any chance of holding this (and/or future) meetings without relying on non-free software? Especially 'round these parts, there are bound to be people who prefer not to use Flash/Quicktime/WMP/WebEx. bias disclosure: I am one of them. -- Mark Holmquist Contractor, Wikimedia Foundation mtrac...@member.fsf.org http://marktraceur.info ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Can we make an acceptable behavior policy? (was: Re: Mailman archives broken?)
- I warned about broken links myself before, there is a trail for this on RT All other opinions aside, this isn't good enough for a public list--RT tickets aren't public. I don't even have an account there. Some public posting (to the list, on a wiki somewhere) would be much better. That said, I'm not overly irritated. There are a few links I need to update, but that's doable. Thanks for handling this issue, and for being responsive to the concerns raised. -- Mark Holmquist Contractor, Wikimedia Foundation mtrac...@member.fsf.org http://marktraceur.info ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Meta: Inproper Line Breaks
Do my emails look terrible to anyone else? Yes! They do. Do anyone know the solution? My first instinct is don't use Outlook, but I'm guessing you have some reason you need to. In that case, I'm sure there's some manual on Outlook, or some other user group, that has the solution Maybe you have HTML emails turned on by default, and mailman is trashing your line breaks because they're HTML br tags or something crazy like that. Try turning off HTML composition, and see if that's helpful. Cheers, -- Mark Holmquist Contractor, Wikimedia Foundation mtrac...@member.fsf.org http://marktraceur.info ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Mailman archives broken?
On 12-08-16 02:00 AM, Guillaume Paumier wrote: I was told yesterday that the mailman/pipermail archives were broken, in that permalinks were no longer linking to the messages they used to link to (therefore not being permalinks at all). Is the current state of the archives related to these events? It appears to be only text files, with no indices, and improper sortingwhat's going on!? Maybe someone is rebuilding the archives? Could we have gotten notice about that? -- Mark Holmquist Contractor, Wikimedia Foundation mtrac...@member.fsf.org http://marktraceur.info ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
[Wikitech-l] EtherEditor stress test about to start
A brief note, we're about to start a stress test of EtherEditor, an extension for real-time collaborative editing, at 20:00 UTC. If you'd like to help out, or if you'd like to watch, do please join us in #ethereditor on Freenode. If you don't have IRC for whatever reason, but still would like to help out with the project, feel free to email me (either on-list or off) or just check out the mediawiki.org page about the extension [0]. [0] http://www.mediawiki.org/wiki/Extension:EtherEditor Thanks! -- Mark Holmquist Contractor, Wikimedia Foundation mtrac...@member.fsf.org http://marktraceur.info ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] EtherEditor testing, round 3: Minor stress test
So I'm here to ask for some volunteers to help me test EtherEditor tomorrow, between 20:00 and 22:00 UTC. I'll be in #ethereditor on Freenode with a list of things to try out, and really, some number of people helping me stress test the connections to the backend would be immensely useful. All right! A bunch of awesome people showed up to help, and I got a bunch of interesting results. I'm sure I'll be back soon to announce a more stable release. Thanks especially to Chris McMahon, Thomas Gries (Wikinaut), S Page, and all the others who hopped on to help break my software :) -- Mark Holmquist Contractor, Wikimedia Foundation mtrac...@member.fsf.org http://marktraceur.info ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
[Wikitech-l] EtherEditor testing, round 3: Minor stress test
Good evening, wikitech-l! I've been developing an extension recently called EtherEditor [0]. I emailed this list about it twice [1] before [2], mostly to ask for help testing it at various times. To sum up, the extension, with proper configuration, enables multi-user, real-time collaborative editing of the wikitext of any page. Well, this project is coming to the point where *I* think it's pretty stable. There are still a few details to iron out, and I'm excited to fix those things, but there are also a few things I haven't ever tested. Most notably, multiple users for an extended period of time, and all the fun bugs that are sure to appear in that case. So I'm here to ask for some volunteers to help me test EtherEditor tomorrow, between 20:00 and 22:00 UTC. I'll be in #ethereditor on Freenode with a list of things to try out, and really, some number of people helping me stress test the connections to the backend would be immensely useful. If you don't have an account on Freenode, or prefer to contact me in some other way, the best way is probably to use this email address. If you have any questions, you can always ask them on-list. If you want to chat about the extension at any time, you can always join the IRC channel (#ethereditor on Freenode), and I'm always online. Remember that IRC isn't necessarily a synchronous communication system, and be patient when asking questions. In case you want to try the extension, it is and has been running on my TestWiki instance [3], so feel free to attack it! If you could report any problems on the Bugzilla [4], I will gladly fix them with great quickness! If you want to try installing it on your own MediaWiki instance, the mw.org entry [0] has some instructions for that. As before, please report problems to the Bugzilla [4] :) Links: [0] http://www.mediawiki.org/wiki/Extension:EtherEditor [1] http://lists.wikimedia.org/pipermail/wikitech-l/2012-June/061188.html [2] http://lists.wikimedia.org/pipermail/wikitech-l/2012-July/061517.html [3] http://etherpad.wmflabs.org/wiki/index.php/Main_Page [4] https://bugzilla.wikimedia.org/enter_bug.cgi?product=MediaWiki%20extensionscomponent=EtherEditor Thanks, everyone! -- Mark Holmquist Contractor, Wikimedia Foundation mtrac...@member.fsf.org http://marktraceur.info ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] EtherEditor testing, round 3: Minor stress test
+ installed the OpenID extension for simple account creation Registration is open, so account creation is simple now! Is there that much increased ease with OpenID? + set collaborate as default for any page This is a question of the extension's defaults--I assume that most people will not want to have that default, but there is a way to enable it on page load--set collaborate=true in the URL, or use the link in the text field above the edit box. + set collaborate as default in all newly created accounts That preference is actually pretty much deprecated, but I'll consider fixing it up and using it again. I want the switch behavior to be part of the testing, though, so I'll keep it as-is for now. Thanks for the feedback! Oh, and bonus: If you go to http://etherpad.wmflabs.org/pad/ you can play with the Etherpad Lite interface without integration into MW. -- Mark Holmquist Contractor, Wikimedia Foundation mtrac...@member.fsf.org http://marktraceur.info ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] EtherEditor testing, round 3: Minor stress test
Saves much time and efforts You don't use it, otherwise you wouldn't have asked that question. Clearly--but could you elaborate more as to why it's helpful? (Thomas is currently in #ethereditor having this conversation) Uhh, I *am* developer for Etherpad Lite and have that running in many of my servers. I have seen your patches, yes :) It was more for the benefit of the others on the list who might be tired or frustrated with the old etherpad.wikimedia.org interface. -- Mark Holmquist Contractor, Wikimedia Foundation mtrac...@member.fsf.org http://marktraceur.info ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Design comments
On 12-08-14 10:44 AM, Martijn Hoekstra wrote: I found this not at all bad looking. whatever your take, it's always nice to have an outside view: http://www.wikipediaredefined.com/ Also, the conversation on the Design list may be interesting: http://lists.wikimedia.org/pipermail/design/2012-August/81.html For my part, I did like the design, but I can see that there were a bunch of places where the designer didn't consider some pretty huge parts of the Wiki ecosystem. L10n, for instance, was tossed away pretty casually at the start. -- Mark Holmquist Contractor, Wikimedia Foundation mtrac...@member.fsf.org http://marktraceur.info ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Design comments (and note about no-www)
For example, to get to Czech Wikipedia from www.wikipedia.org, I have to roll over the top right corner? That's absolutely unusable, I would never think of that. Even better, the smaller wikis have very little screen space, so it might be impossible to get to them! Great design. And I was going to say something about no-www [0] compliance, but that's not from their design, apparently there are rewrite (or redirect) rules in place for it. Ah well. Represent all the arguments: [0] http://no-www.org [1] http://www.yes-www.org (apparently defunct) [2] http://www.www.extra-www.org/ -- Mark Holmquist Contractor, Wikimedia Foundation mtrac...@member.fsf.org http://marktraceur.info ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Phabricator trial
All right, I think we need to seriously consider this problem right now: There's seemingly no way to disable required-login in the conf files. We need people to be allowed into our code review system, and right now, they need to either have a Google account (nasty) or ask someone on the machine for a login (slow). That's pretty much the state of things for *committing* right now, in Gerrit, but people should be able to see changesets and the discussions thereof without logging in. I tried setting policy.allow-public to true, but that did nothing obvious. The docs in default.conf say something about setting objects to be publicly accessible, but there are no extra configuration settings in the policy group, and no mention of public in either the docs or the default.conf example. Can anyone help me figure this one out? -- Mark Holmquist Contractor, Wikimedia Foundation mtrac...@member.fsf.org http://marktraceur.info ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Phabricator trial
This isn't currently supported. You can see progress toward it here: https://secure.phabricator.com/T603 Except, of course, that I can't, because I (don't have a (Facebook|Disqus|phabricator.org)|prefer not to use my (GitHub|Google)) account. I'm sure I'm not the only one who hits this wall. policy.allow-public affects future settings which will be available in the UIs built by T603. Could you perhaps summarize what that means, time-wise? Is there anything we (I) can do to push things forward? -- Mark Holmquist Contractor, Wikimedia Foundation mtrac...@member.fsf.org http://marktraceur.info ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] gerrit tab width
Also, it should be 4 spaces is a matter of opinion--everyone likes different tab widths depending on their preferences and monitor size. http://www.mediawiki.org/wiki/CC#Tab_size you should make no assumptions appears to support Chad's statement. However, I'm pretty sure that in reality, many people assume a width of 4. I've definitely seen funky tab-plus-space indentations that support that theory. -- Mark Holmquist Contractor, Wikimedia Foundation mtrac...@member.fsf.org http://marktraceur.info ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] wfMsg and associated functions
Siebrand, one of our i18n overloards, is stating that some wfMsg function has been deprecated here: https://gerrit.wikimedia.org/r/#/c/17055/4/SRF_Utils.php However, the functions do definitely not have any deprecation notice. Do we want to deprecate them at this point? If so, @deprecated and wfDeprecated should be added. And if not, it's probably still good to mention using the newer methods if your code can depend on 1.18 or later in the function docs. The deprecation notes come from the mw.org page on wfMessage [0], so unless that page is incorrect, those functions have been deprecated for some time. In fact, several patches have gone by my screen about making this specific change (including one you merged [1]). This seems like it's already been decided, unless I'm mistaken. [0] http://www.mediawiki.org/wiki/WfMessage#Comparison_with_the_deprecated_wfMsg.2A_functions [1] https://gerrit.wikimedia.org/r/#/c/16043/ (oh, Krenair beat me to the 0th link, sending message anyway) -- Mark Holmquist Contractor, Wikimedia Foundation mtrac...@member.fsf.org http://marktraceur.info ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Doxygen Not Working?
I'm not sure whether anybody was aware of this, but the Doxygen documentation at http://svn.wikimedia.org/doc has suddenly broken down. It's still online, but all the actual class and function definitions have disappeared. On IRC, 10:20 hashar Amgine_: I broke it 10:20 hashar will be fixed next time the doc is regenerated So it's known, and apparently fixed. -- Mark Holmquist Contractor, Wikimedia Foundation mtrac...@member.fsf.org http://marktraceur.info ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Is the current system too difficult for volunteer developers?
I believe, although I may not be correct, Sumana requested these a while back and has been looking at them... I'm not sure if it covers (new) volunteer contributors or not. I had provided some numbers, though I'm not sure how helpful they are/have been/will be: http://lists.wikimedia.org/pipermail/wikitech-l/2012-July/061649.html Also relevant, the reports for which Sumana had requested help: https://www.mediawiki.org/wiki/Wikimedia_engineering_report/2012/May https://www.mediawiki.org/wiki/Wikimedia_engineering_report/2012/June https://www.mediawiki.org/wiki/Wikimedia_engineering_report/2012/July (this month, draft) My message to the list also included a script that could be easily hacked up to look at just about any possible permutation of gerrit searches. I mean really, it's pretty hackish as-is. As a follow-up, I had made a Python script to do something similar on the GitHub side, but it wasn't as successful or useful in my opinion, since there were only a few additional contributors: http://lists.wikimedia.org/pipermail/wikitech-l/2012-July/061653.html I hope this can be helpful in some way, at least for substantiating the statements elsewhere that we're still growing :) Cheers, -- Mark Holmquist Contractor, Wikimedia Foundation mtrac...@member.fsf.org http://marktraceur.info ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] wfMsg* [was: developing skin]
1) In the extension I can made a file ext.i18n.php and then use: $wgExtensionMessagesFiles['ext'] = dirname( __FILE__ ) . 'ext.i18n.php'; wfMsg('something'); In the interest of keeping up with the times, I'll leave this here: http://www.mediawiki.org/wiki/WfMessage#Comparison_with_the_deprecated_wfMsg.2A_functions Not to say that you should necessarily rush to fix your extensions, but be aware that it's preferable to use the wfMessage function now. Credit to Krenair, by the way, who managed to merge a fix for my bug report about this in less than three hours (the patch is also a good example of how to update your extensions): https://bugzilla.wikimedia.org/show_bug.cgi?id=38501 Cheers, -- Mark Holmquist Contractor, Wikimedia Foundation mtrac...@member.fsf.org http://marktraceur.info ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Serious alternatives to Gerrit
* It's a Ruby on Rails codebase with lots of gem dependencies and a reputation for being hard to install (haven't tried it). I can vouch for this in a limited fashion, I spent around an hour one day trying to get it working, and gave up. This has been my experience with 90% of Ruby projects, however, so I wouldn't judge Gitorious for this. I had similar trouble with Barkeep. I was only barely able to get Diaspora running. * Like GitHub/Gerrit, it's not just a code review tool, but also manages repositories. The advantage over Gerrit is that Gitorious also has project wikis (however useful that is over mw.org), and the disadvantage behind GitHub is that it doesn't have an issue tracker (though bz.wm.org should suffice as always). * Unlike GitHub/Gerrit, it has no easy way to integrate merge requests -- developers have to do so manually using git commands (see the how to apply this merge request box in the page above). The process seems particularly cumbersome for small patchsets, of which we get a lot. This is possibly the most troublesome thing about it all--it would require manual rebase/merge before being able to accept a patch. That would certainly be a blocker, I'd say, but it might be possible to port the auto-merge and the revered rebase button from Gerrit over to Gitorious. Maybe a bit of effort, but it could be worth it. [We could d]rink the kool-aid and join the wonderful semi-open world of GitHub, with all the risks and benefits it entails. This is quite a nasty flavor of Kool-Aid, I'd say--but having more contributors, accomplished by whatever means, might be worth it. If there is a free alternative, and we are able to live with it (be it Gerrit, Barkeep, Phabricator, or whatever else), that seems like the better option to me right now. Just a couple of pennies, -- Mark Holmquist Contractor, Wikimedia Foundation mtrac...@member.fsf.org http://marktraceur.info ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] getting Jenkins to run basic automated tests on commits to extensions
Would it be fairly easy for you to get Jenkins to automatically PHP lint-check new commits to extensions? Code reviewers would thank you! Also having unit tests get run automatically, and creating whatever necessary documentation for enabling developers to create consistent unit tests that *can* be run automatically, would be extremely helpful. -- Mark Holmquist Contractor, Wikimedia Foundation mtrac...@member.fsf.org http://marktraceur.info ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] request for Git statistics (or, don't stand back, I don't know regular expressions)
So, 92 unique committers in June. Which is way better than Ohloh has been saying, yay. I'd like to confirm that number, 92 unique contributors in June is absolutely correct. I've also scriptified my method, so now I can do multiple months. Month | Unique contributors + July so far | 60 + June| 92 + May | 77 + April | 67 + March | 34 + February| 2 A note or two: July is high because we have a lot of regular committers, I suppose. You could confirm that by graphing how many contributors are added by adding on one day at a time. My guess is you'll get a nice steep line at first that tapers out to nearly 0 at the end of 30 days. Also, I'm sure the earlier months have inadequate sample sizes to be relevant, since the extensions had to take some time to transfer over, and apparently February was just for testing. Of course, the coolest thing is that each month so far has seen at least 10 additional contributors! :) The script I used to generate it is attached (since it's only 1.1 kb). If you have a sane SSH setup already, you should be able to make it executable and do $ ./gerunique ssh -p 29418 gerrit.wikimedia.org 0d 30d and get the number of contributors for the past 30 days. It will also give you some friendly notifications, though they're largely for debugging. The first option can be described as how you would ssh into gerrit if you had to, and it's provided for the convenience of those people (like me) whose local username doesn't match their remote username. Cheers, -- Mark Holmquist Contractor, Wikimedia Foundation mtrac...@member.fsf.org http://marktraceur.info #!/bin/bash # Script that will count the number of unique contributors to mediawiki # projects in a specified month. # Pass in the command you use to ssh into gerrit, the first day you want to # include in the calculation, and the last day you want to include. SSH=$1 EOM=$2 SOM=$3 FILTERS=age:$EOM -age:$SOM status:merged project:^mediawiki.* -owner:L10n-bot REMOTECOMM=gerrit query TMPFILE=`mktemp` TTMPFILE=`mktemp` CURLEN=500 RSK= N=0 while [ $CURLEN -eq 500 ]; do let N=$N+1 LFILT=$FILTERS if [ $RSK != ]; then LFILT=$FILTERS resume_sortkey:$RSK fi FRCOMM=$REMOTECOMM '$LFILT' echo Getting $N $SSH $REMOTECOMM '$LFILT' $TTMPFILE echo Got it. CURLEN=`grep 'name:' $TTMPFILE | wc -l` cat $TMPFILE $TTMPFILE grep 'name:' $TTMPFILE | sort -u $TMPFILE echo Currently have `wc -l $TMPFILE` echo $CURLEN results in that fetch. RSK=`grep sortKey $TTMPFILE | tail -c 17` echo Next sort starts at $RSK. done TOTAL=`wc -l $TMPFILE` echo $TOTAL total unique contributors. ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] request for Git statistics (or, don't stand back, I don't know regular expressions)
(Also, this isn't counting people who contribute to the mobile projects on GitHub, and really the final monthly report stat ought to. I don't quickly see a way to ask how many unique contributors submitted unique pull requests to a https://github.com/wikimedia/ repo in June? on GitHub, though, so I'll put that off till next month.) I was bored, so I made you a Python script this time :) It's attached, it takes a year and month as its arguments, and fetches all the repos at github/wikimedia, then fetches their pull requests, and then finally checks to see which pull requests match the month you specified. Something like $ ./githubunique 2012 06 # should give 3 unique contributors And yes, most months only have very few contributors, but anything we can do to increase the count :) It also doesn't count people who are making operations changes, or Wikimedia site configuration changes, or are packaging debs, etc, etc. It would be awesome to see stats for those as well. I have a feeling that we have more contributors then the record ;). This should be as simple as removing the project:^mediawiki.* bit from the previous bash script. I'm not sure if there are other bots to exclude in that case, though, so I'll leave it up to someone more versed with the rest of Gerrit (Ryan?) If there are other github repositories *not* in the wikimedia github account, it shouldn't be hard to add those to the consideration in this script. P.S., a word to the wise: don't try to parse github's API requests with bash, it's just not worth it. P.P.S., for those who like unified counts, adding this python script to the end of the previous bash script should be easy enough, so you could get all of the contributors (95!) in one command if you wanted. -- Mark Holmquist Contractor, Wikimedia Foundation mtrac...@member.fsf.org http://marktraceur.info #!/usr/bin/env python import urllib2 import json import datetime import sys reposurl = 'https://api.github.com/users/wikimedia/repos' pullbaseurl = 'https://api.github.com/repos/wikimedia/' pullafterurl = '/pulls?state=' year = sys.argv[1] month = sys.argv[2] text = urllib2.urlopen( reposurl ).read() repos = json.loads( text ) uniqs = {} for repo in repos: for state in ['open', 'closed']: pullurl = pullbaseurl + repo['name'] + pullafterurl + state text = urllib2.urlopen( pullurl ).read() pulls = json.loads( text ) for pull in pulls: pulldate = pull['created_at'][0:7].split( '-' ) if pulldate[0] != year or pulldate[1] != month: continue uniqs[pull['head']['repo']['owner']['login']] = True count = 0 for uniq in uniqs.keys(): print uniq count += 1 print str( count ) + ' unique contributors in that month.' ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] request for Git statistics (or, don't stand back, I don't know regular expressions)
I'll accept either help in running this query correctly so I get the giant table on the command line so I can gin up the status myself, or I will simply accept a number if you want to do my homework for me. :-) It's because you've passed in that string (which was good) as one argument to the SSH command, which is then read as multiple arguments on the remote server. This works for me, adding double quotes around the remote command: ssh -p 29418 gerrit.wikimedia.org gerrit query 'age:4d -age:34d status:merged project:^mediawiki.* -owner:L10n-bot' -- Mark Holmquist Contractor, Wikimedia Foundation mtrac...@member.fsf.org http://marktraceur.info ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l