Re: [Wikitech-l] Investigating building an apps content service using RESTBase and Node.js
One strategy employed by Netflix is to introduce a second API layer http://techblog.netflix.com/2012/07/embracing-differences-inside-netflix.html on top of the general content API to handle device-specific needs. I think this is a sound strategy, as it contains the volatility in a separate layer while ensuring that everything is ultimately consuming the general-purpose API. This design appears often enough that it can likely be called a design pattern. The Selenium/Webdriver project did exactly the same thing[1]. The API for Selenium v2 has about 1/3 as many functions as Selenium v1. People that use Selenium v2 build their own high-level APIs based on the basic core set of functions available. Defining the scope of the general content API can be challenging. [1] http://w3c.github.io/webdriver/webdriver-spec.html ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] extensions tested along core
On Mon, Jan 12, 2015 at 9:31 AM, Antoine Musso hashar+...@free.fr wrote: Hello, I have crafted and enabled two new jobs: * mediawiki-phpunit-hhvm * mediawiki-phpunit-zend ... Side effect: if one deprecates a function/method in mediawiki/core and it is used by one of the extensions above, the job will fail until the extensions above have been adjusted. This should mean fewer problems for QA to find in beta labs. I am all for that. ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
[Wikitech-l] recent security breach for browserstack.com
The Wikimedia Foundation has an account with browserstack.com for cross-browser testing. I do not think it sees much use, but in case anyone is using the browserstack service, they had a security breach yesterday: https://twitter.com/browserstack Browserstack says it will publish details of the breach directly to users. -Chris ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] State of the DumpHTML extension
I may be mistaken, but isn't this done by Kiwix now? There was some discussion of that at http://www.kiwix.org/wiki/Mediawiki_DumpHTML_extension_improvement, and recent discussion here: https://blog.wikimedia.org/2014/09/12/emmanuel-engelhart-inventor-of-kiwix/ I could be mistaken. On Wed, Oct 1, 2014 at 10:45 PM, Quim Gil q...@wikimedia.org wrote: Thank you Daniel for this email. On Mon, Sep 29, 2014 at 10:16 AM, Daniel Friesen dan...@nadir-seen-fire.com wrote: The DumpHTML extension looks like it's in a pretty bad state, it doesn't work at all in the current version of MediaWiki. This seems to be an unfortunate symptom of how it's used and how it's treated by core developers. What you are describing is clearly a MediaWiki problem, not just a DumpHTML problem. The Wiki Release Team and the MediaWiki Cooperation group should be helpful in situations like this, and this is why I'm proposing this task: Take DumpHTML as a use case of 3rd party extension that MediaWiki maintainers cannot ignore https://phabricator.wikimedia.org/T536 DumpHTML is most useful when someone is shutting down and archiving their wiki, so it doesn't get tested regularly. The act of creating a version of wiki pages suitable for offline use from static files is something which inherently requires different behaviour from things deep within core. Because DumpHTML has been segregated into an extension and core doesn't support an offline/dump mode internally DumpHTML has to use a bunch of hacks to make core behave properly during the dump. Then, because they are completely unaware of DumpHTML's needs, core developers make improvements to core that then break DumpHTML without providing it an alternative interface to get what it needs out of core. For one DumpHTML needs to proxy and mess with file repo behaviours. To do that it messed with properties like thumbScriptUrl, but then those properties were protected leaving DumpHTML unsupported. This was reported as a bug a month ago, which has gone relatively unnoticed: https://bugzilla.wikimedia.org/show_bug.cgi?id=69824 It also subclassed RepoGroup and since it proxied existing repo instances instead of working with repo info it had to bypass the __constructor. But then a $repoGroup-cache was added to the __constructor in core, breaking DumpHTML in another way. There are probably more issues that I haven't found yet while trying to workaround the other issues. -- ~Daniel Friesen (Dantman, Nadir-Seen-Fire) [http://danielfriesen.name/] ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l -- Quim Gil Engineering Community Manager @ Wikimedia Foundation http://www.mediawiki.org/wiki/User:Qgil ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Meet-up at WMF: Exploratory Testing for Complex Software, Oct 22 2014
On Tue, Sep 23, 2014 at 11:20 AM, Pine W wiki.p...@gmail.com wrote: Sounds interesting. Will there be a video of this event, similar to the monthly metrics meetings? Yes, Elisabeth is agreeable, we intend to record the talk and post it if all goes well. -Chris Pine On Tue, Sep 23, 2014 at 10:31 AM, Arthur Richards aricha...@wikimedia.org wrote: On Wednesday, October 22, 2014 the Quality Assurance Group and Team Practices Group hope you will join us for a meet-up at the WMF entitled 'Exploratory Testing for Complex Software; Lessons from Cloud Foundry' with special guest speaker Elisabeth Hendrickson [1]. We will be discussing testing in agile iterative software development, and in particular exploratory testing https://en.wikipedia.org/wiki/Exploratory_testing [0]. This will be a lively and enlightening conversation, aimed at everyone concerned about the overall quality of software - even those who do not necessarily contribute code. *When*: Wednesday, October 22, 2014, 6:00pm - 8:30pm (for WMF folks there is a calendar event on the Engineering calendar) *Where*: Wikimedia Foundation 6th Floor, collab space 149 New Montgomery St. San Francisco, CA (Accessible for remote participation via Hangouts on Air; link TBA) *From the meet-up invite http://www.meetup.com/wikimedia-tech/events/207856222/*[2]: In modern software development organizations, the days are gone when separate, independent Quality Assurance departments test software only after it is finished. Iterative development and agile methods mean that software is constantly being created, tested, released, marketed, and used in short, tight cycles. An important testing approach in such an environment is called Exploratory Testing, and the Wikimedia Foundation has made significant investments to support Exploratory Testing for its software development projects. Elisabeth Hendrickson is test obsessed. She was an early adopter and vocal proponent of all aspects of agile software testing. She has been particularly instrumental in encouraging and defining the practice of Exploratory Testing. Elisabeth's 2013 book Explore It!: Reduce Risk and Increase Confidence with Exploratory Testing is the standard reference on the subject. Join us in the Wikimedia Foundation collaboration space to hear Elisabeth discuss her experience doing software testing for complex projects, with particular examples of Exploratory Testing from her current work as Director of Quality Engineering for Cloud Foundry. This talk is for everyone involved in the overall quality of software, and it will be of particular interest to Project Managers, Product Managers, and those working with software development projects who do not necessarily contribute code directly to the projects. [0] https://en.wikipedia.org/wiki/Exploratory_testing [1] Elisabeth Hendrickson is a tester, developer, and Agile enabler. She wrote her first line of code in 1980, and almost immediately found her first bug. In 2010 she won the prestigious Gordon Pask Award from the Agile Alliance. She is best known for her Google Tech Talk on Agile Testing as well as her wildly popular Test Heuristics Cheatsheet. In 2003, she learned how to do Agile for real from Pivotal Labs while working as a tester on one of their projects. In 2012 she decided it was time to take up permanent residence in the Pivotal offices, where she is the Director of Quality Engineering for Cloud Foundry, Pivotal's Open Source Platform as a Service (PaaS). [2] http://www.meetup.com/wikimedia-tech/events/207856222/ -- Arthur Richards Team Practices Manager [[User:Awjrichards]] IRC: awjr +1-415-839-6885 x6687 ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Engineers in residence
On Sat, Aug 9, 2014 at 5:22 PM, dan-nl dan.entous.wikime...@gmail.com wrote: this is an excellent idea and i don't think it needs to be focused only on large corporations or only on corporate individuals who want to volunteer. i would suggest opening the idea to any developer with a skill set that's needed or who wants to learn. Actually, I think we should consider serious limits to any such proposal, such as (as Gilles suggested) working only with reputable employees (or ex-employees, like Aaron Arcos) of reputable companies. Otherwise I suspect any developer will game the system for their own benefit, for example to get free training or to pad their resume. ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Engineers in residence
On Sun, Aug 10, 2014 at 2:47 PM, Gilles Dubuc gil...@wikimedia.org wrote: personally i really liked your comparison, when we were chatting the other day, to an artist in residence -- imo, programmers are the artists of our time and this matches well. To me the point is to have our engineering more open and collaborative, which in my opinion would also increase pure volunteer contributions as a side effect. This is very hard to do in anything other than a non-profit open source project, I think we're in a position to make very interesting things happen through such efforts. Such good timing. As a direct result of meeting at Wikimania (thanks Nik Everett), here is an expert from Expedia who would like to contribute to our security efforts. He has posted to the QA mail list: http://lists.wikimedia.org/pipermail/qa/2014-August/001847.html ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Release Engineering team (new! improved!)
On Tue, Jul 29, 2014 at 12:25 PM, Chris Steipp cste...@wikimedia.org wrote: On Tue, Jul 29, 2014 at 11:58 AM, Pine W wiki.p...@gmail.com wrote: To clarify, is the QA team now under Release Engineering as Chris' comment seems to imply, and how does this org change effect security engineering? For now, I (the only security engineer) am staying in core, although much of my role spans both groups. I'll continue working with Chris, Greg, and other engineers across the WMF and developer community... I think it is not accurate to say that the QA team is under Release Engineering, or that Release Engineering is somehow separate from Core, and security, and the feature development groups. Our QA practice reaches into many aspects of software development at WMF, and RelEng serves everyone who needs to get software to Wikipedia. We have a minimum of formal gates and handoffs and such; instead we try to put in place general processes (build, test, deploy) between your local development environment and production in order to get new features to users as quickly and again, as *safely*, as possible. -Chris ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Release Engineering team (new! improved!)
On Tue, Jul 29, 2014 at 2:06 PM, Pine W wiki.p...@gmail.com wrote: Hi Chris M., By the way, Wikimedians are a vocal group when there are problems, and I take the general quiet of Wikimedia content editors about security and core stability to mean that security and core QA are in good hands. Thank you, that is good to hear! -Chris ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Moving Mathoid to production cluster
On Tue, Jul 15, 2014 at 1:10 AM, Moritz Schubotz phy...@physikerwelt.de wrote: Hi Chris, me too. How can I implement in in beta labs? I'd say to start by filing a bugzilla ticket for Wikimedia Labs/deployment-prep. Then it is a matter of registering the proper extensions and config in puppet. Would mathoid need a dedicated host? Best Moritz On Mon, Jul 14, 2014 at 6:35 PM, Chris McMahon cmcma...@wikimedia.org wrote: I would really like to see this follow the standard deploy scheme: implement it in beta labs; then enable it for mediawiki.org and test2wiki; then enable it on production cluster nodes. -Chris On Mon, Jul 7, 2014 at 3:07 AM, Moritz Schubotz phy...@physikerwelt.de wrote: Hi, during the last year the math extension achieved a goal defined back in 2003. Support of MathML. In addition there is SVG support for MathML disabled browsers. (See http://arxiv.org/abs/1404.6179 for the details) I would like to give Wikipedia users a chance to test this new long awaited feature. Therefore we would need a mathoid instance that is accessible from the production cluster. Greg Grossmeier already created the required table in the database. (Sorry for the friction connected with this process) Currently the MathJax team is working on a phantom.js less method to render texvc to mathml and svg. Some days ago I have tested that it, and it works quite well. I would appreciate a discussion with ops that to figure out how this can be can go to production. The original idea was to use jenkins to build the mathoid debian package. Even though the debian package builds without any issues in the launchpad ppa repo jenkins can not build the package. If there is a reference project that uses jenkins to build debian packages that go to production this would really help to figure out what is different for mathoid and why the package building does not work even though it works on launchpad. Best Physikerwelt PS: I was informed that there is a related RT that I can not access https://rt.wikimedia.org/Ticket/Display.html?id=6077 -- Mit freundlichen Grüßen Moritz Schubotz ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l -- Mit freundlichen Grüßen Moritz Schubotz Telefon (Büro): +49 30 314 22784 Telefon (Privat):+49 30 488 27330 E-Mail: schub...@itp.physik.tu-berlin.de Web: http://www.physikerwelt.de Skype: Schubi87 ICQ: 200302764 Msn: mor...@schubotz.de ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Moving Mathoid to production cluster
I would really like to see this follow the standard deploy scheme: implement it in beta labs; then enable it for mediawiki.org and test2wiki; then enable it on production cluster nodes. -Chris On Mon, Jul 7, 2014 at 3:07 AM, Moritz Schubotz phy...@physikerwelt.de wrote: Hi, during the last year the math extension achieved a goal defined back in 2003. Support of MathML. In addition there is SVG support for MathML disabled browsers. (See http://arxiv.org/abs/1404.6179 for the details) I would like to give Wikipedia users a chance to test this new long awaited feature. Therefore we would need a mathoid instance that is accessible from the production cluster. Greg Grossmeier already created the required table in the database. (Sorry for the friction connected with this process) Currently the MathJax team is working on a phantom.js less method to render texvc to mathml and svg. Some days ago I have tested that it, and it works quite well. I would appreciate a discussion with ops that to figure out how this can be can go to production. The original idea was to use jenkins to build the mathoid debian package. Even though the debian package builds without any issues in the launchpad ppa repo jenkins can not build the package. If there is a reference project that uses jenkins to build debian packages that go to production this would really help to figure out what is different for mathoid and why the package building does not work even though it works on launchpad. Best Physikerwelt PS: I was informed that there is a related RT that I can not access https://rt.wikimedia.org/Ticket/Display.html?id=6077 -- Mit freundlichen Grüßen Moritz Schubotz ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Logging edit lifecycle events
Any logging you could add for this issue would be welcome: https://bugzilla.wikimedia.org/show_bug.cgi?id=65709 We've been seeing it for some time, where the user is sometimes logged out unexpectedly when using VE in beta labs. It happens a lot in automated browser tests, and it happens to both Rummana and me occasionally when editing manually, and we have reason to suspect it might happen in production sometimes also. -Chris On Fri, Jun 27, 2014 at 1:54 PM, Adam Wight awi...@wikimedia.org wrote: Hi VE team, I heard a rumor that there is a new focus on logging edit events to track the overall session trajectory and other stuff. If it's helpful, I've been working on that as well, here are my notes and attempts to implement, hopefully it is complementary to whatever you've done so far: https://meta.wikimedia.org/wiki/Schema:EditLifecycle https://wikitech.wikimedia.org/wiki/User:Awight/Edit_logging https://gerrit.wikimedia.org/r/#/c/141097/ https://gerrit.wikimedia.org/r/#/c/141113/ https://gerrit.wikimedia.org/r/#/c/141114/ Please loop me in on the conversation! Thanks, -Adam ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] MediaWiki Bug Bounty Program
On Wed, Jun 25, 2014 at 4:28 PM, Tyler Romeo tylerro...@gmail.com wrote: Therefore, I thought it may be beneficial to take that over to Wikipedia and start our own bug bounty program. Most likely, it would be strictly a hall of fame like structure where people would be recognized for submitting bug reports (maybe we could even use the OpenBadges extension *wink* *wink*). It would help by increasing the number of bugs (both security and non-security) that are found and reported to us. Any thoughts? Some time ago I ran a number of public exercises testing various aspects of Wikipedia. I ran into a number of issues: 1) It takes a lot of preparation and time spent to do well. 2) Essentially 100% of bugs reported by naive reporters are DUPLICATE, WONTFIX, or are in the backlog of some feature already. 3) Reporting bugs directly in bugzilla creates a lot of noise and annoys people who monitor traffic there. (Mozilla runs things like this from time to time, from them I learned to have people report in a separate system e.g. etherpad or email, and have someone triage and sort the reports before creating Bugzilla tickets, see point 1) above.) Google, who spends a lot of money doing stuff like this for security exploits, narrows the circumstances radically: http://www.chromium.org/Home/chromium-security/pwnium-4 . ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Browser tests for core
On Tue, Jun 24, 2014 at 3:56 PM, Ryan Kaldari rkald...@wikimedia.org wrote: Nice work guys! One slight issue though. The user Selenium_user (as set in environment_variables) was just indefinitely blocked on en.wiki :( Yes, and I think we should keep it that way. See my msg. to the mobile-l and qa lists earlier today: http://lists.wikimedia.org/pipermail/mobile-l/2014-June/007435.html These tests are intended as acceptance tests for a new mediawiki install and smoke tests for a test env like beta labs. Let's not run them against production. -Chris Ryan Kaldari On Tue, Jun 24, 2014 at 3:46 PM, Chris Steipp cste...@wikimedia.org wrote: I just +2'ed a change to add a few basic selenium tests to core [1]. I think it will benefit us all to have a set of automated tests to quickly make sure mediawiki is working correctly. From a security perspective, this also takes a step towards more efficient security testing, which I'm also a fan of (if you've tried blindly scanning mediawiki, you know what I'm talking about..). I think the QA group is working on vagrant-izing these, but if you have ruby 1.9.3 and firefox, then setting up and running these tests on your local dev system is 4-6 commands, $ cd tests/browser $ gem update --system $ gem install bundler $ bundle install You can either set your environment variables yourself, or edit environment_variables and run `source environment_variables` to set them. Then it's just $ bundle exec cucumber features/ to run the tests. They currently complete in 36 seconds on my laptop. I'd like to see more tests added and backported to REL1_23 to make sure we have an ongoing suite to check releases against for next few years that we support that LTS. If anyone is interested in both mediawiki core and browser tests, I'm sure the QA team would like to get you involved. Big thanks to hashar, Chris McMahon, and Dan Duvall for indulging me and getting this done. I'll let them jump in with all the details I've missed. [1] - https://gerrit.wikimedia.org/r/#/c/133507/ ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Getting phpunit working with Vagrant
On Fri, Jun 13, 2014 at 11:45 AM, Dan Duvall dduv...@wikimedia.org wrote: On a related note, I'll be working on improving the mediawiki-vagrant browser tests setup for MobileFrontend in the coming weeks. It'd be great to have you, or someone else on the mobile team, vet the improvements. Dan and I (mostly Dan!) have all the browser tests for VisualEditor now running green for Firefox on a Vagrant instance under the visualeditor role. The mobilefrontend role is next for browser test support. Other suggestions for improvements to vagrant are welcome... -Chris ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Status of the new PDF Renderer
On Thu, May 29, 2014 at 6:06 PM, Matthew Walker mwal...@wikimedia.org wrote: I should have also noted -- there is something strange going on with the frontend to Special:Collection. You have to manually refresh to see status updates... Reported 10 days ago in test envs: https://bugzilla.wikimedia.org/show_bug.cgi?id=65562 ~Matt Walker Wikimedia Foundation Fundraising Technology Team On Thu, May 29, 2014 at 5:56 PM, Matthew Walker mwal...@wikimedia.org wrote: I'm happy to report that after a LONG time fighting with deployment the test instance is available in beta labs (en.wikipedia.beta.wmflabs.org and all others) via the WMF PDF option in Special:Collection and on the side panel. It is very rough still in terms of reliable rendering (it doesn't like to clean up after itself) -- but now that I have deployment sorted and it stably running that's my next task. Play away :D ~Matt Walker Wikimedia Foundation Fundraising Technology Team On Thu, May 29, 2014 at 7:02 AM, Andre Klapper aklap...@wikimedia.org wrote: Hi, On Mon, 2014-05-19 at 11:57 -0700, C. Scott Ananian wrote: That's a good question! I'm in SFO this week, so it's probably worth setting aside a day to resync and figure out what the next steps for the new PDF renderer are. Any news (or a public test instance available)? As I wrote, I'd be interested in having a bugday on testing the new PDF renderer by going through / retesting https://bugzilla.wikimedia.org/buglist.cgi?resolution=---component=Collection Thanks, andre -- Andre Klapper | Wikimedia Bugwrangler http://blogs.gnome.org/aklapper/ ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Tech Talk: Unit testing for MediaWiki projects
Very timely, the good people at Atomic Object have posted two articles about what makes code untestable, I think they're good: This Code Is Untestable! (Part 1, for Managers) http://spin.atomicobject.com/2014/04/28/untestable-code-unit-tests/ This Code Is Untestable! (Part 2, for Developers) http://spin.atomicobject.com/2014/04/29/code-untestable-part-2-developers/ On Tue, Apr 29, 2014 at 12:47 AM, Erik Moeller e...@wikimedia.org wrote: As a reminder, this is happening tomorrow at 12 PM PDT / 19:00 UTC tomorrow (Tuesday): https://plus.google.com/events/cae6ng1m9o4mhdbpo10u5v05bvg We're going to talk about various strategies for automated testing and improvements to our continuous integration infrastructure. Antoine 'hashar' Musso has offered to give an overview, roughly along these lines: - quick overview of the infrastructure (Zuul/Jenkins, the slaves, the myriad of jobs and how they are maintained). - MediaWiki testing frameworks and tools (phpunit, qunit, browser tests, beta cluster) - current concerns in what we test, which should provide enough materials for the open discussion part: - lack of cross repositories tests and how to handle dependencies - repositories that are barely tested yet critical - mw/core tests mixing unit and integration tests - lack of mocking - very thin code coverage This will be followed by an open conversation about improvement strategies. The session is scheduled to take about an hour total. Hope to see you there :) Erik -- Erik Möller VP of Engineering and Product Development, Wikimedia Foundation ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] REST and SOA within MediaWiki - is my understanding right?
On Wed, Apr 16, 2014 at 11:43 AM, Terry Chay tc...@wikimedia.org wrote: Again, I feel a more important aspect of REST is that the interface is extremely narrow: basically a representation of a resource (URL) and a set of 4 CRUD commands (create read update delete = post get put delete). The fact that each resource is independent and each action is stateless, allows it to be highly scalable. But you are correct, that REST has the advantage over non-RESTful APIs in that the access language is defined naturally in the protocol, rather than convention. I'd like to point out that the REST API for the Socialtext wiki is very very well-designed: https://www.socialtext.net/st-rest-docs/ Every action possible in the Socialtext wiki UI is also possible via a call to a REST endpoint, and the REST endpoints are simple to manipulate. The entire application is a thin presentation layer on top of a killer wiki engine all driven via the REST API. (Or it was when I last involved with it, and given that Audrey Tang was last to update the API docs, I'd guess that is still true.) To the best of my knowledge this API is still in use, although there is no longer an open source version of the Socialtext wiki available. (Although if anyone really wants a Socialtext wiki to experiment with, I think I could get you one.) -Chris ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Gerrit Commit Wars
On Sat, Mar 8, 2014 at 8:05 PM, Tyler Romeo tylerro...@gmail.com wrote: On Sat, Mar 8, 2014 at 9:48 PM, Ryan Lane rlan...@gmail.com wrote: OK, then how did this change get deployed if it broke tests? The problem is not that the change broke tests. The problem is that the change broke account creation for Mobile view on a production wiki. Let me repeat: the change broke account creation on a production wiki. It was the tests that let us know that account creation was broken. That is what good tests are supposed to do. -Chris ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Gerrit Commit Wars
On Mon, Mar 10, 2014 at 10:59 AM, Tyler Romeo tylerro...@gmail.com wrote: It's been repeated multiple times, but I'll say it again: it is disputed as to whether account creation was broken. It is not disputed. When you get to the end of the account creation process and you do not have an account, that is broken. When if you manage to create an account anyway but the critical informative messages for new users do not appear, that is broken. Again, this is on a production wiki for all mobile users and all non-javascript clients. That is broken. ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] CSS Regressions
http://sauceio.com/index.php/2014/03/shotsonsauce-by-jim-eisenhauer/ Now you can grab the screenshots from all the browsers and OS platforms you want using Sauce and compare them using the little known image diff feature on Github. Not sure I'm going to have time to poke at this, but it seems reasonable. -Chris On Mon, Mar 10, 2014 at 2:04 PM, Jon Robson jdlrob...@gmail.com wrote: I just wondered if anyone doing MediaWiki development had any experience in catching CSS regressions? We have had a few issues recently in mobile land where we've made big CSS changes and broken buttons on hidden away special pages - particularly now we have been involved in the development of mediawiki ui and moving mobile towards using them. My vision of how this might work is we have an automated tool that visits a list of given pages on various browsers, take screenshots of how they look and then compares the images with the last known state. The tool checks how similar the images are and complains if they are not the same - this might be a comment on the Gerrit patch or an e-mail saying something user friendly like The Special:Nearby page on Vector looks different from how it used to. Please check everything is okay. This would catch a host of issues and prevent a lot of CSS regression bugs. Any experience in catching this sort of thing? Any ideas on how we could make this happen? ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Multimedia team architecture update
On Thu, Mar 6, 2014 at 8:06 PM, Gergo Tisza gti...@wikimedia.org wrote: Hi all, == The state of unit tests == We discussed these issues, and decided that writing the tests was still a good decision at the time, but once we are done with the major code refactorings, we should take some time to refactor the tests as well. Many of our current tests test the implementation of a class; we should replace them with ones that test the specification. This process is something that I think would be of great interest to a variety of teams: * When to throw away old tests * When to create new tests (TDD style, before writing the code that satisfies the test?) * When to refactor existing tests * At what point do you make these decisions? I'd like to encourage you to discuss your approaches along these lines, because I think it would be of great interest across all the WMF dev teams. -Chris ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Multimedia team architecture update
Also see Michael Feathers' response to Coplien via Twitter: http://michaelfeathers.typepad.com/michael_feathers_blog/2008/06/the-flawed-theo.html / https://twitter.com/mfeathers/statuses/441598005515669504 On Fri, Mar 7, 2014 at 9:53 AM, David Gerard dger...@gmail.com wrote: On 7 March 2014 15:13, Chris McMahon cmcma...@wikimedia.org wrote: This process is something that I think would be of great interest to a variety of teams: * When to throw away old tests * When to create new tests (TDD style, before writing the code that satisfies the test?) * When to refactor existing tests * At what point do you make these decisions? I'd like to encourage you to discuss your approaches along these lines, because I think it would be of great interest across all the WMF dev teams. This essay has been going around. There's a lot to it IMO. http://www.rbcs-us.com/documents/Why-Most-Unit-Testing-is-Waste.pdf tl;dr unit test is another word for assert. Now, *acceptance* tests ... - d. ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Gerrit Commit Wars
In over two years at WMF I have never been involved in a discussion like this, but here goes: In this case, I think it was entirely appropriate to revert immediately and pick up the pieces later. The source of the code is immaterial, if Tim Starling or Brion Vibber had merged this we would have done exactly the same thing. As Steven noted, the immediate issue was that it created a serious problem with the mobile account creation process. This blocked our ability to test other aspects of mobile account creation and login that have changed recently. And, since this occurred on Thursday morning in the run-up to the weekly deployment, we had little time to prevent this going live to production. Beyond that, there are serious concerns with any feature that a) requires javascript support in the client in order to create an account on the wiki and b) does not honor the characters that the user types in the username and password fields. I know of at least one historical instance where violating b) caused a significant problem in UniversalLanguageSelector. We prevented the ULS problem from going live to production at the time, also. -Chris On Thu, Mar 6, 2014 at 1:29 PM, Tyler Romeo tylerro...@gmail.com wrote: Hi everybody, I cannot believe I have to say something about this, but I guess it's no surprise. Wikipedia has a notorious policy against edit warring, where users are encouraged to discuss changes and achieve consensus before blindly reverting. This applies even more so to Gerrit, since changes to software have a lot bigger effect. Here's a nice example: https://gerrit.wikimedia.org/r/114400 https://gerrit.wikimedia.org/r/117234 https://gerrit.wikimedia.org/r/117247 Some key points to note here: * The revert commit was not linked to on the original commit * The time between the revert patch being uploaded and +2ed was a mere two minutes * All the reviewers on the revert patch were also reviewers on the original patch This is unacceptable behavior, and is extremely disrespectful to the developers here. If you are going to revert a patch for reasons other than a blatant code review issue (such as a fatal error or the likes), you should *at the very least* give the original patch reviewers time to understand why the patch is being reverted and give their input on the matter. Otherwise it defeats the entire point of the code review process and Gerrit in the first place. The argument being made in this specific case is that the change broke the workflow of mobile, and that the revert was announced on mobile-l. This is not sufficient for a number of reasons: 1) not everybody is subscribed to mobile-l, so you cannot expect the original reviewers to see or know about it 2) this is an issue with MobileFrontend, not MediaWiki core 3) code being merged does not automatically cause a deployment, and if code being deployed breaks something in production, it is the operations team's job to undeploy that change Overall, the lesson to take away here is to be more communicative with other developers, especially when you are negating their changes or decisions. Thanks in advance, *-- * *Tyler Romeo* Stevens Institute of Technology, Class of 2016 Major in Computer Science ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Gerrit Commit Wars
On Thu, Mar 6, 2014 at 4:54 PM, Tyler Romeo tylerro...@gmail.com wrote: On Thu, Mar 6, 2014 at 6:34 PM, Brion Vibber bvib...@wikimedia.org wrote: Is there anything specific in the communications involved that you found was problematic, other than a failure to include a backlink in the initial revert? I think this entire thing was a big failure in basic software development and systems administration. If MobileFrontend is so tightly coupled with the desktop login form, that is a problem with MobileFrontend. In addition, the fact that a practically random code change was launched into production an hour later without so much as a test... It was in fact our automated browser test suite that alerted us that a change to some other area of the software overnight had broken some central MobileFrontend functionality. It was rather unexpected, and we moved quickly to identify the issue and revert it in the short amount of time we had before the code went to production. That's the kind of thing that gets people fired at other companies. But apparently I'm the only person that thinks this, so the WMF can feel free to do what it wants. That sort of thing is not necessary. -Chris ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Gerrit Commit Wars
On Thu, Mar 6, 2014 at 5:49 PM, OQ overlo...@gmail.com wrote: So I'm confused on the timeline here. Did the commit get merged before the testsuite found the breakage, or did the commit get merged despite the testsuite failing? The commit was merged late Wednesday. The automated tests that demonstrated the problem failed over Wednesday night and we analyzed the failures early Thursday morning, which is routine. As noted above, code committed late on Wednesday or early Thursday only resides in the test environment on beta labs for a short time before going to production wikis. We intend to improve this situation in the not too distant future, but for now that is the situation on the ground. -Chris ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Gerrit Commit Wars
On Thu, Mar 6, 2014 at 6:07 PM, OQ overlo...@gmail.com wrote: So the testsuite only runs on merged code and not pending-merge? That sounds like a large oversight. Picture in your mind every branch pending merge for every extension in gerrit. Imagine how many of those branches are eventually abandoned, imagine how many patch sets each receives, imagine how many times each gets rebased. And even if we had such tests, they would not have exposed today's issue. We run UI-level regression tests against a model of the Wikipedia cluster on beta labs running the master branch *exactly* so that we can expose cross-repo problems, configuration problems, etc. before they go to production. Today's issue was hardly unique. Just one week ago our tests picked up an entirely unrelated but similarly surprising issue that had the MobileFrontend team scrambling on a Thursday morning: https://bugzilla.wikimedia.org/show_bug.cgi?id=62004. We stop bugs *all the time* this way. This is hardly an oversight. These tests and these test environments are very carefully designed to expose exactly the kind of issues that they expose. They have saved us an extraordinary amount of pain by preventing bugs released to production. ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Testing changes to the Math extension before they get live at wikipedia
On Mon, Feb 10, 2014 at 5:31 AM, Moritz Schubotz phy...@physikerwelt.dewrote: Dear all, recently some changes were merged to Wikipedia that broke some math rendering for almost 2 days. I'm highly interested to avoid that this will happen again. On 27 January an automated test on beta labs identified new missing dependencies for Math: https://bugzilla.wikimedia.org/show_bug.cgi?id=60486. This was fixed. On 28 January an automated test on beta labs identified an error with Math communicating with the Parser that prevented loading any page containing a Math expression: https://bugzilla.wikimedia.org/show_bug.cgi?id=60546. This was fixed. On 29 January Physikerwelt sent a message to Antoine Musso entitled effects on caching saying please be informed that recent changes in the Math extension and core might influence the stability of large MediaWiki instances due to a change in the cache key. That message does not appear in the wikitech-l archives for January, although Physikerwelt seems to have forwarded Antoine's message there. http://lists.wikimedia.org/pipermail/wikitech-l/2014-January/ As a reaction my goal is to develop the changes in a new branch called math2_0_0 that get's reviewed according to the WMF standards and is tested in a production like environment: and reviewed by the community, before the changes are merged to the master branch. Beta labs is our production like environment. It should probably be possible to use beta labs for this. However, beta does run only the master branch of each extension, but does so before the master branch of each extension is deployed to production. In this case the root cause of the error seems to have been that the message about Math's effect on caching was somehow lost. Is there a production like environment that could be used for that? Of course I could try to create a production like environment for Math by myself like I did with http://math-test2.instance-proxy.wmflabs.org/wiki/Main_Page ... but I want to avoid double work and I'm a volunteer... so my time is very limited. Best Physikerwelt -- Mit freundlichen Grüßen Moritz Schubotz Telefon (Büro): +49 30 314 22784 Telefon (Privat):+49 30 488 27330 E-Mail: schub...@itp.physik.tu-berlin.de Web: http://www.physikerwelt.de Skype: Schubi87 ICQ: 200302764 Msn: mor...@schubotz.de ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
[Wikitech-l] Conference on Software Archaeology, London 31 January
For anyone who might be in London at the time, this looks really interesting: http://ticosa.org/ ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Module storage is coming
On Thu, Nov 7, 2013 at 8:05 AM, Jon Robson jdlrob...@gmail.com wrote: From personal experience don't touch cache manifests with a barge pole... Bear in mind the majority of browsers provide at least 5mb of local storage and we are talking about caching a few kB at most of minified JavaScript On 7 Nov 2013 00:35, Daniel Friesen dan...@nadir-seen-fire.com wrote: What I'm seeing in Chromium/Chrome is that when I invoke VisualEditor one time localStorage hits ~3.5MB immediately. Hit shift-reload a few times in quick succession and it's pretty easy to max out localStorage for Chromium at 5.xMB. Default for Chrome seems to be 10MB. ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] [Engineering] Deployment postmortem
One thing that impressed me when I started working with WMF is that reverting in production is as safe as I have ever seen any production environment. In the 20 months or so I've been here, I think I only remember one change that left behind corrupt data in prod, and that change was made by a volunteer, the bug was manifested in beta labs but we failed to recognize the importance of the bug, and then the change to the code was merged on Thanksgiving Day by someone not on the team affected by the change-- one of those perfect storm sort of problems. We're good at reverting. On Thu, Oct 31, 2013 at 12:26 PM, Toby Negrin tneg...@wikimedia.org wrote: How easy is it to rollback production changes? Is this something that can be consistently done easily with our current tools. At other high traffic sites I've worked at this has been an important component of production engineering. -Toby On Wed, Oct 30, 2013 at 6:12 PM, Greg Grossmeier g...@wikimedia.orgwrote: First: Thanks for responding to this and writing it up. quote name=Yuri Astrakhan date=2013-10-31 time=04:53:44 +0400 == Recomendations == * Allow a bit more time between deployments and observe fatalmonitor before and after Agreed. I put a ton of blame on myself for not slowing down the cadence of LD slots when a bunch of people are trying to get in on the same day. For future LDs I am going to explicitly ask everyone to do what Yuri suggests (monitor fatals after your deploy) before saying that you're done. 5 minutes post-deploy of watching the fatalmonitor isn't unreasonable, I don't think. Relatedly, I think we should reassess the Lightning Deploys :) Not necessarily to get rid of them (probably not), but: 1) how many deploys can go in one LD? How many do we *want* to go? 2) from 1, is the length of the LD long enough/too long? 3) LD management is still pretty high-communication (Alright, who's in line? Who's up next? Are you done yet?) There are basic tools that can help with this (Etsy has an IRC pushbot that manages the queue mostly automatically, for instance); I'll look into those/test them. 4) probably more, aka: your thoughts? Greg PS: graph of the fatals attached, just for completenesses sake. -- | Greg GrossmeierGPG: B2FA 27B1 F7EB D327 6B8E | | identi.ca: @gregA18D 1138 8E47 FAC8 1C7D | ___ Engineering mailing list engineer...@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/engineering ___ Engineering mailing list engineer...@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/engineering ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] [QA] Welcome, Rummana Yasmeen
On Tue, Oct 29, 2013 at 4:04 PM, Rob Lanphier ro...@wikimedia.org wrote: Hi everyone, I'd like to introduce Rummana Yasmeen, a new Software Test Engineer in our QA team through April in our San Francisco office. Rummana is going to be working with our Visual Editor team primarily on manual testing, finding bugs so you don't have to. It is really great to have someone not only to scrutinize the behavior of VisualEditor in different browsers, different environments, etc. etc., but also to help out with bug triage, communicating information about current issues, and all the human side of software testing. Welcome Rummana! -Chris ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Optimizing the deployment train schedule
On Sat, Oct 19, 2013 at 1:35 PM, Antoine Musso hashar+...@free.fr wrote: Le 19/10/13 00:26, Erik Moeller a écrit : Are there other ways to optimize / issues I'm missing or misrepresenting above? Evil plan: deploy automatically on merge. But we are not ready yet :-] We're not ready-- except in the beta cluster we are. The earlier that changes are merged to the master branch, the more time we have for scrutiny of those changes in beta labs, and the deployment there is in fact all automated and hands-off. I still occasionally see code being merged to master very shortly before being deployed, which means that beta gets updated at about the same time as the test wikis, which occasionally causes surprises. -Chris ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Officially supported MediaWiki hosting service?
On Tue, Oct 1, 2013 at 7:57 PM, George Herbert george.herb...@gmail.comwrote: On Tue, Oct 1, 2013 at 1:34 PM, Ori Livneh o...@wikimedia.org wrote: Foundation could play in ensuring that MediaWiki exposes the right set of interfaces for deep integration with configuration management and cloud provisioning platforms, and ensuring that these interfaces are intuitive and well-documented. This might actually spur some innovation. Puppet, chef, salt stack, cfengine, CloudFormation, OpenStack, etc? do you mean CloudFoundry, the Pivotal thing? http://www.cloudfoundry.com/ ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] [Wikitech-ambassadors] Fwd: Deployment highlights for the week of Sept 23rd
On Sat, Sep 21, 2013 at 1:21 PM, Nikolas Everett never...@wikimedia.orgwrote: On Sat, Sep 21, 2013 at 10:26 AM, Chad innocentkil...@gmail.com wrote: On Fri, Sep 20, 2013 at 11:47 PM, billinghurst billinghu...@gmail.com wrote: If you can't wait you can read the regression tests here: http://git.wikimedia.org/tree/mediawiki%2Fextensions%2FCirrusSearch.git/master/tests%2Fbrowser%2Ffeatures So nice to see what Nik has done here. Information on running these tests is in the README: http://git.wikimedia.org/blob/mediawiki%2Fextensions%2FCirrusSearch.git/a7d5386c659e0afff1bae24967b333b06f639512/tests%2Fbrowser%2FREADME ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
[Wikitech-l] Browser tests for extensions
When we began creating cross-browser regression tests using Selenium, we kept them all in a repository named '/qa/browsertests'. We created regression tests for features across all WMF development teams and Platform functions, from the appearance of Preferences to navigating UploadWizard to ArticleFeedback and PageTriage to MobileFrontend to UniversalLanguageSelector etc. etc. etc. We run these tests against some combinations of environments in beta labs cluster, test2wiki, and production. There will probably always be a need for a grab-bag repo of general cross-browser regression tests, but managing tests in this way has some drawbacks: * builds run for all features at the same time, regardless of the development process for each feature * failed builds can be caused by any feature, and it takes some effort to find the source of any failed build We have taken some steps recently to improve how we manage and run these tests. In particular, we have begun moving certain browser tests out of the /qa/browsertests repo and into the repos of the particular extensions or features being tested. From there, we are running individual builds for those features in the appropriate test environments. Our first model for doing this was MobileFrontend. The browser tests for MobileFrontend have resided in /MobileFrontend/tests/acceptance for some time now, and building from that repo has proven quite effective. Not long ago Nik Everett created an extensive set of browser tests for CirrusSearch and very quickly put them under /CirrusSearch/tests/browser. Work on CirrusSearch is proceeding quickly and these tests fail often for now, so it is nice to have them in a repo where they can be examined outside of the main /qa/browsertest builds. Back in May of this year we created a number of browser tests for the UniversalLanguageSelector with help from Runa Bhattacharjee and Niklas Laxström, and Amir Aharoni has been doing work on those in the time since. Recently at the request of the Language team we moved the ULS browser tests to the repo at UniversalLanguageSelector/tests/browser, and we're running those in their own build as well. Finally, we have been creating some browser tests for VisualEditor that will be particularly useful for cross-browser acceptance testing and regression testing. Of course the tests are valuable right now as well, they help us identify important issues like https://bugzilla.wikimedia.org/show_bug.cgi?id=53360. These tests today reside under VisualEditor/modules/ve-mw/test/browser. Rachel Thomas, our intern with the Outreach Program for Women has contributed a number of VisualEditor browser tests. ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] [GSoC] IMPORTANT: Mentor Midterm Evaluations 29 July - 2 August
Just confirming that nothing is required for OPW and that this is only for GSoC -Chris On Tue, Jul 23, 2013 at 7:00 AM, Quim Gil q...@wikimedia.org wrote: GSoC mentors: read this email through and submit your mid-term evaluation between 29 July - 2 August in Google Melange. One per project is enough. If the primary mentor can't make it then the secondary mentor needs to step in. VERY IMPORTANT Missing this deadline means the cancellation of your student's GSoC project. It also means damage points for Wikimedia in the current and future GSoC editions. You have the questions of the evaluation below, and you can start preparing it now. Please add Mid-term evaluation: WIP in the table at https://www.mediawiki.org/**wiki/Summer_of_Code_2013https://www.mediawiki.org/wiki/Summer_of_Code_2013now (so your student and the org admins know that you are on it). After submitting it change that line with Mid-term evaluation: OK. Students: do not hesitate pinging your mentors about this evaluation if you don't hear from them. Also remember that your monthly reports are expected during next week. https://www.mediawiki.org/** wiki/Summer_of_Code_2013#**Reportinghttps://www.mediawiki.org/wiki/Summer_of_Code_2013#Reporting Thank you to everybody! PS: fwiw I'm starting a week of holidays tomorrow and then I will be flying to Hong Kong for Wikimania on Aug 1-2. If you need and org admin Lydia Pintscher will be able to help you. Original Message Subject:[GSoC Mentors Announce] 2013 Google Summer of Code Mentor Midterm Evaluations 29 July - 2 August Date: Sun, 21 Jul 2013 13:49:11 -0700 From: Carol Smith car...@google.com Reply-To: gsoc-mentors-announce+owners@**googlegroups.comgsoc-mentors-announce%2bown...@googlegroups.com To: GSoC Mentors Announce gsoc-mentors-announce@**googlegroups.comgsoc-mentors-annou...@googlegroups.com Hi GSoC 2013 Mentors and Org Admins, Per the program timeline [0], starting on Monday, 29 July you will will be able to submit an evaluation of your student(s)' progress on their projects thus far. Here's some important info on midterm evaluations for those not familiar: Midterm evaluations are done via Melange [1]. Starting at 19:00 UTC on 29 July you will be able to submit an evaluation for your student(s).You can find the evaluation links on your dashboard under 'Evaluations', one for each student you are a mentor (or co-mentor) for. If you are curious about who can see evaluations after they are submitted, please check out the FAQ on Evaluations [2]. I have also pre-published the evaluation questions below in this email so you can prepare. The deadline is 19:00 UTC on Friday, 2 August. You may not submit your evaluation before or after the evaluation window. Please ask your org admin to submit your evaluation for you if you absolutely cannot submit yours during the time allotted. Primary mentors, co-mentors, and org admins may all submit evaluations of their students.**Students must have an evaluation on file from both themselves *and* their mentors in order to receive their midterm payments.** There is no excuse for missing the submission of a student's evaluation. You must submit an evaluation for every student you are the primary mentor for this year. You must fill out the entire survey in one session as there's no auto-save in Melange. You may submit, modify, and resubmit an evaluation as many times as you choose up until the deadline. Please note that failing a student at the midterm evaluation means *this student is immediately removed from the program.* There is no way to fail a student at the midterm and have the student continue with the program to try to make it up at the final. If your student fails, your student is out. You might find the FLOSS manual on GSoC evaluations [3] helpful as well. There's some excellent wisdom in there from your fellow mentors and org admins on the evaluation process. Finally, a reminder: This year we will not allow any mentor who misses an evaluation deadline to attend the mentor summit (assuming no one else submits the evaluation on the mentor's behalf before the deadline either). Also, any org that misses 2 or more evaluation deadlines (for the midterm, final, or midterm and final combined) will not be invited to attend the mentor summit this year. Please let me know directly if you have questions or concerns. [0] - http://www.google-melange.com/**gsoc/events/google/gsoc2013http://www.google-melange.com/gsoc/events/google/gsoc2013 [1] - http://google-melange.com http://google-melange.com/ [2] - http://www.google-melange.com/**gsoc/document/show/gsoc_** program/google/gsoc2013/help_**page#9._How_do_evaluations_**workhttp://www.google-melange.com/gsoc/document/show/gsoc_program/google/gsoc2013/help_page#9._How_do_evaluations_work [3] -
Re: [Wikitech-l] Remove 'visualeditor-enable' from $wgHiddenPrefs
On Tue, Jul 23, 2013 at 1:42 PM, Roan Kattouw roan.katt...@gmail.comwrote: On Tue, Jul 23, 2013 at 12:44 PM, Daniel Barrett d...@vistaprint.com wrote: Risker asks: Of course those nowiki tags weren't added by the editors, VE doesn't let you do that directly. What I think Robert was talking about (thanks for that analysis, BTW!) is edits where the user typed something like [[Foo]] into the editor, and Parsoid, in order to achieve a truthful rendering of [[Foo]], wraps it in nowiki tags. That's a case of we did what the user asked us to do, although the user probably didn't mean that. ... These kinds of technically-correct-but-probably-not-what-you-wanted issues are a bit tricky. They're on our list of things to deal with though. FWIW, the WYSIWYG editor for the Socialtext wiki handles these cases well, and has since at least 2007 or so. I think Wikia is (or was) using some derivative of this Wikiwyg editor. Regardless, it might be a useful oracle for VE behavior. ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Search documentation
On Mon, Jun 17, 2013 at 1:28 PM, S Page sp...@wikimedia.org wrote: * enwiki says Hello dolly in quotes gives different results, mw directly contradicts this. Even on my local wiki, quotes make a difference. * enwiki disagrees with itself what a dash in front of a word does. I did some research a few weeks ago on the current state of Search and there are a number of discrepancies between the documentation and actual behavior. Some of them have BZ tickets, like https://bugzilla.wikimedia.org/show_bug.cgi?id=44238 -Chris ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
[Wikitech-l] Features vs. Internet Explorers
In recent times QA has expanded our automated cross-browser testing: we re-organized our builds, pointed the tests to beta labs wikis as well as test2wiki, and we've written a number of new tests. In the course of that a lot of our builds for Internet Explorer versions began to fail. I've just cleaned up most of the build failures and discovered what I think are some interesting facts about support for older versions of Internet Explorer across the set of WMF features. Not supported in IE6: AFTv5 by design VisualEditor by design UniversalLanguageSelector by design Interlanguage Add links feature known issue https://bugzilla.wikimedia.org/show_bug.cgi?id=49139 PageTriage by design PDF export broken: https://bugzilla.wikimedia.org/show_bug.cgi?id=49485 Page editing: degraded UI ACUX display garbled but functions, known issue Not supported in IE7: VisualEditor by design UniversalLanguageSelector by design Interlanguage Add links feature known issue https://bugzilla.wikimedia.org/show_bug.cgi?id=49139 PageTriage by design PDF export broken: https://bugzilla.wikimedia.org/show_bug.cgi?id=49485 Not supported in IE8: VisualEditor by design UniversalLanguageSelector minor issue https://bugzilla.wikimedia.org/show_bug.cgi?id=49447 Interlanguage Add links feature known issue https://bugzilla.wikimedia.org/show_bug.cgi?id=49139 Not supported in IE9: AFTv5 broken for now https://bugzilla.wikimedia.org/show_bug.cgi?id=49445(AFTv5 has a history of IE9-only issues) VisualEditor broken for now https://bugzilla.wikimedia.org/show_bug.cgi?id=49187 Interlanguage Add links feature known issue https://bugzilla.wikimedia.org/show_bug.cgi?id=49139 One other interesting note: we have an effective test for GuidedTour (it has turned up regression bugs) that runs properly across all the browsers, so thanks E3 team. ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
[Wikitech-l] QA: new mail list, update from Amsterdam, etc.
Because of a surge of interest in recent times, we have started a mail list for QA activities connected with WMF projects. If this interests you, feel free to sign up at https://lists.wikimedia.org/mailman/listinfo/qa. The list is completely open, so also feel free to mention it to anyone else for whom this sort of discussion might be of interest. Below is a sample of the sort of things we will discuss on the QA list: OPW: Congratulations Rachel Thomas (rachel99) on being selected to work on browser test automation with us as part of the Gnome FOSS Outreach Program for Women. Rachel has been a volunteer for some time now and despite starting from scratch with gerrit and Ruby/Cucumber/PageObject has already made valuable contributions to the browser tests. We're really looking forward to working with Rachel over the summer. Amsterdam Hackathon: While Željko and Chris have met any number of times before, this was our first meeting outside of the USA. Together we cleaned up many failing builds at https://wmf.ci.cloudbees.com/; we implemented some tests for the Universal Language Selector; and we gave a presentation on the current state of the browser tests to a number of interested (and interesting!) people. In the course of making the builds green, we: * updated the Guided Tour test to reflect the latest behavior of that feature * repaired a bogus error message from parallel_cucumber causing false build failure reports * adjusted the target test environments among beta labs enwiki and commons, test2wiki, and production to yield optimum coverage with a minimum of red builds. We also extended some test coverage: * checked in a test for Appearances and Datetime Preferences * checked in an interim fix for sidebar expand/collapse tests while we explore setting cookies in IE * wrote a new test for Universal Language Selector based on some work from Runa Bhattacharjee of the Language team, and planned for more ULS tests very soon. (Fixing https://bugzilla.wikimedia.org/show_bug.cgi?id=45958will be helpful) In the very near future we'll be working with tests for VisualEditor as well, which continues to have some interesting bugs: https://bugzilla.wikimedia.org/show_bug.cgi?id=48166. Bug 48166 and a few others were identified as part of a community test exercise at the Telerik Test Summit peer confernence in Austin TX not long ago. We are also looking forward to using YuviPanda's new github-gerrit integration. Again, we extend an invitation to anyone interested in testing, test automation, and QA activities in general to join the mail list at https://lists.wikimedia.org/mailman/listinfo/qa -Chris McMahon QA Lead for WMF ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] New developments in automated testing, including mobile browsers apps
On Thu, May 9, 2013 at 8:13 PM, Sumana Harihareswara suma...@wikimedia.orgwrote: https://lwn.net/Articles/548910/ A summary of talks at a recent conference on test automation, with a bunch of links for people who want to follow up and watch videos. Chris McMahon was at this conference and may have more specific we should do foo recommendations. :) I was at GTAC in 2007 and 2009 as well. This year's conference seemed to me to indicate that this sort of test automation is in something of a holding pattern right now, with a strong emphasis on existing, accepted practice in mobile testing. All of the videos of the GTAC talks are available here: http://www.youtube.com/playlist?list=PLSIUOFhnxEiCODb8XQB-RUQ0RGNZ2yW7d I found #5 and #16 the most interesting. #5 because it explains how Expedia chose exactly the same set of browser automation tools that we at WMF did, for very similar reasons, with similar results. It's always nice to find someone else thinking the same way. #16 talks about the architecture of Appium, the open source mobile test framework under development from Sauce Labs. To my mind, Appium is the most interesting approach to automated mobile testing right now. Sauce Labs intends to offer Appium tests as a service Real Soon Now, which we intend to take advantage of. And if you follow such things, in some distant future time I believe that Sauce intends to have Appium control actual physical robots with moving arms and such poking real devices in 3D space. But that is no time soon, and not part of GTAC... ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] OPW Browser Test Automation - Proposal Summary
On Tue, Apr 30, 2013 at 11:31 AM, Matthew Flaschen mflasc...@wikimedia.orgwrote: On 04/30/2013 08:25 AM, Indrani Sen wrote: Scope: To do functionality testing of selected features(i.e Print/Export PDF option etc. ) of the website on selected browser(i.e IE, FF, Chrome etc.) This is nothing to do with Performance Testing, Load Testing of the website. Thank you for your proposal. I think the scope should be more specific. There are of course a vast number of MediaWiki features, including many in extensions. Some of these already have browser tests. Many do not. And quite a number of ongoing projects have no need of browser tests Your proposal should try to give a clear idea of what tests (which extensions, which parts of MediaWiki core, etc.) you will work on. Also, any constraints or obstacles that you might encounter in the course of such tests. For example, you mention exporting PDFs. This requires access not only to the browsers, but also to the underlying file system, which changes according to where the test is run (Windows, Linux, Mac, FAT, ext3, etc.). We have solved this problem for testing UploadWizard in multiple browsers, and work on exporting PDFs is already beginning. You mention test data. Where will this test data reside? Again, we have in place a somewhat complex arrangement of shared public test environments right now. How does your proposal fit in? Have you seen our reporting in Jenkins that we have in place already? -Chris ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Support for 3D content
On Fri, Apr 19, 2013 at 6:03 AM, Mathieu Stumpf psychosl...@culture-libre.org wrote: Hi, Reading the 2012-13 Plan, I see that multimedia is one the key activities for Mediawiki. So I was wondering if there was already any plan to integrate 3D model viewers, which would be for example very interesting for anatomy articles, or simply 3D maths objects. I know that work on PDBHandler is ongoing: http://www.mediawiki.org/wiki/Extension:PDBHandler ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
[Wikitech-l] browser tests now targeting beta cluster
Over the last week Željko and I changed the QA automated browser tests to target test environments on the beta cluster as well as on test2wiki, where they had been running for some time. This was possible because of a number of improvements made to beta since the QA Quarterly Review and the f2f meeting in SF in February: * automatic updates to the beta database * enabling Search on beta * Varnish on beta to support MobileFrontend * etc. There is still more to do: * get more extensions running and configured on beta * change settings to more closely match prod and/or test2 as appropriate * moar tests Automated browser tests have a proven record of exposing real issues in the test2wiki environment before code is put in production. Here are some of the bugs we have identified by way of browser tests so far: https://bugzilla.wikimedia.org/buglist.cgi?bug_status=UNCONFIRMEDbug_status=NEWbug_status=ASSIGNEDbug_status=REOPENEDemail1=cmcmahon%40wikimedia.orgemailtype1=exactemailassigned_to1=1emailreporter1=1list_id=195485 By running these tests in the beta cluster also, we gain both better test coverage features in disparate environments and also a longer window to identify issues, since beta is updated much more often than test2wiki. As always, the current status of the browser test builds are available in Jenkins at https://wmf.ci.cloudbees.com, and any changes to the status of those builds are reported by wmf-jenkins-bot in #wikimedia-dev ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Missing project ideas for GSOC
On Fri, Mar 22, 2013 at 6:55 AM, K. Peachey p858sn...@gmail.com wrote: On Fri, Mar 22, 2013 at 10:30 PM, Željko Filipin zfili...@wikimedia.org wrote: Most of these projects seem to be extension (and PHP?) centric. Can we have more diversity? Browser test automation? Not an extension, not in PHP, but in Ruby[2]. OPs don't want [any more] ruby on the clusters, So I wouldn't suggest that. We should be focusing on stuff that is achievable and that can easily be shown on benefit our users by actually getting it out there (we have a shocking record for this) It's not on the cluster, we manage these in gerrit[0] and run these completely openly on a hosted service. https://wmf.ci.cloudbees.com/. This is after many long discussions with various ops folks over the past year. These tests consistently find regression problems[1], and this week alone found regression issues with PageTriage and GuidedTour. It is achieved, it is demonstrably of benefit, it is definitely out there. [0]https://gerrit.wikimedia.org/r/#/admin/projects/qa/browsertests [1] http://www.mediawiki.org/wiki/QA/Browser_testing/Examples ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Mediawiki branches and deployments
On Tue, Mar 12, 2013 at 2:21 PM, Matthew Flaschen mflasc...@wikimedia.orgwrote: In the case of the GettingStarted extension v2, we used a feature branch because it was a user-facing change that took a few weeks to get ready for deployment. Because it was user-facing, there were interactions, and we wanted to test it, we wanted to do it in one shot. If there were a convenient mechanism to do it, this is a great example of something I would like to see enabled on beta labs during development but disabled in production until ready. ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Do we need to change the MW release process to better involve the non-WMF community?
On Fri, Feb 22, 2013 at 7:06 PM, Matthew Flaschen mflasc...@wikimedia.orgwrote: On 02/22/2013 09:38 PM, Chad wrote: So, I've seen this site tossed around quite a bit recently, and I'm curious: is there any plan to start integrating this jenkins and our other jenkins? Depends on what you mean by integrate. Right now the sweet spot for browser tests shown at https://wmf.ci.cloudbees.com/ is to track the deployment schedule over individual code commits and to target integrated institutional test environments like test2wiki and beta cluster, while https://integration.mediawiki.org/ci/ mostly targets unit-type tests run on the Jenkins host itself. There are a lot of builds there right now already. In the longer term we want to have browser tests targeting more specialized test environments and more granular code commits. There are lots of ways that Jenkins instances can share data, so when that sort of activity comes along we'll figure out the details at that time. More importantly: is there any chance to get the results of these sorts of tests in Gerrit? I think it's great that we're expanding test coverage, but without feedback on people's patches they're usually unaware that they're breaking things. As of today browser test status changes are being reported to #wikimedia-dev by a bot named wmf-jenkins-bot, e.g.: (09:30:18 AM) wmf-jenkins-bot-: Project _debug-irc build #17: SUCCESS in 90 ms: https://wmf.ci.cloudbees.com/job/_debug-irc/17/ Integration with Gerrit as well as Jenkins is certainly feasible, and as the information provided by these tests becomes more closely tied to the code itself rather than the environments in which the code is deployed, we can put that integration in place as it becomes valuable. ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Do we need to change the MW release process to better involve the non-WMF community?
On Thu, Feb 21, 2013 at 2:01 PM, Tyler Romeo tylerro...@gmail.com wrote: I'd like to see MediaWiki gain a more stable release process as well. I think some of the primary things that we're lacking are: - Where is QA? I mean, I know somewhere somebody is probably doing some sort of testing, but having worked as a QA engineer I haven't seen anything in MW that would resemble proper and traditional testing (excluding the unit testing). Where's the list of test cases that need to be performed for each release? How can one make new test cases and add them? etc. Maybe this already exists, but if it does it's definitely not documented well enough. Two answers, possibly oversimplified: First, supporting Mediawiki for 3rd parties is not a priority for WMF in recent times, so QA efforts have been focused elsewhere. Second, volunteer QA testing in general *is* a priority for WMF right now, so a volunteer QA effort to test Mediawiki releases would be a likely candidate for WMF support. This sort of project would fall naturally into the effort we're calling Features Testing, and we're looking to support that sort of project by way of a Group http://www.mediawiki.org/wiki/Groups/Proposals/Features_testing. -Chris ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Maria DB
On Wed, Feb 13, 2013 at 6:19 AM, Petr Bena benap...@gmail.com wrote: Okay - so what is outcome? Should we migrate beta cluster? Are we going to use it in production? At the risk of derailing the conversation to an unrelated subject, I would rather work on finding a way to keep the db on beta cluster up to date rather than migrate to a whole different SQL implementation that is still not correct. https://bugzilla.wikimedia.org/show_bug.cgi?id=36228 -Chris ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
[Wikitech-l] Help us test the latest Article Feedback Tool features
Links and details, along with updates as they happen, are available on the testing page at http://www.mediawiki.org/wiki/Article_feedback/Version_5/Testing_Feb_2013 ... The WMF Editor Engagement team has been updating the Article Feedback tool, improving both the back end architecture and the user experience in preparation for further deployments of AFTv5 to French and German Wikipedias. For the week of 11 February 2013, the Editor Engagement team welcomes any reports of issues or problems with AFTv5. We have arranged a dedicated test environment for the latest version of AFTv5 and we can fully support any testers from the community who care to contribute their time and expertise to examining the new features. The latest AFTv5 is fully implemented on the host at http://ee-prototype.wmflabs.org/wiki/Main_Page, in particular the page at http://ee-prototype.wmflabs.org/wiki/Special:ArticleFeedbackv5/Golden-crowned_Sparrow Detailed descriptions of the new features are as always available in the AFTv5 documentationhttp://www.mediawiki.org/wiki/Article_feedback/Version_5/Feature_Requirements#Features_under_consideration, but a brief overview of features to be tested contains: *Link for editors to see feedback on their articleshttp://www.mediawiki.org/wiki/Article_feedback/Version_5/Feature_Requirements#Feedback_link_on_article_pages * - Provides an easy way to see current feedback for each article *Simplified moderation toolshttp://www.mediawiki.org/wiki/Article_feedback/Version_5/Feature_Requirements#Simpler_moderation_tools * - New page elements to make article moderation more intuitive *More filters for feedback pagehttp://www.mediawiki.org/wiki/Article_feedback/Version_5/Feature_Requirements#Feedback_page_filters * - Allow more configurable views of article feedback *Streamline editor tool sethttp://www.mediawiki.org/wiki/Article_feedback/Version_5/Feature_Requirements#Remove_reader_tools_for_editors * - Remove reader tools from editor's view of article feedback *More abuse filtershttp://www.mediawiki.org/wiki/Article_feedback/Version_5/Feature_Requirements#More_abuse_filters * - Increase signal and reduce noise in feedback with abuse filters *Satisfaction ratinghttp://www.mediawiki.org/wiki/Article_feedback/Version_5/Feature_Requirements#Satisfaction_Rating * - Present an overall view of aggregate feedback per article *Post without commenthttp://www.mediawiki.org/wiki/Article_feedback/Version_5/Feature_Requirements#Post_without_comment * - Support simple approval/disapproval without requiring text input Some of these new features may be tested by the casual user, but some features require special accounts or privileges in order to exercise them. Testers who require special privileges to examine the more sophisticated aspects of AFTv5 are encouraged to reply to this message, update the Talk page for the test plan, or contact the Editor Engagement team directly for help. [edithttp://www.mediawiki.org/w/index.php?title=Article_feedback/Version_5/Testing_Feb_2013action=editsection=1 ]Reporting Issues - Use this linkhttps://bugzilla.wikimedia.org/enter_bug.cgi?product=MediaWiki%20extensionscomponent=ArticleFeedbackv5keywords=aftv5-1.5 to access an input screen for Bugzilla with the appropriate first fields filled automatically: - Product: MediaWiki extensions - Component: 'ArticleFeedbackv5' - Keywords: 'aftv5-1.5' ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
[Wikitech-l] testable javascript tutorial
Given the Javascript LevelUp Bootcamp sessions coming up, I thought this would be of interest: https://shanetomlinson.com/2013/testing-javascript-frontend-part-1-anti-patterns-and-fixes/ ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] FOSDEM presentation - feedback welcome
Still working on details credits. Your feedback is welcome! It would be great if you could mention browser test automation explicitly on page 4, especially since Zeljko is giving a talk on the subject at FOSDEM Saturday: https://fosdem.org/2013/schedule/event/testing_mediawiki/ ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] A testing bug management wheel
We in QA discussed some possibilities for the browser test automation community activities, and we suggest that the first couple of community events be educational. In particular, we think it would be beneficial to start with some introductory topics to be presented as a hangout+IRC chat+documentation on the wiki. Our suggestions for the first two events: * how anyone can write an Test Scenario to be automated (and why this is important!) * how to read, understand and analyze results in the Jenkins system we have for browser automation -Chris On Wed, Jan 23, 2013 at 2:56 PM, Quim Gil q...@wikimedia.org wrote: This proposal got a basic agreement and is being implemented at https://www.mediawiki.org/**wiki/QA/Weekly_goalshttps://www.mediawiki.org/wiki/QA/Weekly_goals A rough start is expected in the first iteration of the four areas but we hope to have improvements every week. Get involved! Development teams: your proposals for testing bug management weekly goals are welcome. On 01/16/2013 02:25 PM, Quim Gil wrote: There are ongoing separate discussions about the best way to organize testing sprints and bug days. The more we talk and the more we delay the beginning of continuous activities the more I believe the solution is common for both: Smaller activities and more frequent. Each one of them less ambitious but more precise. Not requiring by default the involvement of developer teams. Especially not requiring the involvement of WMF dev teams. Of course we want to work together with development teams! But just not wait for them. They tend to be busy, willing and at the same time unwilling (a problem we need to solve but not necessarily before starting a routine of testing and bug management activities. If a dev team (WMF or not) wants to have dedicated testing and bug management activities we will give them the top priority. Imagine this wheel: Week 1: manual testing (Chris) Week 2: fresh bugs (Andre) Week 3: browser testing (Željko) Week 4: rotten bugs (Valerie) All the better if there is certain correlation between testing and bugs activities, but no problem if there is none. From the point of view of the week coordinators this is how a cycle would look like: Week 1: decide the goal of the next activity. Weeks 2-3: preparing documentation, recruiting participants. Week 4: DIY activities start. Support via IRC mailing list. Group sprint on Wed/Thu DIY activities continue. Week 4+1: Evaluation of results. Goal of the next activity During the group sprints there would be secondary DIY tasks for those happy to participate but not fond of the main goal of the week. If one group needs more than one activity per month they can start overflowing the following week, resulting in simultaneous testing bugs activities. Compared to the current situation, this wheel looks powerful and at the same time relatively easy to set up. There will plenty of things to improve and fine tune, but probably none of them will require to stop the wheel. What do you think? -- Quim Gil Technical Contributor Coordinator @ Wikimedia Foundation http://www.mediawiki.org/wiki/**User:Qgilhttp://www.mediawiki.org/wiki/User:Qgil __**_ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/**mailman/listinfo/wikitech-lhttps://lists.wikimedia.org/mailman/listinfo/wikitech-l ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] A testing bug management wheel
* how to read, understand and analyze results in the Jenkins system we have for browser automation A good proposal, booked for the week starting on Mar 11. Please help defining what could be the practice, the actual contribution made by participants at the end of the week. It's right here if you want to take a look: https://wmf.ci.cloudbees.com/ -C ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] A testing bug management wheel
On Wed, Jan 16, 2013 at 3:25 PM, Quim Gil q...@wikimedia.org wrote: All the better if there is certain correlation between testing and bugs activities, but no problem if there is none. I'm glad you mentioned this, it's something I'd like to bring up with Andre and Valerie. Note that much of the backlog for automated tests is the result of fixed BZ tickets http://www.mediawiki.org/wiki/Qa/test_backlog. Fixed bugs are great candidates for regression tests because a) what broke once is more likely to break again and b) an issue fixed may indicate more issues in nearby areas of the feature. Our UploadWizard test is a great example of a single test catching multiple issues in the same area over time. So a mechanism by which fixed browser bugs become entered in the automated browser test backlog would be a fine thing. Compared to the current situation, this wheel looks powerful and at the same time relatively easy to set up. There will plenty of things to improve and fine tune, but probably none of them will require to stop the wheel. What do you think? How would this affect the notion of Groups? http://www.mediawiki.org/wiki/Groups/Proposals ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] A testing bug management wheel
Compared to the current situation, this wheel looks powerful and at the same time relatively easy to set up. There will plenty of things to improve and fine tune, but probably none of them will require to stop the wheel. What do you think? Our object here is to foster a community interested in participating in bug and testing projects. You've described one way we might create some projects, but I'd like to know more about your ideas for communicating with, creating, and supporting the communities for such projects. What makes the wheel valuable to such a community, and how do they know? ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Language features testing sprint (was Re: A testing bug management wheel)
They are ready to start. Next week. Keep in mind that we're migrating data centers next week and all the Wikipedias will be subject to intermittent read-only access and possibly other issues. Hopefully we'll be stable by Thursday. VisualEditor looks like the primary goal, having Milkshake as secondary option for whoever feels more interested. To the best of my knowledge the only publicly accessible page for VE exists at http://www.mediawiki.org/wiki/VisualEditor:Test, and I believe one might have to have special rights to edit that page. Of course we can do further outreach, but the Wikipedias alone should already provide the critical mass of contributors, right? By what mechanism? Then we need to define the right environment for testing. Is it a fresh install in Labs? Something else? I would love to see VE widely enabled in beta labs. I suspect that is a non-trivial project. The sprint could be on Thursday, starting in Asian friendly times since this is where most of the potential testers will be based. Seems risky to me. Others might know differently. -Chris ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Language features testing sprint (was Re: A testing bug management wheel)
That is a good point. But even more important is to decide what is the testing environment. Thanks Isarra, I hadn't known VE was an option on enwp now, nor did I know about http://ve-change-marking.**instance-proxy.wmflabs.org/**wiki/Main_Pagehttp://ve-change-marking.instance-proxy.wmflabs.org/wiki/Main_Page If next week is too soon and the datacenter migration complicates things, then we should be able to do this the other week. I hope there wouldn't be any reason to delay further. And that would fit the slot of Jan 30 that was left empty by Echo. I think last week of January would be less risky, in terms of both stable access and also adequate preparation. I actually make a pretty awesome paella, I'll give you the recipe if you like. One time though I served it to my in-laws and messed up the rice, and no one ever wanted to eat it again. I hope I don't seem too negative. I have done a few of these test events, and doing them well is not as easy as it would seem. Two overarching concerns are: * the participants should have fun * the results should be valuable If the participants are faced with access issues, or confusing instructions, or spammy messages, or any of a host of other annoyances, they will not come back. Ever. Creating a fun experience takes a significant investment in planning, set up, and communication both before and during the exercise. If the results are not valuable to the project being tested, then that is a waste of a significant investment. And again, if the participants feel like they've spent time in a wasted cause, they will not come back. I've wanted to get a lot of eyeballs on VE for some time now, so let's figure out some details. -Chris ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] WikiEditor-like functionality in the VisualEditor age
And finally, many wikis built their own custom features: ProofreadPage on wikisource is of particular note here, e.g. https://en.wikisource.org/w/index.php?title=Page:United_States_Statutes_at_Large_Volume_43_Part_2.djvu/15action=edit -Chris ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
[Wikitech-l] Željko Filipin presenting at FOSDEM
Željko Filipin, QA Engineer for WMF, will be giving a presentation at FOSDEM in Brussels Feb 2 about our browser automation project. Congratulations Željko, it looks like a great track to be part of. https://fosdem.org/2013/schedule/track/testing_and_automation/ ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] How to contribute to sysadmin / devops
On Tue, Jan 8, 2013 at 12:58 AM, Petr Bena benap...@gmail.com wrote: From my point of view, this is something what will be possible in future. I thought that once we finish working on beta cluster, all deployment will be done there, and then once it is found working, it's merged with production. Now it works the other way - changes are done in production, and then merged to labs. That is not strictly true. Merged code changes go to beta automatically, and because of the rolling deployments to production, are typically available in beta labs for some time before being deployed. However, beta labs has pointed out how much manual effort is involved in deploying. Database updates have not been going to beta labs along with merged code, for example, and I believe we might have a similar issue with graphic elements, icons and such, also lagging the actual merged code. Two things coming up might mitigate this somewhat: we'll be experimenting with git-deploy in beta labs first, and that might make for some improvements. Also, we will have an increasing focus on Continuous Deployment/DevOps in the near future, and I'm hoping beta labs will play a big role in any DevOps work we do. I think this should change, and once it is done, people should be able to modify configuration of labs - thus changing the production in future. But given labs were flagged as stable few weeks ago, this is going to take a while. But that's how I see it, maybe it's going to be done differently. On Tue, Jan 8, 2013 at 1:45 AM, Quim Gil q...@wikimedia.org wrote: Hi, Even before joining the WMF I have heard about the opportunities for sysadmins to contribute, the possibility to get involved in Wikipedia that way, how great https://labs.wikimedia.org/ is for this purpose and the inevitable mention to Puppet at some point. While improving http://www.mediawiki.org/wiki/**How_to_contribute http://www.mediawiki.org/wiki/How_to_contributeI couldn't find (m)any details about how a volunteer sysadmin could enter this path and make progress until, say, becoming a Wikimedia sysadmin wearing a 'got [Wikimedia logo] root' shirt. http://www.mediawiki.org/wiki/**Sysadmin_hub http://www.mediawiki.org/wiki/Sysadmin_hubseems to be the closest landing page for a sysadmin volunteer, but that page needs love and there is little to be found there for wannabe contributors. I can put some time sorting out this, but I need help from the people in the know. We could start defining the first step that a potential sysadmin volunteer could make in order to become a helpful contributor. -- Quim Gil Technical Contributor Coordinator @ Wikimedia Foundation http://www.mediawiki.org/wiki/**User:Qgil http://www.mediawiki.org/wiki/User:Qgil __**_ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/**mailman/listinfo/wikitech-l https://lists.wikimedia.org/mailman/listinfo/wikitech-l ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Adapting Visual Editor for 1.19
On Fri, Jan 4, 2013 at 10:04 AM, David Gerard dger...@gmail.com wrote: On 4 January 2013 17:02, Mark A. Hershberger m...@everybody.org wrote: Is anyone else interested in helping to make this happen? I have no coding ability but would LOVE this for our work 1.19 instances, and would be most pleased to test. I think it would be valuable to have a coordinated effort to test Visual Editor at a time when such a project could provide useful feedback to the VE development team. -Chris ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
[Wikitech-l] revamped/updated QA docs on mediawiki.org
I've sorted, linked, tagged, organized, and gardened our collection of QA pages on mw.o to be more useful. Of course there is always more to do, so comments, criticism, edits are welcome. http://www.mediawiki.org/wiki/QA -Chris ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Unit tests scream for attention
On Sat, Dec 29, 2012 at 1:47 PM, Bryan Tong Minh bryan.tongm...@gmail.comwrote: This is an annoyance to me as well. So, I went triaging, and finally found the issues that failed the unit tests for me. I have committed fixes for them to gerrit: https://gerrit.wikimedia.org/r/#/c/41362/ https://gerrit.wikimedia.org/r/#/c/41360/ This sort of thing has come up before. Michael Feathers, author of Working Effectively with Legacy Code ( http://www.artima.com/weblogs/viewpost.jsp?thread=126923), etc. published a guideline for unit tests in 2005: http://www.artima.com/weblogs/viewpost.jsp?thread=126923: A test is not a unit test if: It talks to the database It communicates across the network It touches the file system It can't run at the same time as any of your other unit tests You have to do special things to your environment (such as editing config files) to run it. I am not a great writer of unit tests, especially in PHP, but it is my impression that we have quite a few existing tests that don't follow this guide. In Bryan's example above, having a unit test depend on the existence of gzip would be a code smell. I don't think it's an immediate priority, but over time it would probably make sense to refactor the tests that don't follow this guide to use mocks and stubs and other accepted practices to avoid requiring a particular environment to run unit tests. -Chris ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
[Wikitech-l] DevOps/Continuous Deployment discussion?
Hi, A number of people I know of have ideas and aspirations pertaining to a DevOps-style deployment process, a.k.a Continuous Deployment. In recent times a number of pieces of such a system have become functional: Zuul, Jenkins enhancements for tests, automated acceptance tests, etc. But looking at mediawiki.org I don't see any sort of central discussion of overall approach/design/process for DevOps/Continuous Deployment. Is it time to start such a discussion? Or is this premature? -Chris ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Deploying to test2 before other wikis
On Tue, Dec 18, 2012 at 11:28 AM, Mark Holmquist mtrac...@member.fsf.orgwrote: Is there any reason why this shouldn't be a stated policy? If not, where should we state the policy so that people are aware of it? Considering that testwiki has a nice documentation page [0] and test2wiki doesn't [1], I think we should probably clear that up before announcing any policy. [0] http://wikitech.wikimedia.org/view/Test.wikipedia.org [1] http://wikitech.wikimedia.org/view/Test2.wikipedia.org I thought I had an account on wikitech, but apparently I was mistaken. Nor can I create an account on wikitech, nor edit that page. If I could, though, there would not be much to add. I don't really see the connection between having a docs page on wikitech for test2 and using test2 in a responsible way for staging production code. -Chris ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Deploying to test2 before other wikis
On Tue, Dec 18, 2012 at 9:27 AM, Petr Bena benap...@gmail.com wrote: It would be cool if part of that policy was testing on beta cluster which is also supposed to be identicaly configured as production and is even closer to production because the MediaWiki space is cloned from production and on beta cluster we have replicated each production wiki with its custom configuration This would be desirable, and is reflected in this request: https://bugzilla.wikimedia.org/show_bug.cgi?id=43203 -Chris ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
[Wikitech-l] Proposal: MediaWiki Group Browser testing
Following the process for requesting the creation of a MediaWiki group, here is a proposal for MediaWiki Group Browser testing http://www.mediawiki.org/wiki/Groups/Proposals/Browser_testing This group welcomes anyone interested in the automated browser testing project of WMF. Technical skills and programming experience are NOT required... Your endorsements, improvements and feedback are welcome at the wiki page. Thank you! PS: see also http://www.mediawiki.org/wiki/Groups/Proposals ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] MediaWiki Groups are official: start your own!
On Wed, Dec 12, 2012 at 4:06 AM, David Gerard dger...@gmail.com wrote: There used to be Wiki Wednesdays in London - not just MediaWiki or Wikipedia - but all sorts of wikis. Mostly corporate users. These petered out from lack of general interest, though. It surprises me, as I'd expect a lot of people using MW in London. Thank you for mentioning Wiki Wednesday. Wiki Wednesday is a long-standing institution that seems to have lost popularity over the past several years. Socialtext used to promote Wiki Wednesday pretty heavily in the Bay Area and elsewhere: https://www.socialtext.net/wikiwed/ . Socialtext today is radically different than it was then, and Wiki Wednesday became much less a priority for them. Wiki Wednesday was the first of many such ideas: http://ashub.blogspot.com/2005/06/tag-tuesday.html Ward Cunningham is still doing Wiki Wednesday activities today: http://twitter.com/WardCunningham/status/238315318345347074 On one hand, I think aligning Mediawiki Groups with the venerable tradition of Wiki Wednesday might be worthwhile. Wiki Wednesday is a concept that already exists, and I think people like Ward would be happy to see it promoted more than it has been. On the other hand, interest in Wiki Wednesday has died down in recent years, and that might reflect badly on a new but similar project. -Chris ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Alpha version of the VisualEditor now available on the English Wikipedia
Would it be possible to enable VE on test2 in the same way? I would like to use it in a noisy way, and would rather not make noise on enwiki. -Chris On Tue, Dec 11, 2012 at 8:30 PM, James Forrester jforres...@wikimedia.orgwrote: TL;DR: Today we are launching an alpha, opt-in version of the VisualEditor[0] to the English Wikipedia. This will let editors create and modify real articles visually, using a new system where the articles they edit will look the same as when you read them, and their changes show up as they type enter them — like writing a document in a word processor. Please let us know what you think[1]. Why launch now? We want our community of existing editors to get an idea of what the VisualEditor will look like in the “real world” and start to give us feedback about how well it integrates with how they edit right now, and their thoughts on what aspects are the priorities in the coming months. The editor is at an early stage and is still missing significant functions, which we will address in the coming months. Because of this, we are mostly looking for feedback from experienced editors at this point, because the editor is insufficient to really give them a proper experience of editing. We don’t want to promise an easier editing experience to new editors before it is ready. As we develop improvements, they will be pushed every fortnight to the wikis, allowing you to give us feedback[1] as we go and tell us what next you want us to work on. How can I try it out? The VisualEditor is now available to all logged-in accounts on the English Wikipedia as a new preference, switched off by default. If you go to your “Preferences” screen and click into the “Editing” section, it will have as an option labelled “Enable VisualEditor”). Once enabled, for each article you can edit, you will get a second editor tab labelled “VisualEditor” next to the “Edit” tab. If you click this, after a little pause you will enter the VisualEditor. From here, you can play around, edit and save real articles and get an idea of what it will be like when complete. At this early stage in our development, we recommend that after saving any edits, you check whether they broke anything. All edits made with the VisualEditor will show up in articles’ history tabs with a “VisualEditor” tag next to them, so you can track what is happening. Things to note Slow to load - It will take some time for long complex pages to load into the VisualEditor, and particularly-big ones may timeout after 60 seconds. This is because pages have to be loaded through Parsoid which is also in its early stages, and is not yet optimised for deployment and is currently uncached. In the future (a) Parsoid itself will be much faster, (b) Parsoid will not depend on as many slow API calls, and (c) it will be cached. Odd-looking - we currently struggle with making the HTML we produce look like you are used to seeing, so styling and so on may look a little (or even very) odd. This hasn't been our priority to date, as our focus has been on making sure we don't disrupt articles with the VisualEditor by altering the wikitext (correct round-tripping). No editing references or templates - Blocks of content that we cannot yet handle are uneditable; this is mostly references and templates like infoboxes. Instead, when you mouse over them, they will be hatched out and a tooltip will inform you that they have to be edited via wikitext for now. You can select these items and delete them entirely, however there is not yet a way to add ones in or edit them currently (this will be a core piece of work post-December). Incomplete editing - Some elements of complex formatting will display and let you edit their contents, but not let users edit their structure or add new entries - such as tables or definition lists. This area of work will also be one of our priorities post-December. No categories - Articles' meta items will not appear at all - categories, langlinks, magic words etc.; these are preserved (so editing won't disrupt them), but they not yet editable. Another area for work post-December - our current plan is that they will be edited through a metadata flyout, with auto-suggestions and so on. Poor browser support - Right now, we have only got VisualEditor to work in the most modern versions of Firefox, Chrome and Safari. We will find a way to support (at least) Internet Explorer post-December, but it's going to be a significant piece of work and we have failed to get it ready for now. Articles and User pages only - The VisualEditor will only be enabled for the article and user namespaces (so you can make changes in a personal sandbox), and will not work with talk pages, templates, categories, etc.. In time, we will build out the kinds of specialised editing tools needed for non-articles, but our focus has been on articles. Final point This is not the final form of the
Re: [Wikitech-l] Alpha version of the VisualEditor now available on the English Wikipedia
On Wed, Dec 12, 2012 at 8:59 AM, Chad innocentkil...@gmail.com wrote: On Wed, Dec 12, 2012 at 10:58 AM, Chris McMahon cmcma...@wikimedia.org wrote: Would it be possible to enable VE on test2 in the same way? I would like to use it in a noisy way, and would rather not make noise on enwiki. It's also enabled for the user namespace, so people can feel free to play with it and not be afraid of messing up a real article. I was thinking of making some basic automated browser tests for it. test2 is more handy than enwiki for that. Beta labs would be even better. -Chris ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Mobile apps: time to go native?
Have a link? 'Cheap smartphone' seems a contradiction. $50 Huawei phones running an ancient Android and only getting cheaper. Jimbo's all about them. http://techcrunch.com/2012/12/10/50-android-smartphones-are-disrupting-africa-much-faster-than-you-think-says-wikipedias-jimmy-wales/ This is a big deal at Mozilla also: http://arstechnica.com/information-technology/2012/07/mozillas-b2g-to-be-called-firefox-os-will-ship-in-2013/ ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Manual testing strategy - DRAFT
This is a great outline. I am looking forward to contributing to the areas where I have some experience and expertise, and learning about the areas where Quim does and I don't! -Chris On Wed, Nov 21, 2012 at 4:45 PM, Quim Gil q...@wikimedia.org wrote: Here is a first stab for a draft proposal to organize our volunteer testing activities: http://www.mediawiki.org/wiki/**Talk:QA/Strategy#Manual_**testing_strategyhttp://www.mediawiki.org/wiki/Talk:QA/Strategy#Manual_testing_strategy Written after some lousy discussions with Chris and Sumana, and reading a bunch of related wiki pages. Your feedback is welcome. Ideally this _theory_ will be immediately applicable to some pilots that we can run in the upcoming weeks. The Language and Mobile teams seem to be ready for a try - maybe even before the end of the year. Visual Editor and Editor Engagement teams might come next in January. The door is open for any other project willing to run QA activities with volunteers. Just let me know. -- Quim Gil Technical Contributor Coordinator Wikimedia Foundation __**_ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/**mailman/listinfo/wikitech-lhttps://lists.wikimedia.org/mailman/listinfo/wikitech-l ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Manual testing strategy - DRAFT
This proposal feels detached from reality. Right now features teams mostly do one of the following, in my experience: 1). Product managers and developers do their own manual QA. For PMs this aligns with verifying requirements, for developers it's checking their own work. It can be a pain in the ass but it works for the most part. 2). A lucky few teams have dedicated QA help, like mobile. I've mentioned this before but I've been pretty quiet about it. QA at WMF is still a pretty new idea, and we're still getting a lot of bits sorted, but if your project has a need for software testing/QA, I am always willing to help, and Zeljko and Michelle are also expert on the subject. In either situation, manual QA tends to be done on a tight deadline, requires an intimiate understanding of the goals and requirements, and within a very specific scope. Community QA is less about deadlines and more about organizing around windows of opportunity. My best example is the testing session that we ran for AFTv5 just before the first limited release to production. A nice mix of Wikipedians and outside software testers provided well-considered testing, and we changed AFTv5 in significant ways before the release as a result of that feedback. I don't have a lot of experience working at a large open source project, a caveat, so I haven't had the opportunity to see volunteer QA in action. But considering my current working situation, I would rather continue doing my own QA rather than rely on a volunteer who cannot be held to a deadline and is not personally responsible for the quality of our work. The only solutions in my mind are A) much more robust automated testing. B) hiring experienced QA people. Anything else is just going to slow us down. Automated testing is hugely important, and it has been my focus in recent times, I'll be making some announcements about that very soon. On the community testing side, it is quite possible to have those who understand the requirements and desired behavior create guided test charters for those who are not necessarily intimately aware of the project goals. One of the biggest impediments we have to community testing (or even user testing by insiders) is the lack of reasonable test environments. The test and test2 environments are not only misnamed, but are of marginal utility. We've been investing in beta labs, and beta is so much better than it used to be, but we still have a way to go there. As I mentioned on this list before, the best way to improve beta labs at this point is to use it. WMF has a small but dedicated QA staff. My idea of the role of QA/testing is that QA/testing is a service we may provide to particular projects. Some projects may not need QA/testing. Some projects may need it from time to time, but not always. Some projects may need community testing, and we can support that also. What I do not want to see is for QA/testing to be some sort of mandatory gateway/hand-off/quality-police function that everything must pass through. That way lies madness. -Chris ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Manual testing strategy - DRAFT
QA activity days often lead to a large number of duplicate issues being filed. This is true. But I think there is value when new users (for some value of new) file duplicate issues. In particular, I think it points up a possible need to increase the severity/priority of the issues reported. The other thought I had was about the layered personas being created for the team. Since Chris points out later in this thread that QA is a relatively new concept, being egalitarian would have a much higher chance at a larger percentage of committed members of a QA group. Thank you. Again, I think this comes down to testing activities or charters being designed well, in advance, by those who have some knowledge of the project being tested, for the benefit of those who have less knowledge. The level of expertise from project to project for any particular person will change radically over the course of multiple test exercises. Egalitarian is a good word. ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Short URLs for Wikimedia Commons files
I'll also note that ShortURL is also live on test2.wikipedia.org, and that it seems to cause an error on every page for IE7. On Wed, Oct 31, 2012 at 6:00 AM, Raimond Spekking raimond.spekk...@gmail.com wrote: Am 31.10.2012 02:48, schrieb Everton Zanella Alvarenga: Hi, wouldn't be nice? Where to propose it? Tom Do you mean the usage of the ShortURL extension on Commons like it is used on same indic language Wikipedias? https://ta.wikipedia.org/s/1iu resolves to https://ta.wikipedia.org/wiki/%E0%AE%AE%E0%AE%BE%E0%AE%B2%E0%AF%88%E0%AE%A4%E0%AF%8D%E0%AE%A4%E0%AF%80%E0%AE%B5%E0%AF%81%E0%AE%95%E0%AE%B3%E0%AF%8D ++1 by me for Commons to get short file links :-) Raimond. ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Improving Documentation on Bug Report Management
Mediawiki.org please. mw.o is host to much more technical discussion than meta. On Wed, Oct 10, 2012 at 4:25 AM, Siebrand Mazeland (WMF) smazel...@wikimedia.org wrote: Hi Andre, Can we please change the venue for this discussion? English Wikipedia is not used and/or visited by all. It seems to me that Meta-Wiki or mediawiki.org is a better place for this. Unless you're trying to establish English Wikipedia's documentation on bug management/triaging, in which case I misunderstood the scope of your proposal, for which I apologise. Cheers! Siebrand On Wed, Oct 10, 2012 at 12:12 PM, Andre Klapper aklap...@wikimedia.org wrote: Hi everybody, being WMF's new bug wrangler I would like to improve documentation on bug management/triaging in order to make it easier to get involved and understand how things work. As a side effect it will also help myself in understanding things better. :) Current content feels inconsistent and scattered across several wikis, plus after reorganizing I'd like to extend it with some pages I consider useful (e.g. starting a proper Triage Guide, and a page listing URLs of upstream bugtrackers). My (old but still valid) proposal is available at https://en.wikipedia.org/wiki/Wikipedia:Village_pump_%28technical%29/Archive_100#Improving_Documentation_on_Handling_Bug_Reports Not sure if I should also post this again in http://en.wikipedia.org/wiki/Wikipedia:Village_pump (and not sure in which section)? If anybody has (good/bad) recent experience with trying to get into bug management: Your feedback is very welcome. No comments means nobody has strong feelings and I'll simply go ahead. ;) Cheers, andre -- Andre Klapper | Wikimedia Bugwrangler http://blogs.gnome.org/aklapper/ ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l -- Siebrand Mazeland Product Manager Language Engineering Wikimedia Foundation M: +31 6 50 69 1239 Skype: siebrand Support Free Knowledge: http://wikimediafoundation.org/wiki/Donate ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] [Wmfall] Welcome Michelle Grover as QA to the Mobile Team!
I'd like an excuse to get to the Front Range, though, hopefully Michelle and I could meet face-to-face at some point. http://goo.gl/maps/6m5yz It's been a real pleasure working with Michelle through the hiring process and orientation, and I am looking forward to the QA staff doing some nifty things in the very near future. -Chris On Tue, Oct 9, 2012 at 1:59 PM, Michelle Grover mgro...@wikimedia.orgwrote: Thanks everyone! and Yup I join the CO office though Chris is pretty far away from where I live :) On Tue, Oct 9, 2012 at 1:25 PM, Sangeeta Prashar spras...@wikimedia.orgwrote: Great to have you on board Michelle! Cheers, Sangeeta On Tue, Oct 9, 2012 at 12:17 PM, Tomasz Finc tf...@wikimedia.org wrote: Greetings all, I am pleased to announce that Michelle Grover joins WMF this week as a Mobile QA contractor. Michelle has worked as a Java Developer, Software Developer in Test, Release Engineer for Adobe, and Mobile QA Automation Lead for Crowdfusion (They developed The Daily, TMZ, Telepictures mobile applications). She's worked closely with agile teams (SCRUM and XP) over the past 7 + years, Implemented Kanban and setup and configured CI using Hudson,Team CIty and Cruise Control. She's been married 17 years and has a 6 year old son. Currently lives in Monument, CO and has spent a lot of time up in the mountains. Michelle will help both the community and mobile team build out a sound process for testing both the mobile web and our apps. The team would like to welcome her and wish her success. --tomasz ___ Wmfall mailing list wmf...@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wmfall -- *Sangeeta Prashar* *Recruiting Manager spras...@wikimedia.org kchel...@wikimedia.org* *415-839-6885 ext. 6829* * * Just Wikipedia it. ___ Wmfall mailing list wmf...@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wmfall ___ Wmfall mailing list wmf...@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wmfall ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
[Wikitech-l] Welcome Željko Filipin, QA Engineer
Hello everyone, I am pleased to announce that Željko Filipin joins WMF this week as QA Engineer. Željko is is a world-class expert on browser test automation, software testing, and related systems and tools. He will be leading our browser test automation effort as well as doing other QA and testing work. Željko writes an authoritative blog about test automation, and is active (and highly visible) on Github and Stackoverflow. He hosted the Watir (Web Application Testing In Ruby) podcast for a long time, and is a long-standing member of the Watir Core Team. Željko lives with his family in Zagreb, Croatia, where he is a competitive table tennis player. I am particularly pleased, proud, and excited to make this announcement because Željko and I have been acquainted for many years. He and I were both early adopters of Watir, the first viable open source browser test automation tool in history. This is the first time we have worked together professionally, and I could not be happier that he is our new QA Engineer. -Chris ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
[Wikitech-l] beta labs is now a fit test environment
When I was hired as QA Lead almost seven months ago, WMF lacked a test environment where * code was routinely deployed ahead of production * the test environment emulated the production environment closely * aspects of the test environment (config, permissions, etc.) could be easily and reliably manipulated for testing purposes Today I am happy to announce that beta labs fulfills those needs. Beta labs is intended to host the upcoming release of Mediawiki, plus those extensions scheduled for deployment to production, for the purpose of testing and investigation. As of a little while ago, Mediawiki, AFTv5, New Pages Feed/Page Curation, and UploadWizard are being deployed to beta labs from git automatically and reliably. The configurations for those extensions are also being managed in git. The environment itself is managed via puppet, and emulates production to the greatest extent possible. Many many thanks to Antoine Musso for making this possible. As of this week, all these extensions are up, running, and configured to be useful. Note that they are not perfect, just useful. For example, right now on beta enwiki both AFTv4 and AFTv5 input forms appear on the same page in many cases, because I was experimenting with what happens when these extensions are not configured correctly. Some actions from the Page Curation toolbar never complete. As these glitches become important to testing, we will get them working correctly, and likely will find out some interesting things about the software along the way. The timing for this announcement is excellent, because new QA Engineers will be joining WMF soon (more on that next week), and beta labs will be a prime target for the browser-level end-to-end automated tests we will shortly be creating. Also, we have been wanting to retire the 'prototype' host for some time, and having AFTv5 etc. on beta labs should make that possible. In summary, beta labs is up and running with current code for Mediawiki and critical extensions, and at this point the best way to improve beta labs is to use it. http://en.wikipedia.beta.wmflabs.org/wiki/Special:ArticleFeedbackv5 http://en.wikipedia.beta.wmflabs.org/wiki/Special:NewPagesFeed http://commons.wikimedia.org/wiki/Special:UploadWizard ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] beta labs is now a fit test environment
http://commons.wikimedia.beta.wmflabs.org/wiki/Special:UploadWizard I am a bad editor. -Chris On Fri, Sep 28, 2012 at 9:48 AM, Chris McMahon cmcma...@wikimedia.orgwrote: When I was hired as QA Lead almost seven months ago, WMF lacked a test environment where * code was routinely deployed ahead of production * the test environment emulated the production environment closely * aspects of the test environment (config, permissions, etc.) could be easily and reliably manipulated for testing purposes Today I am happy to announce that beta labs fulfills those needs. Beta labs is intended to host the upcoming release of Mediawiki, plus those extensions scheduled for deployment to production, for the purpose of testing and investigation. As of a little while ago, Mediawiki, AFTv5, New Pages Feed/Page Curation, and UploadWizard are being deployed to beta labs from git automatically and reliably. The configurations for those extensions are also being managed in git. The environment itself is managed via puppet, and emulates production to the greatest extent possible. Many many thanks to Antoine Musso for making this possible. As of this week, all these extensions are up, running, and configured to be useful. Note that they are not perfect, just useful. For example, right now on beta enwiki both AFTv4 and AFTv5 input forms appear on the same page in many cases, because I was experimenting with what happens when these extensions are not configured correctly. Some actions from the Page Curation toolbar never complete. As these glitches become important to testing, we will get them working correctly, and likely will find out some interesting things about the software along the way. The timing for this announcement is excellent, because new QA Engineers will be joining WMF soon (more on that next week), and beta labs will be a prime target for the browser-level end-to-end automated tests we will shortly be creating. Also, we have been wanting to retire the 'prototype' host for some time, and having AFTv5 etc. on beta labs should make that possible. In summary, beta labs is up and running with current code for Mediawiki and critical extensions, and at this point the best way to improve beta labs is to use it. http://en.wikipedia.beta.wmflabs.org/wiki/Special:ArticleFeedbackv5 http://en.wikipedia.beta.wmflabs.org/wiki/Special:NewPagesFeed http://commons.wikimedia.org/wiki/Special:UploadWizard ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] beta labs is now a fit test environment
On Fri, Sep 28, 2012 at 11:10 AM, Arthur Richards aricha...@wikimedia.orgwrote: Chris, this sounds really cool. Can you point us to some specs about how the test environment is set up (what is the architecture like, what services are running, etc)? How closely does it emulate the production environment? Does the beta labs environment provide load balanced squid/varnish caching layers, configured similarly to the produciton cluster? If not, is that something we can hope to see? Is the setup something that we can package up and easily deploy to new instances in labs? This is somewhat out of date and in the queue to be updated, so take it with a grain of salt, but most of that is documented here: https://labsconsole.wikimedia.org/wiki/Deployment/Overview I have to admit I am still learning my way around, I'll defer to Antoine for any detailed answers on the configuration. Also, how can other projects/extensions start getting automatically pushed to the beta labs setup? The rule of thumb so far is that any code with a scheduled deployment date may (and probably should) be deployed to beta labs beforehand. In practice this has so far meant AFTv5, NewPagesFeed + Curation Toolbar, and to some extent TimedMediaHandler (testing TMH was the original motivation to get this environment in place). For example, AFTv5 is about to undergo some back end changes, I want to have an automated end-to-end test in place for it to be sure the front end is not changed by accident. This is still early days though, if you have a project that could benefit from the beta labs test env, I'm open to discussing anything along those lines. ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] beta labs is now a fit test environment
On Fri, Sep 28, 2012 at 11:37 AM, Arthur Richards aricha...@wikimedia.orgwrote: This is super awesome. This is something that could be useful for MobileFrontend, although I suspect there will need to be some additional configuration work to mimic how mobile requests get handled on the cluster. I'll ping Antoine :) Also, we'll have a Mobile QA person hired Real Soon Now, so more motivation to get things like this in place. ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] beta labs is now a fit test environment
On Fri, Sep 28, 2012 at 11:49 AM, Siebrand Mazeland (WMF) smazel...@wikimedia.org wrote: 1. How can I get access to this environment, so I can fiddle with it, too (or is this done through test puppet changes or something? Depends on what you mean by 'access' and 'fiddle'. Beta labs is configured from the same git project as production, except that there is a CommonSettings-wmflabs.php file, db-wmflabs.php file etc. that override the production settings where desired. For example, you can see my recent commit to CommonSettings-wmflabs.php from two days ago that enabled AFTv5 for 100% of beta enwiki in the queue here: https://gerrit.wikimedia.org/r/gitweb?p=operations%2Fmediawiki-config.git;a=shortlog;h=HEAD 2. Is it possible to simply clone another one of these environments, and if so, how is that done? My understanding is that there are still a few live hacks in place yet to be managed, but since the vast majority of the environment is managed from puppet and the config is in git, that should be possible. I'm not sure of the mechanism by which to accomplish it, though. Cloning beta labs is not a topic we've considered so far. ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
[Wikitech-l] quality at Google, 2006-2011
This crossed my desk this morning, it is a long and detailed (and honest!) account by an insider of Google's efforts to increase code quality and product quality. I think it's relevant to what we're doing at WMF, and what we might do in the future. http://mike-bland.com/2012/07/10/test-mercenaries.html ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Evaluating Google Summer of Code
Most software projects fail (for some definition of fail). Even for highly skilled and highly experienced companies and shops, most software projects fail. I'm not going to look up the Gartner and Forrester and Chaos reports this late on a Monday night, but google away. GSoC is an investment that is not intended to have a short-term payoff. The fact that ANY GSoC code makes to production is fantastic. GSoC is an investment in the long term. It is intended to provide real concrete experience to promising students in real environments, including all the frustrations and annoyances that everyone on a software team experiences in the real world all the time. Schools simply do not provide that experience. Some fraction of those participants will take those experiences into the future of software development, to make real improvements, both to code and to process. Furthermore, considering GSoC solely in terms of benefit to Mediawiki/Wikipedia is short-sighted. Take a look at the organizations participating: http://www.google-melange.com/gsoc/projects/list/google/gsoc2012 . What would your opinion be if WMF were not on that list? On Mon, Aug 27, 2012 at 5:32 PM, MZMcBride z...@mzmcbride.com wrote: (Splitting this off from John's critique of ConventionExtension.) Hi. MediaWiki has participated in several (Google) Summer of Code iterations now (https://www.mediawiki.org/wiki/Summer_of_Code) and I'm wondering how this partnership program is evaluated. Whenever this program wraps up at the end of the (Northern Hemisphere's) summer, I always sense a worrying amount of frustration and annoyance from all parties involved. The projects are usually overly large and complex and from what I understand, nearly all of the projects from Google Summer of Code don't end up in production environments. If the projects are lucky, they end up in a MediaWiki extension; if they're unlucky, they rot away in a code repo branch somewhere or behind a configuration variable set to false by default. The end result being that: * the people who worked on these projects are frustrated and annoyed because they didn't get their code deployed [to Wikimedia wikis, a wide audience, or anyone at all in some cases]; * the people who mentored these students are frustrated and annoyed for similar reasons; and * the people (end-users) who wanted to see these projects successfully completed are frustrated and annoyed that these features still don't exist. So I'm left wondering how the cost v. benefit equation works out for this program. How do you evaluate the program and whether MediaWiki ought to remain a continued participant? And, of course, should MediaWiki decide not to participate in Google Summer of Code in 2013, are there other [better] ideas for getting people involved in MediaWiki development? MZMcBride ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Appreciation thread
On Wed, Aug 22, 2012 at 6:16 PM, Mark Holmquist mtrac...@member.fsf.orgwrote Pi to the fifth points to Chris McMahon for being awesome in many ways, most notably the recent EtherEditor testing he helped with. Back atcha. Mark is cranking out a ton of code on several projects, is a go-to guy on IRC, contributing on every front. And while I'm here: Antoine Musso and Faidon Liambotis are handling some really huge issues so quietly and unobtrusively that you would never know there was an issue in the first place. Matthias Mullie has replaced an entire boutique third-party dev shop on the AFTv5 project with grace and style. Fabrice Florin is one of the best project managers I've ever worked with. Outstanding on every front, from documentation to communication to testing. Finally, everyone chatting on #wikimedia-staff. As a remote person, that IRC channel is my link to the office and WMF culture, I learn a lot from everything that goes on there. ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] GSoC wrapup report: TranslateSvg
Once code review is complete, there'll be at least one more testing phase, this time with specific questions, followed by a pitch by me to Wikimedia Commons. Only after that will I even utter the d word in the context of TranslateSvg. Please let me know when you get here, I am really interested in helping with that testing. Congratulations! ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] About outreach and tech events (as suggested by Sumana!)
The goal would be to lure experienced TDD devs in by focusing the event on testing, make them work with the community during a weekend on the code base, existing test, writing tests, etc. and mentor others along the way. (I'm still convinced that giving it a cool, flashy, name doesn't hurt ;-) We might even have a nice cultural clash as a result:) The recent discussion on this list about deprecation levels might indicate a a lack of DRY (Don't Repeat Yourself) in core code, and an opportunity to refactor for testability, with specific examples. ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] About outreach and tech events (as suggested by Sumana!)
«Testing Wikipedia» could be a nice catchy name for a series for events in various cities around TDD, with experienced dev mentoring less experienced community members, etc. Even if the experts come and go, everybody learn, some test and process get done, and the community grow and learn. I'm the QA Lead for WMF, and I can say a little about what we've done so far, and things we'd like to do in the future. In May we collaborated with the Weekend Testers (Americas)[1] in an online test exercise to validate the new frequent rollout schedule of new software to all of the Wikipedia wikis. WTA meets on the first Saturday of every month, and conveniently, we had deployed the latest version to all of the wikis except English Wikipedia at that time, so we had quite a few professional software testers looking for anomalies in the new wiki software, using English Wikipedia as an oracle for correct behavior. It went well, but the scope was a bit ambitious, so we were lucky that the testers were very professional. In June we collaborated online with Openhatch.org to validate a near-final version of the new Article Feedback system. This was a much more focused exercise, and it went really well, we found a number of real issues with AFT that we were able to address before rolling it out widely. Some of participants from the previous test event with WTA helped out, so there was a mix of skill and experience among the testers, and several people remarked about how they had not expected to have so much fun. We would like to do some more sessions like these. One strong suggestion is to have a test event addressing outstanding Bugzilla issues for particular extensions. This could be an ongoing exercise, either in collaboration with other groups or as a pure-Wikipedia exercise. I have discussed doing this with the leader of WTA who is also one of the instructors of the Association for Software Testing[2] 'Bug Advocacy' course, but haven't pursued it much farther than that. In the long run we would like these sorts of exercises to foster a critical spirit among Wikipedia users, improve the quality of issue reporting and follow-up in Bugzilla, build liaisons with communities like WTA and Openhatch that would not otherwise exist, and foster a general sense that all the Wikipedia software can be tested at any time, and Bugzilla is always open for all sorts of improvements. So now to address what you actually said :-)... Although I've read my share of unit tests in many languages, I'm not expert at it, nor do I have a background in PHP. But I am nearly certain that our existing unit test arrangements could be improved in many ways. Threads on the subject show up on this list from time to time, and I think I can say accurately that we could make better use of mocks and stubs instead of e.g. real database tables, we could do more TDD, our code coverage is probably not very high, and improving that coverage would entail not only writing more unit tests, but also refactoring existing code to make it more testable. I think improving unit testing would be a great ongoing project. [1] http://weekendtesting.com/chapters/america [2] http://www.associationforsoftwaretesting.org/training/courses/ ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
[Wikitech-l] Some labs beta cluster milestones
As you might know, we have been making significant improvements to the beta cluster in labs[1]. I wanted to point out two developments of the last few days that I think are important milestones. First, the TimedMediaHandler extension that we've been testing in the beta commons wiki for quite some time has moved off of the the beta cluster and is now being installed on test2wiki in preparation for a production deployment. Much work has gone into TMH, and this move would not have been possible without the extensive improvements to beta commons to support TMH development and testing. Second, there is a working, testable version of the new Article Feedback Tool Version 5 on the beta enwiki right now. This is the first major user-facing extension under development to be deployed to the beta cluster. The beta cluster will give us a much more sophisticated test and development environment for such extensions (and their combinations) in the very near future. Versions of Mediawiki core and extensions are deployed to the beta cluster directly from our git repository[2]. This process should be automated in the very near future. An issue with the shared file system GlusterFS has prevented any easy or convenient automated deployment, but our GlusterFS system is being upgraded to the latest version (pretty much any minute now) which should make deployment directly from git much more convenient. Thanks to Antoine Musso, Faidon Liambotis, Jan Gerber, Michael Dale, and Matthias Mullie for all your work and all your help. The labs beta cluster is becoming an important part our software development environment. -Chris [1] http://en.wikipedia.beta.wmflabs.org/wiki/Main_Page for example [2] https://labsconsole.wikimedia.org/wiki/Deployment/Overview#Configure ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Criteria for serious alternative
On Fri, Jul 27, 2012 at 11:29 AM, Rob Lanphier ro...@wikimedia.org wrote: I think our best mitigation strategy is to do as good a job as we possibly can integrating Gerrit with GitHub, combined with other improvements to Gerrit. One thing I don't think has been explicitly said yet, although Brion hinted at it early on is the nature of the gerrit-github integration. Gerrit serves the required workflow well for things like core and key extensions and puppet, so having changes made via gerrit reflected in a github view is great for outsiders to explore and experiment. But it would also be great, especially for certain types of community contributions, if we could approve pull requests from github and have those reflected in gerrit. I realize this is all hand-wavy and stuff, but as Brion pointed out, it's all git. With some thought behind the design, a two-way integration between gerrit and github seems like it would be possible and useful. ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Article revision numbers
This is all a fantastic idea. Distributing Wikipedia in a fashion similar to git will make it a lot easier to use in areas where Internet connections are not so common. I wonder could this sort of feature be implemented in the existing Kiwix codebase? That would be ideal I think. Ward is working on it. :) http://wardcunningham.github.com/ https://github.com/WardCunningham/Smallest-Federated-Wiki ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Speed up tests, make @group Database smarter
Christian, thanks for the information about DatabaseBase for mocking, that makes sense. I don't know PHP at all, but I know something about how to do automated tests. Besides manipulating tables directly, I've seen a couple of other things in the unit tests that struck me as strange also: * at least one test relies on a hard-coded path to a file on disk * there is a global timeout variable used by a number of tests settable (I think to 1s, 10s, 60s). (As opposed to for example, polling for a particular state from within the test itself) On Friday Erik sent a msg. to Engineering Alternatives to the 20% approach where he discussed a number of aspects of code review. My impression is that production code gets a lot of scrutiny, but the tests do not. I wonder if we reviewed the unit tests for good practice as well if that would eventually reduce the chore and burden required by the current situation with code review. -Chris PS It would be great to see some of this information on http://www.mediawiki.org/wiki/Manual:PHP_unit_testing which is pretty spare right now. On Sun, Jul 1, 2012 at 5:49 AM, Christian Aistleitner christ...@quelltextlich.at wrote: Hi Platonides, On Sat, Jun 30, 2012 at 03:45:14PM +0200, Platonides wrote: On 30/06/12 14:24, Christian Aistleitner wrote: [ Mocking the database ] [...] One would have to abstract database access above the SQL layer (separate methods for select, insert, ...) [...] You still need to implement some complex SQL logic. One might think so, yes. But as I said, one would mock /above/ the SQL layer. For typical database operations, SQL would not even get generated in the first place! Consider for example code containing $db-insert( $param1, $param2, ... ); The mock db's insert function would compare $param1, $param2, ... against the invocations the test setup injected. If there is no match, the test fails. If there is a match, the mock returns the corresponding return value right away. No generating SQL. No call to $db-tableName. No call to $db-makeList. No call to $db-query. No nothing. \o/ But maybe you hinted at DatabaseBase::query? DatabaseBase::query should not be used directly, and it's hardly is. We can go for straight for parameter comparison there as well. No need to parse the SQL. Unit testing is about decoupling and testing things in isolation. With DatabaseBase and the corresponding factories, MediaWiki has a layer that naturally decouples business logic from direct database access. Use the decoupling, Luke! Christian P.S.: As an example for decoupling and mocking in MW, consider tests/phpunit/maintenance/backupTextPassTest.php:testPrefetchPlain. This test is about dumping a wiki's database using prefetch. The idea behind prefetch is to use an old dump and use texts from this old dump instead of asking the database for every single text of the new dump. To test dumping using prefetch /without/ mocking, one would have to setup an XML for the old dump. This old dump's XML would get read, parsed, interpreted, ... upon each single test invocation. Tedious and time consuming. Upon each update of the XML format, we'd also have to update also the XML representation of the old XML dump. Yikes! [1] Besides, it duplicates efforts, as reading dumps, interpreting them is a separate issue and dealt with in isolation already in tests/phpunit/maintenance/backupPrefetchTest.php So the handling of the old dump reading, ... has been mocked out. All that's necessary for this is lines 143--153 and line 160 of tests/phpunit/maintenance/backupTextPassTest.php:testPrefetchPlain $prefetchMock is the mock for the prefetch (i.e.: old dump). $prefetchMap models the expected parameters and return values of the mocked method. So for example invoking $prefetchMock-prefetch( $this-pageId1, $this-revId1_1 ) yields Prefetch_1Text1 . [1] Yes, we had that situation recently, when the parentid tag got introduced [2]. The XML dumps of tests/phpunit/maintenance/backupTextPassTest.php tests/phpunit/maintenance/backupPrefetchTest.php were updated. So the tests assert that both dumping, and prefetch works with parentid. But we did not have to touch the mock due to this decoupling. [2] See commit d04b8ceea67660485245beaa4aca1625cf2170aa https://gerrit.wikimedia.org/r/#/c/10743/ -- quelltextlich e.U. \\ Christian Aistleitner Companies' registry: 360296y in Linz Christian Aistleitner Gruendbergstrasze 65aEmail: christ...@quelltextlich.at 4040 Linz, Austria Phone: +43 732 / 26 95 63 Fax:+43 732 / 26 95 63 Homepage: http://quelltextlich.at/ --- ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org
Re: [Wikitech-l] Speed up tests, make @group Database smarter
P.S.: On a related note ... one could think about mocking the database as a whole for PHPUnit tests. Thereby, one would get rid of unnecessary database coupling for unit testing, get better control/detection of side effects, and really solve the database performance problem for unit tests in one go. I'd like to hear more about this. -Chris ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] I hate to be that guy
On Wed, Jun 27, 2012 at 2:06 PM, Derric Atzrott datzr...@alizeepathology.com wrote: So I hate to be that guy who doesn't know the simple things, but what is Jenkins? The server has come up in discussion a few times since I joined this mailing list about a month ago. And since no one has mentioned it yet, you might want to read http://en.wikipedia.org/wiki/Continuous_integration. Jenkins is an open source system for doing CI. It used to be called Hudson. (Those are both names of butlers, which has always been the mascot of the project.) Over time, Jenkins has become a very powerful hub for automated testing and deployment, and most serious software projects integrate with Jenkins via plugins. (Although some of those projects don't do it very well, Fitnesse for example.) -C ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Suggestions for tasks for new contributors during hackathon
On the QA front, this came up in a WMF discussion recently, and I proposed it as a Weekend Testing Americas session, but it would work equally well at Wikimania, and it fits our goal of bringing in more community testing nicely: Wikipedia has a large number of open bug reports, like around 8000 right now, and that number is growing by approximately 3000/year last I looked. It is probable that many of those open bugs should be marked RESOLVED or possibly UNCONFIRMED, but our triaging resources are scarce right now. So what I'm proposing is a session to take a manageable number of open bug reports for a particular extension or two, read them, try to reproduce them, and then either a) mark them RESOLVED or else b) mark them UNCONFIRMED and/or c) leave a helpful comment on the open bug describing what the tester found when trying to reproduce the issue. An example would be https://bugzilla.wikimedia.org/buglist.cgi?resolution=---query_format=advancedcomponent=Moodbarproduct=MediaWiki%20extensionslist_id=123286 Our community testing session with Openhatch on June 9 was pretty successful and a lot of fun, this might be a nice way to get people familiar with how we manage issues in Bugzilla, which can be pretty daunting for newcomers. And of course it is a repeatable exercise, so doing it at Wikimania does not prevent doing it again with WTA or anywhere else. -Chris ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] IE7 tax
On Thu, Jun 14, 2012 at 5:21 AM, Arun Ganesh arun.plane...@gmail.comwrote: 6% of wikimedia project page views are from IE6/7 - because of the following: - IE6 ships default with XP - Legal users with SP2+ can upgrade to IE8 - If you have 90s era hardware, no SP for you. Can only be solved by buying some new hardware (or switching to linux) - IT admins who dont know much about IT and have kept the workforce hostage through their ignorance. Can be solved if the workforce and boss demands it. I'd like to reframe these examples. First, as I understand it, most IE6/IE7 users globally are running pirated versions of Windows. For financial or political reasons, they will not or can not acquire legal versions and thus can't upgrade their browsers. Second is certain types of Enterprise shops. Before I was hired at WMF, I worked for a company that processes complex financial records for pharmacies participating in a US federal program that reimburses pharmacies for the cost of drugs prescribed for indigent patients. Well over 50% of our users were on IE6/7. This was for two reasons: one is that these pharmacies are in the business of selling drugs, and IT is only a tiny part of their operation. Second is that with millions and millions of dollars passing through a system regulated by HIPAA and other laws, the risk of upgrading is seen as higher than the risk of using old tech. I don't think we can dismiss IE6/7 users cavalierly. ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
[Wikitech-l] WMF/Openhatch test event for AFTv5 Sat. 9 June
Similar to last month's event with the Weekend Testing group, this Saturday WMF is teaming with Openhatch.org for a test event to get critical eyeballs on the near-final version of the Article Feedback Tool before AFT gets rolled out to a wide audience in the very near future. Like last time, we anticipate some Bugzilla issues being created for AFT on Saturday. The test session is from 10AM-noon Pacific time. The openhatch announcement is: https://openhatch.org/blog/2012/june-9-help-wikipedia-test-some-software-and-get-involved-in-their-community-no-programming-required/ The test plan for the event is: http://www.mediawiki.org/wiki/QA/Article_Feedback_Test_Plan ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l