Re: [Wikitech-l] Module storage is coming
On Nov 4, 2013 2:28 AM, Ori Livneh o...@wikimedia.org wrote: The roll-out of 1.23wmf2 to your favorite Wikimedia wiki will inaugurate the era of ResourceLoader module storage -- an era which will be marked by terrifying and hilarious new bugs, intermittent client-side failures, and generally inexcusable disruptions to user experience. Sounds exciting? Read on! What is it? --- Module storage refers to the use of localStorage in ResourceLoader as an auxiliary to the browser cache. ResourceLoader, remember, is the MediaWiki subsystem that delivers JavaScript and CSS to your browser. One of its primary functions is to minimize the total number of network requests that your browser needs to make in order to render the page, and to make the remaining requests as slim as possible. One of the ways ResourceLoader does this is by making a list of all the different components your browser needs and then transmitting them in bulk. The downside of this approach is that a small update to MediaWiki's code, touching perhaps one or two components, will often force the browser to throw out (and re-retrieve) a bunch of unrelated modules that did not change and did not ostensibly need to be re-retrieved. This is because the browser cache operates on the level of network requests; it is not intelligent enough to decompose a request into its constituitive parts and to cache these parts separately from one another. Starting with the 1.23wmf2 branch, ResourceLoader will check if your browser implements localStorage, a mechanism for storing structure data. If it does, ResourceLoader will use it as an auxiliary cache, capable of versioning individual components. Why? The primary goals are: * Destabalize MediaWiki's front-end code, by coupling it to an inconsistently-implemented and poorly-understood HTML5 API. * Make debugging fun again, by adding another layer of caching to MediaWiki. Yes!! However, as with all new features, unintended side-effects are possible. Specific concerns for module storage include the risk of making the network footprint of page loads smaller, resulting in improved latency. But seriously, -- * Module storage is enabled only if the underlying browser supports localStorage (~89% of browsers in use, according to http://caniuse.com/#feat=namevalue-storage). Users with older browsers will not see a benefit, but they will not suffer a penalty either. * Module storage may or may not improve site performance. We need to test it against a wide range of browsers and platforms to know for certain. If it doesn't improve performance, we'll blame it on you, your poor choice of browsers, and your uncooperative attitude, but then we'll scrap it. How can I test it? -- Glad you asked! Module storage is enabled by default on the beta cluster, and on test test2 wikis. The simplest way you can help is by being alert to to performance regressions and front-end code breakage. If you know how to use your browser's JavaScript console, you can get stats about module storage usage on the current page by checking the value of 'mw.loader.store.stats'. When the code is rolled out across the cluster, it will be disabled by default, but you will be able to opt-in by manually setting a cookie in your browser. I will follow up with the exact details. If module storage proves stable enough for production use, we'll seek to asses its utility by running a controlled study of some kind. If we can demonstrate that module storage leads to a significant improvement in performance, we'll enable by it default. Sounds awesome. I'll wait patiently for hilarity to ensue. --- Ori Livneh o...@wikimedia.org ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Module storage is coming
On 11/03/2013 08:27 PM, Ori Livneh wrote: * Module storage is enabled only if the underlying browser supports localStorage (~89% of browsers in use, according to http://caniuse.com/#feat=namevalue-storage). Users with older browsers will not see a benefit, but they will not suffer a penalty either. Ori, Thank you for this fun-to-read introduction to Module Storage. I wonder if this will have any affect on low bandwidth users. It seems like this would help if someone had only a 2G/GPRS connection, but do a lower percentage of those users have browsers that can use Module Storage? If so, is there anything that we can to do address this issue? I imagine there are things we could do in the software, but those things would take work and time that might not make sense for limited resources. Would it make sense to see if Mozilla has any interest in helping to spread HTML5-capable browsers via USB keychains and/or local Mozillians? Hrm... I should probably go ask them about that. But I'm curious about your perspective and to see if we have any information on the bandwidth available to various users. Mark. ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Module storage is coming
On Mon, Nov 4, 2013 at 1:50 PM, Mark A. Hershberger m...@nichework.comwrote: Ori, Thank you for this fun-to-read introduction to Module Storage. +1 Željko ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Memento Extension for MediaWiki: Advice on Further Development
On Fri, Nov 1, 2013 at 6:43 PM, Brian Wolff bawo...@gmail.com wrote: I haven't looked at your code, so not sure about the context - but: In general a hook returns true to denote no futher processing should take place. If we're talking about wfRunHooks hooks, the usual case is that they return *false* to indicate no further processing, and true means to *continue* processing. -- Brad Jorsch (Anomie) Software Engineer Wikimedia Foundation ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Module storage is coming
On 04.11.2013, 16:50 Mark wrote: Ori, Thank you for this fun-to-read introduction to Module Storage. I wonder if this will have any affect on low bandwidth users. It seems like this would help if someone had only a 2G/GPRS connection, but do a lower percentage of those users have browsers that can use Module Storage? If so, is there anything that we can to do address this issue? I imagine there are things we could do in the software, but those things would take work and time that might not make sense for limited resources. Would it make sense to see if Mozilla has any interest in helping to spread HTML5-capable browsers via USB keychains and/or local Mozillians? Hrm... I should probably go ask them about that. But I'm curious about your perspective and to see if we have any information on the bandwidth available to various users. Most of our mobile traffic is genrated by iOS and Android - both of which support LocalStorage, so it will work for them too. The situation for mobile users in developing countries is less certain - lots of various browsers with wildly varied support of modern features. Still, it's a huge improvement for the majority of users. -- Best regards, Max Semenik ([[User:MaxSem]]) ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Module storage is coming
On 11/04/2013 10:10 AM, Max Semenik wrote: Most of our mobile traffic is genrated by iOS and Android - both of which support LocalStorage, so it will work for them too. People using 2G/GPRS are not necessarily using a mobile device. I've heard from at least one user (hence https://bugzilla.wikimedia.org/55842) that he is using his cell phone as a modem. I suspect this is not a small percentage of users in less developed areas where cell phone towers are plentiful but cables are not. Non-mobile UAs on mobile IPs is what I'm asking about. Mark. ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Memento Extension for MediaWiki: Advice on Further Development
On 2013-11-04 11:04 AM, Brad Jorsch (Anomie) bjor...@wikimedia.org wrote: On Fri, Nov 1, 2013 at 6:43 PM, Brian Wolff bawo...@gmail.com wrote: I haven't looked at your code, so not sure about the context - but: In general a hook returns true to denote no futher processing should take place. If we're talking about wfRunHooks hooks, the usual case is that they return *false* to indicate no further processing, and true means to *continue* processing. -- Brad Jorsch (Anomie) Software Engineer Wikimedia Foundation ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l D'oh. You are of course correct. Sorry for the mistake. As an aside, perhaps we should introduce constants for this. Its easy to mix up the two values. -bawolff ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Module storage is coming
I'm looking forward to seeing how this plays out. The only downside I can so far see is that the amount of browser storage available varies drastically [1] and I wonder whether this will cause upsets for those browsers with extremely strict limits. I've also been toying with the idea of using localStorage and cache manifests in mobile for offline usage in mobile for some time and this will compete for that space and make that experiment if it ever happens even more interesting. :-) [1] http://dev-test.nemikor.com/web-storage/support-test/ On Mon, Nov 4, 2013 at 7:31 AM, Mark A. Hershberger m...@nichework.com wrote: On 11/04/2013 10:10 AM, Max Semenik wrote: Most of our mobile traffic is genrated by iOS and Android - both of which support LocalStorage, so it will work for them too. People using 2G/GPRS are not necessarily using a mobile device. I've heard from at least one user (hence https://bugzilla.wikimedia.org/55842) that he is using his cell phone as a modem. I suspect this is not a small percentage of users in less developed areas where cell phone towers are plentiful but cables are not. Non-mobile UAs on mobile IPs is what I'm asking about. Mark. ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l -- Jon Robson http://jonrobson.me.uk @rakugojon ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Memento Extension for MediaWiki: Advice on Further Development
Le Sat, 02 Nov 2013 21:15:01 +0100, Shawn Jones sj...@cs.odu.edu a écrit : Seb35, I came across your extension a month ago. Ours is different in that it is also implementing the Memento protocol as used by the Internet Archive, Archive-It, and others. I do however, appreciate your insight in trying to solve many of the same problems. I, too, was trying to address the retrieval of old versions of templates, which brought me to your extension. Your use of BeforeParserFetchTemplateAndtitle inspired parts of our Template solution. We're currently trying to figure out how to handle images. What did you mean by MediaWiki messages? Are you referring to the Messages API as part of I18N? In my attempt to recreate the more exact previous display of a past version, I thought about retrieving the old version of interface messages (whose the stylesheet MediaWiki:Common.css/js et others stylesheets); I thought also to modifications of LocalSettings.php and MediaWiki version in order to recreate previous bugs, but this would be quite difficult and probably not very interesting from a user point of view (and probably not secure). Seb35 Thanks again, --Shawn On Nov 1, 2013, at 9:50 PM, Seb35 seb35wikipe...@gmail.com wrote: Hi, No responses to your specific questions, but just to mention I worked some years ago on an extension [1] aiming at retrieving the as-exact-as-possible display of the page at a given past datetime, because the current implementation of oldid is only past wikitext with current context (templates, images, etc.). I mainly implemented the retrieval of old versions of templates, but a lot of other smaller improvements could be done (MediaWiki messages, styles/JS, images, etc.). With this approach, some details are irremediably lost (e.g. number of articles at given timedate, some tricky delete-and-move actions, etc.) and additional informations would have to be recorded to retrieve more exactly the past versions. [1] https://www.mediawiki.org/wiki/Extension:BackwardsTimeTravel ~ Seb35 Le Fri, 01 Nov 2013 20:50:06 +0100, Shawn Jones sj...@cs.odu.edu a écrit: Hi, I'm currently working on the Memento Extension for Mediawiki, as announced earlier today by Herbert Van de Sompel. The goal of this extension is to work with the Memento framework, which attempts to display web pages as they appeared at a given date and time in the past. Our goal is for this to be a collaborative effort focusing on solving issues and providing functionality in the Wikimedia Way as much as possible. Without further ado, I have the following technical questions (I apologize in advance for the fire hose): 1. The Memento protocol has a resource called a TimeMap [1] that takes an article name and returns text formatted as application/link-format. This text contains a machine-readable list of all of the prior revisions (mementos) of this page. It is currently implemented as a SpecialPage which can be accessed like http://www.example.com/index.php/Special:TimeMap/Article_Name. Is this the best method, or is it more preferable for us to extend the Action class and add a new action to $wgActions in order to return a TimeMap from the regular page like http://www.example.com/index.php?title=Article_Nameaction=gettimemap without using the SpecialPage? Is there another preferred way of solving this problem? 2. We currently make several database calls using the the select method of the Database Object. After some research, we realized that Mediawiki provides some functions that do what we need without making these database calls directly. One of these needs is to acquire the oldid and timestamp of the first revision of a page, which can be done using Title-getFirstRevision()-getId() and Title-getFirstRevision()-getTimestamp() methods. Is there a way to get the latest ID and latest timestamp? I see I can do Title-getLatestRevID() to get the latest revision ID; what is the best way to get the latest timestamp? 3. In order to create the correct headers for use with the Memento protocol, we have to generate URIs. To accomplish this, we use the $wgServer global variable (through a layer of abstraction); how do we correctly handle situations if it isn't set by the installation? Is there an alternative? Is there a better way to construct URIs? 4. We use exceptions to indicate when showErrorPage should be run; should the hooks that catch these exceptions and then run showErrorPage also return false? 5. Is there a way to get previous revisions of embedded content, like images? I tried using the ImageBeforeProduceHTML hook, but found that setting the $time parameter didn't return a previous revision of an image. Am I doing something wrong? Is there a better way? 6. Are there any additional coding standards we should be following besides those on the
Re: [Wikitech-l] Module storage is coming
On Mon, Nov 4, 2013 at 11:13 AM, Jon Robson jdlrob...@gmail.com wrote: I'm looking forward to seeing how this plays out. The only downside I can so far see is that the amount of browser storage available varies drastically [1] and I wonder whether this will cause upsets for those browsers with extremely strict limits. Or, for that matter, if it will fill up the allowed storage so user scripts and gadgets can't make effective use of it. -- Brad Jorsch (Anomie) Software Engineer Wikimedia Foundation ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Module storage is coming
On 04/11/13 15:31, Mark A. Hershberger wrote: On 11/04/2013 10:10 AM, Max Semenik wrote: Most of our mobile traffic is genrated by iOS and Android - both of which support LocalStorage, so it will work for them too. People using 2G/GPRS are not necessarily using a mobile device. I've heard from at least one user (hence https://bugzilla.wikimedia.org/55842) that he is using his cell phone as a modem. I suspect this is not a small percentage of users in less developed areas where cell phone towers are plentiful but cables are not. Non-mobile UAs on mobile IPs is what I'm asking about. Mark. ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l Er, that's a fairly standard speed for low-cost wired plans as well in much of the US. I'd actually be happy to have something that fast, but I couldn't afford it. Well, it's the US's version of 'low cost', anyway. ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
[Wikitech-l] FOSS OPW Round 7 Featured Project synopsis: Uploadwizard:OSM map embedding
Hi everyone, As an applicant of FOSS-OPW Round 7, I took a deep and profound interest in one of the featured projects Uploadwizard:OSM map embedding . If chosen or allowed I am planning to focus on this particular project. Project Synopsis*** This project is about enhancing the image upload process for Wikimedia Commons. Commons has millions of images and Wikipedia editors need to be able to find quickly the right ones for their articles, so it is important to store various metadata with the image which will help navigation - topic of the image, when it was made and so on. This metadata is usually stored in a complex template language that is specific to MediaWiki, and the sight of which usually makes people to flee in terror. Since we cannot expect image authors to learn to write something like that, Commons has a tool called UploadWizard which creates all the code for you, after you fill out a bunch of forms. The main goal of the OSM map embedding project would be to provide map interface the integration with external databases to the UploadWizard. (OSM stands for OpenStreetMap - a free map application which is similar to Google Maps but not encumbered by restrictive copyright.) The secondary goal is to integrate with some databases of locations and use that in various ways. One way would be checking if there are locations with requested images nearby (that is, someone put out a note that Commons needs good images about a certain place and doesn't currently have any), and warn the uploader about them. Another involves image competitions where people can participate with pictures taken at some predefined locations; UploadWizard could get a list of these locations and use it to help the participant in selecting where he took the photo. - Mentor,Gergő Tisza Expecting valuable comments and suggestions to improve the project from vibrant MediaWiki community. :) ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Lightning deploy Monday for Vector
Thanks Matt! On Sun, Nov 3, 2013 at 10:51 PM, Matthew Walker mwal...@wikimedia.org wrote: I can do it; I'm already deploying tomorrow for Centralnotice. I thought you had deploy rights though? ~Matt Walker Wikimedia Foundation Fundraising Technology Team On Sun, Nov 3, 2013 at 4:48 PM, Jon Robson jdlrob...@gmail.com wrote: Who would be a good person to lightning deploy the fix [2] to this bug [1] which is effecting all print stylesheets on all wikis? Thanks in advance Jon [1] https://bugzilla.wikimedia.org/show_bug.cgi?id=56366 [2] https://gerrit.wikimedia.org/r/93412 ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l -- Jon Robson http://jonrobson.me.uk @rakugojon ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Memento Extension for MediaWiki: Advice on Further Development
From: Brian Wolffbawo...@gmail.com On 2013-11-04 11:04 AM, Brad Jorsch (Anomie)bjor...@wikimedia.org wrote: On Fri, Nov 1, 2013 at 6:43 PM, Brian Wolffbawo...@gmail.com wrote: I haven't looked at your code, so not sure about the context - but: In general a hook returns true to denote no futher processing should take place. If we're talking about wfRunHooks hooks, the usual case is that they return*false* to indicate no further processing, and true means to *continue* processing. D'oh. You are of course correct. Sorry for the mistake. As an aside, perhaps we should introduce constants for this. Its easy to mix up the two values. -bawolff +1 - great idea! ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
[Wikitech-l] Visual Editor is trashing every article on French Wiki
Currently, Visual Editor is trashing badly every article that is edited on French Wiki: every accented character is replaced by strange characters all over the article. Could you please stop immediately VE ? How can such a mess make its way into production ? Is there no real tests before ? Nico ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Visual Editor is trashing every article on French Wiki
I'm not involved in the VisualEditor development, but I've tried to pass on this to one of the engineers involved in VisualEditor so he's aware of it. Obviously this is an urgent problem that needs fixing. Thanks, Dan On 4 November 2013 20:29, Nicolas Vervelle nverve...@gmail.com wrote: Currently, Visual Editor is trashing badly every article that is edited on French Wiki: every accented character is replaced by strange characters all over the article. Could you please stop immediately VE ? How can such a mess make its way into production ? Is there no real tests before ? Nico ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l -- Dan Garry Associate Product Manager for Platform Wikimedia Foundation ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Visual Editor is trashing every article on French Wiki
Likely related to this bugzilla: https://bugzilla.wikimedia.org/show_bug.cgi?id=50296 It is also happening on English Wikipedia, according to the VE feedback page. Risker On 4 November 2013 15:40, Dan Garry dga...@wikimedia.org wrote: I'm not involved in the VisualEditor development, but I've tried to pass on this to one of the engineers involved in VisualEditor so he's aware of it. Obviously this is an urgent problem that needs fixing. Thanks, Dan On 4 November 2013 20:29, Nicolas Vervelle nverve...@gmail.com wrote: Currently, Visual Editor is trashing badly every article that is edited on French Wiki: every accented character is replaced by strange characters all over the article. Could you please stop immediately VE ? How can such a mess make its way into production ? Is there no real tests before ? Nico ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l -- Dan Garry Associate Product Manager for Platform Wikimedia Foundation ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Visual Editor is trashing every article on French Wiki
Well, yes it's more than urgent, several articles are trashed every minute. How can something like that be unleashed into big wikis without proper testing ? On Mon, Nov 4, 2013 at 9:40 PM, Dan Garry dga...@wikimedia.org wrote: I'm not involved in the VisualEditor development, but I've tried to pass on this to one of the engineers involved in VisualEditor so he's aware of it. Obviously this is an urgent problem that needs fixing. Thanks, Dan On 4 November 2013 20:29, Nicolas Vervelle nverve...@gmail.com wrote: Currently, Visual Editor is trashing badly every article that is edited on French Wiki: every accented character is replaced by strange characters all over the article. Could you please stop immediately VE ? How can such a mess make its way into production ? Is there no real tests before ? Nico ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l -- Dan Garry Associate Product Manager for Platform Wikimedia Foundation ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Visual Editor is trashing every article on French Wiki
Yes, it's happening everywhere, but English doesn't have accented characters, so damages are more limited. On Mon, Nov 4, 2013 at 9:43 PM, Risker risker...@gmail.com wrote: Likely related to this bugzilla: https://bugzilla.wikimedia.org/show_bug.cgi?id=50296 It is also happening on English Wikipedia, according to the VE feedback page. Risker On 4 November 2013 15:40, Dan Garry dga...@wikimedia.org wrote: I'm not involved in the VisualEditor development, but I've tried to pass on this to one of the engineers involved in VisualEditor so he's aware of it. Obviously this is an urgent problem that needs fixing. Thanks, Dan On 4 November 2013 20:29, Nicolas Vervelle nverve...@gmail.com wrote: Currently, Visual Editor is trashing badly every article that is edited on French Wiki: every accented character is replaced by strange characters all over the article. Could you please stop immediately VE ? How can such a mess make its way into production ? Is there no real tests before ? Nico ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l -- Dan Garry Associate Product Manager for Platform Wikimedia Foundation ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Visual Editor is trashing every article on French Wiki
This was due to a broken deployment of Parsoid, the new MediaWiki parser used by VisualEditor. A new library dependency defaulted to iso8859-1 instead of utf-8, which caused character munging to occur. Gabriel is working on a postmortem and we'll share this shortly with recommendations on how to avoid such an incident in future. We're very sorry for the breakage. Affected edits likely have already been reverted - but if there's anything in addition we can do, let us know. The team is still actively investigating the issue on #mediawiki-visualeditor on irc.freenode.net, which is the best place to reach them. Erik -- Erik Möller VP of Engineering and Product Development, Wikimedia Foundation ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Visual Editor is trashing every article on French Wiki
The bug for this was https://bugzilla.wikimedia.org/56583 . -- Matma Rex ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
[Wikitech-l] Introducing Beta Features
Hi all, We're pleased to announce Beta Features [1], a new program that lets you test new features on Wikipedia and other Wikimedia sites before they are released for everyone. Think of it as a digital laboratory where community members can preview upcoming software and give feedback to help improve them. This special preference page lets designers and engineers experiment with new features on a broad scale, but not in a disruptive manner. Beta Features is ready for testing now on MediaWiki.org [2]. This Thursday, we plan to also deploy it on Wikimedia Commons and MetaWiki. Based on test results, we aim to release it on all wikis worldwide on 21 November, 2013. Here are the first features you can test this week: * Media Viewer — view images in large size or full screen [3] * VisualEditor Formulæ — edit algebra or equations on your pages [4] * Typography Refresh — make text more readable (coming soon) [5] Would you like to try out BetaFeatures now? After you log in on MediaWiki.org, a small 'Beta' link will appear next to your 'Preferences'. Click on it to see features you can test, check the ones you want, then click 'Save'. Learn more on the BetaFeatures page. [1] After you've tested Beta Features, please let us know what you think on our discussion page [6] -- or report any bugs on Bugzilla [7]. For technical documentation, visit the BetaFeatures extension page. [8]. You're also welcome to join our IRC office hours chat on Friday, 8 November at 10:30am PST, 18:30 UTC. [9] Beta Features was developed by the Wikimedia Foundation's Design, Multimedia and VisualEditor teams. Along with other developers, we will be adding new features to this experimental program every few weeks. For now, we'd like to thank some key team members who made this project possible: Ed Sanders, Jon Robson, Gergő Tisza, May Galloway, Vibha Bamba, Aaron Schulz, Brion Vibber, Bryan Davis, Chris Steipp, Greg Grossmeier, Keegan Peterzell, Quim Gil, Guillaume Paumier, Erik Moeller, Rob Lanphier, Howie Fung and Tomasz Finc, to name but a few. We're also very grateful to all the community and other team members who helped create this system — and look forward to many more productive collaborations in the future. :) Enjoy, and don't forget to let us know what you think! Fabrice, James, Mark and Jared Wikimedia Engineering and Product Group [1] https://www.mediawiki.org/wiki/About_Beta_Features [2] https://www.mediawiki.org/wiki/Special:Preferences#mw-prefsection-betafeatures [3] https://www.mediawiki.org/wiki/Multimedia/About_Media_Viewer [4] https://www.mediawiki.org/wiki/VisualEditor/Beta_Features/Formulae [5] https://www.mediawiki.org/wiki/Typography_Update [6] https://www.mediawiki.org/wiki/Talk:About_Beta_Features [7] http://wmbug.com/new?product=MediaWiki%20extensionscomponent=BetaFeatures [8] https://www.mediawiki.org/wiki/Extension:BetaFeatures [9] https://meta.wikimedia.org/wiki/IRC_office_hours#Upcoming_office_hours ___ Fabrice Florin Product Manager, Multimedia Wikimedia Foundation http://en.wikipedia.org/wiki/User:Fabrice_Florin_(WMF) https://www.mediawiki.org/wiki/Multimedia ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Memento Extension for MediaWiki: Advice on Further Development
Hi Shawn, I'm currently working on the Memento Extension for Mediawiki, as announced earlier today by Herbert Van de Sompel. This is very exciting! Coincidentally, at last week's SMWCon (the Semantic MediaWiki conference) in Berlin I gave a presentation to argue that we need some sort of 'time travelling' feature (slides are available at http://slidesha.re/1iIf3F9). One of the other participants also pointed out the Memento protocol. Are you familiar with Semantic MediaWiki (http://www.semantic-mediawiki.org/) as an extension to MediaWiki? I'm curious what it would take to let SMW play nice together with Memento. Kind regards, Remco de Boer ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Thoughts of the better landing page
Hi, I have been researching how to implement m.wikipedia.org landing page. It should show the same output as en.m.wikipedia.org/wiki/Special:Zero page, except that it should be localized based on the carrier's default language. Possible solutions: * VirtualHost - map to docroot that contains the same mediawiki as en.m.wikipedia.org. There should be a Rewrite rule that allows only ^/$ - wiki/Special:Zero; but all other paths will return 404. * Magic hack - something similar to extract2.php that will first load zero configuration, configure the language to use, then somehow execute the whole mediawiki rendering for the special page, and lastly output the rendered content. Also, it might be needed to set proper MW_LANG before loading, but that means some other code need to access memcached and make API call I think #1 is better, please advise on how to switch the language and if we should put up any defenses (like removing headers, rejecting non-GET requests, etc). Thanks. On Tue, Nov 5, 2013 at 1:58 AM, Brion Vibber bvib...@wikimedia.org wrote: Generally I agree that #1 is the way to go -- extract2.php is a nasty hack and I'd hate for us to have to replicate it. :D re: changing the language something within Zero extension should be able to do that if you just need to change the _display_ language. If you have to target it to a specific wiki I'd recommend doing that in the rewrite rules if possible. But that don't sound easy to do. ;) -- brion On Mon, Nov 4, 2013 at 1:50 PM, Yuri Astrakhan yastrak...@wikimedia.orgwrote: Brion, I was thinking of how to do the new landing page better. Needed: m.wikipedia.org should show the same output as en.m.wikipedia.org/wiki/Special:Zero page, except that it should be localized based on the carrier's default language. Possible solutions: * VirtualHost - map to docroot that contains the same mediawiki as en.m.wikipedia.org. There should be a Rewrite rule that allows only ^/$ - wiki/Special:Zero; but all other paths will return 404. Also, I need some magic code that dynamically changes the current language based on the carrier's configuration. * Magic hack - something similar to extract2.php that will first load zero configuration, configure the language to use, then somehow execute the whole mediawiki rendering for the special page, and lastly output the rendered content. I think #1 is better, please advise on how to switch the language and if we should put up any defenses (like removing headers, rejecting non-GET requests, etc). Thanks. ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
[Wikitech-l] Parsoid charset corruption post-mortem
On 11/04/2013 12:58 PM, Erik Moeller wrote: This was due to a broken deployment of Parsoid, the new MediaWiki parser used by VisualEditor. A new library dependency defaulted to iso8859-1 instead of utf-8, which caused character munging to occur. I have created a post-mortem at https://wikitech.wikimedia.org/wiki/Incident_documentation/20131104-Parsoid The most important recommendation is the creation of automated full-stack web API tests in addition to our existing parser test and round-trip test setups. This should catch library issues like this in the future before deployment. We also updated the Parsoid deploy guidelines to include manual edit checks on non-English wikis, which will help to catch any remaining production-specific deployment issues. Gabriel ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Memento Extension for MediaWiki: Advice on Further Development
On Nov 4, 2013, at 14:24, Remco de Boer remcocdeb...@gmail.com wrote: Hi Shawn, I'm currently working on the Memento Extension for Mediawiki, as announced earlier today by Herbert Van de Sompel. This is very exciting! Coincidentally, at last week's SMWCon (the Semantic MediaWiki conference) in Berlin I gave a presentation to argue that we need some sort of 'time travelling' feature (slides are available at http://slidesha.re/1iIf3F9). One of the other participants also pointed out the Memento protocol. Are you familiar with Semantic MediaWiki (http://www.semantic-mediawiki.org/) as an extension to MediaWiki? I'm curious what it would take to let SMW play nice together with Memento. Before announcing the Memento extension to this list we tested it with a locally installed Semantic MediaWiki and all seemed OK. It would be great if someone could test it on a live one with actual real data. We got in touch with the people behind http://neurolex.org/wiki/Main_Page but they are running an older MediaWiki version and are not in a hurry to upgrade because they have a lot of extensions. From the early days of Memento, we have been very interested in semantic web, linked data applications of the Memento protocol. See, for example: - http://arxiv.org/abs/1003.3661 - illustrates the power of the protocol to do time series analysis across versions of linked data description (in DBpedia) - http://mementoweb.org/depot/native/dbpedia/ - the DBpedia archive that we operate and that is Memento compliant - Greetings Herbert Kind regards, Remco de Boer ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l