[MediaWiki-CodeReview] [MediaWiki r85403]: New comment added, and revision status changed
User Siebrand changed the status of MediaWiki.r85403. Old Status: new New Status: fixme User Siebrand also posted a comment on MediaWiki.r85403. Full URL: http://www.mediawiki.org/wiki/Special:Code/MediaWiki/85403#c15719 Comment: FIXME: +var_dump('foo'); ___ MediaWiki-CodeReview mailing list mediawiki-coderev...@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/mediawiki-codereview
[MediaWiki-CodeReview] [MediaWiki r85296]: New comment added, and revision status changed
User Tbleher changed the status of MediaWiki.r85296. Old Status: new New Status: fixme User Tbleher also posted a comment on MediaWiki.r85296. Full URL: http://www.mediawiki.org/wiki/Special:Code/MediaWiki/85296#c15720 Comment: Surely getSkin() should return a skin object rather than a Linker object? I use an old version of SMW on my wiki[1] that does $parser-getOptions()-getSkin()-makeSpecialUrl() This is broken now. [1]: I know I should upgrade, but my version contains some custom code that will have to be forward ported. ___ MediaWiki-CodeReview mailing list mediawiki-coderev...@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/mediawiki-codereview
[MediaWiki-CodeReview] [MediaWiki r82778]: New comment added
User Tim Starling posted a comment on MediaWiki.r82778. Full URL: http://www.mediawiki.org/wiki/Special:Code/MediaWiki/82778#c15721 Comment: In EditPage::showEditForm:fields, you should do what you are doing already. Use $parser-mOutput-setProperty() in the parser hook, remove the ArticleSaveComplete hook, and then load from page_props in the edit page. To get the properties in SkinTemplateToolboxEnd, you should do the following: * Hook OutputPageParserOutput. That hook is passed an OutputPage object and a ParserOutput object. * Get the central wiki pages with $parserOutput-getProperty(), then store them into a custom property in the OutputPage object, say pre $outputPage-interlanguage_pages = $parserOutput-getProperty( 'interlanguage_pages' ); /pre * In SkinTemplateToolboxEnd, you get a Skin object. In 1.17, the OutputPage object is just $wgOut, in 1.18 it is $skin-getContext()-getOutput(). You should aim for 1.17 compatibility for now, since we want to deploy this soon. So check for your custom property with isset() with code like: pre global $wgOut; if ( isset( $wgOut-interlanguage_pages ) ) { foreach ( $wgOut-interlanguage_page as $title ) { ... } } /pre This will work with the parser cache, giving you up to date properties with no extra queries. ___ MediaWiki-CodeReview mailing list mediawiki-coderev...@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/mediawiki-codereview
[MediaWiki-CodeReview] [MediaWiki r85296]: New comment added
User Dantman posted a comment on MediaWiki.r85296. Full URL: http://www.mediawiki.org/wiki/Special:Code/MediaWiki/85296#c15722 Comment: Why do we have to support old versions of extensions in new versions of core? ___ MediaWiki-CodeReview mailing list mediawiki-coderev...@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/mediawiki-codereview
[MediaWiki-CodeReview] [MediaWiki r84997]: Revision status changed
User Tim Starling changed the status of MediaWiki.r84997. Old Status: new New Status: ok Full URL: http://www.mediawiki.org/wiki/Special:Code/MediaWiki/84997#c0 ___ MediaWiki-CodeReview mailing list mediawiki-coderev...@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/mediawiki-codereview
Re: [Wikitech-l] HipHop
On 05/04/11 04:47, Tim Starling wrote: Speaking of fast, I did a quick benchmark of the [[Barack Obama]] article with templates pre-expanded. It took 22 seconds in HipHop and 112 seconds in Zend, which is not bad, for a first attempt. I reckon it would do better if a few of the regular expressions were replaced with tight loops. snip I have imported in my local wiki the english [[Barack Obama]] article with all its dependencies. I can not have it parsed under either 256MB max memory or 1 minute max execution time limits. Hiphop helps, but there is still a highly broken code somewhere in our PHP source code. No matter how much hacks we throw at bad code, the algorithm still need to get fixed. Also, browsing the generated source turns up silly things like: if (equal(switch2, (NAMSTR(s_ss34c5c84c, currentmonth goto case_2_0; if (equal(switch2, (NAMSTR(s_ss55b88086, currentmonth1 goto case_2_1; snip 71 string comparisons in total, in quite a hot function. A hashtable would probably be better. As I understand it, hiphop is just a straight translator from PHP to C language but does not actually enhance the code. Just like you would use Google translator instead of Charles Baudelaire [1]. Your code above comes from Parse.php getVariableValue() which use a long switch() structure to map a string to a method call. If you manage to find unoptimized code in the translated code, fix it in the PHP source code :-b [1] French poet, probably best known in UK/US for his work on translating Edgar Allan Poe novels from English to French. -- Ashar Voultoiz ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] HipHop
On 04/05/2011 04:45 PM, Ashar Voultoiz wrote: On 05/04/11 04:47, Tim Starling wrote: Speaking of fast, I did a quick benchmark of the [[Barack Obama]] article with templates pre-expanded. It took 22 seconds in HipHop and 112 seconds in Zend, which is not bad, for a first attempt. I reckon it would do better if a few of the regular expressions were replaced with tight loops. snip I have imported in my local wiki the english [[Barack Obama]] article with all its dependencies. I can not have it parsed under either 256MB max memory or 1 minute max execution time limits. Hiphop helps, but there is still a highly broken code somewhere in our PHP source code. No matter how much hacks we throw at bad code, the algorithm still need to get fixed. Let me know when you find that broken code. Try using the profiling feature from xdebug to narrow down the causes of CPU usage. Also, browsing the generated source turns up silly things like: if (equal(switch2, (NAMSTR(s_ss34c5c84c, currentmonth goto case_2_0; if (equal(switch2, (NAMSTR(s_ss55b88086, currentmonth1 goto case_2_1; snip 71 string comparisons in total, in quite a hot function. A hashtable would probably be better. As I understand it, hiphop is just a straight translator from PHP to C language but does not actually enhance the code. Just like you would use Google translator instead of Charles Baudelaire [1]. It's not just a translator, it's also a reimplementation of the bulk of the PHP core. Your code above comes from Parse.php getVariableValue() which use a long switch() structure to map a string to a method call. If you manage to find unoptimized code in the translated code, fix it in the PHP source code :-b In Zend PHP, switch statements are implemented by making a hashtable at compile time, and then doing a hashtable lookup at runtime. The HipHop implementation is less efficient. So getVariableValue() is not broken, it's just not optimised for HipHop. -- Tim Starling ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
[MediaWiki-CodeReview] [MediaWiki r85296]: New comment added
User Siebrand posted a comment on MediaWiki.r85296. Full URL: http://www.mediawiki.org/wiki/Special:Code/MediaWiki/85296#c15726 Comment: This is not a reason for marking an issue as FIXME. If the change broke the trunk version of an extension, it might be. Leaving this at FIXME because of #c15686. ___ MediaWiki-CodeReview mailing list mediawiki-coderev...@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/mediawiki-codereview
[MediaWiki-CodeReview] [MediaWiki r85354]: New comment added, and revision status changed
User Bryan changed the status of MediaWiki.r85354. Old Status: new New Status: fixme User Bryan also posted a comment on MediaWiki.r85354. Full URL: http://www.mediawiki.org/wiki/Special:Code/MediaWiki/85354#c15727 Comment: pre +* (bug 27018) Added action=filerevert to revert files to an old version /pre Is not in 1.17. ___ MediaWiki-CodeReview mailing list mediawiki-coderev...@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/mediawiki-codereview
Re: [Wikitech-l] Actions and Special Pages
On Tue, Apr 5, 2011 at 4:41 AM, Daniel Friesen li...@nadir-seen-fire.com wrote: Personally, I like tacking on ?action=edit and especially purge. Prefixing Special:Edit/ doesn't sound nice to me. I know I fixed the issues with things like Special:Movepage not sharing the same UI tabs as the rest of the actions. I'm +1 with you on this. I don't have any convincing arguments against either way, but action links just look nicer to me. Bryan ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Future: Love for the sister projects!
2011/4/4 MZMcBride z...@mzmcbride.com: Brion Vibber wrote: I'm not a real MediaWiki developer, so it may be a silly question: How hard it is to convert a gadget into an extension? If it's not too hard, wouldn't it be better to redo useful gadgets as extensions? Unless i miss something very basic, this will make them easier to maintain, install, update and localize. In principle, making an extension based on an existing Gadget should be pretty straightforward, especially with ResourceLoader taking care of more of the details of JS CSS fetching. Might be fun to whip up a how-to guide and post it on the tech blog... That seems like a bit of a regression, though. The Gadgets extension was implemented as a way for local admins to easily hook into Special:Preferences so that typical users could enable certain pieces of JavaScript without needing to edit any pages. By creating separate extensions for individual JavaScript gadgets, you hit some interesting benefits/detriments: Benefits: * extensions can easily be enabled globally * extensions have great localization support * extensions have sane review processes in place * extensions can implement functionality when JavaScript isn't available Detriments: * extensions usually take one forever to get enabled (this is a huge point) The benefit of great localization support is also a huge point, and hugely positive. It's worth the resources to fix the hard, long and frustrating review processes that cause extensions to wait for years in the line to be enabled to get the benefit of localization. It's true that extensions take forever to get enabled, and this is, indeed, a huge point - and it must be fixed. I suppose that it requires resources; well, the Foundation must prioritize this much-needed evolutionary change higher than big revolutionary projects. It's not even a matter of globalization, language equality etc. - it's something that's just plain useful for the Big English Wikipedia, too. -- Amir Elisha Aharoni · אָמִיר אֱלִישָׁע אַהֲרוֹנִי http://aharoni.wordpress.com We're living in pieces, I want to live in peace. - T. Moore ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
[MediaWiki-CodeReview] [MediaWiki r85403]: New comment added, and revision status changed
User Happy-melon changed the status of MediaWiki.r85403. Old Status: fixme New Status: new User Happy-melon also posted a comment on MediaWiki.r85403. Full URL: http://www.mediawiki.org/wiki/Special:Code/MediaWiki/85403#c15728 Comment: Lol. ___ MediaWiki-CodeReview mailing list mediawiki-coderev...@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/mediawiki-codereview
[MediaWiki-CodeReview] [MediaWiki r85354]: New comment added
User ^demon posted a comment on MediaWiki.r85354. Full URL: http://www.mediawiki.org/wiki/Special:Code/MediaWiki/85354#c15729 Comment: Do you mean it should be and I missed it, or that I botched the REL-NOTES merge? ___ MediaWiki-CodeReview mailing list mediawiki-coderev...@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/mediawiki-codereview
Re: [Wikitech-l] HipHop
On 04/05/2011 05:47 AM, Tim Starling wrote: Speaking of fast, I did a quick benchmark of the [[Barack Obama]] article with templates pre-expanded. It took 22 seconds in HipHop and 112 seconds in Zend, which is not bad, for a first attempt. I reckon it would do better if a few of the regular expressions were replaced with tight loops. Hmm, does HipHop precompile regexen? -- Ilmari Karonen ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
[Wikitech-l] Static HTML Dumps
Hi! I am looking for static HTML Dumps of the English Wikipedia, but I have found only this page: http://static.wikipedia.org/ This page contains almost three years old dumps! Do you know other pages, where I could download recent HTML dumps from, or do you have any idea, how could I get them? Thank you for your help, Samat ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] HipHop
On 04/05/2011 02:42 AM, Tim Starling wrote: On 04/05/2011 07:31 AM, Platonides wrote: Having to create a reflection class and look for exceptions just to check for class existance is really ugly. However, looking at https://github.com/facebook/hiphop-php/issues/314 it seems to be a declaration before use problem (even though class_exists shouldn't be declaring it). I suppose that we could work around that by including all classes at the beginning instead of using the AutoLoader, which shouldn't be needed for compiled code. You can't include the class files for compiled classes, they all exist at startup and you get a redeclaration error if you try. I explained this in the documentation page on mediawiki.org which has now appeared. http://www.mediawiki.org/wiki/HipHop The autoloader is not used. It doesn't matter where the class_exists() is. HipHop scans the entire codebase for class_exists() at compile time, and breaks any classes it finds, whether the or not the class_exists() is reachable. These both really sound like bugs in HipHop. I've no idea how hard it would be to fix them, but are we reporting them at least? -- Ilmari Karonen ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Static HTML Dumps
Hello, The HTML dump proces isn't running anymore according to the message on the downloads page. The most recent en.wiki dump is in XML and here: http://dumps.wikimedia.org/enwiki/20110317/ Best, Huib Laurens ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] HipHop
On Tue, Apr 5, 2011 at 7:45 AM, Ashar Voultoiz hashar+...@free.fr wrote: On 05/04/11 04:47, Tim Starling wrote: Speaking of fast, I did a quick benchmark of the [[Barack Obama]] article with templates pre-expanded. It took 22 seconds in HipHop and 112 seconds in Zend, which is not bad, for a first attempt. I reckon it would do better if a few of the regular expressions were replaced with tight loops. snip I have imported in my local wiki the english [[Barack Obama]] article with all its dependencies. I can not have it parsed under either 256MB max memory or 1 minute max execution time limits. Hiphop helps, but there is still a highly broken code somewhere in our PHP source code. No matter how much hacks we throw at bad code, the algorithm still need to get fixed. For comparison: WYSIFTW parses [[Barak Obama]] in 3.5 sec on my iMac, and in 4.4 sec on my MacBook (both Chrome 12). Yes, it doesn't do template/variable replacing, and it's probably full of corner cases that break; OTOH, it's JavaScript running in a browser, which should make it much slower than a dedicated server setup running precompiled PHP. So, maybe another hard look at the MediaWiki parser is in order? Cheers, Magnus ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
[MediaWiki-CodeReview] [MediaWiki r85354]: Revision status changed
User Bryan changed the status of MediaWiki.r85354. Old Status: fixme New Status: ok Full URL: http://www.mediawiki.org/wiki/Special:Code/MediaWiki/85354#c0 ___ MediaWiki-CodeReview mailing list mediawiki-coderev...@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/mediawiki-codereview
Re: [Wikitech-l] HipHop: need developers
On Tue, Apr 5, 2011 at 12:58 AM, Tim Starling tstarl...@wikimedia.org wrote: I need to get back to work on reviews and deployments and other such strategic things. I hope the work I've done on HipHop support is enough to get the project started. Initial benchmarks are showing a 5x speedup for article parsing, which is better than we had hoped for. So there are big gains to be had here, both for small users and for large websites like Wikimedia and Wikia. There's a list of things that still need doing under the to do heading at: http://www.mediawiki.org/wiki/HipHop I'll be available to support any developers who want to work on this. -- Tim Starling This is something I'm interested in and hope to put a little more time into in the near future once the 1.17 release is behind us. -Chad ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
[MediaWiki-CodeReview] [MediaWiki r82413]: New comment added
User ^demon posted a comment on MediaWiki.r82413. Full URL: http://www.mediawiki.org/wiki/Special:Code/MediaWiki/82413#c15731 Comment: I couldn't get this to merge cleanly to REL1_17. ___ MediaWiki-CodeReview mailing list mediawiki-coderev...@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/mediawiki-codereview
Re: [Wikitech-l] HipHop
2011/4/5 Magnus Manske magnusman...@googlemail.com: For comparison: WYSIFTW parses [[Barak Obama]] in 3.5 sec on my iMac, and in 4.4 sec on my MacBook (both Chrome 12). Yes, it doesn't do template/variable replacing, and it's probably full of corner cases that break; OTOH, it's JavaScript running in a browser, which should make it much slower than a dedicated server setup running precompiled PHP. Seriously, the bulk of the time needed to parse these enwiki articles is for template expansion. If you pre-expand them, taking care that also the templates in ref.../ref tags get expanded, MediaWiki can parse the article in a few seconds, 3-4 on my laptop. ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] HipHop
On Tue, Apr 5, 2011 at 3:30 PM, Paul Copperman paul.copper...@googlemail.com wrote: 2011/4/5 Magnus Manske magnusman...@googlemail.com: For comparison: WYSIFTW parses [[Barak Obama]] in 3.5 sec on my iMac, and in 4.4 sec on my MacBook (both Chrome 12). Yes, it doesn't do template/variable replacing, and it's probably full of corner cases that break; OTOH, it's JavaScript running in a browser, which should make it much slower than a dedicated server setup running precompiled PHP. Seriously, the bulk of the time needed to parse these enwiki articles is for template expansion. If you pre-expand them, taking care that also the templates in ref.../ref tags get expanded, MediaWiki can parse the article in a few seconds, 3-4 on my laptop. So is the time spent with the actual expansion (replacing variables), or getting the wikitext for n-depth template recursion? Or is it the parser functions? ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
[MediaWiki-CodeReview] [MediaWiki r85202]: New comment added
User ^demon posted a comment on MediaWiki.r85202. Full URL: http://www.mediawiki.org/wiki/Special:Code/MediaWiki/85202#c15732 Comment: r84918 was never merged, so r84924 isn't needed in 1.17. findFileFromKey() and new newFileFromKey() are both in 1.17 as they should be. ___ MediaWiki-CodeReview mailing list mediawiki-coderev...@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/mediawiki-codereview
[MediaWiki-CodeReview] [MediaWiki r85430]: Revision status changed
User Platonides changed the status of MediaWiki.r85430. Old Status: new New Status: ok Full URL: http://www.mediawiki.org/wiki/Special:Code/MediaWiki/85430#c0 ___ MediaWiki-CodeReview mailing list mediawiki-coderev...@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/mediawiki-codereview
[MediaWiki-CodeReview] [MediaWiki r85410]: New comment added, and revision status changed
User Platonides changed the status of MediaWiki.r85410. Old Status: new New Status: fixme User Platonides also posted a comment on MediaWiki.r85410. Full URL: http://www.mediawiki.org/wiki/Special:Code/MediaWiki/85410#c15733 Comment: AbortAutoAccount description should use more than one line in hooks.txt. $message is an out parameter, should be specified as $message (same problem with AbortAutoblock) It is not clear from the description that $message is a message key, which will be shown to the user through wfMsg() ___ MediaWiki-CodeReview mailing list mediawiki-coderev...@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/mediawiki-codereview
Re: [Wikitech-l] GSOC and networking
On Mon, 4 Apr 2011 23:19:54 -0400 Jelle Zijlstra jelle.zijls...@gmail.com wrote: However, making https://www.mediawiki.org would be another useful project. That would be useful from a privacy point of view, though just making all logins secure by default would be good first start. Making the entire site SSL-capable would be an interesting engineering problem: can the increased cost due to higher server load be amortized over the number of users given their typical browsing habits? On Mon, 4 Apr 2011 23:20:08 -0400 Benjamin Lees emufarm...@gmail.com wrote: There is a secure login option, and the login page has a link to it, but the link wasn't showing up in the right place in the Vector skin. I think I've fixed it.[0] I see it now. Thanks! On Tue, 5 Apr 2011 13:21:54 +1000 K. Peachey p858sn...@gmail.com wrote: also I thought we did have IPV6 setup for the sites. You're absolutely right. http://wikitech.wikimedia.org/view/IPv6_deployment And it looks like there is more work do. ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] HipHop
2011/4/5 Magnus Manske magnusman...@googlemail.com: So is the time spent with the actual expansion (replacing variables), or getting the wikitext for n-depth template recursion? Or is it the parser functions? Well, getting the wikitext shouldn't be very expensive as it is cached in several cache layers. Basically it's just expanding many, many preprocessor nodes. A while ago I did a bit of testing with my template tool on dewiki[1] and found that wikimedia servers spend approx. 0.2 ms per expanded node part, although there's of course much variation depending on current load. My tool counts 303,905 nodes when expanding [[Barack Obama]] so that would account for about 60 s of render time. As already said, YMMV. Paul Copperman [1] http://de.wikipedia.org/wiki/Benutzer:P.Copp/scripts/templateutil.js, you can test it with javascript:void importScriptURI('http://de.wikipedia.org/w/index.php?action=rawtitle=Benutzer:P.Copp/scripts/templateutil.jsctype=text/javascript') and a click on Template tools in the toolbox ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
[MediaWiki-CodeReview] [MediaWiki r85247]: New comment added
User Bawolff posted a comment on MediaWiki.r85247. Full URL: http://www.mediawiki.org/wiki/Special:Code/MediaWiki/85247#c15734 Comment: One potential use case for not making the Linker static, is consider how sites that have multiple urls (aka http://secure.wikimedia.org/wikipedia/en/wiki/ vs http://en.wikipedia.org ) In most contexts one would want to make links that reflect the site you're currently looking at, but in some cases you would want to use the canonical url for the site. (email notifications, perhaps the irc notifications although at the moment we use some black magic to get around that). Being able to have different linker objects to generate canonical vs normal links might be useful. Of course at the moment we don't do that, but making the Linker static makes it more difficult to do (on the other hand, it is somewhat of an edge case that sites have multiple urls). ___ MediaWiki-CodeReview mailing list mediawiki-coderev...@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/mediawiki-codereview
[MediaWiki-CodeReview] [MediaWiki r85247]: New comment added
User Happy-melon posted a comment on MediaWiki.r85247. Full URL: http://www.mediawiki.org/wiki/Special:Code/MediaWiki/85247#c15735 Comment: Sites having multiple urls is itself black magic/hackish. The most we should ever have to change is the protocol. ___ MediaWiki-CodeReview mailing list mediawiki-coderev...@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/mediawiki-codereview
[MediaWiki-CodeReview] [MediaWiki r85439]: Revision status changed
User Reedy changed the status of MediaWiki.r85439. Old Status: new New Status: ok Full URL: http://www.mediawiki.org/wiki/Special:Code/MediaWiki/85439#c0 ___ MediaWiki-CodeReview mailing list mediawiki-coderev...@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/mediawiki-codereview
[MediaWiki-CodeReview] [MediaWiki r85438]: New comment added
User Reedy posted a comment on MediaWiki.r85438. Full URL: http://www.mediawiki.org/wiki/Special:Code/MediaWiki/85438#c15736 Comment: Can you make days longer? ___ MediaWiki-CodeReview mailing list mediawiki-coderev...@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/mediawiki-codereview
[MediaWiki-CodeReview] [MediaWiki r85443]: Revision status changed
User Reedy changed the status of MediaWiki.r85443. Old Status: new New Status: ok Full URL: http://www.mediawiki.org/wiki/Special:Code/MediaWiki/85443#c0 ___ MediaWiki-CodeReview mailing list mediawiki-coderev...@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/mediawiki-codereview
[MediaWiki-CodeReview] [MediaWiki r85442]: Revision status changed
User Reedy changed the status of MediaWiki.r85442. Old Status: new New Status: resolved Full URL: http://www.mediawiki.org/wiki/Special:Code/MediaWiki/85442#c0 ___ MediaWiki-CodeReview mailing list mediawiki-coderev...@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/mediawiki-codereview
[MediaWiki-CodeReview] [MediaWiki r85441]: New comment added
User MZMcBride posted a comment on MediaWiki.r85441. Full URL: http://www.mediawiki.org/wiki/Special:Code/MediaWiki/85441#c15737 Comment: Is this problem limited to this extension? ___ MediaWiki-CodeReview mailing list mediawiki-coderev...@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/mediawiki-codereview
[MediaWiki-CodeReview] [MediaWiki r85441]: New comment added
User Reedy posted a comment on MediaWiki.r85441. Full URL: http://www.mediawiki.org/wiki/Special:Code/MediaWiki/85441#c15738 Comment: Very likely not. I think Cite (maybe!?) has already been fixed. I'd be highly surprised if it's not a wider problem ___ MediaWiki-CodeReview mailing list mediawiki-coderev...@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/mediawiki-codereview
[MediaWiki-CodeReview] [MediaWiki r85327]: New comment added
User Krinkle posted a comment on MediaWiki.r85327. Full URL: http://www.mediawiki.org/wiki/Special:Code/MediaWiki/85327#c15739 Comment: I haven't read all the diffs, but I noticed a few changes that may not be obvious and/or are conflicting with our current conventions. Since HipHop is apparently the way we go, perhaps you (Tim) could take a look at these three pages to make sure they are still correct: * [[Manual:Coding conventions]] * [[Manual:Pre-commit checklist]] * [[Manual:Code]] ___ MediaWiki-CodeReview mailing list mediawiki-coderev...@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/mediawiki-codereview
[MediaWiki-CodeReview] [MediaWiki r85441]: New comment added
User MaxSem posted a comment on MediaWiki.r85441. Full URL: http://www.mediawiki.org/wiki/Special:Code/MediaWiki/85441#c15740 Comment: Most likely, there are dozens of broken extensions. Oh well. ___ MediaWiki-CodeReview mailing list mediawiki-coderev...@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/mediawiki-codereview
[MediaWiki-CodeReview] [MediaWiki r82778]: New comment added
User Nikola Smolenski posted a comment on MediaWiki.r82778. Full URL: http://www.mediawiki.org/wiki/Special:Code/MediaWiki/82778#c15741 Comment: OK, in r85456 I am using LinksUpdate. If it is patched the extension should work with the patch, and if not the extension should still work, without handling more than one link per language. ___ MediaWiki-CodeReview mailing list mediawiki-coderev...@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/mediawiki-codereview
[Wikitech-l] Rules for text in languages/messages/Messages***.php
I recently fixed a bug (14901) and attached some patches to the bug ticket. Included were some changes to languages/messages/MessagesEn.php. Following the directions given at http://www.mediawiki.org/wiki/ Localisation#Changing_existing_messages, I contacted #mediawiki-i18n and asked them for instructions how to proceed with the internationalization part of the work. After looking over my changes, they pointed out that my new message text violated some requirements, specifically: + I had used /n rather than literal newlines in the text + I had put newlines in front of some messages + I had put trailing whitespace in some messages Since I was (and am) unfamiliar with MW internationalization workflow and since this text is destined for email messages, not display on a wiki page, I was puzzled by these requirements. However, it was explained to me that internationalization occurs by display of the text on translate.net, which means the text must conform to web page display rules. Consequently, the problems specified above interfere with the translation work. I understand I need to fix these problems. However, before proceeding I wonder if there are other constraints that I must observe. I have read the material at http://www.mediawiki.org/wiki/Localisation, but did not find the above constraints mentioned there. Is there another place where they are specified? -- -- Dan Nessett ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
[MediaWiki-CodeReview] [MediaWiki r85441]: New comment added
User MZMcBride posted a comment on MediaWiki.r85441. Full URL: http://www.mediawiki.org/wiki/Special:Code/MediaWiki/85441#c15742 Comment: Reedy fixed up other extensions in r85450, r85451, and r85452. ___ MediaWiki-CodeReview mailing list mediawiki-coderev...@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/mediawiki-codereview
Re: [Wikitech-l] Static HTML Dumps
Paul Houle wrote: I did a substantial project that worked from the XML dumps. I designed a recursive descent parser in C# that, with a few tricks, almost decodes wikipedia markup correctly. Getting it right is tricky, for a number of reasons, however, my approach preserved some semantics that would have been lost in the HTML dumps. (...) In your case, I'd do the following: install a copy of the mediawiki software, http://lifehacker.com/#!163707/geek-to-live--set-up-your-personal-wikipedia http://lifehacker.com/#%21163707/geek-to-live--set-up-your-personal-wikipedia get a list of all the pages in the wiki by running a database query, and then write a script that does http requests for all the pages and saves them in files. This is programming of the simplest type, but getting good speed could be a challenge. I'd seriously consider using Amazon EC2 for this kind of thing, renting a big DB server and a big web server, then writing a script that does the download in parallel. He could as well generate the static html dumps from that. http://www.mediawiki.org/wiki/Extension:DumpHTML I think he is better parsing the articles, though. For a linguistic research you don't need things such as the contents of templates, so a simple wikitext stripping would do. And it will be much, much, much, much faster than parsing the whole wiki. ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] GSOC and networking
Interesting. That would be an -ops project, not a software development one. It seems a riskier deliverable, although certainly interesting for WMF. ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] HipHop
On Tue, Apr 5, 2011 at 9:02 AM, Magnus Manske magnusman...@googlemail.com wrote: Yes, it doesn't do template/variable replacing, and it's probably full of corner cases that break; OTOH, it's JavaScript running in a browser, which should make it much slower than a dedicated server setup running precompiled PHP. To the contrary, I'd expect JavaScript in the most recent version of any browser (even IE) to be *much* faster than PHP, maybe ten times faster on real-world tasks. All browsers now use JIT compilation for JavaScript, and have been competing intensively on raw JavaScript speed for the last three years or so. There are no drop-in alternative PHP implementations, so PHP is happy sticking with a ridiculously slow interpreter forever. Alioth's language benchmarks show JavaScript in Chrome's V8 (they don't say what version) as being at least eight times faster than PHP on most of its benchmarks, although it's slower on two: http://shootout.alioth.debian.org/u32/benchmark.php?test=alllang=v8lang2=php Obviously not very scientific, but it gives you an idea. ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
[MediaWiki-CodeReview] [MediaWiki r82413]: New comment added
User Simetrical posted a comment on MediaWiki.r82413. Full URL: http://www.mediawiki.org/wiki/Special:Code/MediaWiki/82413#c15743 Comment: Unsurprising. It's probably not a big deal, it can be left out unless we get complaints. The changes are only going to even potentially affect bots that try to screen-scrape these extensions' pages using an XML parser, which is a pretty niche issue. ___ MediaWiki-CodeReview mailing list mediawiki-coderev...@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/mediawiki-codereview
[Wikitech-l] Need some help with a problem with the Cite extension
Hi, Today, some users on ro.wp reported that the citations used more than once in the text had changed appeareance. Instead of ^a,b, we now have something like ↑ 34,0 34,1. This was considered disturbing by some users. Furthermore, the backlinks from the notes section to the article are not functioning. This does not happen on en.wp or other wikis. See http://ro.wikipedia.org/wiki/Dorin_Goian vs. http://en.wikipedia.org/wiki/Polyozellus From [[Special:Version]], I checked that the versions of the Cite extension were identical, then I searched for a relevant bug in bugzilla, both at the Site requests and at the Cite extension, but I could find nothing. Does anybody know what is the matter with the links? I did not file a bug report because I'm not sure if this is a php problem or some setting on ro.wp which makes the extension misbehave. Thanks a lot for any pointers, Strainu ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Static HTML Dumps
On 4/5/2011 4:00 PM, Platonides wrote I think he is better parsing the articles, though. For a linguistic research you don't need things such as the contents of templates, so a simple wikitext stripping would do. And it will be much, much, much, much faster than parsing the whole wiki. Could be true, but what's fascinating for me about Wikipedia is all of the unscrambled eggs that can be found in the middle of otherwise unstructured text. ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Rules for text in languages/messages/Messages***.php
On Tue, Apr 5, 2011 at 11:33 AM, Dan Nessett dness...@yahoo.com wrote: I understand I need to fix these problems. However, before proceeding I wonder if there are other constraints that I must observe. I have read the material at http://www.mediawiki.org/wiki/Localisation, but did not find the above constraints mentioned there. Is there another place where they are specified? I've added a few notes about the whitespace stuff: http://www.mediawiki.org/wiki/Localisation#Be_aware_of_whitespace_and_line_breaks -- brion ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
[MediaWiki-CodeReview] [MediaWiki r85468]: Revision status changed
User Krinkle changed the status of MediaWiki.r85468. Old Status: new New Status: deferred Full URL: http://www.mediawiki.org/wiki/Special:Code/MediaWiki/85468#c0 ___ MediaWiki-CodeReview mailing list mediawiki-coderev...@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/mediawiki-codereview
Re: [Wikitech-l] HipHop
On 5 April 2011 21:29, Aryeh Gregor simetrical+wikil...@gmail.com wrote: To the contrary, I'd expect JavaScript in the most recent version of any browser (even IE) to be *much* faster than PHP, maybe ten times faster on real-world tasks. All browsers now use JIT compilation for JavaScript, and have been competing intensively on raw JavaScript speed for the last three years or so. There are no drop-in alternative PHP implementations, so PHP is happy sticking with a ridiculously slow interpreter forever. So if we machine-translate the parser into JS, we can get the user to do the work and everyone wins! [*] (Magnus, did you do something like this for WYSIFTW?) - d. [*] if they're using a recent browser on a recent computer, etc etc, ymmv. ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
[MediaWiki-CodeReview] [MediaWiki r85247]: New comment added
User Dantman posted a comment on MediaWiki.r85247. Full URL: http://www.mediawiki.org/wiki/Special:Code/MediaWiki/85247#c15744 Comment: That's not a good reason for the linker not to be static, that has nothing to do with a skin overloading parts of the linker. That's more of a use case for a hook to alter urls (DumpHTML could use one too), or a custom hook for fragmenting the cache by url, some sort of multiple url config, or substituting local urls in stuff going into the parser cache with tokens. ___ MediaWiki-CodeReview mailing list mediawiki-coderev...@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/mediawiki-codereview
Re: [Wikitech-l] Linker, more on Skins, OutputPage, and context code patterns changes
On Mon, Apr 4, 2011 at 6:06 PM, Daniel Friesen li...@nadir-seen-fire.comwrote: Continuing with my changes to $wgOut, $wgUser, Skin, and SpecialPage patterns. The Linker is now static, $sk-link will still work, however you should not be requiring a Skin anymore to use linker methods. Please use Linker::link directly. I was about to go no! because mixed static/nonstatic functions throw deprecation warnings these days, but I see you've done this by separating the classes entirely and using a magic __call handler on Skin to forward the calls, neatly sidestepping the issue and avoiding writing a bunch of stubs. Yay! That should work, but feels a little icky. :D But if it's just for back-compat and we're migrating code over, then that's probably not a problem... Main thing to keep in mind is that exceptions thrown from a __call function for unknown stuff will act differently from the fatal error you otherwise get from misspelling a function name, as exceptions are catchable etc. This can sometimes lead to surprises. We also lose the documentation comments on the functions as they're currently used on Skin, but if that's being deprecated, not too big a deal. A bigger worry: DumpHTML's SkinOffline overrides makeBrokenLinkObj to change behavior (looks like it never got updated to override link()'s behavior for broken-link case?). Looks like that might need to be updated to use BeginLink/EndLink hooks or something... but I'm not sure how good a shape DumpHTML's in right now, it might need many more such fixes. -- brion ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Linker, more on Skins, OutputPage, and context code patterns changes
On 11-04-05 02:16 PM, Brion Vibber wrote: On Mon, Apr 4, 2011 at 6:06 PM, Daniel Friesenli...@nadir-seen-fire.comwrote: Continuing with my changes to $wgOut, $wgUser, Skin, and SpecialPage patterns. The Linker is now static, $sk-link will still work, however you should not be requiring a Skin anymore to use linker methods. Please use Linker::link directly. I was about to go no! because mixed static/nonstatic functions throw deprecation warnings these days, but I see you've done this by separating the classes entirely and using a magic __call handler on Skin to forward the calls, neatly sidestepping the issue and avoiding writing a bunch of stubs. Yay! That should work, but feels a little icky. :D But if it's just for back-compat and we're migrating code over, then that's probably not a problem... Main thing to keep in mind is that exceptions thrown from a __call function for unknown stuff will act differently from the fatal error you otherwise get from misspelling a function name, as exceptions are catchable etc. This can sometimes lead to surprises. We also lose the documentation comments on the functions as they're currently used on Skin, but if that's being deprecated, not too big a deal. A bigger worry: DumpHTML's SkinOffline overrides makeBrokenLinkObj to change behavior (looks like it never got updated to override link()'s behavior for broken-link case?). Looks like that might need to be updated to use BeginLink/EndLink hooks or something... but I'm not sure how good a shape DumpHTML's in right now, it might need many more such fixes. -- brion - Overrides a linker method - Uses deprecated poweredbyico and copyrightico - Is completely out of date as a skin, no use of headelement, footerlinks, blah, blah, blah... - Overrides the method generating content_actions instead of using content_navigation - Creates an OutputPage instead of using RequestContext I don't really like the fact that DumpHTML handles skin stuff by copying Monobook and making some overriding hacks in it. I'm starting to thing we should drop SkinOffline, create an offline/dump/static render mode inside skin or outputpage to be used to make some of the minor tweaks to skin markup relevant to static outputs, and make DumpHTML work by setting that offline mode, making calls to OutputPage to alter what scripts/styles are loaded, and using hooks to alter the behaviour of links made from the linker. And as a bonus: - We won't have to modify a clone of MonoBook anymore - And we'll be able to make dumps from any skin you want, including Vector, Modern, or any custom skin you build. ~Daniel Friesen (Dantman, Nadir-Seen-Fire) [http://daniel.friesen.name] ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
[Wikitech-l] Narayam User Interface Improvements
When Roan Kattouw reviewed, and then rewrote most of the Narayam extension, I reviewed his code, and found many things about the user interface that could be, and need to be improved. I have both some observations and some suggestions, expressed in the attached PDF. The exact design I'm proposing isn't a requirement, but hopefully is provides a starting point for a way to get this extension's user interface where we need it to be. - Trevor ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Narayam User Interface Improvements
Hoi, Where can this PDF be found ? Thanks, GerardM On 5 April 2011 23:44, Trevor Parscal tpars...@wikimedia.org wrote: When Roan Kattouw reviewed, and then rewrote most of the Narayam extension, I reviewed his code, and found many things about the user interface that could be, and need to be improved. I have both some observations and some suggestions, expressed in the attached PDF. The exact design I'm proposing isn't a requirement, but hopefully is provides a starting point for a way to get this extension's user interface where we need it to be. - Trevor ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Narayam User Interface Improvements
OK, not an attachment... File:Narayam-proposal.pdf File:Narayam-proposal-annotated.pdf - Trevor On Apr 5, 2011, at 2:44 PM, Trevor Parscal wrote: When Roan Kattouw reviewed, and then rewrote most of the Narayam extension, I reviewed his code, and found many things about the user interface that could be, and need to be improved. I have both some observations and some suggestions, expressed in the attached PDF. The exact design I'm proposing isn't a requirement, but hopefully is provides a starting point for a way to get this extension's user interface where we need it to be. - Trevor ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Narayam User Interface Improvements
And by that I meant... http://www.mediawiki.org/wiki/File:Narayam-proposal.pdf http://www.mediawiki.org/wiki/File:Narayam-proposal-annotated.pdf (I need to stop clicking send so fast... waiting a minute... OK, this one looks good.) - Trevor On Apr 5, 2011, at 2:55 PM, Trevor Parscal wrote: OK, not an attachment... File:Narayam-proposal.pdf File:Narayam-proposal-annotated.pdf - Trevor On Apr 5, 2011, at 2:44 PM, Trevor Parscal wrote: When Roan Kattouw reviewed, and then rewrote most of the Narayam extension, I reviewed his code, and found many things about the user interface that could be, and need to be improved. I have both some observations and some suggestions, expressed in the attached PDF. The exact design I'm proposing isn't a requirement, but hopefully is provides a starting point for a way to get this extension's user interface where we need it to be. - Trevor ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
[MediaWiki-CodeReview] [MediaWiki r85450]: Revision status changed
User Platonides changed the status of MediaWiki.r85450. Old Status: new New Status: ok Full URL: http://www.mediawiki.org/wiki/Special:Code/MediaWiki/85450#c0 ___ MediaWiki-CodeReview mailing list mediawiki-coderev...@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/mediawiki-codereview
Re: [Wikitech-l] HipHop
On Tue, Apr 5, 2011 at 10:07 PM, David Gerard dger...@gmail.com wrote: On 5 April 2011 21:29, Aryeh Gregor simetrical+wikil...@gmail.com wrote: To the contrary, I'd expect JavaScript in the most recent version of any browser (even IE) to be *much* faster than PHP, maybe ten times faster on real-world tasks. All browsers now use JIT compilation for JavaScript, and have been competing intensively on raw JavaScript speed for the last three years or so. There are no drop-in alternative PHP implementations, so PHP is happy sticking with a ridiculously slow interpreter forever. So if we machine-translate the parser into JS, we can get the user to do the work and everyone wins! [*] (Magnus, did you do something like this for WYSIFTW?) Nope, all hand-rolled, just like my last 50 or so parser wannabe implementations ;-) (this one has the advantage of a dunno what this is, just keep the wikitext fallback, which helped a lot) Magnus ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
[MediaWiki-CodeReview] [MediaWiki r68358]: Revision status changed
User Platonides changed the status of MediaWiki.r68358. Old Status: fixme New Status: reverted Full URL: http://www.mediawiki.org/wiki/Special:Code/MediaWiki/68358#c0 ___ MediaWiki-CodeReview mailing list mediawiki-coderev...@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/mediawiki-codereview
Re: [Wikitech-l] Need some help with a problem with the Cite extension
Strainu wrote: Hi, Today, some users on ro.wp reported that the citations used more than once in the text had changed appeareance. Instead of ^a,b, we now have something like ↑ 34,0 34,1. This was considered disturbing by some users. Furthermore, the backlinks from the notes section to the article are not functioning. This does not happen on en.wp or other wikis. See http://ro.wikipedia.org/wiki/Dorin_Goian vs. http://en.wikipedia.org/wiki/Polyozellus From [[Special:Version]], I checked that the versions of the Cite extension were identical, then I searched for a relevant bug in bugzilla, both at the Site requests and at the Cite extension, but I could find nothing. Does anybody know what is the matter with the links? I did not file a bug report because I'm not sure if this is a php problem or some setting on ro.wp which makes the extension misbehave. Thanks a lot for any pointers, Strainu The problem are actually links to anchors starting with an underscore, which was broken in r68358 (they get stripped). It already was reported as bug 27474, but I had to rediscover it the hard way. You can work around it by changing http://ro.wikipedia.org/w/index.php?title=MediaWiki:cite_reference_link_prefixaction=edit to not begin with an underscore (the default since r32160, cite_ref- works fine). I reverted it on r85481. ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
[MediaWiki-CodeReview] [MediaWiki r79108]: New comment added
User Platonides posted a comment on MediaWiki.r79108. Full URL: http://www.mediawiki.org/wiki/Special:Code/MediaWiki/79108#c15745 Comment: Maybe you were running with an old parserTEst.inc version where $wgMemc = wfGetMainCache(); $messageMemc = wfGetMessageCacheStorage(); $parserMemc = wfGetParserCacheStorage(); still worked by reference. In such case I do get a parsertest_objectcache' doesn't exist message (although not for message-profiling). ___ MediaWiki-CodeReview mailing list mediawiki-coderev...@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/mediawiki-codereview
Re: [Wikitech-l] Need some help with a problem with the Cite extension
Implemented the workaround. Should be fixed now. Ryan Kaldari On 4/5/11 3:42 PM, Platonides wrote: Strainu wrote: Hi, Today, some users on ro.wp reported that the citations used more than once in the text had changed appeareance. Instead of ^a,b, we now have something like ↑ 34,0 34,1. This was considered disturbing by some users. Furthermore, the backlinks from the notes section to the article are not functioning. This does not happen on en.wp or other wikis. See http://ro.wikipedia.org/wiki/Dorin_Goian vs. http://en.wikipedia.org/wiki/Polyozellus From [[Special:Version]], I checked that the versions of the Cite extension were identical, then I searched for a relevant bug in bugzilla, both at the Site requests and at the Cite extension, but I could find nothing. Does anybody know what is the matter with the links? I did not file a bug report because I'm not sure if this is a php problem or some setting on ro.wp which makes the extension misbehave. Thanks a lot for any pointers, Strainu The problem are actually links to anchors starting with an underscore, which was broken in r68358 (they get stripped). It already was reported as bug 27474, but I had to rediscover it the hard way. You can work around it by changing http://ro.wikipedia.org/w/index.php?title=MediaWiki:cite_reference_link_prefixaction=edit to not begin with an underscore (the default since r32160, cite_ref- works fine). I reverted it on r85481. ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Need some help with a problem with the Cite extension
Well it fixed the functionality of the links, but they still look weird. Ryan Kaldari On 4/5/11 4:02 PM, Ryan Kaldari wrote: Implemented the workaround. Should be fixed now. Ryan Kaldari On 4/5/11 3:42 PM, Platonides wrote: Strainu wrote: Hi, Today, some users on ro.wp reported that the citations used more than once in the text had changed appeareance. Instead of ^a,b, we now have something like ↑ 34,0 34,1. This was considered disturbing by some users. Furthermore, the backlinks from the notes section to the article are not functioning. This does not happen on en.wp or other wikis. See http://ro.wikipedia.org/wiki/Dorin_Goian vs. http://en.wikipedia.org/wiki/Polyozellus From [[Special:Version]], I checked that the versions of the Cite extension were identical, then I searched for a relevant bug in bugzilla, both at the Site requests and at the Cite extension, but I could find nothing. Does anybody know what is the matter with the links? I did not file a bug report because I'm not sure if this is a php problem or some setting on ro.wp which makes the extension misbehave. Thanks a lot for any pointers, Strainu The problem are actually links to anchors starting with an underscore, which was broken in r68358 (they get stripped). It already was reported as bug 27474, but I had to rediscover it the hard way. You can work around it by changing http://ro.wikipedia.org/w/index.php?title=MediaWiki:cite_reference_link_prefixaction=edit to not begin with an underscore (the default since r32160, cite_ref- works fine). I reverted it on r85481. ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
[MediaWiki-CodeReview] [MediaWiki r85403]: New comment added, and revision status changed
User Platonides changed the status of MediaWiki.r85403. Old Status: new New Status: fixme User Platonides also posted a comment on MediaWiki.r85403. Full URL: http://www.mediawiki.org/wiki/Special:Code/MediaWiki/85403#c15746 Comment: $ php tests/parserTests.php PHP Fatal error: Call to a member function getDefaultUserOptionOverrides() on a non-object in User.php:1071 User.php:1963 RequestContext.php:108 RequestContext.php:194 parserTest.inc:684 parserTest.inc:427 parserTest.inc:374 parserTest.inc:356 parserTests.php:90 includes/User.php on line 1071 Reason: You are removing the Language::factory( $lang ); ___ MediaWiki-CodeReview mailing list mediawiki-coderev...@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/mediawiki-codereview
[Wikitech-l] Need some help with a problem with the Cite extension
Message: 1 Date: Tue, 5 Apr 2011 23:35:00 +0300 From: Strainu strain...@gmail.com Subject: [Wikitech-l] Need some help with a problem with the Cite extension To: Wikimedia developers wikitech-l@lists.wikimedia.org Message-ID: BANLkTik6Lbr=oc1sgcpbmhy0qjnwt6m...@mail.gmail.com Content-Type: text/plain; charset=UTF-8 Hi, Today, some users on ro.wp reported that the citations used more than once in the text had changed appeareance. Instead of ^a,b, we now have something like ? 34,0 34,1. This was considered disturbing by some users. Furthermore, the backlinks from the notes section to the article are not functioning. This does not happen on en.wp or other wikis. See http://ro.wikipedia.org/wiki/Dorin_Goian vs. http://en.wikipedia.org/wiki/Polyozellus From [[Special:Version]], I checked that the versions of the Cite extension were identical, then I searched for a relevant bug in bugzilla, both at the Site requests and at the Cite extension, but I could find nothing. Does anybody know what is the matter with the links? I did not file a bug report because I'm not sure if this is a php problem or some setting on ro.wp which makes the extension misbehave. Thanks a lot for any pointers, Strainu From what I can tell, it seems as if a wrong version of [[ro:mediawiki:cite_references_link_many_format]] somehow got cached somewhere. It was changed to its current value in April 2007. However, it was showing the wrong value in [[special:allmessages]]. Anyways, after purging that page (or perhaps this is all a big coincidence), the right version appeared in Special:allmessages and the correct behaviour was observed on [[Dorin_Goian]] (after purging that page too to get rid of the parser cached version of it). So I'm not sure what the deal with that was, but it seems to be a one time issue. (Or also entirely possible, it fixed itself well I was looking at it, and I just think purging the page had something to do with it, where really its all a big coincidence). -bawolff ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
[MediaWiki-CodeReview] [MediaWiki r85492]: Revision status changed
User Jack Phoenix changed the status of MediaWiki.r85492. Old Status: deferred New Status: new Full URL: http://www.mediawiki.org/wiki/Special:Code/MediaWiki/85492#c0 ___ MediaWiki-CodeReview mailing list mediawiki-coderev...@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/mediawiki-codereview
[MediaWiki-CodeReview] [MediaWiki r85500]: Revision status changed
User Jack Phoenix changed the status of MediaWiki.r85500. Old Status: deferred New Status: new Full URL: http://www.mediawiki.org/wiki/Special:Code/MediaWiki/85500#c0 ___ MediaWiki-CodeReview mailing list mediawiki-coderev...@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/mediawiki-codereview
[MediaWiki-CodeReview] [MediaWiki r85503]: Revision status changed
User Jack Phoenix changed the status of MediaWiki.r85503. Old Status: deferred New Status: new Full URL: http://www.mediawiki.org/wiki/Special:Code/MediaWiki/85503#c0 ___ MediaWiki-CodeReview mailing list mediawiki-coderev...@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/mediawiki-codereview
[MediaWiki-CodeReview] [MediaWiki r85505]: Revision status changed
User Jack Phoenix changed the status of MediaWiki.r85505. Old Status: deferred New Status: new Full URL: http://www.mediawiki.org/wiki/Special:Code/MediaWiki/85505#c0 ___ MediaWiki-CodeReview mailing list mediawiki-coderev...@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/mediawiki-codereview
[MediaWiki-CodeReview] [MediaWiki r85506]: Revision status changed
User Jack Phoenix changed the status of MediaWiki.r85506. Old Status: deferred New Status: new Full URL: http://www.mediawiki.org/wiki/Special:Code/MediaWiki/85506#c0 ___ MediaWiki-CodeReview mailing list mediawiki-coderev...@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/mediawiki-codereview
Re: [Wikitech-l] Narayam User Interface Improvements
On Tue, Apr 5, 2011 at 2:56 PM, Trevor Parscal tpars...@wikimedia.orgwrote: And by that I meant... http://www.mediawiki.org/wiki/File:Narayam-proposal.pdf http://www.mediawiki.org/wiki/File:Narayam-proposal-annotated.pdf Some nice thoughts; I definitely like having a more compact trigger at the top of the page (see page 4 of the annotated PDF). Another thing to consider is making input methods available separately from content or UI language: we have a lot of mixed-language wikis, and not everybody's going to want to switch their preferences (or figure out 'uselang=' URL magic) just to type a little bit. Even for single-language wikis like the Wikipedias, you might want to say type a name in a language you know, on a wiki that's in another language you know. This could be a case for a user preference, like enabling of additional input methods in full-blown operating systems: usually if you're typing in a language that doesn't use input methods, they'll never get in your way but you can select some languages and now you get a status bar icon or such where you can pick from your options. Would benefit from global preferences, too! While we're in there, I've added Esperanto conversion rules to Narayam (trunk r85504) -- it seems to work really nice! I'd strongly consider dropping the old server-side conversion system for this if it's ready to roll by 1.18. (Making quick notes on bugs 3615 and 21781 to this effect.) -- brion ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
[MediaWiki-CodeReview] [MediaWiki r85507]: Revision status changed
User Jack Phoenix changed the status of MediaWiki.r85507. Old Status: deferred New Status: new Full URL: http://www.mediawiki.org/wiki/Special:Code/MediaWiki/85507#c0 ___ MediaWiki-CodeReview mailing list mediawiki-coderev...@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/mediawiki-codereview
[MediaWiki-CodeReview] [MediaWiki r85465]: New comment added
User NeilK posted a comment on MediaWiki.r85465. Full URL: http://www.mediawiki.org/wiki/Special:Code/MediaWiki/85465#c15747 Comment: When I loaded this for the first time on prototype, I managed to get the spinner spinning above the UploadWizard. The spinner didn't go away. ___ MediaWiki-CodeReview mailing list mediawiki-coderev...@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/mediawiki-codereview
Re: [Wikitech-l] Narayam User Interface Improvements
On Wed, April 6, 2011 3:26 am, Trevor Parscal wrote: And by that I meant... http://www.mediawiki.org/wiki/File:Narayam-proposal.pdf http://www.mediawiki.org/wiki/File:Narayam-proposal-annotated.pdf The positioning of IME looks good. It does not make much space and showing it on hover of the icon is nice. Placing a select box inside a dropdown div does not look good to me(page 5 of the pdf). That adds one extra level of user interaction. Can't we simlify this by listing available input methods with a radio button in the dropdown div itself? Some thing like this: [icon] [personal tools] +---+ | Select an Input Tool: | | O [Input Method 1] | | O [Input Method 2] | | O [Input Method 3] | | O [Disable-Ctrl+ M] | | [Help Link] | +---+ Thanks Santhosh ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
[MediaWiki-CodeReview] [MediaWiki r85504]: New comment added, and revision status changed
User MaxSem changed the status of MediaWiki.r85504. Old Status: new New Status: fixme User MaxSem also posted a comment on MediaWiki.r85504. Full URL: http://www.mediawiki.org/wiki/Special:Code/MediaWiki/85504#c15748 Comment: Misses the .js file. ___ MediaWiki-CodeReview mailing list mediawiki-coderev...@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/mediawiki-codereview