[Wikitech-l] Lack of search results from Wikisource services

2011-04-06 Thread Billinghurst
I haven't been able to get an answer/response via IRC so I will try this 
methodology.

When running searches at English and French Wikisources today, I have been not 
given 
search results, though it will successfully run an ajax check and locate a 
link, and if it 
is blue to forward you to that link.

These links do not give results (searches tried by myself and another)

Secure, logged in, English Wikisource
https://secure.wikimedia.org/wikisource/en/w/index.php?title=Special%3ASearchredirs=1search=shakespearefulltext=Searchns0=1ns4=1ns8=1ns10=1ns12=1ns14=1ns100=1ns102=1ns104=1ns106=1title=Special%3ASearchadvanced=1fulltext=Advanced+search

Non-secure, not logged in, French WS
http://fr.wikisource.org/w/index.php?title=Sp%C3%A9cial%3ARechercheredirs=1search=Shakespearefulltext=Searchns0=1title=Sp%C3%A9cial%3ARechercheadvanced=1fulltext=Advanced+search

Some assistance would be great.
Regards Andrew

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Rules for text in languages/messages/Messages***.php

2011-04-06 Thread Niklas Laxström
On 5 April 2011 23:45, Brion Vibber br...@pobox.com wrote:
 On Tue, Apr 5, 2011 at 11:33 AM, Dan Nessett dness...@yahoo.com wrote:

 I understand I need to fix these problems. However, before proceeding I
 wonder if there are other constraints that I must observe. I have read
 the material at http://www.mediawiki.org/wiki/Localisation, but did not
 find the above constraints mentioned there. Is there another place where
 they are specified?


 I've added a few notes about the whitespace stuff:
 http://www.mediawiki.org/wiki/Localisation#Be_aware_of_whitespace_and_line_breaks

Other thing is markup which Mediawiki automatically converts, like
. This is disabled on translatewiki.net but not when editing
messages in local wiki.
Also the message names must be valid Mediawiki titles and unique case
insensitively (restriction placed by twn).

  -Niklas

-- 
Niklas Laxström

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


[MediaWiki-CodeReview] [MediaWiki r85466]: Revision status changed

2011-04-06 Thread MediaWiki Mail
User Nikerabbit changed the status of MediaWiki.r85466.

Old Status: new
New Status: ok

Full URL: http://www.mediawiki.org/wiki/Special:Code/MediaWiki/85466#c0

___
MediaWiki-CodeReview mailing list
mediawiki-coderev...@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/mediawiki-codereview


[MediaWiki-CodeReview] [MediaWiki r85508]: New comment added

2011-04-06 Thread MediaWiki Mail
User Siebrand posted a comment on MediaWiki.r85508.

Full URL: http://www.mediawiki.org/wiki/Special:Code/MediaWiki/85508#c15749

Comment:

Can we please drop all but the CC 3.0 licenses (2.0 and 2.5)? They confuse the 
heck out of uploaders. Those that want 2.0 or 2.5 for whatever reason, will add 
it anyway.

___
MediaWiki-CodeReview mailing list
mediawiki-coderev...@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/mediawiki-codereview


[MediaWiki-CodeReview] [MediaWiki r85508]: New comment added

2011-04-06 Thread MediaWiki Mail
User MaxSem posted a comment on MediaWiki.r85508.

Full URL: http://www.mediawiki.org/wiki/Special:Code/MediaWiki/85508#c15750

Comment:

Hiding the older licenses makes a lot of sense for self-created images. 
However, there are lots of images licensed under the previous versions 
available in the internets, and people should be able to specify them when 
uploading.

___
MediaWiki-CodeReview mailing list
mediawiki-coderev...@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/mediawiki-codereview


[MediaWiki-CodeReview] [MediaWiki r85479]: Revision status changed

2011-04-06 Thread MediaWiki Mail
User Krinkle changed the status of MediaWiki.r85479.

Old Status: new
New Status: ok

Full URL: http://www.mediawiki.org/wiki/Special:Code/MediaWiki/85479#c0

___
MediaWiki-CodeReview mailing list
mediawiki-coderev...@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/mediawiki-codereview


[Wikitech-l] Core html of a wikisource page

2011-04-06 Thread Alex Brollo
I saved the HTML source of a typical Page: page from it.source, the
resulting txt file having ~ 28 kBy; then I saved the core html only, t.i.
the content of div class=pagetext, and this file have 2.1 kBy; so
there's a more than tenfold ratio between container and real content.

I there a trick to download the core html only? And, most important: could
this save a little bit of server load/bandwidth? I humbly think that core
html alone could be useful as a means to obtain a well formed page
content,  and that this could be useful to obtain derived formats of the
page (i.e. ePub).

Alex brollo
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


[MediaWiki-CodeReview] [MediaWiki r85477]: Revision status changed

2011-04-06 Thread MediaWiki Mail
User Krinkle changed the status of MediaWiki.r85477.

Old Status: new
New Status: ok

Full URL: http://www.mediawiki.org/wiki/Special:Code/MediaWiki/85477#c0

___
MediaWiki-CodeReview mailing list
mediawiki-coderev...@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/mediawiki-codereview


[MediaWiki-CodeReview] [MediaWiki r85469]: Revision status changed

2011-04-06 Thread MediaWiki Mail
User Krinkle changed the status of MediaWiki.r85469.

Old Status: new
New Status: ok

Full URL: http://www.mediawiki.org/wiki/Special:Code/MediaWiki/85469#c0

___
MediaWiki-CodeReview mailing list
mediawiki-coderev...@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/mediawiki-codereview


Re: [Wikitech-l] Core html of a wikisource page

2011-04-06 Thread Daniel Kinzler
On 06.04.2011 09:15, Alex Brollo wrote:
 I saved the HTML source of a typical Page: page from it.source, the
 resulting txt file having ~ 28 kBy; then I saved the core html only, t.i.
 the content of div class=pagetext, and this file have 2.1 kBy; so
 there's a more than tenfold ratio between container and real content.

wow, really? that seems a lot...

 I there a trick to download the core html only? 

there are two ways:

a) the old style render action, like this:
http://en.wikipedia.org/wiki/Foo?action=render

b) the api parse action, like this:
http://en.wikipedia.org/w/api.php?action=parsepage=Fooredirects=1format=xml

To learn more about the web API, have a look at 
http://www.mediawiki.org/wiki/API

 And, most important: could
 this save a little bit of server load/bandwidth? 

No, quite to the contrary. The full page HTML is heavily cached. If you pull the
full page (without being logged in), it's quite likely that the page will be
served from a front tier reverse proxy (squid or varnish). API requests and
render actions however always go through to the actual Apache servers and cause
more load.

However, as long as you don't make several requests at once, you are not putting
any serious strain on the servers. Wikimedia servers more than a hundret
thousand requests per second. One more is not so terrible...

 I humbly think that core
 html alone could be useful as a means to obtain a well formed page
 content,  and that this could be useful to obtain derived formats of the
 page (i.e. ePub).

It is indeed frequently used for that.

cheers,
daniel

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Core html of a wikisource page

2011-04-06 Thread Alex Brollo
2011/4/6 Daniel Kinzler dan...@brightbyte.de

 On 06.04.2011 09:15, Alex Brollo wrote:
  I saved the HTML source of a typical Page: page from it.source, the
  resulting txt file having ~ 28 kBy; then I saved the core html only,
 t.i.
  the content of div class=pagetext, and this file have 2.1 kBy; so
  there's a more than tenfold ratio between container and real content.

 wow, really? that seems a lot...

  I there a trick to download the core html only?

 there are two ways:

 a) the old style render action, like this:
 http://en.wikipedia.org/wiki/Foo?action=render

 b) the api parse action, like this:
 
 http://en.wikipedia.org/w/api.php?action=parsepage=Fooredirects=1format=xml
 

 To learn more about the web API, have a look at 
 http://www.mediawiki.org/wiki/API


Thanks Daniel, API stuff is a little hard for me:  the more I study, the
less I edit. :-)

Just to have a try, I called the same page, render action gives a file of
~ 3.4 kBy, api action a file of ~ 5.6 kBy. Obviuosly I'm thinking to bot
download. You are suggesting that it would be a good idea to use a *unlogged
* bot to avoid page parsing, and to catch the page code from some cache? I
know that some thousands of calls are nothing for wiki servers, but... I
always try to get a good performance, even from the most banal template.

Alex
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Core html of a wikisource page

2011-04-06 Thread Daniel Kinzler
Hi Alex
 Thanks Daniel, API stuff is a little hard for me:  the more I study, the
 less I edit. :-)
 
 Just to have a try, I called the same page, render action gives a file of
 ~ 3.4 kBy, api action a file of ~ 5.6 kBy.

That's because the render call returns just the HTML, while the API call
includes some meta-info in the XML wrapper.

 Obviuosly I'm thinking to bot
 download. You are suggesting that it would be a good idea to use a *unlogged
 * bot to avoid page parsing, and to catch the page code from some cache?

No. I'm saying that non-logged-in views of full pages are what  causes the least
server load. I'm not saying that this is what you should use. For one thing, it
wasts bandwidth and causes additional work on your side (tripping the skin 
cruft).

I would recommend to use action=render if you need just the plain old html, or
the API if you need a bit more control, e.g. over whether templates are resolved
or not, how redirects are handled, etc.

If your bot is logged in when fetching the pages would only matter if you
requested full page html. Which, as I said, isn't the best option for what you
are doing. So, log in or not, it doesn't matter. But do use a distinctive and
descriptive User Agent string for your bot, ideally containing some contact info
http://meta.wikimedia.org/wiki/User-Agent_policy.

Note that as soon as the bot does any editing, it really should be logged in,
and, depending on the wiki's rules, have a bot flag, or have some specific info
on its user page.

 I
 know that some thousands of calls are nothing for wiki servers, but... I
 always try to get a good performance, even from the most banal template.

That'S always a good idea :)

-- daniel

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Core html of a wikisource page

2011-04-06 Thread Alex Brollo
2011/4/6 Daniel Kinzler dan...@brightbyte.de

  I
  know that some thousands of calls are nothing for wiki servers, but... I
  always try to get a good performance, even from the most banal template.


 That'S always a good idea :)

 -- daniel


Thanks Daniel.  So, my edits will drop again. I'll put myself into study
mode :-D

Alex
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


[Wikitech-l] Peng Wan's proposal to figure out the most popular pages

2011-04-06 Thread Peng Wan
This is Peng Wan. I have submitted my application to wikimedia of Gsoc.
My Project title is Figuring out the most popular pages.

Here is the project's short description:
The feature aims to figure out the most popular and favorite pages in
wikimedia. The most popular pages are calculated when users click on pages.
The click event can send the action to the database. Then we can figure out
the most popular pages by querying through the destination urls. As for the
most favorite pages, I want to add a like or +1 tag in every page. If
one user likes the content in the page, s/he just need to click the like
link to add the like number in database.

Here is my proposal link:
http://www.google-melange.com/gsoc/proposal/review/google/gsoc2011/buaajackson/1#

I would appreciate for your advice about my proposal.

Thanks
Peng Wan
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Peng Wan's proposal to figure out the most popular pages

2011-04-06 Thread David Gerard
On 6 April 2011 16:02, Peng Wan buaajack...@gmail.com wrote:

 This is Peng Wan. I have submitted my application to wikimedia of Gsoc.
 My Project title is Figuring out the most popular pages.


Does it do anything http://stats.grok.se/ doesn't?


- d.

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Peng Wan's proposal to figure out the most popular pages

2011-04-06 Thread Andrew G. West
See also: http://dammit.lt/wikistats/

I've parsed every one of these files (at hour granularity; grok.se 
aggregates at day-level, I believe) since Jan. 2010 into a DB structure 
indexed by page title. It takes up about 400GB of space, at the moment.

While a comprehensive measurement study over this data would be 
interesting (long term trends, traffic spikes during cultural events, 
etc.) -- the technical infrastructure is already in place. I doubt a 
measurement study meets GSoC requirements.

Thanks, -AW


On 04/06/2011 11:05 AM, David Gerard wrote:
 On 6 April 2011 16:02, Peng Wanbuaajack...@gmail.com  wrote:

 This is Peng Wan. I have submitted my application to wikimedia of Gsoc.
 My Project title is Figuring out the most popular pages.


 Does it do anything http://stats.grok.se/ doesn't?


 - d.

 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l

-- 
Andrew G. West, Doctoral Student
Dept. of Computer and Information Science
University of Pennsylvania, Philadelphia PA
Website: http://www.cis.upenn.edu/~westand

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Narayam User Interface Improvements

2011-04-06 Thread Trevor Parscal
On Apr 5, 2011, at 7:49 PM, Santhosh Thottingal wrote:
 
 Placing a select box inside a dropdown div does not look good to me(page 5
 of the pdf). That adds one extra level of user interaction. Can't we
 simlify this by listing available input methods with a radio button in the
 dropdown div itself? Some thing like this:
 
 [icon]  [personal tools]
 +---+
 | Select an Input Tool: |
 | O  [Input Method 1]   |
 | O  [Input Method 2]   |
 | O  [Input Method 3]   |
 | O  [Disable-Ctrl+ M]  |
 | [Help Link]   |
 +---+

Given how few options there are, this seems like a great idea.

- Trevor

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


[MediaWiki-CodeReview] [MediaWiki r85525]: Revision status changed

2011-04-06 Thread MediaWiki Mail
User Krinkle changed the status of MediaWiki.r85525.

Old Status: new
New Status: ok

Full URL: http://www.mediawiki.org/wiki/Special:Code/MediaWiki/85525#c0

___
MediaWiki-CodeReview mailing list
mediawiki-coderev...@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/mediawiki-codereview


[MediaWiki-CodeReview] [MediaWiki r85531]: New comment added

2011-04-06 Thread MediaWiki Mail
User ^demon posted a comment on MediaWiki.r85531.

Full URL: http://www.mediawiki.org/wiki/Special:Code/MediaWiki/85531#c15751

Comment:

I know this is a first import. Couple of quick notes:
* README and COPYING need eol-style
* Shouldn't use Snoopy, should use MW's Http class.
* Could this not possibly be better as a core ExtAuth plugin (we already have 
one for a local wiki DB other than our own)

___
MediaWiki-CodeReview mailing list
mediawiki-coderev...@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/mediawiki-codereview


[MediaWiki-CodeReview] [MediaWiki r85531]: New comment added

2011-04-06 Thread MediaWiki Mail
User Jack Phoenix posted a comment on MediaWiki.r85531.

Full URL: http://www.mediawiki.org/wiki/Special:Code/MediaWiki/85531#c15752

Comment:

*''README and COPYING need eol-style''
**Fixed in r85532.
*''Shouldn't use Snoopy, should use MW's Http class.''
**I agree, I just don't have the courage to rewrite it, I know I'll end up 
breaking stuff. ;-)
*''Could this not possibly be better as a core ExtAuth plugin (we already have 
one for a local wiki DB other than our own)''
**Interesting idea; maybe...I'm not the original developer of this extension, 
although I've made a few tweaks to it; I just decided to import this to the 
official SVN seeing that the source code was already publicly available on the 
Internet.

___
MediaWiki-CodeReview mailing list
mediawiki-coderev...@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/mediawiki-codereview


[MediaWiki-CodeReview] [MediaWiki r85533]: New comment added

2011-04-06 Thread MediaWiki Mail
User ^demon posted a comment on MediaWiki.r85533.

Full URL: http://www.mediawiki.org/wiki/Special:Code/MediaWiki/85533#c15753

Comment:

Welcome! Needs svn:eol-style native. Please remember to 
[[Subversion/auto-props|set your auto-props]]

___
MediaWiki-CodeReview mailing list
mediawiki-coderev...@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/mediawiki-codereview


[MediaWiki-CodeReview] [MediaWiki r84259]: Revision status changed

2011-04-06 Thread MediaWiki Mail
User ^demon changed the status of MediaWiki.r84259.

Old Status: new
New Status: ok

Full URL: http://www.mediawiki.org/wiki/Special:Code/MediaWiki/84259#c0

___
MediaWiki-CodeReview mailing list
mediawiki-coderev...@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/mediawiki-codereview


[MediaWiki-CodeReview] [MediaWiki r85532]: Revision status changed

2011-04-06 Thread MediaWiki Mail
User Reedy changed the status of MediaWiki.r85532.

Old Status: new
New Status: ok

Full URL: http://www.mediawiki.org/wiki/Special:Code/MediaWiki/85532#c0

___
MediaWiki-CodeReview mailing list
mediawiki-coderev...@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/mediawiki-codereview


[MediaWiki-CodeReview] [MediaWiki r85377]: New comment added

2011-04-06 Thread MediaWiki Mail
User Brion VIBBER posted a comment on MediaWiki.r85377.

Full URL: http://www.mediawiki.org/wiki/Special:Code/MediaWiki/85377#c15754

Comment:

marking as needing merge to 1.17; the issue this fixes does occur on live sites.

___
MediaWiki-CodeReview mailing list
mediawiki-coderev...@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/mediawiki-codereview


[MediaWiki-CodeReview] [MediaWiki r85504]: New comment added, and revision status changed

2011-04-06 Thread MediaWiki Mail
User Brion VIBBER changed the status of MediaWiki.r85504.

Old Status: fixme
New Status: resolved

User Brion VIBBER also posted a comment on MediaWiki.r85504.

Full URL: http://www.mediawiki.org/wiki/Special:Code/MediaWiki/85504#c15755

Comment:

Woops! Added in r85545.

___
MediaWiki-CodeReview mailing list
mediawiki-coderev...@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/mediawiki-codereview


[MediaWiki-CodeReview] [MediaWiki r85531]: New comment added

2011-04-06 Thread MediaWiki Mail
User Raymond posted a comment on MediaWiki.r85531.

Full URL: http://www.mediawiki.org/wiki/Special:Code/MediaWiki/85531#c15756

Comment:

The i18n file is still Wikia related:

 Please a target=_new href=http://www.wikia.com/wiki/Special:UserLogin; ...

$1 needs support for PLURAL

 +  'mwa-wait' = 'Please wait $1 seconds before trying again.',

___
MediaWiki-CodeReview mailing list
mediawiki-coderev...@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/mediawiki-codereview


[MediaWiki-CodeReview] [MediaWiki r85531]: New comment added

2011-04-06 Thread MediaWiki Mail
User Jack Phoenix posted a comment on MediaWiki.r85531.

Full URL: http://www.mediawiki.org/wiki/Special:Code/MediaWiki/85531#c15757

Comment:

*''The i18n file is still Wikia related''
**Yes, because we don't have a way of knowing the source wiki's sitename and 
because people commonly use this with Wikia as the source wiki. It's not 
optimal, but I can live with it.

*''$1 needs support for PLURAL''
**That's simple as pie, just change wfMsg() to wfMsgExt() with parsemag option 
and update the message accordingly.

___
MediaWiki-CodeReview mailing list
mediawiki-coderev...@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/mediawiki-codereview


[MediaWiki-CodeReview] [MediaWiki r85536]: New comment added, and revision status changed

2011-04-06 Thread MediaWiki Mail
User Nikerabbit changed the status of MediaWiki.r85536.

Old Status: new
New Status: fixme

User Nikerabbit also posted a comment on MediaWiki.r85536.

Full URL: http://www.mediawiki.org/wiki/Special:Code/MediaWiki/85536#c15758

Comment:

PHP Notice:  iconv(): Detected an illegal character in input string in 
/www/w/languages/Language.php on line 1884

___
MediaWiki-CodeReview mailing list
mediawiki-coderev...@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/mediawiki-codereview


[MediaWiki-CodeReview] [MediaWiki r85403]: New comment added, and revision status changed

2011-04-06 Thread MediaWiki Mail
User Happy-melon changed the status of MediaWiki.r85403.

Old Status: fixme
New Status: new

User Happy-melon also posted a comment on MediaWiki.r85403.

Full URL: http://www.mediawiki.org/wiki/Special:Code/MediaWiki/85403#c15759

Comment:

Fixed in r85553.

___
MediaWiki-CodeReview mailing list
mediawiki-coderev...@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/mediawiki-codereview


[MediaWiki-CodeReview] [MediaWiki r84805]: New comment added, and revision status changed

2011-04-06 Thread MediaWiki Mail
User Brion VIBBER changed the status of MediaWiki.r84805.

Old Status: new
New Status: fixme

User Brion VIBBER also posted a comment on MediaWiki.r84805.

Full URL: http://www.mediawiki.org/wiki/Special:Code/MediaWiki/84805#c15760

Comment:

This broke Special:Specialpages with CentralAuth enabled:

PHP Fatal error:  Class 'UsersPager' not found in 
/var/www/trunk/extensions/CentralAuth/SpecialGlobalUsers.php on line 3

___
MediaWiki-CodeReview mailing list
mediawiki-coderev...@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/mediawiki-codereview


[MediaWiki-CodeReview] [MediaWiki r81558]: New comment added, and revision status changed

2011-04-06 Thread MediaWiki Mail
User Brion VIBBER changed the status of MediaWiki.r81558.

Old Status: ok
New Status: fixme

User Brion VIBBER also posted a comment on MediaWiki.r81558.

Full URL: http://www.mediawiki.org/wiki/Special:Code/MediaWiki/81558#c15761

Comment:

This triggers a fatal error when thumbnail caching is disabled:

  PHP Fatal error:  Cannot pass parameter 4 by reference in 
/var/www/trunk/includes/filerepo/ForeignAPIRepo.php on line 234

That's the null on: return $this-getThumbUrl( $name, $width, $height, null, 
$params );


___
MediaWiki-CodeReview mailing list
mediawiki-coderev...@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/mediawiki-codereview


[MediaWiki-CodeReview] [MediaWiki r85555]: New comment added

2011-04-06 Thread MediaWiki Mail
User Brion VIBBER posted a comment on MediaWiki.r8.

Full URL: http://www.mediawiki.org/wiki/Special:Code/MediaWiki/8#c15762

Comment:

This should be merged to 1.17 and live deployment; affects files on Commons 
that have had some info suppressed, the usernames are exposed when the page is 
viewed on other Wikimedia wikis. (Offsite wikis using InstantCommons are not 
affected, as the info isn't exposed through the API.)

___
MediaWiki-CodeReview mailing list
mediawiki-coderev...@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/mediawiki-codereview


[MediaWiki-CodeReview] [MediaWiki r85555]: New comment added

2011-04-06 Thread MediaWiki Mail
User Catrope posted a comment on MediaWiki.r8.

Full URL: 
https://secure.wikimedia.org/wikipedia/mediawiki/wiki/Special:Code/MediaWiki/8#c15763

Comment:

For live deployment, use the 1.17wmf1 tag (added)

___
MediaWiki-CodeReview mailing list
mediawiki-coderev...@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/mediawiki-codereview


[MediaWiki-CodeReview] [MediaWiki r85531]: New comment added

2011-04-06 Thread MediaWiki Mail
User Reach Out to the Truth posted a comment on MediaWiki.r85531.

Full URL: 
https://secure.wikimedia.org/wikipedia/mediawiki/wiki/Special:Code/MediaWiki/85531#c15764

Comment:

[[Extension:MediaWikiAuth]] should probably make note of this in the 
installation or configuration documentation.

___
MediaWiki-CodeReview mailing list
mediawiki-coderev...@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/mediawiki-codereview


[MediaWiki-CodeReview] [MediaWiki r81558]: New comment added, and revision status changed

2011-04-06 Thread MediaWiki Mail
User Brion VIBBER changed the status of MediaWiki.r81558.

Old Status: fixme
New Status: resolved

User Brion VIBBER also posted a comment on MediaWiki.r81558.

Full URL: http://www.mediawiki.org/wiki/Special:Code/MediaWiki/81558#c15765

Comment:

Fixed in r85556 on trunk.


___
MediaWiki-CodeReview mailing list
mediawiki-coderev...@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/mediawiki-codereview


[MediaWiki-CodeReview] [MediaWiki r85554]: New comment added

2011-04-06 Thread MediaWiki Mail
User Nikerabbit posted a comment on MediaWiki.r85554.

Full URL: http://www.mediawiki.org/wiki/Special:Code/MediaWiki/85554#c15766

Comment:

Twice?
 +   'mwe-upwiz-feedback-cancel' = 'Button label',
 +   'mwe-upwiz-feedback-cancel' = 'Button label',

___
MediaWiki-CodeReview mailing list
mediawiki-coderev...@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/mediawiki-codereview


[MediaWiki-CodeReview] [MediaWiki r85555]: Revision status changed

2011-04-06 Thread MediaWiki Mail
User ^demon changed the status of MediaWiki.r8.

Old Status: new
New Status: ok

Full URL: http://www.mediawiki.org/wiki/Special:Code/MediaWiki/8#c0

___
MediaWiki-CodeReview mailing list
mediawiki-coderev...@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/mediawiki-codereview


[MediaWiki-CodeReview] [MediaWiki r85556]: Revision status changed

2011-04-06 Thread MediaWiki Mail
User ^demon changed the status of MediaWiki.r85556.

Old Status: new
New Status: ok

Full URL: http://www.mediawiki.org/wiki/Special:Code/MediaWiki/85556#c0

___
MediaWiki-CodeReview mailing list
mediawiki-coderev...@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/mediawiki-codereview


[MediaWiki-CodeReview] [MediaWiki r85553]: New comment added, and revision status changed

2011-04-06 Thread MediaWiki Mail
User Brion VIBBER changed the status of MediaWiki.r85553.

Old Status: new
New Status: fixme

User Brion VIBBER also posted a comment on MediaWiki.r85553.

Full URL: http://www.mediawiki.org/wiki/Special:Code/MediaWiki/85553#c15768

Comment:

This doesn't seem to have fixed everything; while the tests now run, 13 tests 
fail that didn't in r85301 before this stuff changed, mostly related to 
LanguageConverter which may indicate that language objects still are off 
somehow:

  13 new FAILING test(s) :(
  * Prevent conversion with -{}- tags (language variants)  [Introduced 
between 06-Apr-2011 19:06:48, 1.18alpha (r85302) and 06-Apr-2011 19:10:58, 
1.18alpha (r85562)]
  * Prevent conversion of text with -{}- tags (language variants)  
[Introduced between 06-Apr-2011 19:06:48, 1.18alpha (r85302) and 06-Apr-2011 
19:10:58, 1.18alpha (r85562)]
  * Prevent conversion of links with -{}- tags (language variants)  
[Introduced between 06-Apr-2011 19:06:48, 1.18alpha (r85302) and 06-Apr-2011 
19:10:58, 1.18alpha (r85562)]
  * -{}- tags within headlines (within html for parserConvert())  
[Introduced between 06-Apr-2011 19:06:48, 1.18alpha (r85302) and 06-Apr-2011 
19:10:58, 1.18alpha (r85562)]
  * Explicit definition of language variant alternatives  [Introduced 
between 06-Apr-2011 19:06:48, 1.18alpha (r85302) and 06-Apr-2011 19:10:58, 
1.18alpha (r85562)]
  * Explicit session-wise language variant mapping (A flag and - flag)  
[Introduced between 06-Apr-2011 19:06:48, 1.18alpha (r85302) and 06-Apr-2011 
19:10:58, 1.18alpha (r85562)]
  * Explicit session-wise language variant mapping (H flag for hide)  
[Introduced between 06-Apr-2011 19:06:48, 1.18alpha (r85302) and 06-Apr-2011 
19:10:58, 1.18alpha (r85562)]
  * Adding explicit conversion rule for title (T flag)  [Introduced between 
06-Apr-2011 19:06:48, 1.18alpha (r85302) and 06-Apr-2011 19:10:58, 1.18alpha 
(r85562)]
  * Bug 24072: more test on conversion rule for title  [Introduced between 
06-Apr-2011 19:06:48, 1.18alpha (r85302) and 06-Apr-2011 19:10:58, 1.18alpha 
(r85562)]
  * Raw output of variant escape tags (R flag)  [Introduced between 
06-Apr-2011 19:06:48, 1.18alpha (r85302) and 06-Apr-2011 19:10:58, 1.18alpha 
(r85562)]
  * Nested using of manual convert syntax  [Introduced between 06-Apr-2011 
19:06:48, 1.18alpha (r85302) and 06-Apr-2011 19:10:58, 1.18alpha (r85562)]
  * Do not convert roman numbers to language variants  [Introduced between 
06-Apr-2011 19:06:48, 1.18alpha (r85302) and 06-Apr-2011 19:10:58, 1.18alpha 
(r85562)]
  * Edit comment with bare anchor link (local, as on diff)  [Introduced 
between 06-Apr-2011 19:06:48, 1.18alpha (r85302) and 06-Apr-2011 19:10:58, 
1.18alpha (r85562)]

Passed 641 of 654 tests (98.01%)... 13 tests failed!

Also it seems odd to set $wgContLang to $langObj, then set it to $context-lang 
immediately after. If that's because something in RequestContext()'s 
constructure requires $wgContLang to be set, then you should mention this in a 
comment so someone else doesn't come along and remove it, thinking it's 
redundant.

___
MediaWiki-CodeReview mailing list
mediawiki-coderev...@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/mediawiki-codereview


Re: [Wikitech-l] Undefined property: SkinVector::$mTitle

2011-04-06 Thread jidanni
All this now needs to be revised too.

function 
JidanniDontInviteEdit($article,$outputDone){if($article-getID()===0){
switch($article-mTitle-getNamespace()){case NS_CATEGORY:case 
NS_CATEGORY_TALK:$outputDone=true;}}return true;}
$wgHooks['ArticleViewHeader'][]='JidanniDontInviteEdit';//Bug 17630
function JidanniLessRedNavigation($sktemplate,$links){ //  
var_dump('BEFORE',$links);
  foreach($links['namespaces'] as $ns=$value){
if($value['context']=='talk'  $value['class']=='new'  
!$sktemplate-mTitle-quickUserCan('createtalk')){
  unset($links['namespaces'][$ns]);}
if($ns=='category'  $value['class']=='selected new'){
  $value['class']='selected';
  
if(isset($links['actions']['watch'])){unset($links['actions']['watch']);}}}  
//var_dump('AFTER',$links);
  return true;}
$wgHooks['SkinTemplateNavigation'][]='JidanniLessRedNavigation';

And who knows what will happen if some other skin.
No imagine what will happen ten years later when I am ten years older
and still have to maintain this with even 'less cells to spare upstairs'
(in the brain.)

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Open call for Parser bugs

2011-04-06 Thread Diederik van Liere
I love this idea!
Diederik

On Wed, Apr 6, 2011 at 5:21 PM, Mark A. Hershberger
mhershber...@wikimedia.org wrote:

 Starting with this coming Monday's bug triage, I want to try and make
 sure the community's voice is heard.  In order to do that, I've created
 the “triage” keyword in Bugzilla.  Every week, I'll announce a theme and
 use this keyword to keep track of the bugs that will be handled in the
 meeting.

 As we discuss the bug, it will be modified, probably assigned to a
 developer, and the “triage” keyword removed.  Some people may see this
 as bug-spam, but I'd like to keep the email notifications on so that
 people who have expressed an interest will know that we're giving the
 bugs some love.

 This week, I'm going to focus on Parser-related bugs.  There are
 currently 10 bugs on the with the “triage” keyword applied.  A bug
 triage meeting needs about 30 bugs, so I have room for about 20 more
 right now.  I'll be adding to the list before Monday, but this is your
 chance to get WMF's developers talking about YOUR favorite parser bug by
 adding the “triage” keyword.

 I will reserve the right to remove the “triage” keyword — especially if
 the list becomes unwieldy, or if the bug has nothing to do with parsing
 — but I wanted to start to open up the triage process a bit more and
 begin to provide a way for the community to participate in these
 meetings.

 Mark.

 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l



-- 
a href=http://about.me/diederik;Check out my about.me profile!/a

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Core html of a wikisource page

2011-04-06 Thread Aryeh Gregor
On Wed, Apr 6, 2011 at 3:15 AM, Alex Brollo alex.bro...@gmail.com wrote:
 I saved the HTML source of a typical Page: page from it.source, the
 resulting txt file having ~ 28 kBy; then I saved the core html only, t.i.
 the content of div class=pagetext, and this file have 2.1 kBy; so
 there's a more than tenfold ratio between container and real content.

 I there a trick to download the core html only? And, most important: could
 this save a little bit of server load/bandwidth?

It could save a huge amount of bandwidth.  This could be a big deal
for mobile devices, in particular, but it could also reduce TCP
round-trips and make things noticeably snappier for everyone.  I
recently read about an interesting technique on Steve Souders' blog,
which he got by analyzing Google and Bing:

http://www.stevesouders.com/blog/2011/03/28/storager-case-study-bing-google/

The gist is that when the page loads, assuming script is enabled, you
store static pieces of the page in localStorage (available on the
large majority of browsers, including IE8) and set a cookie.  Then if
the cookie is present on a subsequent request, the server doesn't send
the repetitive static parts, and instead sends a script that inserts
the desired contents synchronously from localStorage.  I guess this
breaks horribly if you request a page with script enabled and then
request another page with script disabled, but otherwise the basic
idea seems powerful and reliable, if a bit tricky to get right.

Of course, it would at least double the amount of space pages take in
Squid cache, so for us it might not be a great idea.  Still
interesting, and worth keeping in mind.

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


[MediaWiki-CodeReview] [MediaWiki r85573]: New comment added, and revision status changed

2011-04-06 Thread MediaWiki Mail
User Platonides changed the status of MediaWiki.r85573.

Old Status: new
New Status: ok

User Platonides also posted a comment on MediaWiki.r85573.

Full URL: http://www.mediawiki.org/wiki/Special:Code/MediaWiki/85573#c15769

Comment:

Nice.

The usual approach for the static is to add another function to clear the 
caching, but if accessing the regex is fast enough, it is fine. I don't see why 
there would be a slowdown there in the current code. It should hit 
:LocalizationCache:loadedItems.

The added testcases pass. Marking ok.

___
MediaWiki-CodeReview mailing list
mediawiki-coderev...@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/mediawiki-codereview


[MediaWiki-CodeReview] [MediaWiki r85553]: New comment added, and revision status changed

2011-04-06 Thread MediaWiki Mail
User Platonides changed the status of MediaWiki.r85553.

Old Status: fixme
New Status: resolved

User Platonides also posted a comment on MediaWiki.r85553.

Full URL: http://www.mediawiki.org/wiki/Special:Code/MediaWiki/85553#c15770

Comment:

Fixed in r85581. The $GLOBALS['wgContLang'] = $context-lang; was really 
unneeded, and the cause of the bug (when there's a code override).

___
MediaWiki-CodeReview mailing list
mediawiki-coderev...@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/mediawiki-codereview


[MediaWiki-CodeReview] [MediaWiki r85553]: New comment added

2011-04-06 Thread MediaWiki Mail
User Brion VIBBER posted a comment on MediaWiki.r85553.

Full URL: http://www.mediawiki.org/wiki/Special:Code/MediaWiki/85553#c15771

Comment:

Woot, thanks!

The 'edit comment with bare anchor link' still fails, but that looks unrelated. 
Others are all now A-OK.

___
MediaWiki-CodeReview mailing list
mediawiki-coderev...@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/mediawiki-codereview


[MediaWiki-CodeReview] [MediaWiki r85536]: New comment added

2011-04-06 Thread MediaWiki Mail
User Midom posted a comment on MediaWiki.r85536.

Full URL: http://www.mediawiki.org/wiki/Special:Code/MediaWiki/85536#c15772

Comment:

hm, do note that mediawiki has an iconv() wrapper, that would break with this, 
the change in wmf-deployment was based on the fact that we don't use the wrapper

___
MediaWiki-CodeReview mailing list
mediawiki-coderev...@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/mediawiki-codereview


[MediaWiki-CodeReview] [MediaWiki r85302]: New comment added, and revision status changed

2011-04-06 Thread MediaWiki Mail
User Platonides changed the status of MediaWiki.r85302.

Old Status: new
New Status: fixme

User Platonides also posted a comment on MediaWiki.r85302.

Full URL: http://www.mediawiki.org/wiki/Special:Code/MediaWiki/85302#c15773

Comment:

Argument 1 passed to MediaWiki::__construct() must be an instance of 
RequestContext, none given, called in 
phase3/tests/phpunit/includes/MediaWikiTest.php on line 13 

___
MediaWiki-CodeReview mailing list
mediawiki-coderev...@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/mediawiki-codereview


[MediaWiki-CodeReview] [MediaWiki r85302]: New comment added, and revision status changed

2011-04-06 Thread MediaWiki Mail
User Happy-melon changed the status of MediaWiki.r85302.

Old Status: fixme
New Status: new

User Happy-melon also posted a comment on MediaWiki.r85302.

Full URL: http://www.mediawiki.org/wiki/Special:Code/MediaWiki/85302#c15776

Comment:

I am rapidly losing faith in my IDE's search function... tt:S/tt.  Fixed in 
r85584.

___
MediaWiki-CodeReview mailing list
mediawiki-coderev...@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/mediawiki-codereview


[MediaWiki-CodeReview] [MediaWiki r85553]: New comment added

2011-04-06 Thread MediaWiki Mail
User Platonides posted a comment on MediaWiki.r85553.

Full URL: http://www.mediawiki.org/wiki/Special:Code/MediaWiki/85553#c15777

Comment:

See r85582. :)

I had inadvertedly broken it in r85481

___
MediaWiki-CodeReview mailing list
mediawiki-coderev...@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/mediawiki-codereview


[MediaWiki-CodeReview] [MediaWiki r85582]: Revision status changed

2011-04-06 Thread MediaWiki Mail
User Brion VIBBER changed the status of MediaWiki.r85582.

Old Status: new
New Status: ok

Full URL: http://www.mediawiki.org/wiki/Special:Code/MediaWiki/85582#c0

___
MediaWiki-CodeReview mailing list
mediawiki-coderev...@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/mediawiki-codereview


[MediaWiki-CodeReview] [MediaWiki r85508]: New comment added

2011-04-06 Thread MediaWiki Mail
User NeilK posted a comment on MediaWiki.r85508.

Full URL: http://www.mediawiki.org/wiki/Special:Code/MediaWiki/85508#c15778

Comment:

I agree with MaxSem here. See the followup r85589

___
MediaWiki-CodeReview mailing list
mediawiki-coderev...@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/mediawiki-codereview


[MediaWiki-CodeReview] [MediaWiki r85523]: Revision status changed

2011-04-06 Thread MediaWiki Mail
User Reedy changed the status of MediaWiki.r85523.

Old Status: new
New Status: reverted

Full URL: http://www.mediawiki.org/wiki/Special:Code/MediaWiki/85523#c0

___
MediaWiki-CodeReview mailing list
mediawiki-coderev...@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/mediawiki-codereview


[Wikitech-l] New SVN committer

2011-04-06 Thread Priyanka Dhanda
Extension and Core:
* Paul Copperman (pcopp)

Paul has contributed several patches to core MediaWiki and FlaggedRevs

-- 
Priyanka Dhanda
Code Maintenance Engineer
Wikimedia Foundation
http://wikimediafoundation.org
San Francisco, CA


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Peng Wan's proposal to figure out the most popular pages

2011-04-06 Thread Andrew G. West
Not sure I want to throw the API open to the public (the grok.se folks, 
and others, have a fine service for casual experimentation).

However, I am willing to share the data with interested researchers who 
need to do some serious crunching (I have a Java API and could 
distribute database credentials on a per-case basis).

I'll note that I only parse English Wikipedia at this time. I've found 
it useful in my anti-vandalism research (i.e., given that edit survived 
between time [w] and [x] on article [y], we estimate it received [z] 
views). Thanks, -AW


On 04/06/2011 08:44 PM, MZMcBride wrote:
 Andrew G. West wrote:
 I've parsed every one of these files (at hour granularity; grok.se
 aggregates at day-level, I believe) since Jan. 2010 into a DB structure
 indexed by page title. It takes up about 400GB of space, at the moment.

 Is your database available to the public? The Toolserver folks have been
 talking about getting the page view stats into usable form for quite some
 time, but nothing's happened yet. If you have an API or something similar,
 that would be fantastic. (stats.grok.se has a rudimentary API that I don't
 imagine many people are aware of.)

 MZMcBride



 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l

-- 
Andrew G. West, Doctoral Student
Dept. of Computer and Information Science
University of Pennsylvania, Philadelphia PA
Website: http://www.cis.upenn.edu/~westand

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l