Re: [Wikitech-l] #switch limits

2012-09-21 Thread Strainu
I'm just curious: would LUA improve memory usages in this use case?

Strainu


 Original Message 
From: Tim Starling tstarl...@wikimedia.org
Sent: Fri Sep 21 07:07:34 GMT+03:00 2012
To: wikitech-l@lists.wikimedia.org
Subject: [Wikitech-l] #switch limits

Over the last week, we have noticed very heavy apache memory usage on
the main Wikimedia cluster. In some cases, high memory usage resulted
in heavy swapping and site-wide performance issues.

After some analysis, we've identified the main cause of this high
memory usage to be geographical data (données) templates on the
French Wikipedia, and to a lesser extent, the same data templates
copied to other wikis for use on articles about places in Europe.

Here is an example of a problematic template:

https://fr.wikipedia.org/w/index.php?title=Mod%C3%A8le:Donn%C3%A9es_PyrF1-2009action=edit

That template alone uses 47MB for 37000 #switch cases, and one article
used about 15 similarly sized templates.

The simplest solution to this problem is for the few Wikipedians
involved to stop doing what they are doing, and to remove the template
invocations which have already been introduced. Antoine Musso has
raised the issue on the French Wikipedia's Bistro and some of the
worst cases have already been fixed.

To protect site stability, I've introduced a new preprocessor
complexity limit called the preprocessor generated node count, which
is incremented by about 6 for each #switch case. When the limit is
exceeded, an exception is thrown, preventing the page from being saved
or viewed.

The limit is currently 4 million (~667,000 #switch cases), and it will
soon be reduced to 1.5 million (~250,000 #switch cases). That's a
compromise which allows most of the existing geographical pages to
keep working, but still allows a memory usage of about 230MB.

At some point, we would like to patch PHP upstream to cause memory for
DOM XML trees to be allocated from the PHP request pool, instead of
with malloc(). But to deploy that, we would need to reduce the limit
to the point where the template DOM cache can easily fit in the PHP
memory limit of 128MB.

In the short term, we will be working with the template editors to
ensure that all articles can be viewed with a limit of 1.5 million.
That's not a very viable solution in the long term, so I'd also like
to introduce save-time warnings and tracking categories for pages
which use more than, say, 50% of the limit, to encourage authors to
fix articles without being directly prompted by WMF staff members.

At some point in the future, you may be able to put this kind of
geographical data in Wikidata. Please, template authors, wait
patiently, don't implement your own version of Wikidata using wikitext
templates.

-- Tim Starling



___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Sent from my Kindle Fire

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] #switch limits

2012-09-21 Thread Tim Starling
On 21/09/12 16:06, Strainu wrote:
 I'm just curious: would LUA improve memory usages in this use case?

Yes, it's an interesting question.

I tried converting that template with 37000 switch cases to a Lua
array. Lua used 6.5MB for the chunk and then another 2.4MB to execute
it, so 8.9MB in total compared to 47MB for wikitext. So it's an
improvement, but we limit Lua memory to 50MB and you would hit that
limit long before you loaded 15 such arrays.

It's still an O(N) solution. What we really want is to avoid loading
the entire French census into memory every time someone wants to read
an article about France.

-- Tim Starling


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] #switch limits

2012-09-21 Thread Strainu
2012/9/21 Tim Starling tstarl...@wikimedia.org:
 On 21/09/12 16:06, Strainu wrote:
 I'm just curious: would LUA improve memory usages in this use case?

 Yes, it's an interesting question.

 I tried converting that template with 37000 switch cases to a Lua
 array. Lua used 6.5MB for the chunk and then another 2.4MB to execute
 it, so 8.9MB in total compared to 47MB for wikitext. So it's an
 improvement, but we limit Lua memory to 50MB and you would hit that
 limit long before you loaded 15 such arrays.

I'm not sure on how the Lua code would look like, but perhaps you can
tweak the loading of Lua templates so that you don't load the same
code more than once? I'm totally oblivious on how MediaWiki (or is it
PHP?) is linked to Lua right now, but I'm thinking along the lines of
a C program which loads a library once, then can use it many times
over.

With such an approach, you would have 6.5 + 15*2.4 = 42.5 MB of memory
(assuming memory cannot be reused between calls).


 It's still an O(N) solution. What we really want is to avoid loading
 the entire French census into memory every time someone wants to read
 an article about France.

Well, you said something about Wikidata. But even if the client Wiki
would not need to load the full census, can it be avoided on Wikidata?

Strainu

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] MediaWiki 1.20 release candidate

2012-09-21 Thread Niklas Laxström
On 21 September 2012 05:37, Mark A. Hershberger m...@everybody.org wrote:
 Last week, I announced the MediaWiki 1.20 release candidate that I
 created on wikitech-l
 (http://lists.wikimedia.org/pipermail/wikitech-l/2012-September/063226.html
 shortened: http://hexm.de/lo).

Earlier you wrote that it is based on 1.20wmf11 branch. I didn't check
the tarball but there were pretty severe i18n issues with plurals
around that time. Do you know whether fixes for those issues are
already included or not? Most important is
https://gerrit.wikimedia.org/r/#/c/23900/
  -Niklas

-- 
Niklas Laxström

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] #switch limits

2012-09-21 Thread Denny Vrandečić
2012/9/21 Strainu strain...@gmail.com:
 Well, you said something about Wikidata. But even if the client Wiki
 would not need to load the full census, can it be avoided on Wikidata?

Talking about the template that Tim listed:
https://fr.wikipedia.org/w/index.php?title=Mod%C3%A8le:Donn%C3%A9es_PyrF1-2009action=edit

I was trying to understand the template and its usage. As far as I can
tell it maps a ZIP (or some other identifier) of a commune to a value
(maybe a percentage or population, sorry, the documentation did not
exist and my French is rusty).

So basically it provide all values for a given property. Differently
said that Wikipage implements a database table with the columns key
and value and holds the whole table. (I think when Ward Cunningham
described a wiki the simplest online database that could possibly
work, this is *not* what he envisioned.)

In Wikidata we are not storing the data by the property, but for every
item. Put differently, every row in that template would become one
statement for the item identified by its key.

So Wikidata would not load the whole census data for every article,
but only the data for the items that is actually requested.

On the other hand, we would indeed load the whole data for one item on
the repository (not the Wikipedias), which might lead to problems with
very big items at some points. We will test make tests to see how this
behaves once these features have been developed, and then see if we
need to do something like partition by property groups (similar as
Cassandra does it).

I hope that helps,
Denny

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] #switch limits

2012-09-21 Thread Max Semenik
On 21.09.2012, 11:47 Strainu wrote:

 2012/9/21 Tim Starling tstarl...@wikimedia.org:
 On 21/09/12 16:06, Strainu wrote:
 I'm just curious: would LUA improve memory usages in this use case?

 Yes, it's an interesting question.

 I tried converting that template with 37000 switch cases to a Lua
 array. Lua used 6.5MB for the chunk and then another 2.4MB to execute
 it, so 8.9MB in total compared to 47MB for wikitext. So it's an
 improvement, but we limit Lua memory to 50MB and you would hit that
 limit long before you loaded 15 such arrays.

 I'm not sure on how the Lua code would look like, but perhaps you can
 tweak the loading of Lua templates so that you don't load the same
 code more than once? I'm totally oblivious on how MediaWiki (or is it
 PHP?) is linked to Lua right now, but I'm thinking along the lines of
 a C program which loads a library once, then can use it many times
 over.

And what if a page is related to France, Germany and other European
countries at once? Loading this information just once isn't helpful -
it needs to load just what is needed, otherwise smart wikipedians will
keep inventing creative ways to push the boundaries:)


 With such an approach, you would have 6.5 + 15*2.4 = 42.5 MB of memory
 (assuming memory cannot be reused between calls).


 It's still an O(N) solution. What we really want is to avoid loading
 the entire French census into memory every time someone wants to read
 an article about France.

 Well, you said something about Wikidata. But even if the client Wiki
 would not need to load the full census, can it be avoided on Wikidata?

(Mumbles something about databases that don't store all information in
one row and don't always read all the rows at once)


-- 
Best regards,
  Max Semenik ([[User:MaxSem]])


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] #switch limits

2012-09-21 Thread Denny Vrandečić
I took another look at the output that is created with the data, and I
am at the same time delighted and astonished by the capability and
creativity of the Wikipedia community to solve such tasks with
MediaWiki template syntax and at the same time horrified by the
necessity of the solution taken.

Adding to my own explanation of how Wikidata would help here: we plan
to implement some form of query answering capabilities in phase III
which would actually not work on the full items, as described in my
previous mail, but just on some smarter derived representation of the
data. So specific queries -- the possible expressivity is not defined
yet -- would be performed much more efficiently than performing them
on the fly over all relevant items. (That is covered by the technical
proposal as item P3.2 in
http://meta.wikimedia.org/wiki/Wikidata/Technical_proposal#Technical_requirements_and_rationales_3).

Cheers,
Denny

2012/9/21 Denny Vrandečić denny.vrande...@wikimedia.de:
 2012/9/21 Strainu strain...@gmail.com:
 Well, you said something about Wikidata. But even if the client Wiki
 would not need to load the full census, can it be avoided on Wikidata?

 Talking about the template that Tim listed:
 https://fr.wikipedia.org/w/index.php?title=Mod%C3%A8le:Donn%C3%A9es_PyrF1-2009action=edit

 I was trying to understand the template and its usage. As far as I can
 tell it maps a ZIP (or some other identifier) of a commune to a value
 (maybe a percentage or population, sorry, the documentation did not
 exist and my French is rusty).

 So basically it provide all values for a given property. Differently
 said that Wikipage implements a database table with the columns key
 and value and holds the whole table. (I think when Ward Cunningham
 described a wiki the simplest online database that could possibly
 work, this is *not* what he envisioned.)

 In Wikidata we are not storing the data by the property, but for every
 item. Put differently, every row in that template would become one
 statement for the item identified by its key.

 So Wikidata would not load the whole census data for every article,
 but only the data for the items that is actually requested.

 On the other hand, we would indeed load the whole data for one item on
 the repository (not the Wikipedias), which might lead to problems with
 very big items at some points. We will test make tests to see how this
 behaves once these features have been developed, and then see if we
 need to do something like partition by property groups (similar as
 Cassandra does it).

 I hope that helps,
 Denny



-- 
Project director Wikidata
Wikimedia Deutschland e.V. | Obentrautstr. 72 | 10963 Berlin
Tel. +49-30-219 158 26-0 | http://wikimedia.de

Wikimedia Deutschland - Gesellschaft zur Förderung Freien Wissens e.V.
Eingetragen im Vereinsregister des Amtsgerichts Berlin-Charlottenburg
unter der Nummer 23855 B. Als gemeinnützig anerkannt durch das
Finanzamt für Körperschaften I Berlin, Steuernummer 27/681/51985.

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] #switch limits

2012-09-21 Thread Alex Brollo
I too use sometimes large switches (some hundred) and I'm far from happy
about. For larger switches, I use nested switches, but I find very
difficult to compare performance of nested switches (i.e.: a 1000 elements
switch can be nested in three switches of 10 elements) against single
global switches. I imagine that there's a performance function  changing
the number of switch level and number of switch elements, but I presume
that it would be difficult to calculate; can someone explore the matter by
tests?

Another way would be, to implement a .split() function to transform a
string into a list, at least; much better, to implement a JSON parsing of a
JSON string, to get lists and dictionaries from strings saved into pages. I
guess a dramatic improvement of performance; but I'm far from sure about.

Alex brollo
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Wikidata blockers

2012-09-21 Thread Denny Vrandečić
Daniel,

sorry for the previous tone of my answer. Indeed I mixed up the Sites
management RFC, that I was discussing, and your proposal to change the
Sitelinks table. Although they are related, the former does not depend
on the latter.

Whereas I see that changing it both in one go might have advantages,
in order to proceed with the site management being reviewed, merged
and implemented in a timely manner I would suggest that we keep them
separated.

The Sitelinks discussion you have started has indeed not received
answers as fat as I can tell. I suggest that this might be due to it
being currently a sub-part of the sites management topic. I would
suggest that this gets started in its own RFC, what do you think?

Cheers,
Denny


2012/9/21 Daniel Friesen dan...@nadir-seen-fire.com:
 On Thu, 20 Sep 2012 05:54:02 -0700, Denny Vrandečić
 denny.vrande...@wikimedia.de wrote:

 2012/9/20 Daniel Friesen dan...@nadir-seen-fire.com:

 I already started the discussion ages ago. No-one replied.


 Daniel,

 can you please point to the discussion that you started where no one
 replied? As far as I can tell, I can find discussions that you started
 on-wiki, on this mailing list, and I see comments by you on Gerrit. I
 found these discussions enlightening and I think they improved the
 design and the code - but in all these discussion threads there have
 been replies.

 If there are discussions you have started where no one replied, can
 you please provide links to them? I cannot find them.

 Denny


 https://www.mediawiki.org/wiki/Thread:Talk:Requests_for_comment/New_sites_system/Sitelinks


 --
 ~Daniel Friesen (Dantman, Nadir-Seen-Fire) [http://daniel.friesen.name]


 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l



-- 
Project director Wikidata
Wikimedia Deutschland e.V. | Obentrautstr. 72 | 10963 Berlin
Tel. +49-30-219 158 26-0 | http://wikimedia.de

Wikimedia Deutschland - Gesellschaft zur Förderung Freien Wissens e.V.
Eingetragen im Vereinsregister des Amtsgerichts Berlin-Charlottenburg
unter der Nummer 23855 B. Als gemeinnützig anerkannt durch das
Finanzamt für Körperschaften I Berlin, Steuernummer 27/681/51985.

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Wikidata blockers

2012-09-21 Thread Daniel Kinzler
On 21.09.2012 14:07, Denny Vrandečić wrote:
 Daniel,
 
 sorry for the previous tone of my answer. Indeed I mixed up the Sites
 management RFC, that I was discussing, and your proposal to change the
 Sitelinks table. Although they are related, the former does not depend
 on the latter.
 
 Whereas I see that changing it both in one go might have advantages,
 in order to proceed with the site management being reviewed, merged
 and implemented in a timely manner I would suggest that we keep them
 separated.
 
 The Sitelinks discussion you have started has indeed not received
 answers as fat as I can tell. I suggest that this might be due to it
 being currently a sub-part of the sites management topic. I would
 suggest that this gets started in its own RFC, what do you think?

I fully agree, especially since the new sites stuff does not immediately replace
the old interwiki table. It will at some point, I hope, but the current proposal
does not cover this.

-- daniel


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Wikidata blockers

2012-09-21 Thread Daniel Friesen
On Fri, 21 Sep 2012 05:07:31 -0700, Denny Vrandečić  
denny.vrande...@wikimedia.de wrote:

Daniel,

sorry for the previous tone of my answer. Indeed I mixed up the Sites
management RFC, that I was discussing, and your proposal to change the
Sitelinks table. Although they are related, the former does not depend
on the latter.

Whereas I see that changing it both in one go might have advantages,
in order to proceed with the site management being reviewed, merged
and implemented in a timely manner I would suggest that we keep them
separated.

The Sitelinks discussion you have started has indeed not received
answers as fat as I can tell. I suggest that this might be due to it
being currently a sub-part of the sites management topic. I would
suggest that this gets started in its own RFC, what do you think?
It may not be part of the portion of what you're implementing right now  
but it is part of the RFC. The RFC's goal is the complete replacement of  
interwiki links with a sites system and intefaces/apis to interact with it.



Cheers,
Denny


2012/9/21 Daniel Friesen dan...@nadir-seen-fire.com:

On Thu, 20 Sep 2012 05:54:02 -0700, Denny Vrandečić
denny.vrande...@wikimedia.de wrote:


2012/9/20 Daniel Friesen dan...@nadir-seen-fire.com:


I already started the discussion ages ago. No-one replied.



Daniel,

can you please point to the discussion that you started where no one
replied? As far as I can tell, I can find discussions that you started
on-wiki, on this mailing list, and I see comments by you on Gerrit. I
found these discussions enlightening and I think they improved the
design and the code - but in all these discussion threads there have
been replies.

If there are discussions you have started where no one replied, can
you please provide links to them? I cannot find them.

Denny



https://www.mediawiki.org/wiki/Thread:Talk:Requests_for_comment/New_sites_system/Sitelinks


--
~Daniel Friesen (Dantman, Nadir-Seen-Fire) [http://daniel.friesen.name]


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l







--
~Daniel Friesen (Dantman, Nadir-Seen-Fire) [http://daniel.friesen.name]


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] MediaWiki 1.20 release candidate

2012-09-21 Thread Mark A. Hershberger
On 09/21/2012 03:57 AM, Niklas Laxström wrote:
 Earlier you wrote that it is based on 1.20wmf11 branch. I didn't check
 the tarball but there were pretty severe i18n issues with plurals
 around that time. Do you know whether fixes for those issues are
 already included or not? Most important is
 https://gerrit.wikimedia.org/r/#/c/23900/

That commit is not included.  I can merge it in or make a second RC with
1.20wmf12.

What do you think is the better way to go?

-- 
http://hexmode.com/

Human evil is not a problem.  It is a mystery.  It cannot be solved.
  -- When Atheism Becomes a Religion, Chris Hedges

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] #switch limits

2012-09-21 Thread George Herbert
Alternately; if ever there was a case for automatedly creating a whole 
hierarchy of new separate templates for each article, or even just directly 
editing the articles and putting the data in...

Templates would make finding and updating later somewhat easier I think.
Just have one per location code.


George William Herbert
Sent from my iPhone

On Sep 21, 2012, at 4:37 AM, Alex Brollo alex.bro...@gmail.com wrote:

 I too use sometimes large switches (some hundred) and I'm far from happy
 about. For larger switches, I use nested switches, but I find very
 difficult to compare performance of nested switches (i.e.: a 1000 elements
 switch can be nested in three switches of 10 elements) against single
 global switches. I imagine that there's a performance function  changing
 the number of switch level and number of switch elements, but I presume
 that it would be difficult to calculate; can someone explore the matter by
 tests?
 
 Another way would be, to implement a .split() function to transform a
 string into a list, at least; much better, to implement a JSON parsing of a
 JSON string, to get lists and dictionaries from strings saved into pages. I
 guess a dramatic improvement of performance; but I'm far from sure about.
 
 Alex brollo
 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] HTML5 and non valid attributes/elements of previous versions (bug 40329)

2012-09-21 Thread Gabriel Wicke
On 09/20/2012 07:40 PM, MZMcBride wrote:
 Scanning dumps (or really dealing with them in any form) is pretty awful.
 There's been some brainstorming in the past for how to set up a system where
 users (or operators) could run arbitrary regular expressions on all of the
 current wikitext regularly, but such a setup requires _a lot_ of anything
 involved (disk space, RAM, bandwidth, processing power, etc.). Maybe one day
 Labs will have something like this.

We have a dump grepper tool in the Parsoid codebase (see
js/tests/dumpGrepper.js) that takes about 25 minutes to grep an XML dump
of the English Wikipedia. The memory involved is minimal and constant,
the thing is mostly CPU-bound.

It should not be hard to hook this up to a web service. Our parser web
service in js/api could serve as a template for that.

Gabriel

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Notification bubble system

2012-09-21 Thread Rob Moen

On Sep 20, 2012, at 3:48 PM, Krinkle wrote:

 If they happened as a direct
 consequence of a user action, maybe it should appear inside the interface 
 where
 it was performed?

Agreed, interaction related notifications should be localized in the interface 
where the action is be performed. 
This increases visibility and implies a connection to the user action.
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


[Wikitech-l] Extensions still in SVN

2012-09-21 Thread Chad
Hi,

I'm trying to figure out what needs to happen with the remaining extensions
in SVN that have not yet moved to Git (there's 372 of them). I've taken the
time to make up a list of extensions and put them on the wiki, but I need
some help!

Here's the page:
http://www.mediawiki.org/wiki/Git/Conversion/Extensions_still_in_svn

Mainly what I'm looking for is anyone who knows the status of any of these
372 extensions to take a few minutes to fill in ones they know. If it's
abandoned or obsolete, mark it as such so I can ignore it. If you know an
extension's still used, but maybe doesn't have an active maintainer, let's get
it in Git. Some extensions might still not be ready to move yet, but that's
something I'd like to know too.

Thanks for any help you can give. I'll be looking through the list as well,
but I figured crowd-sourcing the task might help us get it done faster.

Have a great Friday everyone,

-Chad

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Extensions still in SVN

2012-09-21 Thread Thomas Gries
Am 21.09.2012 20:03, schrieb Chad:
 Hi,

 I'm trying to figure out what needs to happen with the remaining extensions
 in SVN that have not yet moved to Git (there's 372 of them). I've taken the
 time to make up a list of extensions and put them on the wiki, but I need
 some help!

 Here's the page:
 http://www.mediawiki.org/wiki/Git/Conversion/Extensions_still_in_svn

 Mainly what I'm looking for is anyone who knows the status of any of these
 372 extensions to take a few minutes to fill in ones they know. If it's
 abandoned or obsolete, mark it as such so I can ignore it. If you know an
 extension's still used, but maybe doesn't have an active maintainer, let's get
 it in Git. Some extensions might still not be ready to move yet, but that's
 something I'd like to know too.

 Thanks for any help you can give. I'll be looking through the list as well,
 but I figured crowd-sourcing the task might help us get it done faster.

 Have a great Friday everyone,

 -Chad

 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l

I moved these

  * AJAXPoll
  * EtherpadLite
  * OpenID
  * Suhosin
  * WikiArticleFeeds

toLeave in SVN: Not ready to migrate
If you want to git/gerrit them ok, but then you need to find someone
else to maintain. I hate gerrit.
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Extensions still in SVN

2012-09-21 Thread Daniel Friesen

On Fri, 21 Sep 2012 11:03:15 -0700, Chad innocentkil...@gmail.com wrote:


Hi,

I'm trying to figure out what needs to happen with the remaining  
extensions
in SVN that have not yet moved to Git (there's 372 of them). I've taken  
the

time to make up a list of extensions and put them on the wiki, but I need
some help!

Here's the page:
http://www.mediawiki.org/wiki/Git/Conversion/Extensions_still_in_svn

Mainly what I'm looking for is anyone who knows the status of any of  
these

372 extensions to take a few minutes to fill in ones they know. If it's
abandoned or obsolete, mark it as such so I can ignore it. If you know an
extension's still used, but maybe doesn't have an active maintainer,  
let's get
it in Git. Some extensions might still not be ready to move yet, but  
that's

something I'd like to know too.

Thanks for any help you can give. I'll be looking through the list as  
well,

but I figured crowd-sourcing the task might help us get it done faster.

Have a great Friday everyone,

-Chad


Where do we put all the extensions I made that people may be using but I  
haven't had a reason to make any modifications to in ages.


--
~Daniel Friesen (Dantman, Nadir-Seen-Fire) [http://daniel.friesen.name]


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Extensions still in SVN

2012-09-21 Thread Siebrand Mazeland (WMF)
On Fri, Sep 21, 2012 at 11:26 AM, Daniel Friesen
dan...@nadir-seen-fire.com wrote:
 Where do we put all the extensions I made that people may be using but I
 haven't had a reason to make any modifications to in ages.

In git.

-- 
Siebrand Mazeland
Product Manager Localisation
Wikimedia Foundation

M: +31 6 50 69 1239
Skype: siebrand

Support Free Knowledge: http://wikimediafoundation.org/wiki/Donate

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] #switch limits

2012-09-21 Thread Alex Brollo
Some atomic  specific page data set is needed and it's perfectly logic
and predictable the creative users try any trick to forse wikicode and
template code do get such a result.

I appreciate deeply and I'm enthusiast about WikiData project, but I wonder
about this issue: is wikidata a good data container for data sets needed
from a single, specific page of a single project?

I.e.: consider citations from Bible: they have a widely used structure;
something like Genesis, 4:5 to point to verse 5 into chapter 4 of Genesis.
A good switch can translate this reference into a link+anchor  to a Page:
page of a wikisource version of Bible; a different switch will translate
this reference into a link+anchor  pointing to ns0 version of same Bible.
Can you imagine to host such a set of data into WikiData? I can't; some
local data container is needed; #switch makes perfectly the job, end
creative users will find this way and will use it, since it's needed to get
result.

Simply build something more light and efficient and simple than #switch to
get the same result, and users will use it.

Alex brollo
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Code review meeting notes

2012-09-21 Thread Roan Kattouw
On Thu, Sep 20, 2012 at 2:08 PM, Mark Holmquist mtrac...@member.fsf.org wrote:
 Hm. Will this be file-level whitelisting (i.e., this file changed from the
 master branch in this patchset, so we'll show the changes) or is it
 line-level? If the latter, how? Because I'm not sure it's trivial

I believe it's file-level, which eliminates most but not all noise.

Roan

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l