[Wikitech-l] Development policy around database/SQL use

2015-09-14 Thread Rob Lanphier
Hi folks,

Executive summary:
T108255 is the default option for our Wednesday RfC review (E66)[0].
As part of improving our database use, we need to start gating our
code review on better shared norms of SQL correctness.  We need to
enable strict mode, cleanup/enforce primary keys (T17441), and start
using row-based replication (T109179).  Let's talk about this on
Wednesday.

Details:
We're still not 100% decided what our topic for this week's RfC review
meeting will be, but I'm leaning pretty heavily toward T108255.  Jaime
Crespo (Jynus) asked me about it last week, which inspired me to turn
T108255 into an RfC.  After he cleared up my writeup, I think there's
something for us to talk about.

In particular, I originally thought this was merely about enabling
MariaDB's strict mode, and all of the rainbows and unicorns that would
result from that.  Jaime corrected me, pointing out that there is
other database related cleanup we would need to do to get the benefits
of this.

So, as of this writing, T108255 by title still appears to be about
merely enabling strict mode.  It's tempting to split this ticket into
two tickets:
1.  RfC: Write/enforce SQL correctness guidelines
2.  Enable MariaDB/MySQL's Strict Mode

I may make a separate ticket tomorrow unless someone convinces me that
kittens will die as a result.[1]

Regarding SQL correctness guidelines, we have a mess of stuff on
mediawiki.org, which doesn't seem to be very discoverable, and also
doesn't seem to have any teeth to it.  We have a modest number of
pages marked as "MediaWiki development policies"[2], but of the 5
pages that were there, only 1 of them was about specifically about
databases, which is weakly called [[Database optimization]][3].  Since
[[Database optimization]] didn't seem to have gotten the review that
[[Security for developers]] or [[Gerrit/+2]] had, I changed its status
to "{{draft}}"

We *do* have something that actually looks more policy-like, which is
the "#Database patches" section of the [[Development policy]] page[4]
 However, it's not clear that the "Development policy" page gets read,
and has gotten pretty crufty.  It's tempting to put "{{draft}}" on
that one too.

It seems there are a number of sources we could/should be pulling from
to make a database development policy[5]  T108255 (or some
database-related RfC) should be about pulling all of these together
into a coherent set of guidelines.  These guidelines should be
well-known to frequent committers, and should be well-written for a
beginning developer.

What we need to actually *do* is not merely enable strict mode, but
also cleanup/enforce primary keys (T17441), and start using row-based
replication (T109179). Before completing all of this, we need our code
review gated on actually making this work.

The fact that we have a mess of documentation and norms is the reason
why I'm leaning toward this topic for the E66 meeting this week.  If
you believe we should talk about this, please participate at T108255
and help get this as far along as possible so that we can wrap things
up at the E66 meeting,  If you believe we should be talking about
something else in our IRC meeting, please say so in E66 on Phab.

Rob

[0]  IRC meeting:

"RfC: Enable MariaDB/MySQL's Strict Mode"


[1]  if someone decides to jfdi, I would recommend using T108255 for
the "Write/enforce SQL correctness guidelines" RfC, and make a new
ticket for the less important "Enable MariaDB/MySQL's Strict Mode".
The comments on the ticket seem to relate more to the former than the
latter, and the subscribers will probably be more interested in the
former.

[2]


[3] 

[4] 

[5] Other database-related guidance for developers:





___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] mailman upgrade next week - Sep 9th 1400 UTC

2015-09-14 Thread Daniel Zahn
There will be another scheduled maintenance window for this upgrade at:

Friday, September 18, 2015 at 2:00:00 PM UTC
(Friday, September 18, 2015 at 7:00:00 AM PDT)

for 3 hour mails to lists will not be delivered and i will follow-up
once this is over.

Thank you,
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] thumb generation

2015-09-14 Thread Gergo Tisza
On Mon, Sep 14, 2015 at 4:49 PM, Platonides  wrote:

> You know it will fail for all kind of images included through templates
> (particularly infoboxes), right?


Indeed, it is not possible to find out what thumbnails are used by a page
without actually parsing it. Your best bet is to wait until Parsoid dumps
become available (T17017 ), then
go through those with an XML parser and extract the thumb URLs. That's
still slow but not as slow as the MediaWiki parser. (Or you can try to find
a regexp which matches thumbnail URLs but we all know what happens
 when you use a regexp to parse
HTML.) After that, just throw those URLs at the 404 handler.
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Idea: Cryptographically signed wiki pages.

2015-09-14 Thread John Erling Blad
You will run into problems with transclusions
http://www.w3.org/standards/techs/xmlsig#w3c_all

On Tue, Sep 15, 2015 at 1:54 AM, Platonides  wrote:

> On 13/09/15 18:20, Purodha Blissenbach wrote:
>
>> The idea is that third parties can publish texts, such as theis
>> statutes, via a open or public wiki, and readers can be sure to read,
>> download, sign, and mail the originals. Another use would be to have
>> pledges and petitions signed by many people. Etc. It is not about
>> WMF-run Wikis.
>>
>> Purodha
>>
>
>
> You can already use PGP-armored wikitext if you wanted to (you may want to
> parse it locally, ensure that it doesn't call unsigned templates, etc. but
> the option is there).
>
>
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
>
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Idea: Cryptographically signed wiki pages.

2015-09-14 Thread Platonides

On 13/09/15 18:20, Purodha Blissenbach wrote:

The idea is that third parties can publish texts, such as theis
statutes, via a open or public wiki, and readers can be sure to read,
download, sign, and mail the originals. Another use would be to have
pledges and petitions signed by many people. Etc. It is not about
WMF-run Wikis.

Purodha



You can already use PGP-armored wikitext if you wanted to (you may want 
to parse it locally, ensure that it doesn't call unsigned templates, 
etc. but the option is there).


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] thumb generation

2015-09-14 Thread Platonides

On 15/09/15 01:34, wp mirror wrote:

Idea.  I am thinking of piping the *pages-articles.xml.bz2 dump file
through an AWK script to write all unique [[File:*]] tags into a file. This
can be done quickly. The question then is: Given a file with all the media
tags, how can I generate all the thumbs. What mediawiki function shall I
call? Can this be done using the web API? Any other ideas?

Sincerely Yours,
Kent


You know it will fail for all kind of images included through templates 
(particularly infoboxes), right?



___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Add a Gerrit label "WIP" to mark changes as work in progress

2015-09-14 Thread Jon Robson
It would be good to formalise this.



On Mon, Sep 14, 2015 at 2:07 PM, Tim Landscheidt  
wrote:
> "C. Scott Ananian"  wrote:
>
>> I'd use this tag more often if I could set it from the gerrit
>> command-line when I upload a patch.  Otherwise it will be pretty
>> inconvenient to keep this in sync with the summary line of my patch.
>
> That should be possible with Gerrit's command line inter-
> face
> (cf. https://gerrit-review.googlesource.com/Documentation/cmd-review.html).
> For example, I just voted on
> https://gerrit.wikimedia.org/r/#/c/238201/ with:
>
> | ssh -p 29418 gerrit.wikimedia.org gerrit review --label Code-Review=+1 
> 4266a950bf7d0984cc5177b0f2f8d76b7d0b3c55
>
> This /should/ work for arbitrary labels.
>
> Tim
>
>
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l



-- 
Jon Robson
* http://jonrobson.me.uk
* https://www.facebook.com/jonrobson
* @rakugojon

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] thumb generation

2015-09-14 Thread wp mirror
Dear Brian,

On 9/13/15, Brian Wolff  wrote:
> On 9/12/15, wp mirror  wrote:
>> 0) Context
>>
>> I am currently developing new features for WP-MIRROR (see <
>> https://www.mediawiki.org/wiki/Wp-mirror>).
>>
>> 1) Objective
>>
>> I would like WP-MIRROR to generate all image thumbs during the mirror
build
>> process. This is so that mediawiki can render pages quickly using
>> precomputed thumbs.
>>
>> 2) Dump importation
>>
>> maintenance/importDump.php - this computes thumbs during importation, but
>> is too slow.
>> mwxml2sql - loads databases quickly, but does not compute thumbs.
>>
>> 3) Question
>>
>> Is there a way to compute all the thumbs after loading databases quickly
>> with mwxml2sql?
>>
>> Sincerely Yours,
>> Kent
>> __
>_
>> Wikitech-l mailing list
>> Wikitech-l@lists.wikimedia.org
>> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
>
> Hi. My understanding is that wp-mirror sets up a MediaWiki instance
> for rendering the mirror. One solution would be to set up 404-thumb
> rendering. This makes it so that instead of pre-rendering the needed
> thumbs, MediaWiki will render the thumbs on-demand whenever the web
> browser requests a thumb. There's some instructions for how this works
> at https://www.mediawiki.org/wiki/Manual:Thumb.php This is probably
> the best solution to your problem.

Right. Currently, wp-mirror does set up mediawiki to use 404-thumb
rendering.

This works fine, but can cause a few seconds latency when rendering pages.
Also, it would be nice to be able to generate thumb dump tarballs, just
like we used to generate original size media dump tarballs. I would like
wp-mirror have such dump features.

> Otherwise, MW needs to know what thumbs are needed for all pages,
> which involves parsing pages (e.g. via refreshLinks.php). This is a
> very slow process. If you already had all the thumbnail's generated,
> you could just copy over the thumb directory perhaps, but I'm not sure
> where you would get a pre-generated thumb directory.

Wp-mirror does load the *links.sql.gz dump files into the *links tables,
because this method is two orders of magnitude faster than
maintenance/refreshLinks.php.

>--
>-bawolff

Idea.  I am thinking of piping the *pages-articles.xml.bz2 dump file
through an AWK script to write all unique [[File:*]] tags into a file. This
can be done quickly. The question then is: Given a file with all the media
tags, how can I generate all the thumbs. What mediawiki function shall I
call? Can this be done using the web API? Any other ideas?

Sincerely Yours,
Kent
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Interested in working on a WikiWidget for algorithm visualization

2015-09-14 Thread Gergo Tisza
Another possible approach is to have the MediaWiki server proxy and cache
the content from the remote server, and then display it in some kind of
safe sandbox. See T31242 .

On Mon, Sep 14, 2015 at 9:15 AM, Daniel Moisset 
wrote:

> How do you handle the "Offline animated image rendering solution" for your
> current wikiwidgets? What I can think about here is an animated gif
> fallback, but that would lose interactivity at all (this is something that
> users will be able to do anyway without any implementation effort)
>

I don't think we have widgets currently (apart from some JS hacks which
change from wiki to wiki and can't deal with offline etc). Videos come
closest, and they just display a thumbnail as fallback for offline and old
browsers.
Depending on the implementation, an interactive algorithm visualisation
might work offline just fine, but you need to support old browsers,
print/PDF and so on. A simple static image (or even just text explaining
there would be a visualisation here but you don't get to see it) would work
IMO.


> Another (technical) question I have is if there's already something in
> mediawiki that would allow users to upload JSON content, or if something
> like that should be implemented. Having that would allow users to use later
> in markup stuff like {{WikiWidget|WalnutVisualizer:myuser/somecontent}}.
>

There are three ways to go about this:

   - Upload widgets as files, use your own MediaHandler
   
and MediaTransformOutput
   

classes to
   display them. This is what Bryan was referring to in his answer, and
   probably the best suited for your use case. There is no support for JSON
   files specifically, but they are not really different from any other file.
   Licensing and some of the display is handled by MediaWiki, you just need to
   return an HTML blob from the MediaTransformOutput class, given the
   dimensions. Compared to the other options, this area of the code is
   somewhat poorly documented but you can look at the existing media
   handling extensions
   ,
   - Upload widgets as wiki pages. There is some support for JSON pages
   (see ContentHandler
    and
   the JsonContent class). This has some advantages if you want to focus on
   collaboratively creating animations, not just displaying them (you can
   provide a custom editor, can show custom diffs and so on), but the widget
   has to be on its own page so not really useful.
   - Upload widgets as part of the wikitext of a page (something like
   ...JSON blob here...). The Graph extension
    is a good example of
   this. Still somewhat useful for collaborative creation (no JSON support so
   you get plaintext diffs and less editor support but that's still a lot
   better than having to compare uploaded files by hand). You need to
   implement any support you want for licensing and rendering (including how
   the user can specify size/location) on your own. You can combine this with
   the previous option to get the benefits of both, via the JsonConfig
    extension (the
   Graph extension does this as well).
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Add a Gerrit label "WIP" to mark changes as work in progress

2015-09-14 Thread Tim Landscheidt
"C. Scott Ananian"  wrote:

> I'd use this tag more often if I could set it from the gerrit
> command-line when I upload a patch.  Otherwise it will be pretty
> inconvenient to keep this in sync with the summary line of my patch.

That should be possible with Gerrit's command line inter-
face
(cf. https://gerrit-review.googlesource.com/Documentation/cmd-review.html).
For example, I just voted on
https://gerrit.wikimedia.org/r/#/c/238201/ with:

| ssh -p 29418 gerrit.wikimedia.org gerrit review --label Code-Review=+1 
4266a950bf7d0984cc5177b0f2f8d76b7d0b3c55

This /should/ work for arbitrary labels.

Tim


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] CI checkin Sep 15th

2015-09-14 Thread Antoine Musso
Hello,

The next continuous integration checkin is tomorrow Tuesday 15th at
14:00 UTC (16:00 CEST) in #wikimedia-office *and Google hangouts*:

https://plus.google.com/hangouts/_/wikimedia.org/ci-weekly

You would need a Google account and probably ask for invite via the
#wikimedia-office channel.


Meeting Agenda:
https://www.mediawiki.org/wiki/Continuous_integration_meetings/2015-09-15

Past meetings with agenda and minutes are archived at:
https://www.mediawiki.org/wiki/Continuous_integration/meetings

Last week minutes:
https://www.mediawiki.org/wiki/Continuous_integration_meetings/2015-09-08/Minutes

cheers,

-- 
Antoine "hashar" Musso


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Add a Gerrit label "WIP" to mark changes as work in progress

2015-09-14 Thread C. Scott Ananian
I'd use this tag more often if I could set it from the gerrit
command-line when I upload a patch.  Otherwise it will be pretty
inconvenient to keep this in sync with the summary line of my patch.
 --scott

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] Visual diffing updates

2015-09-14 Thread Subramanya Sastry

https://github.com/subbuss/parsoid_visual_diffs

I've pushed a bunch of updates over the last week which should now make 
this usable for comparing HTML files from different sources (not 
restricted to PHP parser and Parsoid). I did this so that this could be 
used to compare the rendering of Tidy and HTML5depurate HTML (T89331).


You can now provide options in a config file (example config file 
included in bin/settings.js.example).


You can provide DOM-post-processing scripts to inject (in the context of 
a browser window) and process the HTML before it is screenshotted. This 
is useful in the PHP and Parsoid cases where we need to make sure all 
the closed-by-default tabs / tables / etc. are opened, custom CSS is 
injected (to use Parsoid styling), chrome is removed, etc. PHP parser 
and Parsoid HTML injectable scripts are in lib/*postprocess.js. Similar 
scripts can be provided for other use cases.


You can run a visualdiffing node service (which is currently still 
targeted at visual diffing PHP parser and Parsoid output, but could 
potentially be extended to not be hardcoded for those use cases). The 
codebase also provides a testreduce client for use with Parsoid's 
testreduce service for running these kind of tests on a set of pages in 
an automated fashion.


This codebase is not pretty or polished but it does the job for the use 
cases it is targeted for. I wanted to spend only as much time on it as 
required to get the job done rather than overengineer it for multiple 
use cases.


As part of the RFC discussion on T89331, there was some interest in 
using this outside of Parsoid's visual diffing usecase. That is the 
motivation for this more wider update.


This first pass of cleanup, refactoring, and generalization is primarily 
to support work being done as part of T89331. More testing to be done.


But, ideas for using this in other scenarios welcome. More generally, 
other feedback and pull requests welcome.


Subbu.

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] Join the Wikimedia Developer Summit 2016

2015-09-14 Thread Rachel Farrand
Hello!

The Wikimedia Developer Summit 2016 will be taking place in San Francisco,
CA between January 4th and January 6th, 2016.

Registration is open along with the call for participation.

*Deadline for travel sponsorship requests and the call for participation is
October 2, 2015.*

https://www.mediawiki.org/wiki/Wikimedia_Developer_Summit_2016

Hope to see you in San Francisco!
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Interested in working on a WikiWidget for algorithm visualization

2015-09-14 Thread Daniel Moisset
Hi, thanks for the detailed information... my reply inline:

On Fri, Sep 11, 2015 at 7:20 PM, Bryan Davis  wrote:

> (...)
> The high level and non-negotiable needs for integration with Wikipedia
> or the sister projects would be:
> * Freely licensed client (OSI approved license)
> * Offline animated image rendering solution for user-agents that
> can't/won't use the client
> * Ability to host freely licensed "models" on Commons
> * No client interaction with servers not managed by the Wikimedia
> Foundation on behalf of the larger Wikimedia movement.
>
> Ideally the authoring tools would also be freely licensed and capable
> of being integrated with MediaWiki under the same hosting terms listed
> for the client.
>
>
What I can imagine as factible is something like what there is already for
any "standard" format like png or ogg: We can define a spec for the
visualization input data (which already is a pretty simple JSON, so it is
just a matter of documentation), and a freely licensed (I'm guessing GPL
here) "player" in JS, implemented as a Wikiwidget. So our tool can be a way
to build and generate these JSONs (but anybody could implement their own)
and export it so wikimedia can host the data. I think that matches your
requirements; it loses part of the interaction on the article (I think that
requires the interaction with external servers you want to avoid) , but is
still a way to allow any user instead of only mediawiki commiters
contribute visualizations. At upload time the content creator can assert
that the content has appropriate licensing (as you do now when uploading
other media)

How do you handle the "Offline animated image rendering solution" for your
current wikiwidgets? What I can think about here is an animated gif
fallback, but that would lose interactivity at all (this is something that
users will be able to do anyway without any implementation effort)

Another (technical) question I have is if there's already something in
mediawiki that would allow users to upload JSON content, or if something
like that should be implemented. Having that would allow users to use later
in markup stuff like {{WikiWidget|WalnutVisualizer:myuser/somecontent}}.

Do you think this approach would work, and would be compatible with the
values of the wikimedia foundation?

Thanks for your time,

D.
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] UploadWizard long term problems

2015-09-14 Thread Steinsplitter Wiki
Hi,

UploadWizard has a lot of bugs and is sometimes defacto unusable.

See error reports here: 
https://commons.wikimedia.org/wiki/Commons:Upload_Wizard_feedback

See also reports on phabricator.

It is a huge problem that the UploadWizard is not fixed for years now. 
Especially during WLM we have a lot of error reports.


Kind regards,
Steinsplitter
  
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Idea: Cryptographically signed wiki pages.

2015-09-14 Thread Brian Wolff
On 9/13/15, Purodha Blissenbach  wrote:
> In a discussion in the German Pirate Party the idea came up that we
> might want to have cryptographically signed wiki pages.
> I could not find that this has been implemented already anyhow.
>
> Thus, can we develop an extsion which provides cryptographically signed
> wiki pages?
>
> A brief and preliminaly scetch would mean that any user who provides a
> matching public key could sign any existing page.
> Before a page + signature is saved, the signature is checked for
> vadility.
> Editing a siged page is possible without resigning it.
> There must be a page display allowing to copy+paste the page with
> signature for external verification.
> Therre should be a button triggering the verifivation via an external
> online service.
> Maybe signature display of signed pages should be suppressable.
> Any numer of independent signatures must be possible to a page.
>
> Does that make sense? Anything vital forgotten?
>
> Feedback welcome.
>
> Greetings -- Purodha
>
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Sounds like the sort of use case that would be well-adapted to ContentHandler.

Whether or not this is a good idea depends on what sort of security
goals you have in mind.

Some thoughts
*Key distribution: Can just anyone sign any page with any key? How do
you communicate to the user if the signature is worth anything? Will
some association be made between user accounts and public keys?
*Intent of signature: You may want to have some way to specify what
the intent of the signature is - Is the signer agreeing with the
document? agreeing to be bound by the document? asserting that they
have reviewed the document for factual accuracy?
* "Therre should be a button triggering the verifivation via an
external online service" Well probably a good idea, keep in mind - if
you don't trust the local server, why would you trust that one of its
links go to the legitimate external server, etc.

--
bawolff

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] Looking for a co-org-admin for Outreachy 11

2015-09-14 Thread Quim Gil
Hi, we are looking for volunteers willing to get involved in the
organization of Wikimedia's participation in the upcoming round of
Outreachy, a program to involve underrepresented communities in open source
projects. https://www.gnome.org/outreachy/

Experienced org admins like Niharika Kohli, Andre Klapper and myself are
looking forward to support new volunteers in this role. We also have a
solid and reasonably well documented process in place helping the
onboarding of new org admins, mentors, and interns. This is a good chance
to grow your tech community management experience in a safe and supportive
context, dedicating about 2-4 hours per week between October and February.

About half year ago, I asked for volunteers to become co-org-admins of the
upcoming Google Summer of Code and Outreachy round. Niharika Kohli answered
back, and she has been an amazing org admin since then. I'm not
exaggerating when I say that her involvement has been one of the best
things impacting my work this year. She has learned and enjoyed a lot, and
she has been very helpful to the interns and mentors that are concluding
their projects as we speak.

Interested? Just reply to this email or comment at
https://phabricator.wikimedia.org/T112267

-- 
Quim Gil
Engineering Community Manager @ Wikimedia Foundation
http://www.mediawiki.org/wiki/User:Qgil
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l