Re: [Wikitech-l] LiquidThreads - how do we kill it?

2014-06-09 Thread Lee Worden

 [weighing in on the nesting/quoting bikeshed: the structured quoting
 which Simple Machines Forum (SMF) provides is a nice compromise: it
 preserves the exact origin of the quoted material, for easy
 backreference, but it also allows flexible editing of the quoted
 content and for combining multiple quotations into a single response.]

Clearly the whole thing should be started over using Xanadu [1] :)

[1]
http://www.theguardian.com/technology/2014/jun/06/vapourware-software-54-years-xanadu-ted-nelson-chapman


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] [RFC] Extension registration

2014-06-04 Thread Lee Worden

Date: Wed, 04 Jun 2014 02:12:30 -0700
From: Daniel Friesendan...@nadir-seen-fire.com

On 2014-06-04, 1:29 AM, Legoktm wrote:

== Extension locations ==
We agreed that we should require extensions to all be in the same
directory, but that directory should be configurable. By default it
will point to $IP/extensions.

I still do NOT like this idea.

By all means there should be one directory for extensions that are
managed by a web/cli installer and the method of loading extensions from
that one directory should be simple even when we're still using a php
settings file. But when someone is intentionally not using that and
doing complex config then we shouldn't stop them from saying to load an
extension from a specific directory.


+1


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Pending changes to Extension:MultiUpload

2014-04-09 Thread Lee Worden

n 04/04/2014 06:33 PM, Lee Worden wrote:

The MultiUpload extension has been unmaintained for a while.  I believe
its author has moved on to another job.  I've written a completely
independent piece of extension code (as a byproduct of work I did for
the WorkingWiki extension) that provides a multiple-upload interface.
After consulting with Petrb, who is listed as the current maintainer of
MultiUpload on mediawiki.org, I'm thinking of checking in my
multiple-upload code as a new version of MultiUpload, completely
replacing the current implementation.


This is done now.  A completely reimplemented MultiUpload extension is 
now current, in the mediawiki/extensions/MultiUpload git repo, and in 
the documentation at https://www.mediawiki.org/wiki/Extension:MultiUpload.


Please direct any concerns or requests to me.

Lee Worden


I'm posting here in case anyone objects or cares to raise concerns.

Here's a bare-bones demo video: https://www.youtube.com/watch?v=4d90Tt5EGAc


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] Pending changes to Extension:MultiUpload

2014-04-04 Thread Lee Worden
The MultiUpload extension has been unmaintained for a while.  I believe 
its author has moved on to another job.  I've written a completely 
independent piece of extension code (as a byproduct of work I did for 
the WorkingWiki extension) that provides a multiple-upload interface. 
After consulting with Petrb, who is listed as the current maintainer of 
MultiUpload on mediawiki.org, I'm thinking of checking in my 
multiple-upload code as a new version of MultiUpload, completely 
replacing the current implementation.


I'm posting here in case anyone objects or cares to raise concerns.

Here's a bare-bones demo video: https://www.youtube.com/watch?v=4d90Tt5EGAc

And here's my extension, in a temporary location: 
https://github.com/worden-lee/MultiUpload


Lee Worden

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Gerrit Commit Wars

2014-03-08 Thread Lee Worden

On 03/08/2014 02:10 PM, Isarra Yos zhoris...@gmail.com wrote:

On 08/03/14 09:15, Niklas Laxström wrote:

Please do not forget the contributors who want to improve MediaWiki
for their own needs. We also have to balance how much we inconvenience
them to meet the requirements of WMF. In my opinion, the balance is
already in favor of WMF.

   -Niklas


This.

MediaWiki serves many interests, and it's already hard enough for
third-party users to contribute their features/improvements upstream
that most simply don't even try. We should be trying to improve this,
not make it even worse for them, because these are often things that
would prove widely useful even if they are not /currently/ WMF
priorities (it is not uncommon for them later become such and then have
to be completely reimplemented).


+1.

BTW, anyone care to have a look at
https://gerrit.wikimedia.org/r/#/c/67173/3
? :)

(It's been sitting around for a good while, and it would make my life 
tons easier if it could get into the 1.23 LTS.)


LW

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Comet

2013-11-19 Thread Lee Worden
Thanks for responding, Aran, Tyler, and Daniel.  I appreciate your 
thoughtful advice.


I am planning to use SSE.  It's for some slow server-side operations
that typically take 1-3 minutes or so, where I want to give the user
near-real-time updates on what's happening.  I don't need full duplex
or a long-term connection, and I don't want to make wiki admins install
extra server software.  I could just do it with Ajax polling, but I
think SSE will give a more responsive feel.

I have an early version of this working now, that watches a file and
spools it out in an SSE event stream (by violently hijacking the
ApiBase logic and seizing low-level control of the HTTP response - I'm
sure it could be integrated into the output options in the Api class
hierarchy if there's interest).  This seems to work pretty well so far.

For slow operations with side effects, I'm going to do it in two parts: 
client calls api.php to request the operation; it sends back a single 
SSE event containing a key, and goes on with its business, writing 
information about its progress to a file; client calls api.php a second 
time, using the key, to watch that file.  This way it can robustly 
handle a bad connection or timeout by reconnecting without causing any 
unwanted side effects, like restarting the long operation.  The stream 
of output could just as well go in memcache or the database as in a 
file, but for my project it makes more sense to use the file system, 
because some of it will come from redirecting the output of unix commands.


LW


From: Aran a...@organicdesign.co.nz

Wouldn't WebSocket be the better choice for a full duplex channel?

On 14/11/13 21:12, Lee Worden wrote:

In the MW extension development I'm doing, I'm thinking of writing
some operations that use [[Comet_(programming)]] to deliver continuous
updates to the client, rather than the Ajax pattern of one request,
one response.

Has anyone messed with this?  Any code I should crib from, or advice
or cautionary tales?  Also, if it develops into something useful, I
could split it out for others to use.

Thanks,
LW



From: Tyler Romeo tylerro...@gmail.com

I have not messed with it personally, but I think it is a good idea. You
should also know that the HTML5 standard has standardized the Comet model
into server-sent events (SSE). [1] Mozilla also provides a nice tutorial on
how to use it. [2] However, one big catch is that this is not currently
implemented in Internet Explorer or mobile browsers. [3] So you'd have to
have your own custom pure-JavaScript implementation for IE support.

WebSocket, as another mentioned, is also an approach you could use.
However, WebSockets are meant for full duplex communication, meaning the
client is also talking back to the server, which may or may not be what you
want. Also using WebSockets means the internals of what is sent over the
socket and what it means is left to you to design, rather than being
standardized. Not to mention the fact that you have to implement WebSockets
in PHP or find a reliable library that will do it for you. And even then,
WebSockets are only supported in IE 10 and later, so you're still a bit
screwed in terms of backwards compatibility.

[1] http://www.w3.org/TR/eventsource/
[2]
https://developer.mozilla.org/en-US/docs/Server-sent_events/Using_server-sent_events
[3] http://caniuse.com/#feat=eventsource



From: Daniel Friesen dan...@nadir-seen-fire.com

Rather than natively using WebSockets you could use a higher-level library.
There are two of these in existence, Socket.IO[1] and SockJS[2].
Socket.IO is the popular implementation.
However when I looked into it I found SockJS to be the superior
implementation IMHO.

These libraries use WebSockets when available and fall back to a variety
of other methods when WebSockets aren't available.
So they support practically every browser.

However you're probably going to have to give up on PHP.

Neither Socket.IO nor SockJS have a PHP server implementation.
The only thing that shows up for Socket.IO is Elephant.IO[3] which is a
Socket.IO *client*.

If you go down to raw WebSockets there is Ratchet[4]. However that
brings up a number of issues that accumulate up to losing every single
point you could possibly have to run it using PHP.
Ratchet needs to spawn its own server. This means it needs a dedicated
port, domain, or a reverse proxy in front of it.
It also means that even though it's written in PHP you can no longer run
it on any form of shared hosting.
PHP is also not designed to run long running daemons. It can do it, but
you will have to watch this very carefully and be prepared to fix any
bugs that show up.
You'd expect that since it's in PHP you at least have the one remaining
advantage that you can directly interact with MediaWiki.
However all of MediaWiki's code is synchronous. This means that if you
directly use MW every time it needs to do something with the database,
memcached, filesystem, other storage, or a slow MW code path your

[Wikitech-l] Comet

2013-11-14 Thread Lee Worden
In the MW extension development I'm doing, I'm thinking of writing some 
operations that use [[Comet_(programming)]] to deliver continuous 
updates to the client, rather than the Ajax pattern of one request, one 
response.


Has anyone messed with this?  Any code I should crib from, or advice or 
cautionary tales?  Also, if it develops into something useful, I could 
split it out for others to use.


Thanks,
LW

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Memento Extension for MediaWiki: Advice on Further Development

2013-11-04 Thread Lee Worden

From: Brian Wolffbawo...@gmail.com
On 2013-11-04 11:04 AM, Brad Jorsch (Anomie)bjor...@wikimedia.org
wrote:


On Fri, Nov 1, 2013 at 6:43 PM, Brian Wolffbawo...@gmail.com  wrote:

 I haven't looked at your code, so not sure about the context - but: In
 general a hook returns true to denote no futher processing should take
 place.


If we're talking about wfRunHooks hooks, the usual case is that they
return*false*  to indicate no further processing, and true means to
*continue*  processing.


D'oh. You are of course correct. Sorry for the mistake.

As an aside, perhaps we should introduce constants for this. Its easy to
mix up the two values.

-bawolff


+1 - great idea!

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] What are DeferredUpdates good for?

2013-09-17 Thread Lee Worden

From: Aaron Schulzaschulz4...@gmail.com

Until what? A timestamp? That would be more complex and prone to over/under
guessing the right delay (you don't know how long it will take to commit). I
think deferred updates are much simpler as they will just happen when the
request is nearly done, however long that takes.


The two systems could be merged by putting these updates in a local 
queue, and adding them to the real jobqueue at the end of the request?

LW

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] editsection styling

2013-09-01 Thread Lee Worden
I just noticed that the editsection links, now with both the visual and 
source editor links, use a pipe character (|) to separate the two.  Is 
this generally the desired way to separate multiple links in a bracketed 
section like this, in MW?  Is there a policy?


I ask because I've been producing editsection-like links for a long time 
in our extension project, with commas in between - for example a LaTeX 
document will come with a list of links like [log, pdf, dvi].  Maybe I 
should switch to using pipes instead of commas.


Sorry in advance for such a bikeshed-friendly question :P ...

Lee Worden
McMaster U., UCB, SF Art Institute
WorkingWiki

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] MediaWiki to Latex Converter

2013-08-13 Thread Lee Worden

Hi Micru!

What use case do you have in mind?

The main use of latex in WorkingWiki is simply authoring .tex files 
directly...


Lee Worden

p.s. I meant to post the What is WorkingWiki video here that I put on 
twitter during Wikimania: http://www.youtube.com/watch?v=D8pwr-Uizf4



Date: Tue, 13 Aug 2013 16:04:44 -0400
From: David Cuenca dacu...@gmail.com
To: Wikimedia developers wikitech-l@lists.wikimedia.org

I think this is going to be great news for WorkingWiki
https://www.mediawiki.org/wiki/Extension:WorkingWiki
Cheers, Micru

On Tue, Aug 13, 2013 at 3:57 PM, Greg Grossmeier g...@wikimedia.org wrote:

quote name=Dirk Hünniger date=2013-08-04 time=10:05:35 +0200

Hello,
I made a new debian package, which resolves the security issues you
mentioned.
It is available here:
http://sourceforge.net/projects/wb2pdf/files/mediawiki2latex/6.5/
Yours Dirk


I had to use a mailing list archive to see the full thread here, which
started way back in 2004(!!!):
http://www.gossamer-threads.com/lists/wiki/wikitech/281372

A highlight I noticed: the wb2pdf that Dirk links above when from a
90+meg download to only a 2.8 meg Debian package.

Thanks for your persistence, Dirk.

Greg

--
| Greg GrossmeierGPG: B2FA 27B1 F7EB D327 6B8E |
| identi.ca: @gregA18D 1138 8E47 FAC8 1C7D |

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l



___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Suggestion for solving the disambiguation problem

2013-07-16 Thread Lee Worden
Maybe it could be done with just the Referer field on the second 
request, without needing to log two different page requests and 
correlate them.



Date: Tue, 16 Jul 2013 14:14:42 -0400
From: David Cuencadacu...@gmail.com

Good idea, it could also help to know which are the links more used in a
disambiguation page to sort them by importance.

Micru

On Tue, Jul 16, 2013 at 2:03 PM, Nicolas Vervellenverve...@gmail.comwrote:


Interesting idea...


On Mon, Jul 15, 2013 at 11:41 PM, Jon Robsonjdlrob...@gmail.com  wrote:


 I understand there is an issue that needs solving where various pages
 link to disambiguation pages. These need fixing to point at the
 appropriate thing.
 
 I had a thought on how this might be done using a variant of
 EventLogging...
 
 When a user clicks on a link that is a disambiguation page and then
 clicks on a link on that page we log an event that contains
 
 * page user was on before
 * page user is on now
 
 If we were to collect this data it would allow us to statistically
 suggest what the  correct disambiguation page might be.
 
 To take a more concrete theoretical example:
 * If I am on the Wiki page for William Blake and click on London I am
 taken tohttps://en.wikipedia.org/wiki/London_(disambiguation)
 * I look through and see London (poem) and click on it
 * An event is fired that links London (poem) to William Blake.
 
 Obviously this won't always be accurate but I'd expect generally this
 would work (obviously we'd need to filter out bots)
 
 Then when editing William Blake say that disambiguation links are
 surfaced. If I go to fix one it might prompt me that 80% of visitors
 go from William Blake to London (poem).
 
 
 Have we done anything like this in the past? (Collecting data from
 readers and informing editors)
 
 I can imagine applying this sort of pattern could have various other
 uses...
 
 
 
 
 --
 Jon Robson
 http://jonrobson.me.uk
 @rakugojon
 


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] Special:Upload

2013-06-10 Thread Lee Worden

Hi -

In the extension development I'm doing, I need a custom file-upload 
interface.  I'm building it around the existing Special:Upload, and as a 
side benefit I've been building a rewritten version of 
Extension:MultiUpload.


In order to do this in what seems to me a reasonable, future-compatible 
way - particularly by calling Special:Upload's methods rather than 
duplicating their code in extension classes - I've needed to split out 
some of Special:Upload's code into separate functions that can be 
overridden in subclasses.  Those changes are in gerrit and bugzilla now:

https://gerrit.wikimedia.org/r/#/c/67173/
https://bugzilla.wikimedia.org/show_bug.cgi?id=48581

I'm posting here in case people want to discuss those changes.  Ideally 
I'd like to backport that to 1.19 so I can support LTS users.


Also, I'd like to submit my MultiUpload code for review, but I'm not 
sure how to do that, because it looks like Extension:MultiUpload hasn't 
been brought over from svn to gerrit.  I'd either submit it as a commit 
that replaces most of the extension's code, or propose it as a separate 
extension.  Please advise me...


Thanks!
Lee Worden

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Corporate needs are different (RE: How can we help Corporations use MW?)

2013-02-09 Thread Lee Worden

On 02/09/2013 03:00 PM, Platonides wrote:

On 08/02/13 21:51, Lee Worden wrote:

As an aside, you could almost certainly do this cheaper with
WorkingWiki.  If you can write a make rule to retrieve the Excel file
from the network drive and make it into html and image files (and maybe
a little wikitext to format the page), you're done.

LW


You could do it with openoffice.org/libreoffice, although I agree that
getting all the dependencies right for running in the server is a bit
tedious. You can also use Excel itself for that (eg. COM automation), as
suggested by vitalif, supposing you are using a Windows server.


Yes, something like that is what I had in mind.

On 02/09/2013 11:06 AM, Antoine Musso wrote:
 In big companies, 10 000$ is cheap. Plus I bet they get a support
 contract coming in.  Overall, that is probably cheaper than paying an
 internal software developer to integrate and then maintain the
 WorkingWiki solution.

 -- Antoine hashar Musso

True.  Others who operate less formally might find it a welcome option, 
OTOH.

LW

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Corporate needs are different (RE: How can we help Corporations use MW?)

2013-02-08 Thread Lee Worden

On 02/08/2013 10:23 AM, Daniel Barrett wrote:

O_O $1 excel-to-html? O_OOO
Why not just copy-paste into for example wikEd (google://wikEd)? :-))) Not 
that beautiful, but it works.


Now, I will demonstrate what I mean by Corporate needs are different. :-)

With our extension, the Excel spreadsheet is rendered live in the wiki page.
So if somebody updates the spreadsheet (on a network drive), the wiki page is
automatically and instantly up to date!  This is totally different from a 
one-time
copy-and-paste, and much more maintainable. (And it's pretty fast too, with 
AJAX and good caching.)

Even better, if your spreadsheet generates a graph or chart, the image gets 
embedded
in the wiki page too, and is automatically kept up to date.  And if your 
spreadsheet
calls out to a database for its data, to generate the chart, then the wiki is 
updated
when the database changes too! Suddenly, MediaWiki has all the charting 
capability of
Excel + SQL.  This is very powerful and definitely worth $10K for a highly 
analytical
company like ours.

We've had this feature for about 2 months, and so far we have 350+ articles with
embedded spreadsheets, updated live.


As an aside, you could almost certainly do this cheaper with 
WorkingWiki.  If you can write a make rule to retrieve the Excel file 
from the network drive and make it into html and image files (and maybe 
a little wikitext to format the page), you're done.


LW

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] An actual bikeshed

2013-01-23 Thread Lee Worden

On 01/23/2013 12:56 PM, Bawolff Bawolff wrote:

On 2013-01-23 3:38 PM, Jon Robson jdlrob...@gmail.com wrote:


Suggested solution:
Maybe some kind of voting system might be of use to force some kind of
consensus rather than leaving problems unsolved. I'm fed up of
receiving emails about the same problem I discussed weeks before that
never got solved. It makes my mailbox ill.

I mean if the question is really what colour is the bikeshed it would
be good for people to propose colours, people to vote on preferred
colours and at the end of say a week the majority colour wins and gets
implemented (or in cases where there is no majority we discuss the
front runners and other possible solutions).



What colour should the polling booth be?

I don't think the answer is voting. Perhaps there are some sheds that don't
need to be painted.

-bawolff

P.s. if someone built a bikeshed in the wmf office they would be my hero


I have put out a call for one: 
http://groups.freecycle.org/oaklandfreecycle/posts/27285249


If I get one, I will donate it to the wmf office.
LW

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Release Notes and ContentHandler

2013-01-14 Thread Lee Worden

On 01/13/2013 07:00 PM, wikitech-l-requ...@lists.wikimedia.org wrote:

From: Daniel Kinzlerdan...@brightbyte.de
To: Wikimedia developerswikitech-l@lists.wikimedia.org
Subject: Re: [Wikitech-l] Release Notes and ContentHandler
Message-ID:50f2abf9.4020...@brightbyte.de
Content-Type: text/plain; charset=ISO-8859-1

On 13.01.2013 02:02, Lee Worden wrote:

Yes, I think ContentHandler does some of what WW does, and I'll be happy to
integrate with it.  I don't think we'll want to abandon the source-file tag,
though, because on pages like
http://lalashan.mcmaster.ca/theobio/math/index.php/Nomogram  and
http://lalashan.mcmaster.ca/theobio/worden/index.php/Selection_Gradients/MacLev,
it's good to be able to intertwine source code with the rest of the page's
wikitext.

ContentHandler does not yet have good support for inclusion - currently, there's
just Content::getWikitextForInclusion(), which is annoying. It would be much
nicer if we had Content::getHTMLForInclusion(). That would allow us to
transclude any kind of content anywhere.

That would mean taht instead of source-fileFoo.tex/source-file, you could
just use {{:Foo.tex}} to transclude Foo.tex's content. Actually, you could
implement getWikitextForInclusion() to return
source-fileFoo.tex/source-file, I guess - but that's cheating ;)


It's not source-fileFoo.tex/source-file - that would be ridiculous. 
 If nothing else, I'd use source-file filename=Foo.tex/. :)  [In 
fact, I do support that form, but it's hardly ever used.]


No, it's source-file filename=Foo.tex\documentclass{article} [...] 
\end{document}/source-file.  It stores the actual source code.  The 
file is not transcluded.  This may seem ridiculous at first blush, and 
you might never want to do that with long .tex files or Python modules, 
but think instead of explaining a concept by using a series of little 
4-line programs, to illustrate each step.  You can think of 
source-file as syntaxhighlight with semantics behind it.  It 
displays a code listing, with highlighting, but it also lets you run the 
code, in context, and display the output.


There are plenty of use cases where you don't want to do that, you just 
want to put each source file on its own page, and I support that now, 
and I'll definitely support the use of ContentHandler when it's time, as 
I've been planning to all along.  But the ability to edit, run, test, 
and refine the code by pressing the Preview button, without saving your 
changes to the wiki until you're satisfied, even if the code is in 
multiple source files, is a valuable affordance, and we may not want to 
give it up.



Also, in a multi-file project, for instance a simple LaTeX project with a .tex
file and a .bib file, it's useful to put the files on a single page so you can
edit and preview them for a while before saving.

That would not change when using the ContentHandler: You would have one page for
the .tex file, one for the .bib file, etc. The difference is that MediaWiki will
know about the different types of content, so it can provide different rendering
methods (syntax highlighted source or html output, as you like), different
editing methods (input forms for bibtext entries?).



Basically, you no longer need nasty hacks to work around MediaWiki's assumption
that pages contain wikitext, because that assumption was removed in 1.21.


Bundling several source files onto a single page, with wikitext 
interspersed, is a form of literate programming.  It's part of the same 
tradition as knitr, R markdown, sage notebooks, etc.  There are valuable 
uses for embedding code in free-form markup, and I think it's useful to 
be able to do it in MediaWiki.


LW


-- daniel


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Release Notes and ContentHandler

2013-01-12 Thread Lee Worden

Hi Mark,

Do you know that the WorkingWiki extension pretty much already does this?

http://lalashan.mcmaster.ca/theobio/projects/index.php/WorkingWiki

I haven't looked into the ContentHandler features in any detail, but I 
don't imagine it'll be hard to handle source files stored on pages with 
special content types along with the other ways of storing source files 
that we already support.  And yes, I'm sure I'll do it at some point.


Lee Worden


Date: Fri, 11 Jan 2013 20:19:57 -0500
From: Mark A. Hershbergerm...@everybody.org
To: Wikimedia developerswikitech-l@lists.wikimedia.org
Subject: [Wikitech-l] Release Notes and ContentHandler
Message-ID:50f0ba3d.6060...@everybody.org
Content-Type: text/plain; charset=ISO-8859-1

As you may have guessed, I've been working on the release notes for
1.21.  Please look over them and improve them if you can.

In the process, I came across the ContentHandler blurb.  I don't recall
this being discussed on-list, but, from looking at the documentation for
it, it looks pretty awesome.  I've used some of my editorial powers to
say, in the release notes:

Extension developers are expected to create additional types in the
future. These might support LaTeX or other forms of markup.

Is this correct? It sounds like a really big thing, if it is.

-- http://hexmode.com/ Language will always shift from day to day. It is
the wind blowing through our mouths. -- http://hexm.de/np


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Release Notes and ContentHandler

2013-01-12 Thread Lee Worden

From: Mark A. Hershbergerm...@everybody.org
To: Wikimedia developerswikitech-l@lists.wikimedia.org
Subject: Re: [Wikitech-l] Release Notes and ContentHandler
Message-ID:50f16dc2.70...@everybody.org
Content-Type: text/plain; charset=ISO-8859-1

On 01/12/2013 08:00 AM, Lee Worden wrote:

Do you know that the WorkingWiki extension pretty much already does this?

I was not aware of this extension.

If I am understanding the example at
http://lalashan.mcmaster.ca/theobio/projects/index.php/Greenhouse_paper
correctly, it looks like it uses the source-file tag to make it so
that parts of the page are handled by LaTeXML.


Yes, that's what's happening there.  On other pages we store R source 
code, makefiles, and whatever else people want to store.



Just from looking overhttps://www.mediawiki.org/wiki/ContentHandler
there seems to be a lot of cross over between what the WorkingWiki
extension does and what ContentHandler enables.  Having ContentHandler
in core may mean that you can reduce the amount of code in the
WorkingWiki extension.


Yes, I think ContentHandler does some of what WW does, and I'll be happy 
to integrate with it.  I don't think we'll want to abandon the 
source-file tag, though, because on pages like 
http://lalashan.mcmaster.ca/theobio/math/index.php/Nomogram and
http://lalashan.mcmaster.ca/theobio/worden/index.php/Selection_Gradients/MacLev, 
it's good to be able to intertwine source code with the rest of the 
page's wikitext.


Also, in a multi-file project, for instance a simple LaTeX project with 
a .tex file and a .bib file, it's useful to put the files on a single 
page so you can edit and preview them for a while before saving.


LW

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] ContentHandler examples?

2013-01-12 Thread Lee Worden

From: Ori Livneh o...@wikimedia.org
To: Wikimedia developers wikitech-l@lists.wikimedia.org
Subject: Re: [Wikitech-l] ContentHandler examples?

After working with the API for a while I had a head-explodes moment when I 
realized
that MediaWiki is now a generic framework for collaboratively fashioning 

and editing
content objects, and that it provides a generic implementation of a 

creative workflow
based on the concepts of versioning, diffing, etc. I think it's a 

fucking amazing
model for the web and I hope MediaWiki's code and community is nimble 

enough to fully

realize it.


I agree.  Wikis can do lots more than create documents, and we've only 
started down that road.  ContentHandler is a step in this direction, and 
our project is as well.  I'd love to have conversations about all this. 
 I'm based in the Bay Area, by the way, even though our project is 
based in Canada.


Lee Worden

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Alpha version of the VisualEditor now available on the English Wikipedia

2012-12-12 Thread Lee Worden

From: Roan Kattouwroan.katt...@gmail.com
On Tue, Dec 11, 2012 at 10:55 PM, Lee Wordenworden@gmail.com  wrote:

Very exciting - congratulations!

I know these are early days for the VisualEditor, but is there a plan for
extension developers to be able to hook in to provide editing for the things
their extensions support?

Yes, absolutely! We've been working on cleaning up and rewriting
various internal APIs in VE such that they can reasonably be used to
write extensions. We've made progress, but we're not done yet, and


Fantastic!


more recently it's received less attention because of yesterday's
release.


Of course.


We're gonna be picking that work back up in January, and once
it's done, we would be happy to work with willing guinea pigs to test
our APIs in the wild and work out the remaining kinks. As for when
that'll actually be scheduled to happen, I defer to James F.

Roan


Great!  I'll stay tuned.  Great work!
Lee

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Alpha version of the VisualEditor now available on the English Wikipedia

2012-12-11 Thread Lee Worden

Very exciting - congratulations!

I know these are early days for the VisualEditor, but is there a plan 
for extension developers to be able to hook in to provide editing for 
the things their extensions support?


Lee Worden
http://leeworden.net

On 12/11/2012 10:28 PM, wikitech-l-requ...@lists.wikimedia.org wrote:

From: James Forresterjforres...@wikimedia.org
To: Wikimedia developerswikitech-l@lists.wikimedia.org,
wikimedi...@lists.wikimedia.org
Subject: [Wikitech-l] Alpha version of the VisualEditor now available
on the  English Wikipedia
Message-ID:
CAEWGtDWtQ5a-a=J8DN1vNmQH1J=5hl0k59t+lrwobjfpu+d...@mail.gmail.com
Content-Type: text/plain; charset=UTF-8

TL;DR: Today we are launching an alpha, opt-in version of the
VisualEditor[0] to the English Wikipedia. This will let editors create
and modify real articles visually, using a new system where the
articles they edit will look the same as when you read them, and their
changes show up as they type enter them ? like writing a document in a
word processor. Please let us know what you think[1].


Why launch now?

We want our community of existing editors to get an idea of what the
VisualEditor will look like in the ?real world? and start to give us
feedback about how well it integrates with how they edit right now,
and their thoughts on what aspects are the priorities in the coming
months.

The editor is at an early stage and is still missing significant
functions, which we will address in the coming months. Because of
this, we are mostly looking for feedback from experienced editors at
this point, because the editor is insufficient to really give them a
proper experience of editing. We don?t want to promise an easier
editing experience to new editors before it is ready.

As we develop improvements, they will be pushed every fortnight to the
wikis, allowing you to give us feedback[1] as we go and tell us what
next you want us to work on.


How can I try it out?

The VisualEditor is now available to all logged-in accounts on the
English Wikipedia as a new preference, switched off by default. If you
go to your ?Preferences? screen and click into the ?Editing? section,
it will have as an option labelled ?Enable VisualEditor?).

Once enabled, for each article you can edit, you will get a second
editor tab labelled ?VisualEditor? next to the ?Edit? tab. If you
click this, after a little pause you will enter the VisualEditor. From
here, you can play around, edit and save real articles and get an idea
of what it will be like when complete.

At this early stage in our development, we recommend that after saving
any edits, you check whether they broke anything. All edits made with
the VisualEditor will show up in articles? history tabs with a
?VisualEditor? tag next to them, so you can track what is happening.


Things to note

Slow to load - It will take some time for long complex pages to load
into the VisualEditor, and particularly-big ones may timeout after 60
seconds. This is because pages have to be loaded through Parsoid which
is also in its early stages, and is not yet optimised for deployment
and is currently uncached. In the future (a) Parsoid itself will be
much faster, (b) Parsoid will not depend on as many slow API calls,
and (c) it will be cached.

Odd-looking - we currently struggle with making the HTML we produce
look like you are used to seeing, so styling and so on may look a
little (or even very) odd. This hasn't been our priority to date, as
our focus has been on making sure we don't disrupt articles with the
VisualEditor by altering the wikitext (correct round-tripping).

No editing references or templates - Blocks of content that we cannot
yet handle are uneditable; this is mostly references and templates
like infoboxes. Instead, when you mouse over them, they will be
hatched out and a tooltip will inform you that they have to be edited
via wikitext for now. You can select these items and delete them
entirely, however there is not yet a way to add ones in or edit them
currently (this will be a core piece of work post-December).

Incomplete editing - Some elements of complex formatting will
display and let you edit their contents, but not let users edit their
structure or add new entries - such as tables or definition lists.
This area of work will also be one of our priorities post-December.

No categories - Articles' meta items will not appear at all -
categories, langlinks, magic words etc.; these are preserved (so
editing won't disrupt them), but they not yet editable. Another area
for work post-December - our current plan is that they will be edited
through a metadata flyout, with auto-suggestions and so on.

Poor browser support - Right now, we have only got VisualEditor to
work in the most modern versions of Firefox, Chrome and Safari. We
will find a way to support (at least) Internet Explorer post-December,
but it's going to be a significant piece of work and we have failed to
get it ready for now.

Articles

Re: [Wikitech-l] HTMLPurifier

2012-04-24 Thread Lee Worden

I've been considering using it, so I'd be interested too.

On 04/24/2012 11:52 AM, wikitech-l-requ...@lists.wikimedia.org wrote:

From: Chris Steippcste...@wikimedia.org
Subject: [Wikitech-l] HTMLPurifier

Hello Everyone,

Does anyone know if mediawiki has ever used HTMLPurifier (
http://htmlpurifier.org/) as a library? Or if any extensions have used it?

I'm looking at adding in a library for svg cleaning that depends on it, but
not sure if that's something that can be added in, or if I
should re-implement those features.

Thanks,

Chris


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Wikitech-l Digest, Vol 103, Issue 78

2012-02-29 Thread Lee Worden
Thibault may not know that WorkingWiki [1] handles LaTeX documents in 
wiki pages. - LW


[1] http://lalashan.mcmaster.ca/theobio/projects/index.php/WorkingWiki


From: Sumana Harihareswarasuma...@wikimedia.org
Subject: [Wikitech-l] New committers schuellersa, netbrain, and
thibaultmarin

Thibault Marin (thibaultmarin) works on
http://www.mediawiki.org/wiki/Extension:TimelineTable  and says, I also
have in mind a few other extensions I would like to work on, such as
conversion of LaTeX documents to wiki pages.


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] RFC: Should we drop the rendering preferences for math?

2011-11-28 Thread Lee Worden


On 11/28/2011 05:46 PM, wikitech-l-requ...@lists.wikimedia.org wrote:
 Date: Mon, 28 Nov 2011 15:16:55 -0800
 From: Brion Vibberbr...@pobox.com

 On Mon, Nov 28, 2011 at 3:05 PM, MZMcBridez...@mzmcbride.com  wrote:

   Brion Vibber wrote:
 
 [snip my notes about removing the non-PNG non-source options, wanting
 higher-resolution renderings]

   Did you have a chance to evaluate MathJax?http://www.mathjax.org/  I
   know
   it's come up in past math discussions and that a lot of math folks think 
  it
   looks promising. A technical analysis of its feasibility on Wikimedia 
  wikis
   would be great. Killing the less-used, ancient math options is great, but
   perhaps adding one wouldn't be too bad to do too. :-)
 
 That's an excellent thing to bring up -- MathJAX*does*  look very
 promising, and things seem to render pretty nicely. Need to make sure that
 we can either do that type of rendering cleanly with the PNG fallback
 (older browsers will still need the PNGs, so it may still be worth spending
 the time to fix baselines).

Not sure, but you might get better performance by translating the tex to 
mathml on the server, and having MathJax render the mathml on the client...

 Size of the library, and compatibility, could be an issue for mobile but is
 worth checking on. (Looks like it*does*  work in Android and iPhone
 browsers, so has a good chance of being something we could use as a
 progressive enhancement.)

 It's actually probably a better idea to go ahead in that direction than to
 worry about high-resolution renderings from texvc for now. (Other image
 types, including icons in the UI, will still need high-resolution versions
 though.)

 -- brion


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


[Wikitech-l] WorkingWiki extension

2011-05-03 Thread Lee Worden
Just added to MediaWiki.org:
http://www.mediawiki.org/wiki/Extension:WorkingWiki

WorkingWiki is a software extension for MediaWiki that makes a wiki into 
a powerful environment for collaborating on publication-quality 
manuscripts and software projects.  It's designed for research labs' 
wikis, but may have diverse other uses as well.

(I probably should have made it public long ago, but there was a big 
refactor and it took a long time to settle out...)

Lee Worden
McMaster University

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] WYSIWYG and parser plans (was What is wrong with Wikia's WYSIWYG?)

2011-05-02 Thread Lee Worden
On 05/02/11 15:30, wikitech-l-requ...@lists.wikimedia.org wrote:
 Date: Tue, 03 May 2011 00:29:51 +0200
 From: Platonidesplatoni...@gmail.com
 Subject: Re: [Wikitech-l] WYSIWYG and parser plans (was What is wrong
   withWikia's WYSIWYG?)
 To:wikitech-l@lists.wikimedia.org
 Message-ID:ipnb0i$omi$1...@dough.gmane.org
 Content-Type: text/plain; charset=ISO-8859-1

 Magnus Manske wrote:
 
   So, why not use my WYSIFTW approach? It will only parse the parts of
   the wikitext that it can turn back, edited or unedited, into wikitext,
   unaltered (including whitespace) if not manually changed. Some parts
   may therefore stay as wikitext, but it's very rare (except lists,
   which I didn't implement yet, but they look intuitive enough).
 
   Magnus
 Crazy idea: What if it was an/extensible/  editor? You could add later a
 module for enable lists, or enable graphicref, but also instruct it
 on how to present to the user some crazy template with a dozen parameters...

Seems like it will need to be extensible, to allow authors of MW 
extensions to add support for cases where they've changed the parser's 
behavior?

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] sessions in parallel

2009-12-07 Thread lee worden

 Hi -

 I'm debugging my extension code against potential deadlock
 conditions, and am having a problem: when I request 2 pages
 simultaneously in different firefox tabs, judging by the wfDebug
 output it seems like the second page request blocks at
 session_start() and waits until the first page is done.
 Is it supposed to do this?  Does it depend on my configuration?

 Thanks -
 Lee Worden
 McMaster University


 If I remember correctly, the PHP files session handler has
 to acquire
 an exclusive lock on the session file in session_start(). Thus
 preventing your second request until the first is complete.

 Jared

 Thanks!
 I have to log in as 2 people from 2 different browsers, I guess.
 LW


 Or configure MediaWiki  PHP to use something like MemCache.

 http://www.mediawiki.org/wiki/Manual:$wgSessionsInMemcached

 Jared

Noted, thanks.
Lee

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] sessions in parallel

2009-12-06 Thread lee worden

 -Original Message-
 From: wikitech-l-boun...@lists.wikimedia.org
 [mailto:wikitech-l-boun...@lists.wikimedia.org] On Behalf Of
 lee worden
 Sent: 04 December 2009 19:14
 To: Wikimedia developers
 Subject: [Wikitech-l] sessions in parallel

 Hi -

 I'm debugging my extension code against potential deadlock
 conditions, and am having a problem: when I request 2 pages
 simultaneously in different firefox tabs, judging by the
 wfDebug output it seems like the second page request blocks
 at session_start() and waits until the first page is done.
 Is it supposed to do this?  Does it depend on my configuration?

 Thanks -
 Lee Worden
 McMaster University


 If I remember correctly, the PHP files session handler has to acquire
 an exclusive lock on the session file in session_start(). Thus
 preventing your second request until the first is complete.

 Jared

Thanks!
I have to log in as 2 people from 2 different browsers, I guess.
LW

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


[Wikitech-l] sessions in parallel

2009-12-04 Thread lee worden
Hi -

I'm debugging my extension code against potential deadlock conditions, and 
am having a problem: when I request 2 pages simultaneously in different 
firefox tabs, judging by the wfDebug output it seems like the second page 
request blocks at session_start() and waits until the first page is done. 
Is it supposed to do this?  Does it depend on my configuration?

Thanks -
Lee Worden
McMaster University

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] A potential land mine

2009-08-11 Thread lee worden


On Tue, 11 Aug 2009, dan nessett wrote:


--- On Tue, 8/11/09, Aryeh Gregor simetrical+wikil...@gmail.com wrote:


Then you're doing almost exactly the same thing we're doing
now,
except with MWInit.php instead of LocalSettings.php. 
$IP is normally
set in LocalSettings.php for most page views.  Some
places still must
figure it out independently in either case, e.g.,
config/index.php.



I want to avoid seeming obsessed by this issue, but file position 
dependent code is a significant generator of bugs in other software. The 
difference between MWInit.php and LocalSettings.php is if you get the 
former into a directory that PHP uses for includes, you have a way of 
getting the root path of MW without the caller knowing anything about 
the relative structure of the code distribution tree hierarchy. As you 
pointed out previously, the reason you need to compute $IP before 
including/requiring LocalSettings is you don't know where it is.


Dan


Placing it in the include path could make it hard to run more than one 
version of the MW code on the same server, since both would probably find 
the same file and one of them would likely end up using the other one's 
$IP.


Another way of putting it is, is it really better to hard-code the 
absolute position of the MW root rather than its position relative to the 
files in it?


lw___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] $300K grant for Wikimedia Commons

2009-07-03 Thread lee worden
It sure does look good!  I have a question: the extension project I'm 
working on needs a customized upload interface for its own Special: pages 
- the Special:Upload classes in the new-upload branch are exactly what I 
need.  I've been doing a similar refactoring of Special:Upload to allow me 
to use subclasses to customize the upload process.  Is anyone willing to 
risk a wild guess about when the new Special:Upload classes will replace 
the current version?


Lee Worden
McMaster University

On Thu, 2 Jul 2009, Chad wrote:


This is excellent news. For those who haven't seen it
in action, Michael Dale's work on the upload workflow
is looking awesome. I'm sure this grant will make it even
better.

-Chad

On Jul 1, 2009 11:52 PM, Steve Bennett stevag...@gmail.com wrote:

On Thu, Jul 2, 2009 at 11:30 AM, Erik Moellere...@wikimedia.org wrote:
http://upload.wikimedia.org...

The objective of this project is to increase participation in and
contributions to Wikimedia
Commons by implementing a 13month
software development, usability testing and
documentation project to improve the interface for uploading
multimedia files to Wikimedia
Commons.
The deliverables should include the following key improvements:
•an integrated upload tool that can be accessed directly from the
editing window;
•Wikipedia integration of Wikimedia Commons as a repository to store
freely licensed
media;
•an intelligent workflow for fair use media that are not permissible
on Wikimedia Commons;
•an upload form process that emphasizes common defaults above less
frequent use cases;
•separated instructions and tutorials for conveying key policy
information and background
on copyright law and licensing.


And thank god for that! :) I can't wait.

Steve

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Downloadable client fonts to aid language script support?

2009-05-04 Thread lee worden
On Mon, 4 May 2009, Brion Vibber wrote:

 It might be helpful for some language wikis to link in a free font this
 way, when standard fonts supporting their script are often unavailable.
 Right now on such sites there tends to be a little English link at the
 top such as 'font help' leading to a page like this telling you how to
 download and install a font:
 http://ta.wikipedia.org/wiki/Project:Font_help

It sounds like a good way to provide the STIX fonts for rendering MathML 
as well, since currently one has to point users to a font help page just 
like that, and hope they don't give up before they get to installing the 
fonts and seeing the nice-looking equations.

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Serving as xhtml+xml

2009-03-19 Thread lee worden

This has been done before, for instance in the ASCIIMath4Wiki extension [2].
 I don't want to change the Content-type unconditionally, though, only some
of the time, so that we can serve texvc-style images to browsers or users
that don't like the modified content type.


Note that this will interfere with any kind of HTML caching, such as
Squid or file cache, and with the parser cache as well.  It won't work
correctly in most well-configured MediaWiki installs unless you make
sure to fragment the parser cache appropriately, at a minimum.


The patch seems to work with the simple caching on the system I'm using to 
test.  It changes more than just the content-type - the DOCTYPE is 
different and there's a ?xml string before the document, and a 
'Vary: Accept' header.


I got all that from another author.  I'll look further into how to do it 
without patches.  It's true I can change the Content-type by setting 
$wgMimeTypes dynamically, but it only works on the first hit, and cached 
pages arrive as text/html.
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] Serving as xhtml+xml

2009-03-18 Thread lee worden
I'm at work on a MW extension that, among other things, uses LaTeXML [1] 
to make XHTML from full LaTeX documents.  One feature is the option to 
render the equations in MathML, which requires the skins to be patched so 
that they output the page as Content-type: application/xhtml+xml instead 
of text/html.


Attached is a patch for the skins directory that allows changing the 
Content-type dynamically.  After applying this patch, if any code sets the 
global $wgServeAsXHTML to true, the page will be output with the xhtml+xml 
content type.  This seems to work fine with the existing MW XHTML pages.


This has been done before, for instance in the ASCIIMath4Wiki extension 
[2].  I don't want to change the Content-type unconditionally, though, 
only some of the time, so that we can serve texvc-style images to browsers 
or users that don't like the modified content type.


It should be possible to use this patch without breaking any existing 
systems (unless someone else's extension happens to use the same global 
variable name, I guess).


The patch is made on the 1:1.13.3-1ubuntu1 mediawiki package (from Ubuntu 
9.04), and only modifies Monobook.php and Modern.php.  There are other 
skins in my installation here, but they don't seem to work very well and I 
didn't see where to make the change.


Is there a better way to make MathML work in MW?  Might this option be 
included in a future MW release?  Any feedback or alternative suggestions 
is welcome.


Lee Worden
McMaster University Dept of Biology

[1] http://dlmf.nist.gov/LaTeXML/
[2] http://www.mediawiki.org/wiki/Extension:ASCIIMath4Wiki

ps. I'm not sure if this list accepts attachments - if not I'll be happy 
to send it to people on request.___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l