[Wikitech-l] Thoughts: stateless services with open servers?

2015-01-27 Thread Brion Vibber
We had a great discussion at the MediaWiki dev summit just now on
service-oriented architecture and future plans in that direction... as
always the community seems a bit split on the subject.

Breaking things out to separate services can indeed make setup and
maintenance more complex... pro-services/pro-VM/pro-hosting folks like me
tend to say hey that can be automated! (and MediaWiki-Vagrant shows a lot
of awesomeness to start, though it's very developer-focused).


One thing that comes to mind though is something Gabriel mentioned about
wanting stateless services for things like image manipulation and citation
parsing and whatnot -- in principle many services like this could be opened
up to public use.

For instance, a random MediaWiki instance could use a Wikimedia-hosted
image scaler or graph rasterizer.

Usage limits and API keys could be introduced if we're worried about
overloading.


Whether this can apply also to things like Parsoid might be tricky --
that's the biggest Scary Thing since core editing with VE/Flow is going to
depend on it.

Anyway, just thoughts. :)

-- brion
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] MediaWiki Developer Summit: The future of Language Converter

2015-01-27 Thread Joel Sahleen
I would love to go to this but the timing is not good. Like James, already 
committed to another session.

There seem to be a lot of language-related issues that are being worked on 
independently by different groups of people who don’t necessarily communicate. 
Hard for a newbie like me to wrap my head around it all. What could we do to be 
a bit more coordinated in our efforts? (Seriously, I’m asking.)


Joel Sahleen, Software Engineer
Language Engineering
Wikimedia Foundation
jsahl...@wikimedia.org




On Jan 27, 2015, at 11:07 AM, James Forrester jforres...@wikimedia.org wrote:

 On 27 January 2015 at 10:49, C. Scott Ananian canan...@wikimedia.org
 wrote:
 
 Well, I'll give James the opposite answer then:
 
 Language Variants are not *just* a social and cultural issue, either.
 There are real technical issues to address, having to do with how we manage
 slightly forked wikis, how we maintain code which is not used by enwiki,
 and the technical limitations of the three basic approaches which have been
 attempted to date (Content Translation tools, Language Converter, forked
 wikis with manual synchronization).
 
 So, let's talk about the technical issues at the Developer summit.  I'll
 show you the code to support LanguageConverter in Parsoid, let's talk about
 what it would take to make this work in Visual Editor and/or for HTML page
 views (targeting mobile performance).
 
 And then we can discuss the highly technical RFC regarding Glossary
 support in core mediawiki:
 
 https://www.mediawiki.org/wiki/Requests_for_comment/Scoped_language_converter
 
 I think there is plenty of technical content to discuss.  And non-technical
 issues to keep in mind, and should inform the technical discussion.
 
 
 ​Absolutely.​ I didn't say anything about the technical issues, which do
 need some serious discussion (though I think that an ad-hoc session might
 be too short notice).
 
 
 
 But if you don't think it's worth discussing, then don't come. ;)  There
 are three other different talks at the same time...
 
 
 ​I think it's totally worth discussing, but indeed, unfortunately I
 committed ​to another session. :-(
 
 ​J.​
 -- 
 James D. Forrester
 Product Manager, Editing
 Wikimedia Foundation, Inc.
 
 jforres...@wikimedia.org | @jdforrester
 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Thoughts: stateless services with open servers?

2015-01-27 Thread C. Scott Ananian
We already run a public Parsoid service.

Kiwix uses it, as does Nell's Wikipedia.  I believe some non-WMF content
translation efforts also use it.

Usage limits and API keys would probably be a reasonable thing to do,
longer-term.
 --scott

On Tue, Jan 27, 2015 at 11:20 AM, Brion Vibber bvib...@wikimedia.org
wrote:

 On Tue, Jan 27, 2015 at 11:10 AM, James Forrester 
 jforres...@wikimedia.org
 wrote:

  On 27 January 2015 at 11:04, Brion Vibber bvib...@wikimedia.org wrote:
 
   Whether this can apply also to things like Parsoid might be tricky --
   that's the biggest Scary Thing since core editing with VE/Flow is going
  to
   depend on it.
  
 
  ​Running Parsoid as a public service (with some soft-ish API limits)
 would
  allow us to support the oft-cited user who has a dumb PHP-only box and no
  means to install a node service, so that has my support;


 Yay!


  however, I worry
  that WMF might not be the best organisation to provide this if people
  wanted it at large for commercial use.
 

 Agreed... but if not us, then who?

 /me looks around at folks, wonders if anyone wants to commit to running
 such a service as a third-party that we could make super-easy for shared
 PHP-host users to use...

 -- brion
 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l




-- 
(http://cscott.net)
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] MediaWiki Developer Summit: The future of Language Converter

2015-01-27 Thread C. Scott Ananian
Asaf, wctaiwan: I totally understand the social and cultural implications
-- that is indeed one of the key things I wish to discuss during this
session so that people understand that language variants are not *just* a
technical issue.

For example, I believe that Urdu and Hindi are also amendable to
LanguageConverter, but the political issues in that case make it highly
unlikely that the two languages would wish to share a wiki.

I think Asaf's question was well-posed: let's see what the technical issues
are, so that the community can have an informed discussion.
  --scott
​
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] MediaWiki Developer Summit: The future of Language Converter

2015-01-27 Thread James Forrester
On 27 January 2015 at 10:32, C. Scott Ananian canan...@wikimedia.org
wrote:

 Asaf, wctaiwan: I totally understand the social and cultural implications
 -- that is indeed one of the key things I wish to discuss during this
 session so that people understand that language variants are not *just* a
 technical issue.


​For precisely this reason, ​​I don't think it would be an appropriate
conversation to have at a technical event like MWDS.​

​J.​
-- 
James D. Forrester
Product Manager, Editing
Wikimedia Foundation, Inc.

jforres...@wikimedia.org | @jdforrester
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] MediaWiki Developer Summit: The future of Language Converter

2015-01-27 Thread James Forrester
On 27 January 2015 at 10:49, C. Scott Ananian canan...@wikimedia.org
wrote:

 Well, I'll give James the opposite answer then:

 Language Variants are not *just* a social and cultural issue, either.
 There are real technical issues to address, having to do with how we manage
 slightly forked wikis, how we maintain code which is not used by enwiki,
 and the technical limitations of the three basic approaches which have been
 attempted to date (Content Translation tools, Language Converter, forked
 wikis with manual synchronization).

 So, let's talk about the technical issues at the Developer summit.  I'll
 show you the code to support LanguageConverter in Parsoid, let's talk about
 what it would take to make this work in Visual Editor and/or for HTML page
 views (targeting mobile performance).

 And then we can discuss the highly technical RFC regarding Glossary
 support in core mediawiki:

 https://www.mediawiki.org/wiki/Requests_for_comment/Scoped_language_converter

 I think there is plenty of technical content to discuss.  And non-technical
 issues to keep in mind, and should inform the technical discussion.


​Absolutely.​ I didn't say anything about the technical issues, which do
need some serious discussion (though I think that an ad-hoc session might
be too short notice).



 But if you don't think it's worth discussing, then don't come. ;)  There
 are three other different talks at the same time...


​I think it's totally worth discussing, but indeed, unfortunately I
committed ​to another session. :-(

​J.​
-- 
James D. Forrester
Product Manager, Editing
Wikimedia Foundation, Inc.

jforres...@wikimedia.org | @jdforrester
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Thoughts: stateless services with open servers?

2015-01-27 Thread Brion Vibber
On Tue, Jan 27, 2015 at 11:10 AM, James Forrester jforres...@wikimedia.org
wrote:

 On 27 January 2015 at 11:04, Brion Vibber bvib...@wikimedia.org wrote:

  Whether this can apply also to things like Parsoid might be tricky --
  that's the biggest Scary Thing since core editing with VE/Flow is going
 to
  depend on it.
 

 ​Running Parsoid as a public service (with some soft-ish API limits) would
 allow us to support the oft-cited user who has a dumb PHP-only box and no
 means to install a node service, so that has my support;


Yay!


 however, I worry
 that WMF might not be the best organisation to provide this if people
 wanted it at large for commercial use.


Agreed... but if not us, then who?

/me looks around at folks, wonders if anyone wants to commit to running
such a service as a third-party that we could make super-easy for shared
PHP-host users to use...

-- brion
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Thoughts: stateless services with open servers?

2015-01-27 Thread Brion Vibber
Another possibility is to shell out to nodejs-based services as an
alternative to running them as ongoing web services.

This may not be super performant, but it should work -- just as we've been
able to shell out to system binaries, Python scripts, ocaml, lua, etc for
years. Would require having node *present* on the system but wouldn't
require running a web service.

Something to consider trying...

-- brion

On Tue, Jan 27, 2015 at 11:20 AM, Brion Vibber bvib...@wikimedia.org
wrote:

 On Tue, Jan 27, 2015 at 11:10 AM, James Forrester 
 jforres...@wikimedia.org wrote:

 On 27 January 2015 at 11:04, Brion Vibber bvib...@wikimedia.org wrote:

  Whether this can apply also to things like Parsoid might be tricky --
  that's the biggest Scary Thing since core editing with VE/Flow is going
 to
  depend on it.
 

 ​Running Parsoid as a public service (with some soft-ish API limits) would
 allow us to support the oft-cited user who has a dumb PHP-only box and no
 means to install a node service, so that has my support;


 Yay!


 however, I worry
 that WMF might not be the best organisation to provide this if people
 wanted it at large for commercial use.


 Agreed... but if not us, then who?

 /me looks around at folks, wonders if anyone wants to commit to running
 such a service as a third-party that we could make super-easy for shared
 PHP-host users to use...

 -- brion

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Thoughts: stateless services with open servers?

2015-01-27 Thread Luis Villa
On Tue, Jan 27, 2015 at 11:20 AM, Brion Vibber bvib...@wikimedia.org
wrote:

  ​Running Parsoid as a public service (with some soft-ish API limits)
 would
  allow us to support the oft-cited user who has a dumb PHP-only box and no
  means to install a node service, so that has my support;


 Yay!


  however, I worry
  that WMF might not be the best organisation to provide this if people
  wanted it at large for commercial use.
 

 Agreed... but if not us, then who?

 /me looks around at folks, wonders if anyone wants to commit to running
 such a service as a third-party that we could make super-easy for shared
 PHP-host users to use...


This is essentially the wordpress.com model for things like anti-spam and
analytics, right? Not that I'm saying they would do it, but clearly it's a
workable model at some scale.

Luis


-- 
Luis Villa
Deputy General Counsel
Wikimedia Foundation
415.839.6885 ext. 6810

*This message may be confidential or legally privileged. If you have
received it by accident, please delete it and let us know about the
mistake. As an attorney for the Wikimedia Foundation, for legal/ethical
reasons I cannot give legal advice to, or serve as a lawyer for, community
members, volunteers, or staff members in their personal capacity. For more
on what this means, please see our legal disclaimer
https://meta.wikimedia.org/wiki/Wikimedia_Legal_Disclaimer.*
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Thoughts: stateless services with open servers?

2015-01-27 Thread Subramanya Sastry

On 01/27/2015 11:46 AM, C. Scott Ananian wrote:

We already run a public Parsoid service.


But, this doesn't serve wiki content from wikis other than wikimedia 
wikis and we are unlikely to do so with the existing production WMF cluster.


The discussion is whether we should / will run a different cluster for 
smallish wikis.


Subbu.



Kiwix uses it, as does Nell's Wikipedia.  I believe some non-WMF content
translation efforts also use it.

Usage limits and API keys would probably be a reasonable thing to do,
longer-term.
  --scott

On Tue, Jan 27, 2015 at 11:20 AM, Brion Vibber bvib...@wikimedia.org
wrote:


On Tue, Jan 27, 2015 at 11:10 AM, James Forrester 
jforres...@wikimedia.org
wrote:


On 27 January 2015 at 11:04, Brion Vibber bvib...@wikimedia.org wrote:


Whether this can apply also to things like Parsoid might be tricky --
that's the biggest Scary Thing since core editing with VE/Flow is going

to

depend on it.


​Running Parsoid as a public service (with some soft-ish API limits)

would

allow us to support the oft-cited user who has a dumb PHP-only box and no
means to install a node service, so that has my support;


Yay!



however, I worry
that WMF might not be the best organisation to provide this if people
wanted it at large for commercial use.


Agreed... but if not us, then who?

/me looks around at folks, wonders if anyone wants to commit to running
such a service as a third-party that we could make super-easy for shared
PHP-host users to use...

-- brion
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l







___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Thoughts: stateless services with open servers?

2015-01-27 Thread C. Scott Ananian
On Tue, Jan 27, 2015 at 11:46 AM, Brion Vibber bvib...@wikimedia.org
wrote:

 Another possibility is to shell out to nodejs-based services as an
 alternative to running them as ongoing web services.


Parsoid currently does a lot of MediaWiki API querying at startup to get
the current wiki settings.  As a one-off service, this makes it rather
sluggish.  (We have a `parse.js` test binary that lets you experience this
for yourself, if you like.)

I think it's more reasonable to embed v8 into PHP (or vice-versa) an an
extension which lets us amortize this startup cost over a number of
requests.
  --scott

-- 
(http://cscott.net)
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] MediaWiki Developer Summit: The future of Language Converter

2015-01-27 Thread wctaiwan
I think any decision to split wikis should not be made without
consensus within the affected communities. From a technical
standpoint, it would be much easier to decide in favour of a split,
but I'm worried that the social ramifications of such a decision would
be overlooked in a discussion among developers. It might be a good
idea to present the challenges of continuing to maintain Language
Converter to the communities, but if, for example, it turns out that
having a single Chinese language is better for our free knowledge
goals, then it might still be worth investing the time needed to
overcome the technical hurdles.

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] Odd API errors spotted this morning (around 08:25 UTC to 10:20 UTC)

2015-01-27 Thread Brad Jorsch (Anomie)
Something seems to have gone weird with the API this morning, to the point
where something like this would have intermittently failed on enwiki:

 // The POST parameter 'cmtitle' has value 'Category:Articles to be
expanded'
 $title = RequestContext::getMain()-getRequest()-getVal( 'cmtitle' );
 $titleObj = Title::newFromText( $title );
 assert( $titleObj-getNamespace() === NS_CATEGORY );

I don't see anything around that timeframe that would have made that
happen. Does anyone have any ideas?


-- 
Brad Jorsch (Anomie)
Software Engineer
Wikimedia Foundation
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] MediaWiki Developer Summit: The future of Language Converter

2015-01-27 Thread C. Scott Ananian
Well, I'll give James the opposite answer then:

Language Variants are not *just* a social and cultural issue, either.
There are real technical issues to address, having to do with how we manage
slightly forked wikis, how we maintain code which is not used by enwiki,
and the technical limitations of the three basic approaches which have been
attempted to date (Content Translation tools, Language Converter, forked
wikis with manual synchronization).

So, let's talk about the technical issues at the Developer summit.  I'll
show you the code to support LanguageConverter in Parsoid, let's talk about
what it would take to make this work in Visual Editor and/or for HTML page
views (targeting mobile performance).

And then we can discuss the highly technical RFC regarding Glossary
support in core mediawiki:
https://www.mediawiki.org/wiki/Requests_for_comment/Scoped_language_converter

I think there is plenty of technical content to discuss.  And non-technical
issues to keep in mind, and should inform the technical discussion.

But if you don't think it's worth discussing, then don't come. ;)  There
are three other different talks at the same time...
 --scott
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Performance of parsing links?

2015-01-27 Thread Chad
On Tue Jan 27 2015 at 1:37:36 PM Brion Vibber bvib...@wikimedia.org wrote:

 Probably the fastest thing would be to manually create the ulli etc and
 wrap them around a loop calling the linker functions (Linker::link).

 https://doc.wikimedia.org/mediawiki-core/master/php/html/classLinker.html#
 a52523fb9f10737404b1dfa45bab61045


Another option could be using LinkBatch.

-Chad
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Performance of parsing links?

2015-01-27 Thread Brion Vibber
Probably the fastest thing would be to manually create the ulli etc and
wrap them around a loop calling the linker functions (Linker::link).

https://doc.wikimedia.org/mediawiki-core/master/php/html/classLinker.html#a52523fb9f10737404b1dfa45bab61045

-- brion

On Tue, Jan 27, 2015 at 1:33 PM, Daniel Barrett d...@cimpress.com wrote:

 I'm writing a parser function extension that outputs about 5000 lines of
 text (an organizational chart of a company) as a nested, bulleted list.

 * Bob the CEO
 ** Jane Jones
 ** Mike Smith
 *** etc.

 It takes about 3 seconds (real time) for MediaWiki to render this list,
 which is acceptable. However, if I make it a list of links, which is more
 useful:

 * [[User:Bob | Bob the CEO]]
 ** [[User:Jane | Jane Jones]]
 ** [[User:Mike | Mike Smith]]

 the rendering time more than doubles to 6-8 seconds, which users perceive
 as too slow.

 Is there a faster implementation for rendering a large number of links,
 rather than returning the wikitext list and having MediaWiki render it?

 Thanks,
 DanB

 
 My email address has changed to d...@cimpress.com. Please update your
 address book.

 Cimpress is the new name for Vistaprint NV, the world’s leader in mass
 customization. Read more about Cimpress at www.cimpress.com.
 
 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Performance of parsing links?

2015-01-27 Thread Daniel Friesen
You should be able to return something like this to make your parser
function output raw HTML instead of WikiText.

return array( $output, 'noparse' = true, 'isHTML' = true );

~Daniel Friesen (Dantman, Nadir-Seen-Fire) [http://danielfriesen.name/]

On 2015-01-27 1:33 PM, Daniel Barrett wrote:
 I'm writing a parser function extension that outputs about 5000 lines of text 
 (an organizational chart of a company) as a nested, bulleted list.

 * Bob the CEO
 ** Jane Jones
 ** Mike Smith
 *** etc.

 It takes about 3 seconds (real time) for MediaWiki to render this list, which 
 is acceptable. However, if I make it a list of links, which is more useful:

 * [[User:Bob | Bob the CEO]]
 ** [[User:Jane | Jane Jones]]
 ** [[User:Mike | Mike Smith]]

 the rendering time more than doubles to 6-8 seconds, which users perceive as 
 too slow.

 Is there a faster implementation for rendering a large number of links, 
 rather than returning the wikitext list and having MediaWiki render it?

 Thanks,
 DanB

 
 My email address has changed to d...@cimpress.com. Please update your address 
 book.

 Cimpress is the new name for Vistaprint NV, the world’s leader in mass 
 customization. Read more about Cimpress at www.cimpress.com.
 
 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Thoughts: stateless services with open servers?

2015-01-27 Thread Stas Malyshev
Hi!

 ​Running Parsoid as a public service (with some soft-ish API limits) would
 allow us to support the oft-cited user who has a dumb PHP-only box and no
 means to install a node service, so that has my support; however, I worry
 that WMF might not be the best organisation to provide this if people
 wanted it at large for commercial use.

I think the prospect of having publicly-accessible services like Parsoid
useable by smaller wiki installer is very cool and exciting. Thought it
does not solve all cases - i.e. for inter-company wiki (which I
personally used extensively as a medium for capturing internal docs,
discussions, plans, etc.) there may be legal and confidentiality reasons
that do not allow using services crossing the firewall.

-- 
Stas Malyshev
smalys...@wikimedia.org

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Thoughts: stateless services with open servers?

2015-01-27 Thread Daniel Friesen
On 2015-01-27 11:04 AM, Brion Vibber wrote:
 One thing that comes to mind though is something Gabriel mentioned about
 wanting stateless services for things like image manipulation and citation
 parsing and whatnot -- in principle many services like this could be opened
 up to public use.

 For instance, a random MediaWiki instance could use a Wikimedia-hosted
 image scaler or graph rasterizer.
This!

This is something I've thought about too but never got around to writing
about.

I found one of Automattic's models rather interesting.

Everyone knows about WordPress.com where they have full free and premium
hosting of WordPress (with limits on plugins, etc...).

But another one of their projects is the Jetpack plugin.

The plugin has multiple features that can be enabled and disabled.
- Some of them are simple small WordPress plugins that add local features.
- Others integrate with services hosted on WordPress.com offering a
3rd-party installation features like WordPress.com hosted stats, use of
Automattic's CDN, WordPress.com login, and warnings when your site is
offline.
- And then there are a few hosted premium features like video hosting
and an automated backup service.

Simple extension(s) that quick and easily allow a hosted service to
enhance a locally installed MediaWiki installation could be a useful
addition to the MediaWiki ecosystem.

~Daniel Friesen (Dantman, Nadir-Seen-Fire) [http://danielfriesen.name/]


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Thoughts: stateless services with open servers?

2015-01-27 Thread Daniel Friesen
On 2015-01-27 12:51 PM, Stas Malyshev wrote:
 Hi!

 ​Running Parsoid as a public service (with some soft-ish API limits) would
 allow us to support the oft-cited user who has a dumb PHP-only box and no
 means to install a node service, so that has my support; however, I worry
 that WMF might not be the best organisation to provide this if people
 wanted it at large for commercial use.
 I think the prospect of having publicly-accessible services like Parsoid
 useable by smaller wiki installer is very cool and exciting. Thought it
 does not solve all cases - i.e. for inter-company wiki (which I
 personally used extensively as a medium for capturing internal docs,
 discussions, plans, etc.) there may be legal and confidentiality reasons
 that do not allow using services crossing the firewall.
Companies running internal wikis naturally aren't using shared hosting.

They would probably be better served with alternative setups that make
it easy to setup and run the services locally.

Something for things like Vagrant, Docker, Turnkey, etc...

~Daniel Friesen (Dantman, Nadir-Seen-Fire) [http://danielfriesen.name/]


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] Performance of parsing links?

2015-01-27 Thread Daniel Barrett
I'm writing a parser function extension that outputs about 5000 lines of text 
(an organizational chart of a company) as a nested, bulleted list.

* Bob the CEO
** Jane Jones
** Mike Smith
*** etc.

It takes about 3 seconds (real time) for MediaWiki to render this list, which 
is acceptable. However, if I make it a list of links, which is more useful:

* [[User:Bob | Bob the CEO]]
** [[User:Jane | Jane Jones]]
** [[User:Mike | Mike Smith]]

the rendering time more than doubles to 6-8 seconds, which users perceive as 
too slow.

Is there a faster implementation for rendering a large number of links, rather 
than returning the wikitext list and having MediaWiki render it?

Thanks,
DanB


My email address has changed to d...@cimpress.com. Please update your address 
book.

Cimpress is the new name for Vistaprint NV, the world’s leader in mass 
customization. Read more about Cimpress at www.cimpress.com.

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] Archcom plans

2015-01-27 Thread Brion Vibber
I just want to thank everybody who participated in the 'future of the
architecture committee' session at the dev summit today -- we had some
great discussion and good ideas and concerns from a lot of people.


Damon has proposed that the committee plan a provisional governance model
formalizing the Archcom and clarifying what we do, what we don't do, and
what people expect out of us beyond looking over RfCs.

Provisional plan is to have something in place or ready to ratify by the
European hackathon in Lyon (in May:
https://www.mediawiki.org/wiki/Lyon_Hackathon_2015 )


More details to come as we work stuff out -- I think we'll want to do some
open discussion on the subject, so don't be shy with ideas, constructive
criticism, etc!

-- brion
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] All Hands On Deck!

2015-01-27 Thread Jay Ashworth
- Original Message -
 From: Mark A. Hershberger m...@nichework.com

  Can we get a simple English version of the poetry there?
 
 As others have pointed out poetry doesn't translate well, but I've
 managed to come up with an interpretation that I hope is easy for a
 non-native speaker to understand.
 
 (I admit feeling safer doing this since I wasn't at the all staff and
 have no clue what this is about. Burn me at the stake later.)
 
 --- begin ---

 In the end, we'll probably just end up doing our job like we have
 been.
 --- end ---
 
 Hope that is helpful,

So I was right: it was a scoff.  :-)

Cheers,
-- jra
-- 
Jay R. Ashworth  Baylink   j...@baylink.com
Designer The Things I Think   RFC 2100
Ashworth  Associates   http://www.bcp38.info  2000 Land Rover DII
St Petersburg FL USA  BCP38: Ask For It By Name!   +1 727 647 1274

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] Please Review These Patches

2015-01-27 Thread Tyler Romeo
I don’t usually email the list for this kind of stuff, but if you have some 
spare time to test an extension, please check out this patch chain. I’d like to 
get it merged before May (when they become a year old).

In order of dependency:
https://gerrit.wikimedia.org/r/132783
https://gerrit.wikimedia.org/r/132784
https://gerrit.wikimedia.org/r/134050
https://gerrit.wikimedia.org/r/134789
https://gerrit.wikimedia.org/r/135597

As a description, these five patches severely refactor Extension:OATHAuth, in 
hopes that I can try and get it deployed on wiki so that enwiki users can use 
two factor authentication.

Thanks,
-- 
Tyler Romeo
0x405D34A7C86B42DF


signature.asc
Description: Message signed with OpenPGP using AMPGpg
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] All Hands On Deck!

2015-01-27 Thread Mark A. Hershberger
Antoine Musso hashar+...@free.fr writes:

 Le 23/01/2015 09:11, littlewmfb...@yandex.com a écrit :
 Oh what a day! Which began when perforce
 a visitor from afar began to exhort
 snip
 https://lists.wikimedia.org/pipermail/wikitech-l/2015-January/080300.html

 Can we get a simple English version of the poetry there?

As others have pointed out poetry doesn't translate well, but I've
managed to come up with an interpretation that I hope is easy for a
non-native speaker to understand.

(I admit feeling safer doing this since I wasn't at the all staff and
have no clue what this is about.  Burn me at the stake later.)

--- begin ---
What a day it has been!  We had an outside speaker who told us to
a lot of feel-good things -- learn, travel, see new things.

So we walked around San Francisco and talked to each other about
ourselves.

What a relief that we didn't *really* have to consider new thoughts or
ways of seeing things.

This made us feel great despite some obvious things like problems with
our projects and our horrible user experience that seems to ignore the
last 10 years of the Social Web and dissatisfied users who write long
rants.

Next we talked about strategy and new people without experience in the
project told us everything we had done wrong.

Three commands we were given:
1. Be fast *and* correct!
(But this seems to ignore reality of how long it takes to create
correct code.)

2. Innovate!
(As a pessimist, I don't see this as being in line with the
conservative nature of our current users and new editors and processes
don't just appear.)

3. Integrate, engage with the community
(But this seems to assume that there is a cohesive community instead
of the enormous conflicts and chaos that looks like the reality.  They
told us to engage with the community, but this didn't seem to recognize
the times we we've tried and got beat up by the community.)

In the end, we'll probably just end up doing our job like we have been.
--- end ---

Hope that is helpful,

Mark.

-- 
Mark A. Hershberger
NicheWork LLC
717-271-1084

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Thoughts: stateless services with open servers?

2015-01-27 Thread Federico Leva (Nemo)
The only true precedent we have here is the public Collection server run 
by PediaPress, which is a wonderful service used by several hundreds 
wikis for years. It ensures a comparatively smooth installation of the 
Collection extension even for the most niche formats.

https://www.mediawiki.org/wiki/Extension:Collection#Generating_PDFs.2C_OpenDocument-_.26_DocBook-Exports

One could perhaps also consider DNSBL services by SpamHaus and others. 
https://www.mediawiki.org/wiki/Manual:Combating_spam#DNSBL But I don't 
know how many are using them and they're not really effective, so we 
don't really have a service comparable to Akismet. The Antispam 
extension claims to be one, but at the moment it's used by 3 wikis only.


Nemo

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] DevSummit appreciation

2015-01-27 Thread Erik Moeller
Just a quick note that I really appreciated everyone's help making the
summit come together. As always, we'll be doing lots of second-guessing of
everything we did and didn't do, and how we want to use future time
together. Before we go into that, I'd like to thank the event team and
_everyone_ who worked to and beyond the point of exhaustion to organize the
event, support attendees, plan sessions, facilitate conversations,
negotiate sometimes difficult terrain.

Thank you. :)

Erik

-- 
Erik Möller
VP of Product  Strategy, Wikimedia Foundation
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] [Engineering] DevSummit appreciation

2015-01-27 Thread Joel Sahleen
Here, here! Thanks to everyone who helped put this event together. Great job!

Joel Sahleen, Software Engineer
Language Engineering
Wikimedia Foundation
jsahl...@wikimedia.org




On Jan 27, 2015, at 10:43 PM, Erik Moeller e...@wikimedia.org wrote:

 Just a quick note that I really appreciated everyone's help making the summit 
 come together. As always, we'll be doing lots of second-guessing of 
 everything we did and didn't do, and how we want to use future time together. 
 Before we go into that, I'd like to thank the event team and _everyone_ who 
 worked to and beyond the point of exhaustion to organize the event, support 
 attendees, plan sessions, facilitate conversations, negotiate sometimes 
 difficult terrain. 
 
 Thank you. :)
 
 Erik
 
 -- 
 Erik Möller
 VP of Product  Strategy, Wikimedia Foundation
 ___
 Engineering mailing list
 engineer...@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/engineering

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Wikidata search provider for GNOME Shell

2015-01-27 Thread Jon Robson
Hey Baha! How can I see what this is like without a Gnome environment?

On Fri, Jan 23, 2015 at 2:28 AM, Lydia Pintscher
lydia.pintsc...@wikimedia.de wrote:
 On Fri, Jan 23, 2015 at 6:24 AM, Bahodir Mansurov
 bmansu...@wikimedia.org wrote:
 I’ve created a GNOME Shell extension that allows the user to search for 
 Wikidata items directly from the shell. Currently you can search for simple 
 things such as “Obama”, “Book”, etc. I plan on adding support for complex 
 queries such as “the population of the earth” which would show the current 
 population of the earth. In the future I also see this extension handle the 
 submission of new entries or editing existing ones. Check it out here [1] if 
 you’re interested. Pull requests are also welcome ;)

 [1] https://github.com/6ahodir/wikidata-search-provider

 Thanks, Bahodir!
 Forwarding to the Wikidata mailing list so they get to see this too.


 Cheers
 Lydia

 --
 Lydia Pintscher - http://about.me/lydia.pintscher
 Product Manager for Wikidata

 Wikimedia Deutschland e.V.
 Tempelhofer Ufer 23-24
 10963 Berlin
 www.wikimedia.de

 Wikimedia Deutschland - Gesellschaft zur Förderung Freien Wissens e. V.

 Eingetragen im Vereinsregister des Amtsgerichts Berlin-Charlottenburg
 unter der Nummer 23855 Nz. Als gemeinnützig anerkannt durch das
 Finanzamt für Körperschaften I Berlin, Steuernummer 27/681/51985.

 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l



-- 
Jon Robson
* http://jonrobson.me.uk
* https://www.facebook.com/jonrobson
* @rakugojon

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Wikidata search provider for GNOME Shell

2015-01-27 Thread Bahodir Mansurov
You can see a screenshot here [1]. It’s similar to spotlight search on a mac.

[1] https://extensions.gnome.org/extension/924/wikidata-search-provider/ 
https://extensions.gnome.org/extension/924/wikidata-search-provider/

 On Jan 27, 2015, at 9:02 AM, Jon Robson jdlrob...@gmail.com wrote:
 
 Hey Baha! How can I see what this is like without a Gnome environment?
 
 On Fri, Jan 23, 2015 at 2:28 AM, Lydia Pintscher
 lydia.pintsc...@wikimedia.de wrote:
 On Fri, Jan 23, 2015 at 6:24 AM, Bahodir Mansurov
 bmansu...@wikimedia.org wrote:
 I’ve created a GNOME Shell extension that allows the user to search for 
 Wikidata items directly from the shell. Currently you can search for simple 
 things such as “Obama”, “Book”, etc. I plan on adding support for complex 
 queries such as “the population of the earth” which would show the current 
 population of the earth. In the future I also see this extension handle the 
 submission of new entries or editing existing ones. Check it out here [1] 
 if you’re interested. Pull requests are also welcome ;)
 
 [1] https://github.com/6ahodir/wikidata-search-provider
 
 Thanks, Bahodir!
 Forwarding to the Wikidata mailing list so they get to see this too.
 
 
 Cheers
 Lydia
 
 --
 Lydia Pintscher - http://about.me/lydia.pintscher
 Product Manager for Wikidata
 
 Wikimedia Deutschland e.V.
 Tempelhofer Ufer 23-24
 10963 Berlin
 www.wikimedia.de
 
 Wikimedia Deutschland - Gesellschaft zur Förderung Freien Wissens e. V.
 
 Eingetragen im Vereinsregister des Amtsgerichts Berlin-Charlottenburg
 unter der Nummer 23855 Nz. Als gemeinnützig anerkannt durch das
 Finanzamt für Körperschaften I Berlin, Steuernummer 27/681/51985.
 
 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l
 
 
 
 -- 
 Jon Robson
 * http://jonrobson.me.uk
 * https://www.facebook.com/jonrobson
 * @rakugojon
 
 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Wikidata search provider for GNOME Shell

2015-01-27 Thread Jon Robson
Nice!

On Tue, Jan 27, 2015 at 9:05 AM, Bahodir Mansurov
bmansu...@wikimedia.org wrote:
 You can see a screenshot here [1]. It’s similar to spotlight search on a mac.

 [1] https://extensions.gnome.org/extension/924/wikidata-search-provider/ 
 https://extensions.gnome.org/extension/924/wikidata-search-provider/

 On Jan 27, 2015, at 9:02 AM, Jon Robson jdlrob...@gmail.com wrote:

 Hey Baha! How can I see what this is like without a Gnome environment?

 On Fri, Jan 23, 2015 at 2:28 AM, Lydia Pintscher
 lydia.pintsc...@wikimedia.de wrote:
 On Fri, Jan 23, 2015 at 6:24 AM, Bahodir Mansurov
 bmansu...@wikimedia.org wrote:
 I’ve created a GNOME Shell extension that allows the user to search for 
 Wikidata items directly from the shell. Currently you can search for 
 simple things such as “Obama”, “Book”, etc. I plan on adding support for 
 complex queries such as “the population of the earth” which would show the 
 current population of the earth. In the future I also see this extension 
 handle the submission of new entries or editing existing ones. Check it 
 out here [1] if you’re interested. Pull requests are also welcome ;)

 [1] https://github.com/6ahodir/wikidata-search-provider

 Thanks, Bahodir!
 Forwarding to the Wikidata mailing list so they get to see this too.


 Cheers
 Lydia

 --
 Lydia Pintscher - http://about.me/lydia.pintscher
 Product Manager for Wikidata

 Wikimedia Deutschland e.V.
 Tempelhofer Ufer 23-24
 10963 Berlin
 www.wikimedia.de

 Wikimedia Deutschland - Gesellschaft zur Förderung Freien Wissens e. V.

 Eingetragen im Vereinsregister des Amtsgerichts Berlin-Charlottenburg
 unter der Nummer 23855 Nz. Als gemeinnützig anerkannt durch das
 Finanzamt für Körperschaften I Berlin, Steuernummer 27/681/51985.

 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l



 --
 Jon Robson
 * http://jonrobson.me.uk
 * https://www.facebook.com/jonrobson
 * @rakugojon

 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l

 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l



-- 
Jon Robson
* http://jonrobson.me.uk
* https://www.facebook.com/jonrobson
* @rakugojon

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] MediaWiki Developer Summit: The future of Language Converter

2015-01-27 Thread C. Scott Ananian
Another late addition to the developer summit:  let's talk about language
converter, content translation tools, and other ways to bridge language
differences between projects.

2:45pm PST today (Tuesday) in Robertson 3.

Task: https://phabricator.wikimedia.org/T87652

Here's a list of all the wikis currently using (or considering to use)
Language Converter:
https://www.mediawiki.org/wiki/Writing_systems#LanguageConverter

Some provocative questions to spur your attendance:

1. Should zhwiki split for mainland China and Taiwan?  What about Malay?

2. Should we be using LanguageConverter on enwiki to handle
British/Australian/Indian/American spelling differences?

3. Perhaps Swiss German and High German should be different wikis? Or use
LanguageConverter? Or use Content Translation tools?

4. How do we best edit content which has slight localized differences?
Templates?  Interface strings?  Can we scale our tools across this range?

See you there!
  --scott



-- 
(http://cscott.net)
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l