Re: [Wikitech-l] !ask

2014-01-19 Thread Petr Bena
this can be of course improved, if someone is interested:

https://github.com/benapetr/wikimedia-bot/blob/master/plugins/slap/slap/Class1.cs

On Mon, Jan 20, 2014 at 8:26 AM, Petr Bena  wrote:
> There is one more feature available on wm-bot.
>
> First of all, that thing is smart enough to recognize who is irc
> newbie and who is not. It is possible to direct this only to people
> who are known to the bot (their cloak is trusted) so that it doesn't
> bite the newbies, but rather "slap" the experienced who keep bad
> habits. The bot is able to recognize what is a "question if they can
> ask" and automatically tell the person to ask the question instead of
> asking if they can ask, for example:
>
> [08:23]  hi I have a question
> [08:23] <+wm-bot> Hi petan2, just ask! There is no need to ask if you can ask
> 
> [08:24]  is someone here
> [08:24] <+wm-bot> Hi petan2, I am here, if you need anything, please
> ask, otherwise no one is going to help you... Thank you
>
> I admit that having a bot for this may not be a best solution, having
> more active helpers would be better, but I think that having people
> who just come to channel, say someone here? and immediately quit after
> 5 minutes of no response is even worse
>
> On Mon, Jan 20, 2014 at 5:39 AM, Jay Ashworth  wrote:
>> - Original Message -
>>> From: "MZMcBride" 
>>
>>> >I've used '!ask' a lot before but I'm going to stop. I hope others do
>>> >the same.
>>>
>>> So really you're going for !!ask.
>>>
>>> "!ask" used to be more direct (some would say meaner) and I'm a little
>>> sad to see it go away, but I suppose snotty replies can always be made
>>> manually. ;-)
>>
>> Well, perhaps if it expanded as "IRC is a worldwide network, and many users
>> are logged into it while doing other things.  Ask your question, supplying
>> as much detail as possible, and one of them may reply, even if it takes a
>> while.  In other words: don't ask to ask, just ask."...?
>>
>> Cheers,
>> -- jra
>> --
>> Jay R. Ashworth  Baylink   
>> j...@baylink.com
>> Designer The Things I Think   RFC 
>> 2100
>> Ashworth & Associates   http://www.bcp38.info  2000 Land Rover 
>> DII
>> St Petersburg FL USA  BCP38: Ask For It By Name!   +1 727 647 
>> 1274
>>
>> ___
>> Wikitech-l mailing list
>> Wikitech-l@lists.wikimedia.org
>> https://lists.wikimedia.org/mailman/listinfo/wikitech-l

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] !ask

2014-01-19 Thread Petr Bena
There is one more feature available on wm-bot.

First of all, that thing is smart enough to recognize who is irc
newbie and who is not. It is possible to direct this only to people
who are known to the bot (their cloak is trusted) so that it doesn't
bite the newbies, but rather "slap" the experienced who keep bad
habits. The bot is able to recognize what is a "question if they can
ask" and automatically tell the person to ask the question instead of
asking if they can ask, for example:

[08:23]  hi I have a question
[08:23] <+wm-bot> Hi petan2, just ask! There is no need to ask if you can ask

[08:24]  is someone here
[08:24] <+wm-bot> Hi petan2, I am here, if you need anything, please
ask, otherwise no one is going to help you... Thank you

I admit that having a bot for this may not be a best solution, having
more active helpers would be better, but I think that having people
who just come to channel, say someone here? and immediately quit after
5 minutes of no response is even worse

On Mon, Jan 20, 2014 at 5:39 AM, Jay Ashworth  wrote:
> - Original Message -
>> From: "MZMcBride" 
>
>> >I've used '!ask' a lot before but I'm going to stop. I hope others do
>> >the same.
>>
>> So really you're going for !!ask.
>>
>> "!ask" used to be more direct (some would say meaner) and I'm a little
>> sad to see it go away, but I suppose snotty replies can always be made
>> manually. ;-)
>
> Well, perhaps if it expanded as "IRC is a worldwide network, and many users
> are logged into it while doing other things.  Ask your question, supplying
> as much detail as possible, and one of them may reply, even if it takes a
> while.  In other words: don't ask to ask, just ask."...?
>
> Cheers,
> -- jra
> --
> Jay R. Ashworth  Baylink   
> j...@baylink.com
> Designer The Things I Think   RFC 2100
> Ashworth & Associates   http://www.bcp38.info  2000 Land Rover DII
> St Petersburg FL USA  BCP38: Ask For It By Name!   +1 727 647 1274
>
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] !ask

2014-01-19 Thread Jay Ashworth
- Original Message -
> From: "MZMcBride" 

> >I've used '!ask' a lot before but I'm going to stop. I hope others do
> >the same.
> 
> So really you're going for !!ask.
> 
> "!ask" used to be more direct (some would say meaner) and I'm a little
> sad to see it go away, but I suppose snotty replies can always be made
> manually. ;-)

Well, perhaps if it expanded as "IRC is a worldwide network, and many users
are logged into it while doing other things.  Ask your question, supplying 
as much detail as possible, and one of them may reply, even if it takes a 
while.  In other words: don't ask to ask, just ask."...?

Cheers,
-- jra
-- 
Jay R. Ashworth  Baylink   j...@baylink.com
Designer The Things I Think   RFC 2100
Ashworth & Associates   http://www.bcp38.info  2000 Land Rover DII
St Petersburg FL USA  BCP38: Ask For It By Name!   +1 727 647 1274

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] Bugzilla Weekly Report

2014-01-19 Thread reporter
MediaWiki Bugzilla Report for January 13, 2014 - January 20, 2014

Status changes this week

Reports changed/set to UNCONFIRMED:  6 
Reports changed/set to NEW:  23
Reports changed/set to ASSIGNED   :  40
Reports changed/set to REOPENED   :  9 
Reports changed/set to PATCH_TO_RE:  74
Reports changed/set to RESOLVED   :  218   
Reports changed/set to VERIFIED   :  2 

Total reports still open  : 13680 
Total bugs still open : 7998  
Total non-lowest prio. bugs still open: 7787  
Total enhancements still open : 5682  

Reports created this week: 258   

Resolutions for the week:

Reports marked FIXED :  143   
Reports marked DUPLICATE :  29
Reports marked INVALID   :  15
Reports marked WORKSFORME:  17
Reports marked WONTFIX   :  10

Specific Product/Component Resolutions & User Metrics 

Created reports per component

Wikimedia General/Unknown   14  
  
MediaWiki extensions  Flow  13  
  
Wikimedia Site requests 11  
  
Analytics Tech community metrics8   
  
Wikimedia Continuous integration8   
  

Created reports per product

MediaWiki extensions  105   
Wikimedia 53
MediaWiki 49
VisualEditor  16
Wikimedia Labs10

Top 5 bug report closers

jforrester [AT] wikimedia.org 36
okeyes [AT] wikimedia.org 19
aklapper [AT] wikimedia.org   11
ryasmeen [AT] wikimedia.org   11
jrobson [AT] wikimedia.org10


Most urgent open issues

Product   | Component | BugID | Priority  | LastChange | Assignee   
  | Summary  
--
Analytics | Tech communit | 57038 | Highest   | 2013-12-23 | 
acs[AT]bitergia.com  | Metrics about contributors with +2 pe

MediaWiki | JavaScript| 52659 | Highest   | 2013-12-16 | 
wikibugs-l[AT]lists. | [Regression]: mediawiki.notification:

MediaWiki ext | CentralAuth   | 54195 | Highest   | 2013-12-30 | 
csteipp[AT]wikimedia | CentralAuth not caching Special:Centr

MediaWiki ext | Diff  | 58274 | Highest   | 2013-12-10 | 
wikibugs-l[AT]lists. | Implement an order-aware MapDiffer   

MediaWiki ext | Echo  | 53569 | Highest   | 2013-12-18 | 
wikibugs-l[AT]lists. | [Regression] Echo: Sending 2 e-mails 

MediaWiki ext | Flow  | 58016 | Highest   | 2013-12-31 | 
wikibugs-l[AT]lists. | Flow: Suppression redacts the wrong u

MediaWiki ext | WikidataRepo  | 58166 | Highest   | 2013-12-09 | 
wikidata-bugs[AT]lis | label/description uniqueness constrai

MediaWiki ext | WikidataRepo  | 57918 | Highest   | 2014-01-13 | 
wikidata-bugs[AT]lis | show diffs for sorting changes   

MediaWiki ext | WikidataRepo  | 58394 | Highest   | 2014-01-15 | 
wikidata-bugs[AT]lis | "specified index out of bounds" issue

MediaWiki ext | WikidataRepo  | 52385 | Highest   | 2014-01-16 | 
wikidata-bugs[AT]lis | Query by one property and one value (

MediaWiki ext | WikidataRepo  | 60127 | Highest   | 2014-01-17 | 
wikidata-bugs[AT]lis | Implement DB schema for query indexes

MediaWiki ext | WikidataRepo  | 58850 | Highest   | 2014-01-17 | 
wikidata-bugs[AT]lis | wbmergeitems isn't merging claims pro

VisualEditor  | Data Model| 60117 | Highest   | 2014-01-16 | 
esanders[AT]wikimedi | VisualEditor: Copying references is c

VisualEditor  | Data Model| 59653 | Highest   | 2014-01-16 | 
esanders[AT]wikimedi | VisualEditor: Pasting copied template

VisualEditor  | Editing Tools | 50768 | Highest   | 2013-12-16 | 
rmoen[AT]wikimedia.o | VisualEditor: Better reference UI for

VisualEditor  | MediaWiki int | 48429 | Highest   | 2014-01-17 | 
krinklemail[AT]gmail | VisualEditor: Support editing of sect

Wikimedia | Apache config | 31369 | Highest   | 2014-01-17 | 
bugzilla+org.wikimed | Non-canonical HTTPS URLs quietly redi

Wikimedia | Continuous in | 49846 | Highest   | 2014-01-17 | 
wikibugs-l[AT]lists. | mediawiki/extensions.git does not upd

Wikimedia | General/Unk

Re: [Wikitech-l] MediaWiki-Vagrant can run MediaWiki under HHVM!

2014-01-19 Thread Ori Livneh
On Sun, Jan 19, 2014 at 5:39 PM, Niklas Laxström
wrote:

> Do you know where I can find these hhvm-nightly packages if I want to
> try them out on my own?
>
> Last time I tested hhvm on translatewiki.net, there were fastcgi
> parameter passing problems which blocked further testing there.
>   -Niklas
>

It's part of the package repository provided by Facebook @
http://dl.hhvm.com/ubuntu/
If you're running Precise you can simply add this entry to sources.list:

deb http://dl.hhvm.com/ubuntu precise main

I also forgot to mention in my previous e-mail that if you aren't sure
which interpreter is running, you can simply check under "Installed
software" (or localized equivalent) in Special:Version. HHVM appears as
'5.4.999-hiphop (srv)'.
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] MediaWiki-Vagrant can run MediaWiki under HHVM!

2014-01-19 Thread Niklas Laxström
Do you know where I can find these hhvm-nightly packages if I want to
try them out on my own?

Last time I tested hhvm on translatewiki.net, there were fastcgi
parameter passing problems which blocked further testing there.
  -Niklas

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] MediaWiki-Vagrant can run MediaWiki under HHVM!

2014-01-19 Thread Ori Livneh
Thanks to some fantastic work by Erik Bernhardson[0], MediaWiki-Vagrant can
now run MediaWiki on HHVM. The setup runs HHVM behind Apache using FastCGI,
which is a close match to how we think we'll be running it in production.
It uses the 'hhvm-nightly' packages from Facebook, so you can test
MediaWiki core and extensions against cutting-edge builds.

To switch MediaWiki from PHP to HHVM, simply run 'vagrant enable-role
hhvm', followed by 'vagrant provision'. To switch back to PHP, run 'vagrant
disable-role hhvm' and reprovision.

Please try it out and FILE BUGS for any issues you encounter. This includes
not only provisioning failures (which should be reported under the
MediaWiki-Vagrant product in Bugzilla) but also any instances of PHP code
breaking under HHVM. There is now an 'hhvm' keyword in Bugzilla you can use
to tag your report.

Three cheers for Erik B., and for Facebook's Paul Tarjan, whose recent
packaging work makes this possible.

 [0]: https://gerrit.wikimedia.org/r/#/c/105834/

---
Ori Livneh
o...@wikimedia.org
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Status of the new PDF Renderer

2014-01-19 Thread Marco Fleckinger

Hi Matthew,

greate work, thank you for sharing. In my company we need an extension 
like this. A few months ago, I was not successful to find a solution 
accepting UTF-8 encoded Unicode characters greater than 0x7F inside URLs.


Here I couldn't find an article, which was not forwarded to another 
article. Therefore I just created the article [1] and tried if it is 
possible to render it as PDF.


Though this is "just" a sprint's result, there are still some bugs, but 
mainly rendering was possible to render the article. That, compared to 
my experiences, is a really good result.


Is it also possible to set this up behind a firewall?

[1] 
http://ocg-collection-alpha.wmflabs.org/index.php/Test_German_Umlauts_%C3%A4%C3%B6%C3%BC%C3%84%C3%96%C3%9C%C3%9F%E2%86%92%E2%80%93%E2%80%9E%E2%80%9C


Cheers,

Marco

On 01/18/2014 03:42 AM, Matthew Walker wrote:

All,

We've just finished our second sprint on the new PDF renderer. A
significant chunk of renderer development time this cycle was on non latin
script support, as well as puppetization and packaging for deployment. We
have a work in progress pipeline up and running in labs which I encourage
everyone to go try and break. You can use the following featured articles
just to see what our current output is:
 * http://ocg-collection-alpha.wmflabs.org/index.php/Alexis_Bachelot
 *
http://ocg-collection-alpha.wmflabs.org/index.php/Atlantis:_The_Lost_Empire

Some other articles imported on that test wiki:
 * http://ur1.ca/gg0bw

Please note that some of these will fail due to known issues noted below.

You can render any page in the new renderer by clicking the sidebar link
"Download as WMF PDF"; if you "Download as PDF" you'll be using the old
renderer (useful for comparison.) Additionally, you can create full books
via Special:Book -- our renderer is "RDF to Latex (PDF)" and the old
renderer is "e-book (PDF)". You can also try out the "RDF to Text (TXT)"
renderer, but that's not on the critical path. As of right now we do not
have a bugzilla project entry so reply to this email, or email me directly
-- we'll need one of: the name of the page, the name of the collection, or
the collection_id parameter from the URL to debug.

There are some code bits that we know are still missing that we will have
to address in the coming weeks or in another sprint.
 * Attribution for images and text. The APIs are done, but we still need
to massage that information into the document.
 * Message translation -- right now all internal messages are in English
which is not so helpful to non English speakers.
 * Things using the  tag and the Cite extension are not currently
supported (meaning you won't get nice references.)
 * Tables may not render at all, or may break the renderer.
 * Caching needs to be greatly improved.

Looking longer term into deployment on wiki, my plans right now are to get
this into beta labs for general testing and connect test.wikipedia.org up
to our QA hardware for load testing. The major blocker there is acceptance
of the Node.JS 0.10, and TexLive 2012 packages into reprap, our internal
aptitude package source. This is not quite as easy as it sounds, we already
use TexLive 2009 in production for the Math extension and we must apply
thorough tests to ensure we do not introduce any regressions when we update
to the 2012 package. I'm not sure what actual dates for those migrations /
testing will be because it greatly depends on when Ops has time. In the
meantime, our existing PDF cluster based on mwlib will continue to serve
our offline needs. Once our solution is deployed and tested, mwlib
(pdf[1-3]) will be retired here at the WMF and print on demand services
will be provided directly by PediaPress servers.

For the technically curious; we're approximately following the parsoid
deployment model -- using trebuchet to push out a source repository
(services/ocg-collection) that has the configuration and node dependencies
built on tin along with git submodules containing the actual service code.

It may not look like it on the surface, but we've come a long way and it
wouldn't have been possible without the (probably exasperated) help from
Jeff Green, Faidon, and Ori. Also big thanks to Brad and Max for their
work, and Gabriel for some head thunking. C. Scott and I are not quite off
the hook yet, as indicated by the list above, but hopefully soon enough
we'll be enjoying the cake and cookies from another new product launch.
(And yes, even if you're remote if I promised you cookies as bribes I'll
ship them to you :p)

~Matt Walker
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l



___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Revamping interwiki prefixes

2014-01-19 Thread MZMcBride
Tim Starling wrote:
>I can see that is convenient, but I think it should be replaced even in
>that use case. UI convenience, link styling and rel=nofollow can be dealt
>with in other ways.

Re: https://meta.wikimedia.org/wiki/Interwiki_map

It's not just convenience. Interwiki links are an easy way to implement
global (across all Wikimedia wikis) templates. They're very simple linker
templates, but templates just the same.

Instead of {{bugzilla|}} for Bugzilla, you use [[bugzilla:]]. Instead of
updating dozens of templates on hundreds of wikis indefinitely, you can
update a centralized interwiki map. The centralized map also helps avoid
conflicts. And if one day one of the targets moves and doesn't leave a
redirect (boo!), we can theoretically update the interwiki map and all of
the links across Wikimedia wikis will continue to work. I believe we use
this feature occasionally.

We could make parser functions such as "{{#bugzilla:}}", but depending on
who you ask, wikitext as a written form is on its way out. I'm not sure
the investment is worth the return.

I suppose it's possible that people are using interwiki markup to disable
the typical link icons, but instead we should be discussing link icons
generally in the user interface. This is pretty far removed from interwiki
links, in my opinion. I do know that people occasionally use redirection
to get around weird link generation behavior when using interwiki markup.
As I recall, space interpretation was the center of that (i.e., query
paths containing "_" v. "+" v. "%20" v. " " &c.).

Regarding rel=nofollow and link trustworthiness: I'm not sure any sane
search engine continues to trust user input these days. I thought lessons
of the past taught developers that people are pretty unscrupulous. :-)

 
MZMcBride



___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Status of the new PDF Renderer

2014-01-19 Thread Amir E. Aharoni
apologies: s/Are the/Are there/


--
Amir Elisha Aharoni · אָמִיר אֱלִישָׁע אַהֲרוֹנִי
http://aharoni.wordpress.com
‪“We're living in pieces,
I want to live in peace.” – T. Moore‬


2014/1/19 Amir E. Aharoni 

> 1. Can this be set up for testing locally? Where is the new software? I'm
> not sure that I see it in the master version of Collection in Gerrit.
>
> 2. Are the wikis with a non-English content language where this can be
> tested?
>
>
> --
> Amir Elisha Aharoni · אָמִיר אֱלִישָׁע אַהֲרוֹנִי
> http://aharoni.wordpress.com
> ‪“We're living in pieces,
> I want to live in peace.” – T. Moore‬
>
>
> 2014/1/18 Matthew Walker 
>
>> All,
>>
>> We've just finished our second sprint on the new PDF renderer. A
>> significant chunk of renderer development time this cycle was on non latin
>> script support, as well as puppetization and packaging for deployment. We
>> have a work in progress pipeline up and running in labs which I encourage
>> everyone to go try and break. You can use the following featured articles
>> just to see what our current output is:
>> * http://ocg-collection-alpha.wmflabs.org/index.php/Alexis_Bachelot
>> *
>>
>> http://ocg-collection-alpha.wmflabs.org/index.php/Atlantis:_The_Lost_Empire
>>
>> Some other articles imported on that test wiki:
>> * http://ur1.ca/gg0bw
>>
>> Please note that some of these will fail due to known issues noted below.
>>
>> You can render any page in the new renderer by clicking the sidebar link
>> "Download as WMF PDF"; if you "Download as PDF" you'll be using the old
>> renderer (useful for comparison.) Additionally, you can create full books
>> via Special:Book -- our renderer is "RDF to Latex (PDF)" and the old
>> renderer is "e-book (PDF)". You can also try out the "RDF to Text (TXT)"
>> renderer, but that's not on the critical path. As of right now we do not
>> have a bugzilla project entry so reply to this email, or email me directly
>> -- we'll need one of: the name of the page, the name of the collection, or
>> the collection_id parameter from the URL to debug.
>>
>> There are some code bits that we know are still missing that we will have
>> to address in the coming weeks or in another sprint.
>> * Attribution for images and text. The APIs are done, but we still
>> need
>> to massage that information into the document.
>> * Message translation -- right now all internal messages are in
>> English
>> which is not so helpful to non English speakers.
>> * Things using the  tag and the Cite extension are not currently
>> supported (meaning you won't get nice references.)
>> * Tables may not render at all, or may break the renderer.
>> * Caching needs to be greatly improved.
>>
>> Looking longer term into deployment on wiki, my plans right now are to get
>> this into beta labs for general testing and connect test.wikipedia.org up
>> to our QA hardware for load testing. The major blocker there is acceptance
>> of the Node.JS 0.10, and TexLive 2012 packages into reprap, our internal
>> aptitude package source. This is not quite as easy as it sounds, we
>> already
>> use TexLive 2009 in production for the Math extension and we must apply
>> thorough tests to ensure we do not introduce any regressions when we
>> update
>> to the 2012 package. I'm not sure what actual dates for those migrations /
>> testing will be because it greatly depends on when Ops has time. In the
>> meantime, our existing PDF cluster based on mwlib will continue to serve
>> our offline needs. Once our solution is deployed and tested, mwlib
>> (pdf[1-3]) will be retired here at the WMF and print on demand services
>> will be provided directly by PediaPress servers.
>>
>> For the technically curious; we're approximately following the parsoid
>> deployment model -- using trebuchet to push out a source repository
>> (services/ocg-collection) that has the configuration and node dependencies
>> built on tin along with git submodules containing the actual service code.
>>
>> It may not look like it on the surface, but we've come a long way and it
>> wouldn't have been possible without the (probably exasperated) help from
>> Jeff Green, Faidon, and Ori. Also big thanks to Brad and Max for their
>> work, and Gabriel for some head thunking. C. Scott and I are not quite off
>> the hook yet, as indicated by the list above, but hopefully soon enough
>> we'll be enjoying the cake and cookies from another new product launch.
>> (And yes, even if you're remote if I promised you cookies as bribes I'll
>> ship them to you :p)
>>
>> ~Matt Walker
>> ___
>> Wikitech-l mailing list
>> Wikitech-l@lists.wikimedia.org
>> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
>
>
>
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Status of the new PDF Renderer

2014-01-19 Thread Amir E. Aharoni
1. Can this be set up for testing locally? Where is the new software? I'm
not sure that I see it in the master version of Collection in Gerrit.

2. Are the wikis with a non-English content language where this can be
tested?


--
Amir Elisha Aharoni · אָמִיר אֱלִישָׁע אַהֲרוֹנִי
http://aharoni.wordpress.com
‪“We're living in pieces,
I want to live in peace.” – T. Moore‬


2014/1/18 Matthew Walker 

> All,
>
> We've just finished our second sprint on the new PDF renderer. A
> significant chunk of renderer development time this cycle was on non latin
> script support, as well as puppetization and packaging for deployment. We
> have a work in progress pipeline up and running in labs which I encourage
> everyone to go try and break. You can use the following featured articles
> just to see what our current output is:
> * http://ocg-collection-alpha.wmflabs.org/index.php/Alexis_Bachelot
> *
> http://ocg-collection-alpha.wmflabs.org/index.php/Atlantis:_The_Lost_Empire
>
> Some other articles imported on that test wiki:
> * http://ur1.ca/gg0bw
>
> Please note that some of these will fail due to known issues noted below.
>
> You can render any page in the new renderer by clicking the sidebar link
> "Download as WMF PDF"; if you "Download as PDF" you'll be using the old
> renderer (useful for comparison.) Additionally, you can create full books
> via Special:Book -- our renderer is "RDF to Latex (PDF)" and the old
> renderer is "e-book (PDF)". You can also try out the "RDF to Text (TXT)"
> renderer, but that's not on the critical path. As of right now we do not
> have a bugzilla project entry so reply to this email, or email me directly
> -- we'll need one of: the name of the page, the name of the collection, or
> the collection_id parameter from the URL to debug.
>
> There are some code bits that we know are still missing that we will have
> to address in the coming weeks or in another sprint.
> * Attribution for images and text. The APIs are done, but we still need
> to massage that information into the document.
> * Message translation -- right now all internal messages are in English
> which is not so helpful to non English speakers.
> * Things using the  tag and the Cite extension are not currently
> supported (meaning you won't get nice references.)
> * Tables may not render at all, or may break the renderer.
> * Caching needs to be greatly improved.
>
> Looking longer term into deployment on wiki, my plans right now are to get
> this into beta labs for general testing and connect test.wikipedia.org up
> to our QA hardware for load testing. The major blocker there is acceptance
> of the Node.JS 0.10, and TexLive 2012 packages into reprap, our internal
> aptitude package source. This is not quite as easy as it sounds, we already
> use TexLive 2009 in production for the Math extension and we must apply
> thorough tests to ensure we do not introduce any regressions when we update
> to the 2012 package. I'm not sure what actual dates for those migrations /
> testing will be because it greatly depends on when Ops has time. In the
> meantime, our existing PDF cluster based on mwlib will continue to serve
> our offline needs. Once our solution is deployed and tested, mwlib
> (pdf[1-3]) will be retired here at the WMF and print on demand services
> will be provided directly by PediaPress servers.
>
> For the technically curious; we're approximately following the parsoid
> deployment model -- using trebuchet to push out a source repository
> (services/ocg-collection) that has the configuration and node dependencies
> built on tin along with git submodules containing the actual service code.
>
> It may not look like it on the surface, but we've come a long way and it
> wouldn't have been possible without the (probably exasperated) help from
> Jeff Green, Faidon, and Ori. Also big thanks to Brad and Max for their
> work, and Gabriel for some head thunking. C. Scott and I are not quite off
> the hook yet, as indicated by the list above, but hopefully soon enough
> we'll be enjoying the cake and cookies from another new product launch.
> (And yes, even if you're remote if I promised you cookies as bribes I'll
> ship them to you :p)
>
> ~Matt Walker
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] On-wiki configuration RfCs (was: Proposed Program Architecture Summit 2014)

2014-01-19 Thread legoktm
On Thu, Jan 16, 2014 at 2:52 PM, Rob Lanphier  wrote:

> Let's put that number at 95%  :-)  I don't want to commit the group to
> anything; it's at the top of the short list, but you never know where
> people will want to take the conversation.
>
> Legoktm, the thing you could do to push it over the top is to write up
> a summary as you see it of the various positions, and post it to this
> list.  Ideally, you would also publish it as a subpage of the
> Architecture Summit 2014 page, similar to how some of the others are
> (e.g. HTML templating[1])  Even if we don't get to it at the summit,
> it still will be a very useful backgrounder to have.  Is that
> something you'd be willing to chip in with?

Sure, I created
,
and solicited feedback from ^demon and Yurik. I've included it below
for ease:

MediaWiki is currently configured via a series of global variables
that must be set in a PHP file by a user with file system access. This
cluster of RfCs intends to allow the wiki to be configured using an
on-wiki mechanism. Some of the motivations behind this can be found in
the first RfC[1].

The first two RfCs, "Configuration database"[2], and "Configuration
database 2"[3] discuss storing configuration parameters in a database
table, while "Json Config pages in wiki"[4] discusses using a JSON
ContentHandler to store configuration options in a wiki page.

Some of these features have already been implemented in extensions:
Configure provides basic on-wiki configuration functionality, and
CentralAuth allows custom global user groups to be created on-wiki.
The EventLogging, UploadWizard, and ZeroRatedMobileAccess extensions
all currently store configuration options in JSON formatted pages.

See also: bug 26992[5], "Implement configuration database aka
configuration management aka no shell excuse (tracking)"

[1] 
https://www.mediawiki.org/wiki/Requests_for_comment/Configuration_database#Problems_to_solve
[2] https://www.mediawiki.org/wiki/Requests_for_comment/Configuration_database
[3] https://www.mediawiki.org/wiki/Requests_for_comment/Configuration_database_2
[4] 
https://www.mediawiki.org/wiki/Requests_for_comment/Json_Config_pages_in_wiki
[5] https://bugzilla.wikimedia.org/show_bug.cgi?id=26992

-- Legoktm

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l