Re: [Wikitech-l] CAPTCHA

2013-03-21 Thread Federico Leva (Nemo)
Restrictive wikis for captchas are only a handful (plus pt.wiki which is 
in permanent emergency mode). 
https://meta.wikimedia.org/wiki/Newly_registered_user
For them you could request confirmed flag at 
https://meta.wikimedia.org/wiki/SRP
Personally I found it easier to do the required 10, 50 or whatever edits 
on a userpage. 5 min at most and you're done.


Nemo

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Flagged revs and Lua modules

2013-03-21 Thread Aaron Schulz
Sounds like a site config issue. All wikis that have NS_TEMPLATE in
$wgFlaggedRevsNamespaces should also have NS_MODULE in there.



--
View this message in context: 
http://wikimedia.7.n6.nabble.com/Flagged-revs-and-Lua-modules-tp4999685p497.html
Sent from the Wikipedia Developers mailing list archive at Nabble.com.

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] [Wmfall] Yuri Astrakhan Adam Baso join Mobile department partner team

2013-03-21 Thread Nasir Khan
great news. :)



*
--
**Nasir Khan Saikat* http://profiles.google.com/nasir8891
www.nasirkhn.com



On Thu, Mar 21, 2013 at 10:01 AM, Matthew Flaschen
mflasc...@wikimedia.orgwrote:

 On 03/18/2013 01:29 PM, Tomasz Finc wrote:
  Greetings all,
 
  I'm pleased to announce that the mobile department has two new staff
  members. Yuri Astrakhan  Adam Baso join as sr. software developers on
  the mobile partner team.

 Welcome!  It's great to have both of you.

 Matt Flaschen

 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Missing project ideas for GSOC

2013-03-21 Thread Lydia Pintscher
On Thu, Mar 21, 2013 at 1:04 AM, Eugene Zelenko
eugene.zele...@gmail.com wrote:
 Hi!

 I think will be good idea to direct some of Google Summer of Code
 participants energy to help Wikidata which misses many must-be
 features. Some of them like support for projects other then Wikipedia
 is postponed to next years, but something tells me that it may be
 clone of existing functionality in most cases except one-to-multiple
 links in Wikisource :-)

Yes we have a few GSoC project ideas lined up. I will add the to the
wiki in a bit. Features are not postponed to next year but to the
second year of development which starts in a few days. Patience
please. Things will happen as fast as they can :)


Cheers
Lydia

--
Lydia Pintscher - http://about.me/lydia.pintscher
Community Communications for Wikidata

Wikimedia Deutschland e.V.
Obentrautstr. 72
10963 Berlin
www.wikimedia.de

Wikimedia Deutschland - Gesellschaft zur Förderung Freien Wissens e. V.

Eingetragen im Vereinsregister des Amtsgerichts Berlin-Charlottenburg
unter der Nummer 23855 Nz. Als gemeinnützig anerkannt durch das
Finanzamt für Körperschaften I Berlin, Steuernummer 27/681/51985.

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] IRC office hour on Tue March 19th, 1700 UTC, about Bug management

2013-03-21 Thread Andre Klapper
Thanks to everybody who showed up!

The IRC log can be found at
https://meta.wikimedia.org/wiki/IRC_office_hours/Office_hours_2013-03-19

andre
-- 
Andre Klapper | Wikimedia Bugwrangler
http://blogs.gnome.org/aklapper/


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] RFC How/whether MediaWiki could use ZendOptimizuerPlus -- ZendOptimizerPlus, opcode cache, PHP 5.4, APC, memcache ???

2013-03-21 Thread Thomas Gries
tl;dr
discussion start How/whether MediaWiki could use ZendOptimizuerPlus


Since a short time

  * ZendOptimizerPlus (Opcode cache; source [4])


is under PHP license and planned to be integrated in PHP 5.5.
The former restrictions of this program (closed-source etc.) appear to
be gone,
so I'd like to open discussions how MediaWIki could be adapted to make
use of it.


Because APC does not work with PHP 5.4, (or APC is beta for PHP 5.4),
I wanted to make use of the new ZendOptimizer with PHP 5.4 ...

  * but MediaWiki apparently does not work with that. ( CACHE_ACCEL )


*Who knows more ?*
and could help to get the Zend cache working and APC replaced ?
Or is this not possible, because Zend is only for opcode caching ??


MediaWiki pages [1,2] should be /*revisited *//*and updated*/ according
to current knowledge and version /*by cache experts*/


Tom


[1]  https://www.mediawiki.org/wiki/Manual:Cache
[2]
https://meta.wikimedia.org/wiki/PHP_caching_and_optimization#PHPA_or_Zend_Optimizer
[3] https://wiki.php.net/rfc/optimizerplus
[4] https://github.com/zend-dev/ZendOptimizerPlus

[3}] says: This RFC proposes integrating the Zend Optimizer+ component
into the Open Source PHP distribution. Optimizer+ is the fastest opcode
cache available for PHP, and presently supports PHP 5.2 through 5.5,
with public builds available for PHP 5.2 through 5.4. It was originally
developed in 1998 and was the first opcode cache available for PHP.

Presently, Optimizer+ is a closed-source, yet free-for-use component. As
a part of implementing this RFC - Zend will make the source code of
Optimizer+ available under the PHP License, so that it can become an
integrated part of PHP with no strings attached. Once that happens,
community contribution would be welcome exactly like it is with any
other PHP component, and the component will be governed by the exact
same processes (RFC et. al) that are employed by the PHP community.


signature.asc
Description: OpenPGP digital signature
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] RFC How/whether MediaWiki could use ZendOptimizuerPlus -- ZendOptimizerPlus, opcode cache, PHP 5.4, APC, memcache ???

2013-03-21 Thread Chad
On Thu, Mar 21, 2013 at 10:14 AM, Thomas Gries m...@tgries.de wrote:
 tl;dr
 discussion start How/whether MediaWiki could use ZendOptimizuerPlus


 Since a short time

   * ZendOptimizerPlus (Opcode cache; source [4])


 is under PHP license and planned to be integrated in PHP 5.5.
 The former restrictions of this program (closed-source etc.) appear to
 be gone,
 so I'd like to open discussions how MediaWIki could be adapted to make
 use of it.


 Because APC does not work with PHP 5.4, (or APC is beta for PHP 5.4),
 I wanted to make use of the new ZendOptimizer with PHP 5.4 ...

   * but MediaWiki apparently does not work with that. ( CACHE_ACCEL )


 *Who knows more ?*
 and could help to get the Zend cache working and APC replaced ?
 Or is this not possible, because Zend is only for opcode caching ??


 MediaWiki pages [1,2] should be /*revisited *//*and updated*/ according
 to current knowledge and version /*by cache experts*/


You're confusing opcode caching with shared memory caching. Having the Zend
Optimizer doesn't prohibit you from using APC's shared memory caching. And
since Zend Optimizer doesn't do shared memory functionality, there's
no support
that needs to be added anywhere (now, if they introduce such a feature, that's
another story).

-Chad

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] RFC How/whether MediaWiki could use ZendOptimizuerPlus -- ZendOptimizerPlus, opcode cache, PHP 5.4, APC, memcache ???

2013-03-21 Thread Thomas Gries
Am 21.03.2013 15:23, schrieb Chad:
 You're confusing opcode caching with shared memory caching. 
thanks, as already mentioned, I anticipated that difference.
 Having the Zend
 Optimizer doesn't prohibit you from using APC's shared memory caching.
But APC has issues with PHP 5.4 .
What can we MediaWiki developers do to get this (APC) working  with PHP 5.4+

@all
Has someone PHP 5.4 and MediaWiki core/master */WITH APC/* working ?

  And
 since Zend Optimizer doesn't do shared memory functionality, there's
 no support
 that needs to be added anywhere (now, if they introduce such a feature, that's
 another story).
+1
Wouldn't that be an improvement, can you contact these people ?
I found, that APC  is a must have for larger MediaWikis, and would
like to see further methods supported in the core or by extensions.

Chad: thanks for your swift answer which confirms my view, even when I
wasn't very clear in my original mail.



___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] RFC How/whether MediaWiki could use ZendOptimizuerPlus -- ZendOptimizerPlus, opcode cache, PHP 5.4, APC, memcache ???

2013-03-21 Thread Chad
On Thu, Mar 21, 2013 at 10:42 AM, Thomas Gries m...@tgries.de wrote:
 Am 21.03.2013 15:23, schrieb Chad:
 You're confusing opcode caching with shared memory caching.
 thanks, as already mentioned, I anticipated that difference.
 Having the Zend
 Optimizer doesn't prohibit you from using APC's shared memory caching.
 But APC has issues with PHP 5.4 .
 What can we MediaWiki developers do to get this (APC) working  with PHP 5.4+


Nothing, unless someone wants to contribute upstream with patches or
bug reports.

  And
 since Zend Optimizer doesn't do shared memory functionality, there's
 no support
 that needs to be added anywhere (now, if they introduce such a feature, 
 that's
 another story).
 +1
 Wouldn't that be an improvement, can you contact these people ?
 I found, that APC  is a must have for larger MediaWikis, and would
 like to see further methods supported in the core or by extensions.


Sure, it'd be an improvement--go ahead and file a bug wherever it
belongs upstream (github?). If and when they decide to implement
it, *then* would be the time to make MW changes :)

-Chad

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] RFC How/whether MediaWiki could use ZendOptimizuerPlus -- ZendOptimizerPlus, opcode cache, PHP 5.4, APC, memcache ???

2013-03-21 Thread Thomas Gries
Am 21.03.2013 15:57, schrieb Chad:
 Sure, it'd be an improvement--go ahead and file a bug wherever it
 belongs upstream (github?). If and when they decide to implement
 it, *then* would be the time to make MW changes :)

Where can I read more (and can then refer to it) about how MediaWiki
uses the APC as memcache?
I found some pages (see my other mail), but this wasn't detailed enough.

Do I have to look for the MediaWiki source module/memcache API? Where ?

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] RFC How/whether MediaWiki could use ZendOptimizuerPlus -- ZendOptimizerPlus, opcode cache, PHP 5.4, APC, memcache ???

2013-03-21 Thread Bryan Tong Minh
On Thu, Mar 21, 2013 at 4:02 PM, Thomas Gries m...@tgries.de wrote:


 Do I have to look for the MediaWiki source module/memcache API? Where ?

 It's called BagOStuff.


Bryan
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Gerrit Wars™: a strategy-guide

2013-03-21 Thread Mark A. Hershberger
On 03/20/2013 10:43 AM, Jasper Wallace wrote:
 On Tue, 19 Mar 2013, MZMcBride wrote:
 

 P.S. mailman: there's a non-ASCII character in the subject line. Attack!
 
 Why? It's correctly encoded:

Because the way the subject line is displayed 3 different ways on the
archive page:

http://lists.wikimedia.org/pipermail/wikitech-l/2013-March/thread.html#67742

There we have:

  Gerrit =?utf-8?Q?Wars=E2=84=A2=3A_?=a strategy-guide
  Gerrit =?UTF-8?B?V2Fyc+KEog==?=: a strategy-guide
  Gerrit Wars™: a strategy-guide

I wouldn't be surprised if my message creates a fourth way.

-- 
http://hexmode.com/

[We are] immortal ... because [we have] a soul, a spirit capable of
   compassion and sacrifice and endurance.
-- William Faulker, Nobel Prize acceptance speech

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] RFC: Alternative domains for Commons

2013-03-21 Thread Juliusz Gonera

We've been having a hard time making photo uploads work in
MobileFrontend because of CentralAuth's third party cookies problem (we
upload them from Wikipedia web site to Commons API). Apart from the
newest Firefox [1,2], mobile Safari also doesn't accept third party
cookies unless the domain has been visited and it already has at least
one cookie set.

Even though we have probably found a solution for now, it's a very shaky
and not elegant workaround which might stop working any time (if some
detail of default browser cookie policy changes again) [3].

I came up with another idea of how this could be solved. The problem we
have right now is that Commons is on a completely different domain than
Wikipedia, so they can't share the login token cookie. However, we could
set up alternative domains for Commons, such as commons.wikipedia.org,
and then the cookie could be shared.

The only issue I see with this solution is that we would have to
prevent messing up SEO (having multiple URLs pointing to the same
resource). This, however, could be avoided by redirecting every
non-API request to the main domain (commons.wikimedia.org) and only
allowing API requests on alternative domains (which is what we use for
photo uploads on mobile).

This obviously doesn't solve the broader problem of CentralAuth's common
login being broken, but at least would allow easy communication between
Commons and other projects. In my opinion this is the biggest problem
right now. Users can probably live without being automatically logged in
to other projects, but photo uploads on mobile are just broken when we
can't use Commons API.

Please let me know what you think. Are there any other possible
drawbacks of this solution that I missed?

[1] http://webpolicy.org/2013/02/22/the-new-firefox-cookie-policy/
[2] 
https://developer.mozilla.org/en-US/docs/Site_Compatibility_for_Firefox_22

[3] https://gerrit.wikimedia.org/r/#/c/54813/

--
Juliusz

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Gerrit Wars™: a strategy-guide

2013-03-21 Thread Luke Welling WMF
Ori's advice rings true with me.  It's something I need to get better at.

On the email titiel sidetrack, it should not create a 4th way.  Without
verifying them those all look like valid representations of the same data.
 MIME encoded word syntax only has two possible encodings, quoted printable
and base 64.  The one with the Q after UTF-8 should be the quoted printable
version.  The one with the B after UTF-8 has been encoded as base64 instead.

Of course it's possible somebody's client or MTA has munged them but the
point of encoded word is that even if you only speak RFC2822 and not
RFC2047(?) you can still transmit them correctly.

Luke Welling



On Thu, Mar 21, 2013 at 11:19 AM, Mark A. Hershberger m...@everybody.orgwrote:

 On 03/20/2013 10:43 AM, Jasper Wallace wrote:
  On Tue, 19 Mar 2013, MZMcBride wrote:
 
 
  P.S. mailman: there's a non-ASCII character in the subject line. Attack!
 
  Why? It's correctly encoded:

 Because the way the subject line is displayed 3 different ways on the
 archive page:


 http://lists.wikimedia.org/pipermail/wikitech-l/2013-March/thread.html#67742

 There we have:

   Gerrit =?utf-8?Q?Wars=E2=84=A2=3A_?=a strategy-guide
   Gerrit =?UTF-8?B?V2Fyc+KEog==?=: a strategy-guide
   Gerrit Wars™: a strategy-guide

 I wouldn't be surprised if my message creates a fourth way.

 --
 http://hexmode.com/

 [We are] immortal ... because [we have] a soul, a spirit capable of
compassion and sacrifice and endurance.
 -- William Faulker, Nobel Prize acceptance speech

 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] EasyRDF light

2013-03-21 Thread Denny Vrandečić
Hey,

as you remembered, we were asking about EasyRDF in order to use it in
Wikidata.

We have now cut off the pieces that we do not need, in order to simplify
the review. Most of the interesting parts of EasyRDF regarding security
issues -- parsing, serving, etc. -- has been removed.

Our code is here. We would invite comments and reviews.

https://github.com/Wikidata/easyrdf_lite

Cheers,
Denny



-- 
Project director Wikidata
Wikimedia Deutschland e.V. | Obentrautstr. 72 | 10963 Berlin
Tel. +49-30-219 158 26-0 | http://wikimedia.de

Wikimedia Deutschland - Gesellschaft zur Förderung Freien Wissens e.V.
Eingetragen im Vereinsregister des Amtsgerichts Berlin-Charlottenburg unter
der Nummer 23855 B. Als gemeinnützig anerkannt durch das Finanzamt für
Körperschaften I Berlin, Steuernummer 27/681/51985.
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] Encoding subject encoding bike shed

2013-03-21 Thread Mark A. Hershberger
On 03/21/2013 11:45 AM, Luke Welling WMF wrote:
 On the email title sidetrack, it should not create a 4th way.

The pedant in me says there are at least two more ways -- different
capitalization for UTF-8.  But your subject line shows another way.

My client displays all of the subjects the same.

Jasper,
Mine:  =?utf-8?q?Gerrit_Wars=E2=84=A2=3A_a_strategy-guide?=
Yours: =?windows-1252?q?Gerrit_Wars=99=3A_a_strategy-guide?=
MZ's:  Gerrit =?UTF-8?B?V2Fyc+KEog==?=: a strategy-guide
Ori:   Gerrit =?utf-8?Q?Wars=E2=84=A2=3A_?=a strategy-guide

Maybe mailman doesn't understand when the encoding doesn't start at the
first character since those are the ones that don't display correctly.

-- 
http://hexmode.com/

[We are] immortal ... because [we have] a soul, a spirit capable of
   compassion and sacrifice and endurance.
-- William Faulker, Nobel Prize acceptance speech

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Encoding subject encoding bike shed

2013-03-21 Thread Luke Welling WMF
Heh, if clients randomly change character sets than I guess there are a
very large number of possible values.

Given that RFC2047 came out in 1996 it's reasonable that people use
non-ascii characters in titles given that the means to do it in a
compatible way has been around for 17 years.

Luke


On Thu, Mar 21, 2013 at 12:04 PM, Mark A. Hershberger m...@everybody.orgwrote:

 On 03/21/2013 11:45 AM, Luke Welling WMF wrote:
  On the email title sidetrack, it should not create a 4th way.

 The pedant in me says there are at least two more ways -- different
 capitalization for UTF-8.  But your subject line shows another way.

 My client displays all of the subjects the same.

 Jasper,
 Mine:  =?utf-8?q?Gerrit_Wars=E2=84=A2=3A_a_strategy-guide?=
 Yours: =?windows-1252?q?Gerrit_Wars=99=3A_a_strategy-guide?=
 MZ's:  Gerrit =?UTF-8?B?V2Fyc+KEog==?=: a strategy-guide
 Ori:   Gerrit =?utf-8?Q?Wars=E2=84=A2=3A_?=a strategy-guide

 Maybe mailman doesn't understand when the encoding doesn't start at the
 first character since those are the ones that don't display correctly.

 --
 http://hexmode.com/

 [We are] immortal ... because [we have] a soul, a spirit capable of
compassion and sacrifice and endurance.
 -- William Faulker, Nobel Prize acceptance speech

 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Gerrit Wars™: a strategy-guide

2013-03-21 Thread Brad Jorsch
On Wed, Mar 20, 2013 at 10:43 AM, Jasper Wallace jas...@pointless.netwrote:

 On Tue, 19 Mar 2013, MZMcBride wrote:

 
  P.S. mailman: there's a non-ASCII character in the subject line. Attack!

 Why? It's correctly encoded:

 Subject: Re: [Wikitech-l] Gerrit =?UTF-8?B?V2Fyc+KEog==?=: a strategy-guide


Actually, that one is not. There must be a space between the ?= and the
colon.

The same goes for Ori's [Wikitech-l] Gerrit =?utf-8?Q?Wars=E2=84=A2=3A_?=a
strategy-guide, there must be a space between the ?= and the a.

RFC 2047, section 5.

-- 
Brad Jorsch
Software Engineer
Wikimedia Foundation
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Who is responsible for communicating changes in MediaWiki to WMF sites?

2013-03-21 Thread Platonides
Is sending an email to wikitech-ambassadors enough for unblocking it?

Although such should contain a timeframe expectation, which probably
only WMF can give.


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Who is responsible for communicating changes in MediaWiki to WMF sites?

2013-03-21 Thread Quim Gil

On 03/21/2013 02:55 AM, Niklas Laxström wrote:

I've seen a couple of instances where changes to MediaWiki are blocked
until someone informs the community.

Someone is a volunteer.

Community is actually just the Wikimedia project communities. Or at
least the biggest ones which are expected to complain and where the
complaining would hurt.

This situation seems completely unfair to me. WMF should be able to
communicate upcoming changes itself, not throw it to volunteers.
Volunteers can help, but they should not be responsible for this to
happen.


Can you point to the changes blocked, or to anything that would give a 
better idea to those of us that don't know what are the cases you are 
talking about?


I agree with the principle, but without more details it is difficult to 
help fixing the problem.


--
Quim Gil
Technical Contributor Coordinator @ Wikimedia Foundation
http://www.mediawiki.org/wiki/User:Qgil

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Gerrit Wars™: a strategy-guide

2013-03-21 Thread Quim Gil

Ori, now you can add another point to the concise strategy-guide:

Deeper posts tend to generate superficial and tangential replies. The 
answer is Silence.


PS: thank you for the post, I enjoyed it.

--
Quim Gil
Technical Contributor Coordinator @ Wikimedia Foundation
http://www.mediawiki.org/wiki/User:Qgil

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] RFC How/whether MediaWiki could use ZendOptimizuerPlus -- ZendOptimizerPlus, opcode cache, PHP 5.4, APC, memcache ???

2013-03-21 Thread Tyler Romeo
Just to be clear, APC will not work in PHP 5.5 at all. It actually
conflicts with Zend Optimizer+, and you cannot use both at the same time.

*-- *
*Tyler Romeo*
Stevens Institute of Technology, Class of 2015
Major in Computer Science
www.whizkidztech.com | tylerro...@gmail.com
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Who is responsible for communicating changes in MediaWiki to WMF sites?

2013-03-21 Thread Paul Selitskas
Example:
We are running a fix in category sorting collations. That was a fix for the
bug (introduced by developers, 3rd party software, whatever), not an
enhancement. Anyway, notifying the community and its approval was requested.

On Thursday, March 21, 2013, Quim Gil q...@wikimedia.org wrote:
 On 03/21/2013 02:55 AM, Niklas Laxström wrote:

 I've seen a couple of instances where changes to MediaWiki are blocked
 until someone informs the community.

 Someone is a volunteer.

 Community is actually just the Wikimedia project communities. Or at
 least the biggest ones which are expected to complain and where the
 complaining would hurt.

 This situation seems completely unfair to me. WMF should be able to
 communicate upcoming changes itself, not throw it to volunteers.
 Volunteers can help, but they should not be responsible for this to
 happen.

 Can you point to the changes blocked, or to anything that would give a
better idea to those of us that don't know what are the cases you are
talking about?

 I agree with the principle, but without more details it is difficult to
help fixing the problem.

 --
 Quim Gil
 Technical Contributor Coordinator @ Wikimedia Foundation
 http://www.mediawiki.org/wiki/User:Qgil

 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l

-- 
З павагай,
Павел Селіцкас/Pavel Selitskas
Wizardist @ Wikimedia projects
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Raw page view no longer showing language links for some pages?

2013-03-21 Thread Sumana Harihareswara
On 03/09/2013 10:00 PM, Brian Cassidy wrote:
 Hello,
 
 I'm the co-author of the WWW::Wikipedia Perl module (
 https://metacpan.org/release/WWW-Wikipedia). It programmatically parses the
 raw source of a Wikipedia page.
 
 Of late, a few changes in behaviour have been reported to me -- all related
 to the language functionality.
 
 As it turns out a number of pages are no longer returning the language
 links in the raw source code like they used to. The canonical test for us
 was to load Russia in English, then grab the Russian link. As you can
 see, the page for Russia no longer has those links (which can normally be
 seen down the left-hand side of the real page):
 
 http://en.wikipedia.org/w/index.php?title=Russiaaction=raw
 
 A shorter example is the page for Rotation
 
 http://en.wikipedia.org/w/index.php?title=Rotationaction=raw
 
 I did find that some pages still have language links. See this one for
 Babushka
 
 http://en.wikipedia.org/w/index.php?title=Babushkaaction=raw
 
 Has there been some change that no longer outputs those links in some
 instances, or is this an actual bug?
 
 I apologize that this wasn't sent to some official bug tracker, but I
 couldn't find that info off-hand from the Wikipedia site.
 
 Thanks in advance,

Brian, thanks for your note.

Here's our recent blog post on how to file a bug report or feature
request in our Bugzilla installation:
https://blog.wikimedia.org/2013/03/18/how-to-create-a-good-first-bug-report/

If you find that the API doesn't give you some of what you need for
WWW::Wikipedia, please do file a bug.  Thanks!
-- 
Sumana Harihareswara
Engineering Community Manager
Wikimedia Foundation

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Who is responsible for communicating changes in MediaWiki to WMF sites?

2013-03-21 Thread Tyler Romeo
Well, as part of the community and a volunteer, I can safely say that I
don't think I (or anybody else) needs notification before bug fixes. :P

*-- *
*Tyler Romeo*
Stevens Institute of Technology, Class of 2015
Major in Computer Science
www.whizkidztech.com | tylerro...@gmail.com
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] wmf12 rollback, all wikis (except test2) are on wmf11

2013-03-21 Thread Greg Grossmeier
Tim rolled back wmf12 after a nasty bug last night:
https://bugzilla.wikimedia.org/show_bug.cgi?id=46397

Our fix we deployed to test2 didn't fix it:
https://gerrit.wikimedia.org/r/#/c/55086/

We're still diagnosing/etc.

So, we're staying on wmf11 for now (except on test2, which is running
the broken fix in wmf12).

Please ping me or Robla with any questions.

Greg

-- 
| Greg GrossmeierGPG: B2FA 27B1 F7EB D327 6B8E |
| identi.ca: @gregA18D 1138 8E47 FAC8 1C7D |

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Who is responsible for communicating changes in MediaWiki to WMF sites?

2013-03-21 Thread Brian Wolff
On 2013-03-21 3:08 PM, Tyler Romeo tylerro...@gmail.com wrote:

 Well, as part of the community and a volunteer, I can safely say that I
 don't think I (or anybody else) needs notification before bug fixes. :P

 *-- *
 *Tyler Romeo*
 Stevens Institute of Technology, Class of 2015
 Major in Computer Science
 www.whizkidztech.com | tylerro...@gmail.com
 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l

That depends on the bug. Some fixes do cause disruption. To pick a random
clear cut example from a while ago - consider adding the token to the login
api action. It was very important that got fixed, but it did cause
disruption.

-bawolff
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] Meetup April 11: GSOC and other open source internship programs

2013-03-21 Thread Quim Gil
Your forwards to potential students / interns / mentors are welcome! 
Also out of the Bay Area: we will stream the event and accept questions 
via IRC.


GSoC and other open source internship programs
Wikipedia Engineering Meetup (San Francisco)

Thursday, April 11, 2013
5:00 PM

http://www.meetup.com/Wikipedia-Engineering-Meetup/events/109096132/

Interested about Google Summer of Code? Now it's the right time. Watch  
edit https://www.mediawiki.org/wiki/Summer_of_Code_2013


--
Quim Gil
Technical Contributor Coordinator @ Wikimedia Foundation
http://www.mediawiki.org/wiki/User:Qgil

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Who is responsible for communicating changes in MediaWiki to WMF sites?

2013-03-21 Thread Quim Gil

On 03/21/2013 11:05 AM, Paul Selitskas wrote:

Example:
We are running a fix in category sorting collations. That was a fix for the
bug (introduced by developers, 3rd party software, whatever), not an
enhancement. Anyway, notifying the community and its approval was requested.


Thank you, having examples helps.

It is a good practice to notify stakeholders when fixing something might 
break or disrupt other things. Usually our bug tracking, code review and 
release/deployment processes should be enough to involve and notify in 
real time whoever needs to be warned or is likely to complain.


If you need more, then there are at least 4 people that can help you.

https://www.mediawiki.org/wiki/Wikimedia_Platform_Engineering#Engineering_Community_Team

If we are talking about bugs, then Andre Klapper aka bugmeister is a 
natural default. If anybody else needs to be involved he will know.


--
Quim Gil
Technical Contributor Coordinator @ Wikimedia Foundation
http://www.mediawiki.org/wiki/User:Qgil

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] Important maintenance bug

2013-03-21 Thread Leslie Carr
https://bugzilla.wikimedia.org/show_bug.cgi?id=46428

If one of the php-knowledgable peeps can take a look at this (sadly my
php-foo is quite weak).

-- 
Leslie Carr
Wikimedia Foundation
AS 14907, 43821
http://as14907.peeringdb.com/

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Who is responsible for communicating changes in MediaWiki to WMF sites?

2013-03-21 Thread Isarra Yos
Another example would be changing default options in core - recently I 
tried to push for making the enhanced recentchanges the default, but one 
of the blockers was that I'd need to let the Wikimedia communities know 
as the change would be applied there as well.


Unfortunately I didn't have any idea when such a change could or would 
be merged or deployed, so not only did I not have any timeframe to give 
said the communities, I didn't even know when it would be appropriate to 
tell them (if it happens months later, mentioning now would not be very 
helpful) - or even if it ever would really happen at all.
As it was the change just sat in gerrit for a month before James 
Forrester agreed to merge it.


In this case it turns out there was another problem and now we're 
waiting on the wikidata folks to resolve that issue (namely that the 
enhanced recentchanges code kind of sucks), but the point is in many 
cases there is just no way volunteers can even know if something will 
actually be merged, let alone the timeframe, and thus expecting us to 
inform folks in these circumstances is a little ridiculous in general.


Don't get me wrong, I'd personally be happy to let folks know of such 
changes, but given how utterly unreliable the review process can be for 
changes coming from volunteers, it's just not a reasonable expectation.



On 21/03/13 16:43, Quim Gil wrote:

On 03/21/2013 02:55 AM, Niklas Laxström wrote:

I've seen a couple of instances where changes to MediaWiki are blocked
until someone informs the community.

Someone is a volunteer.

Community is actually just the Wikimedia project communities. Or at
least the biggest ones which are expected to complain and where the
complaining would hurt.

This situation seems completely unfair to me. WMF should be able to
communicate upcoming changes itself, not throw it to volunteers.
Volunteers can help, but they should not be responsible for this to
happen.


Can you point to the changes blocked, or to anything that would give a 
better idea to those of us that don't know what are the cases you are 
talking about?


I agree with the principle, but without more details it is difficult 
to help fixing the problem.





--
-— Isarra


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Who is responsible for communicating changes in MediaWiki to WMF sites?

2013-03-21 Thread Tyler Romeo
On Thu, Mar 21, 2013 at 2:13 PM, Brian Wolff bawo...@gmail.com wrote:

 That depends on the bug. Some fixes do cause disruption. To pick a random
 clear cut example from a while ago - consider adding the token to the login
 api action. It was very important that got fixed, but it did cause
 disruption.

 -bawolff


Oh yeah. Trust me. I know. Does anybody even remember the thread I sent out
not too long ago? There were like three breaking changes applied to the
core that would have caused fatal errors in my extensions.

*-- *
*Tyler Romeo*
Stevens Institute of Technology, Class of 2015
Major in Computer Science
www.whizkidztech.com | tylerro...@gmail.com
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Raw page view no longer showing language links for some pages?

2013-03-21 Thread Yuri Astrakhan
This is all due to the introduction of Wikidata http://wikidata.org.


On Thu, Mar 21, 2013 at 12:32 PM, Sumana Harihareswara 
suma...@wikimedia.org wrote:

 On 03/09/2013 10:00 PM, Brian Cassidy wrote:
  Hello,
 
  I'm the co-author of the WWW::Wikipedia Perl module (
  https://metacpan.org/release/WWW-Wikipedia). It programmatically parses
 the
  raw source of a Wikipedia page.
 
  Of late, a few changes in behaviour have been reported to me -- all
 related
  to the language functionality.
 
  As it turns out a number of pages are no longer returning the language
  links in the raw source code like they used to. The canonical test for us
  was to load Russia in English, then grab the Russian link. As you can
  see, the page for Russia no longer has those links (which can normally
 be
  seen down the left-hand side of the real page):
 
  http://en.wikipedia.org/w/index.php?title=Russiaaction=raw
 
  A shorter example is the page for Rotation
 
  http://en.wikipedia.org/w/index.php?title=Rotationaction=raw
 
  I did find that some pages still have language links. See this one for
  Babushka
 
  http://en.wikipedia.org/w/index.php?title=Babushkaaction=raw
 
  Has there been some change that no longer outputs those links in some
  instances, or is this an actual bug?
 
  I apologize that this wasn't sent to some official bug tracker, but I
  couldn't find that info off-hand from the Wikipedia site.
 
  Thanks in advance,

 Brian, thanks for your note.

 Here's our recent blog post on how to file a bug report or feature
 request in our Bugzilla installation:

 https://blog.wikimedia.org/2013/03/18/how-to-create-a-good-first-bug-report/

 If you find that the API doesn't give you some of what you need for
 WWW::Wikipedia, please do file a bug.  Thanks!
 --
 Sumana Harihareswara
 Engineering Community Manager
 Wikimedia Foundation

 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Important maintenance bug

2013-03-21 Thread Brion Vibber
On Thu, Mar 21, 2013 at 11:46 AM, Leslie Carr lc...@wikimedia.org wrote:
 https://bugzilla.wikimedia.org/show_bug.cgi?id=46428

 If one of the php-knowledgable peeps can take a look at this (sadly my
 php-foo is quite weak).

Hmm... ok this explains why I couldn't find Wikipedia Zero-related
memcache entries when testing from fenari last night. :(

It sounds like maintenance scripts could have a serious split-brain
problem if they're being run in pmtpa while the web servers are in
eqiad, if each cluster uses its own local memcached servers.

Either updates need to be proxied to both clusters (race conditions?)
or it should only be possible to run maintenance scripts in the same
cluster that's running the web servers. The former sounds scary and
latency-intense, while the latter requires more configuration on the
eqiad bastion hosts so MW scripts can be run. :(

-- brion

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Who is responsible for communicating changes in MediaWiki to WMF sites?

2013-03-21 Thread Quim Gil

On 03/21/2013 11:48 AM, Isarra Yos wrote:

Unfortunately I didn't have any idea when such a change could or would
be merged or deployed, so not only did I not have any timeframe to give
said the communities, I didn't even know when it would be appropriate to
tell them (if it happens months later, mentioning now would not be very
helpful) - or even if it ever would really happen at all.


This is so clear that anybody will understand it.

I believe mentioning potential problems when you see them coming is 
always helpful. Do it in the related bug report and share the URL with 
the affected parties e.g. at wikitech-ambassadors. Invite them to follow 
the bug to have the same information than you, at the same time than 
you, with the same chances of giving feedback and participating than you.


Ideally, by the time a deployment date can be decided they will be the 
ones communicating with their own communities. Otherwise you can simply 
go and say Remember what we told you (link)? Ok, it's coming now.


--
Quim Gil
Technical Contributor Coordinator @ Wikimedia Foundation
http://www.mediawiki.org/wiki/User:Qgil

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Who is responsible for communicating changes in MediaWiki to WMF sites?

2013-03-21 Thread Isarra Yos

On 21/03/13 19:15, Quim Gil wrote:

On 03/21/2013 11:48 AM, Isarra Yos wrote:

Unfortunately I didn't have any idea when such a change could or would
be merged or deployed, so not only did I not have any timeframe to give
said the communities, I didn't even know when it would be appropriate to
tell them (if it happens months later, mentioning now would not be very
helpful) - or even if it ever would really happen at all.


This is so clear that anybody will understand it.

I believe mentioning potential problems when you see them coming is 
always helpful. Do it in the related bug report and share the URL with 
the affected parties e.g. at wikitech-ambassadors. Invite them to 
follow the bug to have the same information than you, at the same time 
than you, with the same chances of giving feedback and participating 
than you.


Ideally, by the time a deployment date can be decided they will be the 
ones communicating with their own communities. Otherwise you can 
simply go and say Remember what we told you (link)? Ok, it's coming 
now.




You speak of an ideal world, which this is not. Those most affected by 
these things generally do not use bugzilla at all (it's not just an 
extra hassle, but given the peculiar login system it uses, many 
wikimedians have incentive to not even try), so linking the bug won't help.


For that matter, do you have any idea how *many* random proposals like 
this people come up with? Of those that make it to bugzilla at all, only 
some go through, most don't. And those that do can take months, if not 
years, to actually be implemented/merged - even after implementation, 
changes can and often sit in gerrit for months with no indication of 
progress, even the most trivial things.


So it seems frankly ridiculous to me to suggest effectively going around 
announcing to folks that 'hey, some things may change sometime this 
year, but then again they may not, but in the meantime you can go to 
this strange site that doesn't accept your login and follow its massive 
forms and disorganised comments!' when instead we could just... I dunno, 
maybe get more staff and other folks who have merge rights to help work 
out a real timeframe (or even if it is likely to get merged at all) 
before expecting *anyone* to announce the matter, then announce 
something more concrete. (Which would be good because communities tend 
to be more receptive to that - give them a timeframe, and they'll speak 
their minds. Give them something vague and it just teaches them to 
ignore it, since who knows if it'll ever actually be a thing.)


Alternately maybe we could just not expect the volunteers to announce 
these themselves in the first place like Niklas suggested, since on top 
of everything else he mentioned, said volunteers tend to lack some key 
details, along with access to the usual announcement methods in the 
first place.


--
-— Isarra


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Who is responsible for communicating changes in MediaWiki to WMF sites?

2013-03-21 Thread Federico Leva (Nemo)
Quim, you seem to be answering the question how does one communicate 
changes, but the question of this thread is who is responsible for 
doing so. It's quite a difference.
Usually volunteers know the communities better and have less problems 
with the how than others, but that's not the point.


Nemo

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Missing project ideas for GSOC

2013-03-21 Thread Quim Gil

Thank you for the feedback!

Brion  co, I have tried to distill the essence of your comments and 
write it down as generic guidelines at


https://www.mediawiki.org/wiki/Summer_of_Code_2013#Project_ideas


On 03/20/2013 04:53 PM, Luca de Alfaro wrote:

Would there be interest in integrating the work on authorship computation?
This would not be an extension; it would be ... server-side development
that could fit well with a Summer of Code?



I can't say much since I'm not aware of that project, but what matters 
is to list it with a good description at

https://www.mediawiki.org/wiki/Mentorship_programs/Possible_projects

--
Quim Gil
Technical Contributor Coordinator @ Wikimedia Foundation
http://www.mediawiki.org/wiki/User:Qgil

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Who is responsible for communicating changes in MediaWiki to WMF sites?

2013-03-21 Thread Quim Gil

On 03/21/2013 02:12 PM, Federico Leva (Nemo) wrote:

Quim, you seem to be answering the question how does one communicate
changes, but the question of this thread is who is responsible for
doing so. It's quite a difference.


I don't think there is a single name for this responsibility. As it 
happens in so many tasks at the Wikimedia community.


Andre Klapper and me should be good contact points for bugfixes and 
other changes in MediaWiki requiring communication beyond Bugzilla / 
Gerrit. Put us in CC or contact us directly. The sooner the better. We 
will take it from there.


As for the current case(s?), Niklas or whoever else in the know: send us 
the URLs or the background so we can do something about it.


--
Quim Gil
Technical Contributor Coordinator @ Wikimedia Foundation
http://www.mediawiki.org/wiki/User:Qgil

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] [RFC] performance standards for new mediawiki features

2013-03-21 Thread Asher Feldman
I'd like to push for a codified set of minimum performance standards that
new mediawiki features must meet before they can be deployed to larger
wikimedia sites such as English Wikipedia, or be considered complete.

These would look like (numbers pulled out of a hat, not actual
suggestions):

- p999 (long tail) full page request latency of 2000ms
- p99 page request latency of 800ms
- p90 page request latency of 150ms
- p99 banner request latency of 80ms
- p90 banner request latency of 40ms
- p99 db query latency of 250ms
- p90 db query latency of 50ms
- 1000 write requests/sec (if applicable; writes operations must be free
from concurrency issues)
- guidelines about degrading gracefully
- specific limits on total resource consumption across the stack per request
- etc..

Right now, varying amounts of effort are made to highlight potential
performance bottlenecks in code review, and engineers are encouraged to
profile and optimize their own code.  But beyond is the site still up for
everyone / are users complaining on the village pump / am I ranting in
irc, we've offered no guidelines as to what sort of request latency is
reasonable or acceptable.  If a new feature (like aftv5, or flow) turns out
not to meet perf standards after deployment, that would be a high priority
bug and the feature may be disabled depending on the impact, or if not
addressed in a reasonable time frame.  Obviously standards like this can't
be applied to certain existing parts of mediawiki, but systems other than
the parser or preprocessor that don't meet new standards should at least be
prioritized for improvement.

Thoughts?

Asher
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Reminder about the best way to link to bugs in commits

2013-03-21 Thread Krinkle
On Mar 20, 2013, at 11:59 AM, Niklas Laxström niklas.laxst...@gmail.com wrote:

 On 1 March 2013 23:46, Chad innocentkil...@gmail.com wrote:
 Bug: 1234
 Change-Id: Ia90.
 
 
 So when you do this, you're able to search for bug:1234 via Gerrit.
 By doing this, you're also removing it from the first line (which was
 our old habit, mostly from SVN days), providing you more space to
 be descriptive in that first line.
 
 Few questions:
 
 1) Why is Bug:43778 different from bug:43778 when searching?
 

Because it doesn't literally search for Bug:123 (even though in our case it 
looks that way because the footer is also Bug: 123).

There is a search operator (bug), which is linked to a footer name (Bug:), a 
match (\\#?\\d{1,6}) for the value that is to be indexed.
Just like project, owner, branch and topic are search operators linked to 
certain values. The operators are case sensitive and always lowercase by 
convention.

The footer being clickable is done independently.

-- Krinkle



___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Who is responsible for communicating changes in MediaWiki to WMF sites?

2013-03-21 Thread Tim Starling
On 21/03/13 20:55, Niklas Laxström wrote:
 I've seen a couple of instances where changes to MediaWiki are blocked
 until someone informs the community.
 
 Someone is a volunteer.
 
 Community is actually just the Wikimedia project communities. Or at
 least the biggest ones which are expected to complain and where the
 complaining would hurt.
 
 This situation seems completely unfair to me. WMF should be able to
 communicate upcoming changes itself, not throw it to volunteers.
 Volunteers can help, but they should not be responsible for this to
 happen.

I would assume that do it yourself is usually code for we don't
consider this deployment to be important enough to spend any time on
it at the moment.

Fair enough, volunteers don't have an automatic right to dictate other
people's priorities, a fact which might need to be communicated with tact.

Also, community managers generally see it as their responsibility to
extract as much work from volunteers as possible, and will ask a
volunteer to do something whether or not a WMF staff member would be
more than happy to do it.

-- Tim Starling


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] [RFC] performance standards for new mediawiki features

2013-03-21 Thread Matthew Walker
Asher,

Do we know what our numbers are now? That's probably a pretty good baseline
to start with as a discussion.

p99 banner request latency of 80ms

Fundraising banners? From start of page load; or is this specifically how
fast our API requests run?

On the topic of APIs; we should set similar perf goals for requests to the
API / jobs. This gets very subjective though because now we're talking
about CPU time, memory usage, HDD usage, cache key space usage -- are these
in your scope; or are we simply starting the discussion with response times?

Further down the road -- consistency is going to be important (my box will
profile differently than someone else's) so it seems like this is a good
candidate for 'yet another' continuous integration test. I can easily see
us being able to get an initial feel for response times in the
CI environment. Or maybe we should just continuously hammer the alpha/beta
servers...

On deployment though -- currently the only way I know of to see if
something is performing is to look directly at graphite -- can
icinga/something alert us -- presumably via email? Ideally we would be able
to set up new metrics as we go (obviously start with global page loads; but
maybe I want to keep an eye on banner render time). I would love to get an
email about something I've deployed under-performing.

~Matt Walker
Wikimedia Foundation
Fundraising Technology Team


On Thu, Mar 21, 2013 at 6:40 PM, Asher Feldman afeld...@wikimedia.orgwrote:

 I'd like to push for a codified set of minimum performance standards that
 new mediawiki features must meet before they can be deployed to larger
 wikimedia sites such as English Wikipedia, or be considered complete.

 These would look like (numbers pulled out of a hat, not actual
 suggestions):

 - p999 (long tail) full page request latency of 2000ms
 - p99 page request latency of 800ms
 - p90 page request latency of 150ms
 - p99 banner request latency of 80ms
 - p90 banner request latency of 40ms
 - p99 db query latency of 250ms
 - p90 db query latency of 50ms
 - 1000 write requests/sec (if applicable; writes operations must be free
 from concurrency issues)
 - guidelines about degrading gracefully
 - specific limits on total resource consumption across the stack per
 request
 - etc..

 Right now, varying amounts of effort are made to highlight potential
 performance bottlenecks in code review, and engineers are encouraged to
 profile and optimize their own code.  But beyond is the site still up for
 everyone / are users complaining on the village pump / am I ranting in
 irc, we've offered no guidelines as to what sort of request latency is
 reasonable or acceptable.  If a new feature (like aftv5, or flow) turns out
 not to meet perf standards after deployment, that would be a high priority
 bug and the feature may be disabled depending on the impact, or if not
 addressed in a reasonable time frame.  Obviously standards like this can't
 be applied to certain existing parts of mediawiki, but systems other than
 the parser or preprocessor that don't meet new standards should at least be
 prioritized for improvement.

 Thoughts?

 Asher
 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] [RFC] performance standards for new mediawiki features

2013-03-21 Thread Peter Gehres
From where would you propose measuring these data points?  Obviously
network latency will have a great impact on some of the metrics and a
consistent location would help to define the pass/fail of each test. I do
think another benchmark Ops features would be a set of
latency-to-datacenter values, but I know that is a much harder taks. Thanks
for putting this together.


On Thu, Mar 21, 2013 at 6:40 PM, Asher Feldman afeld...@wikimedia.orgwrote:

 I'd like to push for a codified set of minimum performance standards that
 new mediawiki features must meet before they can be deployed to larger
 wikimedia sites such as English Wikipedia, or be considered complete.

 These would look like (numbers pulled out of a hat, not actual
 suggestions):

 - p999 (long tail) full page request latency of 2000ms
 - p99 page request latency of 800ms
 - p90 page request latency of 150ms
 - p99 banner request latency of 80ms
 - p90 banner request latency of 40ms
 - p99 db query latency of 250ms
 - p90 db query latency of 50ms
 - 1000 write requests/sec (if applicable; writes operations must be free
 from concurrency issues)
 - guidelines about degrading gracefully
 - specific limits on total resource consumption across the stack per
 request
 - etc..

 Right now, varying amounts of effort are made to highlight potential
 performance bottlenecks in code review, and engineers are encouraged to
 profile and optimize their own code.  But beyond is the site still up for
 everyone / are users complaining on the village pump / am I ranting in
 irc, we've offered no guidelines as to what sort of request latency is
 reasonable or acceptable.  If a new feature (like aftv5, or flow) turns out
 not to meet perf standards after deployment, that would be a high priority
 bug and the feature may be disabled depending on the impact, or if not
 addressed in a reasonable time frame.  Obviously standards like this can't
 be applied to certain existing parts of mediawiki, but systems other than
 the parser or preprocessor that don't meet new standards should at least be
 prioritized for improvement.

 Thoughts?

 Asher
 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] [RFC] performance standards for new mediawiki features

2013-03-21 Thread Yuri Astrakhan
API is fairly complex to meassure and performance target. If a bot requests
5000 pages in one call, together with all links  categories, it might take
a very long time (seconds if not tens of seconds). Comparing that to
another api request that gets an HTML section of a page, which takes a
fraction of a second (especially when comming from cache) is not very
useful.


On Fri, Mar 22, 2013 at 1:32 AM, Peter Gehres li...@pgehres.com wrote:

 From where would you propose measuring these data points?  Obviously
 network latency will have a great impact on some of the metrics and a
 consistent location would help to define the pass/fail of each test. I do
 think another benchmark Ops features would be a set of
 latency-to-datacenter values, but I know that is a much harder taks. Thanks
 for putting this together.


 On Thu, Mar 21, 2013 at 6:40 PM, Asher Feldman afeld...@wikimedia.org
 wrote:

  I'd like to push for a codified set of minimum performance standards that
  new mediawiki features must meet before they can be deployed to larger
  wikimedia sites such as English Wikipedia, or be considered complete.
 
  These would look like (numbers pulled out of a hat, not actual
  suggestions):
 
  - p999 (long tail) full page request latency of 2000ms
  - p99 page request latency of 800ms
  - p90 page request latency of 150ms
  - p99 banner request latency of 80ms
  - p90 banner request latency of 40ms
  - p99 db query latency of 250ms
  - p90 db query latency of 50ms
  - 1000 write requests/sec (if applicable; writes operations must be free
  from concurrency issues)
  - guidelines about degrading gracefully
  - specific limits on total resource consumption across the stack per
  request
  - etc..
 
  Right now, varying amounts of effort are made to highlight potential
  performance bottlenecks in code review, and engineers are encouraged to
  profile and optimize their own code.  But beyond is the site still up
 for
  everyone / are users complaining on the village pump / am I ranting in
  irc, we've offered no guidelines as to what sort of request latency is
  reasonable or acceptable.  If a new feature (like aftv5, or flow) turns
 out
  not to meet perf standards after deployment, that would be a high
 priority
  bug and the feature may be disabled depending on the impact, or if not
  addressed in a reasonable time frame.  Obviously standards like this
 can't
  be applied to certain existing parts of mediawiki, but systems other than
  the parser or preprocessor that don't meet new standards should at least
 be
  prioritized for improvement.
 
  Thoughts?
 
  Asher
  ___
  Wikitech-l mailing list
  Wikitech-l@lists.wikimedia.org
  https://lists.wikimedia.org/mailman/listinfo/wikitech-l
 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l