[Wikitech-l] RFC: I suggest renaming of pages Cross-site scripting = Cross-site scripting (XSS, XSSI) and Cross-site request forgery = Cross-site request forgery (CSRF)

2013-03-22 Thread Thomas Gries
I suggest renaming of pages

1. Cross-site scripting = Cross-site scripting (XSS, XSSI)
https://www.mediawiki.org/wiki/Cross-site_scripting

2. Cross-site request forgery = Cross-site request forgery (CSRF)
https://www.mediawiki.org/wiki/Cross-site_request_forgery


Before doing that I want to be sure that you accept it. To you support
my initiative.

Another page, part of the MW Security Guide, has already (only) XSS in
its name

DOM-based_XSS
https://www.mediawiki.org/wiki/DOM-based_XSS

See https://www.mediawiki.org/wiki/MSG .


Rationale:

The change would have the advantages, that the section and pages in the
MediaWiki Security Guide (MSG) have the same, more detailed page title.
And that the commonly used abbrevations are then forming part of this title.


I am asking, because these pages are so important and I don't want to be
rude simply changing it.
Can you confirm you allow me that change?

T.



signature.asc
Description: OpenPGP digital signature
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] RFC: I suggest renaming of pages Cross-site scripting = Cross-site scripting (XSS, XSSI) and Cross-site request forgery = Cross-site request forgery (CSRF)

2013-03-22 Thread K. Peachey
What actual benefit with having their abbreviation in the title archive?

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] RFC: I suggest renaming of pages Cross-site scripting = Cross-site scripting (XSS, XSSI) and Cross-site request forgery = Cross-site request forgery (CSRF)

2013-03-22 Thread Thomas Gries
Am 22.03.2013 07:29, schrieb K. Peachey:
 What actual benefit with having their abbreviation in the title archive?
Make users aware at the first glance to the TOC, that XSS is this and
CSRF is that if they did not yet know this.
You, as expert, can overread the part in parentheses.

I am the editor of that book, and I like it in.

BTW, what is the purpose in having a title DOM-based XSS (like it is now)
instead of the more distinct and absolutely clear title
DOM-based Cross-Site Scripting (CSS)



___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] RFC: I suggest renaming of pages Cross-site scripting = Cross-site scripting (XSS, XSSI) and Cross-site request forgery = Cross-site request forgery (CSRF)

2013-03-22 Thread Daniel Friesen

On Thu, 21 Mar 2013 23:35:27 -0700, Thomas Gries m...@tgries.de wrote:


Am 22.03.2013 07:29, schrieb K. Peachey:

What actual benefit with having their abbreviation in the title archive?

Make users aware at the first glance to the TOC, that XSS is this and
CSRF is that if they did not yet know this.
You, as expert, can overread the part in parentheses.

I am the editor of that book, and I like it in.

BTW, what is the purpose in having a title DOM-based XSS (like it is  
now)

instead of the more distinct and absolutely clear title
DOM-based Cross-Site Scripting (CSS)


Including the abbreviation into the title seems quite a non-standard thing  
to do. Parenthesis are typically used for disambig, not adding extra  
versions of the title.


((Side note, XSS not CSS))

--
~Daniel Friesen (Dantman, Nadir-Seen-Fire) [http://danielfriesen.name/]


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] RFC: I suggest renaming of pages Cross-site scripting = Cross-site scripting (XSS, XSSI) and Cross-site request forgery = Cross-site request forgery (CSRF)

2013-03-22 Thread Thomas Gries
Am 22.03.2013 08:26, schrieb Daniel Friesen:
 You, as expert, can overread the part in parentheses.

 Including the abbreviation into the title seems quite a non-standard
 thing to do. Parenthesis are typically used for disambig, not adding
 extra versions of the title.



I fixed it locally on the MSG page;

[[pagename|Pagename (something)]]

The E:Collection page renderer obeys my page names and - as wanted -
uses Pagename (something) as header  on pages.
So I could locally fix my problem, and no rename is needed.

problem solved with a differnt solution; thread closed.

T.



___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] RFC: I suggest renaming of pages Cross-site scripting = Cross-site scripting (XSS, XSSI) and Cross-site request forgery = Cross-site request forgery (CSRF)

2013-03-22 Thread Daniel Friesen
On 13-03-22 12:41 AM, Thomas Gries wrote:
 Am 22.03.2013 08:26, schrieb Daniel Friesen:
 You, as expert, can overread the part in parentheses.
 Including the abbreviation into the title seems quite a non-standard
 thing to do. Parenthesis are typically used for disambig, not adding
 extra versions of the title.


 I fixed it locally on the MSG page;

 [[pagename|Pagename (something)]]

 The E:Collection page renderer obeys my page names and - as wanted -
 uses Pagename (something) as header  on pages.
 So I could locally fix my problem, and no rename is needed.

 problem solved with a differnt solution; thread closed.

 T.
Editorially, I'd suggest [[ Cross-site request forgery]] (XSS).

~Daniel Friesen (Dantman, Nadir-Seen-Fire) [http://danielfriesen.name/]

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] wmf12 rollback, all wikis (except test2) are on wmf11

2013-03-22 Thread Niklas Laxström
On 21 March 2013 20:11, Greg Grossmeier g...@wikimedia.org wrote:
 Tim rolled back wmf12 after a nasty bug last night:
 https://bugzilla.wikimedia.org/show_bug.cgi?id=46397

I assume this included all the extension as well.
  -Niklas

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] Making it possible to create method documentation links on MW.org

2013-03-22 Thread Daniel Friesen
Right now a lot of our links to generated documentation on php classes are  
manually copied.


One of the reasons for this seems to be how badly Doxygen handles the  
anchors to sections on individual methods.
Instead of simply using the method name it generates a md5 hash of a  
signature that is impossible to know simply with a class and method name.  
Meaning we can't simply have a template to link to documentation.


However Doxygen exports a tagfile that includes a list of classes, their  
methods, and the anchors used for their links.


This is currently at: (*warning* large file)
https://doc.wikimedia.org/mediawiki-core/master/php/html/tagfile.xml

What does everyone think of making it so that when Jenkins generates this  
documentation. It processes the tagfile, splits it up and converts it into  
multiple lua tables, then uses the API to update a Module: page on  
mediawiki.org.


This way templates can have some Lua code that uses that data to create  
links with something like:


local classes = mw.loadData( 'DoxygenTags mediawiki-core class' )
...

function p.methodLink(className, methodName)
local class = classes[className]
...
local method = members.function[methodName]
...
	return https://doc.wikimedia.org/mediawiki-core/master/php/html/; +  
method.anchorfile + # + method.anchor

end

...


--
~Daniel Friesen (Dantman, Nadir-Seen-Fire) [http://danielfriesen.name/]


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] RFC How/whether MediaWiki could use ZendOptimizuerPlus -- ZendOptimizerPlus, opcode cache, PHP 5.4, APC, memcache ???

2013-03-22 Thread Platonides
Trying to clarify:

APC can do two things:
1) Keep the compiled php opcodes, so php execution is faster.
2) Allow the application to store values in the web server memory (kept
accross requests).

ZendOptimizer only does 1.

MediaWiki only needs to be changed for 2, since 1 is done automatically
by all php opcode cache.

You can't use 2 once you have several servers. An alternative for 2 is
to use memcached or another cache that allows to be accessed from
multiple servers.

The «APC  is a must have for larger MediaWikis» is due to 1. In fact,
wikimedia is not using APC for 2, but memcached.


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] CAPTCHA

2013-03-22 Thread Platonides
On 21/03/13 08:05, Federico Leva (Nemo) wrote:
 Restrictive wikis for captchas are only a handful (plus pt.wiki which is
 in permanent emergency mode).
 https://meta.wikimedia.org/wiki/Newly_registered_user
 For them you could request confirmed flag at
 https://meta.wikimedia.org/wiki/SRP
 Personally I found it easier to do the required 10, 50 or whatever edits
 on a userpage. 5 min at most and you're done.
 
 Nemo

Their problem is likely that their accounts are new, not that those
wikis additionally require a minimum number of edits (only a handful of
wikis have that).


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] RFC How/whether MediaWiki could use ZendOptimizuerPlus -- ZendOptimizerPlus, opcode cache, PHP 5.4, APC, memcache ???

2013-03-22 Thread Thomas Gries
Am 22.03.2013 12:16, schrieb Platonides:
 Trying to clarify:

 APC can do two things:
 1) Keep the compiled php opcodes, so php execution is faster.
 2) Allow the application to store values in the web server memory (kept
 accross requests).

 ZendOptimizer only does 1.

 MediaWiki only needs to be changed for 2, since 1 is done automatically
 by all php opcode cache.

 You can't use 2 once you have several servers. An alternative for 2 is
 to use memcached or another cache that allows to be accessed from
 multiple servers.

 The «APC  is a must have for larger MediaWikis» is due to 1. In fact,
 wikimedia is not using APC for 2, but memcached.
Hi,

thanks for your clarification (for me, this was almost clear, but it's
often difficult for me to explain exactly what I want).
Your last info is perfect, should also go the mentioned mediaWiki pages.


My user story:

As an admin, when I upgrade my PHP from 5.3.x to PHP 5.4.x,
and when I cannot or do not want make use of APC because this is not
working with PHP 5.4.x.
and when I have successfully compiled the present ZendOptimizerPlus from
github
and when I have successfully compiled the present PHP 5.4.x from php.net
and when I have successfully made the necessary changes in php.ini
and php runs with it,

I have to change CACHE_ACCEL to CACHE_NONE in my LocalSettings.php,
and will still enjoy opcode caching by ZendOptimizerPlus,
but have no memory cache - currently.


Is this correct ?
Can the setup be improved, and how ?



___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Making it possible to create method documentation links on MW.org

2013-03-22 Thread Waldir Pimenta
On Fri, Mar 22, 2013 at 9:43 AM, Daniel Friesen
dan...@nadir-seen-fire.comwrote:

 Right now a lot of our links to generated documentation on php classes are
 manually copied.

 One of the reasons for this seems to be how badly Doxygen handles the
 anchors to sections on individual methods.
 Instead of simply using the method name it generates a md5 hash of a
 signature that is impossible to know simply with a class and method name.
 Meaning we can't simply have a template to link to documentation.

 However Doxygen exports a tagfile that includes a list of classes, their
 methods, and the anchors used for their links.

 This is currently at: (*warning* large file)
 https://doc.wikimedia.org/**mediawiki-core/master/php/**html/tagfile.xmlhttps://doc.wikimedia.org/mediawiki-core/master/php/html/tagfile.xml

 What does everyone think of making it so that when Jenkins generates this
 documentation. It processes the tagfile, splits it up and converts it into
 multiple lua tables, then uses the API to update a Module: page on
 mediawiki.org.

 This way templates can have some Lua code that uses that data to create
 links with something like:

 local classes = mw.loadData( 'DoxygenTags mediawiki-core class' )
 ...

 function p.methodLink(className, methodName)
 local class = classes[className]
 ...
 local method = members.function[methodName]
 ...
 return https://doc.wikimedia.org/**mediawiki-core/master/php/**
 html/ https://doc.wikimedia.org/mediawiki-core/master/php/html/ +
 method.anchorfile + # + method.anchor
 end

 ...


I would go further and suggest a way to integrate code comments into manual
pages at mediawiki.org, so that we could have good documentation in both
code and mw.org, without needing to sync it manually. For example, the
detailed description of api.php found at
https://doc.wikimedia.org/mediawiki-core/master/php/html/api_8php.html#detailscould
be integrated into
https://www.mediawiki.org/wiki/Manual:api.php

Similarly, the contents of README files and the docs folder should be
integratable into mw.org.

If what Daniel suggests is feasible, I assume this also is, and imo would
greatly improve the availability and quality of both in-wiki and code
documentation.

--Waldir
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] RFC How/whether MediaWiki could use ZendOptimizuerPlus -- ZendOptimizerPlus, opcode cache, PHP 5.4, APC, memcache ???

2013-03-22 Thread Tyler Romeo
On Fri, Mar 22, 2013 at 7:39 AM, Thomas Gries m...@tgries.de wrote:

 I have to change CACHE_ACCEL to CACHE_NONE in my LocalSettings.php,
 and will still enjoy opcode caching by ZendOptimizerPlus,
 but have no memory cache - currently.


 Is this correct ?
 Can the setup be improved, and how ?


Yes, this is correct. It can be improved by setting up a memcached server
(it's quick and easy, and in small wikis can even be run on the same server
as the web server, though not recommended for larger setups) and then using
that as your cache. As an alternative, you can also use CACHE_DB, which
will use the database for caching, although that doesn't really help much
since a cache miss usually means a DB query anyway.

*-- *
*Tyler Romeo*
Stevens Institute of Technology, Class of 2015
Major in Computer Science
www.whizkidztech.com | tylerro...@gmail.com
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] Gerrit/Jenkins Verification

2013-03-22 Thread Tyler Romeo
I've noticed that sometimes Jenkins +1s changes and other times it +2s them
(for Verified, that is). Is there any specific pattern to this? It's not a
problem or anything; I'm just curious. I feel like it was explained back
when the system was changed to +1 and +2 but I forget and can't seem to
find anything in the archives.
*-- *
*Tyler Romeo*
Stevens Institute of Technology, Class of 2015
Major in Computer Science
www.whizkidztech.com | tylerro...@gmail.com
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Missing project ideas for GSOC

2013-03-22 Thread Željko Filipin
Hi Quim,

comments are inline.

On Thu, Mar 21, 2013 at 12:43 AM, Quim Gil q...@wikimedia.org wrote:

 (...)
 After this filtering, we seem to be left with:
 (...)
 (If you think your project should also be considered here please speak up!)


Browser test automation[1]?


 Most of these projects seem to be extension (and PHP?) centric. Can we
 have more diversity?


Browser test automation? Not an extension, not in PHP, but in Ruby[2].


 (...) What about the mobile front?


We have some browser test automation for MobileFrontend[3].

Željko
--
[1]
https://www.mediawiki.org/wiki/Mentorship_programs/Possible_projects#Browser_Test_Automation
[2] https://github.com/wikimedia/qa-browsertests
[3]
https://github.com/wikimedia/mediawiki-extensions-MobileFrontend/tree/master/tests/acceptance
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Missing project ideas for GSOC

2013-03-22 Thread Guillaume Paumier
Hi,

On Thu, Mar 21, 2013 at 12:43 AM, Quim Gil q...@wikimedia.org wrote:

 Many of the ideas listed there are too generic (Write an extension),
 improvements of existing features (Improve Extension:CSS)

This may sound naive, but why are improvements of existing features
discarded? My thinking was that, if the student didn't have to start
from scratch, they would have more time to polish their work and make
it fit with our strict standards, hence making it more likely for
their work to be merged and deployed.

(Of course, the existing code needs to be good enough not to require a
complete rewrite, but that could be decided on a case-by-case basis.)

-- 
Guillaume Paumier

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Missing project ideas for GSOC

2013-03-22 Thread Yuvi Panda
I've a good number of Mobile app ideas to contribute, but do not think
I'll be able to mentor. Should I still put those in? Does putting them
in with your name attached convey 'hey, this guy might be able to
mentor?' to people looking?


--
Yuvi Panda T
http://yuvi.in/blog

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Missing project ideas for GSOC

2013-03-22 Thread sankarshan
On Fri, Mar 22, 2013 at 6:16 PM, Yuvi Panda yuvipa...@gmail.com wrote:
 Does putting them
 in with your name attached convey 'hey, this guy might be able to
 mentor?' to people looking?

It usually does. At least interested students tend to assume as much.


--
sankarshan mukhopadhyay
https://twitter.com/#!/sankarshan

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Missing project ideas for GSOC

2013-03-22 Thread K. Peachey
On Fri, Mar 22, 2013 at 10:30 PM, Željko Filipin zfili...@wikimedia.org wrote:
 Most of these projects seem to be extension (and PHP?) centric. Can we
 have more diversity?


 Browser test automation? Not an extension, not in PHP, but in Ruby[2].

OPs don't want [any more] ruby on the clusters, So I wouldn't suggest that.

We should be focusing on stuff that is achievable and that can easily
be shown on benefit our users by actually getting it out there (we
have a shocking record for this)

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] Glossary vs. Glossaries

2013-03-22 Thread Guillaume Paumier
Hi,

Last November, I started to clean up on the Glossary page on meta, as
an attempt to revive it and expand it to include many technical terms,
notably related to Wikimedia Engineering (see e-mail below).

There were (and are) already many glossaries spread around the wikis:
* one for MediaWiki: https://www.mediawiki.org/wiki/Manual:Glossary
* one for Wikidata: https://www.wikidata.org/wiki/Wikidata:Glossary
* one for Labs: https://wikitech.wikimedia.org/wiki/Help:Terminology
* two for the English Wikipedia:
https://en.wikipedia.org/wiki/Wikipedia:Glossary 
https://en.wikipedia.org/wiki/Wikipedia:WikiSpeak
* etc.

My thinking at the time was that it would be better to include tech
terms in meta's glossary, because fragmentation isn't a good thing for
glossaries: The user probably doesn't want to search a term through a
dozen glossaries (that they know of), and it would be easier if they
could just search in one place.

The fact is, though, that we're not going to merge all the existing
glossaries into one anytime soon, so overlap and duplication will
remain anyway. Also, it feels weird to have tech content on meta, and
the glossary is getting very long (and possibly more difficult to
maintain). Therefore, I'm now reconsidering the decision of mixing
tech terms and general movement terms on meta.

Below are the current solutions I'm seeing to move forward; I'd love
to get some feedback as to what people think would be the best way to
proceed.

* Status quo: We keep the current glossaries as they are, even if they
overlap and duplicate work. We'll manage.

* Wikidata: If Wikidata could be used to host terms and definitions
(in various languages), and wikis could pull this data using
templates/Lua, it would be a sane way to reduce duplication, while
still allowing local wikis to complement it with their own terms. For
example, administrator is a generic term across Wikimedia sites
(even MediaWiki sites), so it would go into the general glossary
repository on Wikidata; but DYK could be local to the English
Wikipedia. With proper templates, the integration between remote and
local terms could be seamless. It seems to me, however, that this
would require significant development work.

* Google custom search: Waldir recently used Google Custom Search to
created a search tool to find technical information across many pages
and sites where information is currently fragmented:
http://lists.wikimedia.org/pipermail/wikitech-l/2013-March/067450.html
. We could set up a similar tool (or a floss alternative) that would
include all glossaries. By advertising the tool prominently on
existing glossary pages (so that users know it exists), this could
allow us to curate more specific glossaries, while keeping them all
searchable with one tool.

Right now, I'm inclined to go with the custom search solution,
because it looks like the easiest and fastest to implement, while
reducing maintenance costs and remaining flexible. That said, I'd love
to hear feedback and opinions about this before implementing anything.

Thanks,

guillaume



On Tue, Nov 20, 2012 at 7:55 PM, Guillaume Paumier
gpaum...@wikimedia.org wrote:
 Hi,

 The use of jargon, acronyms and other abbreviations throughout the
 Wikimedia movement is a major source of communication issues, and
 barriers to comprehension and involvement.

 The recent thread on this list about What is Product? is an example
 of this, as are initialisms that have long been known to be a barrier
 for Wikipedia newcomers.

 A way to bridge people and communities with different vocabularies is
 to write and maintain a glossary that explains jargon in plain English
 terms. We've been lacking a good and up-to-date glossary for Wikimedia
 stuff (Foundation, chapter, movement, technology, etc.).

 Therefore, I've started to clean up and expand the outdated Glossary
 on meta, but it's a lot of work, and I don't have all the answers
 myself either. I'll continue to work on it, but I'd love to get some
 help on this and to make it a collaborative effort.

 If you have a few minutes to spare, please consider helping your
 (current and future) fellow Wikimedians by writing a few definitions
 if there are terms that you can explain in plain English. Additions of
 new terms are much welcome as well:

 https://meta.wikimedia.org/wiki/Glossary

 Some caveats:
 * As part of my work, I'm mostly interested in a glossary from a
 technical perspective, so the list currently has a technical bias. I'm
 hoping that by sending this message to a wider audience, people from
 the whole movement will contribute to the glossary and balance it out.
 * Also, I've started to clean up the glossary, but it still contains
 dated terms and definitions from a few years ago (like the FundCom),
 so boldly edit/remove obsolete content.

-- 
Guillaume Paumier
Technical Communications Manager — Wikimedia Foundation
https://donate.wikimedia.org

___
Wikitech-l mailing list

Re: [Wikitech-l] Missing project ideas for GSOC

2013-03-22 Thread Brian Wolff
On 2013-03-22 9:37 AM, Guillaume Paumier gpaum...@wikimedia.org wrote:

 Hi,

 On Thu, Mar 21, 2013 at 12:43 AM, Quim Gil q...@wikimedia.org wrote:
 
  Many of the ideas listed there are too generic (Write an extension),
  improvements of existing features (Improve Extension:CSS)

 This may sound naive, but why are improvements of existing features
 discarded? My thinking was that, if the student didn't have to start
 from scratch, they would have more time to polish their work and make
 it fit with our strict standards, hence making it more likely for
 their work to be merged and deployed.

 (Of course, the existing code needs to be good enough not to require a
 complete rewrite, but that could be decided on a case-by-case basis.)

 --
 Guillaume Paumier

 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l

I think improvement to exidting features are fine, but it should be
existing features that are used by (or have a high potential of being) used
by the wmf. If its a feature not used by wikimedia, it should have an
extremely high impact on third parties to compensate.

-bawolff
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] RFC How/whether MediaWiki could use ZendOptimizuerPlus -- ZendOptimizerPlus, opcode cache, PHP 5.4, APC, memcache ???

2013-03-22 Thread Brian Wolff
On 2013-03-22 9:20 AM, Tyler Romeo tylerro...@gmail.com wrote:

 On Fri, Mar 22, 2013 at 7:39 AM, Thomas Gries m...@tgries.de wrote:

  I have to change CACHE_ACCEL to CACHE_NONE in my LocalSettings.php,
  and will still enjoy opcode caching by ZendOptimizerPlus,
  but have no memory cache - currently.
 
 
  Is this correct ?
  Can the setup be improved, and how ?
 

 Yes, this is correct. It can be improved by setting up a memcached server
 (it's quick and easy, and in small wikis can even be run on the same
server
 as the web server, though not recommended for larger setups) and then
using
 that as your cache. As an alternative, you can also use CACHE_DB, which
 will use the database for caching, although that doesn't really help much
 since a cache miss usually means a DB query anyway.

 *-- *
 *Tyler Romeo*
 Stevens Institute of Technology, Class of 2015
 Major in Computer Science
 www.whizkidztech.com | tylerro...@gmail.com
 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Some people have claimed that CACHE_DB might even slow things down compared
to CACHE_NONE when used as main cache type (cache db is still better than
cache none for slow caches like the parser cache). Anyhow you should do
profiling type things when messing with caching settings (or any
performance settings) to see what is effective and what is not.

-bawolff
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Gerrit/Jenkins Verification

2013-03-22 Thread Brian Wolff
On 2013-03-22 9:24 AM, Tyler Romeo tylerro...@gmail.com wrote:

 I've noticed that sometimes Jenkins +1s changes and other times it +2s
them
 (for Verified, that is). Is there any specific pattern to this? It's not a
 problem or anything; I'm just curious. I feel like it was explained back
 when the system was changed to +1 and +2 but I forget and can't seem to
 find anything in the archives.
 *-- *
 *Tyler Romeo*
 Stevens Institute of Technology, Class of 2015
 Major in Computer Science
 www.whizkidztech.com | tylerro...@gmail.com
 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l

I think +1 is a merge and lint check where +2 is tests that actully execute
the code, so only happens if your on the trusted list of users. Jenkins
posts which tests its doing in a gerrit comment.

-bawolff
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Gerrit/Jenkins Verification

2013-03-22 Thread Tyler Romeo
On Fri, Mar 22, 2013 at 9:42 AM, Brian Wolff bawo...@gmail.com wrote:

 I think +1 is a merge and lint check where +2 is tests that actully execute
 the code, so only happens if your on the trusted list of users. Jenkins
 posts which tests its doing in a gerrit comment.

 -bawolff


Ah, OK. That makes sense. Thanks.

*-- *
*Tyler Romeo*
Stevens Institute of Technology, Class of 2015
Major in Computer Science
www.whizkidztech.com | tylerro...@gmail.com
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] RFC How/whether MediaWiki could use ZendOptimizuerPlus -- ZendOptimizerPlus, opcode cache, PHP 5.4, APC, memcache ???

2013-03-22 Thread Tyler Romeo
On Fri, Mar 22, 2013 at 9:38 AM, Brian Wolff bawo...@gmail.com wrote:

 Some people have claimed that CACHE_DB might even slow things down compared
 to CACHE_NONE when used as main cache type (cache db is still better than
 cache none for slow caches like the parser cache). Anyhow you should do
 profiling type things when messing with caching settings (or any
 performance settings) to see what is effective and what is not.

 -bawolff


Wouldn't be surprised. ;) The only problem is that with CACHE_NONE, many
things (specifically, throttling mechanisms) won't work since the cache
isn't persistent across requests.

*-- *
*Tyler Romeo*
Stevens Institute of Technology, Class of 2015
Major in Computer Science
www.whizkidztech.com | tylerro...@gmail.com
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] [Wikimedia-l] Glossary vs. Glossaries

2013-03-22 Thread Federico Leva (Nemo)

Guillaume Paumier, 22/03/2013 14:27:

* Status quo: We keep the current glossaries as they are, even if they
overlap and duplicate work. We'll manage.


Ugly.



* Wikidata: If Wikidata could be used to host terms and definitions
(in various languages), and wikis could pull this data using
templates/Lua, it would be a sane way to reduce duplication, while
still allowing local wikis to complement it with their own terms. For
example, administrator is a generic term across Wikimedia sites
(even MediaWiki sites), so it would go into the general glossary
repository on Wikidata; but DYK could be local to the English
Wikipedia. With proper templates, the integration between remote and
local terms could be seamless. It seems to me, however, that this
would require significant development work.


Will take years.



* Google custom search: Waldir recently used Google Custom Search to
created a search tool to find technical information across many pages
and sites where information is currently fragmented:
http://lists.wikimedia.org/pipermail/wikitech-l/2013-March/067450.html
. We could set up a similar tool (or a floss alternative) that would
include all glossaries. By advertising the tool prominently on
existing glossary pages (so that users know it exists), this could
allow us to curate more specific glossaries, while keeping them all
searchable with one tool.


+1


Right now, I'm inclined to go with the custom search solution,
because it looks like the easiest and fastest to implement, while
reducing maintenance costs and remaining flexible. That said, I'd love
to hear feedback and opinions about this before implementing anything.


Any solution that helps killing overlap and duplication is welcome. 
Having four slightly different versions of the same glossary 
(mediawiki.org, wikibooks, wikipedia, meta) plus countless accessories 
means that none does the job.


Nemo

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] wmf12 rollback, all wikis (except test2) are on wmf11

2013-03-22 Thread Brad Jorsch
On Fri, Mar 22, 2013 at 4:28 AM, Niklas Laxström
niklas.laxst...@gmail.com wrote:
 On 21 March 2013 20:11, Greg Grossmeier g...@wikimedia.org wrote:
 Tim rolled back wmf12 after a nasty bug last night:
 https://bugzilla.wikimedia.org/show_bug.cgi?id=46397

 I assume this included all the extension as well.

Tim simply put everything except test2wiki back to wmf11. So yes.

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] RFC How/whether MediaWiki could use ZendOptimizuerPlus -- ZendOptimizerPlus, opcode cache, PHP 5.4, APC, memcache ???

2013-03-22 Thread Thomas Gries
just one message, just arrived:

http://php.net/archive/2013.php#id2013-03-21-1

PHP 5.5.0 beta1 available 21-Mar-2013

The PHP development team announces the release of the first beta of PHP
5.5.0. This release is the first to include the Zend OPCache. Please
help our efforts to provide a stable PHP version and test this version
carefully against several different applications, with Zend OPCache
enabled and report any bug in the bug tracking system.
THIS IS A DEVELOPMENT PREVIEW - DO NOT USE IT IN PRODUCTION!


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] RFC How/whether MediaWiki could use ZendOptimizuerPlus -- ZendOptimizerPlus, opcode cache, PHP 5.4, APC, memcache ???

2013-03-22 Thread Brian Wolff
On 2013-03-22 10:45 AM, Tyler Romeo tylerro...@gmail.com wrote:

 On Fri, Mar 22, 2013 at 9:38 AM, Brian Wolff bawo...@gmail.com wrote:

  Some people have claimed that CACHE_DB might even slow things down
compared
  to CACHE_NONE when used as main cache type (cache db is still better
than
  cache none for slow caches like the parser cache). Anyhow you should do
  profiling type things when messing with caching settings (or any
  performance settings) to see what is effective and what is not.
 
  -bawolff
 

 Wouldn't be surprised. ;) The only problem is that with CACHE_NONE, many
 things (specifically, throttling mechanisms) won't work since the cache
 isn't persistent across requests.

 *-- *
 *Tyler Romeo*
 Stevens Institute of Technology, Class of 2015
 Major in Computer Science
 www.whizkidztech.com | tylerro...@gmail.com
 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l

That would be a mediawiki bug though. Does throtling actually work with
cache_db now? I remember it used to only work with the memcached backend.
Anyways if that's been fixed, throtling should be changed to use CACHE_ANY
so it actually works in all configs.

-bawolff
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] RFC How/whether MediaWiki could use ZendOptimizuerPlus -- ZendOptimizerPlus, opcode cache, PHP 5.4, APC, memcache ???

2013-03-22 Thread Tyler Romeo
On Fri, Mar 22, 2013 at 10:29 AM, Brian Wolff bawo...@gmail.com wrote:

 That would be a mediawiki bug though. Does throtling actually work with
 cache_db now? I remember it used to only work with the memcached backend.
 Anyways if that's been fixed, throtling should be changed to use CACHE_ANY
 so it actually works in all configs.


Theoretically I'm thinking it should work with CACHE_DB. But yeah, the fact
that the default main cache is CACHE_NONE is unsettling.

*-- *
*Tyler Romeo*
Stevens Institute of Technology, Class of 2015
Major in Computer Science
www.whizkidztech.com | tylerro...@gmail.com
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Making it possible to create method documentation links on MW.org

2013-03-22 Thread Daniel Friesen
On Fri, 22 Mar 2013 05:05:13 -0700, Waldir Pimenta wal...@email.com  
wrote:
I would go further and suggest a way to integrate code comments into  
manual

pages at mediawiki.org, so that we could have good documentation in both
code and mw.org, without needing to sync it manually. For example, the
detailed description of api.php found at
https://doc.wikimedia.org/mediawiki-core/master/php/html/api_8php.html#detailscould
be integrated into
https://www.mediawiki.org/wiki/Manual:api.php

Similarly, the contents of README files and the docs folder should be
integratable into mw.org.

If what Daniel suggests is feasible, I assume this also is, and imo would
greatly improve the availability and quality of both in-wiki and code
documentation.

--Waldir


This information isn't available in the current doxygen output. Tagfile  
provides an index. But that kind of content is only available in the full  
html output. You might be able to get at it if we turned on XML output.  
But it would have a ridiculous amount of extra cruft you won't want and  
will take a lot of work to format.


--
~Daniel Friesen (Dantman, Nadir-Seen-Fire) [http://danielfriesen.name/]


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Missing project ideas for GSOC

2013-03-22 Thread Chris McMahon
On Fri, Mar 22, 2013 at 6:55 AM, K. Peachey p858sn...@gmail.com wrote:

 On Fri, Mar 22, 2013 at 10:30 PM, Željko Filipin zfili...@wikimedia.org
 wrote:
  Most of these projects seem to be extension (and PHP?) centric. Can we
  have more diversity?
 
 
  Browser test automation? Not an extension, not in PHP, but in Ruby[2].

 OPs don't want [any more] ruby on the clusters, So I wouldn't suggest that.

 We should be focusing on stuff that is achievable and that can easily
 be shown on benefit our users by actually getting it out there (we
 have a shocking record for this)


It's not on the cluster, we manage these in gerrit[0] and run these
completely openly on a hosted service.  https://wmf.ci.cloudbees.com/.
 This is after many long discussions with various ops folks over the past
year.

These tests consistently find regression problems[1], and this week alone
found regression issues with PageTriage and GuidedTour.  It is achieved, it
is demonstrably of benefit, it is definitely out there.

[0]https://gerrit.wikimedia.org/r/#/admin/projects/qa/browsertests
[1] http://www.mediawiki.org/wiki/QA/Browser_testing/Examples
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Making it possible to create method documentation links on MW.org

2013-03-22 Thread Antoine Musso
Le 22/03/13 10:43, Daniel Friesen a écrit :
 What does everyone think of making it so that when Jenkins generates
 this documentation. It processes the tagfile, splits it up and converts
 it into multiple lua tables, then uses the API to update a Module: page
 on mediawiki.org.

If you can come back with an even more complicated suggestion, I am all
for it.  Meanwhile lets point people to doc.wikimedia.org and let
everyone directly use Doxygen output instead of mw.org.


-- 
Antoine hashar Musso


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Making it possible to create method documentation links on MW.org

2013-03-22 Thread Tim Landscheidt
Antoine Musso hashar+...@free.fr wrote:

 What does everyone think of making it so that when Jenkins generates
 this documentation. It processes the tagfile, splits it up and converts
 it into multiple lua tables, then uses the API to update a Module: page
 on mediawiki.org.

 If you can come back with an even more complicated suggestion, I am all
 for it.  Meanwhile lets point people to doc.wikimedia.org and let
 everyone directly use Doxygen output instead of mw.org.

I think Daniel's idea has some merit to it though obviously
you are (very) right about the unnecessary complexity.

How about a simple CGI on doc.wikimedia.org that provides
redirects?  Or, even better and more reusable by other
projects: Make Doxygen output predictable anchors in addi-
tion to the hash ones?

Tim


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] RFC How/whether MediaWiki could use ZendOptimizuerPlus -- ZendOptimizerPlus, opcode cache, PHP 5.4, APC, memcache ???

2013-03-22 Thread Kevin Israel
On 03/22/2013 07:16 AM, Platonides wrote:
 APC can do two things:
 1) Keep the compiled php opcodes, so php execution is faster.
 2) Allow the application to store values in the web server memory (kept
 accross requests).
 
 ZendOptimizer only does 1. [...]
 The «APC  is a must have for larger MediaWikis» is due to 1. In fact,
 wikimedia is not using APC for 2, but memcached.

With one exception: a [live hack][1] to use apc_inc() instead of rand()
to generate a 32-bit TRANS-ID for HTCP cache purging.

Why is this hack in place? Is it particularly useful for [monitoring
packet loss][2]?

[1]:
https://gerrit.wikimedia.org/r/gitweb?p=mediawiki/core.git;a=blobdiff;h=897397e41fe14bdf7dcd02eb61a9744880a5e1a3;hb=b7bc01d0ccea6a6a817aed31d781ce6693ee9417;hpb=1256724550556e5e35810bb88b20ef87dbe1ce47

[2]:
https://svn.wikimedia.org/viewvc/mediawiki/trunk/udpmcast/htcpseqcheck.py?view=markup

-- 
Wikipedia user PleaseStand
http://en.wikipedia.org/wiki/User:PleaseStand

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] OAuth critique

2013-03-22 Thread Yuri Astrakhan
There was a discussion recently about OAuth, and I just saw this blog
posthttp://insanecoding.blogspot.com/2013/03/oauth-great-way-to-cripple-your-api.html
(posted
on 
slashdothttp://tech.slashdot.org/story/13/03/22/1439235/a-truckload-of-oauth-issues-that-would-make-any-author-quit)
with some heavy criticisms. I am not an expert in OAuth and do not yet have
a pro/against position, this is more of an FYI for those interested.

--yurik
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] OAuth critique

2013-03-22 Thread Tyler Romeo
Most of those concerns are valid. Daniel Friesnen has managed to convince
me that OAuth is absolutely horrible, and that we will probably have to
make our own authentication framework.

*-- *
*Tyler Romeo*
Stevens Institute of Technology, Class of 2015
Major in Computer Science
www.whizkidztech.com | tylerro...@gmail.com


On Fri, Mar 22, 2013 at 11:59 AM, Yuri Astrakhan
yastrak...@wikimedia.orgwrote:

 There was a discussion recently about OAuth, and I just saw this blog
 post
 http://insanecoding.blogspot.com/2013/03/oauth-great-way-to-cripple-your-api.html
 
 (posted
 on slashdot
 http://tech.slashdot.org/story/13/03/22/1439235/a-truckload-of-oauth-issues-that-would-make-any-author-quit
 )
 with some heavy criticisms. I am not an expert in OAuth and do not yet have
 a pro/against position, this is more of an FYI for those interested.

 --yurik
 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] OAuth critique

2013-03-22 Thread Gerard Meijssen
Hoi,
MAY I QUOTE YOU ???
Thanks,
 GerardM


On 22 March 2013 17:11, Tyler Romeo tylerro...@gmail.com wrote:

 Most of those concerns are valid. Daniel Friesnen has managed to convince
 me that OAuth is absolutely horrible, and that we will probably have to
 make our own authentication framework.

 *-- *
 *Tyler Romeo*
 Stevens Institute of Technology, Class of 2015
 Major in Computer Science
 www.whizkidztech.com | tylerro...@gmail.com


 On Fri, Mar 22, 2013 at 11:59 AM, Yuri Astrakhan
 yastrak...@wikimedia.orgwrote:

  There was a discussion recently about OAuth, and I just saw this blog
  post
 
 http://insanecoding.blogspot.com/2013/03/oauth-great-way-to-cripple-your-api.html
  
  (posted
  on slashdot
 
 http://tech.slashdot.org/story/13/03/22/1439235/a-truckload-of-oauth-issues-that-would-make-any-author-quit
  )
  with some heavy criticisms. I am not an expert in OAuth and do not yet
 have
  a pro/against position, this is more of an FYI for those interested.
 
  --yurik
  ___
  Wikitech-l mailing list
  Wikitech-l@lists.wikimedia.org
  https://lists.wikimedia.org/mailman/listinfo/wikitech-l
 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] RFC: I suggest renaming of pages Cross-site scripting = Cross-site scripting (XSS, XSSI) and Cross-site request forgery = Cross-site request forgery (CSRF)

2013-03-22 Thread Chris Steipp
On Thu, Mar 21, 2013 at 11:35 PM, Thomas Gries m...@tgries.de wrote:
 Am 22.03.2013 07:29, schrieb K. Peachey:
 What actual benefit with having their abbreviation in the title archive?
 Make users aware at the first glance to the TOC, that XSS is this and
 CSRF is that if they did not yet know this.
 You, as expert, can overread the part in parentheses.

 I am the editor of that book, and I like it in.

 BTW, what is the purpose in having a title DOM-based XSS (like it is now)
 instead of the more distinct and absolutely clear title
 DOM-based Cross-Site Scripting (CSS)

Cross-Site Scripting (DOM-based) might be a better name for that
page? I didn't put much thought into it before naming it originally.

It would be good to make the names consistent, thanks for working on
this Thomas.

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Who is responsible for communicating changes in MediaWiki to WMF sites?

2013-03-22 Thread Sumana Harihareswara
On 03/21/2013 09:54 PM, Tim Starling wrote:

 Also, community managers generally see it as their responsibility to
 extract as much work from volunteers as possible

The community managers for MediaWiki (Quim and me) don't think like
this.  If you believe we do, please say so.  :-)

 and will ask a
 volunteer to do something whether or not a WMF staff member would be
 more than happy to do it.

 -- Tim Starling

You're implying that we should always check whether there is a WMF
staffer available and eager to do a particular task before asking a
volunteer to help out.  That would be impractical, and would prevent us
from helping eager volunteers learn.

However, that's separate from the question in *this* thread about who is
and who ought to be responsible for managing and communicating about
certain kinds of blockages.  I'd say, as a first approximation: release
managers for WMF and for MediaWiki (so, Greg  hexmode), and product
managers, who are staff members and volunteers.

-- 
Sumana Harihareswara
Engineering Community Manager
Wikimedia Foundation

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] OAuth critique

2013-03-22 Thread Chris Steipp
I think the caricature of OAuth there should be taken with a grain of
salt. The author talks about OAuth, but seems to be referring to
OAuth 2 primarily, which is very different from OAuth 1. Also, the
author says that the protocol was designed for authorizing
website-to-website communication, but then says it's insecure in a
desktop app environment, which it is. They also point to the (very
good) article about using OAuth for authentication, which again, the
protocol was not designed for.

So yes, if you don't use the protocol in the way it's intended,
absolutely it's insecure. The same can be said for AES encryption
(like if you use it in cbc mode to protect predictable messages).
Should you trust a system just because it's using OAuth? Definitely
not. But is it insecure just because it's using OAuth? I would say no.
If you disagree, you can even get paid if you can find a flaw in
Facebook's implementation, so you should take them up on it :)



On Fri, Mar 22, 2013 at 9:11 AM, Tyler Romeo tylerro...@gmail.com wrote:
 Most of those concerns are valid. Daniel Friesnen has managed to convince
 me that OAuth is absolutely horrible, and that we will probably have to
 make our own authentication framework.

 *-- *
 *Tyler Romeo*
 Stevens Institute of Technology, Class of 2015
 Major in Computer Science
 www.whizkidztech.com | tylerro...@gmail.com


 On Fri, Mar 22, 2013 at 11:59 AM, Yuri Astrakhan
 yastrak...@wikimedia.orgwrote:

 There was a discussion recently about OAuth, and I just saw this blog
 post
 http://insanecoding.blogspot.com/2013/03/oauth-great-way-to-cripple-your-api.html
 
 (posted
 on slashdot
 http://tech.slashdot.org/story/13/03/22/1439235/a-truckload-of-oauth-issues-that-would-make-any-author-quit
 )
 with some heavy criticisms. I am not an expert in OAuth and do not yet have
 a pro/against position, this is more of an FYI for those interested.

 --yurik
 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l
 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] OAuth critique

2013-03-22 Thread Brion Vibber
On Fri, Mar 22, 2013 at 8:59 AM, Yuri Astrakhan
yastrak...@wikimedia.org wrote:
 There was a discussion recently about OAuth, and I just saw this blog
 posthttp://insanecoding.blogspot.com/2013/03/oauth-great-way-to-cripple-your-api.html
 (posted
 on 
 slashdothttp://tech.slashdot.org/story/13/03/22/1439235/a-truckload-of-oauth-issues-that-would-make-any-author-quit)
 with some heavy criticisms. I am not an expert in OAuth and do not yet have
 a pro/against position, this is more of an FYI for those interested.

OAuth has ... plenty of issues ... ;) but it has its place.

That place is *specifically* in authorizing third-party web
applications to get partial access on behalf of a user without getting
unfettered access to their credentials -- something that should be
useful for wiki-related tools such as on Toolserver and Labs, or on
other third-party hosting.

It shouldn't be used for mobile or desktop apps. It can't replace
CentralAuth. It can't replace login. It can't replace OpenID. And it
shouldn't be shoved into any of those things where it won't fit. :)

-- brion

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] Reminder: Lua/Scribunto IRC office hours today

2013-03-22 Thread Guillaume Paumier
Hi,

Just a quick reminder that we'll be holding IRC office hours in about
50 minutes. If you have questions about how to use Lua, or issues
you'd like help with, join us in #wikimedia-office on Freenode.

More information about how to connect to IRC is available at
https://meta.wikimedia.org/wiki/IRC_office_hours

On Wed, Mar 13, 2013 at 7:38 PM, Guillaume Paumier
gpaum...@wikimedia.org wrote:
 Greetings,

 As you might have seen on the Wikimedia tech blog (article included below)
 or the tech ambassadors list, a new functionality called Lua is being
 enabled on all Wikimedia sites today. Lua is a scripting language that
 enables Wikimedia editors to write faster and more powerful MediaWiki
 templates.

 If you have questions about how to convert existing templates to Lua (or how
 to create new ones), we'll be holding two support sessions on IRC next week:
 one on Wednesday (for Oceania, Asia  America) and one on Friday (for
 Europe, Africa  America); see m:IRC office hours for details. If you can't
 make it, you can also get help at mw:Talk:Lua scripting.

 If you'd like to learn about this kind of events earlier in advance,
 consider becoming a Tech ambassador by subscribing to the mailing list.



 =

 New Lua templates bring faster, more flexible pages to your wiki

 Posted by Sumana Harihareswara on March 11th, 2013

 Starting Wednesday, March 13th, you’ll be able to make wiki pages even more
 useful, no matter what language you speak: we’re adding Lua as a templating
 language. This will make it easier for you to create and change infoboxes,
 tables, and other useful MediaWiki templates. We’ve already started to
 deploy Scribunto (the MediaWiki extension that enables this); it’s on
 several of the sites, including English Wikipedia, right now.

 You’ll find this useful for performing more complex tasks for which
 templates are too complex or slow — common examples include numeric
 computations, string manipulation and parsing, and decision trees. Even if
 you don’t write templates, you’ll enjoy seeing pages load faster and with
 more interesting ways to present information.

 Background

 MediaWiki developers introduced templates and parser functions years ago to
 allow end-users of MediaWiki to replicate content easily and build tools
 using basic logic. Along the way, we found that we were turning wikitext
 into a limited programming language. Complex templates have caused
 performance issues and bottlenecks, and it’s difficult for users to write
 and understand templates. Therefore, the Lua scripting project aims to make
 it possible for MediaWiki end-users to use a proper scripting language that
 will be more powerful and efficient than ad-hoc, parser functions-based
 logic. The example of Lua’s use in World of Warcraft is promising; even
 novices with no programming experience have been able to make large changes
 to their graphical experiences by quickly learning some Lua.

 Lua on your wiki

 As of March 13th, you’ll be able to use Lua on your home wiki (if it’s not
 already enabled). Lua code can be embedded into wiki templates by employing
 the {{#invoke:}} parser function provided by the Scribunto MediaWiki
 extension. The Lua source code is stored in pages called modules (e.g.,
 Module:Bananas). These individual modules are then invoked on template
 pages. The example: Template:Lua hello world uses the code
 {{#invoke:Bananas|hello}} to print the text “Hello, world!”. So, if you
 start seeing edits in the Module namespace, that’s what’s going on.

 Getting started

 Check out the basic “hello, world!” instructions, then look at Brad Jorsch’s
 short presentation for a basic example of how to convert a wikitext template
 into a Lua module. After that, try Tim Starling’s tutorial.

 To help you preview and test a converted template, try
 Special:TemplateSandbox on your wiki. With it, you can preview a page using
 sandboxed versions of templates and modules, allowing for easy testing
 before you make the sandbox code live.

 Where to start? If you use pywikipedia, try parsercountfunction.py by
 Bináris, which helps you find wikitext templates that currently parse slowly
 and thus would be worth converting to Lua. Try fulfilling open requests for
 conversion on English Wikipedia, possibly using Anomie’s Greasemonkey script
 to help you see the performance gains. On English Wikipedia, some of the
 templates have already been converted — feel free to reuse them on your
 wiki.

 The Lua hub on mediawiki.org has more information; please add to it. And
 enjoy your faster, more flexible templates!

 Sumana Harihareswara, Engineering Community Manager

 =


 --
 Guillaume Paumier
 Technical Communications Manager — Wikimedia Foundation
 https://donate.wikimedia.org



-- 
Guillaume Paumier
Technical Communications Manager — Wikimedia Foundation
https://donate.wikimedia.org

___
Wikitech-l 

Re: [Wikitech-l] OAuth critique

2013-03-22 Thread Daniel Friesen

Oh yay, I actually convinced someone.

This post is a little different than mine. A random spattering of  
high-level qualms with it. OAuth 2 not being a protocol. Flow issues  
(though a little debatable). And some stuff about enterprise that  
besides being irrelevant to us sounds like berating the taste of an apple  
cause it doesn't taste like an orange.


For reference this was my overview of the issues with both the OAuth 1 and  
OAuth 2 standards:

https://www.mediawiki.org/wiki/OAuth/Issues

I didn't get round to an actual specification. But in the interest of  
writing one, awhile ago I did go over every user flow I could think of an  
auth system having, made notes and comments on each of them, then decided  
what ones should be rejected.

https://github.com/dantman/protoauth-spec/blob/master/auth-flows.md

--
~Daniel Friesen (Dantman, Nadir-Seen-Fire) [http://danielfriesen.name/]

On Fri, 22 Mar 2013 09:11:06 -0700, Tyler Romeo tylerro...@gmail.com  
wrote:



Most of those concerns are valid. Daniel Friesnen has managed to convince
me that OAuth is absolutely horrible, and that we will probably have to
make our own authentication framework.

*-- *
*Tyler Romeo*
Stevens Institute of Technology, Class of 2015
Major in Computer Science
www.whizkidztech.com | tylerro...@gmail.com


On Fri, Mar 22, 2013 at 11:59 AM, Yuri Astrakhan
yastrak...@wikimedia.orgwrote:


There was a discussion recently about OAuth, and I just saw this blog
post
http://insanecoding.blogspot.com/2013/03/oauth-great-way-to-cripple-your-api.html

(posted
on slashdot
http://tech.slashdot.org/story/13/03/22/1439235/a-truckload-of-oauth-issues-that-would-make-any-author-quit
)
with some heavy criticisms. I am not an expert in OAuth and do not yet  
have

a pro/against position, this is more of an FYI for those interested.

--yurik



___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] OAuth critique

2013-03-22 Thread Matthew Flaschen
On 03/22/2013 12:48 PM, Chris Steipp wrote:
 I think the caricature of OAuth there should be taken with a grain of
 salt. The author talks about OAuth, but seems to be referring to
 OAuth 2 primarily, which is very different from OAuth 1. Also, the
 author says that the protocol was designed for authorizing
 website-to-website communication, but then says it's insecure in a
 desktop app environment, which it is. They also point to the (very
 good) article about using OAuth for authentication, which again, the
 protocol was not designed for.

I agree.  There are valid issues with OAuth, but the article is way over
the top, and some of the statements, like:

Third party software cannot run automated processes on an OAuth APUI.

are flat out false.

That's exactly how services like IFTTT and Zapier work.  They require a
one-time authentication step, then can run in the background automated
forever (or until revoked).

A web site can embed a web browser via a Java Applet or similar, or
have a web browser server side which presents the OAuth log in page to
the user, but slightly modified to have all the data entered pass
through the third party site. Therefore OAuth doesn't even fulfill its
own primary security objective!

is a bit silly, since Java applets are increasingly being sandboxed and
just completely disabled/uninstalled, and some users can certainly tell
the difference between a weird Java browser and a popup in their main
browser.

The biggest real issue is probably the optional components, but I sense
that sites are already forming de facto profiles (i.e. new sites
gravitate toward particular components).

Also it is common that OAuth implementations are using security tokens
which expire, meaning the boss will need to keep reentering his Calendar
credentials again and again.

I don't know any one that requires you to enter your password again.
Some require automatic token renewal, and with others (again, an
increasing number, based on what I can see) the token lasts until
revocation.

Matt Flaschen

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] RFC: Alternative domains for Commons

2013-03-22 Thread Arthur Richards
+ops


On Thu, Mar 21, 2013 at 8:20 AM, Juliusz Gonera jgon...@wikimedia.orgwrote:

 We've been having a hard time making photo uploads work in
 MobileFrontend because of CentralAuth's third party cookies problem (we
 upload them from Wikipedia web site to Commons API). Apart from the
 newest Firefox [1,2], mobile Safari also doesn't accept third party
 cookies unless the domain has been visited and it already has at least
 one cookie set.

 Even though we have probably found a solution for now, it's a very shaky
 and not elegant workaround which might stop working any time (if some
 detail of default browser cookie policy changes again) [3].

 I came up with another idea of how this could be solved. The problem we
 have right now is that Commons is on a completely different domain than
 Wikipedia, so they can't share the login token cookie. However, we could
 set up alternative domains for Commons, such as commons.wikipedia.org,
 and then the cookie could be shared.

 The only issue I see with this solution is that we would have to
 prevent messing up SEO (having multiple URLs pointing to the same
 resource). This, however, could be avoided by redirecting every
 non-API request to the main domain (commons.wikimedia.org) and only
 allowing API requests on alternative domains (which is what we use for
 photo uploads on mobile).

 This obviously doesn't solve the broader problem of CentralAuth's common
 login being broken, but at least would allow easy communication between
 Commons and other projects. In my opinion this is the biggest problem
 right now. Users can probably live without being automatically logged in
 to other projects, but photo uploads on mobile are just broken when we
 can't use Commons API.

 Please let me know what you think. Are there any other possible
 drawbacks of this solution that I missed?

 [1] 
 http://webpolicy.org/2013/02/**22/the-new-firefox-cookie-**policy/http://webpolicy.org/2013/02/22/the-new-firefox-cookie-policy/
 [2] https://developer.mozilla.org/**en-US/docs/Site_Compatibility_**
 for_Firefox_22https://developer.mozilla.org/en-US/docs/Site_Compatibility_for_Firefox_22
 [3] 
 https://gerrit.wikimedia.org/**r/#/c/54813/https://gerrit.wikimedia.org/r/#/c/54813/

 --
 Juliusz

 __**_
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/**mailman/listinfo/wikitech-lhttps://lists.wikimedia.org/mailman/listinfo/wikitech-l




-- 
Arthur Richards
Software Engineer, Mobile
[[User:Awjrichards]]
IRC: awjr
+1-415-839-6885 x6687
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] [RFC] performance standards for new mediawiki features

2013-03-22 Thread Arthur Richards
Right now, I think many of us profile locally or in VMs, which can be
useful for relative metrics or quickly identifying bottlenecks, but doesn't
really get us the kind of information you're talking about from any sort of
real-world setting, or in any way that would be consistent from engineer to
engineer, or even necessarily from day to day. From network topology to
article counts/sizes/etc and everything in between, there's a lot we can't
really replicate or accurately profile against. Are there plans to put
together and support infrastructure for this? It seems to me that this
proposal is contingent upon a consistent environment accessible by
engineers for performance testing.


On Thu, Mar 21, 2013 at 10:55 PM, Yuri Astrakhan
yastrak...@wikimedia.orgwrote:

 API is fairly complex to meassure and performance target. If a bot requests
 5000 pages in one call, together with all links  categories, it might take
 a very long time (seconds if not tens of seconds). Comparing that to
 another api request that gets an HTML section of a page, which takes a
 fraction of a second (especially when comming from cache) is not very
 useful.


 On Fri, Mar 22, 2013 at 1:32 AM, Peter Gehres li...@pgehres.com wrote:

  From where would you propose measuring these data points?  Obviously
  network latency will have a great impact on some of the metrics and a
  consistent location would help to define the pass/fail of each test. I do
  think another benchmark Ops features would be a set of
  latency-to-datacenter values, but I know that is a much harder taks.
 Thanks
  for putting this together.
 
 
  On Thu, Mar 21, 2013 at 6:40 PM, Asher Feldman afeld...@wikimedia.org
  wrote:
 
   I'd like to push for a codified set of minimum performance standards
 that
   new mediawiki features must meet before they can be deployed to larger
   wikimedia sites such as English Wikipedia, or be considered complete.
  
   These would look like (numbers pulled out of a hat, not actual
   suggestions):
  
   - p999 (long tail) full page request latency of 2000ms
   - p99 page request latency of 800ms
   - p90 page request latency of 150ms
   - p99 banner request latency of 80ms
   - p90 banner request latency of 40ms
   - p99 db query latency of 250ms
   - p90 db query latency of 50ms
   - 1000 write requests/sec (if applicable; writes operations must be
 free
   from concurrency issues)
   - guidelines about degrading gracefully
   - specific limits on total resource consumption across the stack per
   request
   - etc..
  
   Right now, varying amounts of effort are made to highlight potential
   performance bottlenecks in code review, and engineers are encouraged to
   profile and optimize their own code.  But beyond is the site still up
  for
   everyone / are users complaining on the village pump / am I ranting in
   irc, we've offered no guidelines as to what sort of request latency is
   reasonable or acceptable.  If a new feature (like aftv5, or flow) turns
  out
   not to meet perf standards after deployment, that would be a high
  priority
   bug and the feature may be disabled depending on the impact, or if not
   addressed in a reasonable time frame.  Obviously standards like this
  can't
   be applied to certain existing parts of mediawiki, but systems other
 than
   the parser or preprocessor that don't meet new standards should at
 least
  be
   prioritized for improvement.
  
   Thoughts?
  
   Asher
   ___
   Wikitech-l mailing list
   Wikitech-l@lists.wikimedia.org
   https://lists.wikimedia.org/mailman/listinfo/wikitech-l
  ___
  Wikitech-l mailing list
  Wikitech-l@lists.wikimedia.org
  https://lists.wikimedia.org/mailman/listinfo/wikitech-l
 
 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l




-- 
Arthur Richards
Software Engineer, Mobile
[[User:Awjrichards]]
IRC: awjr
+1-415-839-6885 x6687
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] RFC: Alternative domains for Commons

2013-03-22 Thread MZMcBride
Juliusz Gonera wrote:
We've been having a hard time making photo uploads work in
MobileFrontend because of CentralAuth's third party cookies problem (we
upload them from Wikipedia web site to Commons API). Apart from the
newest Firefox [1,2], mobile Safari also doesn't accept third party
cookies unless the domain has been visited and it already has at least
one cookie set.

Even though we have probably found a solution for now, it's a very shaky
and not elegant workaround which might stop working any time (if some
detail of default browser cookie policy changes again) [3].

I came up with another idea of how this could be solved. The problem we
have right now is that Commons is on a completely different domain than
Wikipedia, so they can't share the login token cookie. However, we could
set up alternative domains for Commons, such as commons.wikipedia.org,
and then the cookie could be shared.

The only issue I see with this solution is that we would have to
prevent messing up SEO (having multiple URLs pointing to the same
resource). This, however, could be avoided by redirecting every
non-API request to the main domain (commons.wikimedia.org) and only
allowing API requests on alternative domains (which is what we use for
photo uploads on mobile).

This obviously doesn't solve the broader problem of CentralAuth's common
login being broken, but at least would allow easy communication between
Commons and other projects. In my opinion this is the biggest problem
right now. Users can probably live without being automatically logged in
to other projects, but photo uploads on mobile are just broken when we
can't use Commons API.

Please let me know what you think. Are there any other possible
drawbacks of this solution that I missed?

[1] http://webpolicy.org/2013/02/22/the-new-firefox-cookie-policy/
[2] 
https://developer.mozilla.org/en-US/docs/Site_Compatibility_for_Firefox_22
[3] https://gerrit.wikimedia.org/r/#/c/54813/

Hi Juliusz,

Please draft an RFC at https://www.mediawiki.org/wiki/RFC. :-)

commons.wikipedia.org already redirects to commons.wikimedia.org (for
historical reasons, maybe), so that has to be considered. I think what
you're proposing is also kind of confusing and I'm wondering if there
aren't better ways to approach the problem.

A good RFC will lay out the underlying components in a Background
section, the problem you're attempting to solve in a Problem section,
and then offer possible solutions in a Proposals section. Variants on
this also usually work.

MZMcBride



___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] RFC: Alternative domains for Commons

2013-03-22 Thread Brian Wolff
On 2013-03-22 5:22 PM, MZMcBride z...@mzmcbride.com wrote:

 Juliusz Gonera wrote:
 We've been having a hard time making photo uploads work in
 MobileFrontend because of CentralAuth's third party cookies problem (we
 upload them from Wikipedia web site to Commons API). Apart from the
 newest Firefox [1,2], mobile Safari also doesn't accept third party
 cookies unless the domain has been visited and it already has at least
 one cookie set.
 
 Even though we have probably found a solution for now, it's a very shaky
 and not elegant workaround which might stop working any time (if some
 detail of default browser cookie policy changes again) [3].
 
 I came up with another idea of how this could be solved. The problem we
 have right now is that Commons is on a completely different domain than
 Wikipedia, so they can't share the login token cookie. However, we could
 set up alternative domains for Commons, such as commons.wikipedia.org,
 and then the cookie could be shared.
 
 The only issue I see with this solution is that we would have to
 prevent messing up SEO (having multiple URLs pointing to the same
 resource). This, however, could be avoided by redirecting every
 non-API request to the main domain (commons.wikimedia.org) and only
 allowing API requests on alternative domains (which is what we use for
 photo uploads on mobile).
 
 This obviously doesn't solve the broader problem of CentralAuth's common
 login being broken, but at least would allow easy communication between
 Commons and other projects. In my opinion this is the biggest problem
 right now. Users can probably live without being automatically logged in
 to other projects, but photo uploads on mobile are just broken when we
 can't use Commons API.
 
 Please let me know what you think. Are there any other possible
 drawbacks of this solution that I missed?
 
 [1] http://webpolicy.org/2013/02/22/the-new-firefox-cookie-policy/
 [2]
 
https://developer.mozilla.org/en-US/docs/Site_Compatibility_for_Firefox_22
 [3] https://gerrit.wikimedia.org/r/#/c/54813/

 Hi Juliusz,

 Please draft an RFC at https://www.mediawiki.org/wiki/RFC. :-)

 commons.wikipedia.org already redirects to commons.wikimedia.org (for
 historical reasons, maybe), so that has to be considered. I think what
 you're proposing is also kind of confusing and I'm wondering if there
 aren't better ways to approach the problem.

 A good RFC will lay out the underlying components in a Background
 section, the problem you're attempting to solve in a Problem section,
 and then offer possible solutions in a Proposals section. Variants on
 this also usually work.

 MZMcBride



 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Imo this sounds like a hacky solution. Also doesnt work for wikis that are
not commons.

That said I don't have a better solution atm.

-bawolff
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] RFC: Alternative domains for Commons

2013-03-22 Thread Brion Vibber
On Fri, Mar 22, 2013 at 1:22 PM, MZMcBride z...@mzmcbride.com wrote:
 commons.wikipedia.org already redirects to commons.wikimedia.org (for
 historical reasons, maybe), so that has to be considered. I think what
 you're proposing is also kind of confusing and I'm wondering if there
 aren't better ways to approach the problem.

The proposal is to continue to redirect everything *except* API
requests, but to allow the API requests to complete and run as though
they were requested on commons.wikimedia.org.

This would create a new local session cookie on commons.wikipedia.org
based on the *.wikipedia.org CentralAuth session cookie, but this
should be harmless (roughly equivalent to logging into Commons on two
browsers at once).

Of course, in order to use the same functionality on Wikisource,
Wikiversity, Wikivoyage, mediawiki.org etc we'd need similar alternate
commons subdomains under those domains.

-- brion

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] [RFC] performance standards for new mediawiki features

2013-03-22 Thread MZMcBride
Asher Feldman wrote:
I'd like to push for a codified set of minimum performance standards that
new mediawiki features must meet before they can be deployed to larger
wikimedia sites such as English Wikipedia, or be considered complete.

These would look like (numbers pulled out of a hat, not actual
suggestions):

[...]

Thoughts?

Hi.

Once you have numbers from a non-hat source, please draft an RFC at
https://www.mediawiki.org/wiki/RFC. :-)

MZMcBride 



___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] RFC: Alternative domains for Commons

2013-03-22 Thread Daniel Zahn
fyi, we have all these:

DNS:

root@sockpuppet:~/pdns-templates# ls -l | grep commons
lrwxrwxrwx 1 root root 13 Jun  7  2012 wikimediacommons.co.uk -
wikimedia.com
lrwxrwxrwx 1 root root 13 Jun  7  2012 wikimediacommons.eu - wikimedia.com
lrwxrwxrwx 1 root root 13 Jun  7  2012 wikimediacommons.info -
wikimedia.com
lrwxrwxrwx 1 root root 13 Jul 19  2012 wikimediacommons.jp.net -
wikimedia.com
lrwxrwxrwx 1 root root 13 Jul 19  2012 wikimediacommons.mobi -
wikimedia.com
lrwxrwxrwx 1 root root 13 Jun  7  2012 wikimediacommons.net - wikimedia.com
lrwxrwxrwx 1 root root 13 Jun  7  2012 wikimediacommons.org - wikimedia.com

Apache:

/apache-config$ grep commons redirects.conf
wikimediacommons.co.uk *.wikimediacommons.co.uk \
wikimediacommons.eu *.wikimediacommons.eu \
wikimediacommons.info *.wikimediacommons.info \
wikimediacommons.jp.net *.wikimediacommons.jp.net \
wikimediacommons.mobi *.wikimediacommons.mobi \
wikimediacommons.net *.wikimediacommons.net \
wikimediacommons.org *.wikimediacommons.org \
wikisource.com *.wikisource.com commons.wikipedia.org \
www.commons.wikipedia.org www.commons.wikimedia.org \
RewriteRule ^/welcometowikipedia$
http://commons.wikimedia.org/wiki/File:Welcome_to_Wikipedia_brochure_EN.pdf
[R=301,L]
RewriteRule ^/instructorbasics$
http://commons.wikimedia.org/wiki/File:Instructor_Basics_How_to_Use_Wikipedia_as_a_Teaching_Tool.pdf
[R=301,L]
RewriteCond %{HTTP_HOST}
(^|\.)wikimediacommons.(net|info|mobi|eu|org|jp\.net)$
RewriteRule ^(.*)$ http://commons.wikimedia.org$1 [R=301,L,NE]
RewriteCond %{HTTP_HOST} (^|\.)wikimediacommons.co.uk$
RewriteRule ^(.*)$ http://commons.wikimedia.org$1 [R=301,L]
RewriteCond %{HTTP_HOST} =commons.wikipedia.org [OR]
RewriteCond %{HTTP_HOST} =www.commons.wikimedia.org
RewriteRule ^(.*)$ http://commons.wikimedia.org$1 [R=301,L,NE]

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] CAPTCHA

2013-03-22 Thread Steven Walling
On Wed, Mar 20, 2013 at 8:23 PM, James Heilman jmh...@gmail.com wrote:

 Hey All

 I have someone helping me add translation done by Translators Without
 Borders of key medical articles. An issue that slows the work is that
 many languages require CAPTCHA to save the edits. Is their anyway
 around this (ie to get an account confirmed in all languages)?


This doesn't quite solve your problem, but one enhancement that may reduce
frustration is the addition of a refresh button on the CAPTCHA (
https://bugzilla.wikimedia.org/show_bug.cgi?id=14230).

This is slowly but surely being worked on at
https://gerrit.wikimedia.org/r/#/c/44376/

Steven
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] [RFC] performance standards for new mediawiki features

2013-03-22 Thread Steven Walling
On Thu, Mar 21, 2013 at 6:40 PM, Asher Feldman afeld...@wikimedia.orgwrote:

 Right now, varying amounts of effort are made to highlight potential
 performance bottlenecks in code review, and engineers are encouraged to
 profile and optimize their own code.  But beyond is the site still up for
 everyone / are users complaining on the village pump / am I ranting in
 irc, we've offered no guidelines as to what sort of request latency is
 reasonable or acceptable.  If a new feature (like aftv5, or flow) turns out
 not to meet perf standards after deployment, that would be a high priority
 bug and the feature may be disabled depending on the impact, or if not
 addressed in a reasonable time frame.  Obviously standards like this can't
 be applied to certain existing parts of mediawiki, but systems other than
 the parser or preprocessor that don't meet new standards should at least be
 prioritized for improvement.

 Thoughts?


As a features product manager, I am totally behind this. I don't take
adding another potential blocker lightly, but performance is a feature, and
not a minor one. For me the hurdle to taking this more seriously, beyond
just is this thing unusably/annoyingly slow when testing it?, has always
been a way to reliably measure performance, set goals, and a set of
guidelines.

Like MZ suggests, I think the place to discuss that is in an RFC on
mediawiki.org, but in general I want to say that I support creating a
reasonable set of guidelines based data.

Steven
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] [RFC] performance standards for new mediawiki features

2013-03-22 Thread Brian Wolff
 been a way to reliably measure performance...


The measure part is important. As it stands I have no way of measuring code
in action (sure i can set up profiling locally, and actually have but its
not the same [otoh i barely ever look at the local profiling i did set
up...). People throw around words like graphite, but unless im mistaken us
non staff folks do not have access to whatever that may be.

-bawolff
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] 1.21wmf12 re-deployed (was Re: wmf12 rollback, all wikis (except test2) are on wmf11)

2013-03-22 Thread Greg Grossmeier
quote name=Greg Grossmeier date=2013-03-21 time=11:11:15 -0700
 We're still diagnosing/etc.

Thanks to Aaron Schulz for debugging with help from Chris Steipp and
Aude testing we fixed the issue.

We now have re-deployed 1.21wmf12 to the phase 1 and 2 wikis (see:
https://www.mediawiki.org/wiki/MediaWiki_1.21/Roadmap )

To see the changes/reverts that were made, see the log on gerrit, here:
https://gerrit.wikimedia.org/r/#/q/status:merged+project:mediawiki/core+branch:wmf/1.21wmf12+topic:wmf/1.21wmf12,n,z

(if that doesn't work, try http://goo.gl/Rw3nF )


Thanks, all, and have a good weekend,

Greg

-- 
| Greg GrossmeierGPG: B2FA 27B1 F7EB D327 6B8E |
| identi.ca: @gregA18D 1138 8E47 FAC8 1C7D |

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] 1.21wmf12 re-deployed (was Re: wmf12 rollback, all wikis (except test2) are on wmf11)

2013-03-22 Thread Paul Selitskas
May this be connected with the deployment?

https://www.wikidata.org/wiki/Wikidata:Project_chat#Special:ItemByTitle_changes_.22_.22_with_.22_.22_in_.22site.22


On Sat, Mar 23, 2013 at 12:26 AM, Greg Grossmeier g...@wikimedia.orgwrote:

 quote name=Greg Grossmeier date=2013-03-21 time=11:11:15 -0700
  We're still diagnosing/etc.

 Thanks to Aaron Schulz for debugging with help from Chris Steipp and
 Aude testing we fixed the issue.

 We now have re-deployed 1.21wmf12 to the phase 1 and 2 wikis (see:
 https://www.mediawiki.org/wiki/MediaWiki_1.21/Roadmap )

 To see the changes/reverts that were made, see the log on gerrit, here:

 https://gerrit.wikimedia.org/r/#/q/status:merged+project:mediawiki/core+branch:wmf/1.21wmf12+topic:wmf/1.21wmf12,n,z

 (if that doesn't work, try http://goo.gl/Rw3nF )


 Thanks, all, and have a good weekend,

 Greg

 --
 | Greg GrossmeierGPG: B2FA 27B1 F7EB D327 6B8E |
 | identi.ca: @gregA18D 1138 8E47 FAC8 1C7D |

 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l




-- 
З павагай,
Павел Селіцкас/Pavel Selitskas
Wizardist @ Wikimedia projects
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] 1.21wmf12 re-deployed (was Re: wmf12 rollback, all wikis (except test2) are on wmf11)

2013-03-22 Thread Greg Grossmeier
quote name=Paul Selitskas date=2013-03-23 time=00:32:28 +0300
 May this be connected with the deployment?
 
 https://www.wikidata.org/wiki/Wikidata:Project_chat#Special:ItemByTitle_changes_.22_.22_with_.22_.22_in_.22site.22

I don't believe so. We had some issues with lua/wikidata related to this
issue, but the underlying issue was the same (we believe).

See the related bugs:
https://bugzilla.wikimedia.org/show_bug.cgi?id=46397
and
https://bugzilla.wikimedia.org/show_bug.cgi?id=46427

Greg

-- 
| Greg GrossmeierGPG: B2FA 27B1 F7EB D327 6B8E |
| identi.ca: @gregA18D 1138 8E47 FAC8 1C7D |

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] [RFC] performance standards for new mediawiki features

2013-03-22 Thread Matthew Walker

 People throw around words like graphite, but unless im mistaken us
 non staff folks do not have access to whatever that may be.

Graphite refers to the cluster performance logger available at:
http://graphite.wikimedia.org/

Anyone with a labs account can view it -- which as a commiter you do (it's
the same as your Gerrit login.)

otoh i barely ever look at the local profiling i did set up...

This problem still exists with graphite; you have to look at it for it to
do any good :)

~Matt Walker
Wikimedia Foundation
Fundraising Technology Team


On Fri, Mar 22, 2013 at 2:17 PM, Brian Wolff bawo...@gmail.com wrote:

  been a way to reliably measure performance...
 

 The measure part is important. As it stands I have no way of measuring code
 in action (sure i can set up profiling locally, and actually have but its
 not the same [otoh i barely ever look at the local profiling i did set
 up...). People throw around words like graphite, but unless im mistaken us
 non staff folks do not have access to whatever that may be.

 -bawolff
 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] [RFC] performance standards for new mediawiki features

2013-03-22 Thread Brian Wolff
On 2013-03-22 6:46 PM, Matthew Walker mwal...@wikimedia.org wrote:

 
  People throw around words like graphite, but unless im mistaken us
  non staff folks do not have access to whatever that may be.

 Graphite refers to the cluster performance logger available at:
 http://graphite.wikimedia.org/

 Anyone with a labs account can view it -- which as a commiter you do (it's
 the same as your Gerrit login.)

I've tried. My lab login doesnt work.

More generally, since labs account are free to make, what is the point of
password protecting it?


 otoh i barely ever look at the local profiling i did set up...

 This problem still exists with graphite; you have to look at it for it to
 do any good :)

That's lame ;)

-bawolff

 ~Matt Walker
 Wikimedia Foundation
 Fundraising Technology Team


 On Fri, Mar 22, 2013 at 2:17 PM, Brian Wolff bawo...@gmail.com wrote:

   been a way to reliably measure performance...
  
 
  The measure part is important. As it stands I have no way of measuring
code
  in action (sure i can set up profiling locally, and actually have but
its
  not the same [otoh i barely ever look at the local profiling i did set
  up...). People throw around words like graphite, but unless im mistaken
us
  non staff folks do not have access to whatever that may be.
 
  -bawolff
  ___
  Wikitech-l mailing list
  Wikitech-l@lists.wikimedia.org
  https://lists.wikimedia.org/mailman/listinfo/wikitech-l
 
 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Who is responsible for communicating changes in MediaWiki to WMF sites?

2013-03-22 Thread Greg Grossmeier
Hello Niklas and all,

quote name=Niklas Laxström date=2013-03-21 time=11:55:24 +0200
 I've seen a couple of instances where changes to MediaWiki are blocked
 until someone informs the community.

Just so I know, could you share these? Either on list or in a private
email to me if you don't want to cause more stress in a sensitive
situation.

But, generally, on this issue, I think we need to make this better.

I have some ideas that I want to work on, but, I've been mostly getting
caught up on things thus far (I'm one month in now, pretty soon that
excuse no longer flies!).

For one thing, I'm the one who condenses down all of the changes that
happened in a wmfXX release to the most important ones, see, eg:
https://www.mediawiki.org/wiki/MediaWiki_1.21/wmf12

(NOTE: that page doesn't yet include the reverts we did to fix the Page
Move issue.)

Now, here's the problem with the current process around the creation of
that page:

0) On every other Monday morning (pacific time) Reedy picks a
commit on master and says right here, this is wmfXX.

1) That gets deployed to our phase 1 list of wikis (mediawiki.org,
test., test2.)

2) The Release Notes page for that wmfXX is created with the list of
changes.

3) I then start my review of it. This usually takes me about an hour of
concerted effort. There are a lot of changes and I'm unfamiliar with
about 100% of them (some bugs I may be aware of, but not all).

4) I update that release notes page with the important/breaking changes.

5) Now, from here, what should I/we do? There is now a reasonably good
list of important changes for a specific wmfXX release, with references
to bug reports (usually). I don't know EXACTLY which things will be
important to various communities/wikis and I don't want to be noisy
about things. So, your suggestions welcome if I should do something and
what after that release notes page is done.


Now, this is just one part of the process, yes. But it is one that I
have a large hand in.

Official X.XX releases of MW are more in Mark H's hands, but I don't
think that's the situation you're talking about here.


Thanks,

Greg

-- 
| Greg GrossmeierGPG: B2FA 27B1 F7EB D327 6B8E |
| identi.ca: @gregA18D 1138 8E47 FAC8 1C7D |

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] Deploymens Highlights - Week of March 25th

2013-03-22 Thread Greg Grossmeier
Hello!

This is your weekly preview of higher-risk or general you should be
aware of items for the slew of deployments coming in the near term.


== During the week of March 25th ==

* Wikidata Phase 2 is going to test2 on Monday
** Then on Wednesday, if all goes well, it is going to about 10% of
   projects' page views, these wikis:
*** it, he, hu, ru, tr, uk, uz, hr, bs, sr, sh 

* AFTv5 is taking a longer window than normal on Tuesday (9am - 1pm
  Pacific)

* Lightning Deploys on M/T/W/Th at 4pm! :)


Full schedule:
https://wikitech.wikimedia.org/wiki/Deployments#Week_of_March_25th


Best,

Greg

-- 
| Greg GrossmeierGPG: B2FA 27B1 F7EB D327 6B8E |
| identi.ca: @gregA18D 1138 8E47 FAC8 1C7D |

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Missing project ideas for GSOC

2013-03-22 Thread phoebe ayers
On Fri, Mar 22, 2013 at 5:36 AM, Guillaume Paumier
gpaum...@wikimedia.org wrote:
 Hi,

 On Thu, Mar 21, 2013 at 12:43 AM, Quim Gil q...@wikimedia.org wrote:

 Many of the ideas listed there are too generic (Write an extension),
 improvements of existing features (Improve Extension:CSS)

 This may sound naive, but why are improvements of existing features
 discarded? My thinking was that, if the student didn't have to start
 from scratch, they would have more time to polish their work and make
 it fit with our strict standards, hence making it more likely for
 their work to be merged and deployed.

I have some ideas for existing features and extensions that could use
a good summer's work, and I just added one of them to the page,
despite not having any ability to personally mentor -- I assumed if
there was interest it could get picked up by someone. I hope this is
ok!

-- phoebe

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] JobQueue

2013-03-22 Thread John
I know Aaron has spent a lot of time on the job queue. But I have
several observations and would like some feedback. The current workers
apparently select jobs from the queue at random. A FIFO method would
make far more sense. We have some jobs that can sit there in the queue
for extended periods of time, while others added after that point may
get completed in mere few minutes.

Second exposing job_timestamp via the API should also assist in
identifying issues. whether or not some job is being ignored or the
particular wiki is just extremely lagged.

I am monitoring a template modification to file links that occurred
about  hours ago, with a job queue between 350,000 and 500,000 items
this delay seems excessive.

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l