Re: [Wikitech-l] Jake requests enabling access and edit access to Wikipedia via TOR

2013-12-31 Thread Federico Leva (Nemo)
Sob perennial discussions. Personally I consider this issue solved: a 
global policy now is in place to allow global exemptions via email requests.

https://meta.wikimedia.org/wiki/NOP

Nemo

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Jake requests enabling access and edit access to Wikipedia via TOR

2013-12-31 Thread Martijn Hoekstra
Does Jake have any mechanism in mind to prevent abuse? Is there any
possible mechanism available to prevent abuse?


On Tue, Dec 31, 2013 at 12:09 AM, Tyler Romeo tylerro...@gmail.com wrote:

 On Mon, Dec 30, 2013 at 6:08 PM, Rjd0060 rjd0060.w...@gmail.com wrote:

  Shouldn't the discussion *not* be happening on Bugzilla, but somewhere
  where the wider community is actually present?  Perhaps Meta?
 

 Well the issue is not whether we want Tor users editing or not. We do. The
 issue is finding a software solution that makes it possible.

 *-- *
 *Tyler Romeo*
 Stevens Institute of Technology, Class of 2016
 Major in Computer Science
 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] npm publish

2013-12-31 Thread Antoine Musso
Le 30/12/13 15:57, Dan Andreescu a écrit :
 We needed to write an npm module for mediawiki oauth, and we're about to
 publish it to the npm registry.  I just wanted to check whether we had a
 wikimedia account on https://npmjs.org/.  If not, do we want one?

Hello,

I would said be bold :-D  I did create a 'mediawiki' account on
packagist.org which is the equivalent of npmjs for PHP Composer.  I have
put the credentials on fenari in /home/wikipedia/doc.  You might to do
something alike.

cheers,

-- 
Antoine hashar Musso


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] Fwd: Proposal for biweekly Labs showcase

2013-12-31 Thread Diederik van Liere
Bumping my proposal. I am particularly looking forward from responses from
community members.
Have an awesome New Year's Eve!

Best,
Diederik


Heya,


I just posted an initial proposal to start running a biweekly showcase to
feature all the cool things that are happening on Labs. Please have a look
at https://wikitech.wikimedia.org/wiki/Showcase, express your interest and
chime in on the Talk page to help get this off the ground.

Best,
Diederik
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] npm publish

2013-12-31 Thread Max Semenik
On 31.12.2013, 15:54 Antoine wrote:

 I have
 put the credentials on fenari in /home/wikipedia/doc.  You might to do
 something alike.

Move to some host that will not die soon?

-- 
Best regards,
  Max Semenik ([[User:MaxSem]])


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] npm publish

2013-12-31 Thread Jeremy Baron
On Tue, Dec 31, 2013 at 3:26 PM, Max Semenik maxsem.w...@gmail.com wrote:
 Move to some host that will not die soon?

fenari has been the canonical place for these docs for a while. (AIUI)

IMHO, there should be a clearly defined home for them and not a
situation where people have to guess/hunt for the right host to look
on. (i.e. not packagist and dozens other things on fenari and npmjs
alone on tin/bast1001) There is already a ticket to move these docs
away from fenari (RT 6402) and seems like fenari is still the right
place to put new docs as long as it's still the canonical home for
them.

-Jeremy

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] npm publish

2013-12-31 Thread Antoine Musso
Le 31/12/13 16:26, Max Semenik a écrit :
 On 31.12.2013, 15:54 Antoine wrote:
 
 I have
 put the credentials on fenari in /home/wikipedia/doc.  You might to do
 something alike.
 
 Move to some host that will not die soon?

Hopefully they will be migrated out of fenari just like we had them
moved from good old zwinger :]


-- 
Antoine hashar Musso


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Google Code-in update (almost there!)

2013-12-31 Thread Quim Gil
Yesterday I couldn't send the weekly report. Here it goes, although the
data is from Monday to keep weekly consistency.

Summary: look at this!
https://www.mediawiki.org/wiki/File:Wikimedia_at_Google_Code-in_2013.png

Google Code-in: 6 weeks past, LESS THAN A WEEK TO GO!

https://www.mediawiki.org/wiki/Google_Code-In

We have 219 tasks completed, 17 in progress, and 42 open.

Org admins Andre and me can't express how thankful are we for the great
team collaboration and mentorship performance achieved thanks to the
contributions of many official and unofficial mentors, in the GCI site,
Gerrit, and Bugzilla. In a couple of months we have learned meny good
lessons that we will be able to apply not only in the next GCI edition,
but also (and mainly) in our processes to welcome new contributors with
appropriate tasks.

We are also getting good feedback from the GCI students that have
survived with us until this stage. As an indication, currently we have
about half of the students that we had after the first week of GCI, yet
they are completing just as many tasks. In fact we had some problems
related with excess of productivity (e.g. two students going after the
same task) but things seems to be relaxing a bit now. We can't blame
these students, GCI rules implicitly promote this  hypercompetitiveness,
students are young and full of energy.  :)

For the first time I can say that we don't need new mentors, and we
don't actually need new tasks (although additions of coding tasks are
still welcome, and they are still being taken by some brilliant young
developers). Thank you to everybody helping. Let's finish the last sprint!

-- 
Quim Gil
Technical Contributor Coordinator @ Wikimedia Foundation
http://www.mediawiki.org/wiki/User:Qgil



signature.asc
Description: OpenPGP digital signature
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] MediaWiki Language Extension Bundle 2013.12 release

2013-12-31 Thread Kartik Mistry
Hello,

Happy New Year to all!

I would like to announce the release of MediaWiki Language Extension
Bundle 2013.12. This bundle is compatible with MediaWiki 1.21.3 and
MediaWiki 1.22.0 releases.

* Download: 
https://translatewiki.net/mleb/MediaWikiLanguageExtensionBundle-2013.12.tar.bz2
* sha256sum:  2ef19e31d685dd0d6e50088ec8bbb1cd3e74c694096dfa333efb61dcfee9c3c0

Quick links:
* Installation instructions are at: https://www.mediawiki.org/wiki/MLEB
* Announcements of new releases will be posted to a mailing list:
https://lists.wikimedia.org/mailman/listinfo/mediawiki-i18n
* Report bugs to: https://bugzilla.wikimedia.org
* Talk with us at: #mediawiki-i18n @ Freenode

Release notes for each extension are below.

-- Kartik Mistry

== Babel, CLDR and LocalisationUpdates ==
* Only localization updates.

== CleanChanges ==
* Bug 58439: Fix Monobook skin by removing extra
nowiki/div/nowiki creating invalid HTML.
* Localization updates.

== Translate ==
=== Noteworthy changes ===
* Translator Sandbox:
** Translator Sandbox is a feature to simplify the sign-up process for
new translators. It can be enabled using $wgTranslateUseSandbox =
true;, but it will only be useful in conjunction with the
TwnMainPage, which is not included in MLEB (at least yet).
** Stash and translator approval pages are now usable.
** When a user is accepted, a bare-bones user page with a Babel box is created.

* TUX editor:
** Bug 53748: Message group related calls were made two order of
magnitudes faster, making Special:Translate's group selector, workflow
state selector etc. fast enough even on wikis with hundreds or
thousands groups.
** Add useful shortcuts to TUX editor. All buttons to insert/replace
text (insertables and then translation aids in order; discard
changes is skipped) are numbered from 1 up and can be accessed with
both alt+num and arrow up and down.
** Update the statusbar at the bottom of the TUX message tables when
saving or proofreading.

* Added support to MediaWiki localisation format from PHP files to
JSON files. This is part of the Localization Format RFC (
https://www.mediawiki.org/wiki/Requests_for_comment/Localisation_format
).
** Add support for JSON in mediawiki-defines.txt, allowing the
handling of MediaWiki extension that us the new format. Documentation
about the updates in MessageGroup configuration can be found at:
https://www.mediawiki.org/wiki/Help:Extension:Translate/Group_configuration/MediaWiki
** Fixed JsonFFS for properly importing and exporting Authors.
** targetPattern in group configuration is now optional.
** Migrate the Translate extension itself to JSON based i18n files
from PHP files. This change is transparent to all users of extension.
* Support xliff 1.1 files in addition to xliff 1.2 as well, but
without validation.

=== Breaking changes ===
Special:FirstSteps has been removed from Translate. If you want the
feature, you'll have to download it separately.

== UniversalLanguageSelector ==
=== Noteworthy changes ===
* Language selector now opens an order of magnitude faster (see
https://github.com/wikimedia/jquery.uls/pull/122 for details).

* Initial support for webfonts in MobileFrontend:
** Currently, support for webfonts in MobileFrontend is disabled by
default. It can be enabled by setting $wgULSMobileWebfontsEnabled =
true; in LocalSettings.php file. Note that, even then it will only be
enabled for users who has opted in for the mobile beta mode.

=== Fonts ===
* Bug 57767: Enable Amiri font for Soranî language.
* Bug 58381, 58382: Add Lateef and Scheherazade fonts for Soranî and
Farsi languages.
* Update Autonym font to version 20131205:
** Fixed Autonym for Old Slavonic.
** Fixed glyphs for the Glagolitic script.
** Removed Reserved Font Name (RFN) and documentation updates.

-- 
Kartik Mistry/કાર્તિક મિસ્ત્રી | IRC: kart_
{kartikm, 0x1f1f}.wordpress.com

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] RFC on PHP profiling

2013-12-31 Thread Chad
I'm starting a new RFC to discuss ways we can improve our PHP profiling.

https://www.mediawiki.org/wiki/Requests_for_comment/Better_PHP_profiling

Please feel free to help expand and/or comment on the talk page if you've
got ideas :)

-Chad
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] RFC on PHP profiling

2013-12-31 Thread Erik Bernhardson
I have also been thinking about the profiling system recently,  xhprof is a
really nice library which i use locally(quick plug: try the vagrant
`xhprof` role).  There are quite a few wfProfileIn/Out calls currently that
generate sub-ms events.  I am fairly certain these could be more accurately
tracked with a php extension like xhprof.

I see wfProfileIn/Out data as still useful though, in one example for
generating high level views for developers into what is happening(another
plug, i wrote a visualization of wfProfileIn/Out data for the debug toolbar
last weekend[1]).  Also for the ?forceprofile=1 and ?forcetrace=1 options
in production.

I think there is room for both our DIY and xhprof(but probably not xdebug
in prod, its way too slow when generating traces).

Erik B.

[1] https://gerrit.wikimedia.org/r/#/c/104318/



On Tue, Dec 31, 2013 at 9:55 AM, Chad innocentkil...@gmail.com wrote:

 I'm starting a new RFC to discuss ways we can improve our PHP profiling.

 https://www.mediawiki.org/wiki/Requests_for_comment/Better_PHP_profiling

 Please feel free to help expand and/or comment on the talk page if you've
 got ideas :)

 -Chad
 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Gerrit command line tool

2013-12-31 Thread Jon Robson
This now lives here:
https://github.com/jdlrobson/GerritCommandLine

Thanks for all the feedback so far.


On Fri, Dec 27, 2013 at 11:02 AM, Jon Robson jdlrob...@gmail.com wrote:

 I added a license amongst other things in this follow up patchset:
 https://gerrit.wikimedia.org/r/104122


 On Fri, Dec 27, 2013 at 7:32 AM, Tim Landscheidt 
 t...@tim-landscheidt.dewrote:

 Jon Robson jdlrob...@gmail.com wrote:

  I was getting very bored of the tedious job of working out what to code
  review and then how to pull it down so I made a tool in the
 MobileFrontend
  repository which lets me do all this from the comfort of the command
 line
  [1]. With the script (and a couple of python installs as referenced in
 the
  command line) you simply run:

  make gerrit

  and it prints a list of options
  http://imgur.com/e5JBnQI

  Simply type the number of the commit you want to review and it will
 pull it
  down for you.

  The list of reviews is ordered by the following mechanism (could be
  tweaked):
  * Patches closest to being merged appear at top e.g. +1s at top, -2s at
  bottom
  * If patches are of equal code review score, the old ones get
 preference as
  that only seems fair.

  It currently doesn't allow you to do reviews from the command line but
 that
  would be a pretty awesome addition.

  Let me know your thoughts.

  [1] https://gerrit.wikimedia.org/r/#/c/103884/

 Very neat!  Could you release it under Apache 2.0 so that it
 can be merged upstream?

 Tim


 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l




 --
 Jon Robson
 * http://jonrobson.me.uk
 * https://www.facebook.com/jonrobson
 * @rakugojon




-- 
Jon Robson
* http://jonrobson.me.uk
* https://www.facebook.com/jonrobson
* @rakugojon
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] $wgDBmysql5

2013-12-31 Thread Platonides

On 30/12/13 23:21, Tyler Romeo wrote:

As the subject implies, there is a variable named $wgDBmysql5. I am
inclined to believe that the purpose of this variable is to use features
only available in MySQL 5.0 or later. The specific implementation affects
the encoding in the database.

https://www.mediawiki.org/wiki/Manual:$wgDBmysql5

Right now MediaWiki installation requirements say you need MySQL 5.0.2 or
later. So my question is: is this configuration variable still necessary,
or should it be deprecated?


I think it is, for schemas created without it.


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] Captcha filter list

2013-12-31 Thread George Herbert
Just got a report and screenshot that a new user got this string for their
captcha on en.wikipedia nigerblew

http://snag.gy/JpSUR.jpg

Though several people are pointing out that Niger is a country, I think
it's reasonable to try and avoid things close to the two-g version of that
word; nobody's denigrated by avoiding possibly offensive if misinterpreted
words.  I recall there's a filter list?...


-- 
-george william herbert
george.herb...@gmail.com
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Captcha filter list

2013-12-31 Thread Steven Walling
On Tue, Dec 31, 2013 at 7:05 PM, George Herbert george.herb...@gmail.comwrote:

 Just got a report and screenshot that a new user got this string for their
 captcha on en.wikipedia nigerblew

 http://snag.gy/JpSUR.jpg

 Though several people are pointing out that Niger is a country, I think
 it's reasonable to try and avoid things close to the two-g version of that
 word; nobody's denigrated by avoiding possibly offensive if misinterpreted
 words.  I recall there's a filter list?...


I don't know if there's a filter list, but this has happened before. I've
seen many of our past and current CAPTCHAs when we were testing the
signup/login redesign. I recall seeing both headshits and obamadick.

The CAPTCHA is two randomly generated English words. Personally, I think it
might just be good to regenerate the CAPTCHAs entirely from time to time.
AFAIK it's not that hard and our CAPTCHAs are weak anyway.

Steven
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Captcha filter list

2013-12-31 Thread Benjamin Lees
There's a blacklist that has been included with FancyCaptcha for a few
months, although I don't know whether it's the same as the one the WMF uses.

See https://bugzilla.wikimedia.org/show_bug.cgi?id=21025 and the associated
patches.
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Captcha filter list

2013-12-31 Thread Tyler Romeo
It's a CAPTCHA, not an article or piece of actual content. If people are
actually getting offended by randomly generated CAPTCHAs I think they need
to find something more worthwhile to complain about.

-- 
Tyler Romeo
On Jan 1, 2014 12:27 AM, Benjamin Lees emufarm...@gmail.com wrote:

 There's a blacklist that has been included with FancyCaptcha for a few
 months, although I don't know whether it's the same as the one the WMF
 uses.

 See https://bugzilla.wikimedia.org/show_bug.cgi?id=21025 and the
 associated
 patches.
 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Captcha filter list

2013-12-31 Thread George Herbert
Tyler, websites everywhere blacklist offensive words (and with some
regularity, look and sound-alikes)  from the random captcha generator...

I don't personally care, you don't, but if we offend people needlessly it's
an oops.  We need some elements of the site to meet Lowest Common
Denominator rather than most enlightened participant.

That all said, it is possible the screenshot was a fake; the brand new user
posted two things on-wiki; one a faked source for a BBC actress article,
and two the screenshot for the captcha.

On-wiki conclusion was we were trolled.  Not sure if that means the
screenshot was a photoshop job or a real one, whether that was part of the
troll or not.  But the actual edit was a good solid troll.




On Tue, Dec 31, 2013 at 9:30 PM, Tyler Romeo tylerro...@gmail.com wrote:

 It's a CAPTCHA, not an article or piece of actual content. If people are
 actually getting offended by randomly generated CAPTCHAs I think they need
 to find something more worthwhile to complain about.

 --
 Tyler Romeo
 On Jan 1, 2014 12:27 AM, Benjamin Lees emufarm...@gmail.com wrote:

  There's a blacklist that has been included with FancyCaptcha for a few
  months, although I don't know whether it's the same as the one the WMF
  uses.
 
  See https://bugzilla.wikimedia.org/show_bug.cgi?id=21025 and the
  associated
  patches.
  ___
  Wikitech-l mailing list
  Wikitech-l@lists.wikimedia.org
  https://lists.wikimedia.org/mailman/listinfo/wikitech-l
 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l




-- 
-george william herbert
george.herb...@gmail.com
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Captcha filter list

2013-12-31 Thread Marc A. Pelletier
On 01/01/2014 12:34 AM, George Herbert wrote:
 Tyler, websites everywhere blacklist offensive words (and with some
 regularity, look and sound-alikes)  from the random captcha generator...

Yes, and then we end up with the Scunthorpe problem instead.

I agree it's a little bit silly, and also a loosing proposition; even
trying to filter for /actual/ cuss words is hard enough (because the
list of word/fragments someone *might* find offensive is boundless); if
we try to also block misspelling, lookalikes or cognates we might as
well block /^[a-z]*$/.

-- Marc


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Captcha filter list

2013-12-31 Thread Benjamin Lees
On Wed, Jan 1, 2014 at 12:42 AM, Marc A. Pelletier m...@uberbox.org wrote:

 Yes, and then we end up with the Scunthorpe problem instead.


The Scunthorpe problem is not actually a problem here, because we're just
limiting the CAPTCHAs we serve to users, not filtering their input.
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Captcha filter list

2013-12-31 Thread Tim Landscheidt
Marc A. Pelletier m...@uberbox.org wrote:

 Tyler, websites everywhere blacklist offensive words (and with some
 regularity, look and sound-alikes)  from the random captcha generator...

 Yes, and then we end up with the Scunthorpe problem instead.

 I agree it's a little bit silly, and also a loosing proposition; even
 trying to filter for /actual/ cuss words is hard enough (because the
 list of word/fragments someone *might* find offensive is boundless); if
 we try to also block misspelling, lookalikes or cognates we might as
 well block /^[a-z]*$/.

Not only that, the selection of blacklisted words may be of-
fensive itself.  For example,
https://git.wikimedia.org/blob/mediawiki%2Fextensions%2FConfirmEdit/master/blacklist
protects the Christian god with two entries, while Allah is
up for ridicule.  And it might even require Jews to type in
the tetragrammaton and thus effectively ban them from a
site.

IMHO if someone is offended by a captcha, they should click
on reload.

Tim


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Captcha filter list

2013-12-31 Thread Benjamin Lees
On Wed, Jan 1, 2014 at 2:00 AM, Tim Landscheidt t...@tim-landscheidt.dewrote:

 Not only that, the selection of blacklisted words may be of-
 fensive itself.


That doesn't seem like a problem, since the list isn't visible in the user
interface.  Users will not be complaining that they aren't receiving
niggardly, cocky, and cumin in CAPTCHAs.


 For example,

 https://git.wikimedia.org/blob/mediawiki%2Fextensions%2FConfirmEdit/master/blacklist
 protects the Christian god with two entries, while Allah is
 up for ridicule.  And it might even require Jews to type in
 the tetragrammaton and thus effectively ban them from a
 site.


I checked out the registration form for a white supremacist forum, and they
just use reCAPTCHA.  No doubt they'll be developing a CAPTJCA or CAPTMCA
soon enough.

There are likely a number of strings that should be added to the default
blacklist.  Patches welcome!
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l