Re: [Wikitech-l] Bug 1542 - Log spam blacklist hits

2013-03-11 Thread anubhav agarwal
Hey Chriss I added some debug lines on filter function of SpamBlacklist Class. *wfDebugLog('SpamBlacklist', testing );* in the intial step; but I wasn't able to see any such thing in the Debug Log in debug toolbaar. Then I added the following line *wfErrorLog(Error here, './error.log' );* * *

Re: [Wikitech-l] Bugzilla Weekly Report

2013-03-11 Thread Andre Klapper
On Sun, 2013-03-10 at 22:35 -0700, Quim Gil wrote: On 03/10/2013 08:00 PM, reporter wrote: Top 5 bug report closers federicoleva [AT] What about showing realname instead? Realname is not a mandatory field, so this would need fallback code to use login_name if realname is empty code.

[Wikitech-l] Fwd: Commons Android app v1.0 Beta 3 released to the play store

2013-03-11 Thread Yuvi Panda
I'm going to spam wikitech-l with release updates as well :) -- Forwarded message -- From: Yuvi Panda yuvipa...@gmail.com Date: Mon, Mar 11, 2013 at 3:19 PM Subject: Commons Android app v1.0 Beta 3 released to the play store To: mobile-l mobil...@lists.wikimedia.org A new

Re: [Wikitech-l] Free Disk Space needed for import

2013-03-11 Thread Andre Klapper
Hi, On Sat, 2013-03-09 at 02:16 -0600, wiki wrote: Sorry, I forgot to mention that I have in mind the English wikipedia dump. http://en.wikipedia.org/wiki/Wikipedia:Database_download says The size of the 3 January 2013 dump is approximately 9.0 GB compressed, 41.5 GB uncompressed). andre --

Re: [Wikitech-l] Free Disk Space needed for import

2013-03-11 Thread wiki
Thank you for the response. I think those sizes refer to the exported xml, e.g. 41.5GB is the English xml.bz2 expanded. I was curious as to how much extra disk space is needed (and consumed) after importing this English xml dump into MySQL, i.e. how much the database tables use. - sam -

Re: [Wikitech-l] RFC: image information API

2013-03-11 Thread Denny Vrandečić
I certainly don't want to cookie-lick this usecase for Wikidata / Wikibase, but I think that using the Wikibase extension for this might be easier than it looks. The one major point would be a bit of coding to allow claims for the media namespace. But indeed, right now we do not have the bandwidth

[Wikitech-l] Category sorting in random order

2013-03-11 Thread Lars Aronsson
This category is sorted strangely, http://sv.wikipedia.org/wiki/Kategori:Svenska_kokboksf%C3%B6rfattare A, B, E, G, H, L, B, M, N, F, J, R, Þ, S, W, Z, Å Not only does B appear twice, and F after N, but the article sorted under Þ has {{STANDARDSORTERING:Tunberger, Pernilla}} where

Re: [Wikitech-l] Who is responsible for accepting backported patch sets for maintained versions?

2013-03-11 Thread Andre Klapper
On Fri, 2013-02-22 at 11:24 -0500, Mark A. Hershberger wrote: I can make and push tarballs without involving Chris. But yes, we need to discuss policy, and start setting expectations, etc. tl;dr: There's now a Backport_to_Stable dropdown in Bugzilla for backporting requests. Set it to ? if

Re: [Wikitech-l] Category sorting in random order

2013-03-11 Thread Bartosz Dziewoński
This temporary, due to bug 45446 being fixed right now. Give it 24 hours :) (and then purge the cache). https://bugzilla.wikimedia.org/show_bug.cgi?id=45446 -- Matma Rex ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org

Re: [Wikitech-l] Category sorting in random order

2013-03-11 Thread Paul Selitskas
Can you add Belarusian projects as well? 'bewiki' = 'uca-be', 'bewikisource' = 'uca-be', 'be_x_oldwiki' = 'uca-be', I was denied while sending a patch for review. On Mon, Mar 11, 2013 at 3:18 PM, Bartosz Dziewoński matma@gmail.comwrote: This temporary, due to bug 45446 being fixed right

Re: [Wikitech-l] Who is responsible for accepting backported patch sets for maintained versions?

2013-03-11 Thread Mark A. Hershberger
On 03/11/2013 08:12 AM, Andre Klapper wrote: On Fri, 2013-02-22 at 11:24 -0500, Mark A. Hershberger wrote: But yes, we need to discuss policy, and start setting expectations, etc. tl;dr: There's now a Backport_to_Stable dropdown in Bugzilla for backporting requests. Set it to ? if you

Re: [Wikitech-l] Identifying pages that are slow to render

2013-03-11 Thread Ariel T. Glenn
Στις 07-03-2013, ημέρα Πεμ, και ώρα 21:12 -0400, ο/η bawolff έγραψε: On 2013-03-07 4:06 PM, Matthew Flaschen mflasc...@wikimedia.org wrote: On 03/07/2013 12:00 PM, Antoine Musso wrote: Le 06/03/13 23:58, Federico Leva (Nemo) a écrit : There's slow-parse.log, but it's private unless a

Re: [Wikitech-l] Replacement for tagging in Gerrit

2013-03-11 Thread Christian Aistleitner
Hi Andre, On Mon, Mar 11, 2013 at 12:43:44AM +0100, Andre Klapper wrote: Are there some thoughts already how to make a match between a project in Gerrit and its related Bugzilla product/component? [Disclaimer: I'm a fan of DOAP metadata files, if used properly.] While DOAP looks great, is

Re: [Wikitech-l] Free Disk Space needed for import

2013-03-11 Thread Ariel T. Glenn
Στις 11-03-2013, ημέρα Δευ, και ώρα 05:35 -0500, ο/η wiki έγραψε: Thank you for the response. I think those sizes refer to the exported xml, e.g. 41.5GB is the English xml.bz2 expanded. I was curious as to how much extra disk space is needed (and consumed) after importing this

Re: [Wikitech-l] Live recent changes feed

2013-03-11 Thread Brian Wolff
On 2013-03-11 1:11 AM, Tyler Romeo tylerro...@gmail.com wrote: *dos vector - register someone you don't like's url. Register 100 variants from the same domain. Push enwikipedia's rc feed there. Or you roll out some EC2 instances and open 100 sockets. (And before you say rate-limit

[Wikitech-l] QA weekly goal: browser testing automation for Wikipedia Search

2013-03-11 Thread Quim Gil
WHAT: Writing a Wikipedia search feature description in plain English for our automated testing process. Let's feed the Test backlog! WHEN: We will start with a video streamed demo on Wednesday, March 13, 2013 at 17h UTC [1] and we will be helping volunteers during the rest of the week. This

Re: [Wikitech-l] Live recent changes feed

2013-03-11 Thread Jeroen De Dauw
Hey, Sure you could add some mechamism to prove you own the domain where you want the rc updates to be sent, but things can get rather complex. Google uses, or at least used to use, the following to do exactly that: On request provide a auth file to the user which includes some unique

Re: [Wikitech-l] Live recent changes feed

2013-03-11 Thread Brian Wolff
On 2013-03-11 3:46 PM, Jeroen De Dauw jeroended...@gmail.com wrote: Hey, Sure you could add some mechamism to prove you own the domain where you want the rc updates to be sent, but things can get rather complex. Google uses, or at least used to use, the following to do exactly that: On

Re: [Wikitech-l] RFC: image information API

2013-03-11 Thread Brion Vibber
On Mon, Mar 11, 2013 at 4:52 AM, Denny Vrandečić denny.vrande...@wikimedia.de wrote: I certainly don't want to cookie-lick this usecase for Wikidata / Wikibase, but I think that using the Wikibase extension for this might be easier than it looks. The one major point would be a bit of coding to

Re: [Wikitech-l] Live recent changes feed

2013-03-11 Thread Jeroen De Dauw
Hey, what you describe is not what google does. Google tells the user the path for the file (i believe the usual place is in the root of the domain). The user does not pick the path. Otherwise I could prove I own wikipedia (assuming mime types weren't checked) by using action=raw. Good

Re: [Wikitech-l] Live recent changes feed

2013-03-11 Thread Tyler Romeo
Honestly, the solution could be as simple as requiring that the HTTP response have a certain header or something. *--* *Tyler Romeo* Stevens Institute of Technology, Class of 2015 Major in Computer Science www.whizkidztech.com | tylerro...@gmail.com ___

Re: [Wikitech-l] Live recent changes feed

2013-03-11 Thread Brian Wolff
On 2013-03-11 4:32 PM, Tyler Romeo tylerro...@gmail.com wrote: Honestly, the solution could be as simple as requiring that the HTTP response have a certain header or something. *--* *Tyler Romeo* Stevens Institute of Technology, Class of 2015 Major in Computer Science www.whizkidztech.com

[Wikitech-l] Project ideas for GSOC'13

2013-03-11 Thread Quim Gil
We need to work in our application for Google Summer of Code 2013. http://www.mediawiki.org/wiki/Summer_of_Code_2013 A basic piece is the list of projects we are proposing. This main reference is http://www.mediawiki.org/wiki/Mentorship_programs/Possible_projects Make sure the projects you

Re: [Wikitech-l] 40 million links to Wikipedia

2013-03-11 Thread S Page
On Sat, Mar 9, 2013 at 6:32 PM, Peter Kaminski kamin...@istori.com wrote: Here's big data dataset from Google Research and UMass IESL, 40 million links to Wikipedia pages where the anchor text of the link closely matches the title of the target Wikipedia page, from 10 million web pages, for the

Re: [Wikitech-l] Live recent changes feed

2013-03-11 Thread Tyler Romeo
On Mon, Mar 11, 2013 at 3:53 PM, Brian Wolff bawo...@gmail.com wrote: (provided that very initially a get request is used to verify this. Post requests to arbitrary unverified urls can be dangerous.). Totally agreed. If anything the GET request could be used to obtain initial information

[Wikitech-l] Submit a talk to Open Source Bridge by Mar 23rd, get WMF subsidy

2013-03-11 Thread Sumana Harihareswara
This year, consider speaking at Open Source Bridge http://opensourcebridge.org/ in Portland, Oregon, USA, June 18-21. OSB is tech talks hack sessions with hands-on technologists we want, for Foundation staff recruiting and for volunteer recruiting and collaboration. Good talks, clueful people,

[Wikitech-l] This week's update, now with 7.3% more levity!

2013-03-11 Thread Marc A. Pelletier
Toto, I've a feeling we're not on the toolserver any more. So, things are progressing at a fair clip; there is now a database available to the tools, which means that most tools that do not depend on the presence of the database replicas should be workable in the new environment. Maintainers

Re: [Wikitech-l] Extensions meta repo problem with MaintenanceShell

2013-03-11 Thread Brion Vibber
On Sun, Mar 3, 2013 at 5:50 AM, Brion Vibber br...@pobox.com wrote: Is anybody else seeing this when running 'git submodule update' in a checkout of the extensions repo? fatal: reference is not a tree: beead919cac17528f335d9409dfcada12e606ebd Unable to checkout

Re: [Wikitech-l] Category sorting in random order

2013-03-11 Thread Matthew Flaschen
On 03/11/2013 08:38 AM, Paul Selitskas wrote: Can you add Belarusian projects as well? 'bewiki' = 'uca-be', 'bewikisource' = 'uca-be', 'be_x_oldwiki' = 'uca-be', I was denied while sending a patch for review. Please file a bug if you haven't already. How did you attempt to do a patch?

Re: [Wikitech-l] Category sorting in random order

2013-03-11 Thread Paul Selitskas
Git review, of course. The log is here: http://pastebin.com/iC4N1am0 On Tue, Mar 12, 2013 at 1:46 AM, Matthew Flaschen mflasc...@wikimedia.orgwrote: On 03/11/2013 08:38 AM, Paul Selitskas wrote: Can you add Belarusian projects as well? 'bewiki' = 'uca-be', 'bewikisource' = 'uca-be',

Re: [Wikitech-l] Nightly shallow clones of mediawiki/core

2013-03-11 Thread Ori Livneh
On Sun, Mar 10, 2013 at 2:39 PM, Chad innocentkil...@gmail.com wrote: I'm open to any other ideas that could help core. I think it would be very useful to provide regular snapshots of core with a shallow '.git' included. Git-cloning from GitHub is faster than Gerrit but is still quite slow,

Re: [Wikitech-l] Category sorting in random order

2013-03-11 Thread Brian Wolff
To ask the obvious question, is the key you have scp configured to use the same as the one in your gerrit prefs? -bawolff On 2013-03-11 8:05 PM, Paul Selitskas p.selits...@gmail.com wrote: Git review, of course. The log is here: http://pastebin.com/iC4N1am0 On Tue, Mar 12, 2013 at 1:46 AM,

Re: [Wikitech-l] Nightly shallow clones of mediawiki/core

2013-03-11 Thread Platonides
On 11/03/13 00:17, Yuri Astrakhan wrote: Answered Inline. Also, I apologize as I think my email was slightly off-topic to Ori's question. On Sun, Mar 10, 2013 at 6:57 PM, Matthew Flaschen mflasc...@wikimedia.orgwrote: PHPMyAdmin also has major security issues. It isn't allowed on

Re: [Wikitech-l] Category sorting in random order

2013-03-11 Thread Paul Selitskas
Yes, thanks for that question. I didn't check but this is obviously it. Sorry for spamming :) On Tue, Mar 12, 2013 at 2:15 AM, Brian Wolff bawo...@gmail.com wrote: To ask the obvious question, is the key you have scp configured to use the same as the one in your gerrit prefs? -bawolff On

Re: [Wikitech-l] Category sorting in random order

2013-03-11 Thread Platonides
On 12/03/13 00:05, Paul Selitskas wrote: Git review, of course. The log is here: http://pastebin.com/iC4N1am0 Everyone can send patches to all repositories. If you get an error doing that, it's not that You are denied, it's some configuration problem. In this case, it looks like you didn't have

Re: [Wikitech-l] Category sorting in random order

2013-03-11 Thread Brian Wolff
No worries. The obvious things are often the hardest to spot :-) -bawolff On 2013-03-11 8:24 PM, Paul Selitskas p.selits...@gmail.com wrote: Yes, thanks for that question. I didn't check but this is obviously it. Sorry for spamming :) On Tue, Mar 12, 2013 at 2:15 AM, Brian Wolff

Re: [Wikitech-l] Nightly shallow clones of mediawiki/core

2013-03-11 Thread Mark A. Hershberger
On 03/11/2013 07:09 PM, Ori Livneh wrote: I think it would be very useful to provide regular snapshots of core with a shallow '.git' included. Git-cloning from GitHub is faster than Gerrit but is still quite slow, and their snapshots, too, exclude '.git'. You can get a nightly snapshot of

Re: [Wikitech-l] Nightly shallow clones of mediawiki/core

2013-03-11 Thread Tyler Romeo
On Mon, Mar 11, 2013 at 8:31 PM, Mark A. Hershberger m...@everybody.orgwrote: You can get a nightly snapshot of mediawiki from: https://toolserver.org/~krinkle/mwSnapshots/#!/mediawiki-core/master There is not .git, but maybe it could be added. This is spectacular. Simple and easy. *--*

Re: [Wikitech-l] Nightly shallow clones of mediawiki/core

2013-03-11 Thread Thomas Gries
Am 12.03.2013 02:51, schrieb Rob Lanphier: I was able to do the clone in 50 seconds over HTTPS Ori, have you tried this with --depth 1? Rob I added - already some weeks ago - this comment to our main download _talk_ page