Re: [Wikitech-l] "Nobody" & "Wikidata bugs": notify when you start working on a bug

2012-12-08 Thread Quim Gil

On 12/08/2012 08:25 PM, Roan Kattouw wrote:

On Thu, Dec 6, 2012 at 10:07 AM, Quim Gil  wrote:

Hi, thanks to the metrics reports now we know that the top bug fixers in
November were Nobody (228) and Wikidata bugs (83)... followed by Michael
Dale (28), Roan Kattouw (23), etc.

http://www.mediawiki.org/wiki/Community_metrics/November_2012#People


Those statistics don't actually measure who fixes bugs, they measure
who the fixed bugs were assigned to. Those aren't necessarily the same
person (although I imagine this is rare), but the larger issue is
that, as you say, most bugs have no human assignee.


Yes, it's not perfect. However, I'm curious to see what happens after 
paying attention to this detail during this month. Maybe the next report 
will make more sense, maybe not. Maybe it will help improving a bit our 
processes or maybe not.


If the whole thing is pointless and more noisy than anything then I will 
just put it to rest. Like I did with the identification of "new 
contributors" based on who Ohloh thought that was new. Still, the 
exercise was useful for a few developers that claim their multiple 
identities in Ohloh. At least Jon Robson seemed to be very happy 
discovering the huge amount of Javascript lines of code he had 
contributed to several projects over time.  :)


I'll keep doing my best this month assigning bugs to whoever seems to 
fix them, checking the gerrit commits when available. Thank you for your 
patience and collaboration.  :)


--
Quim Gil
Technical Contributor Coordinator @ Wikimedia Foundation
http://www.mediawiki.org/wiki/User:Qgil

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] "Nobody" & "Wikidata bugs": notify when you start working on a bug

2012-12-08 Thread Roan Kattouw
On Thu, Dec 6, 2012 at 10:07 AM, Quim Gil  wrote:
> Hi, thanks to the metrics reports now we know that the top bug fixers in
> November were Nobody (228) and Wikidata bugs (83)... followed by Michael
> Dale (28), Roan Kattouw (23), etc.
>
> http://www.mediawiki.org/wiki/Community_metrics/November_2012#People
>
Those statistics don't actually measure who fixes bugs, they measure
who the fixed bugs were assigned to. Those aren't necessarily the same
person (although I imagine this is rare), but the larger issue is
that, as you say, most bugs have no human assignee. Another statistic
that is used in the BZ reports sent to this list is who closed the bug
(i.e. changed its status to RESOLVED FIXED), but this is also
suboptimal. For instance, in the VisualEditor team, James somewhat
frequently cleans up after developers who fix a bug but forget to
close it, or even mention the bug in the commit summary, so he's
probably the top bug "fixer" in VE by that metric, even though most of
that is just him taking paperwork off our hands. Another problem is
that bugs can bounce between REOPENED and FIXED multiple times, and
can be set to FIXED by different people each time.

So both metrics are noisy, although I imagine the latter would not
have a 50% signal-to-noise ratio like the former. Getting more
accuracy would be complicated: you'd probably have to look for Gerrit
links on the bug and identify their authors, or something like that.

Not saying the metric you used is wrong (it has advantages and
disadvantages), but I do think it's a bit misleadingly labeled.

Roan

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


[Wikitech-l] Tutorial videos & slides on Puppet, security, performance profiling, i18n, RL, extensions, Lua...

2012-12-08 Thread Sumana Harihareswara
Can someone help me out by wiki-gnoming the videos and slides referenced
below, plus those in https://www.mediawiki.org/wiki/Category:Tutorials ?

-- 
Sumana Harihareswara
Engineering Community Manager
Wikimedia Foundation

On 09/11/2012 11:04 PM, Erik Moeller wrote:
Subject: Tutorial videos from today
> We had an internal tech meeting day today, and among other things,
> three speakers presented brief tutorials:
> 
> * Asher Feldman on profiling/performance optimization:
> https://commons.wikimedia.org/wiki/File:MediaWiki_Performance_Profiling.ogv
> * Chris Steipp on security:
> https://commons.wikimedia.org/wiki/File:MediaWiki_Security.ogv
> * Ryan Lane on Puppet:
> https://commons.wikimedia.org/wiki/File:Puppet_Tutorial.ogv
> 
> As a reminder, there are a number of previous tutorial videos online
> as well, e.g. from the Berlin hackathon and the SF hackathon earlier
> this year:
> 
> https://commons.wikimedia.org/wiki/Category:Hackathon_Berlin_2012
> https://commons.wikimedia.org/wiki/Category:Wikimedia_Hackathon_San_Francisco_2012
> 
> If anyone feels like doing some wiki-gnoming, it'd be nice to have
> more consistent categorization for all these videos on Commons, and
> perhaps a directory on MediaWiki.org if there isn't one already.
> 
> Erik


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] unit testing foreign wiki access

2012-12-08 Thread Platonides
Do you really need SQL access to wikidata?
I would expect your code to go through a WikidataClient class, which
could then connected to wikidata by sql, http, loading from a local file...


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Please help MediaWiki I18n/L10n improve

2012-12-08 Thread Federico Leva (Nemo)
Just FYI, TWN staff has seen no help at all in the last few months, so 
everything in 
http://lists.wikimedia.org/pipermail/wikitech-l/2012-August/062960.html 
is still valid and https://bugzilla.wikimedia.org/38638 dependencies are 
longer than ever.


The small good news is that now good devs can follow the support 
requests for them with a nice RSS feed, see the link at e.g. 
https://translatewiki.net/wiki/User:Nike#Support_TODO


Today I got rid of 16 of the many issues, previously "assigned" to 
Fabrice Florin, which gave me nightmares on 
 by dumping them 
on bugzilla to forget them:



Nemo

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] unit testing foreign wiki access

2012-12-08 Thread Daniel Kinzler
Hi Christian

On 08.12.2012 22:16, Christian Aistleitner wrote:
> However, we actually do not need those databases and tables for testing.
> For testing, it would be sufficient to have mock database objects [1] that
> pretend that there are underlying databases, tables, etc.

Hm... so, if that mock DB object is used on the code, which tried to execute
an SQL query against it, it will work? Sounds like that should at least be an
in-memory sqlite instance... The trouble is, we do *not* have an abstraction
layer on top of SQL. We just have one for different SQL implementations. To
abstract from SQL, we'd need a full DAO layer. We don't have that.

Anyway: even if that works, one reason not to do it would be the ability to
test against different database engines. The PostGres people are quite keen on
that. But I suppose that could remain as an optional feature.

Also: once I have a mock object, how do I inject it into the load
balancer/LBFactory, so wfGetLB( 'foo' )->getConnection( DB_SLAVE, null, 'foo'
) will return the correct mock object (one one for wiki 'foo')? Global state
is evil...

If you could help me to answer that last question, that would already help me
a lot...

thanks
daniel

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] unit testing foreign wiki access

2012-12-08 Thread Christian Aistleitner
Hi Daniel,

sorry for chiming in late. But better a late response than none ;-)

On Tue, Dec 04, 2012 at 10:54:08AM +0100, Daniel Kinzler wrote:
> [ LoadBalancer, DBAccessBase, and ORMTable integration help when
>   developing components that access another wiki's database. ]
> But. How do I test these?

In the further parts of your email, you exposed some of the structural
problems we face when trying to setup the required databases, ... at
run^Wtesttime.

However, we actually do not need those databases and tables for
testing. For testing, it would be sufficient to have mock database
objects [1] that pretend that there are underlying databases, tables,
etc.
Those mocks would return the very values that our database objects
would return if those databases, tables, etc existed.

Thereby, we could skip the hassles of setting up a second database or
additional tables etc. Yippie! We'd just have to initialize the mock.

Best regards,
Christian


[1] I am well aware of sounding like a broken record, as I already
suggested mocking the database some time ago when discussing
performance issues for tests
https://lists.wikimedia.org/pipermail/wikitech-l/2012-June/061469.html
But this reference is just to show that going down the database
mocking road would solve both problems at once.


-- 
 quelltextlich e.U.  \\  Christian Aistleitner 
   Companies' registry: 360296y in Linz
Christian Aistleitner
Gruendbergstrasze 65aEmail:  christ...@quelltextlich.at
4040 Linz, Austria   Phone:  +43 732 / 26 95 63
 Fax:+43 732 / 26 95 63
 Homepage: http://quelltextlich.at/
---


signature.asc
Description: Digital signature
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Unit tests scream for attention

2012-12-08 Thread Niklas Laxström
On 8 December 2012 21:22, Platonides  wrote:
> They mostly run for me.

Yep, mostly:
Tests: 4653, Assertions: 556494, Failures: 119, Errors: 17,
Incomplete: 3, Skipped: 15.

I've not even looked at most of these failures because of the fatal
errors. Thanks to Jeroen and Brad there is now one fatal error less
(and at least one remains), though I did spend two hours debugging
that issue.
  -Niklas

--
Niklas Laxström

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Unit tests scream for attention

2012-12-08 Thread Platonides
They mostly run for me.

Those failing are:
- PreferencesTest and TitleMethodsTest requiring a DB but not in
Database group.
- SanitizerTest::testRemovehtmltagsOnHtml5Tags skipping the closing tag
an ,  and  (???)
- IPTCTest::testIPTCParseForcedUTFButInvalid with php 5.4. Reported in
June as bug 37665

Plus an odd error by DatabaseTest::testStoredFunctions



___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] slight change to the review workflow in Gerrit

2012-12-08 Thread Bryan Tong Minh
On Fri, Dec 7, 2012 at 1:58 PM, Derric Atzrott  wrote:

> >> Why not just add (more) slaves?  Computing power is much
> >> cheaper than developer time.
> >
> >I absolutely agree.
>
> I'm also in agreement on this one.  Having tests run after code review
> negates
> the point of having tests run in the first place.
>
I also fully agree.

I would also like to point out  that this new workflow requires unit tests
to all pass out of the box, but they don't -- at least last time I checked.


Bryan
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] HTTP Error 504 Gateway timeout when running a robot

2012-12-08 Thread Maarten Dammers

Hi strainu,

Op 8-12-2012 11:21, Strainu schreef:

Hi,

I'm running a robot that makes relatively small changes in ~250KB
pages. Starting from November, the change is made, but I recieve a 504
error. This does not happen when making the change manually and it did
not happen earlier in the year (say, September). The code is a custom,
pywikipediabot-based robot on ro.wp. One example of such change is
http://ro.wikipedia.org/w/index.php?title=Lista_monumentelor_istorice_din_jude%C8%9Bul_Sibiu&diff=7172109&oldid=7125528


Has something changed recently in the server configuration that would
cause such errors to occur? Is there somethnig I can do to avoid them,
except splitting the pages?
Pages are rather large and should probably be split up in more 
manageable pieces. I wrote some documentation about that at 
https://commons.wikimedia.org/wiki/Commons:Monuments_database/Improving_lists


But that doesn't solve the problem. What is the timeout of the normal 
interface (a user editing) and what is the timeout for the api? Could 
someone (maybe ops) answer this question?
I assume there is a difference between that. You could confirm this by 
falling back to a non-api write after you received a timeout .


Maarten


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


[Wikitech-l] HTTP Error 504 Gateway timeout when running a robot

2012-12-08 Thread Strainu
Hi,

I'm running a robot that makes relatively small changes in ~250KB
pages. Starting from November, the change is made, but I recieve a 504
error. This does not happen when making the change manually and it did
not happen earlier in the year (say, September). The code is a custom,
pywikipediabot-based robot on ro.wp. One example of such change is
http://ro.wikipedia.org/w/index.php?title=Lista_monumentelor_istorice_din_jude%C8%9Bul_Sibiu&diff=7172109&oldid=7125528


Has something changed recently in the server configuration that would
cause such errors to occur? Is there somethnig I can do to avoid them,
except splitting the pages?

Thanks,
   Strainu

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l