Re: [Wikitech-l] Replacement for tagging in Gerrit

2013-03-14 Thread Christian Aistleitner
Hi,

On Thu, Mar 14, 2013 at 12:27:06AM +0100, Krinkle wrote:
 Can git notes be changed after merge?

Yes.

 Do they show in Gerrit diffs, so we don't accidentally lose them if
 someone else pushed an amend without the tags?

No.
At least not yet :-)

 Can they be removed after merge?

Yes.
And they are under revision control themselves. So you can have a look
at when a note was added by whom etc.

 Can one query for commits having a certain tag within a
 repo/branch/author? (eg. fixme commits by me in mw core)

As there is no native support in gerrit yet to query such objects,
querying relies mostly on command line tools. Notes are plain text
files stored along the main repository.

As we currently have no fixme tags set in core, but only links to
CodeReview, let me rephrase your demo query
  fixme commits by me in mw core
to
  commits by me in mw core linking to CodeReview with a seven in its number
.

Given you have fetched core's existing commit notes [1], you could
query for that like
  git log --show-notes --author='Christian Aistleitner' --grep 'Special:Code.*7'
. And you'll get [2] as result. Those are my commits to core when it
was still on CodeReview and CodeReview number contains a 7.


So how could queries for tags look like when using git notes?
Your original query might look like

  git log --show-notes --author='Christian Aistleitner' --grep '^Tag: 
fixme$'

. Since notes is free-form text, we would probably want to add some
'[[:space:]]*', ignore case, etc. But it's up to us to decide upon the
rules and decide, how a correct tag has to be represented.
And we have aliasing in git to save us from typing the same things
over and over again.

One caveat (for now) is that we would also catch commits, where the
commit message itself (instead of a git note) contains the tag. But
that might actually turn out to be a benefit instead of a caveat.

If we want to avoid matching also on the commit message and limit to
notes only, we can always checkout refs/notes/commits to have the
plain git notes in the file system and do our querying directly on the
notes files.

 If not on displayed in Gerrit, then from git-cli via a web tool (how
 does that grep perform, is it fast enough?)

time git log --notes --author='Christian Aistleitner' --grep 'Special:Code.*7' 
/dev/null

real0m0.683s
user0m0.670s
sys 0m0.011s

So that's 1s for a plain spinning disk (no SSD) with cold caches
grepping the whole history of core. That'd be good enough for me :-)

Although the above timings do not reflect a good real-world sample, as
our current git notes in core stem from a single commit. Nevertheless,
git has to go through the whole core history, as grep does not limit
to notes. So, querying through git history is typically fast. Even for
core. And querying through notes is basically doing the same thing.

Best regards,
Christian


[1] You can do so by issueing
  git fetch origin refs/notes/commits:refs/notes/commits
. Just like git clone it's slow for the first fetch. Follow-up
fetches are much, much quicker.


[2]
-8-Begin-8-
commit b23761744f237c5d9b4b1cabee4f5e76fc819213
Author: Christian Aistleitner qch...@users.mediawiki.org
Date:   Tue Mar 20 12:00:18 2012 +

Follow-up to r114256: Removing final assert

Notes:
http://mediawiki.org/wiki/Special:Code/MediaWiki/114257

commit 42c88dc137752e49143627416c585436721f0226
Author: Christian Aistleitner qch...@users.mediawiki.org
Date:   Mon Mar 19 23:36:48 2012 +

Follow up to r114126: Being more conservative for HipHop compiler

Notes:
http://mediawiki.org/wiki/Special:Code/MediaWiki/114217
-8-End-8-





-- 
 quelltextlich e.U.  \\  Christian Aistleitner 
   Companies' registry: 360296y in Linz
Christian Aistleitner
Gruendbergstrasze 65aEmail:  christ...@quelltextlich.at
4040 Linz, Austria   Phone:  +43 732 / 26 95 63
 Fax:+43 732 / 26 95 63
 Homepage: http://quelltextlich.at/
---


signature.asc
Description: Digital signature
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Replacement for tagging in Gerrit

2013-03-14 Thread Niklas Laxström
On 14 March 2013 01:07, Christian Aistleitner
christ...@quelltextlich.at wrote:
 Wouldn't git notes be a good match [1]?

 They're basically annotations attached to commits. You can add/remove
 them to any commit without changing the actual commit. And support
 comes directly with git, as for example in

   git log --show-notes

 Queries are just a grep away, and setting them can be done through git
 as well, until we integrate that into gerrit.

Except that you need to have the repository cloned locally first.

And often you need to query multiple repositories.

You can work around both of the above... but then we are again on the
more than non-significant effort needed path.

  -Niklas


-- 
Niklas Laxström

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] Using the File-Upload-Api - Followup to: A new feedback extension - review urgently needed

2013-03-14 Thread Lukas Benedix

Hi,

Thanks for the many comments on my extensions in gerrit. By now I 
handled most of them, but I have a big problem with the image-upload 
when creting the API module that should replace my shabby SpecialPage API.


I think that I could use mediawikis fileupload api 
(http://www.mediawiki.org/wiki/API:Upload). The problem is, that I have 
no idea how to start. I need the connection between the given feedback 
and the screenshot. By now the rendered screenshot is inside the 
POST-Request as base64 encoded binary data and written to the same table 
as the feedback.



best regards,

Lukas



Am Mi 13.03.2013 00:37, schrieb Lukas Benedix:

Hi there!

I am Lukas Benedix, a student of computer science at the Freie Universität
Berlin in Germany. In cooperation with the Wikidata developer team, I’m
currently working on my bachelor thesis about usability testing in open
source software projects and I’d like to provide the Wikidata community my
developed feedback mechanisms (only as a test). Wikidata is a very active,
emerging project which is why I think it’s a great platform for my
project.

And now here's the problem: The deadline of my bachelor thesis is
approaching soon. The test is designed to run for two weeks and I
unfortunately underestimated how much time it needs to get a review for my
extension before deployment.

Is it possible to accelerate that review process somehow? The extension is
in gerrit (https://gerrit.wikimedia.org/r/#/c/50004)

Do you have any advice what I can do?

For further information about my project: Here's a little description I
wrote for the Wikidata community:
http://www.wikidata.org/wiki/User:Lbenedix/UIFeedback

Best regards,
Lukas Benedix



___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l



___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Fwd: New unified SSL certificate deployed

2013-03-14 Thread Jay Ashworth
- Original Message -
 From: Ryan Lane rlan...@gmail.com

 On Wed, Mar 13, 2013 at 9:24 PM, Jay Ashworth j...@baylink.com wrote:
 
  - Original Message -
   From: Ryan Lane rlan...@gmail.com
 
Hey, Ryan; did you see, perhaps on outages-discussion, the after
action
report from Microsoft about how their Azure SSL cert expiration
screwup
happened?
 
   What's the relevance here?
 
  Does ops have a procedure for avoiding unexpected SSL cert
  expirations,
  and does this affect it in any way other than making it easier to
  implement?,
  I would think...
 
 
 We didn't have a certificate expiration. We replaced all individual
 certificates, delivered by different top level domains, with a single
 unified certificate. This change was to fix certificate errors being
 shown
 on all non-wikipedia domains for HTTPS mobile users, who were being
 delivered the *.wikipedia.org certificate for all domains.
 
 The unified certificate was missing 6 Subject Alternative Names:
 mediawiki.org, *.mediawiki.org, m.mediawiki.org, *.m.mediawiki.org,
 m.wikipedia.org and *.m.wikipedia.org. Shortly after deploying the
 certificate we noticed it was bad and reverted the affected services (
 mediawiki.org and mobile) back to their individual certificates. The
 change
 only affected a small portion of users for a short period of time.
 
 If you notice, I've already mentioned how we'll avoid and more quickly
 detect problems like this in the future:
 
 Needless to say I'll be writing a script that can be run against a
 cert to
 ensure it's not missing anything. We'll also be adding monitoring to
 check
 for invalid certificates for any top level domain.

I don't really think it was necessary to be this defensive, do you?

Well, clearly, you do.  My apologies for trying to be helpful in making 
sure you saw an analysis with useful information in it.

Cheers,
-- jra
-- 
Jay R. Ashworth  Baylink   j...@baylink.com
Designer The Things I Think   RFC 2100
Ashworth  Associates http://baylink.pitas.com 2000 Land Rover DII
St Petersburg FL USA   #natog  +1 727 647 1274

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] Next Bugday: March 19 15:00-21:00 UTC

2013-03-14 Thread Valerie Juarez
Hello everyone!

Please join us on the next Wikimedia Bugday:

Tuesday, March 19, 15:00-21:00UTC [1]
   in #wikimedia-dev on Freenode IRC [2]

We will be triaging bug reports in the LiquidThreads component [3]. The
idea came up on the wikitech-l mailing list [4]. While LiquidThreads is not
actively developed anymore it is still popular and deployed in many places.
We will be focusing on cleaning up, improving the quality of these reports,
and retesting some of them on mediawiki.org or test2.wikipedia.org.

Everyone is welcome to join, and no technical knowledge needed! It's an
easy way to get involved in the community or to give something back. This
even is part of a weekly QA activity [5], so we encourage you to triage
throughout the week and record your activity on the bug day etherpad [6].

This information and more can be found here:
https://www.mediawiki.org/wiki/Bug_management/Triage/20130318

For more information on Triaging in general, check out
https://www.mediawiki.org/wiki/Bug_management/Triage

I look forward to seeing you there!
Valerie


[1] Timezone converter: http://www.timeanddate.com/worldclock/converter.html
[2] See http://meta.wikimedia.org/wiki/IRC for more info on IRC chat
[3]
https://bugzilla.wikimedia.org/buglist.cgi?action=wrapcomponent=LiquidThreadsproduct=MediaWiki%20extensionsresolution=---list_id=186677(216
bugs)
[4]
http://lists.wikimedia.org/pipermail/wikitech-l/2013-February/066847.html
[5] https://www.mediawiki.org/wiki/QA/Weekly_goals
[6] http://etherpad.wmflabs.org/pad/p/BugTriage
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] Security training - Wednesday 20, 17:00 UTC

2013-03-14 Thread Quim Gil
If you are developing for MediaWiki and you haven't gone through one of 
our security trainings before, this session is for you:



Security for developers training session and QA
by Chris Steipp.

Wednesday, March 20 at 17:00 UTC
http://www.timeanddate.com/worldclock/fixedtime.html?iso=20130320T1700
#wikimedia-office IRC channel


Before the session you *must* watch the MediaWiki Security presentation:
http://vimeo.com/album/2013929/video/44856793


Extra bonus if you also check the Secure Coding slides and video:
https://commons.wikimedia.org/wiki/File:TechDays2012-SecureCoding.pdf
https://commons.wikimedia.org/wiki/File:MediaWiki_Security.ogv


... and the page you want to watch:
https://www.mediawiki.org/wiki/Security_for_developers

--
Quim Gil
Technical Contributor Coordinator @ Wikimedia Foundation
http://www.mediawiki.org/wiki/User:Qgil

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] [Mediawiki-api] Provisional API extension for CAPTCHA on action=createaccount

2013-03-14 Thread Brad Jorsch
I filed that yesterday as bug
46072https://bugzilla.wikimedia.org/show_bug.cgi?id=46072.
;)

I'll try to find time to review it before the weekend.

On Thu, Mar 14, 2013 at 3:55 PM, Brion Vibber bvib...@wikimedia.org wrote:

 As some folks may recall, an action=createaccount was added to the API a
 few weeks ago. Unfortunately the first pass didn't include CAPTCHA support,
 so we haven't been able to use it for the live sites yet (which use
 ConfirmEdit's FancyCaptcha mode). We expect to start using this in the
 next couple weeks for the mobile Commons apps, so it's time to make it
 captcha-friendly...

 I've made a first stab at adding support, based on the existing captcha
 interfaces for login and editing:

 MediaWiki core: https://gerrit.wikimedia.org/r/53793
 ConfirmEdit ext: https://gerrit.wikimedia.org/r/53794

 So far I've tested it with the default 'math captcha' mode, with this test
 rig: https://github.com/brion/mw-createaccount-test


 If a captcha needs to be run, action=createaccount will spit back a result
 including a 'captcha' field including several subfields, such as in this
 example:
 https://www.mediawiki.org/wiki/API_talk:Account_creation#Captcha_additions

 Since account creation requires a first request to get a token anyway,
 this shouldn't significantly add to the complexity of using the API.


 Text captchas will have a 'question' subfield to be presented; image
 captchas will have a 'url' field which should be loaded as the image.
 'type' and 'mime' will vary, and probably shouldn't be used too closely.

 Pass back the captcha id field in 'captchaid' and the response word in
 'captchaword' parameters along with the final request, and it should pass
 the captcha. If the captcha doesn't pass, you get back new captcha info and
 can continue.


 Questions:
 * Does this seem to make sense? :)
 * Will other API users kill me for this or will it be easy to use?
 * Any surprises that may turn up?
 * Any bugs/style problems in my code so far?

 -- brion

 ___
 Mediawiki-api mailing list
 mediawiki-...@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/mediawiki-api




-- 
Brad Jorsch
Software Engineer
Wikimedia Foundation
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] [Mediawiki-api] Provisional API extension for CAPTCHA on action=createaccount

2013-03-14 Thread Matthew Flaschen
On 03/14/2013 04:01 PM, Brad Jorsch wrote:
 I've made a first stab at adding support, based on the existing captcha
 interfaces for login and editing:

 MediaWiki core: https://gerrit.wikimedia.org/r/53793
 ConfirmEdit ext: https://gerrit.wikimedia.org/r/53794

Side note, there is also an in-progress API for getting a new
FancyCaptcha (https://gerrit.wikimedia.org/r/#/c/44376), used for a
refresh captcha button.

Matt Flaschen

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Pronunciation recording tool wanted

2013-03-14 Thread Lars Aronsson

On 03/13/2013 08:16 AM, Nikola Smolenski wrote:
A very nice website that does this already is www.forvo.com but they 
claim by-nc-sa licence. But the way it works could be used as inspiration.


Forvo looks very nice, and if they can do the job,
I'm happy that we don't have to. We should try
to collaborate with them.

However, when I try it, it says Oops! The recorder
is having a fix. Please, try again later. Thanks..,
both yesterday and today.


--
  Lars Aronsson (l...@aronsson.se)
  Aronsson Datateknik - http://aronsson.se



___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Pronunciation recording tool wanted

2013-03-14 Thread Matthew Flaschen
On 03/14/2013 05:49 PM, Lars Aronsson wrote:
 Forvo looks very nice, and if they can do the job,
 I'm happy that we don't have to. We should try
 to collaborate with them.

Unfortunately, their license (non-commercial) is not free as in freedom,
and not acceptable for Wikimedia projects.

It's possible they might be able to change that license for particular
recordings.

Matt Flaschen

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Github/Gerrit mirroring

2013-03-14 Thread Juliusz Gonera

On 03/09/2013 01:06 PM, Tyler Romeo wrote:

I strongly disagree that Gerrit is harder to learn than Github. The only
difficult thing to understand is the web UI, which takes only a few minutes
to really get used to. Let's look at the biggest complaints:


Let's not forget about this one: ...forces a *one commit at a time* 
workflow on developers and forces the use of |git commit --amend| as the 
only way to update patches. [1] For me this breaks the purpose of branches.


[1] 
http://www.mediawiki.org/wiki/Git/Gerrit_evaluation#The_case_against_Gerrit


--
Juliusz
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Github/Gerrit mirroring

2013-03-14 Thread Juliusz Gonera

On 03/08/2013 10:20 AM, Bartosz Dziewoński wrote:
On Fri, 08 Mar 2013 17:07:18 +0100, Antoine Musso hashar+...@free.fr 
wrote:

I guess the whole idea of using GitHub is for public relation and to
attract new people.  Then, if a developer is not willing to learn
Gerrit, its code is probably not worth the effort of us integrating
github/gerrit.  That will just add some more poor quality code to your
review queues.


This a hundred times. I manage a few (small) open-source projects at 
GitHub, and most of the patches I get are not even up to my standards 
(and those are significantly lower than WMF's ones).


Submitting a patch to gerrit and even fixing it after code review is 
not that hard. (Of course any more complicated operations like 
rebasing do suck, but you hopefully won't be doing that with your 
first patch.)


I strongly disagree with this. I also get some poor quality pull 
requests to my projects on GitHub, but once in a while I get something good.


To be honest, if I hadn't worked at WMF I'd have never thought about 
learning something as obscure as gerrit just to submit a small patch. 
And I wouldn't assume that without knowing the project well anyone would 
want to contribute something big.


My reasoning: people get involved in open source projects by starting 
with a small contribution and if we don't make it easy for them, they 
just won't try.


--
Juliusz

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Github/Gerrit mirroring

2013-03-14 Thread Juliusz Gonera

On 03/08/2013 08:55 AM, Andrew Otto wrote:

I've been hosting my puppet-cdh4 (Hadoop) repository on Github for a while now. 
 I am planning on moving this into Gerrit.

I've been getting pretty high quality pull requests for the last month or so 
from a couple of different users. (Including CentOS support, supporting 
MapReduce v1 as well as YARN, etc.)

   https://github.com/wikimedia/puppet-cdh4/issues?page=1state=closed

I'm happy to host this in Gerrit, but I suspect that contribution to this 
project will drop once I do. :/


Why do you want to move it to gerrit then?

--
Juliusz

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] en.planet updates stuck - need Python help (upstream problem)

2013-03-14 Thread Daniel Zahn
Hi,

unfortunately en.planet updates are still being stuck.

https://bugzilla.wikimedia.org/show_bug.cgi?id=45806

The issue is in feedparser.py

UnicodeDecodeError: 'ascii' codec can't decode byte 0xe2 in position 32:
ordinal not in range(128)

I would really appreciate help on this from somebody more familiar
with Python/Django template/parsing feeds/ unicode problems.

Thanks
---

INFO:planet.runner:Loading cached data
Traceback (most recent call last):
  File /usr/bin/planet, line 138, in module
splice.apply(doc.toxml('utf-8'))
  File /usr/lib/pymodules/python2.7/planet/splice.py, line 118, in apply
output_file = shell.run(template_file, doc)
  File /usr/lib/pymodules/python2.7/planet/shell/__init__.py, line 66, in run
module.run(template_resolved, doc, output_file, options)
  File /usr/lib/pymodules/python2.7/planet/shell/tmpl.py, line 254, in run
for key,value in template_info(doc).items():
  File /usr/lib/pymodules/python2.7/planet/shell/tmpl.py, line 193, in
template_info
data=feedparser.parse(source)
  File /usr/lib/pymodules/python2.7/planet/vendor/feedparser.py, line 3525,
in parse
feedparser.feed(data)
  File /usr/lib/pymodules/python2.7/planet/vendor/feedparser.py, line 1662,
in feed
sgmllib.SGMLParser.feed(self, data)
  File /usr/lib/python2.7/sgmllib.py, line 104, in feed
self.goahead(0)
  File /usr/lib/python2.7/sgmllib.py, line 143, in goahead
k = self.parse_endtag(i)
  File /usr/lib/python2.7/sgmllib.py, line 320, in parse_endtag
self.finish_endtag(tag)
  File /usr/lib/python2.7/sgmllib.py, line 360, in finish_endtag
self.unknown_endtag(tag)
  File /usr/lib/pymodules/python2.7/planet/vendor/feedparser.py, line 569, in
unknown_endtag
method()
  File /usr/lib/pymodules/python2.7/planet/vendor/feedparser.py, line 1512,
in _end_content
value = self.popContent('content')
  File /usr/lib/pymodules/python2.7/planet/vendor/feedparser.py, line 849, in
popContent
value = self.pop(tag)
  File /usr/lib/pymodules/python2.7/planet/vendor/feedparser.py, line 764, in
pop
mfresults = _parseMicroformats(output, self.baseuri, self.encoding)
  File /usr/lib/pymodules/python2.7/planet/vendor/feedparser.py, line 2219,
in _parseMicroformats
p.vcard = p.findVCards(p.document)
  File /usr/lib/pymodules/python2.7/planet/vendor/feedparser.py, line 2161,
in findVCards
sVCards += '\n'.join(arLines) + '\n'
UnicodeDecodeError: 'ascii' codec can't decode byte 0xe2 in position 32:
ordinal not in range(128)



-- 
Daniel Zahn dz...@wikimedia.org
Operations Engineer

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Security training - Wednesday 20, 17:00 UTC

2013-03-14 Thread Thomas Gries
tl;dr
MediaWiki Security Guide
https://www.mediawiki.org/wiki/MSG
PDF or e-book format available


Am 14.03.2013 20:37, schrieb Quim Gil:
 If you are developing for MediaWiki and you haven't gone through one
 of our security trainings before, this session is for you:

Yes: we have a Virtual Library (PDF or e-book format)
https://www.mediawiki.org/wiki/MVL

In that library, collections of interesting articles can be generated
i.e. you get the latest article versions
and can download to your PC or tablet for off-line reading

One is the collection of*security-related articles**:*

MediaWiki Security Guide
https://www.mediawiki.org/wiki/MSG

Please feel free to add further interesting articles to the collection,
and to update.
And: please make sure not to break it.

Tom



___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] en.planet updates stuck - need Python help (upstream problem)

2013-03-14 Thread Daniel Zahn
note: this did not happen from the beginning and does not apply to
other languages (or at least not all of them),
so it depends which feeds you subscribe to and their (current)
content. It is not that easy though to identify which feed
causes it, since the update goes through all of them, writes to cache
directory, and in the very end tries to assemble HTML from
cached data, not telling you where exactly it got stuck.

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] en.planet updates stuck - need Python help (upstream problem)

2013-03-14 Thread John
Its a matter of encoding, see my post on the bug

On Thu, Mar 14, 2013 at 6:29 PM, Daniel Zahn dz...@wikimedia.org wrote:
 note: this did not happen from the beginning and does not apply to
 other languages (or at least not all of them),
 so it depends which feeds you subscribe to and their (current)
 content. It is not that easy though to identify which feed
 causes it, since the update goes through all of them, writes to cache
 directory, and in the very end tries to assemble HTML from
 cached data, not telling you where exactly it got stuck.

 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Pronunciation recording tool wanted

2013-03-14 Thread Lars Aronsson

On 03/13/2013 08:16 AM, Nikola Smolenski wrote:
A very nice website that does this already is www.forvo.com but they 
claim by-nc-sa licence.


Ah, now I see this detail: Yes, the -NC- clause
in their license makes them useless for us.
That's a pity.

Having been through the great license shift in
OpenStreetMap, I think we should use cc0 as
far as we can. It remains my suggestion that
any tool should demand cc0, but of course that
will be the choice of the tool developer.


--
  Lars Aronsson (l...@aronsson.se)
  Aronsson Datateknik - http://aronsson.se



___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] enwiki dump -- retrying dumping database tables on partial failure?

2013-03-14 Thread Neil Harris

Dear Wikimedia ops team,

The most recent enwiki dump now seems to have finished _almost_ 
successfully, apart from the dumping of the database metadata tables 
such as the pages table and the various links tables, almost all of 
which have failed.


I wonder if there is any chance someone could give this a kick, and 
re-try the dumping of these tables to finish the dump?


Since this seems to have happened several times now, could it be worth 
considering automating re-trying in this sort of situation, to improve 
dump reliability for the future without needing manual intervention?


Thanks,

Neil


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] enwiki dump -- retrying dumping database tables on partial failure?

2013-03-14 Thread Ariel T. Glenn
Στις 14-03-2013, ημέρα Πεμ, και ώρα 23:24 +, ο/η Neil Harris έγραψε:
 Dear Wikimedia ops team,
 
 The most recent enwiki dump now seems to have finished _almost_ 
 successfully, apart from the dumping of the database metadata tables 
 such as the pages table and the various links tables, almost all of 
 which have failed.
 
 I wonder if there is any chance someone could give this a kick, and 
 re-try the dumping of these tables to finish the dump?

It's rerunning the tables now.

 Since this seems to have happened several times now, could it be worth 
 considering automating re-trying in this sort of situation, to improve 
 dump reliability for the future without needing manual intervention?
 
Because the reasons for failure are varied (ranging from hardware
failure to dbs being unreachable to broken MediaWiki code being
deployed), automating restarts isn't practical; typically someone needs
to find out what the underlying cause is and take appropriate action.

Ariel




___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Github/Gerrit mirroring

2013-03-14 Thread Andrew Otto
 I've been hosting my puppet-cdh4 (Hadoop) repository on Github for a while 
 now.  I am planning on moving this into Gerrit.
 Why do you want to move it to gerrit then?

Security reasons.  All puppet repos have to be hosted by WMF and reviewed by 
ops before we can use it in production.


On Mar 14, 2013, at 4:11 PM, Juliusz Gonera jgon...@wikimedia.org wrote:

 On 03/08/2013 08:55 AM, Andrew Otto wrote:
 I've been hosting my puppet-cdh4 (Hadoop) repository on Github for a while 
 now.  I am planning on moving this into Gerrit.
 
 I've been getting pretty high quality pull requests for the last month or so 
 from a couple of different users. (Including CentOS support, supporting 
 MapReduce v1 as well as YARN, etc.)
 
   https://github.com/wikimedia/puppet-cdh4/issues?page=1state=closed
 
 I'm happy to host this in Gerrit, but I suspect that contribution to this 
 project will drop once I do. :/
 
 Why do you want to move it to gerrit then?
 
 -- 
 Juliusz
 
 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Github/Gerrit mirroring

2013-03-14 Thread Tyler Romeo
Can we please be real here? The reason more contributors come in through
GitHub than through Gerrit is because they *already have a GitHub account*.
My browser is always logged into GitHub, and it's at the point where I can
casually just fork a project and begin working on it, whereas with Gerrit
you need to request an account.

Like I said before, if you know how to use Git, you know how to use Gerrit
(and the contra-positive is true as well). The primary thing holding people
back is that it's confusing and not user friendly enough to make an account
and get working. Imagine if people could sign into Gerrit using their
Google accounts like Phabricator allows. I can guarantee participation
would skyrocket.

*--*
*Tyler Romeo*
Stevens Institute of Technology, Class of 2015
Major in Computer Science
www.whizkidztech.com | tylerro...@gmail.com
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Github/Gerrit mirroring

2013-03-14 Thread Juliusz Gonera

On 03/14/2013 07:19 PM, Tyler Romeo wrote:

Like I said before, if you know how to use Git, you know how to use Gerrit
(and the contra-positive is true as well). The primary thing holding people
back is that it's confusing and not user friendly enough to make an account
and get working. Imagine if people could sign into Gerrit using their
Google accounts like Phabricator allows. I can guarantee participation
would skyrocket.


I wouldn't be that optimistic, maybe it would slightly increase. Having 
an account is one of the factors but I wouldn't underestimate user 
friendliness. The first time I tried to find the URL to clone a repo in 
gerrit it took me probably around a minute. On GitHub it probably took 
me 5 seconds.


--
Juliusz

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Github/Gerrit mirroring

2013-03-14 Thread Brian Wolff
On 2013-03-14 11:20 PM, Tyler Romeo tylerro...@gmail.com wrote:

 Can we please be real here? The reason more contributors come in through
 GitHub than through Gerrit is because they *already have a GitHub
account*.
 My browser is always logged into GitHub, and it's at the point where I can
 casually just fork a project and begin working on it, whereas with Gerrit
 you need to request an account.

 Like I said before, if you know how to use Git, you know how to use Gerrit
 (and the contra-positive is true as well). The primary thing holding
people
 back is that it's confusing and not user friendly enough to make an
account
 and get working. Imagine if people could sign into Gerrit using their
 Google accounts like Phabricator allows. I can guarantee participation
 would skyrocket.

 *--*
 *Tyler Romeo*
 Stevens Institute of Technology, Class of 2015
 Major in Computer Science
 www.whizkidztech.com | tylerro...@gmail.com
 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l

I can state from personal experiance that creating an account was not the
hard part (particularly because I had my account created for me ;) and
there definitly was a hard part learning gerrit. I have no idea how
easy/hard it is to do things on github as I don't have an account there but
at the very least they probably have more usability engineers than gerrit
has.

Svn had a much harder account creation procedure, but I personally felt the
learning curve was much lower (or maybe I wss just more familar with the
ideas involved.)

Anyhow, point of this ramble: gerrit is difficult for newbies (or at least
when I was. Many others have said similar things). Well we certainly want
to keep gerrit, its important to recognize this and mitigate the
difficulties where it is reasonable to do so.

-bawolff
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Github/Gerrit mirroring

2013-03-14 Thread K. Peachey
On Fri, Mar 15, 2013 at 12:19 PM, Tyler Romeo tylerro...@gmail.com wrote:
 Can we please be real here? The reason more contributors come in through
 GitHub than through Gerrit is because they *already have a GitHub account*.
 My browser is always logged into GitHub,

?!? Unless you magically had a account created for you at GitHub that
logic fails.

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Github/Gerrit mirroring

2013-03-14 Thread Erik Moeller
On Thu, Mar 14, 2013 at 7:46 PM, Juliusz Gonera jgon...@wikimedia.org wrote:

 I wouldn't be that optimistic, maybe it would slightly increase. Having an
 account is one of the factors but I wouldn't underestimate user
 friendliness. The first time I tried to find the URL to clone a repo in
 gerrit it took me probably around a minute. On GitHub it probably took me 5
 seconds.

And I wouldn't be too quick to celebrate the increased vendor lock-in
of a large percentage of the open source community into an ecosystem
of partially proprietary tools and services (the GitHub engine itself,
the official GitHub applications, etc.). Gerrit and other open source
git repo management and code review tools are one of the best hopes
for the development of a viable alternative. Unlike GitHub, Gerrit can
be improved by its users over time, and the issues that frustrate and
annoy us about it _can_ be fixed (and indeed, many have been).

Yes to better pull request management from GitHub. But let's stop
complaining about Gerrit, and instead get both functionality and UX
issues into their bug tracker, and help get them fixed.

Erik

-- 
Erik Möller
VP of Engineering and Product Development, Wikimedia Foundation

Support Free Knowledge: https://wikimediafoundation.org/wiki/Donate

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Github/Gerrit mirroring

2013-03-14 Thread Quim Gil
This thread has probably reached that limit of fresh arguments. Can we 
continue with proper documentation, reporting and patches?


We care about GitHub and we believe it is an important source of 
potential contributors. This is why we are mirroring our repos there, 
and this is why we are working on


Bug 35497 - Implement a way to bring GitHub pull requests into Gerrit
https://bugzilla.wikimedia.org/show_bug.cgi?id=35497

If you want to improve the path from GitHub to Gerrit please contribute 
to that bug report or e.g. in a landing page (at mediawiki.org or 
github.com) to help GitHub users interested in our projects.



About the Gerrit UI, if you have specific complaints or enhancement 
requests please file them. We have been upstreaming reports and also 
patches. We are surely not the only Gerrit users thinking that GitHub 
does some things better so there is hope. But saying it's ugly doesn't 
help.



On 03/09/2013 01:06 PM, Tyler Romeo wrote:

- Registration - The only thing that is probably a problem. We need to
make Gerrit registration easier.


What are the bottlenecks nowadays? If there are still any please file bugs.

--
Quim Gil
Technical Contributor Coordinator @ Wikimedia Foundation
http://www.mediawiki.org/wiki/User:Qgil

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l