Re: [Wikitech-l] GSOC Proposal: Watchlist Improvements

2012-03-27 Thread Aaron Pramana
Thanks Steven,

I have posted the deliverables for my proposal here:
https://www.mediawiki.org/wiki/User:Blackjack48/GSOC_proposal_for_watchlist_improvements

Is there any place in particular I should post the link besides this
mailing list?

-Aaron
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Committing followups: please no --amend

2012-03-27 Thread Amir E. Aharoni
2012/3/27 Tim Starling tstarl...@wikimedia.org:
 For commits with lots of files, Gerrit's diff interface is too broken
 to be useful. It does not provide a compact overview of the change
 which is essential for effective review.

 Luckily, there are alternatives, specifically local git clients and
 gitweb. However, these don't work when git's change model is broken by
 the use of git commit --amend.

 For commits with a small number of files, such changes are reviewable
 by the use of the patch history table in the diff views. But when
 there are a large number of files, it becomes difficult to find the
 files which have changed, and if there are a lot of changed files, to
 produce a compact combined diff.

 So if there are no objections, I'm going to change [[Git/Workflow]] to
 restrict the recommended applications of git commit --amend, and to
 recommend plain git commit as an alternative. A plain commit seems
 to work just fine. It gives you a separate commit to analyse with
 Gerrit, gitweb and client-side tools, and it provides a link to the
 original change in the dependencies section of the change page.

It sounds similar to what i said in the thread consecutive commits in
Gerrit‏, so i probably support it, but i don't completely understand
how will it work with the `git review' command, which doesn't like
multiple commits. If the documentation will explain how to use `git
review' with follow up commits, it will be fine.

--
Amir Elisha Aharoni · אָמִיר אֱלִישָׁע אַהֲרוֹנִי
http://aharoni.wordpress.com
‪“We're living in pieces,
I want to live in peace.” – T. Moore‬

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Cutting MediaWiki loose from wikitext

2012-03-27 Thread Daniel Kinzler
On 27.03.2012 00:09, Platonides wrote:
 It looks really evil publishing that svn branch just days after git
 migration :)
 I think that branch -created months ago- should be migrated to git, so
 we could all despair..^W benefit from git wonderful branching abilities.

Indeed - when I asked Chad about that, he said ask me again once the dust has
settled. I'd be happy to have this in git.

Or... well, maybe I'll just make a patch from that branch, make a fresh branch
in git, and cherry pick the changes, trying to keep things minimal. Yea, that's
probably the best thing to do.

-- daniel

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


[MediaWiki-CodeReview] [MediaWiki r114496]: New comment added

2012-03-27 Thread MediaWiki Mail
Nikerabbit posted a comment on MediaWiki.r114496.
URL: http://www.mediawiki.org/wiki/Special:Code/MediaWiki/114496#c32530

Commit summary for MediaWiki.r114496:

really correct api path this time.

Nikerabbit's comment:

in mediawiki.util there is shorcut for this: mw.util.wikiScript( 'api' )

___
MediaWiki-CodeReview mailing list
mediawiki-coderev...@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/mediawiki-codereview


[MediaWiki-CodeReview] [MediaWiki r114501]: New comment added

2012-03-27 Thread MediaWiki Mail
Nikerabbit posted a comment on MediaWiki.r114501.
URL: http://www.mediawiki.org/wiki/Special:Code/MediaWiki/114501#c32531

Commit summary for MediaWiki.r114501:

added actual metadata to the template, proper i18n strings

Nikerabbit's comment:

Why no plural for bytes?

___
MediaWiki-CodeReview mailing list
mediawiki-coderev...@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/mediawiki-codereview


[MediaWiki-CodeReview] [MediaWiki r114511]: Revision status changed

2012-03-27 Thread MediaWiki Mail
Nikerabbit changed the status of MediaWiki.r114511 to ok
URL: http://www.mediawiki.org/wiki/Special:Code/MediaWiki/114511

Old status:  deferred
New status: ok

Commit summary for MediaWiki.r114511:

Removed support for SMW  1.6

___
MediaWiki-CodeReview mailing list
mediawiki-coderev...@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/mediawiki-codereview


[Wikitech-l] I don't want a full git clone, I just want an rsync(1) style diet git clone

2012-03-27 Thread jidanni
Gentlemen, I am very thankful for
http://www.mediawiki.org/wiki/Download_from_Git
however it assumes one wants to 'get involved' with MediaWiki, whereas
all I want to do is 'rsync snapshots' of the same files 'that the
tarballs contain'. I.e., I just want to oh, every few days update
(overwrite in place) my running copy of MediaWiki. No need for the
history etc. that a 'clone' pulls in, or further 'pull's refresh.
(The disk size I use should not grow a single bit between updates unless
you upstream added some lines to a file.)

So it would be nice if someone mentioned the commands necessary so I can
add them to that article. Let's pretend we are starting fresh (and not
from SVN like in
http://lists.wikimedia.org/pipermail/wikitech-l/2012-February/058190.html .)

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


[MediaWiki-CodeReview] [MediaWiki r114512]: Revision status changed

2012-03-27 Thread MediaWiki Mail
Nikerabbit changed the status of MediaWiki.r114512 to ok
URL: http://www.mediawiki.org/wiki/Special:Code/MediaWiki/114512

Old status:  deferred
New status: ok

Commit summary for MediaWiki.r114512:

New version: 0.3.2 - support removed for MW  1.16, and SMW  1.6

___
MediaWiki-CodeReview mailing list
mediawiki-coderev...@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/mediawiki-codereview


Re: [Wikitech-l] [Wikitext-l] Fwd: Cutting MediaWiki loose from wikitext

2012-03-27 Thread Daniel Kinzler
On 27.03.2012 02:19, Daniel Friesen wrote:
 Non-wikitext data is supposed to give extensions the ability to do things 
 beyond
 WikiText. The data is always going to be an opaque form controlled by the
 extension.
 I don't think that low level serialized data should be visible at all to
 clients. Even if they know it's there.

The serialized form of the data needs to be visible at least in the XML dump
format. How else could we transfer non-wikitext content between wikis?

Using the serialized form may also make sense for editing via the web API,
though I'm not sure yet what the best ways is here:

a) keep using the current general, text based interface with the serialized form
of the content

or b) require a specialized editing API for each content type.

Going with a) has the advantage of that it will simply work with current API
client code. However, if the client modifies the content and writes it back
without being aware of the format, it may corrupt the data. So perhaps we should
return an error when a client tries to edit a non-wikitext page the old way.

The b) option is a bit annoying because it means that we have to define a
potentially quite complex mapping between the content model and API's result
model (nested php arrays). This is easy enough for Wikidata, which uses a JSON
based internal model. But for, say, SVG... well, I guess the specialized mapping
could still be escaped XML as a string.

Note that if we allow a), we can still allow b) at the same time - for Wikidata,
we will definitely implement a special purpose editing interface that supports
stuff like add value for language x to property y, etc.

 Just like database schemas change, I expect extensions to also want to alter 
 the
 format of data as they add new features.

Indeed. This is why in addition to a data model identifier, the serialization
format is explicitly tracked in the database and will be present in dumps and
via the web API.

 Also I've thought about something like this for quite awhile. One of the 
 things
 I'd really like us to do is start using real metadata even within normal
 WikiText pages. We should really replace in-page [[Category:]] with a real
 string of category metadata. Which we can then use to provide good intuitive
 category interfaces. ([[Category:]] would be left in for templates,
 compatibility, etc...).

That could be implemented using a multipart content type. But I don't want to
get into this too deeply - multipart has a lot of cool uses, but it's beyond
what we will do for Wikidata.

 This case especially tells me that raw is not something that should be
 outputting the raw data, but should be something which is implemented by
 whatever implements the normal handling for that serialized data.

you mean action=raw? yes, I agree. action=raw should not return the actual
serialized format. It should probably return nothing or an error for non-text
content. For multipart pages it would just return the main part, without the
extensions.

But the entire multipart stuff needs more thought. It has a lot of great
applications, but it's beyond the scope of Wikidata, and it has some additional
implications (e.g. can the old editing interface be used to edit just the text
while keeping the attachments?).

-- daniel


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Cutting MediaWiki loose from wikitext

2012-03-27 Thread Daniel Kinzler
On 27.03.2012 00:33, Tim Starling wrote:
 For the record: we've discussed this previously and I'm fine with it.
 It's a well thought-out proposal, and the only request I had was to
 ensure that the DB schema supports some similar projects that we have
 in the idea pile, like multiple parser versions.

Thanks Tim! The one important bit I'd like to hear from you is... do you think
it is feasible to get this not only implemented but also reviewed and deployed
by August?... We are on a tight schedule with Wikidata, and this functionality
is a major blocker.

I think implementing ContentHandlers for MediaWiki would have a lot of benefits
for the future, but if it's not feasible to get it in quickly, I have to think
of an alternative way to implement structured data storage.

Thanks
Daniel


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Cutting MediaWiki loose from wikitext

2012-03-27 Thread Daniel Kinzler
On 27.03.2012 00:37, MZMcBride wrote:
 It's an ancient assumption that's built in to many parts of MediaWiki (and
 many outside tools and scripts). Is there any kind of assessment about the
 level of impact this would have?

Not formally, just my own poking at the code base. There is a lot of places in
the code that access revision text, and do something with it, not all can easily
be found or changed (especially true for extensions).

My proposal covers a compatibility layer that will cause legacy code to just see
an empty page when trying to access the contents of a non-wikitext page. Only
code aware of content models will see any non-wikitext content. This should
avoid most problems, and should ensure that things will work as before at least
for everything that is wikitext.

 For example, would the diff engine need to be rewritten so that people can
 monitor these pages for vandalism? 

A diff engine needs to be implemented for each content model. The existing
engine(s) does not need to be rewritten, it will be used for all wikitext pages.

 Will these pages be editable in the same
 way as current wikitext pages? 

No. The entire point of this proposal is to be able to neatly supply specialized
display, editing and diffing of different kinds of content.

 If not, will there be special editors for the
 various data types? 

Indeed.

 What other parts of the MediaWiki codebase will be
 affected and to what extent? 

A few classes (like Revision or WikiPage) need some major additions or changes,
see the proposal on meta. Lots of places should eventually be changed to become
aware of content models, but don't need to be adapted immediately (see above).

 Will text still go in the text table or will
 separate tables and infrastructure be used?

Uh, did you read the proposal?...

All content is serialized just before storing it. It is stored into the text
table using the same code as before. The content model and serialization format
is recorded in the revision table.

Secondary data (index data, analogous to the link tables) may be extracted from
the content and stored in separate database tables, or in some other service, as
needed.

 I'm reminded a little of LiquidThreads for some reason. This idea sounds
 good, but I'm worried about the implementation details, particularly as the
 assumption you seek to upend is so old and ingrained.

It's more like the transition to using MediaHandlers instead of assuming
uploaded files to be images: existing concepts and actions are generalized to
apply to more types of content.

LiquidThreads introduces new concepts (threads, conversations) and interactions
(re-arranging, summarazing, etc) and tries to integrate them with the concepts
used for wiki pages. This seems far more complicated to me.

 The background is that the Wikidata project needs a way to store structured
 data (JSON) on wiki pages instead of wikitext. Having a pluggable system 
 would
 solve that problem along with several others, like doing away with the 
 special
 cases for JS/CSS, the ability to maintain categories etc separate from body
 text, manage Gadgets sanely on a wiki page, or several other things (see the
 link below).
 
 How would this affect categories being stored in wikitext (alongside the
 rest of the page content text)? That part doesn't make any sense to me.

Imagine a data model that works like mime/multipart email: you have a wrapper
that contains the main text as well as attachments. The whole shebang gets
serialized and stored in the text table, as usual. For displaying, editing and
visualizing, you have code that is aware of the multipart nature of the content,
and puts the parts together nicely.

However, the category stuff is a use case I'm just mentioning because it has bee
requested so often in the past (namely, editing categories, interlanguage links,
etc separately from the wiki text); this mechanism is not essential to the
concept of ContentHandlers, and not something I plan to implement for the
Wikidata project. It'S just somethign that will become much easier once we have
ContentHandlers.

-- daniel

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Community managed production environment

2012-03-27 Thread Petr Bena
I thought that labs are not going to be used to run some final
products, rather just to develop them? Also is it secure to use same
domain for testing and stable version? The stable version would be
better if it was more restricted.

On Tue, Mar 27, 2012 at 5:36 AM, Ryan Lane rlan...@gmail.com wrote:
 We already started working on a new virtual cluster known as labs
 (wmflabs.org) which purpose is to allow people develop stuff and later
 move it to some production, some time ago. I believe it would be nice
 to have exactly same environment (probably we could just extend
 wmflabs for that) running probably on same platform (virtual cluster
 managed through some site, using nova extension) which would have
 exactly same possibilities but it would be supposed to run final
 products (not a testing environment as labs, but production where
 the stable version would live)

 Why do we need this?

 Wikimedia labs will offer cloned db of production in future which
 would allow it to run community managed tools like
 http://toolserver.org/~quentinv57/tools/sulinfo.php and similar. I
 think it would be best if such tools were developed using labs as a
 testing platform and stable version pushed to this production which
 should only run the stable code. In fact it doesn't even need to be
 physically another cluster, just another set of virtual instances
 isolated from testing environment on labs. The environment would have
 restrictions which we don't have on labs. People would need to use
 puppet and gerrit mostly for everything, and root would not be given
 to everyone in this environment (some projects might be restricted to
 wmf ops only), so that we could even move all stable bots, we
 currently host on wmflabs there, without being afraid of leaking the
 bot credentials and such (that's a reason why bots project is
 restricted atm). Also the applications which ask for wikimedia
 credentials could be allowed there, since the code living on this
 production would be subject of review, and such projects which could
 mean security risk could be managed by wmf ops only (the changes could
 be done by volunteers but would need to be submitted to gerrit).

 We could also move some parts of current production to this community
 managed environment. I talked to Roan Kattouw in past regarding
 moving the configuration of wikimedia sites to some git repository so
 that volunteers could submit some patches to gerrit or handle bugzilla
 reports without needing shell access. Changes to production config
 would be merged by operation enginners, so that it would be completely
 secure.

 In a nutshell:

 This environment could be set up on same platform as wmf labs (no
 extra costs, just hard work :)), stable products (bots, user scripts)
 would be living there, while labs would serve only for development and
 nothing else.

 The production version would live on another domain, like
 wikimedia-tools.org or wmtools.org

 Thanks for your comments and responses


 I don't see the need for a different domain name. tools.wmflabs.org
 should suffice.

 Also, this can be accomplished by having multiple projects.

 - Ryan

 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Cutting MediaWiki loose from wikitext

2012-03-27 Thread Alex Brollo
I can't understand details of this talk, but if you like take a look to the
raw code of any ns0 page into it.wikisource and consider that area dati
is removed from wikitext as soon as an user opens the page in edit mode,
and re-builded as the user saves it; or take a look here:
http://it.wikisource.org/wiki/MediaWiki:Variabili.js where date used into
automation/help of edit are collected as js objects.


Alex brollo
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


[MediaWiki-CodeReview] [MediaWiki r114501]: New comment added

2012-03-27 Thread MediaWiki Mail
Raindrift posted a comment on MediaWiki.r114501.
URL: http://www.mediawiki.org/wiki/Special:Code/MediaWiki/114501#c32532

Commit summary for MediaWiki.r114501:

added actual metadata to the template, proper i18n strings

Raindrift's comment:

I figured that single-byte articles are rare enough that keeping things simple 
would be better.

___
MediaWiki-CodeReview mailing list
mediawiki-coderev...@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/mediawiki-codereview


[MediaWiki-CodeReview] [MediaWiki r114501]: New comment added, and revision status changed

2012-03-27 Thread MediaWiki Mail
Nikerabbit changed the status of MediaWiki.r114501 to fixme and commented 
it.
URL: http://www.mediawiki.org/wiki/Special:Code/MediaWiki/114501#c32533

Old Status: new
New Status: fixme

Commit summary for MediaWiki.r114501:

added actual metadata to the template, proper i18n strings

Nikerabbit's comment:

It's not just about number one - there are complex rules for other languages.

___
MediaWiki-CodeReview mailing list
mediawiki-coderev...@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/mediawiki-codereview


Re: [Wikitech-l] [Wikitext-l] Fwd: Cutting MediaWiki loose from wikitext

2012-03-27 Thread Daniel Kinzler
On 27.03.2012 09:33, Oren Bochman wrote:
 1. JSON - that's not a very reader friendly format. Also not an ideal format
  for the search engine to consume. This is due to lack of Support for
 metadata and data schema. XML is universally supported, more human friendly
 and support a schema which can be useful way beyond their this initial .

JSON is the internal serialization format. It will not be shown to the user or
used to communicate with clients. Unless of course they use JSON for interaction
with the web API, as most do.

The full text search engine will be fed a completely artificial view of the
data. I agree that JSON wouldn't be good for that, though XML would be far worse
still.

As to which format and data model to use to represent Wikidata records
internally: that's a different discussion, independent of the idea of
introducing ContentHandlers to MediaWiki. Please post to wikidata-l about that.

 2. Be bold but also be smart and give respect where it is due. Bots and
 everyone else who's written tools for and about MediaWiki, who made a basic
 assumption about the page structure would be broken. Many will not so readily
 adapt.

I agree that backwards compatibility is very important. Which is why I took care
not to break any code or client using the old interface on pages that contain
wikitext (i.e. the standard/legacy case). The current interface (both, the web
API as well as methods in MediaWiki core) will function exactly as before for
all pages that contain wikitext.

For pages not containing wikitext, such code can not readily function. There are
two options here (currently controlled by a global setting): pretend the page is
empty (the default) or throw an error (probably better in case of the web API,
but too strict for other uses).

 3. A project like wikidata - in its infancy should make every effort to be 
 backwards compatible, It would be far wiser to be place wikidata into a page 
 with wiki source using an custom xml/ tag or even cdata/ xhtml tag.

I strongly disagree with that, it introduces more problems than it solves; Denny
and I decided against this option specifically in the light of the experience he
collected with embedding structured data in wikitext in Semantic MediaWiki and
Shortipedia.

But again: that's a different discussion, please post your concerns to 
wikidata-l.

Regards,
Daniel

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Cutting MediaWiki loose from wikitext

2012-03-27 Thread Daniel Kinzler
On 27.03.2012 09:47, Alex Brollo wrote:
 I can't understand details of this talk, but if you like take a look to the
 raw code of any ns0 page into it.wikisource and consider that area dati
 is removed from wikitext as soon as an user opens the page in edit mode,
 and re-builded as the user saves it; or take a look here:
 http://it.wikisource.org/wiki/MediaWiki:Variabili.js where date used into
 automation/help of edit are collected as js objects.

Yes. Basically, the ContentHandler proposal would introduce native support for
this kind of thing into MediaWiki, instead of implementing it as a hack with
JavaScript. Wouldn't it be nice to get input forms for this data, or have nice
diffs of the structure, or good search results for data records?... Not to
mention the ability to actually query for individual data fields :)

-- daniel

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Committing followups: please no --amend

2012-03-27 Thread Trevor Parscal
Good advice here, but I would just say we should mention that git --amend
is still recommended if you committed something and then realized there was
a mistake.

   - Using it to fix a typo or minor error in a commit = awesome.
   - Using it to pile up tons of changes across tons of files = not awesome.

The former makes review EASIER, the latter makes review HARDER.

- Trevor

On Mon, Mar 26, 2012 at 11:24 PM, Amir E. Aharoni 
amir.ahar...@mail.huji.ac.il wrote:

 2012/3/27 Tim Starling tstarl...@wikimedia.org:
  For commits with lots of files, Gerrit's diff interface is too broken
  to be useful. It does not provide a compact overview of the change
  which is essential for effective review.
 
  Luckily, there are alternatives, specifically local git clients and
  gitweb. However, these don't work when git's change model is broken by
  the use of git commit --amend.
 
  For commits with a small number of files, such changes are reviewable
  by the use of the patch history table in the diff views. But when
  there are a large number of files, it becomes difficult to find the
  files which have changed, and if there are a lot of changed files, to
  produce a compact combined diff.
 
  So if there are no objections, I'm going to change [[Git/Workflow]] to
  restrict the recommended applications of git commit --amend, and to
  recommend plain git commit as an alternative. A plain commit seems
  to work just fine. It gives you a separate commit to analyse with
  Gerrit, gitweb and client-side tools, and it provides a link to the
  original change in the dependencies section of the change page.

 It sounds similar to what i said in the thread consecutive commits in
 Gerrit‏, so i probably support it, but i don't completely understand
 how will it work with the `git review' command, which doesn't like
 multiple commits. If the documentation will explain how to use `git
 review' with follow up commits, it will be fine.

 --
 Amir Elisha Aharoni · אָמִיר אֱלִישָׁע אַהֲרוֹנִי
 http://aharoni.wordpress.com
 ‪“We're living in pieces,
 I want to live in peace.” – T. Moore‬

 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] I don't want a full git clone, I just want an rsync(1) style diet git clone

2012-03-27 Thread K. Peachey
https://www.mediawiki.org/wiki/Download_from_Git#Latest_development_version_of_MediaWiki
should behave the same as updating from SVN in the past (as to my
understanding).

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] I don't want a full git clone, I just want an rsync(1) style diet git clone

2012-03-27 Thread Daniel Friesen

On Mon, 26 Mar 2012 23:56:58 -0700, jida...@jidanni.org wrote:


Gentlemen, I am very thankful for
http://www.mediawiki.org/wiki/Download_from_Git
however it assumes one wants to 'get involved' with MediaWiki, whereas
all I want to do is 'rsync snapshots' of the same files 'that the
tarballs contain'. I.e., I just want to oh, every few days update
(overwrite in place) my running copy of MediaWiki. No need for the
history etc. that a 'clone' pulls in, or further 'pull's refresh.
(The disk size I use should not grow a single bit between updates unless
you upstream added some lines to a file.)

So it would be nice if someone mentioned the commands necessary so I can
add them to that article. Let's pretend we are starting fresh (and not
from SVN like in
http://lists.wikimedia.org/pipermail/wikitech-l/2012-February/058190.html  
.)


 99Mgit.git  # A --bare clone
167Mgit-clean# A simple clone of master
215Mgit-fetch# A clone with a full fetch of the history of all  
branches

 68Mgit-linked   # A clone linked to git-clean
128Mgit-shallow  # A --depth=5 clone
147Msvn  # A SVN checkout

Your concerns about space aren't really warranted. For something that  
keeps the entire source history it is very efficient at storing that  
history.

A basic checkout (git-clean) is only 20MB more than a svn checkout.

To top it off git supports bare repos (repos with only history and no  
working copy) and linking a repo to another.
ie: If you `git clone /abs/path/to/other/git/repo newrepo` git will use  
some hard links and references to the other repo instead of physically  
copying the data.
By the way 99M (git.git) + 68M (git-linked) = 167M (git-clean), a --bare  
repo plus a linked repo is the same size as a normal clone.


And it gets even better if you have multiple wikis. If you have a setup  
like this:

web/core.git
web/somewiki.com/
web/anotherwiki.com/
Where core.git is a --bare repo and the other two are clones pointing to  
core.git, you will actually be SAVING space with git.

In git that's 99M + 68M * 2 = 235M
In svn that's 147M * 2 = 294M
;) Just two wikis and git is already taking less space than svn.
And it goes in git's favor the more wikis you have:
1 = (git: 167M, svn: 147M; diff: -20M | 113.605442177%)
2 = (git: 235M, svn: 294M; diff: 59M | 79.9319727891%)
5 = (git: 439M, svn: 735M; diff: 296M | 59.7278911565%)
10 = (git: 779M, svn: 1470M; diff: 691M | 52.9931972789%)
50 = (git: 3499M, svn: 7350M; diff: 3851M | 47.6054421769%)
100 = (git: 6899M, svn: 14700M; diff: 7801M | 46.9319727891%)
500 = (git: 34099M, svn: 73500M; diff: 39401M | 46.3931972789%)

--
~Daniel Friesen (Dantman, Nadir-Seen-Fire) [http://daniel.friesen.name]

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Committing followups: please no --amend

2012-03-27 Thread Roan Kattouw
On Mon, Mar 26, 2012 at 9:50 PM, Tim Starling tstarl...@wikimedia.org wrote:
 For commits with lots of files, Gerrit's diff interface is too broken
 to be useful. It does not provide a compact overview of the change
 which is essential for effective review.

 Luckily, there are alternatives, specifically local git clients and
 gitweb. However, these don't work when git's change model is broken by
 the use of git commit --amend.

They do; it just wasn't obvious to you how to do it, but that doesn't
mean it can't be done.

$ git fetch https://gerrit.wikimedia.org/r/p/analytics/udp-filters
refs/changes/22/3222/3  git branch foo FETCH_HEAD
$ git fetch https://gerrit.wikimedia.org/r/p/analytics/udp-filters
refs/changes/22/3222/4  git branch bar FETCH_HEAD
$ git diff foo..bar

The two 'git fetch' commands (or at least the part before the ) can
be taken from the change page in Gerrit.

 For commits with a small number of files, such changes are reviewable
 by the use of the patch history table in the diff views. But when
 there are a large number of files, it becomes difficult to find the
 files which have changed, and if there are a lot of changed files, to
 produce a compact combined diff.

 So if there are no objections, I'm going to change [[Git/Workflow]] to
 restrict the recommended applications of git commit --amend, and to
 recommend plain git commit as an alternative. A plain commit seems
 to work just fine. It gives you a separate commit to analyse with
 Gerrit, gitweb and client-side tools, and it provides a link to the
 original change in the dependencies section of the change page.

I have mixed feelings towards this. One time at the office, over
lunch, I was bitching about how Gerrit used amends instead of real
branches, and after discussing this for 15 minutes we felt like we'd
just reverse-engineered the Gerrit developers' decision process
(RobLa: I think we just figured out why the Gerrit people did it this
way.)

There are several advantages to using Gerrit the way it's meant to be
used (with amends rather than followup commits):
* Review comments of one logical change are centralized, rather than
being scattered across multiple changes (this is what I said I'd do
differently if I was writing a code review system now)
* Conflicts must be resolved by rebasing the conflicted topic branch
onto the HEAD of master, so there aren't any merge commits containing
conflict resolutions unless you deliberately create them (and someone
else approves them). I imagine I'd be quite annoyed/confused if I was
using git bisect to track down a bug and it blamed a complex merge
commit that had conflicts
* Every commit that ends up being merged into master is clean, in
that it's been approved and passes the tests. This is the entire point
of continuous integration / pre-commit review / gated trunk / etc.,
and it's a major advantage because:
** you can rewind master to any point in history and it'll be in a sane state
** you can start a branch (including a deployment branch) off any
point in master and be confident that that's a reasonably sane branch
point
** if you find subtle breakage later, you can use git bisect on master
to find out where it broke, and there will not be any botched
intermediate states confusing bisect (e.g. if there's a commit
somewhere that breaks half the code and you merge it in together with
a followup commit that fixes it, bisect might find that commit and
wrongly blame it for the breakage you're looking for; this probability
increases as merged-in feature branches get longer)

Of course you can't blindly place absolute trust in any random commit
on master, but at least you know that it 1) has been reviewed as a
unit and approved, and once we have better Jenkins integration you'll
know that 2) it passed the lint checks and 3) it passed the tests as
they existed at the time. Approving commits that are broken because
you know they were fixed later in some follow-up commit somewhere
violates this entire model, and it exposes you to the danger of
merging the broken commit but not the follow-up, if the follow-up is
held up for some reason. Fortunately, once we have proper Jenkins
integration, this will not be possible because Jenkins will mark the
broken commit as having failed the tests, and you will not be able to
approve and merge it without manually overriding Jenkins's V:-1
review.

An unrelated disadvantage of stacked changes in Gerrit (i.e. changes
that depend on other unmerged changes) is that if B.1 (change B
patchset 1) depends on A.1, and someone amends A.1 later to produce
A.2, B.1 will still depend on A.1 and will need to be rebased to
depend on A.2 instead. I've done this before with a stack of four
commits (amended B and had to rebase C and D), and I can tell you it's
not fun. I think I've figured out a trick for it now (use an
interactive rebase from the top of the stack), but it's not intuitive
and I've never tried it.

That said, if you have multiple commits that are 

Re: [Wikitech-l] Committing followups: please no --amend

2012-03-27 Thread Tim Starling
On 27/03/12 17:24, Amir E. Aharoni wrote:
 It sounds similar to what i said in the thread consecutive commits in
 Gerrit‏, so i probably support it, but i don't completely understand
 how will it work with the `git review' command, which doesn't like
 multiple commits. If the documentation will explain how to use `git
 review' with follow up commits, it will be fine.

I've done a few test commits, it will work. The procedure is to copy
the download command out of the Gerrit change page and paste it into a
shell, same as amending. Git gives you some output which includes:

: If you want to create a new branch to retain commits you
: create, you may do so (now or later) by using -b with the
: checkout command again. Example:
:   git checkout -b new_branch_name

You follow its instructions, creating a branch:

:  git checkout -b my_test_commit

Then edit the files, then instead of amend, just a regular

:  git commit -a
:  git review

It complains that you're committing two things:

: You have more than one commit that you are about to submit.
: The outstanding commits are:
:
: 6e4f490 (HEAD, test2) test 2
: 634b5d7 (junk-2) Test commit 1
:
: Is this really what you meant to do?
: Type 'yes' to confirm:

You answer yes, then it's done. Here's what the result looks like:

https://gerrit.wikimedia.org/r/#change,3794

git review --no-rebase may be necessary to reduce noise on the parent
change pages, I haven't tested that yet.

It can be done with slightly fewer steps if you make the steps more
complicated.

-- Tim Starling


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Cutting MediaWiki loose from wikitext

2012-03-27 Thread Antoine Musso
Daniel Kinzler wrote:
 A very rough prototype is in a dev branch here:
 
   http://svn.wikimedia.org/svnroot/mediawiki/branches/Wikidata/phase3/

I guess we could have that migrated to Gerrit and review the project there.

-- 
Antoine hashar Musso


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Cutting MediaWiki loose from wikitext

2012-03-27 Thread Daniel Kinzler
On 27.03.2012 11:26, Antoine Musso wrote:
 Daniel Kinzler wrote:
 A very rough prototype is in a dev branch here:

   http://svn.wikimedia.org/svnroot/mediawiki/branches/Wikidata/phase3/
 
 I guess we could have that migrated to Gerrit and review the project there.

Sure, fine with me :) Though I will likely make a new branch and merge my
changes again more cleanly. What's there now is really a proof of concept. But
sure, have a look!

-- daniel


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Committing followups: please no --amend

2012-03-27 Thread Antoine Musso
Roan Kattouw wrote:
 * Every commit that ends up being merged into master is clean, in
 that it's been approved and passes the tests. This is the entire point
 of continuous integration / pre-commit review / gated trunk / etc.,
 and it's a major advantage snip

That is exactly why I think, one day, we will be able to get ride of the
wmf branch and deploy several time per day straight from master.

-- 
Antoine hashar Musso


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


[Wikitech-l] Open git issues

2012-03-27 Thread Niklas Laxström
I have been collecting list of issues at
http://www.mediawiki.org/wiki/Git/Conversion#Open_issues_after_migration

I'd like start discussion how to solve these issues and in what time frame.

== Code review process ==
I would very much like to have the full patch diffs back in emails so
that I can quickly scan for any i18n issues. Also Gerrit is just slow
enough that I must either waste time waiting for it to load the
interface, or do a context switch by reading next commit. Not to
mention the need to open the diffs for each file in tab.

I can somewhat work with this right now by skipping commits unlikely
to have anything relevant (though you never know, as not even the file
names are mentioned), but when the number of commits pick up the speed
it's not going to work.

== Emails ==
Related to above, I want to scan all (submitted?) changes for the
issues. Currently there is no easy way to subscribe to changes of all
repositories.

In theory I could do the same inside Gerrit, would it provide a easily
navigatable list which records what I have looked at.

== Unicode issue in Gerrit ==
This must be fixed (dataloss). See
https://gerrit.wikimedia.org/r/#change,3505 for example

== Local changes ==
How to handle permanent local changes? There have already been suggestions:
* use git stash (not fun to do for every push)
* use git review --no-rebase (no idea if this is a good idea)
* commit them to local development branch (but then how to rebase
changes-to-commit to master before pushing?)

== How to FIXME ==
We need a way to identify *merged* commits that need fixing. Those
commits should be our collective burden to fix. It must not rely on
the reporter of the issue fixing the issue him/herself or being
persistent enough to get someone to fix it.

I was suggested to use bugzilla, but it's a bit tedious to use and
doesn't as-is have the high visibility like FIXMES used to have.

  -Niklas


-- 
Niklas Laxström

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] rebasing several changes in Gerrit (was: Committing followups: please no --amend)

2012-03-27 Thread Antoine Musso
Roan Kattouw wrote:
 An unrelated disadvantage of stacked changes in Gerrit (i.e. changes
 that depend on other unmerged changes) is that if B.1 (change B
 patchset 1) depends on A.1, and someone amends A.1 later to produce
 A.2, B.1 will still depend on A.1 and will need to be rebased to
 depend on A.2 instead. I've done this before with a stack of four
 commits (amended B and had to rebase C and D), and I can tell you it's
 not fun. I think I've figured out a trick for it now (use an
 interactive rebase from the top of the stack), but it's not intuitive
 and I've never tried it.

We definitely need a documentation about rebasing a serie of dependant
changes, I had to do that this morning.

Given changes:

A - B - C - D

When A is changed, you should do something like:

Get the tip of the commit chain:
  git-review -d D

Then rebase either:

 1) on top of the original branch point:
git rebase A^

 2) on whatever HEAD is rendered to:
git rebase gerrit/master

Then submit your rebased commit chain:

  git-review -f

git-review will warn you about attempting to submit several patchsets at
the same time. It will list the new sha1 for each commit.

Confirm and you get four new patchsets submitted and dependencies updated.


Would need someone to cross check that and then we can amend our Gerrit
documentation.

-- 
Antoine hashar Musso


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Committing followups: please no --amend

2012-03-27 Thread Tim Starling
On 27/03/12 19:49, Roan Kattouw wrote:
 On Mon, Mar 26, 2012 at 9:50 PM, Tim Starling tstarl...@wikimedia.org wrote:
 For commits with lots of files, Gerrit's diff interface is too broken
 to be useful. It does not provide a compact overview of the change
 which is essential for effective review.

 Luckily, there are alternatives, specifically local git clients and
 gitweb. However, these don't work when git's change model is broken by
 the use of git commit --amend.

 They do; it just wasn't obvious to you how to do it, but that doesn't
 mean it can't be done.
 
 $ git fetch https://gerrit.wikimedia.org/r/p/analytics/udp-filters
 refs/changes/22/3222/3  git branch foo FETCH_HEAD
 $ git fetch https://gerrit.wikimedia.org/r/p/analytics/udp-filters
 refs/changes/22/3222/4  git branch bar FETCH_HEAD
 $ git diff foo..bar
 
 The two 'git fetch' commands (or at least the part before the ) can
 be taken from the change page in Gerrit.

It doesn't work, I'm afraid. Because of the implicit rebase on push,
usually subsequent changesets have a different parent. So when you
diff between the two branches, you get all of the intervening commits
which were merged to the master.

Examples from today:

https://gerrit.wikimedia.org/r/#change,3367
Patchsets 1 and 2 have different parents.

https://gerrit.wikimedia.org/r/#change,3363
Patchsets 1, 2 and 3 have different parents.

It's possible to get a diff between them, and I did, but it's tedious.
I figure we should pick a workflow that doesn't waste the reviewer's
time quite so much.

 I have mixed feelings towards this. One time at the office, over
 lunch, I was bitching about how Gerrit used amends instead of real
 branches, and after discussing this for 15 minutes we felt like we'd
 just reverse-engineered the Gerrit developers' decision process
 (RobLa: I think we just figured out why the Gerrit people did it this
 way.)

I could not find a recommendation to use --amend in the Gerrit manual.
The manual says that it is possible to submit subsequent commits to
the same change by just adding a Change-Id footer to the commit
message. That seems to be the reason for the copy button next to the
Change-Id on the change page.

When I tried to do a test push containing several commits which
depended on each other, but with the same Change-Id, Gerrit rejected
it. But I'm not sure if that's a configuration issue or if it just
expected them to be done in separate pushes.

If it was possible to follow the Gerrit manual in this way, and submit
non-amending followup commits with the same Change-Id, then we'd have
the comment grouping advantages without the code review disadvantages.

 * Every commit that ends up being merged into master is clean, in
 that it's been approved and passes the tests. This is the entire point
 of continuous integration / pre-commit review / gated trunk / etc.,
 and it's a major advantage because:
 ** you can rewind master to any point in history and it'll be in a sane state
 ** you can start a branch (including a deployment branch) off any
 point in master and be confident that that's a reasonably sane branch
 point
 ** if you find subtle breakage later, you can use git bisect on master
 to find out where it broke, and there will not be any botched
 intermediate states confusing bisect (e.g. if there's a commit
 somewhere that breaks half the code and you merge it in together with
 a followup commit that fixes it, bisect might find that commit and
 wrongly blame it for the breakage you're looking for; this probability
 increases as merged-in feature branches get longer)

I think significantly increasing review time and breaking authorship
and history information would be a high price to pay for these advantages.

Sure, you can bisect, but when you find the commit and discover that
there were 5 people amending it, how do you work out who was responsible?

I don't think bisect is such a useful feature that it's worth throwing
away every other feature in git just to get it.

 you will not be able to
 approve and merge it without manually overriding Jenkins's V:-1
 review.

It seems like that will be a lot easier and a lot more rare than
finding a diff between amended commits with different parents.

 An unrelated disadvantage of stacked changes in Gerrit (i.e. changes
 that depend on other unmerged changes) is that if B.1 (change B
 patchset 1) depends on A.1, and someone amends A.1 later to produce
 A.2, B.1 will still depend on A.1 and will need to be rebased to
 depend on A.2 instead. I've done this before with a stack of four
 commits (amended B and had to rebase C and D), and I can tell you it's
 not fun. I think I've figured out a trick for it now (use an
 interactive rebase from the top of the stack), but it's not intuitive
 and I've never tried it.

Tell people to not amend their commits then.

-- Tim Starling



___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org

Re: [Wikitech-l] Committing followups: please no --amend

2012-03-27 Thread Chad
On Tue, Mar 27, 2012 at 6:06 AM, Tim Starling tstarl...@wikimedia.org wrote:
 I could not find a recommendation to use --amend in the Gerrit manual.
 The manual says that it is possible to submit subsequent commits to
 the same change by just adding a Change-Id footer to the commit
 message. That seems to be the reason for the copy button next to the
 Change-Id on the change page.

 When I tried to do a test push containing several commits which
 depended on each other, but with the same Change-Id, Gerrit rejected
 it. But I'm not sure if that's a configuration issue or if it just
 expected them to be done in separate pushes.

 If it was possible to follow the Gerrit manual in this way, and submit
 non-amending followup commits with the same Change-Id, then we'd have
 the comment grouping advantages without the code review disadvantages.


Yes, that works. And actually, I think it's a little bit easier. It's a
little more manual that trying git one-liners, but it seems to at
least *always* work (and I've used it a couple of times to fix
changes with totally botched history).

 An unrelated disadvantage of stacked changes in Gerrit (i.e. changes
 that depend on other unmerged changes) is that if B.1 (change B
 patchset 1) depends on A.1, and someone amends A.1 later to produce
 A.2, B.1 will still depend on A.1 and will need to be rebased to
 depend on A.2 instead. I've done this before with a stack of four
 commits (amended B and had to rebase C and D), and I can tell you it's
 not fun. I think I've figured out a trick for it now (use an
 interactive rebase from the top of the stack), but it's not intuitive
 and I've never tried it.

 Tell people to not amend their commits then.


Also making sure people don't have unrelated commits showing up as
dependencies makes the process much easier. If two things aren't
related--don't relate them!

-Chad

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] [Wikitext-l] Fwd: Cutting MediaWiki loose from wikitext

2012-03-27 Thread Daniel Kinzler
On 27.03.2012 13:07, vita...@yourcmc.ru wrote:
 JSON is the internal serialization format.
 
 You're suggesting to use MediaWiki as a model :)
 What's stopping you from implementing it as a _file_ handler, not _article_
 handler?

Because of the actions I want to be able to perform on them, most importantly
editing, but also having diff views for the history, automatic merge to avoid
edit conflicts, etc.

These types of interaction is supported by mediawiki for articles, but not for
files.

In constrast, files are rendered/thumbnailed (we don't need that), get included
in articles with a box and caption (we don't want that), and can be
accessed/downloaded directly as a file via http (we definitely don't want that).

So, what we want to do with the structured data fits much better with
MediaWiki's concept of a page than with the concept of a file.

 I mean, _articles_ contain text (now wikitext).
 All non-human readable/editable/diffable data is stored as files.

But that data WILL be readable/editable/diffable! That's the point! Just not as
text, but as something else, using special viewers, editors, and differs. That's
precisely the idea of the ContentHandler.

 Now they all are in File namespace, but maybe it's much more simpler to allow
 storing them in other namespaces and write file handlers for 
 displaying/editing
 them than to break the idea of article?

How does what I propose break the idea of an article? It just means that
articles do not *necessarily* contain text. And it makes sure that whatever it
is that is contained in the article can still be viewed, edited, and compared in
a meaningful way.

-- daniel


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Localisation commits status update

2012-03-27 Thread Raimond Spekking

Am 26.03.2012 11:25, schrieb Niklas Laxström:

Good $localtime from translatewiki.net. I have been working with Chad
and Antoine to resume the daily translation updates from
translatewiki.net after the partial git migration of MediaWiki.


Thanks for all of your work :-)



Now the last issue is something I want your opinion on:
Is it an issue if we drop pretty formatting of non-English message files?
If it is an issue, I request your help to make the
maintenance/language/rebuildLanguage.php script to support arbitrary
location instead of just the current wiki installation. If it is not
an issue, the first diffs will be big because of the change, but
remaining ones will be small as usual.


I would suggest to get rid of the pretty formatting. Besides core 
developers no one has to edit these files.


Furthermore I would be glad if we could remove the dependency from the 
helper files messages.inc and messageTypes.inc


MessagesEn.php (and not messages.inc) should be the reference for all 
other MessagesXx.php like we do it with extensions.


Optional to translate messages and messages to ignore for translations 
could either stay in messageTypes.php or go into Translate specific 
config file mediawiki-defines.txt like already done for extensions.


Raimond.


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Community managed production environment

2012-03-27 Thread Ryan Lane
tools-dev.wmflabs.org and tools.wmflabs.org? I'd prefer to not have to
register and maintain a ton of top levels for this. It makes
management of things way harder in a bunch of ways (cost, maintenance,
legal, documentation, etc).

Labs is meant for semi-production things. So, tools, bots, etc. We'll
never provide a production-level of support for it, though.

On Tue, Mar 27, 2012 at 4:47 PM, Petr Bena benap...@gmail.com wrote:
 I thought that labs are not going to be used to run some final
 products, rather just to develop them? Also is it secure to use same
 domain for testing and stable version? The stable version would be
 better if it was more restricted.

 On Tue, Mar 27, 2012 at 5:36 AM, Ryan Lane rlan...@gmail.com wrote:
 We already started working on a new virtual cluster known as labs
 (wmflabs.org) which purpose is to allow people develop stuff and later
 move it to some production, some time ago. I believe it would be nice
 to have exactly same environment (probably we could just extend
 wmflabs for that) running probably on same platform (virtual cluster
 managed through some site, using nova extension) which would have
 exactly same possibilities but it would be supposed to run final
 products (not a testing environment as labs, but production where
 the stable version would live)

 Why do we need this?

 Wikimedia labs will offer cloned db of production in future which
 would allow it to run community managed tools like
 http://toolserver.org/~quentinv57/tools/sulinfo.php and similar. I
 think it would be best if such tools were developed using labs as a
 testing platform and stable version pushed to this production which
 should only run the stable code. In fact it doesn't even need to be
 physically another cluster, just another set of virtual instances
 isolated from testing environment on labs. The environment would have
 restrictions which we don't have on labs. People would need to use
 puppet and gerrit mostly for everything, and root would not be given
 to everyone in this environment (some projects might be restricted to
 wmf ops only), so that we could even move all stable bots, we
 currently host on wmflabs there, without being afraid of leaking the
 bot credentials and such (that's a reason why bots project is
 restricted atm). Also the applications which ask for wikimedia
 credentials could be allowed there, since the code living on this
 production would be subject of review, and such projects which could
 mean security risk could be managed by wmf ops only (the changes could
 be done by volunteers but would need to be submitted to gerrit).

 We could also move some parts of current production to this community
 managed environment. I talked to Roan Kattouw in past regarding
 moving the configuration of wikimedia sites to some git repository so
 that volunteers could submit some patches to gerrit or handle bugzilla
 reports without needing shell access. Changes to production config
 would be merged by operation enginners, so that it would be completely
 secure.

 In a nutshell:

 This environment could be set up on same platform as wmf labs (no
 extra costs, just hard work :)), stable products (bots, user scripts)
 would be living there, while labs would serve only for development and
 nothing else.

 The production version would live on another domain, like
 wikimedia-tools.org or wmtools.org

 Thanks for your comments and responses


 I don't see the need for a different domain name. tools.wmflabs.org
 should suffice.

 Also, this can be accomplished by having multiple projects.

 - Ryan

 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l

 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] I don't want a full git clone, I just want an rsync(1) style diet git clone

2012-03-27 Thread jidanni
OK, so you don't offer rsync. And the closest I can get to
$ du -sh med*
17M mediawiki-1.18.2.tar.gz
72M mediawiki-1.18.2
is your 99M product, which I hope you could document on
http://www.mediawiki.org/wiki/Download_from_Git
Apparently it involves a --bare option.

And how might one update it occasionally without accruing things I will
not need. I.e., I just want to do an rsync.

Let's assume
72M mediawiki-1.23
72M mediawiki-1.24
72M mediawiki-1.25

I guess I will just have to accept some bloat with git over this.
I guess there is no way to just get the same content without using git
in some way that it is not intended.

P.S., I use
http://www.mediawiki.org/wiki/Manual:Wiki_family#Ultimate_minimalist_solution
so each additional wiki is just a symlink.

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] I don't want a full git clone, I just want an rsync(1) style diet git clone

2012-03-27 Thread Chad
On Tue, Mar 27, 2012 at 10:05 AM,  jida...@jidanni.org wrote:
 OK, so you don't offer rsync. And the closest I can get to
 $ du -sh med*
 17M     mediawiki-1.18.2.tar.gz
 72M     mediawiki-1.18.2

Well if you want the vanilla install without the ability to update via
version control, of course that's going to be smaller. That's no
different now from how it was under SVN.

 is your 99M product, which I hope you could document on
 http://www.mediawiki.org/wiki/Download_from_Git
 Apparently it involves a --bare option.


No. Most people don't want --bare anyway...they're likely
to get confused.

 And how might one update it occasionally without accruing things I will
 not need. I.e., I just want to do an rsync.


`git pull`

 I guess I will just have to accept some bloat with git over this.
 I guess there is no way to just get the same content without using git
 in some way that it is not intended.


The amount of bloat we're talking about here is between
20 and 30M, at most. I see zero problem here.

Would you rather have the initial conversions I did, that are
around 6.5G of bloat? ;-)

-Chad

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[MediaWiki-CodeReview] [MediaWiki r113190]: New comment added

2012-03-27 Thread MediaWiki Mail
Krinkle posted a comment on MediaWiki.r113190.
URL: http://www.mediawiki.org/wiki/Special:Code/MediaWiki/113190#c32534

Commit summary for MediaWiki.r113190:

[SyntaxHighlight GeSHi] Use code and pre instead of span and div to 
allow skins to universally style these without requiring local wikis to 
manually maintain a copy of each skins's pre style and apply it to .mw-geshi

* Fixes
-- (bug 19416) GeSHi-generated blocks should use pre style for the container

Krinkle's comment:

I can't tell for sure, but I believe that may be due to a local CSS rule on the 
wiki you took the snapshot on.

This code isn't live on WMF afaik, but some of the popular WMF wikis have local 
CSS rules to overwrite Geshi's internal stylesheet for it's HTML input (namely 
the  code source-javascript/code class in this case, which has a strong 
reset on it by Geshi itself).

Since Geshi doesn't want those to be overwritten (it has a strong reset on it), 
and because we want core styles for PRE to apply. Therefor Geshi's output 
(wether it are {{tag|p|o}} with {{tag|span|o}} and css for pre-style or one ore 
more {{tag|pre|o}} tags doesn't matter).. that output is wrapped in a 
container. So far this container has been like codenowikidiv dir=ltr 
class=mw-geshi mw-content-ltr/nowiki/code, this revision changes that to 
codenowikipre dir=ltr class=mw-geshi mw-content-ltr/nowiki/code.

___
MediaWiki-CodeReview mailing list
mediawiki-coderev...@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/mediawiki-codereview


Re: [Wikitech-l] I don't want a full git clone, I just want an rsync(1) style diet git clone

2012-03-27 Thread Antoine Musso
Le 27/03/12 16:13, Chad a écrit :
 Would you rather have the initial conversions I did, that are
 around 6.5G of bloat? ;-)

The ones with the nice theora video of some fishes in an aquarium?

;-D

-- 
Antoine hashar Musso


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[MediaWiki-CodeReview] [MediaWiki r114517]: New comment added

2012-03-27 Thread MediaWiki Mail
Nikerabbit posted a comment on MediaWiki.r114517.
URL: http://www.mediawiki.org/wiki/Special:Code/MediaWiki/114517#c32535

Commit summary for MediaWiki.r114517:

assert correct content model and format

Nikerabbit's comment:

but got found

 +throw new MWException( No handler for model $modelName registered in 
\$wgContentHandlers );
Or with hooks??

___
MediaWiki-CodeReview mailing list
mediawiki-coderev...@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/mediawiki-codereview


[MediaWiki-CodeReview] [MediaWiki r114520]: New comment added, and revision status changed

2012-03-27 Thread MediaWiki Mail
Nikerabbit changed the status of MediaWiki.r114520 to fixme and commented 
it.
URL: http://www.mediawiki.org/wiki/Special:Code/MediaWiki/114520#c32536

Old Status: new
New Status: fixme

Commit summary for MediaWiki.r114520:

use factory method to get difference engine everywhere

Nikerabbit's comment:

Please use tabs for indentation.

___
MediaWiki-CodeReview mailing list
mediawiki-coderev...@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/mediawiki-codereview


[MediaWiki-CodeReview] [Wikimedia r1504]: New comment added, and revision status changed

2012-03-27 Thread MediaWiki Mail
Jpostlethwaite changed the status of Wikimedia.r1504 to new and commented 
it.
URL: http://www.mediawiki.org/wiki/Special:Code/Wikimedia/1504#c32537

Old Status: fixme
New Status: new

Commit summary for Wikimedia.r1504:

Added validation for the process flag. See r1501.

Jpostlethwaite's comment:

It is not broken.

You do not need to specify process.

It defaults to:

next_sched_contribution

in the module.

___
MediaWiki-CodeReview mailing list
mediawiki-coderev...@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/mediawiki-codereview


[MediaWiki-CodeReview] [Wikimedia r1504]: New comment added

2012-03-27 Thread MediaWiki Mail
Khorn (WMF) posted a comment on Wikimedia.r1504.
URL: http://www.mediawiki.org/wiki/Special:Code/Wikimedia/1504#c32538

Commit summary for Wikimedia.r1504:

Added validation for the process flag. See r1501.

Khorn (WMF)'s comment:

I wasn't saying it was broken. I was saying it should probably be improved for 
clarity. 

___
MediaWiki-CodeReview mailing list
mediawiki-coderev...@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/mediawiki-codereview


[MediaWiki-CodeReview] [Wikimedia r1504]: New comment added

2012-03-27 Thread MediaWiki Mail
Jpostlethwaite posted a comment on Wikimedia.r1504.
URL: http://www.mediawiki.org/wiki/Special:Code/Wikimedia/1504#c32539

Commit summary for Wikimedia.r1504:

Added validation for the process flag. See r1501.

Jpostlethwaite's comment:

You said it would error out.

In the help it says:

 drush rg  --process=next_sched_contribution   # By default 
next_sched_contribution will be processed.

   
It would be better if I put it in the option part of help. I will do that.

___
MediaWiki-CodeReview mailing list
mediawiki-coderev...@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/mediawiki-codereview


[MediaWiki-CodeReview] [Wikimedia r1504]: New comment added

2012-03-27 Thread MediaWiki Mail
Jpostlethwaite posted a comment on Wikimedia.r1504.
URL: http://www.mediawiki.org/wiki/Special:Code/Wikimedia/1504#c32540

Commit summary for Wikimedia.r1504:

Added validation for the process flag. See r1501.

Jpostlethwaite's comment:

Added more documentation for help in r1544.

___
MediaWiki-CodeReview mailing list
mediawiki-coderev...@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/mediawiki-codereview


[MediaWiki-CodeReview] [MediaWiki r114501]: New comment added

2012-03-27 Thread MediaWiki Mail
Raindrift posted a comment on MediaWiki.r114501.
URL: http://www.mediawiki.org/wiki/Special:Code/MediaWiki/114501#c32541

Commit summary for MediaWiki.r114501:

added actual metadata to the template, proper i18n strings

Raindrift's comment:

Good point.  Fixed in r114533

___
MediaWiki-CodeReview mailing list
mediawiki-coderev...@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/mediawiki-codereview


[MediaWiki-CodeReview] [MediaWiki r114501]: Revision status changed

2012-03-27 Thread MediaWiki Mail
Raindrift changed the status of MediaWiki.r114501 to new
URL: http://www.mediawiki.org/wiki/Special:Code/MediaWiki/114501

Old status:  fixme
New status: new

Commit summary for MediaWiki.r114501:

added actual metadata to the template, proper i18n strings

___
MediaWiki-CodeReview mailing list
mediawiki-coderev...@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/mediawiki-codereview


[MediaWiki-CodeReview] [Wikimedia r1504]: New comment added

2012-03-27 Thread MediaWiki Mail
Khorn (WMF) posted a comment on Wikimedia.r1504.
URL: http://www.mediawiki.org/wiki/Special:Code/Wikimedia/1504#c32542

Commit summary for Wikimedia.r1504:

Added validation for the process flag. See r1501.

Khorn (WMF)'s comment:

Yeah: I see in r1382, process is assigned a default way down in what is now 
recurring_globalcollect_process_get_query_options. I'm kind of wondering why 
it's all the way down there, instead of out in the drush file, as nothing else 
can run this code and we have no plans for anything to be able to do so in the 
future. 
brBut, no: As of right now, this is not a required value and will default to 
next_sched_contribution. 

___
MediaWiki-CodeReview mailing list
mediawiki-coderev...@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/mediawiki-codereview


[MediaWiki-CodeReview] [Wikimedia r1504]: Revision status changed

2012-03-27 Thread MediaWiki Mail
Khorn (WMF) changed the status of Wikimedia.r1504 to ok
URL: http://www.mediawiki.org/wiki/Special:Code/Wikimedia/1504

Old status:  new
New status: ok

Commit summary for Wikimedia.r1504:

Added validation for the process flag. See r1501.

___
MediaWiki-CodeReview mailing list
mediawiki-coderev...@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/mediawiki-codereview


[MediaWiki-CodeReview] [Wikimedia r1544]: Revision status changed

2012-03-27 Thread MediaWiki Mail
Khorn (WMF) changed the status of Wikimedia.r1544 to ok
URL: http://www.mediawiki.org/wiki/Special:Code/Wikimedia/1544

Old status:  new
New status: ok

Commit summary for Wikimedia.r1544:

Added more documentation for drush help on rg. See r1504.

___
MediaWiki-CodeReview mailing list
mediawiki-coderev...@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/mediawiki-codereview


[MediaWiki-CodeReview] [Wikimedia r1521]: Revision status changed

2012-03-27 Thread MediaWiki Mail
Khorn (WMF) changed the status of Wikimedia.r1521 to ok
URL: http://www.mediawiki.org/wiki/Special:Code/Wikimedia/1521

Old status:  new
New status: ok

Commit summary for Wikimedia.r1521:

Moved civicrm_recurring_globalcollect to recurring_globalcollect due to a drush 
bug.

___
MediaWiki-CodeReview mailing list
mediawiki-coderev...@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/mediawiki-codereview


[MediaWiki-CodeReview] [Wikimedia r1503]: New comment added, and revision status changed

2012-03-27 Thread MediaWiki Mail
Jpostlethwaite changed the status of Wikimedia.r1503 to new and commented 
it.
URL: http://www.mediawiki.org/wiki/Special:Code/Wikimedia/1503#c32543

Old Status: fixme
New Status: new

Commit summary for Wikimedia.r1503:

Added the ability to specify the number of contributions to batch for crg. See 
r1501.

Jpostlethwaite's comment:

Fixed in r1546.

___
MediaWiki-CodeReview mailing list
mediawiki-coderev...@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/mediawiki-codereview


[MediaWiki-CodeReview] [Wikimedia r1529]: New comment added

2012-03-27 Thread MediaWiki Mail
Jpostlethwaite posted a comment on Wikimedia.r1529.
URL: http://www.mediawiki.org/wiki/Special:Code/Wikimedia/1529#c32544

Commit summary for Wikimedia.r1529:

Made batch_max a setting. See r1503.

Jpostlethwaite's comment:

I originally had that.

The minor problem is that you get dual error messages:

pre
WD crg: You are attempting to batch 100 payments, which is more than the 
maximum allowed: 10. Either batch less payments or increase the maximum.
[error]
You are attempting to batch 100 payments, which is more than the maximum 
allowed: 10. Either batch less payments or increase the maximum.
[error]
/pre

This made me find a bug :) Will commit it shortly.

___
MediaWiki-CodeReview mailing list
mediawiki-coderev...@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/mediawiki-codereview


[Wikitech-l] Gerrit bots notification

2012-03-27 Thread Antoine Musso
Hello,

Whenever an action is made in Gerrit, a notification is sent to some IRC
channels. Previously we had to map each project to an IRC channel fall
backing to #mediawiki.

I have enhanced the python script to support wildcard when filtering on
Gerrit project name. Leslie Carr has deployed the changes some minutes
ago, and now:

projects from:
 - analytics and integrations are sent to #wikimedia-dev
 - operations are sent to #wikimedia-operations
 - labs to #wikimedia-labs
 - mediawiki and mediawiki extensions to #mediawiki

Default is still #mediawiki


If someone need more specific rules, you should tweak the filenames
dictionary in templates/gerrit/hookconfig.py.erb .

As an example, we might want to send MobileFrontend notifications to
#wikimedia-mobile .


-- 
Antoine hashar Musso


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


[MediaWiki-CodeReview] [Wikimedia r1529]: New comment added

2012-03-27 Thread MediaWiki Mail
Jpostlethwaite posted a comment on Wikimedia.r1529.
URL: http://www.mediawiki.org/wiki/Special:Code/Wikimedia/1529#c32545

Commit summary for Wikimedia.r1529:

Made batch_max a setting. See r1503.

Jpostlethwaite's comment:

Bug fixed in r1547.

___
MediaWiki-CodeReview mailing list
mediawiki-coderev...@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/mediawiki-codereview


[MediaWiki-CodeReview] [Wikimedia r1529]: Revision status changed

2012-03-27 Thread MediaWiki Mail
Jpostlethwaite changed the status of Wikimedia.r1529 to new
URL: http://www.mediawiki.org/wiki/Special:Code/Wikimedia/1529

Old status:  fixme
New status: new

Commit summary for Wikimedia.r1529:

Made batch_max a setting. See r1503.

___
MediaWiki-CodeReview mailing list
mediawiki-coderev...@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/mediawiki-codereview


[MediaWiki-CodeReview] [Wikimedia r1546]: Revision status changed

2012-03-27 Thread MediaWiki Mail
Khorn (WMF) changed the status of Wikimedia.r1546 to ok
URL: http://www.mediawiki.org/wiki/Special:Code/Wikimedia/1546

Old status:  new
New status: ok

Commit summary for Wikimedia.r1546:

Added return false to validation. See r1503.

___
MediaWiki-CodeReview mailing list
mediawiki-coderev...@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/mediawiki-codereview


[MediaWiki-CodeReview] [Wikimedia r1503]: Revision status changed

2012-03-27 Thread MediaWiki Mail
Khorn (WMF) changed the status of Wikimedia.r1503 to ok
URL: http://www.mediawiki.org/wiki/Special:Code/Wikimedia/1503

Old status:  new
New status: ok

Commit summary for Wikimedia.r1503:

Added the ability to specify the number of contributions to batch for crg. See 
r1501.

___
MediaWiki-CodeReview mailing list
mediawiki-coderev...@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/mediawiki-codereview


Re: [Wikitech-l] Committing followups: please no --amend

2012-03-27 Thread Marcin Cieslak
 Tim Starling tstarl...@wikimedia.org wrote:

 It doesn't work, I'm afraid. Because of the implicit rebase on push,
 usually subsequent changesets have a different parent. 

How does the implicit rebase on push work? Do you mean git-review?

I don't know whether git push remote HEAD:for/master/topic
rebases anything. I still prefer plain git push (where I can also
ay HEAD:refs/changes/XXX) to git-review magic.

//Saper


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


[MediaWiki-CodeReview] [Wikimedia r1547]: Revision status changed

2012-03-27 Thread MediaWiki Mail
Khorn (WMF) changed the status of Wikimedia.r1547 to ok
URL: http://www.mediawiki.org/wiki/Special:Code/Wikimedia/1547

Old status:  new
New status: ok

Commit summary for Wikimedia.r1547:

This caused a bug where batch was being set to zero, when it shoudl have been 
null. See r1529.

___
MediaWiki-CodeReview mailing list
mediawiki-coderev...@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/mediawiki-codereview


[MediaWiki-CodeReview] [Wikimedia r1529]: New comment added

2012-03-27 Thread MediaWiki Mail
Khorn (WMF) posted a comment on Wikimedia.r1529.
URL: http://www.mediawiki.org/wiki/Special:Code/Wikimedia/1529#c32546

Commit summary for Wikimedia.r1529:

Made batch_max a setting. See r1503.

Khorn (WMF)'s comment:

Yeah: Watchdog errors get displayed in realtime, and the drush error that 
killed the process is displayed on process exit. If a process-killing error is 
the only thing that happens, it would look like two in a row from the command 
line. If more happened, it would be totally unclear which watchdog message went 
with the thing that caused execution to unwind and halt itself.
brbrHowever, even in instances where a process-killing error is the only 
thing that happens, both are necessary: watchdog to get the log entry somewhere 
we can see it from drupal, and the drush error so the main controller loop 
knows what do to, and Jenkins knows there was an explosion and can e-mail 
people about it. 
brbrIn other words, don't worry about seeing the error on the command line 
twice. The both serve a purpose, and this behavior has been with us for pretty 
much the whole time. 

___
MediaWiki-CodeReview mailing list
mediawiki-coderev...@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/mediawiki-codereview


[MediaWiki-CodeReview] [Wikimedia r1529]: New comment added, and revision status changed

2012-03-27 Thread MediaWiki Mail
Khorn (WMF) changed the status of Wikimedia.r1529 to fixme and commented it.
URL: http://www.mediawiki.org/wiki/Special:Code/Wikimedia/1529#c32547

Old Status: new
New Status: fixme

Commit summary for Wikimedia.r1529:

Made batch_max a setting. See r1503.

Khorn (WMF)'s comment:

Additionally: watchdog('BATCHING', ... ) doesn't mean anything. 

___
MediaWiki-CodeReview mailing list
mediawiki-coderev...@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/mediawiki-codereview


[MediaWiki-CodeReview] [Wikimedia r1531]: Revision status changed

2012-03-27 Thread MediaWiki Mail
Khorn (WMF) changed the status of Wikimedia.r1531 to ok
URL: http://www.mediawiki.org/wiki/Special:Code/Wikimedia/1531

Old status:  new
New status: ok

Commit summary for Wikimedia.r1531:

Added variable to specify links for watchdog messages. This populates the 
Operations (`link`) field in the report detail.

___
MediaWiki-CodeReview mailing list
mediawiki-coderev...@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/mediawiki-codereview


[MediaWiki-CodeReview] [Wikimedia r1545]: Revision status changed

2012-03-27 Thread MediaWiki Mail
Khorn (WMF) changed the status of Wikimedia.r1545 to ok
URL: http://www.mediawiki.org/wiki/Special:Code/Wikimedia/1545

Old status:  new
New status: ok

Commit summary for Wikimedia.r1545:

Added more documentation. See r1544.

___
MediaWiki-CodeReview mailing list
mediawiki-coderev...@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/mediawiki-codereview


Re: [Wikitech-l] Gerrit bots notification

2012-03-27 Thread Patrick Reilly
Yeah, can we modify it to send MobileFrontend notifications
to #wikimedia-mobile.

— Patrick

On Tue, Mar 27, 2012 at 11:28 AM, Antoine Musso hashar+...@free.fr wrote:

 Hello,

 Whenever an action is made in Gerrit, a notification is sent to some IRC
 channels. Previously we had to map each project to an IRC channel fall
 backing to #mediawiki.

 I have enhanced the python script to support wildcard when filtering on
 Gerrit project name. Leslie Carr has deployed the changes some minutes
 ago, and now:

 projects from:
  - analytics and integrations are sent to #wikimedia-dev
  - operations are sent to #wikimedia-operations
  - labs to #wikimedia-labs
  - mediawiki and mediawiki extensions to #mediawiki

 Default is still #mediawiki


 If someone need more specific rules, you should tweak the filenames
 dictionary in templates/gerrit/hookconfig.py.erb .

 As an example, we might want to send MobileFrontend notifications to
 #wikimedia-mobile .


 --
 Antoine hashar Musso


 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


[Wikitech-l] git triage

2012-03-27 Thread Mark A. Hershberger

What: GIT Bug Triage
   Where: #wikimedia-dev
  http://webchat.freenode.net/?channels=#wikimedia-dev
When: Wednesday, March 28, 2012 at 19:00UTC  http://hexm.de/gu
Etherpad: http://etherpad.wikimedia.org/BugTriage-2012-03

Niklas and others have run into problems since the git migration.  After
looking at Chad's schedule (but not yet talking to him directly, so
fingers crossed) I've set up a time for a triage tomorrow to cover these
issues.

Hope to see you there!

Mark.

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Gerrit bots notification

2012-03-27 Thread Max Semenik
On 27.03.2012, 22:56 Patrick wrote:

 Yeah, can we modify it to send MobileFrontend notifications
 to #wikimedia-mobile.

As well as other mobile extensions: ZeroRatedMobileAccess and
FeaturedFeeds. I think these notifications should be sent to
#mediawiki too, as this concerns WM/MW in general, too.

-- 
Best regards,
  Max Semenik ([[User:MaxSem]])


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


[MediaWiki-CodeReview] [Wikimedia r1529]: New comment added, and revision status changed

2012-03-27 Thread MediaWiki Mail
Jpostlethwaite changed the status of Wikimedia.r1529 to new and commented 
it.
URL: http://www.mediawiki.org/wiki/Special:Code/Wikimedia/1529#c32548

Old Status: fixme
New Status: new

Commit summary for Wikimedia.r1529:

Made batch_max a setting. See r1503.

Jpostlethwaite's comment:

All issues are fixed in r1548.

___
MediaWiki-CodeReview mailing list
mediawiki-coderev...@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/mediawiki-codereview


[MediaWiki-CodeReview] [MediaWiki r111723]: Revision status changed

2012-03-27 Thread MediaWiki Mail
Krinkle changed the status of MediaWiki.r111723 to resolved
URL: http://www.mediawiki.org/wiki/Special:Code/MediaWiki/111723

Old status:  new
New status: resolved

Commit summary for MediaWiki.r111723:

First version of ResourceLoaderLanguageModule.
getModifiedTime method is incomplete.

___
MediaWiki-CodeReview mailing list
mediawiki-coderev...@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/mediawiki-codereview


[MediaWiki-CodeReview] [MediaWiki r114531]: New comment added

2012-03-27 Thread MediaWiki Mail
Reedy posted a comment on MediaWiki.r114531.
URL: http://www.mediawiki.org/wiki/Special:Code/MediaWiki/114531#c32549

Commit summary for MediaWiki.r114531:

Show the folder in which we're executing each subcommand
if passed the -v parameter.

Reedy's comment:

Yay. I was going to ask for something like this to be added

___
MediaWiki-CodeReview mailing list
mediawiki-coderev...@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/mediawiki-codereview


[MediaWiki-CodeReview] [MediaWiki r114531]: New comment added, and revision status changed

2012-03-27 Thread MediaWiki Mail
Reedy changed the status of MediaWiki.r114531 to fixme and commented it.
URL: http://www.mediawiki.org/wiki/Special:Code/MediaWiki/114531#c32550

Old Status: new
New Status: fixme

Commit summary for MediaWiki.r114531:

Show the folder in which we're executing each subcommand
if passed the -v parameter.

Reedy's comment:

pre
reedy@ubuntu64-web-esxi:~/git/mediawiki$ ./subgits pull
[: 19: pull: unexpected operator
Already up-to-date.
/pre

___
MediaWiki-CodeReview mailing list
mediawiki-coderev...@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/mediawiki-codereview


[MediaWiki-CodeReview] [Wikimedia r1475]: New comment added, and revision status changed

2012-03-27 Thread MediaWiki Mail
Jpostlethwaite changed the status of Wikimedia.r1475 to new and commented 
it.
URL: http://www.mediawiki.org/wiki/Special:Code/Wikimedia/1475#c32551

Old Status: fixme
New Status: new

Commit summary for Wikimedia.r1475:

Adding the ability for queue2civicrm to process recurring payments from 
ActiveMQ. Including civicrm_recurring_globalcollect_common.inc for the method 
_civicrm_recurring_globalcollect_get_next_sched_contribution_date_for_month(). 
Added the methods: _queue2civicrm_contribution_recur_insert() and 
_queue2civicrm_update_contribution_for_recurring().

Jpostlethwaite's comment:

Fixes are in r1553 and r1554.

___
MediaWiki-CodeReview mailing list
mediawiki-coderev...@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/mediawiki-codereview


[MediaWiki-CodeReview] [Wikimedia r1476]: New comment added, and revision status changed

2012-03-27 Thread MediaWiki Mail
Jpostlethwaite changed the status of Wikimedia.r1476 to new and commented 
it.
URL: http://www.mediawiki.org/wiki/Special:Code/Wikimedia/1476#c32552

Old Status: fixme
New Status: new

Commit summary for Wikimedia.r1476:

Enabling _queue2civicrm_contribution_recur_insert(). See r1475.

Jpostlethwaite's comment:

This is fixed in r1552.

___
MediaWiki-CodeReview mailing list
mediawiki-coderev...@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/mediawiki-codereview


[MediaWiki-CodeReview] [Wikimedia r1476]: New comment added

2012-03-27 Thread MediaWiki Mail
Jpostlethwaite posted a comment on Wikimedia.r1476.
URL: http://www.mediawiki.org/wiki/Special:Code/Wikimedia/1476#c32553

Commit summary for Wikimedia.r1476:

Enabling _queue2civicrm_contribution_recur_insert(). See r1475.

Jpostlethwaite's comment:

This is fixed in r1554.

___
MediaWiki-CodeReview mailing list
mediawiki-coderev...@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/mediawiki-codereview


[MediaWiki-CodeReview] [Wikimedia r1551]: Revision status changed

2012-03-27 Thread MediaWiki Mail
Jpostlethwaite changed the status of Wikimedia.r1551 to ok
URL: http://www.mediawiki.org/wiki/Special:Code/Wikimedia/1551

Old status:  new
New status: ok

Commit summary for Wikimedia.r1551:

followup r1521
Changes to external modules after the civicrm_recurring_globalcollect - 
recurring_globalcollect module name change.
Also, moved the requirement of a file in queue2civi to a more localized place 
and wrapped it for safety.

___
MediaWiki-CodeReview mailing list
mediawiki-coderev...@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/mediawiki-codereview


[MediaWiki-CodeReview] [Wikimedia r1551]: New comment added

2012-03-27 Thread MediaWiki Mail
Jpostlethwaite posted a comment on Wikimedia.r1551.
URL: http://www.mediawiki.org/wiki/Special:Code/Wikimedia/1551#c32554

Commit summary for Wikimedia.r1551:

followup r1521
Changes to external modules after the civicrm_recurring_globalcollect - 
recurring_globalcollect module name change.
Also, moved the requirement of a file in queue2civi to a more localized place 
and wrapped it for safety.

Jpostlethwaite's comment:

I added a drush_set_error() where the file check takes place.

___
MediaWiki-CodeReview mailing list
mediawiki-coderev...@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/mediawiki-codereview


Re: [Wikitech-l] Gerrit bots notification

2012-03-27 Thread Antoine Musso
Le 27/03/12 21:10, Max Semenik a écrit :
 I think these notifications should be sent to #mediawiki too, as this
 concerns WM/MW in general, too.

Unfortunately the python script does not support sending to multiple
channels yet.  Something any pythonist should be able to do :-]

Can you possibly open a bug about the ability to send to multiple
channels?  Since that is a Gerrit hook, it should be in Wikimedia
product under the git/gerrit component.

Thanks!

-- 
Antoine hashar Musso


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


[MediaWiki-CodeReview] [MediaWiki r113321]: New comment added

2012-03-27 Thread MediaWiki Mail
Kaldari posted a comment on MediaWiki.r113321.
URL: http://www.mediawiki.org/wiki/Special:Code/MediaWiki/113321#c32555

Commit summary for MediaWiki.r113321:

using sweet new 1.19 method! Bodacious! Also making sure that section headers 
(aka subjects) are always handled correctly.

Kaldari's comment:

recommitted via git at https://gerrit.wikimedia.org/r/#change,3831

___
MediaWiki-CodeReview mailing list
mediawiki-coderev...@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/mediawiki-codereview


Re: [Wikitech-l] GSOC2012 Image_recognition

2012-03-27 Thread Sumana Harihareswara
If anyone can mentor this topic, please do say so by this Friday and add
yourself to
https://www.mediawiki.org/wiki/Summer_of_Code_2012#Mentor_signup .
Otherwise I'm going to take it off the ideas page -- it's just not worth
the false attraction.

Thanks.

-- 
Sumana Harihareswara
Volunteer Development Coordinator
Wikimedia Foundation


On 03/24/2012 06:29 AM, emijrp wrote:
 Perhaps, does she want to developed it out of GSoC? I think that it is a
 very interesting topic.
 
 If she know a bit of image processing, probably she know more about it that
 most of us. But we can help her with the pywikipedia bot to retrieve
 images/wikipages and submit the categorized image results to Wikimedia
 Commons.
 
 2012/3/23 Sumana Harihareswara suma...@wikimedia.org
 
 On 03/21/2012 04:00 PM, Emanuela Boroș wrote:
 Hello,

 My name is Emanuela Boros and I am a second year Software Engineering
 master's student at the Al. Ioan Cuza University of Iasi, Romania and,
 after I graduate, I plan to follow a phd program.

 My research experience is focused mainly towards computer vision, image
 processing, machine learning, multimodal information retrieval. It also
 involves participating at international evaluation campaigns (ImageClef,
 RobotVision) focusing on content-based image retrieval, image
 classification and topological localization using visual information

 I am interested in this project
 https://www.mediawiki.org/wiki/Summer_of_Code_2012#Image_recognition .
 I've
 noticed that there isn't a mentor mentioned for this project and I would
 like to know who should I contact for further clarifications regarding
 this
 project before I start writing my proposal.

 Thank you and best regards,
 Emanuela Boros

 Emanuela, thank you for your interest in MediaWiki.  Given the lack of
 response to you (and to a few other students who are interested in this
 idea), I'm sorry to say that I do not think there is a mentor available
 for this idea.

 We do have a lot of ideas that mentors are interested in guiding:
 https://www.mediawiki.org/wiki/Summer_of_Code_2012#Project_ideas so
 perhaps you will find one of them interesting.  I'm sorry for the trouble.

 --
 Sumana Harihareswara
 Volunteer Development Coordinator
 Wikimedia Foundation

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[MediaWiki-CodeReview] [Wikimedia r1563]: Revision status changed

2012-03-27 Thread MediaWiki Mail
Pgehres (WMF) changed the status of Wikimedia.r1563 to deferred
URL: http://www.mediawiki.org/wiki/Special:Code/Wikimedia/1563

Old status:  new
New status: deferred

Commit summary for Wikimedia.r1563:

Updated unit testing configuration file. See r1562.

___
MediaWiki-CodeReview mailing list
mediawiki-coderev...@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/mediawiki-codereview


[MediaWiki-CodeReview] [Wikimedia r1562]: Revision status changed

2012-03-27 Thread MediaWiki Mail
Pgehres (WMF) changed the status of Wikimedia.r1562 to deferred
URL: http://www.mediawiki.org/wiki/Special:Code/Wikimedia/1562

Old status:  new
New status: deferred

Commit summary for Wikimedia.r1562:

Updated script to execute unit tests.

___
MediaWiki-CodeReview mailing list
mediawiki-coderev...@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/mediawiki-codereview


[MediaWiki-CodeReview] [Wikimedia r1561]: Revision status changed

2012-03-27 Thread MediaWiki Mail
Pgehres (WMF) changed the status of Wikimedia.r1561 to deferred
URL: http://www.mediawiki.org/wiki/Special:Code/Wikimedia/1561

Old status:  new
New status: deferred

Commit summary for Wikimedia.r1561:

Added directory for log results from unit testing.

___
MediaWiki-CodeReview mailing list
mediawiki-coderev...@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/mediawiki-codereview


[MediaWiki-CodeReview] [Wikimedia r1560]: Revision status changed

2012-03-27 Thread MediaWiki Mail
Pgehres (WMF) changed the status of Wikimedia.r1560 to deferred
URL: http://www.mediawiki.org/wiki/Special:Code/Wikimedia/1560

Old status:  new
New status: deferred

Commit summary for Wikimedia.r1560:

File cleanup. No code changes.

___
MediaWiki-CodeReview mailing list
mediawiki-coderev...@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/mediawiki-codereview


[MediaWiki-CodeReview] [Wikimedia r1559]: Revision status changed

2012-03-27 Thread MediaWiki Mail
Pgehres (WMF) changed the status of Wikimedia.r1559 to deferred
URL: http://www.mediawiki.org/wiki/Special:Code/Wikimedia/1559

Old status:  new
New status: deferred

Commit summary for Wikimedia.r1559:

Enabling StandaloneGlobalCollectAdapter_Drush_DrushTestCase. See r1539.

___
MediaWiki-CodeReview mailing list
mediawiki-coderev...@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/mediawiki-codereview


Re: [Wikitech-l] Wikimedia and IPv6

2012-03-27 Thread Erik Moeller
We should certainly try to be ready. If we're going to need to make
further schema updates for IPv6 anyway, I'd rather we try to fix rc_ip
while we're at it.

Aaron's provided some additional info for needed MediaWiki-related
changes, which I've copied here:

https://www.mediawiki.org/wiki/IPv6_support_roadmap

I've created a tracking bug here for blocking issues:

https://bugzilla.wikimedia.org/show_bug.cgi?id=35540

We'll continue to sync up on this internally both from an ops and
development standpoint, but anyone who wants to help move this
forward, please do.

-- 
Erik Möller
VP of Engineering and Product Development, Wikimedia Foundation

Support Free Knowledge: https://wikimediafoundation.org/wiki/Donate

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


[MediaWiki-CodeReview] [Wikimedia r1558]: Revision status changed

2012-03-27 Thread MediaWiki Mail
Pgehres (WMF) changed the status of Wikimedia.r1558 to deferred
URL: http://www.mediawiki.org/wiki/Special:Code/Wikimedia/1558

Old status:  new
New status: deferred

Commit summary for Wikimedia.r1558:

File cleanup. No code changes.

___
MediaWiki-CodeReview mailing list
mediawiki-coderev...@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/mediawiki-codereview


[MediaWiki-CodeReview] [Wikimedia r1557]: Revision status changed

2012-03-27 Thread MediaWiki Mail
Pgehres (WMF) changed the status of Wikimedia.r1557 to deferred
URL: http://www.mediawiki.org/wiki/Special:Code/Wikimedia/1557

Old status:  new
New status: deferred

Commit summary for Wikimedia.r1557:

Adding boolean constant to check if CiviCRM testing is enabled. See r1556.

___
MediaWiki-CodeReview mailing list
mediawiki-coderev...@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/mediawiki-codereview


[MediaWiki-CodeReview] [Wikimedia r1556]: Revision status changed

2012-03-27 Thread MediaWiki Mail
Pgehres (WMF) changed the status of Wikimedia.r1556 to deferred
URL: http://www.mediawiki.org/wiki/Special:Code/Wikimedia/1556

Old status:  new
New status: deferred

Commit summary for Wikimedia.r1556:

Adding boolean constant to check if CiviCRM testing is enabled.

___
MediaWiki-CodeReview mailing list
mediawiki-coderev...@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/mediawiki-codereview


[MediaWiki-CodeReview] [Wikimedia r1555]: Revision status changed

2012-03-27 Thread MediaWiki Mail
Pgehres (WMF) changed the status of Wikimedia.r1555 to deferred
URL: http://www.mediawiki.org/wiki/Special:Code/Wikimedia/1555

Old status:  new
New status: deferred

Commit summary for Wikimedia.r1555:

Clean up and documentation. See r1540.

___
MediaWiki-CodeReview mailing list
mediawiki-coderev...@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/mediawiki-codereview


[MediaWiki-CodeReview] [Wikimedia r1538]: Revision status changed

2012-03-27 Thread MediaWiki Mail
Pgehres (WMF) changed the status of Wikimedia.r1538 to deferred
URL: http://www.mediawiki.org/wiki/Special:Code/Wikimedia/1538

Old status:  new
New status: deferred

Commit summary for Wikimedia.r1538:

Added more options for testing in executeDrush(). Added methods: 
getDataForContributionsTable(), createRecordInContributionsTable(), 
getRowsFromContributionsTable(), truncateWatchdogTable() and 
createUserContribution().

___
MediaWiki-CodeReview mailing list
mediawiki-coderev...@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/mediawiki-codereview


[MediaWiki-CodeReview] [Wikimedia r1539]: Revision status changed

2012-03-27 Thread MediaWiki Mail
Pgehres (WMF) changed the status of Wikimedia.r1539 to deferred
URL: http://www.mediawiki.org/wiki/Special:Code/Wikimedia/1539

Old status:  new
New status: deferred

Commit summary for Wikimedia.r1539:

Added new test suite for testing drush.

___
MediaWiki-CodeReview mailing list
mediawiki-coderev...@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/mediawiki-codereview


[MediaWiki-CodeReview] [Wikimedia r1540]: Revision status changed

2012-03-27 Thread MediaWiki Mail
Pgehres (WMF) changed the status of Wikimedia.r1540 to deferred
URL: http://www.mediawiki.org/wiki/Special:Code/Wikimedia/1540

Old status:  new
New status: deferred

Commit summary for Wikimedia.r1540:

Added conditionals on TestConfiguration.

___
MediaWiki-CodeReview mailing list
mediawiki-coderev...@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/mediawiki-codereview


[MediaWiki-CodeReview] [Wikimedia r1541]: Revision status changed

2012-03-27 Thread MediaWiki Mail
Pgehres (WMF) changed the status of Wikimedia.r1541 to deferred
URL: http://www.mediawiki.org/wiki/Special:Code/Wikimedia/1541

Old status:  new
New status: deferred

Commit summary for Wikimedia.r1541:

Added conditionals on TestConfiguration. See r1540.

___
MediaWiki-CodeReview mailing list
mediawiki-coderev...@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/mediawiki-codereview


[Wikitech-l] Google Summer of Code - Integrate upload from Flickr Taxobox (Ryan Kaldari)

2012-03-27 Thread David Palacios
Hi there,

I'm Currently working on a major in Computer Science and Technology and I
want to get involved in the open source community (I am a newbie).  From
the listed ideas, Integrate upload from Flickr and Taxobox caught my
attention  and I am working on a proposal, but I didn't want to wait more
time to introduce myself to you and the community.

I have experience in web development using javascript, jQuery, HTML5 and
PHP, I also have used MySQL on school projects (I have more experience in
SQL Server because an internship I did). The past 3 days I've been getting
familiar with the resources available for developers (MediaWiki
architecture, Manual:code and Coding conventions ). I am glad to say that I
am familiar with most of the javascript and PHP coding conventions.

I'll let you know when I post it on my userpage, hoping I can receive some
feedback.

Regards,
-- 
David Palacios
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Google Summer of Code - Integrate upload from Flickr Taxobox (Ryan Kaldari)

2012-03-27 Thread Sumana Harihareswara
On 03/27/2012 06:35 PM, David Palacios wrote:
 Hi there,
 
 I'm Currently working on a major in Computer Science and Technology and I
 want to get involved in the open source community (I am a newbie).  From
 the listed ideas, Integrate upload from Flickr and Taxobox caught my
 attention  and I am working on a proposal, but I didn't want to wait more
 time to introduce myself to you and the community.
 
 I have experience in web development using javascript, jQuery, HTML5 and
 PHP, I also have used MySQL on school projects (I have more experience in
 SQL Server because an internship I did). The past 3 days I've been getting
 familiar with the resources available for developers (MediaWiki
 architecture, Manual:code and Coding conventions ). I am glad to say that I
 am familiar with most of the javascript and PHP coding conventions.
 
 I'll let you know when I post it on my userpage, hoping I can receive some
 feedback.
 
 Regards,

Welcome, David, and good luck with your application.  Do speak up in our
IRC channel #mediawiki if you need any immediate help!
https://www.mediawiki.org/wiki/Communication

-- 
Sumana Harihareswara
Volunteer Development Coordinator
Wikimedia Foundation

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


[Wikitech-l] GSoC 12 Proposal

2012-03-27 Thread Shivansh Srivastava
Hi,

I am Shivansh, pursuing Engineering at BITS Pilani, currently in my 3rd
year.

I am well versed in Web Technologies, that include HTML5, JavaScripts
(jQuery),
CSS; with PHP  MySQL and App Development for Windows Phone 7.

I have worked in my college on several websites with the same knowledge. I
had
also given a talk at the 3rd WikiConference held at Mumbai on Improving
Wiki UI
using AJAX  jQuery  presented with 4 ideas/projects with the Wiki
community.

I am interested in developing the Frontend, as I am pretty comfortable in
designing UI using HTML, jQuery (javascripts), CSS  HTML5, particularly
dealing with Gadgets  Extensions.

I have already worked on the MediaWiki Software,  generally play around
with the extensions, but I mostly try to develop on the user common.js .

After a series of discussions  feedback, I made the proposal page -
https://www.mediawiki.org/wiki/User:Shivansh13#My_Proposal_for_Ideas .
Kindly look at it  let me me know the feasibility of such a project.

Waiting for a reply.

Thank you.
With Regards,

-- 
Shivansh Srivastava | +91-955-243-5407 |
http://in.linkedin.com/pub/shivansh-srivastava/17/a50/b18mr.shivansh.srivast...@gmail.com
mr.shivansh.srivast...@gmail.com3rd Year Undergraduate | B.E. (Hons.) -
Electronics  Instrumentation
BITS-Pilani.
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


[Wikitech-l] Creating a thumbnail with File::createThumb()

2012-03-27 Thread Daniel Renfro
MW gurus,

I am working on an API module to an extension and would like to create 
thumbnails programmatically for some images (if they don't already have them.)

The includes/filerepo/File.php file contains a createThumb() method, which 
looks like it's what I want. From the comment block directly above the 
aforementioned method:
/**
 * Create a thumbnail of the image having the specified width/height.
 * The thumbnail will not be created if the width is larger than the
 * image's width. Let the browser do the scaling in this case.
 * The thumbnail is stored on disk and is only computed if the thumbnail
 * file does not exist OR if it is older than the image.
 * Returns the URL.
 *
 * 
 */

However, this method always returns the url of the file itself and not the 
thumb. From what I can tell it never generates the thumbnail (it's not in the 
filesystem repo in any directory.) My code is:

?php

# ...query to get a list of recently uploaded images (quite simple) 
$result =  $dbr-select();

# loop through them and get a thumbnail  url
foreach ( $result as $row ) {
  $title = Title::newFromText( $row-page_title, NS_FILE );
  $file = wfFindFile( $title );
  if ( !$file ) {
continue;
  }
  $thumbnail_url = $file-createThumb( 80 ); # width in pixels
  ...add to the API result...
}
...return...

?

I'm sure that my query return valid page titles/namespaces, and that the files 
exist (both in the wiki and in the filesystem.) They are all local, and some 
are quite large. I'd hate to have to send the entire image and make the browser 
do the scaling, as the thumbnail will get reused and the resizing is only done 
once.

Any ideas fellow MW gurus? What am I missing?

-Daniel Renfro

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] I don't want a full git clone, I just want an rsync(1) style diet git clone

2012-03-27 Thread jidanni
 C == Chad  innocentkil...@gmail.com writes:
C No. Most people don't want --bare anyway...they're likely
C to get confused.

So what do I want?

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] I don't want a full git clone, I just want an rsync(1) style diet git clone

2012-03-27 Thread Chad
A regular git clone.

-Chad
On Mar 27, 2012 10:34 PM, jida...@jidanni.org wrote:

  C == Chad  innocentkil...@gmail.com writes:
 C No. Most people don't want --bare anyway...they're likely
 C to get confused.

 So what do I want?

 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Gerrit bots notification

2012-03-27 Thread Ryan Lane
It already supports sending to multiple channels. I can't go into
details right now, but I'll follow up on this later with more info.

On Wed, Mar 28, 2012 at 6:06 AM, Antoine Musso hashar+...@free.fr wrote:
 Le 27/03/12 21:10, Max Semenik a écrit :
 I think these notifications should be sent to #mediawiki too, as this
 concerns WM/MW in general, too.

 Unfortunately the python script does not support sending to multiple
 channels yet.  Something any pythonist should be able to do :-]

 Can you possibly open a bug about the ability to send to multiple
 channels?  Since that is a Gerrit hook, it should be in Wikimedia
 product under the git/gerrit component.

 Thanks!

 --
 Antoine hashar Musso


 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Committing followups: please no --amend

2012-03-27 Thread Tim Starling
On 28/03/12 05:33, Marcin Cieslak wrote:
 Tim Starling tstarl...@wikimedia.org wrote:

 It doesn't work, I'm afraid. Because of the implicit rebase on push,
 usually subsequent changesets have a different parent. 
 
 How does the implicit rebase on push work? Do you mean git-review?

Yes, git-review.

 I don't know whether git push remote HEAD:for/master/topic
 rebases anything. 

It doesn't. But you often have to do a rebase before Gerrit can merge
the change into master.

-- Tim Starling


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] I don't want a full git clone, I just want an rsync(1) style diet git clone

2012-03-27 Thread jidanni
I know. I will use --depth 1 !

$ git clone --depth 1 https://gerrit.wikimedia.org/r/p/mediawiki/core.git 
mediawiki
Cloning into 'mediawiki'...
error: RPC failed; result=22, HTTP code = 500
fatal: The remote end hung up unexpectedly
$ git clone --depth 2 https://gerrit.wikimedia.org/r/p/mediawiki/core.git 
mediawiki
Cloning into 'mediawiki'...
error: RPC failed; result=22, HTTP code = 500
fatal: The remote end hung up unexpectedly
$ git clone --depth 0 https://gerrit.wikimedia.org/r/p/mediawiki/core.git 
mediawiki
Cloning into 'mediawiki'...
remote: Counting objects: 356168, done
remote: Finding sources: 100% (356168/356168)
Receiving objects:   8% (29137/356168), 6.38 MiB | 112 KiB/s

I don't really know what --depth 0 will get me. Fingers crossed.

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Creating a thumbnail with File::createThumb()

2012-03-27 Thread Hunter Fernandes
Since no one has replied at this time, I thought I'd give a crack at an
answer (I really don't know if this will work).

Some goodies from a quick scan of thumb.php:

$img = wfLocalFile( $fileName );
$params = array(
'width' = 100,
'height' = 100,
);
$img-transform( $params, File::RENDER_NOW );
$localpath = $thumb-getPath();
$thumbPath = $img-getThumbPath( $thumbName );

That's my guess. I havn't tested it or anything.
- Hunter F.


On Tue, Mar 27, 2012 at 7:03 PM, Daniel Renfro dren...@vistaprint.comwrote:

 MW gurus,

 I am working on an API module to an extension and would like to create
 thumbnails programmatically for some images (if they don't already have
 them.)

 The includes/filerepo/File.php file contains a createThumb() method, which
 looks like it's what I want. From the comment block directly above the
 aforementioned method:
/**
 * Create a thumbnail of the image having the specified width/height.
 * The thumbnail will not be created if the width is larger than the
 * image's width. Let the browser do the scaling in this case.
 * The thumbnail is stored on disk and is only computed if the thumbnail
 * file does not exist OR if it is older than the image.
 * Returns the URL.
 *
 * 
 */

 However, this method always returns the url of the file itself and not the
 thumb. From what I can tell it never generates the thumbnail (it's not in
 the filesystem repo in any directory.) My code is:

 ?php

 # ...query to get a list of recently uploaded images (quite simple) 
 $result =  $dbr-select();

 # loop through them and get a thumbnail  url
 foreach ( $result as $row ) {
  $title = Title::newFromText( $row-page_title, NS_FILE );
  $file = wfFindFile( $title );
  if ( !$file ) {
continue;
  }
  $thumbnail_url = $file-createThumb( 80 ); # width in pixels
  ...add to the API result...
 }
 ...return...

 ?

 I'm sure that my query return valid page titles/namespaces, and that the
 files exist (both in the wiki and in the filesystem.) They are all local,
 and some are quite large. I'd hate to have to send the entire image and
 make the browser do the scaling, as the thumbnail will get reused and the
 resizing is only done once.

 Any ideas fellow MW gurus? What am I missing?

 -Daniel Renfro

 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


  1   2   >