[Wikitech-l] Challenge: pass jshint on all extensions

2014-07-07 Thread Antoine Musso
Hello,

MediaWiki has JavaScript code conventions which is enforced on core and
some extensions by running jshint in Jenkins:

  http://www.mediawiki.org/wiki/CC/JS

Unfortunately lot of extensions do not match that expectation yet and it
would be rather nice to bring them on par.


We have a tracking bug listing some repositories which are not passing
jshint yet:

https://bugzilla.wikimedia.org/show_bug.cgi?id=60619

It would be rather nice to have all extensions fixed :-]


-- 
Antoine hashar Musso


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] Moving Mathoid to production cluster

2014-07-07 Thread Moritz Schubotz
Hi,

during the last year the math extension achieved a goal defined back
in 2003. Support of MathML. In addition there is SVG support for
MathML disabled browsers. (See http://arxiv.org/abs/1404.6179 for the
details)
I would like to give Wikipedia users a chance to test this new long
awaited feature.
Therefore we would need a mathoid instance that is accessible from the
production cluster. Greg Grossmeier already created the required table
in the database. (Sorry for the friction connected with this
process)
Currently the MathJax team is working on a phantom.js less method to
render texvc to mathml and svg. Some days ago I have tested that it,
and it works quite well. I would appreciate a discussion with ops that
to figure out how this can be can go to production. The original idea
was to use jenkins to build the mathoid debian package. Even though
the debian package builds without any issues in the launchpad ppa repo
jenkins can not build the package. If there is a reference project
that uses jenkins to build debian packages that go to production this
would really help to figure out what is different for mathoid and why
the package building does not work even though it works on launchpad.

Best
Physikerwelt

PS: I was informed that there is a related RT that I can not access
https://rt.wikimedia.org/Ticket/Display.html?id=6077

-- 
Mit freundlichen Grüßen
Moritz Schubotz

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] tool for quickly uploading 10000 images with description

2014-07-07 Thread Yury Katkov
Hi everyone!

Does anyone knows about the tool that can help to upload a lot of files and
create the page for every file with a given description? I'd say that it
should be a maintenance script since for some reason the API upload works
pretty slow. I saw UploadLocal Extension but it's too manual and it doesn't
work well when the amount of files to upload is very large.

Cheers,
-
Yury Katkov
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] tool for quickly uploading 10000 images with description

2014-07-07 Thread Florian Schmidt
Hello!

O think that is the right for you:
https://m.mediawiki.org/wiki/Manual:ImportImages.php

Simply upload the images via FTP for example to the server and run the script 
like explainend on Manual page.

Kind regards
Florian

Gesendet mit meinem HTC

- Reply message -
Von: Yury Katkov katkov.ju...@gmail.com
An: Wikimedia developers wikitech-l@lists.wikimedia.org
Betreff: [Wikitech-l] tool for quickly uploading 1 images with description
Datum: Mo., Juli 7, 2014 16:36

Hi everyone!

Does anyone knows about the tool that can help to upload a lot of files and
create the page for every file with a given description? I'd say that it
should be a maintenance script since for some reason the API upload works
pretty slow. I saw UploadLocal Extension but it's too manual and it doesn't
work well when the amount of files to upload is very large.

Cheers,
-
Yury Katkov
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] Mentors participating in GSoC Reunion (was Re: IMPORTANT: GSoC OPW mid-term evaluations)

2014-07-07 Thread Quim Gil
Hi

On Tue, Jun 17, 2014 at 12:14 AM, Quim Gil q...@wikimedia.org wrote:


 If you want to be a Wikimedia delegate at the Google Summer of Code
 Reunion, apply before the end of June at
 https://www.mediawiki.org/wiki/Talk:Mentorship_programs/Possible_mentors


 Only Siebrand has signed up at the wiki page so far. No other GSoC mentors
interested?

-- 
Quim Gil
Engineering Community Manager @ Wikimedia Foundation
http://www.mediawiki.org/wiki/User:Qgil
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] tool for quickly uploading 10000 images with description

2014-07-07 Thread Yury Katkov
It's a nice one, thanks! I will need to add just a little bit to it to suit
my needs!

-
Yury Katkov


On Mon, Jul 7, 2014 at 4:42 PM, Florian Schmidt 
florian.schmidt.wel...@t-online.de wrote:

 Hello!

 O think that is the right for you:
 https://m.mediawiki.org/wiki/Manual:ImportImages.php

 Simply upload the images via FTP for example to the server and run the
 script like explainend on Manual page.

 Kind regards
 Florian

 Gesendet mit meinem HTC

 - Reply message -
 Von: Yury Katkov katkov.ju...@gmail.com
 An: Wikimedia developers wikitech-l@lists.wikimedia.org
 Betreff: [Wikitech-l] tool for quickly uploading 1 images with
 description
 Datum: Mo., Juli 7, 2014 16:36

 Hi everyone!

 Does anyone knows about the tool that can help to upload a lot of files and
 create the page for every file with a given description? I'd say that it
 should be a maintenance script since for some reason the API upload works
 pretty slow. I saw UploadLocal Extension but it's too manual and it doesn't
 work well when the amount of files to upload is very large.

 Cheers,
 -
 Yury Katkov
 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l
 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] tool for quickly uploading 10000 images with description

2014-07-07 Thread Nasir Khan
The ImportImages.php https://m.mediawiki.org/wiki/Manual:ImportImages.php
maintenance script is very useful. I used this to upload about 5000 images
to one of mediawiki projects.


--
*Nasir Khan Saikat*
www.nasirkhn.com



On Mon, Jul 7, 2014 at 9:03 PM, Yury Katkov katkov.ju...@gmail.com wrote:

 It's a nice one, thanks! I will need to add just a little bit to it to suit
 my needs!

 -
 Yury Katkov


 On Mon, Jul 7, 2014 at 4:42 PM, Florian Schmidt 
 florian.schmidt.wel...@t-online.de wrote:

  Hello!
 
  O think that is the right for you:
  https://m.mediawiki.org/wiki/Manual:ImportImages.php
 
  Simply upload the images via FTP for example to the server and run the
  script like explainend on Manual page.
 
  Kind regards
  Florian
 
  Gesendet mit meinem HTC
 
  - Reply message -
  Von: Yury Katkov katkov.ju...@gmail.com
  An: Wikimedia developers wikitech-l@lists.wikimedia.org
  Betreff: [Wikitech-l] tool for quickly uploading 1 images with
  description
  Datum: Mo., Juli 7, 2014 16:36
 
  Hi everyone!
 
  Does anyone knows about the tool that can help to upload a lot of files
 and
  create the page for every file with a given description? I'd say that it
  should be a maintenance script since for some reason the API upload works
  pretty slow. I saw UploadLocal Extension but it's too manual and it
 doesn't
  work well when the amount of files to upload is very large.
 
  Cheers,
  -
  Yury Katkov
  ___
  Wikitech-l mailing list
  Wikitech-l@lists.wikimedia.org
  https://lists.wikimedia.org/mailman/listinfo/wikitech-l
  ___
  Wikitech-l mailing list
  Wikitech-l@lists.wikimedia.org
  https://lists.wikimedia.org/mailman/listinfo/wikitech-l
 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] tool for quickly uploading 10000 images with description

2014-07-07 Thread hoo
I guess you can just use the importImages maintenance script. I usually
use that for small scale uploads (big videos etc.) to commons, but it
should work well for larger amounts of media also.

Cheers,

Marius


On Mon, 2014-07-07 at 16:36 +0200, Yury Katkov wrote:
 Hi everyone!
 
 Does anyone knows about the tool that can help to upload a lot of files and
 create the page for every file with a given description? I'd say that it
 should be a maintenance script since for some reason the API upload works
 pretty slow. I saw UploadLocal Extension but it's too manual and it doesn't
 work well when the amount of files to upload is very large.
 
 Cheers,
 -
 Yury Katkov
 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l



___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] question about standardized thumbnail sizes

2014-07-07 Thread C. Scott Ananian
Another means to accomplish the same goal (more standardized thumbnail
sizes) are the semantics markup for images proposals mooted about
(but not yet formalized, I don't think).  The idea would be to
strongly encourage authors to use more-semantic markup, and give more
authority to the renderer (responsive theme, pdf, etc) to choose an
appropriate image size and layout.  Articles just Look Better if the
images are scaled to consistent sizes.
  --scott

On Thu, Jul 3, 2014 at 2:48 AM, Gilles Dubuc gil...@wikimedia.org wrote:
 I would be in favor of waiting to see if the ongoing work described by
 Gergo is sufficient to address Ops' issues before doing something with the
 thumbnail size standardization RfC. It came to be due to operational costs,
 but if thumbnail rendering becomes a lot faster (thanks to bucketing, for
 example), it might not be necessary to standardize image sizes anymore. And
 if it is, I would rather start a new RfC with a narrower proposition.


 On Tue, Jun 24, 2014 at 3:23 AM, Sumana Harihareswara suma...@wikimedia.org
 wrote:

 I asked some folks about


 https://www.mediawiki.org/wiki/Requests_for_comment/Standardized_thumbnails_sizes
 .
 Antoine, the original author, said on the talk page:

 We had several mailing list discussion in 2012 / beginning of 2013
 regarding optimizing the thumbnails rendering. That RFC is merely a summary
 of the discussions and is intended to avoid repeating ourself on each
 discussion. I am not leading the RFC by any mean, would be nice to have the
 new multimedia team to take leadership there.

 Gergo of the multimedia team has a question about whether he should start
 a new RfC, and a question for Ops (below), which he said I could forward to
 this list, so I'm doing so. :-)

 If we can settle this onlist, cool. Otherwise I'll be setting up an IRC
 chat for later this week.


 Sumana Harihareswara
 Senior Technical Writer
 Wikimedia Foundation


 On Fri, Jun 20, 2014 at 12:27 PM, Gergo Tisza gti...@wikimedia.org
 wrote:


 Hi Sumana!

 We are working on some form of standardized thumbnail sizes, but it is
 not exactly the same issue that is discussed in the RfC
 https://www.mediawiki.org/wiki/Requests_for_comment/Standardized_thumbnails_sizes
 .

 The problem we have ran into is that MediaViewer fits the image size to
 the browser window size (which means a huge variety of image sizes even
 when the browser window is fully enlarged, and practically infinite
 otherwise),
 but thumbnail rendering is very slow and waiting for it would result in a
 crappy user experience. We started using a list of standardized thumbnail
 sizes, so that MediaViewer always requests one of these sizes from the
 browser and rescales them with CSS, but even so the delay remains
 problematic for the first user who requests the image with a given bucket.
 To address that, we are working with ops towards automatically rendering
 the thumbnails in those sizes as soon as the image is uploaded.

 Another possibility related to standardized thumbnail sizes that we are
 exploring is to speed up the thumbnail generation for large images by
 having a list of sizes for which the thumbnail is pregenerated and always
 present, and resize one of those thumbnails instead of the original to
 generate the size requested by the user. The goal of this would be to avoid
 overloading the scalers when several large images need to be thumbnailed at
 the same time (GWToolset caused outages this way on a few occasions).

 I can create an RfC about one or both of the above issues if there is
 interest in wider discussion. I don't know whether the current thumbnail
 size standardization RfC should be replaced with those, though; its goals
 are not stated, but seem to be mainly operations concerns (how to make sure
 thumbnails don't take up too much storage space). Maybe ops wants to take
 it over, or provide clearer goals in that regard for the multimedia team to
 work towards.




 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l



-- 
(http://cscott.net)

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] learning Ops infrastructure (was: Re: 404 errors)

2014-07-07 Thread Sumana Harihareswara
Pine, two more:

* https://www.mediawiki.org/wiki/File:Schulz-performance.pdf - the slides
about Full Stack Performance by Aaron Schulz from
http://www.meetup.com/SF-Web-Performance-Group/events/182182062/
* Faidon Liambotis's dotScale talk on the Wikimedia infrastructure:
https://www.youtube.com/watch?v=646mJu5f2cQ

I added both to https://www.mediawiki.org/wiki/Presentations#2014 .



Sumana Harihareswara
Senior Technical Writer
Wikimedia Foundation


On Wed, Jun 4, 2014 at 3:24 PM, ENWP Pine deyntest...@hotmail.com wrote:

 Thanks Sumana,

 That's good info to have. I'll look through those links.

 That diagram may make its way into the presentation that I'm drafting. The
 presentation has balooned to an alarming length already but I'm going to
 try to complete it in outline form before pruning.

 If someone else makes presentation slides available about infrastructure
 under a license that allows reuse I would greatly appreciate it.

 By the way, I appreciated the overview of UX in your keynote [1].

 Pine

 [1]
 http://wiki.code4lib.org/index.php/2014_Keynote_by_Sumana_Harihareswara


  Date: Tue, 03 Jun 2014 09:41:34 -0400
  From: Sumana Harihareswara suma...@wikimedia.org
  To: Wikimedia developers wikitech-l@lists.wikimedia.org
  Subject: [Wikitech-l] learning Ops infrastructure (was: Re:  404
errors)
  Message-ID: 538dd08e.1000...@wikimedia.org
  Content-Type: text/plain; charset=UTF-8
 
  Hi, Pine.
 
  I, too, am interested in building our understanding of our TechOps
  infrastructure. https://www.mediawiki.org/wiki/Presentations has some
  explanations of some parts, as does http://wikitech.wikimedia.org/ . I
  welcome more links to guides/overviews.
 
  At the recent Zurich hackathon, other developers agreed that it would be
  good to have a guide to Wikimedia's digital infrastructure, especially
  how MediaWiki is used.
  https://www.mediawiki.org/wiki/Overview_of_Wikimedia_infrastructure is
   a homepage with approximately nothing on it right now except this
  diagram of our server architecture:
 
 https://commons.wikimedia.org/wiki/File:Wikimedia_Server_Architecture_%28simplified%29.svg
 
  You might find the Performance Guidelines illuminating
  https://www.mediawiki.org/wiki/Performance_guidelines and you might also
  like the recent tech talk about how we make Wikipedia fast, by Ori
  Livneh and Aaron Schulz, recently - see
  http://www.youtube.com/watch?v=0PqJuZ1_B6w (I don't know when the video
  is going up on Commons).
 
  --
  Sumana Harihareswara
  Senior Technical Writer
  Wikimedia Foundation
 
 
  On 05/30/2014 06:30 PM, ENWP Pine wrote:
  
   Ori, thanks for following up.
  
   I think I saw somewhere that there is a list of postmortems for tech
 ops disruptions
   that includes reports like this one. Do you know where the list is? I
 tried a web search
   and couldn't find a copy of this report outside of this email list.
  
   I personally find this report interesting and concise, and I am
 interested in
   understanding more about the tech ops infrastructure. Reports like
 this one
   are useful in building that understanding. If there's an overview of
 tech ops
   somewhere I'd be interested in reading that too. The information on
 English
   Wikipedia about WMF's server configuration appears to be outdated.
  
   Thanks,
  
   Pine
  
  

 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] Engineering Community Team - IRC Hour: July 8th @1600 UTC

2014-07-07 Thread Rachel Farrand
Please feel free to join the Engineering Community Team (ECT) tomorrow
(July 8th) at 1600 UTC
http://www.timeanddate.com/worldclock/fixedtime.html?msg=ECT+Office+Hoursiso=20140708T09p1=224ah=1
in #wikimedia-office on IRC. The ECT hosts office hours in
#wikimedia-office the second Tuesday of every month.

Relevant links will be added here
https://www.mediawiki.org/wiki/Engineering_Community_Team/Meetings/2014-07-08
before the meeting begins and minuets from the meeting will also be added
after it is over.

Thanks!
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] tool for quickly uploading 10000 images with description

2014-07-07 Thread rupert THURNER
glamwikitoolset might be an option as well.
http://m.mediawiki.org/wiki/Extension:GWToolset/Technical_Design

rupert
 Am 07.07.2014 17:03 schrieb Yury Katkov katkov.ju...@gmail.com:

 It's a nice one, thanks! I will need to add just a little bit to it to suit
 my needs!

 -
 Yury Katkov


 On Mon, Jul 7, 2014 at 4:42 PM, Florian Schmidt 
 florian.schmidt.wel...@t-online.de wrote:

  Hello!
 
  O think that is the right for you:
  https://m.mediawiki.org/wiki/Manual:ImportImages.php
 
  Simply upload the images via FTP for example to the server and run the
  script like explainend on Manual page.
 
  Kind regards
  Florian
 
  Gesendet mit meinem HTC
 
  - Reply message -
  Von: Yury Katkov katkov.ju...@gmail.com
  An: Wikimedia developers wikitech-l@lists.wikimedia.org
  Betreff: [Wikitech-l] tool for quickly uploading 1 images with
  description
  Datum: Mo., Juli 7, 2014 16:36
 
  Hi everyone!
 
  Does anyone knows about the tool that can help to upload a lot of files
 and
  create the page for every file with a given description? I'd say that it
  should be a maintenance script since for some reason the API upload works
  pretty slow. I saw UploadLocal Extension but it's too manual and it
 doesn't
  work well when the amount of files to upload is very large.
 
  Cheers,
  -
  Yury Katkov
  ___
  Wikitech-l mailing list
  Wikitech-l@lists.wikimedia.org
  https://lists.wikimedia.org/mailman/listinfo/wikitech-l
  ___
  Wikitech-l mailing list
  Wikitech-l@lists.wikimedia.org
  https://lists.wikimedia.org/mailman/listinfo/wikitech-l
 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] tool for quickly uploading 10000 images with description

2014-07-07 Thread Derric Atzrott
 glamwikitoolset might be an option as well.
 http://m.mediawiki.org/wiki/Extension:GWToolset/Technical_Design

Yeah, I was just about to mention that.  The GLAM Community
seems reasonably satisfied with it and as far as I know it
supports adding a lot more metadata to images than the
image importer does.

Thank you,
Derric Atzrott


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] tool for quickly uploading 10000 images with description

2014-07-07 Thread Yury Katkov
Thanks everybody!

I've modified importImages and made it support not only directories but
also single files. I then have written a small python script that runs
importImages for each file in the directory adding the corresponding
metadata from csv file.

-
Yury Katkov


On Mon, Jul 7, 2014 at 5:54 PM, hoo h...@online.de wrote:

 I guess you can just use the importImages maintenance script. I usually
 use that for small scale uploads (big videos etc.) to commons, but it
 should work well for larger amounts of media also.

 Cheers,

 Marius


 On Mon, 2014-07-07 at 16:36 +0200, Yury Katkov wrote:
  Hi everyone!
 
  Does anyone knows about the tool that can help to upload a lot of files
 and
  create the page for every file with a given description? I'd say that it
  should be a maintenance script since for some reason the API upload works
  pretty slow. I saw UploadLocal Extension but it's too manual and it
 doesn't
  work well when the amount of files to upload is very large.
 
  Cheers,
  -
  Yury Katkov
  ___
  Wikitech-l mailing list
  Wikitech-l@lists.wikimedia.org
  https://lists.wikimedia.org/mailman/listinfo/wikitech-l



 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] MediaWiki Front-End Standardization Chat

2014-07-07 Thread Sumana Harihareswara
On Wed, Jun 25, 2014 at 2:53 PM, Erik Moeller e...@wikimedia.org wrote:

 Raw logs here:

 https://tools.wmflabs.org/meetbot/wikimedia-office/2014/wikimedia-office.2014-06-25-17.30.log.html


​Since that link doesn't work anymore:
https://www.mediawiki.org/wiki/User:Sharihareswara_%28WMF%29/2014-06-25_Front-End_Standardization_Chat

Sumana Harihareswara
Senior Technical Writer
Wikimedia Foundation
​




 Next steps:

 1) Trevor, Roan, Timo, Kaldari and others will refine the proposal at
 https://www.mediawiki.org/wiki/Requests_for_comment/Redo_skin_framework as
 a concrete step to develop a standard UI framework for MediaWiki Core.

 2) The proposal on the table is to implement this new skin framework, port
 existing skins in MW core, and port it to mobile as a skin to ensure that
 we're developing a multi-device, responsive design framework.

 3) We'll reconvene with a dedicated IRC session about the refined RFC, and
 then seek to create technical alignment and determine the exact allocation
 of development effort beyond the team above. This will be in about two
 weeks.

 - - - - -

 The aforementioned work will be coordinated on-wiki and via Bugzilla.

 This proposal will only address parts of UX standardization. It does not
 currently focus on look and feel itself, though it would lay the groundwork
 for more CSS standardization as well. Other aspects - such as more
 consistent browser support expectations - need to be resolved
 independently.

 Erik
 --
 Erik Möller
 VP of Engineering and Product Development, Wikimedia Foundation
 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] tool for quickly uploading 10000 images with description

2014-07-07 Thread James Montalvo
Care to share it?
On Jul 7, 2014 1:43 PM, Yury Katkov katkov.ju...@gmail.com wrote:

 Thanks everybody!

 I've modified importImages and made it support not only directories but
 also single files. I then have written a small python script that runs
 importImages for each file in the directory adding the corresponding
 metadata from csv file.

 -
 Yury Katkov


 On Mon, Jul 7, 2014 at 5:54 PM, hoo h...@online.de wrote:

  I guess you can just use the importImages maintenance script. I usually
  use that for small scale uploads (big videos etc.) to commons, but it
  should work well for larger amounts of media also.
 
  Cheers,
 
  Marius
 
 
  On Mon, 2014-07-07 at 16:36 +0200, Yury Katkov wrote:
   Hi everyone!
  
   Does anyone knows about the tool that can help to upload a lot of files
  and
   create the page for every file with a given description? I'd say that
 it
   should be a maintenance script since for some reason the API upload
 works
   pretty slow. I saw UploadLocal Extension but it's too manual and it
  doesn't
   work well when the amount of files to upload is very large.
  
   Cheers,
   -
   Yury Katkov
   ___
   Wikitech-l mailing list
   Wikitech-l@lists.wikimedia.org
   https://lists.wikimedia.org/mailman/listinfo/wikitech-l
 
 
 
  ___
  Wikitech-l mailing list
  Wikitech-l@lists.wikimedia.org
  https://lists.wikimedia.org/mailman/listinfo/wikitech-l
 
 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] MW-Vagrant improvements at the Zürich Hackathon

2014-07-07 Thread Adam Wight
Bryan,

I think I need to take you up on the offer to help.  I can do the coding,
but I need some borrowed insight to get started.  I don't think the
wikimania_scholarships model is a good one to follow, I'd much rather add
configurability to the mediawiki::wiki or multiwiki::wiki classes.
Unfortunately, I see a lot of cascading changes being necessary, which
makes me think I'm on the wrong track.

-Adam


On Fri, Jun 13, 2014 at 5:29 PM, Bryan Davis bd...@wikimedia.org wrote:

 Adam,

 I wanted to avoid the complexity of a full multi version setup if we
 could, but if there are more than a couple of roles that would benefit from
 such features it would be possible. The easiest thing for your payment wiki
 may be to follow the pattern of the wikimania_scholarships role and add
 your own apache vhost and git checkout management. I'd be glad to help you
 work on this if you'd like some assistance. We might be able to add a bit
 more configurability to some of the existing puppet classes and defines to
 make working with multiple checkouts of mw-core easier.

 Bryan

 On Friday, June 13, 2014, Adam Wight awi...@wikimedia.org wrote:

 Bryan and Chris,
 The multiwiki work is fantastic, a big thank you for pursuing this!  I
 tried to use your new module to provide a vagrant development environment
 for Fundraising's payments wiki [1], and I ran up against a large and
 very solid-looking wall that I think is worth mentioning.  We maintain a
 special release branch of MediaWiki for payments, with a bit of security
 hardening.  We cannot follow trunk development without carefully reading
 over the new features, and we need to develop against this target so that
 we catch version incompatibilities before deployment.

 I see that multiwiki encapsulates the various wikis by configuration only,
 and they all share the main codebase.  Do you have multiple checkouts of
 MediaWiki-core on your roadmap, or are we a fringe case?  I'd like to help
 support our development under vagrant, but this issue is a bit of a
 blocker.  Any advice would be appreciated.

 Thanks,
 Adam

 [1] https://gerrit.wikimedia.org/r/135326, production is
 https://payments.wikimedia.org


 On Wed, May 21, 2014 at 9:55 AM, Bryan Davis bd...@wikimedia.org wrote:

  On Fri, May 16, 2014 at 2:40 PM, Arthur Richards
  aricha...@wikimedia.org wrote:
  
   CentralAuth/Multiwiki:
   Bryan Davis, Chris Steipp, and Reedy spent a lot of time hacking on
 this,
   and we now have support for multiwiki/CentralAuth in Vagrant! There is
   still some cleanup work being done for the role to remove
  kludge/hacks/etc
   (see https://gerrit.wikimedia.org/r/#/c/132691/).
 
  The CentralAuth role and the associated puppet config that allows
  creation of multiple wikis as Apache virtual hosts on a single
  MediaWiki-Vagrant virtual machine have been merged! Go forth and
  debug/extend CentralAuth. :)
 
  I'd love to see additional roles created that use the multwiki::wiki
  Puppet define to add interesting things for testing/debugging like RTL
  wikis or other complex features such as WikiData that use a
  collaboration between multiple wikis in the WMF production cluster.
  If you're interested in working on something like this and get stuck
  with the Puppet code needed or find shortcomings in the setup that
  Chris and I developed I'd be glad to try and help work through the
  issues.
 
  Bryan
  --
  Bryan Davis  Wikimedia Foundationbd...@wikimedia.org
  [[m:User:BDavis_(WMF)]]  Sr Software EngineerBoise, ID USA
  irc: bd808v:415.839.6885 x6855
 
  ___
  Wikitech-l mailing list
  Wikitech-l@lists.wikimedia.org
  https://lists.wikimedia.org/mailman/listinfo/wikitech-l
 
 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l



 --
 Bryan Davis  Wikimedia Foundationbd...@wikimedia.org
 [[m:User:BDavis_(WMF)]]  Sr Software EngineerBoise, ID USA
 irc: bd808v:415.839.6885 x6855

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] [Wikidata-tech] reviews needed for pubsubhubbub extension

2014-07-07 Thread Rob Lanphier
On Fri, Jul 4, 2014 at 7:16 AM, Lydia Pintscher 
lydia.pintsc...@wikimedia.de wrote:

 Wikimedia Germany has been working with a team of students over the
 past months. They have among other things developed a pubsubhubbub
 extension. The idea is that we allow 3rd parties to easily subscribe
 to changes made on Wikidata or any other wiki. Wikidata's changes get
 send to a hub which then notifies all subscribers who are interested
 in the change.
 We'd like to get this extension deployed for Wikidata and possibly
 later other Wikimedia projects. For this they need some more eyes to
 review them. RobLa suggested I send an email to wikitech-l about it.
 The review bugs for this are
 https://bugzilla.wikimedia.org/show_bug.cgi?id=67117 and
 https://bugzilla.wikimedia.org/show_bug.cgi?id=67118
 Thanks for your help getting this ready for deployment.


Hi Lydia,

Thanks for providing the basic overview of this.  Could you (or someone on
the team) provide an explanation about how you would like this to be
configured on the Wikimedia cluster?  Is this something that you see anyone
being able to subscribe to, or would this be something that would only be
available to a limited list of third parties?

Also, based on our last conversation, it sounds like we're up against some
time constraints here with respect to the students' time; could you clarify?

Thanks
Rob
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] HTML templating progress; Knockout Components curly brace syntax

2014-07-07 Thread Gabriel Wicke
Hi all,

There are several new  exciting developments in HTML templating land.

First of all, Ryan Kaldari kindly tested the Knockoff / TAssembly HTML
templating library from a Mobile perspective [1,2], and summarized his
findings on the wiki [3]. The main missing feature he identified was the
ability to mark up control flow outside of HTML elements. To address this, I
just added support for Knockout's comment syntax to the Knockoff compiler
[4]. This turned out to be straightforward, and required no changes in the
TAssembly runtime.

Secondly, Dan spotted two very promising developments in upstream KnockoutJS:

== Knockout Components ==
There's a new version of KnockoutJS coming that might bridge a gap we saw at
the architecture summit in January, the one between templating and
widget...ing.  Knockout components [5,6] could be said bridge, they're
basically web components but possible now.

== Curly braces ==
The other thing they're adding is text interpolation so you can do
span{{name}}/span and people don't have to be sad about Knockout's HTML
attribute syntax. This can simplify the migration for existing handlebars
compatible templates as used by Mobile  Flow.

The KnockoutJS release is currently in early beta [7], with the release
expected later this summer.

Gabriel  Dan

[1]
https://www.mediawiki.org/wiki/Requests_for_comment/HTML_templating_library/Knockoff_-_Tassembly
[2] https://github.com/gwicke/knockoff/, https://github.com/gwicke/tassembly/
[3]
https://www.mediawiki.org/wiki/Requests_for_comment/HTML_templating_library/Knockoff_-_Tassembly/Mobile_spike
[4]
https://github.com/gwicke/knockoff/commit/0e516979c3d1fb35489c58e7b659c3ad0b3cdf94
[5] http://www.knockmeout.net/2014/06/knockout-3-2-preview-components.html
[6]
http://blog.stevensanderson.com/2014/06/11/architecting-large-single-page-applications-with-knockout-js/
[7] https://github.com/knockout/knockout/tree/gh-pages-3.2.0

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l