Re: [Wikitech-l] Tools for dealing with citations of withdrawn academic journal articles

2015-09-16 Thread Gergő Tisza
On Tue, Aug 18, 2015 at 2:42 AM, Pine W  wrote:

> Is there any easy way to find all of citations of specified academic
> articles on Wikipedias in all languages, and the text that is supported by
> those references, so that the citations of questionable articles can be
> removed and the article texts can be quickly reviewed for possible changes
> or removal?
>

Not right now, but Aaron and James are working on it!
http://librarybase.wmflabs.org/
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] Fwd: [Tools] Kubernetes picked to provide alternative to GridEngine

2015-09-16 Thread Yuvi Panda
FYI


-- Forwarded message --
From: Yuvi Panda 
Date: Wed, Sep 16, 2015 at 6:25 PM
Subject: [Tools] Kubernetes picked to provide alternative to GridEngine
To: labs-annou...@lists.wikimedia.org


We have picked kubernetes.io to provide an alternative to GridEngine
on Tool Labs! See https://phabricator.wikimedia.org/T106475 for more
details about the evaluation itself,
https://docs.google.com/spreadsheets/d/1YkVsd8Y5wBn9fvwVQmp9Sf8K9DZCqmyJ-ew-PAOb4R4/edit?pli=1#gid=0
for the spreadsheet with the evaluation matrix and
https://lists.wikimedia.org/pipermail/labs-l/2015-August/003955.html
for the previous email listing pros and cons of the others solutions
that we have considered.

== Rough Timeline ==

* October: Start beta testing by opening up Kubernetes to whitelisted
tools. Allows people to run arbitrary Docker images in Kubernetes,
both for continuous jobs and web services. If you are interested in
this,please add a comment to https://phabricator.wikimedia.org/T112824
* December: Work on migrational tooling to assist in switching from
GridEngine to Kubernetes should be in a good place. This will begin
with a '--provider=kubernetes' parameter to the webservice command
that will allow people to easily switch to Kubernetes for webservice.
We will have something similar implemented for jsub / jstart.
* January: Kubernetes cluster is opened up for all tools users. Fairly
complete switching between GridEngine and Kubernetes for at least
continuous jobs and webservices.

== What does this give developers? ==

# A software stack that is made up of widely-used and
actively-developed software components. We are looking to dramatically
reduce the surface of code that we write and maintain in-house.
# (When out of beta) More stability and reliability.
# It allows us to get rid of a lot of our customizations (the entire
webservice setup, for example) which has proven to be a lot of work
and flaky at times.
# We can migrate tools that don't require NFS off of it, and since it
has historically been one of our flakiest servies this allows tools to
opt-out of it and get a lot more reliability.
# Using Docker allows more user freedom in a structured way that is
easier to support.
# Has an active upstream / is used elsewhere as well (Google Container
Engine, Microsoft Azure, RedHat OpenShift, AWS, etc.), so much more
help / community available when users run into issues than we have now
# Probably more :) It opens up a lot of possibilities and I'm sure
developers will use it createively :)

== Do I need to change anything? ==

No, especially if you do not want to. We think Docker and Kubernetes
are exciting technologies, so we would like volunteers who are
interested in exploring these platforms to have the option of getting
their hands dirty. At the same time, it is important for us to avoid
complicating life for people for whom the current setup works well. If
we are successful, the migration to Docker and Kubernetes will be
seamless, and users of Tool Labs will not have to know what either
Docker or Kubernetes are, let alone how to operate them.

We will begin by offering arbitrary Docker image execution only.
Eventually we will make it super-easy to switch between the current
setup and Kubernetes, and allow people to be able to use this without
having to know what Docker is or what Kubernetes is. That will slowly
be built over the next few months.

Absolutely nothing changes for developers or users right now.

== Will GridEngine be going away? ==

Not anytime soon!

Thanks!

--
Yuvi Panda T
http://yuvi.in/blog


-- 
Yuvi Panda T
http://yuvi.in/blog

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] thumb generation

2015-09-16 Thread gnosygnu
As another suggestion, XOWA (http://gnosygnu.github.io/xowa/) can generate
a list of thumbs. It takes about 60 hours to parse the English Wikipedia
dump and generate a table of 4.78 million rows with the following columns:

* file name
* file extension
* repo (commons or local)
* file width
* file height
* thumbtime (for video)
* page (for djvu / pdf)

There's more information in XOWA at
home/wiki/Help:Import/Command-line/Thumbs . I can provide more information
online or offline if you're interested.

If you need the actual thumb files, you can download XOWA databases from
https://archive.org/details/Xowa_enwiki_latest . They have about 5 million
thumbs within SQLite tables. It should be straightforward to write code to
pull the blob from the database and save them to disk.

Otherwise, as others have indicated, I know of no MediaWiki way to get this
information (via .sql dump file or by api.php). Since XOWA parses wikitext,
it can generate the information easily, though the solution is not
officially a MediaWiki one.

Hope this helps.


On Sun, Sep 13, 2015 at 1:43 AM, wp mirror  wrote:

> 0) Context
>
> I am currently developing new features for WP-MIRROR (see <
> https://www.mediawiki.org/wiki/Wp-mirror>).
>
> 1) Objective
>
> I would like WP-MIRROR to generate all image thumbs during the mirror build
> process. This is so that mediawiki can render pages quickly using
> precomputed thumbs.
>
> 2) Dump importation
>
> maintenance/importDump.php - this computes thumbs during importation, but
> is too slow.
> mwxml2sql - loads databases quickly, but does not compute thumbs.
>
> 3) Question
>
> Is there a way to compute all the thumbs after loading databases quickly
> with mwxml2sql?
>
> Sincerely Yours,
> Kent
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Link: useful article on in-person events

2015-09-16 Thread Pine W
That is a nice article.

In the Wikiverse, conferences are good venues for Q&A, structured dialogue,
building and maintenance personal connections, and brainstorming. Excluded
from this picture is the ordination of community policy, but brainstorming,
drafting, and creating policy proposals are all ok and probably benefit
from high-frequency interactions, both spontaneous and planned.



Pine


On Wed, Sep 16, 2015 at 11:33 AM, Frances Hocutt 
wrote:

> An excellent article on best practices and pitfalls for in-person technical
> events: https://modelviewculture.com/pieces/software-in-person
>
> This has some ideas and framings that can be useful for planning for the
> upcoming developer summit.
>
> -Frances
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Super-charging Mobile: An experiment with ServiceWorkers in MediaWiki

2015-09-16 Thread Jon Robson
On Wed, Sep 16, 2015 at 2:18 PM, Jon Robson  wrote:
> Sam asked me to write up my recent adventures with ServiceWorkers and
> making requests for MediaWiki content super super fast so all our
> lovely users can access information quicker. Right now we're trying to
> make the mobile site ridiculously fast by using new shiny standard web
> technologies.
>
> The biggest issue we have on the mobile site right now is we ship a
> lot of content - HTML and images - since we ship those on desktop. On
> desktop it's not really a problem from a performance perspective but
> it may be an issue from a download perspective if you have some kind
> of data limit on your broadband and you are addicted to Wikipedia.
>
> The problem however is that on mobile, the connection speeds are not
> quite up to desktops standard. To take an example the Barack Obama
> article contains 102 image tags and 186KB of HTML resulting in about
> 1MB of HTML
Correction:
s/HTML/content

. If you're on your mobile phone just to look up his place
> of birth (which is in the lead section) or to see the County results
> of the 2004 U.S. Senate race in Illinois [1] that's a lot of
> unnecessary stuff you are forced to load. You have to load all the
> images and all the text! Owch!
>
> Gilles D said a while back [2] "The Barack Obama article might be a
> bit of an extreme example due to its length, but in that case the API
> data needed for section 0's text + the list of sections is almost 30
> times smaller than the data needed for all sections's text (5.9kb
> gzipped versus 173.8kb gzipped)."
>
> Somewhat related, some experimenting with webpagetest.org has
> suggested that disabling images on this page has a serious impact on
> first paint (which we believe is due to too many simultaneous
> connections) [3,4]
>
> Given that ServiceWorker is here (in Chrome first [5] but hopefully
> others soon) I wrote a small proof of concept that lazy loads images
> to expose myself to this promising technology.
>
> For those interested I've documented my idea here:
> https://en.m.wikipedia.org/wiki/User:Jdlrobson/lazyloader
> but basically what is does is:
> 1) intercept network requests for HTML
> 2) Rewrites the src and srcset attributes to data-src and data-srcset 
> attributes
> 3) Uses JavaScript to lazy load images when they appear in the screen.
> 4) Without JS the ServiceWorker doesn't run so the web remains unbroken
> (But as Jake Archibald points out though there are downsides to this
> approach [6].)
>
> It doesn't quite work as a user script due to how scope works in
> service workers but if we want to use these in production we can use a
> header [7] to allow use of scope: '/' so if we wanted to do this in
> production there's no real problem with that, but we will have to
> ensure we can accurately measure that... [8]
>
> A more radical next step for ServiceWorkers would be to intercept
> network requests for HTML to use an API to serve just the lead section
> [9]. This won't help first ever loads from our users, but it might be
> enough to get going quickly.
>
> If we want to target that first page load we need to really rethink a
> lot of our parser architecture fun times.
>
> Would this be a good topic to bring up in January at the dev summit?
>
> [1] 
> https://en.m.wikipedia.org/wiki/Barack_Obama#/media/File:2004_Illinois_Senate_results.svg
> [2] https://phabricator.wikimedia.org/T97570
> [3] https://phabricator.wikimedia.org/T110615
> [4] https://phabricator.wikimedia.org/T105365#1477762
> [5] https://jakearchibald.com/2014/using-serviceworker-today/
> [6] https://twitter.com/jaffathecake/status/644168091216310273
> [7] 
> https://gerrit.wikimedia.org/r/#/c/219960/8/includes/resourceloader/ResourceLoader.php,unified
> [8] https://phabricator.wikimedia.org/T112588
> [9] https://phabricator.wikimedia.org/T100258

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Super-charging Mobile: An experiment with ServiceWorkers in MediaWiki

2015-09-16 Thread Brian Wolff
>
> The biggest issue we have on the mobile site right now is we ship a
> lot of content - HTML and images - since we ship those on desktop. On
> desktop it's not really a problem from a performance perspective but
> it may be an issue from a download perspective if you have some kind
> of data limit on your broadband and you are addicted to Wikipedia.

This is slightly off-topic, but I'd just like to remind everyone that
in some places, bandwidth can be a problem on desktop too. Many places
in the world don't have the network performance that we take for
granted.

[Anyways, nice to hear about exploring new technologies. I don't
really have anything interesting to say on the topic, but sounds like
good work :) ]

--
-bawolff

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Video: updated support for VP9, Opus source media

2015-09-16 Thread Magnus Manske
Good to see video keeps getting some love in the WikiVerse!

On Wed, Sep 16, 2015 at 6:13 PM Brion Vibber  wrote:

> Thanks to Giuseppe in ops & a bunch of other folks who have helped out,
> we've got an updated server configuration for the 'video scalers' that
> generate media transcodes, supporting VP9 video and Opus audio.
>
> Initially we have one server updated and in production, and the rest should
> follow by tomorrow. (Updated servers are also in place on the beta
> cluster.)
>
> This should make it possible for WebM files using VP9 video and/or Opus
> audio to work as fully first-class citizens in TimedMediaHandler,
> generating medium- and low-resolution transcodes in Ogg and WebM VP8.
>
> It should also fix Ogg audio files using the Opus codec to work
> consistently.
>
>
> Support for generating VP9 transcodes as well is in the code and just needs
> to be switched on -- I'll tweak that configuration once we've got the rest
> of the servers upgraded and have more capacity for batch runs.
>
> (I've got a client-side bot which I'll start once that's ready, to force
> all the files to generate over time.)
>
> -- brion
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Link: useful article on in-person events

2015-09-16 Thread Brian Wolff
On 9/16/15, Frances Hocutt  wrote:
> An excellent article on best practices and pitfalls for in-person technical
> events: https://modelviewculture.com/pieces/software-in-person
>
> This has some ideas and framings that can be useful for planning for the
> upcoming developer summit.
>
> -Frances
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Definitely some interesting thoughts. It was a pleasant surprise to
read the by-line and see it was written by Sumana. Nice to read a
familiar name that I haven't seen mentioned in a while.

--
-bawolff

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] Super-charging Mobile: An experiment with ServiceWorkers in MediaWiki

2015-09-16 Thread Jon Robson
Sam asked me to write up my recent adventures with ServiceWorkers and
making requests for MediaWiki content super super fast so all our
lovely users can access information quicker. Right now we're trying to
make the mobile site ridiculously fast by using new shiny standard web
technologies.

The biggest issue we have on the mobile site right now is we ship a
lot of content - HTML and images - since we ship those on desktop. On
desktop it's not really a problem from a performance perspective but
it may be an issue from a download perspective if you have some kind
of data limit on your broadband and you are addicted to Wikipedia.

The problem however is that on mobile, the connection speeds are not
quite up to desktops standard. To take an example the Barack Obama
article contains 102 image tags and 186KB of HTML resulting in about
1MB of HTML. If you're on your mobile phone just to look up his place
of birth (which is in the lead section) or to see the County results
of the 2004 U.S. Senate race in Illinois [1] that's a lot of
unnecessary stuff you are forced to load. You have to load all the
images and all the text! Owch!

Gilles D said a while back [2] "The Barack Obama article might be a
bit of an extreme example due to its length, but in that case the API
data needed for section 0's text + the list of sections is almost 30
times smaller than the data needed for all sections's text (5.9kb
gzipped versus 173.8kb gzipped)."

Somewhat related, some experimenting with webpagetest.org has
suggested that disabling images on this page has a serious impact on
first paint (which we believe is due to too many simultaneous
connections) [3,4]

Given that ServiceWorker is here (in Chrome first [5] but hopefully
others soon) I wrote a small proof of concept that lazy loads images
to expose myself to this promising technology.

For those interested I've documented my idea here:
https://en.m.wikipedia.org/wiki/User:Jdlrobson/lazyloader
but basically what is does is:
1) intercept network requests for HTML
2) Rewrites the src and srcset attributes to data-src and data-srcset attributes
3) Uses JavaScript to lazy load images when they appear in the screen.
4) Without JS the ServiceWorker doesn't run so the web remains unbroken
(But as Jake Archibald points out though there are downsides to this
approach [6].)

It doesn't quite work as a user script due to how scope works in
service workers but if we want to use these in production we can use a
header [7] to allow use of scope: '/' so if we wanted to do this in
production there's no real problem with that, but we will have to
ensure we can accurately measure that... [8]

A more radical next step for ServiceWorkers would be to intercept
network requests for HTML to use an API to serve just the lead section
[9]. This won't help first ever loads from our users, but it might be
enough to get going quickly.

If we want to target that first page load we need to really rethink a
lot of our parser architecture fun times.

Would this be a good topic to bring up in January at the dev summit?

[1] 
https://en.m.wikipedia.org/wiki/Barack_Obama#/media/File:2004_Illinois_Senate_results.svg
[2] https://phabricator.wikimedia.org/T97570
[3] https://phabricator.wikimedia.org/T110615
[4] https://phabricator.wikimedia.org/T105365#1477762
[5] https://jakearchibald.com/2014/using-serviceworker-today/
[6] https://twitter.com/jaffathecake/status/644168091216310273
[7] 
https://gerrit.wikimedia.org/r/#/c/219960/8/includes/resourceloader/ResourceLoader.php,unified
[8] https://phabricator.wikimedia.org/T112588
[9] https://phabricator.wikimedia.org/T100258

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] LDAP extension ownership

2015-09-16 Thread Ryan Lane
I haven't been actively maintaining the LDAP extension for MediaWiki for
over two years. There's not really much that needs to change, but some
basic love and care is likely a good idea. The LDAP extension is one of the
most popular MediaWiki extensions, so it wouldn't be great if it was broken
for long periods of time.

If anyone would like to take this extension over, please feel free.

- Ryan
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Tools for dealing with citations of withdrawn academic journal articles

2015-09-16 Thread Pine W
Ryan, can we get an update about when
https://www.mediawiki.org/wiki/Community_Tech_team/Community_Wishlist_Survey
will be launched? Thanks!

Pine


On Tue, Aug 18, 2015 at 2:24 PM, Ryan Kaldari 
wrote:

> On Tue, Aug 18, 2015 at 2:06 PM, Pine W  wrote:
>
>> 1. I was thinking of a tool that would let users input a variety of ways
>> of referring to the retracted articles, such as DOI numbers (Peaceray is an
>> expert in these). The tool would accept multiple inputs simultaneously,
>> such as all 64 articles that were retracted in a batch. The tool would
>> return to the user a list of all articles in which those references are
>> used as citations, and highlight the paragraphs of the article where the
>> citations are used. This would, I hope, greatly improve the efficiency of
>> the workflow for dealing with retracted journal articles.
>>
>
> Sounds like a reasonable proposal, although I have to wonder if the time
> spent building and maintaining this tool would be more or less than the
> time it would save editors to search for retracted journal articles.
>
>
>> 2. I'm not clear on where I should list a new idea. The list of ideas in 
>> Community
>> Tech team/All Our Ideas/Process
>> 
>> is based on a survey that has already been completed. Is there a
>> Phabricator workboard that would be appropriate for listing a new idea such
>> as this?
>>
>
> Community Tech is currently only accepting new tasks related to the All
> Our Ideas survey results (
> https://www.mediawiki.org/wiki/Community_Tech_team/All_Our_Ideas). We
> will be opening up a new survey next month though (
> https://www.mediawiki.org/wiki/Community_Tech_team/Community_Wishlist_Survey/Process).
> In the meantime, you can post the idea at
> https://meta.wikimedia.org/wiki/Community_Tech_project_ideas to get more
> input on it. More details about all of this will be announced hopefully
> next week.
>
>
>> 3. I would prefer to have everyone using the same system, which is
>> lists.wikimedia.org. It makes sense to me that everyone might migrate
>> eventually to a newer system. I suggest avoiding fragmentation. Researching
>> the possibility of migrating all mailing lists to a newer system sounds
>> like a good project for Community Tech and I could propose that in
>> Phabricator as well if there's a good place to do so.
>>
>
> That's a pretty good point. I'll request to have the mailing list moved to
> lists.wikimedia.org.
>
>
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] 2015-09-16 Scrum of Scrums meeting notes

2015-09-16 Thread Grace Gellerman
https://www.mediawiki.org/wiki/Scrum_of_scrums/2015-09-16
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] Link: useful article on in-person events

2015-09-16 Thread Frances Hocutt
An excellent article on best practices and pitfalls for in-person technical
events: https://modelviewculture.com/pieces/software-in-person

This has some ideas and framings that can be useful for planning for the
upcoming developer summit.

-Frances
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] HTMLForm and default values

2015-09-16 Thread Ricordisamoa
For https://gerrit.wikimedia.org/r/233423 I need 'default' values of 
HTMLForm fields to overwrite values set via query parameters, when the 
latter are set to empty strings. Do you know a clean way to do it?

Thanks in advance.

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] Video: updated support for VP9, Opus source media

2015-09-16 Thread Brion Vibber
Thanks to Giuseppe in ops & a bunch of other folks who have helped out,
we've got an updated server configuration for the 'video scalers' that
generate media transcodes, supporting VP9 video and Opus audio.

Initially we have one server updated and in production, and the rest should
follow by tomorrow. (Updated servers are also in place on the beta cluster.)

This should make it possible for WebM files using VP9 video and/or Opus
audio to work as fully first-class citizens in TimedMediaHandler,
generating medium- and low-resolution transcodes in Ogg and WebM VP8.

It should also fix Ogg audio files using the Opus codec to work
consistently.


Support for generating VP9 transcodes as well is in the code and just needs
to be switched on -- I'll tweak that configuration once we've got the rest
of the servers upgraded and have more capacity for batch runs.

(I've got a client-side bot which I'll start once that's ready, to force
all the files to generate over time.)

-- brion
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] mediawiki maintenance scripts have moved

2015-09-16 Thread MZMcBride
Florian Schmidt wrote:
>thanks for this e-mail. Maybe I'm the only one, but the e-mail title is,
>in my opinion, a bit misleading :P As I read the title first, I thought,
>that the mediawiki/core maintenance scripts (which are in the
>maintenance/ directory in MediaWiki) itself were moved, but after reading
>the e-mail text I now know, that you mean the wmf setup of some mediawiki
>maintenance scripts. Maybe you can put "puppet" or something else into
>the title of the e-mail next time, to make the scope of the e-mail clear?
>:P

You were not the only person who got confused by the subject line. :-)

Regarding the MediaWiki core maintenance directory generally, there's
 tracking work to improve it.

MZMcBride



___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] Collapsible content

2015-09-16 Thread Legault, Phillip [ITSUS]
MediaWiki  1.23.8



I'm using collapsible content for a User Guide page, and there are a lot of 
sections. So I set each section with one heading and collapsed default.

I added the following CSS simple works fine.

:target {

color: red;

}



I’m trying to do something with JQUERY with pseudo class :target to expand the 
selected and if I provide a link it will expand the section as well.



I have been searching and still haven’t found anything that works.

Any help would be appreciated

Thanks!



Phil






___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] [x-post] Reminder: Language Engineering Office Hour is on September 16, 2015 (today) at 1500 UTC

2015-09-16 Thread Runa Bhattacharjee
Hello,

A reminder that the online office hour hosted by the Wikimedia Language
Engineering team is scheduled for later today at 1500 UTC. You can join the
hangout or watch the session from:
https://plus.google.com/u/0/events/c1c0msurhua7enopsu3q8l42j3s

Please note, due to the limitation of Google Hangouts there are few seats
available. So do let us know before hand if you would like to participate
on the hangout. We will also be on the IRC channel #wikimedia-office to
take questions. Please see below for the event details, local time and
original announcement.

Thanks
Runa

== Details ==

# Event: Wikimedia Language Engineering office hour session

# When: September 16th, 2015 (Wednesday) at 15:00 UTC (check local time
http://www.timeanddate.com/worldclock/fixedtime.html?iso=20150916T1500)

# Where: https://plus.google.com/u/0/events/c1c0msurhua7enopsu3q8l42j3s and
on IRC #wikimedia-office (Freenode)

# Agenda: Content Translation updates and Q & A


-- Forwarded message --
From: Runa Bhattacharjee 
Date: Tue, Sep 8, 2015 at 5:21 PM
Subject: Language Engineering office hour and online meeting on 16
September 2015 (Wednesday) at 1500 UTC
To: Wikimedia developers , MediaWiki
internationalisation , "Wikimedia &
GLAM collaboration [Public]" , Wikimedia Mailing
List 


[x-posted announcement]

Hello,

The next office hour of the Wikimedia Language Engineering team is
scheduled for next Wednesday, September 16th at 15:00 UTC. Like our last
office hour, this time also we are hosting it as an online discussion over
Hangout/Youtube with a simultaneous IRC conversation. Due to the limitation
of Google Hangouts, only a limited number of slots are available. Hence, do
let us know (on the event page
) if you
would like to participate on the Hangout. The IRC channel #wikimedia-office
and the Q&A channel for the youtube broadcast will be open for interactions
during the session.

Our last online round-table session was held in June 2015. You can watch
the recording here: https://www.youtube.com/watch?v=vbXyHmpZJGE

Please read below for the event details and do let us know if you have any
questions.

Thank you
Runa

== Details ==

# Event: Wikimedia Language Engineering office hour session

# When: September 16th, 2015 (Wednesday) at 15:00 UTC (check local time
http://www.timeanddate.com/worldclock/fixedtime.html?iso=20150916T1500)

# Where: https://plus.google.com/u/0/events/c1c0msurhua7enopsu3q8l42j3s and
on IRC #wikimedia-office (Freenode)

# Agenda: Content Translation updates and Q & A


-- 
Language Engineering Manager
Outreach and QA Coordinator
Wikimedia Foundation
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] thumb generation

2015-09-16 Thread Federico Leva (Nemo)
Have you looked into what mwoffliner does? 
https://sourceforge.net/p/kiwix/other/ci/master/tree/mwoffliner/mwoffliner.js

Maybe you can even just extract the images from the ZIM files.

Nemo

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Understanding weird behaviour of diffto in API

2015-09-16 Thread Petr Bena
yay, that fixed the problem, thanks :o

On Tue, Sep 15, 2015 at 6:24 PM, Brad Jorsch (Anomie)
 wrote:
> On Tue, Sep 15, 2015 at 11:53 AM, Petr Bena  wrote:
>
>> Just to make clear what my problem is, here is example. There is a
>> user 75.69.113.86 who made 3 consecutive edits to page "Battle
>> Tendency". These edits can be displayed using these 3 API's:
>>
>> *
>> https://en.wikipedia.org/w/api.php?action=query&prop=revisions&rvprop=ids%7Ctags%7Cuser%7Ctimestamp%7Ccomment&rvlimit=1&rvstartid=681167574&rvdiffto=prev&titles=Battle%20Tendency&rawcontinue=1&format=xml
>
>
> This diff is from 679358832 to 681167574.
>
>
>> *
>> https://en.wikipedia.org/w/api.php?action=query&prop=revisions&rvprop=ids%7Ctags%7Cuser%7Ctimestamp%7Ccomment&rvlimit=1&rvstartid=681167835&rvdiffto=prev&titles=Battle%20Tendency&rawcontinue=1&format=xml
>
>
> This is from 681167574 to 681167835.
>
>
>> *
>> https://en.wikipedia.org/w/api.php?action=query&prop=revisions&rvprop=ids%7Ctags%7Cuser%7Ctimestamp%7Ccomment&rvlimit=1&rvstartid=681167909&rvdiffto=prev&titles=Battle%20Tendency&rawcontinue=1&format=xml
>
>
> And this is from 681167835 to 681167909.
>
>
>> The combined diff of all three edits I am trying to get has link
>>
>> https://en.wikipedia.org/w/api.php?action=query&prop=revisions&rvprop=ids%7Ctags%7Cuser%7Ctimestamp%7Ccomment&rvlimit=1&rvstartid=681167574&rvdiffto=681167909&titles=Battle%20Tendency&rawcontinue=1&format=xml
>>
>> but as you can see, it's not actually containing all these changes.
>> Why is that?
>
>
> Because you're asking for the diff from 681167574 to 681167909 (which
> covers only the last two of your three links), not the diff from
> *679358832* to 681167909.
>
>
> --
> Brad Jorsch (Anomie)
> Senior Software Engineer
> Wikimedia Foundation
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l