[Wikitech-l] [Site issue] s3 wikis read-only until replication catches up

2012-12-11 Thread Erik Moeller
Wikimedia wikis hosted on the s3 cluster (pretty much all but the
very large wikis, click on the s3 box in
https://noc.wikimedia.org/dbtree/ to get a full list) are currently in
read-only mode due to severe replication lag. This problem was
apparently caused by a logic issue in the job queue, which should now
be fixed, but it will still take at least 2-3 hours for replication to
catch up. We apologize for the inconvenience, and are continuing to
monitor the situation.

Erik
-- 
Erik Möller
VP of Engineering and Product Development, Wikimedia Foundation

Support Free Knowledge: https://wikimediafoundation.org/wiki/Donate

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


[Wikitech-l] gerrit support question: how to show raw file content after the commit in the browser, not as zip download

2012-12-11 Thread Thomas Gries
While tracking an iusse I came to
https://gerrit.wikimedia.org/r/#/c/7986/ (example case)
In the list of files I clicked on
https://gerrit.wikimedia.org/r/#/c/7986/14/includes/UserMailer.php

Now I am desperately seeking a link to _show the raw file content
after the commit in the __browser__,__
_but only found a link (Download) which starts a zip download. This is
not what I wanted.

Is there a solution which I have overlooked?
Tom
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] gerrit support question: how to show raw file content after the commit in the browser, not as zip download

2012-12-11 Thread Antoine Musso
Le 11/12/12 10:42, Thomas Gries a écrit :
 While tracking an iusse I came to
 https://gerrit.wikimedia.org/r/#/c/7986/ (example case)
 In the list of files I clicked on
 https://gerrit.wikimedia.org/r/#/c/7986/14/includes/UserMailer.php
 
 Now I am desperately seeking a link to _show the raw file content
 after the commit in the __browser__,__
 _but only found a link (Download) which starts a zip download. This is
 not what I wanted.
 
 Is there a solution which I have overlooked?

You can get the patch locally using:
 - the checkout / patch links under each patchset.
 - git-review if you want to look at last patchset

Another way is to use Gitweb.  When looking at the Gerrit change on
https://gerrit.wikimedia.org/r/#/c/7986/ , you will find next to each
patchset entry a gitweb link.  That points to a page showing the
commit metadata and the list of files changed with 4 links diff, blob,
blame and history. Blob is what you are looking for :)

-- 
Antoine hashar Musso


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] [Site issue] s3 wikis read-only until replication catches up

2012-12-11 Thread Ariel T. Glenn
Στις 11-12-2012, ημέρα Τρι, και ώρα 01:10 -0800, ο/η Erik Moeller
έγραψε:
 Wikimedia wikis hosted on the s3 cluster (pretty much all but the
 very large wikis, click on the s3 box in
 https://noc.wikimedia.org/dbtree/ to get a full list) are currently in
 read-only mode due to severe replication lag. This problem was
 apparently caused by a logic issue in the job queue, which should now
 be fixed, but it will still take at least 2-3 hours for replication to
 catch up. We apologize for the inconvenience, and are continuing to
 monitor the situation.
 
 Erik

Forgot to follow up here as well: editing was restored at around 11 am
UTC, and all databases are back to normal.  The job runners have been in
operation during this entire time and seem not to cause any db issues
now.

Ariel


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] MediaWiki Groups are official: start your own!

2012-12-11 Thread Quim Gil
MediaWiki Groups are now official - and recognized by the Wikimedia 
Affiliations Committee:


https://www.mediawiki.org/wiki/Groups/

Who wants to start one?

I just created

https://www.mediawiki.org/wiki/Groups/Proposals/San_Francisco

as a real test of the process and a real example of a local group. Other 
local groups are welcome. If you are in San Francisco and you want to be 
part of this group add yourself to that page. Any ideas welcome!


Looking at https://www.mediawiki.org/wiki/Events there are some local 
groups especially welcomed:


* Amsterdam
* Bangalore
* Berlin
* Brussels
* Buenos Aires
* Chennai
* Cologne
* Hong Kong
* Los Angeles
* Paris
* Pune
* San Diego
* Tel Aviv
* Washington DC
... and wherever you are sitting now.  :)


The gates for topical groups are also open. I will start pushing one 
about Testing / QA, also to test the process with a real example. More 
proposals?


Thank you to all the people that provided feedback about the MediaWiki 
Groups proposal and especially to


* Federico Leva (alias Nemo), who was especially helpful with his close 
marking and attention to detail - 
https://meta.wikimedia.org/wiki/User:Nemo_bis
* Bence Damokos was a perfect Affiliations Committee chair and helped 
fine tuning the proposal to make it fit with the Wikimedia User Groups - 
https://meta.wikimedia.org/wiki/User:Bdamokos


--
Quim Gil
Technical Contributor Coordinator @ Wikimedia Foundation
http://www.mediawiki.org/wiki/User:Qgil

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] gerrit support question: how to show raw file content after the commit in the browser, not as zip download

2012-12-11 Thread Niklas Laxström
On 11 December 2012 11:42, Thomas Gries m...@tgries.de wrote:
 While tracking an iusse I came to
 https://gerrit.wikimedia.org/r/#/c/7986/ (example case)
 In the list of files I clicked on
 https://gerrit.wikimedia.org/r/#/c/7986/14/includes/UserMailer.php

 Now I am desperately seeking a link to _show the raw file content
 after the commit in the __browser__,__
 _but only found a link (Download) which starts a zip download. This is
 not what I wanted.

 Is there a solution which I have overlooked?

If you just want to view the content, in diff view click Preferences,
on the left bottom choose Whole File as context and click Update. It
is no good for copy-pasting though.

  -Niklas


--
Niklas Laxström

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Bugzilla upgrade [was: bugzilla.wikimedia.org downtime: Now.]

2012-12-11 Thread Krinkle
On Dec 10, 2012, at 2:09 PM, Andre Klapper aklap...@wikimedia.org wrote:

 On Sat, 2012-12-08 at 05:50 +1000, K. Peachey wrote:
 Can you propose a better option for this than defaulting the cc'er as
 wikibugs?
 
 I doub't there are that that many people that want the wikibugs-l list
 to receive a copy of every single change (Eg: cc changes) to the bug,
 But that is probably a discussion for another thread.
 
 There are two people watching wikibugs-l@: Quim Gil and I.
 So I'll likely set up an account like all-bugm...@wikimedia.bugs and
 set it as globalwatcher tomorrow. Then Quim and I can subscribe to it
 to drown in bugmail, and you can use wikitech-l@ as usual.
 

I am also subscribed to wikibugs-l, and I believe Roan is as well (or was 
anyway).

I agree with Chad, it should continue to work as expected. From a globalwatcher 
point of view, whatever change is made to a bug, it must not affect the fact 
that the globalwatcher address (wikibugs-l in this case) is notified. The 
default CC status of Nobody wikibug...@lists.wikimedia.org is afaik just 
for legacy reasons and as placeholder. wikibugs-l as Bugzilla account isn't 
actually used afaik. In the Bugzilla configuration, wikibugs-l is configured as 
globalwatcher.

The proposal Andre makes here sounds confusing, How is that different from the 
current situation? What problem is it supposed to address?

-- Krinkle


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] gerrit support question: how to show raw file content after the commit in the browser, not as zip download

2012-12-11 Thread Ryan Kaldari
This is actually my biggest annoyance with gerrit—that I can't view raw 
code from the change view. I can't fathom why they have a zip download 
link, but not a view link. Then I could copy code without copying all 
the line numbers.


Ryan Kaldari

On 12/11/12 9:25 AM, Niklas Laxström wrote:

On 11 December 2012 11:42, Thomas Gries m...@tgries.de wrote:

While tracking an iusse I came to
https://gerrit.wikimedia.org/r/#/c/7986/ (example case)
In the list of files I clicked on
https://gerrit.wikimedia.org/r/#/c/7986/14/includes/UserMailer.php

Now I am desperately seeking a link to _show the raw file content
after the commit in the __browser__,__
_but only found a link (Download) which starts a zip download. This is
not what I wanted.

Is there a solution which I have overlooked?

If you just want to view the content, in diff view click Preferences,
on the left bottom choose Whole File as context and click Update. It
is no good for copy-pasting though.

   -Niklas


--
Niklas Laxström

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l



___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] Recent test failures for many extensions

2012-12-11 Thread Niklas Laxström
Jenkins has been failing all commits for certain extensions today. I
assume it is somehow related to the accidental addition of submodules
to core master branch. I've been told that some of the tests that
Jenkins runs have been disabled.

I'm assuming someone else will post a followup to explain all the
somes above and make sure that all tests are re-enabled and passing
for all affected extensions.

  -Niklas

--
Niklas Laxström

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] Jenkins and extension parser tests

2012-12-11 Thread Merlijn van Deen
tl;dr below, frustration ahead.

I've been trying to get Jenkins to run the parsertests of
LabeledSectionTransclusion, but this has been an incredibly frustrating
experience. It shouldn't be too hard, right? Well, read on...

I thought I'd start with the existing 'testextensions' jenkins job [1],
which is failing. The reason it is failing is because of this:

*12:02:56* + php tests/phpunit/phpunit.php --log-junit
junit-phpunit-ext.xml --
extensions/LabeledSectionTransclusion*12:02:56* Unexpected
non-MediaWiki exception encountered, of type
PHPUnit_Framework_Exception*12:02:56* exception
'PHPUnit_Framework_Exception' with message 'Neither
extensions/LabeledSectionTransclusion.php nor
extensions/LabeledSectionTransclusion.php could be opened.' in
/usr/share/php/PHPUnit/Util/Skeleton/Test.php:100


After installing PHPUnit (on itself quite a frustrating experience), I
get this response instead:


valhallasw@lisilwen:~/src/core$ php tests/phpunit/phpunit.php
extensions/LabeledSectionTransclusion
PHPUnit 3.7.10 by Sebastian Bergmann.

Configuration read from /home/valhallasw/src/core/tests/phpunit/suite.xml



Time: 0 seconds, Memory: 18.50Mb

No tests executed!

Which suggests it might be a problem with the Jenkins configuration.
Of course, the tests are also not running, but this is a second issue.


So, where is the Jenkins configuration? Not in Jenkins itself. After
logging in in Jenkins, there /is/ suddenly a new option 'Workspace'
(why is that disabled for logged-out users?), which shows the
extension is checked out in the workspace - so that doesn't explain
why phpunit is unable to locate the tests.


So let's check the configuration. [2] suggests it should be at [3].
Ugh, gitweb, and no github mirror. Under 'jobs', there are some
extensions, but not LST. There is MediaWiki-Tests-Extensions, which
has some XML files that, at least for me, are incomprehensible. And
are they even relevant to this? In any case, they don't tell me why
PHPUnit is unable to find the LST directory, even though it's right
there [4]. I give up.


By the way - I just realized why the job is not in the Jenkins repository:

valhallasw@lisilwen:~/src/jenkins$ grep -R * -e 'mwext'

jobs/.gitignore:/mwext-*


*Really?*


Okay, so then how do I get PHPUnit to run the parsertests? There is
the NewParserTest class, so I was guessing adding this to
LabeledSectionTransclusion.php would work:


class LSTParserTest extends NewParserTest {

function __construct()

{

parent::__construct();

$this-setParserTestFile(__DIR__ . /lstParserTests.txt);

}

};



Well, that does, er, absolutely nothing. [5] suggests I should move it
to a seperate file and add the following:


$wgHooks['UnitTestsList'][] = 'efFruitsRegisterUnitTests';function
efFruitsRegisterUnitTests( $files ) {
$testDir = dirname( __FILE__ ) . '/';
$files[] = $testDir . 'AppleTest.php';
return true;}


what it *doesn't* tell you is that this is actually irrelevant when
you run php tests/phpunit/phpunit.php
extensions/LabeledSectionTransclusion - and it's not really clear to
me for when it is relevant. After some struggling, apparently it
detects tests because the filename *ends in Test.php* (which is
documented, well, no-where. After moving the test case there, it still
didn't quite work, but after some more fiddling, something is working!


Okay, so now I have a test case that actually runs when I call the
above command. So, let's push it for review [6] - but that doesn't
actually make Jenkins run the test cases so I still don't know
whether this actually fixes it.


[1]
https://integration.mediawiki.org/ci/job/mwext-LabeledSectionTransclusion-testextensions/
[2] http://www.mediawiki.org/wiki/Continuous_integration/Jenkins
[3] https://gerrit.wikimedia.org/r/gitweb?p=integration/jenkins.git;a=tree
[4]
https://integration.mediawiki.org/ci/job/mwext-LabeledSectionTransclusion-testextensions/ws/extensions/
[5]
http://www.mediawiki.org/wiki/Manual:PHP_unit_testing/Writing_unit_tests_for_extensions


So, tl,dr: debugging jenkins failures and getting phpunit to work is a
frustrating experience, because of a lack of documentation and a lack of
'obviousness' (there is not one 'obvious' way to do it). I've updated [5]
with my experiences, but I still have no idea how the magic that is Jenkins
is configured and how it works. It would be great if there would be better
documentation on that.

Some specific things that I think should be fixed:
  - Jenkins should not show new (read-only) options to users - make them
public instead
  - The Jenkins configuration should also be mirrored on github
  - The mwext jobs should be in git (preferrably, because then it's easy to
check them on-line), *or* there should be clear documentation on how to
generate those jobs...
  - There should be documentation on how the configuration is actually used
- and preferrably on how to debug failures.

Thanks.

Merlijn

Re: [Wikitech-l] Bugzilla upgrade [was: bugzilla.wikimedia.org downtime: Now.]

2012-12-11 Thread Andre Klapper
On Tue, 2012-12-11 at 19:30 +0100, Krinkle wrote:
 The proposal Andre makes here sounds confusing, How is that different
 from the current situation? What problem is it supposed to address?

Re-reading what the hack is supposed to do I realize that making another
Bugzilla user a globalwatcher won't change anything.

As I've mentioned before I didn't receive bugmail anymore for a report
after wikibugs-l@ had been removed as assignee of a report  when
wikibugs-l@ was not listed as default CC of the component either.
It works since the upgrade, and I cannot prove if it was related to the
hack, but I guess I can quickly find out after reapplying it.

So anybody with shell access who's passionate about getting this back,
please feel free to reapply the attached patch 
  1) on the Bugzilla server and 
  2) to drop it into Gerrit's
wikimedia/bugzilla/modifications/bugzilla-4.2/Bugzilla/BugMail.pm
so Gerrit and Bugzilla are in sync (Bugzilla isn't hooked up with Puppet
or Gerrit and you don't want to make things messier, do you?). ;)

andre


PS: Offtopic, just explaining how things work in Bugzilla:

On Tue, 2012-12-11 at 19:30 +0100, Krinkle wrote:
 wikibugs-l as Bugzilla account isn't actually used afaik.

It isn't, but that doesn't matter here. Other Bugzilla instances also
use such virtual accounts, even with non-working email addresses. They
are only created so you can add such accounts to your User Watching
list. Then Bugzilla will make you receive the bugmail that such accounts
would theoretically receive. That's all. No email is physically sent
to such virtual accounts. That is contrary to wikibugs-l@ which seems to
be an existing and working email address.

-- 
Andre Klapper | Wikimedia Bugwrangler
http://blogs.gnome.org/aklapper/
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] Welcome our six OPW interns!

2012-12-11 Thread Quim Gil
I’m glad to announce that Kim Schoonover, Mariya Miteva, Priyanka Nag, 
Sucheta Ghoshal, Teresa Cho and Valerie Juarez will join the MediaWiki 
community as full-time interns between January and March 2013. They have 
been selected as part of the FLOSS Outreach Program for Women.


Check the details at

http://blog.wikimedia.org/2012/12/11/welcome-to-floss-outreach-program-for-women-interns/

We wish a happy landing to our new interns and the best luck in their 
projects! You’ll be hearing more from them over the next few months.


--
Quim Gil
Technical Contributor Coordinator @ Wikimedia Foundation
http://www.mediawiki.org/wiki/User:Qgil

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Recent test failures for many extensions

2012-12-11 Thread Antoine Musso
Le 11/12/12 20:27, Niklas Laxström a écrit :
 Jenkins has been failing all commits for certain extensions today. I
 assume it is somehow related to the accidental addition of submodules
 to core master branch. I've been told that some of the tests that
 Jenkins runs have been disabled.
 
 I'm assuming someone else will post a followup to explain all the
 somes above and make sure that all tests are re-enabled and passing
 for all affected extensions.

I have fixed that earlier today around 9am-10am CET.

The root cause was indeed extensions submodules that landed in the
master branch of mediawiki/core.

More details:

The extension jobs are fetching a copy of the latest mediawiki/core and
then install the extension with the submitted change under /extensions/.

In LocalSettings.php is injected a very dumb loader that list
subdirectories of extensions and attempt to require_once() a PHP file
named alike under each of the extension.  Since the submodules are not
updated, the extensions are empty and the dumb loader could not find the
extensions entry points.

As far as I remember, the extensions impacted where Translate, Echo and
VisualEditor. I have manually deleted the empty directories in each of
this jobs workspaces.


Bug logged:

Jenkins extension loader should ignore empty directories
  https://bugzilla.wikimedia.org/show_bug.cgi?id=42960

investigate whether to clean workspace for extension jobs
  https://bugzilla.wikimedia.org/show_bug.cgi?id=42961

Both low priority for now though.

-- 
Antoine hashar Musso


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Bugzilla upgrade [was: bugzilla.wikimedia.org downtime: Now.]

2012-12-11 Thread K. Peachey
On Wednesday, December 12, 2012, Andre Klapper wrote:

 On Tue, 2012-12-11 at 19:30 +0100, Krinkle wrote:
  wikibugs-l as Bugzilla account isn't actually used afaik.

 …

That is contrary to wikibugs-l@ which seems to
 be an existing and working email address.


Yes it's a mailing list that anyone can subscribe to, and last I heard it
has quiet a few members listed.
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] Video on mobile: Firefox works, way is paved for more browser support

2012-12-11 Thread Brion Vibber
Since the switch from OggHandler to TimedMediaHandler we are one step
closer to supporting video on mobile browsers.

In fact, there's one it works in now -- Firefox for Android!

We've been able to close out this Firefox evangelism bug about our broken
mobile video:
https://bugzilla.mozilla.org/show_bug.cgi?id=728486

However, every other mobile browser I've tested doesn't support Ogg Theora
or WebM formats. Mobile Safari, Chrome, the old stock Android browser,
Opera Mobile, and the IE 10 engine in our Windows 8 tablet app will show
the thumbnail, but won't play the video because they need MP4/H.264.


Looking at the bug for adding transcoding...
https://bugzilla.wikimedia.org/show_bug.cgi?id=39869
https://gerrit.wikimedia.org/r/#/c/25473/
...it looks like we may have support ready to go but disabled by default,
so not yet in use.


Just thought I'd check in on what it'll take to get it going. No immediate
rush, but I'd really love to have videos working on smartphones and
tablets, and not everybody runs Firefox. :)

-- brion
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Jenkins and extension parser tests

2012-12-11 Thread Platonides
If you only want to run parser tests from a different file, there's no
need to create a new class.

You simply add in the main file:
 $wgParserTestFiles[] = dirname( __FILE__ ) . /lstParserTests.txt;
(already in lst.php)

It will be automagically picked when running the phpunit tests (if you
have the extension enabled in LocalSettings, which would be a
precondition for them to work).

You can also run them with tests/parserTests.php

With phpunit:
$ make destructive

Tests: 5125, Assertions: 927178, Failures: 2, Incomplete: 3, Skipped: 5.
Tests: 5125, Assertions: 880313, Failures: 2, Incomplete: 3, Skipped: 5.

Add lst to LocalSettings.

$ make destructive
Tests: 5157, Assertions: 883148, Failures: 2, Incomplete: 3, Skipped: 5.
Tests: 5157, Assertions: 942272, Failures: 2, Incomplete: 3, Skipped: 5.
Tests: 5157, Assertions: 913264, Failures: 2, Incomplete: 3, Skipped: 5.

You can also run make parser if you prefer to run less tests.
(no, I don't know why would the number of assertions randomly change
between runs with the same config...)


And fyi, they do pass.


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Video on mobile: Firefox works, way is paved for more browser support

2012-12-11 Thread Erik Moeller
On Tue, Dec 11, 2012 at 3:02 PM, Brion Vibber bvib...@wikimedia.org wrote:
 Just thought I'd check in on what it'll take to get it going. No immediate
 rush, but I'd really love to have videos working on smartphones and
 tablets, and not everybody runs Firefox. :)

As a recap, this is about expanding video support to include h.264,
which is patent-encumbered (licensing fees are charged for some uses
of the format). Cf.
https://meta.wikimedia.org/wiki/Mobile_video_codec_policy

Since there are multiple potential paths for changing the policy
(keeping things ideologically pure, allowing conversion on ingestion,
allowing h.264 but only for mobile, allowing h.264 for all devices,
etc.), and since these issues are pretty contentious, it seems like a
good candidate for an RFC which'll help determine if there's an
obvious consensus path forward.

Any takers for advancing the community conversation?

Erik
-- 
Erik Möller
VP of Engineering and Product Development, Wikimedia Foundation

Support Free Knowledge: https://wikimediafoundation.org/wiki/Donate

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Video on mobile: Firefox works, way is paved for more browser support

2012-12-11 Thread David Gerard
On 11 December 2012 23:15, Erik Moeller e...@wikimedia.org wrote:

 Since there are multiple potential paths for changing the policy
 (keeping things ideologically pure, allowing conversion on ingestion,
 allowing h.264 but only for mobile, allowing h.264 for all devices,
 etc.), and since these issues are pretty contentious, it seems like a
 good candidate for an RFC which'll help determine if there's an
 obvious consensus path forward.
 Any takers for advancing the community conversation?


It's definitely in the class of things that would require strong
community buy-in, yes.


- d.

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Welcome our six OPW interns!

2012-12-11 Thread bawolff
On Tue, Dec 11, 2012 at 4:04 PM, Quim Gil q...@wikimedia.org wrote:
 I’m glad to announce that Kim Schoonover, Mariya Miteva, Priyanka Nag,
 Sucheta Ghoshal, Teresa Cho and Valerie Juarez will join the MediaWiki
 community as full-time interns between January and March 2013. They have
 been selected as part of the FLOSS Outreach Program for Women.

 Check the details at

 http://blog.wikimedia.org/2012/12/11/welcome-to-floss-outreach-program-for-women-interns/

 We wish a happy landing to our new interns and the best luck in their
 projects! You’ll be hearing more from them over the next few months.

 --
 Quim Gil
 Technical Contributor Coordinator @ Wikimedia Foundation
 http://www.mediawiki.org/wiki/User:Qgil

 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Good luck and congrats to everyone accepted.

-bawolff

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


[Wikitech-l] Mobile apps: time to go native?

2012-12-11 Thread Brion Vibber
Over on the mobile team we've been chatting for a while about the various
trade-offs in native vs HTML-based (PhoneGap/Cordova) development.

Currently our main Wikipedia apps are all HTML-based:
* Android - HTML + Cordova + plugins
* iOS - HTML + Cordova + plugins
* BlackBerry PlayBook (and later BB 10) - HTML + WebWorks
* Firefox OS - HTML
* Windows 8 - HTML + WinRT (mostly separate UI code)

We also used Cordova and HTML for the Wiki Loves Monuments app. While it
worked and was good for quick prototyping, we've run into a number of
problems doing HTML-based development in both apps:

* performance -- especially when combining large scrolling text areas with
static controls
* compatibility -- making something that worked on Android 2.x and 4+, and
on iOS 4.x and 5+, is sometimes fun. Since we're stuck running in the
native web view control on each platform, we're stuck with old bugs and
sometimes have to make huge workarounds.
* platform limitations -- for WLM we had to modify the Cordova platform so
we could do a progress bar on upload. Ouch!
* platform integration -- to use system sharing features and other things
well we often need plugins for Cordova, which means either relying on
third-party code (which may not be as well maintained as Cordova core) or
writing our own (and keeping them up to date as Cordova evolves). Either
way we have to fiddle with native code to handle things like saved pages
already, so we're not playing pure HTML+JS.
* crash reports are often useless, not giving us JavaScript stack traces

For instance we recently ran into a problem where HTTPS stopped working in
release builds on Android 2.2... we got no backtraces from the system, just
knew that *something* somewhere in the WebView's infrastructure stopped
working. It was not fun to track down. If we'd been doing the page
loads in native Java, we probably would have gotten stack traces
automatically.


We've started a couple experimental projects as fully-native Android apps:
a Signpost app that uses native notifications to alert you to new issues,
and a Commons uploader which lets you take photos or videos and upload them
direct to Commons.

Going native on these means:
* good access to notifications, multimedia, etc
* faster startup times
* native use of Android accounts management system when doing editing
features
* more native look  feel
* no damn tricks for scrolling areas :P
* crash reports are more likely to point at something relevant

So we've been thinking about splitting up the HTML-based Wikipedia app as
well. There'll still be a big WebView in the middle of the app -- it shows
web content after all! -- but most of the UI chrome, retrieval of data, and
all the system integration will go into the native-side code.

Essentially we may reorg to:
* HTML core for the WebView -- handle section collapsing, references, table
collapse/expand, etc. Probably share much of this code with MobileFrontend.
* native iOS UI (Obj-C)
* native Android UI (Java)
* separate HTML UI for Firefox OS, BlackBerry PlayBook  BB 10
* separate HTML UI for Windows 8/RT

Firefox OS, BlackBerry 10, and Windows 8 all give us fairly good system
integration within the HTML+JS environment so we probably won't have to do
truly 'native' code on these, which is good since we'll have less time to
devote to them.

iOS and Android remain our top-tier mobile platforms, and we know we can do
better on them than we do so far...


Any thoughts? Wildly in favor or against?

-- brion
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] MediaWiki Groups are official: start your own!

2012-12-11 Thread bawolff
On Tue, Dec 11, 2012 at 1:14 PM, Quim Gil q...@wikimedia.org wrote:
 MediaWiki Groups are now official - and recognized by the Wikimedia
 Affiliations Committee:

 https://www.mediawiki.org/wiki/Groups/

 Who wants to start one?

 I just created

 https://www.mediawiki.org/wiki/Groups/Proposals/San_Francisco


I'm actually quite curious to see if there are actually enough MW devs
in a single city (Other then WMF's home town) to form a group.

To be honest though, I kind of feel that if such groups were going
to form, they probably would have already. Formality rarely makes
people come together that wouldn't by themselves.

-bawolff

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Recent test failures for many extensions

2012-12-11 Thread Platonides
I don't think they are really a problem. core was broken (it shouldn't
contain submodules), thus all tests failed.
As c37975 was an automerge, jenkins didn't have the chance to stop it.

It would be interesting if jenkins would -2 whenever a change is sent to
master branch with a parent in a different branch (wmf/1.21wmf6 in this
case).


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Mobile apps: time to go native?

2012-12-11 Thread MZMcBride
Brion Vibber wrote:
 Over on the mobile team we've been chatting for a while about the various
 trade-offs in native vs HTML-based (PhoneGap/Cordova) development.
 
 [...]
 
 iOS and Android remain our top-tier mobile platforms, and we know we can do
 better on them than we do so far...
 
 Any thoughts? Wildly in favor or against?

It's unclear from your e-mail what the goal of mobile interaction (for lack
of a better term) is. Are you trying to build editing features? File upload
features? Or do you want to just have a decent reader? This seems like a key
component to any discussion of the future of mobile development.

Looking at the big picture, I don't think we'll ever see widespread editing
from mobile devices. The user experience is simply too awful. The best I
think most people are hoping for is the ability to easily fix a typo, maybe,
but even then you have to assess costs vs. benefit. That is, is it really
worth paying two or three full-time employees so that someone can easily
change Barrack to Barack from his or her iPhone? Probably not.

Perhaps mobile uploading could use better native support, but again, is the
cost worth it? Does Commons need more low-quality photos? And even as phone
cameras get better, do those photos need to be _instantly_ uploaded to the
site? There's something to be said for waiting until you get home to upload
photos, especially given how cumbersome the photo upload process is
(copyright, permissions, categorization, etc.). And this all side-steps the
question of whether there are better organizations equipped at handling
photos (such as Flickr or whatever).

That leaves reading. If a relatively recent browser can't read Wikimedia
wikis without performance issues, I think that indicates a problem with
Wikimedia wikis (way too much JavaScript, images are too large, etc.).
Mobile browsers are fairly robust (vastly more robust compared to what they
used to be), so I'm not sure why having a Wikipedia reader is valuable or
why it's worth investing finite resources in. Occasionally I'll hear but I
want to have a favorite pages feature or I want to support offline reading,
but the phone's OS should be able to handle most of this from the built-in
Web browser. Or not, but I think if a phone is incapable of a feature such
as e-mail this Web page (article) to a friend, it's not an important
enough feature to devote resources to.

Wikimedia wikis have a lot of bugs and the Wikimedia Foundation has finite
resources. I personally only use the Messages and Music apps on my
phone, so I'm not the best person to make the argument for additional mobile
development, but when I look at how terrible the user experience continues
to be for desktop users, it becomes difficult for me to understand why the
Wikimedia Foundation would delve into the world of mobile (apart from
initiatives such as Wikipedia Zero). What benefit to the creation or
dissemination of free educational content are we seeing (or hoping to see)
from native apps?

MZMcBride



___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] MediaWiki Groups are official: start your own!

2012-12-11 Thread Platonides
On 12/12/12 00:44, bawolff wrote:
 I'm actually quite curious to see if there are actually enough MW devs
 in a single city (Other then WMF's home town) to form a group.
 
 To be honest though, I kind of feel that if such groups were going
 to form, they probably would have already. Formality rarely makes
 people come together that wouldn't by themselves.
 
 -bawolff

It's possible that some people make one group because they are new and
it'd be cool. So we would have one more group listed. Would it last/be
useful? Who knows.



___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] MediaWiki Groups are official: start your own!

2012-12-11 Thread Steven Walling
On Tue, Dec 11, 2012 at 3:44 PM, bawolff bawolff...@gmail.com wrote:

 To be honest though, I kind of feel that if such groups were going
 to form, they probably would have already. Formality rarely makes
 people come together that wouldn't by themselves.


Obviously robust MediaWiki Groups won't materialize overnight, but
providing a good structure and encouraging people to participate is
certainly not unwelcome or impossible to pull off. You'd think contributors
to the software platform behind Wikipedia would be less skeptical about the
power of a few committed individuals to grow a community, but I guess
people don't change. ;-)

Steven
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Mobile apps: time to go native?

2012-12-11 Thread David Gerard
On 12 December 2012 00:04, MZMcBride z...@mzmcbride.com wrote:

 Looking at the big picture, I don't think we'll ever see widespread editing
 from mobile devices. The user experience is simply too awful. The best I
 think most people are hoping for is the ability to easily fix a typo, maybe,
 but even then you have to assess costs vs. benefit. That is, is it really
 worth paying two or three full-time employees so that someone can easily
 change Barrack to Barack from his or her iPhone? Probably not.


OTOH, see recent coverage of Wikipedia in Africa, where it's basically
going to be on phones. Cheap shitty smartphones. That the kids are
*desperate* to get Wikipedia on. Do we want to make those readers into
editors? It'd be nice.


 Perhaps mobile uploading could use better native support, but again, is the
 cost worth it? Does Commons need more low-quality photos? And even as phone
 cameras get better, do those photos need to be _instantly_ uploaded to the
 site? There's something to be said for waiting until you get home to upload
 photos, especially given how cumbersome the photo upload process is
 (copyright, permissions, categorization, etc.). And this all side-steps the
 question of whether there are better organizations equipped at handling
 photos (such as Flickr or whatever).


This is a version of the general argument against participation. There
are reasons it's not favoured.


- d.

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


[Wikitech-l] mariadb 5.5 in production for english wikipedia

2012-12-11 Thread Asher Feldman
Hi,

This afternoon, I migrated one of the main production English Wikipedia
slaves, db59, to MariaDB 5.5.28.  We've previously been testing 5.5.27 on
the primary research slave, and I've been testing the current build for the
last few days on a slave in eqiad.  All has looked good, and I spent the
last few days adapting our monitoring and metrics collection tools to the
new version, and building binary packages that meet our needs.

A main gotcha in major version upgrades is performance regressions due to
changes in query plans.  I've seen no sign of this, and my initial
assessment is that performance for our workload is on par with or slightly
improved over the 5.1 facebook patchset.

Taking the times of 100% of all queries over regular sample windows, the
average query time across all enwiki slave queries is about 8% faster with
MariaDB vs. our production build of 5.1-fb.  Some queries types are 10-15%
faster, some are 3% slower, and nothing looks aberrant beyond those
bounds.  Overall throughput as measured by qps has generally been improved
by 2-10%.  I wouldn't draw any conclusions from this data yet, more is
needed to filter out noise, but it's positive.

MariaDB has some nice performance improvements that our workload doesn't
really hit (better query optimization and index usage during joins, much
better sub query support) but there are also some things, such as full
utilization of the primary key embedded on the right of every secondary
index that we can take advantage of (and improve our schema around) once
prod is fully upgraded, hopefully over the next 1-2 months.

The main goal of migrating to MariaDB is not performance driven.  More so,
I think it's in WMF's and the open source communities interest to coalesce
around the MariaDB Foundation as the best route to ensuring a truly open
and well supported future for mysql derived database technology.
Performance gains along the way are icing on the cake.

-Asher
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] [Ops] mariadb 5.5 in production for english wikipedia

2012-12-11 Thread Peter Youngmeister
Asher,

This is awesome! Thank you for your hard, careful work and dedication in
taking this huge first step in moving to MariaDB!

--peter


On Tue, Dec 11, 2012 at 4:10 PM, Asher Feldman afeld...@wikimedia.orgwrote:

 Hi,

 This afternoon, I migrated one of the main production English Wikipedia
 slaves, db59, to MariaDB 5.5.28.  We've previously been testing 5.5.27 on
 the primary research slave, and I've been testing the current build for the
 last few days on a slave in eqiad.  All has looked good, and I spent the
 last few days adapting our monitoring and metrics collection tools to the
 new version, and building binary packages that meet our needs.

 A main gotcha in major version upgrades is performance regressions due to
 changes in query plans.  I've seen no sign of this, and my initial
 assessment is that performance for our workload is on par with or slightly
 improved over the 5.1 facebook patchset.

 Taking the times of 100% of all queries over regular sample windows, the
 average query time across all enwiki slave queries is about 8% faster with
 MariaDB vs. our production build of 5.1-fb.  Some queries types are 10-15%
 faster, some are 3% slower, and nothing looks aberrant beyond those
 bounds.  Overall throughput as measured by qps has generally been improved
 by 2-10%.  I wouldn't draw any conclusions from this data yet, more is
 needed to filter out noise, but it's positive.

 MariaDB has some nice performance improvements that our workload doesn't
 really hit (better query optimization and index usage during joins, much
 better sub query support) but there are also some things, such as full
 utilization of the primary key embedded on the right of every secondary
 index that we can take advantage of (and improve our schema around) once
 prod is fully upgraded, hopefully over the next 1-2 months.

 The main goal of migrating to MariaDB is not performance driven.  More so,
 I think it's in WMF's and the open source communities interest to coalesce
 around the MariaDB Foundation as the best route to ensuring a truly open
 and well supported future for mysql derived database technology.
 Performance gains along the way are icing on the cake.

 -Asher


 ___
 Ops mailing list
 o...@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/ops


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Mobile apps: time to go native?

2012-12-11 Thread Platonides
On 12/12/12 01:04, MZMcBride wrote:
 Looking at the big picture, I don't think we'll ever see widespread editing
 from mobile devices. The user experience is simply too awful. The best I
 think most people are hoping for is the ability to easily fix a typo, maybe,
 but even then you have to assess costs vs. benefit. That is, is it really
 worth paying two or three full-time employees so that someone can easily
 change Barrack to Barack from his or her iPhone? Probably not.

Then maybe the only feature needed by the mobile apps is a highlight
article content and email me a bookmark to this when I'm on desktop.


David Gerard wrote:
 OTOH, see recent coverage of Wikipedia in Africa, where it's basically
 going to be on phones. Cheap shitty smartphones. That the kids are
 *desperate* to get Wikipedia on. Do we want to make those readers into
 editors? It'd be nice.

Have a link? 'Cheap smartphone' seems a contradiction.


I think there is a field for mobiles and one for desktop. You could do
one tasks from the other? Probably, but not as well as if you used the
right tool*.
The proper optimization should be done, it's ok to make things
*possible*, but trying to force everything to work one way (eg. Windows
8) is the wrong path imho.


* Note that there are valid cases in which you have to use a suboptimal
tool.


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Mobile apps: time to go native?

2012-12-11 Thread David Gerard
On 12 December 2012 00:22, Platonides platoni...@gmail.com wrote:
 David Gerard wrote:

 OTOH, see recent coverage of Wikipedia in Africa, where it's basically
 going to be on phones. Cheap shitty smartphones. That the kids are
 *desperate* to get Wikipedia on. Do we want to make those readers into
 editors? It'd be nice.

 Have a link? 'Cheap smartphone' seems a contradiction.


$50 Huawei phones running an ancient Android and only getting cheaper.
Jimbo's all about them.
http://techcrunch.com/2012/12/10/50-android-smartphones-are-disrupting-africa-much-faster-than-you-think-says-wikipedias-jimmy-wales/


 I think there is a field for mobiles and one for desktop. You could do
 one tasks from the other? Probably, but not as well as if you used the
 right tool*.


You're assuming from the position of someone who can assume a
general-purpose computer with a good Internet connection.


- d.

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] MediaWiki Groups are official: start your own!

2012-12-11 Thread Quim Gil

On 12/11/2012 03:44 PM, bawolff wrote:

I'm actually quite curious to see if there are actually enough MW devs
in a single city (Other then WMF's home town) to form a group.


MediaWiki Groups are open to members of different specialties and 
levels of expertise. The richer and more diverse the better. 
Non-technical users willing to contribute and learn are welcome too!

http://www.mediawiki.org/wiki/Groups

This is not about devs alone, but about people interested in all 
MediaWiki aspects, like testing fresh software, translating strings, 
participating in the UX design of a feature, helping triaging forgotten 
bug reports or enhancement requests...


MediaWiki Groups are tools for reaching to new potential community 
members. If your starting point is how many MediaWiki core/extensions 
hackers are there in my city then I recommend you to widen your scope. 
Otherwise you are right: it's not even worth starting.


Wikipedia is a big thing globally and there is plenty of tech people 
that would be interested in contributing if they would know how or who 
to ask around.


One starting point in your city / region would be to check 
http://en.wikipedia.org/wiki/Wikipedia:Meetup , attend the next meetup 
and start infiltrating the MediaWiki / tech agenda there. There is no 
point in keeping the traditional divide between readers/editors and 
tech/coders forever.




To be honest though, I kind of feel that if such groups were going
to form, they probably would have already. Formality rarely makes
people come together that wouldn't by themselves.


Time will tell. We are not attempting to convince you.  :)  As long as 
you point to the right URL anybody interested in forming a group and you 
attend the activities happening near you, it's all fine.


--
Quim Gil
Technical Contributor Coordinator
Wikimedia Foundation

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Mobile apps: time to go native?

2012-12-11 Thread John Vandenberg
Small (native) apps can do Wikimedia work quite effectively using the api

Upload image
File categorisation
New page patrol
Flagged revs/Pending changes
OTRS

John Vandenberg.
sent from Galaxy Note
On Dec 12, 2012 7:04 AM, MZMcBride z...@mzmcbride.com wrote:

 Brion Vibber wrote:
  Over on the mobile team we've been chatting for a while about the various
  trade-offs in native vs HTML-based (PhoneGap/Cordova) development.
 
  [...]
 
  iOS and Android remain our top-tier mobile platforms, and we know we can
 do
  better on them than we do so far...
 
  Any thoughts? Wildly in favor or against?

 It's unclear from your e-mail what the goal of mobile interaction (for lack
 of a better term) is. Are you trying to build editing features? File upload
 features? Or do you want to just have a decent reader? This seems like a
 key
 component to any discussion of the future of mobile development.

 Looking at the big picture, I don't think we'll ever see widespread editing
 from mobile devices. The user experience is simply too awful. The best I
 think most people are hoping for is the ability to easily fix a typo,
 maybe,
 but even then you have to assess costs vs. benefit. That is, is it really
 worth paying two or three full-time employees so that someone can easily
 change Barrack to Barack from his or her iPhone? Probably not.

 Perhaps mobile uploading could use better native support, but again, is the
 cost worth it? Does Commons need more low-quality photos? And even as phone
 cameras get better, do those photos need to be _instantly_ uploaded to the
 site? There's something to be said for waiting until you get home to upload
 photos, especially given how cumbersome the photo upload process is
 (copyright, permissions, categorization, etc.). And this all side-steps the
 question of whether there are better organizations equipped at handling
 photos (such as Flickr or whatever).

 That leaves reading. If a relatively recent browser can't read Wikimedia
 wikis without performance issues, I think that indicates a problem with
 Wikimedia wikis (way too much JavaScript, images are too large, etc.).
 Mobile browsers are fairly robust (vastly more robust compared to what they
 used to be), so I'm not sure why having a Wikipedia reader is valuable or
 why it's worth investing finite resources in. Occasionally I'll hear but I
 want to have a favorite pages feature or I want to support offline
 reading,
 but the phone's OS should be able to handle most of this from the built-in
 Web browser. Or not, but I think if a phone is incapable of a feature such
 as e-mail this Web page (article) to a friend, it's not an important
 enough feature to devote resources to.

 Wikimedia wikis have a lot of bugs and the Wikimedia Foundation has finite
 resources. I personally only use the Messages and Music apps on my
 phone, so I'm not the best person to make the argument for additional
 mobile
 development, but when I look at how terrible the user experience continues
 to be for desktop users, it becomes difficult for me to understand why the
 Wikimedia Foundation would delve into the world of mobile (apart from
 initiatives such as Wikipedia Zero). What benefit to the creation or
 dissemination of free educational content are we seeing (or hoping to see)
 from native apps?

 MZMcBride



 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Mobile apps: time to go native?

2012-12-11 Thread MZMcBride
Platonides wrote:
 On 12/12/12 01:04, MZMcBride wrote:
 Looking at the big picture, I don't think we'll ever see widespread editing
 from mobile devices. The user experience is simply too awful. The best I
 think most people are hoping for is the ability to easily fix a typo, maybe,
 but even then you have to assess costs vs. benefit. That is, is it really
 worth paying two or three full-time employees so that someone can easily
 change Barrack to Barack from his or her iPhone? Probably not.
 
 Then maybe the only feature needed by the mobile apps is a highlight
 article content and email me a bookmark to this when I'm on desktop.

Maybe. But if you have an entire mobile team, they're going to quickly run
out of things to do.

David Gerard wrote:
 OTOH, see recent coverage of Wikipedia in Africa, where it's basically
 going to be on phones. Cheap shitty smartphones. That the kids are
 *desperate* to get Wikipedia on. Do we want to make those readers into
 editors? It'd be nice.

Sure, it's difficult to argue against turning readers into editors. I'm
asking if the described mobile-related efforts (i.e., going native) will get
us any closer to our broader goals (creating and disseminating free
educational content). And if so, how?

As I hinted in my previous post, I make a distinction between mobile
app/site development work and initiatives such as Wikipedia Zero (which is
probably what the kids in Africa are reading Wikipedia via). I think it
makes sense to focus effort and energy on making Wikimedia wikis (not just
Wikipedia!) available in more places. I'm not sure it makes a lot of sense
to support editing and other interaction components on mobile devices. But,
again, there's a lot of vagueness and ambiguity in this discussion,
particularly with regard to what the _goals_ of mobile interaction actually
are. Without having defined, measurable goals, it's almost impossible to
make an informed decision here, in my opinion.

David Gerard wrote:
 MZMcBride wrote:
 Perhaps mobile uploading could use better native support, but again, is the
 cost worth it? Does Commons need more low-quality photos? And even as phone
 cameras get better, do those photos need to be _instantly_ uploaded to the
 site? There's something to be said for waiting until you get home to upload
 photos, especially given how cumbersome the photo upload process is
 (copyright, permissions, categorization, etc.). And this all side-steps the
 question of whether there are better organizations equipped at handling
 photos (such as Flickr or whatever).
 
 This is a version of the general argument against participation. There
 are reasons it's not favoured.

Oh come on now, that's not really fair. I'm not arguing against
participation, I'm arguing against shitty photos. I was almost completely
uninvolved, but I seem to remember much ado earlier this year about Wiki
Loves Monuments and mobile support (it even had its own mobile app, I
guess?). But looking at all of the WLM winners
(https://commons.wikimedia.org/wiki/Wiki_Loves_Monuments_2012_winners),
were any of them taken on mobile devices? A quick sampling seems to suggest
that all of the good photos came from Nikon or Sony cameras. That isn't to
say that mobile uploads (muploads) aren't ever going to be valuable to
Wikimedia wikis, but it does raise the legitimate question of whether it's a
good use of finite resources to support such projects. What is the value?

MZMcBride



___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Mobile apps: time to go native?

2012-12-11 Thread MZMcBride
John Vandenberg wrote:
 Small (native) apps can do Wikimedia work quite effectively using the api
 
 Upload image
 File categorisation
 New page patrol
 Flagged revs/Pending changes
 OTRS

I think I fundamentally agree with your point, but when I consider that
there is (for example) no API for adding or removing a category from a page
(file or otherwise), it seems like a much better investment of finite
resources to create such an API and let others build mobile apps that use
that API. Or fix other long-standing categorization issues such as the
ability to move/rename categories (https://bugzilla.wikimedia.org/3311).
The lack of infinite developer resources really can't be overlooked or
overstated here, in my view.

MZMcBride



___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] [Ops] mariadb 5.5 in production for english wikipedia

2012-12-11 Thread Patrick Reilly
Hello Asher,

Thanks so much for your hard work on supporting our MariaDB migration.
I think that it's a big step for the MariaDB Foundation.

— Patrick

On Tue, Dec 11, 2012 at 4:17 PM, Peter Youngmeister p...@wikimedia.org wrote:
 Asher,

 This is awesome! Thank you for your hard, careful work and dedication in
 taking this huge first step in moving to MariaDB!

 --peter


 On Tue, Dec 11, 2012 at 4:10 PM, Asher Feldman afeld...@wikimedia.org
 wrote:

 Hi,

 This afternoon, I migrated one of the main production English Wikipedia
 slaves, db59, to MariaDB 5.5.28.  We've previously been testing 5.5.27 on
 the primary research slave, and I've been testing the current build for the
 last few days on a slave in eqiad.  All has looked good, and I spent the
 last few days adapting our monitoring and metrics collection tools to the
 new version, and building binary packages that meet our needs.

 A main gotcha in major version upgrades is performance regressions due to
 changes in query plans.  I've seen no sign of this, and my initial
 assessment is that performance for our workload is on par with or slightly
 improved over the 5.1 facebook patchset.

 Taking the times of 100% of all queries over regular sample windows, the
 average query time across all enwiki slave queries is about 8% faster with
 MariaDB vs. our production build of 5.1-fb.  Some queries types are 10-15%
 faster, some are 3% slower, and nothing looks aberrant beyond those bounds.
 Overall throughput as measured by qps has generally been improved by 2-10%.
 I wouldn't draw any conclusions from this data yet, more is needed to filter
 out noise, but it's positive.

 MariaDB has some nice performance improvements that our workload doesn't
 really hit (better query optimization and index usage during joins, much
 better sub query support) but there are also some things, such as full
 utilization of the primary key embedded on the right of every secondary
 index that we can take advantage of (and improve our schema around) once
 prod is fully upgraded, hopefully over the next 1-2 months.

 The main goal of migrating to MariaDB is not performance driven.  More so,
 I think it's in WMF's and the open source communities interest to coalesce
 around the MariaDB Foundation as the best route to ensuring a truly open and
 well supported future for mysql derived database technology.  Performance
 gains along the way are icing on the cake.

 -Asher


 ___
 Ops mailing list
 o...@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/ops



 ___
 Ops mailing list
 o...@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/ops


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Mobile apps: time to go native?

2012-12-11 Thread Brion Vibber
We've gone a bit off topic into the question of whether we should spend any
time on mobile at all, it seems. :)

On Dec 11, 2012 5:11 PM, MZMcBride z...@mzmcbride.com wrote:

 I think I fundamentally agree with your point, but when I consider that
 there is (for example) no API for adding or removing a category from a
page
 (file or otherwise), it seems like a much better investment of finite
 resources to create such an API and let others build mobile apps that use
 that API.

Where are these others who will build the apps for us? I'd like to hire
them.

 Or fix other long-standing categorization issues such as the
 ability to move/rename categories (https://bugzilla.wikimedia.org/3311).
 The lack of infinite developer resources really can't be overlooked or
 overstated here, in my view.

Finite developer resources exist in other areas than API development. I'll
be honest; I think a good mobile app will pull in more users and do more
good than just making a few API tweaks and hoping something happens.

Note also that the mobile dept is making API improvements as we go along...
these things are not exclusive. Rather, our needs drive the API development
we do.

One in progress is the GeoData extension which is now collecting
coordinates on en.wikipedia.org and once the search back end is deployed
will make location-based search a first class citizen instead of relying on
tool server hacks and third party services.

-- brion
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] [Ops] mariadb 5.5 in production for english wikipedia

2012-12-11 Thread Terry Chay
Nice!


 The main goal of migrating to MariaDB is not performance driven.  More so, I 
 think it's in WMF's and the open source communities interest to coalesce 
 around the MariaDB Foundation as the best route to ensuring a truly open and 
 well supported future for mysql derived database technology.  Performance 
 gains along the way are icing on the cake.
 


If it works out, then at some point we should probably tell the MariaDB peeos 
that they can mention that the WMF uses it. :-)
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Mobile apps: time to go native?

2012-12-11 Thread MZMcBride
Brion Vibber wrote:
 We've gone a bit off topic into the question of whether we should spend any
 time on mobile at all, it seems. :)

Well, I think that's expected when the question was (broadly) where do we go
from here? I think native or non-native support is a false dichotomy:
there's a broader question of what we want the mobile experience to be. I
don't think the answer to this question really exists currently.

 Finite developer resources exist in other areas than API development. I'll
 be honest; I think a good mobile app will pull in more users and do more
 good than just making a few API tweaks and hoping something happens.

Fair enough. But I really would strongly urge you (and/or the mobile team)
to better define what the goals are for mobile interaction, maybe in an RFC.
What do you want people to be able to do from mobile devices? What's
realistic now? What's realistic in five years from now? What's realistic on
4G and what's realistic on whatever slow connection users have elsewhere?

 One in progress is the GeoData extension which is now collecting
 coordinates on en.wikipedia.org and once the search back end is deployed
 will make location-based search a first class citizen instead of relying on
 tool server hacks and third party services.

Hmmm, I actually see this as more of a counter-example: Max was able to
devote some time to creating a GeoData extension that has broad applicable
use to Wikimedia wikis and to the free content movement. Just imagine what
he could be doing if he weren't battling (for example) the article editing
interface on a Samsung Nexus. I see your point about mobile development
feeding APIs and vice versa, I'm just not sure where the value comes from
the _Wikimedia Foundation_ working on mobile development instead of devoting
more energy and resources to tools exactly like the GeoData extension.

MZMcBride



___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] [Ops] mariadb 5.5 in production for english wikipedia

2012-12-11 Thread Asher Feldman
On Tue, Dec 11, 2012 at 5:49 PM, Terry Chay tc...@wikimedia.org wrote:

 Nice!


  The main goal of migrating to MariaDB is not performance driven.  More
 so, I think it's in WMF's and the open source communities interest to
 coalesce around the MariaDB Foundation as the best route to ensuring a
 truly open and well supported future for mysql derived database technology.
  Performance gains along the way are icing on the cake.
 


 If it works out, then at some point we should probably tell the MariaDB
 peeos that they can mention that the WMF uses it. :-)


We've been talking to Monty Widenius who visited the WMF office prior to
the Foundation announcement, and are fostering mutual support between the
Wikimedia and MariaDB Foundations.  Win-win for the open source community
at large!

-A
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] [Ops] mariadb 5.5 in production for english wikipedia

2012-12-11 Thread Alolita Sharma
Asher,

This is great news! Thanks for your perseverance and setting up
MariaDB 5.5 for en.wp :-) Monty will be thrilled!

Best,
Alolita


On Wed, Dec 12, 2012 at 7:53 AM, Asher Feldman afeld...@wikimedia.org wrote:
 On Tue, Dec 11, 2012 at 5:49 PM, Terry Chay tc...@wikimedia.org wrote:

 Nice!


  The main goal of migrating to MariaDB is not performance driven.  More
 so, I think it's in WMF's and the open source communities interest to
 coalesce around the MariaDB Foundation as the best route to ensuring a
 truly open and well supported future for mysql derived database technology.
  Performance gains along the way are icing on the cake.
 


 If it works out, then at some point we should probably tell the MariaDB
 peeos that they can mention that the WMF uses it. :-)


 We've been talking to Monty Widenius who visited the WMF office prior to
 the Foundation announcement, and are fostering mutual support between the
 Wikimedia and MariaDB Foundations.  Win-win for the open source community
 at large!

 -A
 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


[Wikitech-l] Alpha version of the VisualEditor now available on the English Wikipedia

2012-12-11 Thread James Forrester
TL;DR: Today we are launching an alpha, opt-in version of the
VisualEditor[0] to the English Wikipedia. This will let editors create
and modify real articles visually, using a new system where the
articles they edit will look the same as when you read them, and their
changes show up as they type enter them — like writing a document in a
word processor. Please let us know what you think[1].


Why launch now?

We want our community of existing editors to get an idea of what the
VisualEditor will look like in the “real world” and start to give us
feedback about how well it integrates with how they edit right now,
and their thoughts on what aspects are the priorities in the coming
months.

The editor is at an early stage and is still missing significant
functions, which we will address in the coming months. Because of
this, we are mostly looking for feedback from experienced editors at
this point, because the editor is insufficient to really give them a
proper experience of editing. We don’t want to promise an easier
editing experience to new editors before it is ready.

As we develop improvements, they will be pushed every fortnight to the
wikis, allowing you to give us feedback[1] as we go and tell us what
next you want us to work on.


How can I try it out?

The VisualEditor is now available to all logged-in accounts on the
English Wikipedia as a new preference, switched off by default. If you
go to your “Preferences” screen and click into the “Editing” section,
it will have as an option labelled “Enable VisualEditor”).

Once enabled, for each article you can edit, you will get a second
editor tab labelled “VisualEditor” next to the “Edit” tab. If you
click this, after a little pause you will enter the VisualEditor. From
here, you can play around, edit and save real articles and get an idea
of what it will be like when complete.

At this early stage in our development, we recommend that after saving
any edits, you check whether they broke anything. All edits made with
the VisualEditor will show up in articles’ history tabs with a
“VisualEditor” tag next to them, so you can track what is happening.


Things to note

Slow to load - It will take some time for long complex pages to load
into the VisualEditor, and particularly-big ones may timeout after 60
seconds. This is because pages have to be loaded through Parsoid which
is also in its early stages, and is not yet optimised for deployment
and is currently uncached. In the future (a) Parsoid itself will be
much faster, (b) Parsoid will not depend on as many slow API calls,
and (c) it will be cached.

Odd-looking - we currently struggle with making the HTML we produce
look like you are used to seeing, so styling and so on may look a
little (or even very) odd. This hasn't been our priority to date, as
our focus has been on making sure we don't disrupt articles with the
VisualEditor by altering the wikitext (correct round-tripping).

No editing references or templates - Blocks of content that we cannot
yet handle are uneditable; this is mostly references and templates
like infoboxes. Instead, when you mouse over them, they will be
hatched out and a tooltip will inform you that they have to be edited
via wikitext for now. You can select these items and delete them
entirely, however there is not yet a way to add ones in or edit them
currently (this will be a core piece of work post-December).

Incomplete editing - Some elements of complex formatting will
display and let you edit their contents, but not let users edit their
structure or add new entries - such as tables or definition lists.
This area of work will also be one of our priorities post-December.

No categories - Articles' meta items will not appear at all -
categories, langlinks, magic words etc.; these are preserved (so
editing won't disrupt them), but they not yet editable. Another area
for work post-December - our current plan is that they will be edited
through a metadata flyout, with auto-suggestions and so on.

Poor browser support - Right now, we have only got VisualEditor to
work in the most modern versions of Firefox, Chrome and Safari. We
will find a way to support (at least) Internet Explorer post-December,
but it's going to be a significant piece of work and we have failed to
get it ready for now.

Articles and User pages only - The VisualEditor will only be enabled
for the article and user namespaces (so you can make changes in a
personal sandbox), and will not work with talk pages, templates,
categories, etc.. In time, we will build out the kinds of specialised
editing tools needed for non-articles, but our focus has been on
articles.


Final point

This is not the final form of the VisualEditor in lots of different
ways. We know of a number of bugs, and we expect you to find more. We
do not recommend people trying to use the VisualEditor for their
regular editing yet. We would love your feedback on what we have done
so far – whether it’s a problem you discovered, an aspect that you
find 

Re: [Wikitech-l] Alpha version of the VisualEditor now available on the English Wikipedia

2012-12-11 Thread Erik Moeller
On Tue, Dec 11, 2012 at 7:30 PM, James Forrester
jforres...@wikimedia.org wrote:
 TL;DR: Today we are launching an alpha, opt-in version of the
 VisualEditor[0] to the English Wikipedia. This will let editors create
 and modify real articles visually, using a new system where the
 articles they edit will look the same as when you read them, and their
 changes show up as they type enter them — like writing a document in a
 word processor. Please let us know what you think[1].

Congrats, team! :-D Being able to edit the Real Thing is indeed a huge
deal -- even if it's still the very very very (very) first release to
do so. A long road ahead, but maybe some sleep is in order. ;-)

The future just got a little more real.

Erik
-- 
Erik Möller
VP of Engineering and Product Development, Wikimedia Foundation

Support Free Knowledge: https://wikimediafoundation.org/wiki/Donate

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] [Wikimedia-l] Alpha version of the VisualEditor now available on the English Wikipedia

2012-12-11 Thread Terry Chay
For Trevor to remember...

So we tried visually editing using the PS3 browser...

http://campl.us/nckV

Great job team! :-)

Sent from my free corporate advertising removed at the request of the owner

On Dec 11, 2012, at 10:07 PM, Erik Moeller e...@wikimedia.org wrote:

 On Tue, Dec 11, 2012 at 7:30 PM, James Forrester
 jforres...@wikimedia.org wrote:
 TL;DR: Today we are launching an alpha, opt-in version of the
 VisualEditor[0] to the English Wikipedia. This will let editors create
 and modify real articles visually, using a new system where the
 articles they edit will look the same as when you read them, and their
 changes show up as they type enter them — like writing a document in a
 word processor. Please let us know what you think[1].
 
 Congrats, team! :-D Being able to edit the Real Thing is indeed a huge
 deal -- even if it's still the very very very (very) first release to
 do so. A long road ahead, but maybe some sleep is in order. ;-)
 
 The future just got a little more real.
 
 Erik
 -- 
 Erik Möller
 VP of Engineering and Product Development, Wikimedia Foundation
 
 Support Free Knowledge: https://wikimediafoundation.org/wiki/Donate
 
 ___
 Wikimedia-l mailing list
 wikimedi...@lists.wikimedia.org
 Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/wikimedia-l
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] MediaWiki Groups are official: start your own!

2012-12-11 Thread bawolff
On Tue, Dec 11, 2012 at 8:49 PM, Quim Gil q...@wikimedia.org wrote:
 On 12/11/2012 03:44 PM, bawolff wrote:
[..]

 One starting point in your city / region would be to check
 http://en.wikipedia.org/wiki/Wikipedia:Meetup , attend the next meetup and
 start infiltrating the MediaWiki / tech agenda there. There is no point in
 keeping the traditional divide between readers/editors and tech/coders
 forever.

Hey I would if there was one in either of the two cities I currently
live in (University and home are in different cities). Heck no city
from either of the two provinces I live in even make it anywhere on
that page. [before I walk into the whole - you should start one
yourself, I'm much too lazy ;) ]



 To be honest though, I kind of feel that if such groups were going
 to form, they probably would have already. Formality rarely makes
 people come together that wouldn't by themselves.


 Time will tell. We are not attempting to convince you.  :)  As long as you
 point to the right URL anybody interested in forming a group and you attend
 the activities happening near you, it's all fine.

By all means, if stuff actually happens I'll be just as happy as the
rest of you :)


--bawolff

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Alpha version of the VisualEditor now available on the English Wikipedia

2012-12-11 Thread Lee Worden

Very exciting - congratulations!

I know these are early days for the VisualEditor, but is there a plan 
for extension developers to be able to hook in to provide editing for 
the things their extensions support?


Lee Worden
http://leeworden.net

On 12/11/2012 10:28 PM, wikitech-l-requ...@lists.wikimedia.org wrote:

From: James Forresterjforres...@wikimedia.org
To: Wikimedia developerswikitech-l@lists.wikimedia.org,
wikimedi...@lists.wikimedia.org
Subject: [Wikitech-l] Alpha version of the VisualEditor now available
on the  English Wikipedia
Message-ID:
CAEWGtDWtQ5a-a=J8DN1vNmQH1J=5hl0k59t+lrwobjfpu+d...@mail.gmail.com
Content-Type: text/plain; charset=UTF-8

TL;DR: Today we are launching an alpha, opt-in version of the
VisualEditor[0] to the English Wikipedia. This will let editors create
and modify real articles visually, using a new system where the
articles they edit will look the same as when you read them, and their
changes show up as they type enter them ? like writing a document in a
word processor. Please let us know what you think[1].


Why launch now?

We want our community of existing editors to get an idea of what the
VisualEditor will look like in the ?real world? and start to give us
feedback about how well it integrates with how they edit right now,
and their thoughts on what aspects are the priorities in the coming
months.

The editor is at an early stage and is still missing significant
functions, which we will address in the coming months. Because of
this, we are mostly looking for feedback from experienced editors at
this point, because the editor is insufficient to really give them a
proper experience of editing. We don?t want to promise an easier
editing experience to new editors before it is ready.

As we develop improvements, they will be pushed every fortnight to the
wikis, allowing you to give us feedback[1] as we go and tell us what
next you want us to work on.


How can I try it out?

The VisualEditor is now available to all logged-in accounts on the
English Wikipedia as a new preference, switched off by default. If you
go to your ?Preferences? screen and click into the ?Editing? section,
it will have as an option labelled ?Enable VisualEditor?).

Once enabled, for each article you can edit, you will get a second
editor tab labelled ?VisualEditor? next to the ?Edit? tab. If you
click this, after a little pause you will enter the VisualEditor. From
here, you can play around, edit and save real articles and get an idea
of what it will be like when complete.

At this early stage in our development, we recommend that after saving
any edits, you check whether they broke anything. All edits made with
the VisualEditor will show up in articles? history tabs with a
?VisualEditor? tag next to them, so you can track what is happening.


Things to note

Slow to load - It will take some time for long complex pages to load
into the VisualEditor, and particularly-big ones may timeout after 60
seconds. This is because pages have to be loaded through Parsoid which
is also in its early stages, and is not yet optimised for deployment
and is currently uncached. In the future (a) Parsoid itself will be
much faster, (b) Parsoid will not depend on as many slow API calls,
and (c) it will be cached.

Odd-looking - we currently struggle with making the HTML we produce
look like you are used to seeing, so styling and so on may look a
little (or even very) odd. This hasn't been our priority to date, as
our focus has been on making sure we don't disrupt articles with the
VisualEditor by altering the wikitext (correct round-tripping).

No editing references or templates - Blocks of content that we cannot
yet handle are uneditable; this is mostly references and templates
like infoboxes. Instead, when you mouse over them, they will be
hatched out and a tooltip will inform you that they have to be edited
via wikitext for now. You can select these items and delete them
entirely, however there is not yet a way to add ones in or edit them
currently (this will be a core piece of work post-December).

Incomplete editing - Some elements of complex formatting will
display and let you edit their contents, but not let users edit their
structure or add new entries - such as tables or definition lists.
This area of work will also be one of our priorities post-December.

No categories - Articles' meta items will not appear at all -
categories, langlinks, magic words etc.; these are preserved (so
editing won't disrupt them), but they not yet editable. Another area
for work post-December - our current plan is that they will be edited
through a metadata flyout, with auto-suggestions and so on.

Poor browser support - Right now, we have only got VisualEditor to
work in the most modern versions of Firefox, Chrome and Safari. We
will find a way to support (at least) Internet Explorer post-December,
but it's going to be a significant piece of work and we have failed to
get it ready for now.

Articles