Re: [Wikitech-l] [WikimediaMobile] The future of skins

2014-08-27 Thread Juliusz Gonera
Someone in one of our meetings mentioned that Twig is a PHP
implementation of Mustache. This doesn't seem to be the case though.
We need a templating solution that works both on the server and the
client.

On Tue, Aug 26, 2014 at 5:21 PM, Trevor Parscal tpars...@wikimedia.org wrote:
 Thanks for summarizing the meeting Jon.

 So, let's get Twig/Swig into core then, eh? :)

 - Trevor


 On Tue, Aug 26, 2014 at 3:53 PM, Jon Robson jrob...@wikimedia.org wrote:

 Shahyar, Juliusz, Trevor, Kaldari, Roan and I sat down yesterday and
 talked about the future of skins. Hopefully this mail summarises what
 we talked about and what we agreed on. Feel free to add anything, or
 ask any questions in the likely event that I've misinterpreted
 something we talked about or this is unclear :)

 Specifically we talked about how we are unhappy with how difficult it
 currently is for developers to create a skin. The skin class involves
 too many functions and does more than a skin should do e.g. manage
 classes on the body, worry about script tags and style tags.

 Trevor is going to create a base set of widgets, for example a list
 generator to generate things like a list of links to user tools. The
 widgets will be agnostic to how they are rendered - some may use
 templates, some may not.

 We identified the new skin system will have two long term goals:
 1) We would like to get to the point where a new skin can be built by
 simply copying and pasting a master template and writing a new css
 file.
 2) Should be possible for us in future to re-render an entire page via
 JavaScript and using the modern history push state re-render any page
 via the API. (Whether we'd want to do this is another consideration
 but we would like to have an architecture that is powerful enough to
 support such a thing)

 As next steps we agreed to do the following:

 1) Trevor is going to build a watch star widget on client and server.
 We identified that the existing watch star code is poorly written and
 has resulted in MobileFrontend rewriting it. We decided to target this
 as it is a simple enough example that it doesn't need a template. It's
 small and contained enough that we hope this will allow us to share
 ideas and codify a lot of those. Trevor is hoping to begin working on
 this the week of the 2nd September.

 2) We need a templating system in core. Trevor is going to do some
 research on server side templating systems. We hope that the
 templating RFC [1] can get resolved however we are getting to a point
 that we need one as soon as possible and do not want to be blocked by
 the outcome of this RFC, especially given a mustache based templating
 language can address all our current requirements.

 [1]
 https://www.mediawiki.org/wiki/Requests_for_comment/HTML_templating_library

 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l



 ___
 Mobile-l mailing list
 mobil...@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/mobile-l


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] iPad bugs we need to fix in VE

2014-06-25 Thread Juliusz Gonera
One more important bug to add to the list:
https://bugzilla.wikimedia.org/show_bug.cgi?id=64575
(input becomes unresponsive in mobile link inspector on iOS Safari)


On Mon, Jun 23, 2014 at 3:17 PM, Juliusz Gonera jgon...@wikimedia.org
wrote:

 There are three important bugs that need to be fixed in VE before we can
 move it to stable for tablets (specifically iPads). The first two were
 added by me just today. The third was reported in May by Rummana and for
 some odd reason had Jon assigned to it, although he has never worked on it
 (I removed him).

 Following Roan's suggestion I recorded videos on iOS simulator to
 illustrate what is happening in each case (thanks to Monte for letting me
 use his Macbook with up-to-date Xcode).

 https://bugzilla.wikimedia.org/show_bug.cgi?id=66999
 https://bugzilla.wikimedia.org/show_bug.cgi?id=67002
 https://bugzilla.wikimedia.org/show_bug.cgi?id=65326

 Let us (the mobile team) know if you need any help with those.

 --
 Juliusz

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] iPad bugs we need to fix in VE

2014-06-23 Thread Juliusz Gonera
There are three important bugs that need to be fixed in VE before we can
move it to stable for tablets (specifically iPads). The first two were
added by me just today. The third was reported in May by Rummana and for
some odd reason had Jon assigned to it, although he has never worked on it
(I removed him).

Following Roan's suggestion I recorded videos on iOS simulator to
illustrate what is happening in each case (thanks to Monte for letting me
use his Macbook with up-to-date Xcode).

https://bugzilla.wikimedia.org/show_bug.cgi?id=66999
https://bugzilla.wikimedia.org/show_bug.cgi?id=67002
https://bugzilla.wikimedia.org/show_bug.cgi?id=65326

Let us (the mobile team) know if you need any help with those.

-- 
Juliusz
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] [RFC] Scoping site CSS

2014-04-07 Thread Juliusz Gonera

Hi,

Sumana pinged me about this RFC but considering other things I am 
working on that have a high priority, I'm unlikely to work on this in 
the near future. If anyone is interested in picking it up, feel free to 
do so.


https://www.mediawiki.org/wiki/Requests_for_comment/Scoping_site_CSS

--
Juliusz

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] Algorithm for assessing quality of articles

2013-10-31 Thread Juliusz Gonera
I haven't read the paper itself, but just in case someone has a moment 
and is interested:

http://www.technologyreview.com/view/520946/can-automated-editorial-tools-help-wikipedias-declining-volunteer-workforce/

--
Juliusz

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] AbuseFilter error codes and MobileFrontend

2013-09-26 Thread Juliusz Gonera

On 09/25/2013 12:17 PM, Juliusz Gonera wrote:
I've just checked on my local instance and it seems that this patch 
does not change much for the MobileFrontend. As I stated in the first 
message, we need to distinguish between warnings and disallowed edits 
because they require different UI (either allow resubmitting or not). 
It seems that without this patch the only way to do this is looking at 
the `code` property and with this patch I should look at the 
`error-msg` property instead (and keep my fingers crossed that some 
admin didn't come up with a different word for warning).


I understand that thanks to this patch we can be 100% sure that the 
error comes from AbuseFilter even if the `error-msg` has no 
abusefilter- prefix. However, we should also include some 
information which would allow us to figure out which action from 
Actions taken when matched box in filter config a given API response 
is referring to. That would be much more reliable than looking for 
abusefilter-warning- prefix which can be omitted by admins setting 
up the warning.


I think the best way to proceed right now is to use the `code` key to 
determine what to do and we should figure out how to make it better in 
future. Is there any way I can get notified when your patch gets merged? 
Not sure if there is such an option somewhere in Gerrit's confusing UI...


--
Juliusz

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] AbuseFilter error codes and MobileFrontend

2013-09-25 Thread Juliusz Gonera

On 09/25/2013 02:38 AM, Andrew Garrett wrote:

On 09/23/2013 06:48 PM, Andrew Garrett wrote:
You should know about this change[1], which corrects the error 
messages to be more in line with the general case, as well as adding 
some metadata. It's not been approved yet, so I'm nudging a few 
reviewers.


You can also determine which mobile edits are hitting filters, by 
looking for edits in the filter log which have 'user_mobile = 1' 
set. I'm not sure I quite remember how to search in that way, but 
I'll look into it.


[1] https://gerrit.wikimedia.org/r/#/c/80137/ 
I'm not sure if I understand what are the implications of this change 
for the mobile editor. Does it change the code key into error-msg 
key? If yes, should I rely on error-msg rather than code after 
this patch is merged to determine what is a warning and what is 
disallowed?
You will now start to receive a consistent error message of 
'abusefilter-aborted' included in the error.code property of the 
output instead of the hodgepodge of other AbuseFilter errors, which 
were returned in a nonstandard format. You can still find the detailed 
error message in other properties of the error object.


I've just checked on my local instance and it seems that this patch does 
not change much for the MobileFrontend. As I stated in the first 
message, we need to distinguish between warnings and disallowed edits 
because they require different UI (either allow resubmitting or not). It 
seems that without this patch the only way to do this is looking at the 
`code` property and with this patch I should look at the `error-msg` 
property instead (and keep my fingers crossed that some admin didn't 
come up with a different word for warning).


I understand that thanks to this patch we can be 100% sure that the 
error comes from AbuseFilter even if the `error-msg` has no 
abusefilter- prefix. However, we should also include some information 
which would allow us to figure out which action from Actions taken when 
matched box in filter config a given API response is referring to. That 
would be much more reliable than looking for abusefilter-warning- 
prefix which can be omitted by admins setting up the warning.


--
Juliusz

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] AbuseFilter error codes and MobileFrontend

2013-09-23 Thread Juliusz Gonera

On 09/23/2013 06:48 PM, Andrew Garrett wrote:
You should know about this change[1], which corrects the error 
messages to be more in line with the general case, as well as adding 
some metadata. It's not been approved yet, so I'm nudging a few reviewers.


You can also determine which mobile edits are hitting filters, by 
looking for edits in the filter log which have 'user_mobile = 1' set. 
I'm not sure I quite remember how to search in that way, but I'll look 
into it.


[1] https://gerrit.wikimedia.org/r/#/c/80137/ 


I'm not sure if I understand what are the implications of this change 
for the mobile editor. Does it change the code key into error-msg 
key? If yes, should I rely on error-msg rather than code after this 
patch is merged to determine what is a warning and what is disallowed?


Or maybe I should just treat code:abusefilter-aborted as if it was 
code:abusefilter-disallowed?


--
Juliusz

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] AbuseFilter error codes and MobileFrontend

2013-09-20 Thread Juliusz Gonera

Ugh, that was meant to go to wikitech-l too...

On 09/20/2013 03:38 PM, Juliusz Gonera wrote:

Hi,

A bit of background: Not long ago we launched mobile editing. Soon 
after that we discovered that the mobile editor fails on many wikis 
because we hadn't thought think about AbuseFilter support. We're 
trying to fix this now.


Statistics about the most frequent AbuseFilter error codes we're getting:
https://mingle.corp.wikimedia.org/projects/mobile/cards/1162
Related bug:
https://bugzilla.wikimedia.org/show_bug.cgi?id=52049

After getting some initial information from legoktm, my thoughts are:

* Probably no changes to AbuseFilter are necessary and we should 
implement everything in MobileFrontend.


* We should display the warning message (edit.warning in API response) 
for all codes (edit.code in API response) that start with 
abusefilter-warning* and allow the user to resubmit.


* We should display the disallow message (edit.warning in API 
response) for abusefilter-disallow and not allow the user to resubmit.


* We should display edit.warning message if present or a general one 
and not allow the user to resubmit for all the other error codes 
(until now we've got abusefilter-blanking, abusefilter-blank, 
abusefilter-imza, abusefilter-blocked-display and 
abusefilter-autobiography, but they don't happen too often).


Are my assumptions correct? Any thoughts or suggestions?




___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] RFC: Scoping site CSS

2013-07-16 Thread Juliusz Gonera

I wrote an RFC about scoping Common.css and Mobile.css:
https://www.mediawiki.org/wiki/Requests_for_comment/Scoping_site_CSS

In short: this could help us separate CSS rules added by administrators 
from the core UI rules of MediaWiki.


What we would get:
* UI (chrome) CSS more predictable and broken less often
* no crazy UI styling as seen at https://nv.wikipedia.org

Please share your thoughts.

--
Juliusz

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] New git-review lets you configure 'origin' as the gerrit remote

2013-06-05 Thread Juliusz Gonera

Thank you!


On 05/31/2013 03:14 PM, Ori Livneh wrote:

Hey,

The new version of git-review released today (1.22) includes a patch I
wrote that makes it possible to work against a single 'origin' remote. This
amounts to a workaround for git-review's tendency to frighten you into
thinking you're about to submit more patches than the ones you are working
on. It makes git-review more pleasant to work with, in my opinion.

To enable this behavior, you first need to upgrade to the latest version of
git-review, by running pip install -U git-review. Then you need to create
a configuration file: either /etc/git-review/git-review.conf (system-wide)
or ~/.config/git-review/git-review.conf (user-specific).

The file should contain these two lines:

[gerrit]
defaultremote = origin

Once you've made the change, any new Gerrit repos you clone using an
authenticated URI will just work.

You'll need to perform an additional step to migrate existing repositories.
In each repository, run the following commands:

   git remote set-url origin $(git config --get remote.gerrit.url)
   git remote rm gerrit
   git review -s

Hope you find this useful.
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l



___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] APIStrat conference, San Francisco, October 23, 24, 25

2013-05-10 Thread Juliusz Gonera
One thing I forgot to mention: the organizers of APIStrat would like to 
know if we participate or not at the beginning of next week. Please send 
a short draft of a proposal to Vanessa (she is one of the organizers) if 
you are interested.



On 05/07/2013 02:33 PM, Juliusz Gonera wrote:
A friend of mine is co-organizing a conference about APIs. She asked 
me if there is someone from Wikimedia or the community who would be 
interested in participating. It will take place in Parc 55 Hotel in 
San Francisco on October 23, 24, 25. A bit of information about the 
conference:


APIStrat is a vendor neutral conference, and we are committed to 
promote API usage/knowledge, and specifically with the event, to 
build a great program with interesting content and actually create 
something that is useful for the community. People behind it are Kin 
Lane (@apievangelist) and 3Scale. Here you can see speakers from the 
previous event: 
http://apistrategyconference.com/2013NYC/speakers.php, where they 
brought together over 350 people. Videos of the previous event can be 
found here: http://www.infoq.com/api-strategy-practice/


This time we are expecting between 500-600 attendees, and it will be 
both service providers and API consumers (developers). The previous 
edition was mostly addressed to providers, but we will have an 
exclusive track addressed to developers in this edition.


They'd like us to:

- talk about how much traffic we deal with, how do we do it, who uses 
our API (or something similar that we think would be of general 
interest),
- participate in the APIs in Government - Towards a Data Commons 
panel giving the perspective of a non-profit (they aim to have 
federal, state, education, international, and non-profit 
representation in this panel).


Seems interesting. Is there anyone who'd be interested in giving a 
talk or participating in a panel there?





___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] APIStrat conference, San Francisco, October 23, 24, 25

2013-05-07 Thread Juliusz Gonera
A friend of mine is co-organizing a conference about APIs. She asked me 
if there is someone from Wikimedia or the community who would be 
interested in participating. It will take place in Parc 55 Hotel in San 
Francisco on October 23, 24, 25. A bit of information about the conference:


APIStrat is a vendor neutral conference, and we are committed to 
promote API usage/knowledge, and specifically with the event, to build 
a great program with interesting content and actually create something 
that is useful for the community. People behind it are Kin Lane 
(@apievangelist) and 3Scale. Here you can see speakers from the 
previous event: http://apistrategyconference.com/2013NYC/speakers.php, 
where they brought together over 350 people. Videos of the previous 
event can be found here: http://www.infoq.com/api-strategy-practice/


This time we are expecting between 500-600 attendees, and it will be 
both service providers and API consumers (developers). The previous 
edition was mostly addressed to providers, but we will have an 
exclusive track addressed to developers in this edition.


They'd like us to:

- talk about how much traffic we deal with, how do we do it, who uses 
our API (or something similar that we think would be of general interest),
- participate in the APIs in Government - Towards a Data Commons panel 
giving the perspective of a non-profit (they aim to have federal, state, 
education, international, and non-profit representation in this panel).


Seems interesting. Is there anyone who'd be interested in giving a talk 
or participating in a panel there?


--
Juliusz

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] RFC: Alternative domains for Commons

2013-03-28 Thread Juliusz Gonera
On Thu, Mar 28, 2013 at 8:45 AM, Petr Bena benap...@gmail.com wrote:

 I don't know if that is actually good for anything :) but it would
 surely allow you to bypass cookie restrictions everywhere (for api's
 only). On the other way, I think we could just think of using some
 different technology than cookies to avoid mess with DNS


Adding just one domain doesn't seem like a big mess. I'm not sure when
we'll have a different technology other than cookies and I'm also afraid
that the workaround I described in the RFC won't work forever (if it does
work now, we'll know in a few days).

-- 
Juliusz
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] RFC: Alternative domains for Commons

2013-03-28 Thread Juliusz Gonera
On Thu, Mar 28, 2013 at 9:59 AM, Seb35 seb35wikipe...@gmail.com wrote:

 A small issue in this proposition: sub-subdomains are not currently
 covered by the https certificate.


That would make only commons.api.wikipedia.org not work, but not
commons.wikipedia.org, right?

-- 
Juliusz
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] RFC: Alternative domains for Commons

2013-03-27 Thread Juliusz Gonera
Thanks, but I'm afraid they won't help in solving our cookie problems. We
need a subdomain of wikipedia.org (and other projects' domains).


On Fri, Mar 22, 2013 at 9:30 PM, Daniel Zahn dz...@wikimedia.org wrote:

 fyi, we have all these:

 DNS:

 root@sockpuppet:~/pdns-templates# ls -l | grep commons
 lrwxrwxrwx 1 root root 13 Jun  7  2012 wikimediacommons.co.uk -
 wikimedia.com
 lrwxrwxrwx 1 root root 13 Jun  7  2012 wikimediacommons.eu -
 wikimedia.com
 lrwxrwxrwx 1 root root 13 Jun  7  2012 wikimediacommons.info -
 wikimedia.com
 lrwxrwxrwx 1 root root 13 Jul 19  2012 wikimediacommons.jp.net -
 wikimedia.com
 lrwxrwxrwx 1 root root 13 Jul 19  2012 wikimediacommons.mobi -
 wikimedia.com
 lrwxrwxrwx 1 root root 13 Jun  7  2012 wikimediacommons.net -
 wikimedia.com
 lrwxrwxrwx 1 root root 13 Jun  7  2012 wikimediacommons.org -
 wikimedia.com

 Apache:

 /apache-config$ grep commons redirects.conf
 wikimediacommons.co.uk *.wikimediacommons.co.uk \
 wikimediacommons.eu *.wikimediacommons.eu \
 wikimediacommons.info *.wikimediacommons.info \
 wikimediacommons.jp.net *.wikimediacommons.jp.net \
 wikimediacommons.mobi *.wikimediacommons.mobi \
 wikimediacommons.net *.wikimediacommons.net \
 wikimediacommons.org *.wikimediacommons.org \
 wikisource.com *.wikisource.com commons.wikipedia.org \
 www.commons.wikipedia.org www.commons.wikimedia.org \
 RewriteRule ^/welcometowikipedia$
 http://commons.wikimedia.org/wiki/File:Welcome_to_Wikipedia_brochure_EN.pdf
 [R=301,L]
 RewriteRule ^/instructorbasics$

 http://commons.wikimedia.org/wiki/File:Instructor_Basics_How_to_Use_Wikipedia_as_a_Teaching_Tool.pdf
 [R=301,L]
 RewriteCond %{HTTP_HOST}
 (^|\.)wikimediacommons.(net|info|mobi|eu|org|jp\.net)$
 RewriteRule ^(.*)$ http://commons.wikimedia.org$1 [R=301,L,NE]
 RewriteCond %{HTTP_HOST} (^|\.)wikimediacommons.co.uk$
 RewriteRule ^(.*)$ http://commons.wikimedia.org$1 [R=301,L]
 RewriteCond %{HTTP_HOST} =commons.wikipedia.org [OR]
 RewriteCond %{HTTP_HOST} =www.commons.wikimedia.org
 RewriteRule ^(.*)$ http://commons.wikimedia.org$1 [R=301,L,NE]

 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] RFC: Alternative domains for Commons

2013-03-27 Thread Juliusz Gonera
On Fri, Mar 22, 2013 at 9:22 PM, MZMcBride z...@mzmcbride.com wrote:

 Please draft an RFC at https://www.mediawiki.org/wiki/RFC. :-)


http://www.mediawiki.org/wiki/Requests_for_comment/Alternative_Commons_Domains
Please share your comments.


 commons.wikipedia.org already redirects to commons.wikimedia.org (for
 historical reasons, maybe), so that has to be considered.


Yes, it redirects. But to solve the problem I'm describing, the API would
need to be served from commons.wikipedia.org.


 I think what
 you're proposing is also kind of confusing and I'm wondering if there
 aren't better ways to approach the problem.


I'm open to suggestions, but I'd rather not wait until CentralAuth gets
completely redesigned and rewritten.

-- 
Juliusz


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] RFC: Alternative domains for Commons

2013-03-21 Thread Juliusz Gonera

We've been having a hard time making photo uploads work in
MobileFrontend because of CentralAuth's third party cookies problem (we
upload them from Wikipedia web site to Commons API). Apart from the
newest Firefox [1,2], mobile Safari also doesn't accept third party
cookies unless the domain has been visited and it already has at least
one cookie set.

Even though we have probably found a solution for now, it's a very shaky
and not elegant workaround which might stop working any time (if some
detail of default browser cookie policy changes again) [3].

I came up with another idea of how this could be solved. The problem we
have right now is that Commons is on a completely different domain than
Wikipedia, so they can't share the login token cookie. However, we could
set up alternative domains for Commons, such as commons.wikipedia.org,
and then the cookie could be shared.

The only issue I see with this solution is that we would have to
prevent messing up SEO (having multiple URLs pointing to the same
resource). This, however, could be avoided by redirecting every
non-API request to the main domain (commons.wikimedia.org) and only
allowing API requests on alternative domains (which is what we use for
photo uploads on mobile).

This obviously doesn't solve the broader problem of CentralAuth's common
login being broken, but at least would allow easy communication between
Commons and other projects. In my opinion this is the biggest problem
right now. Users can probably live without being automatically logged in
to other projects, but photo uploads on mobile are just broken when we
can't use Commons API.

Please let me know what you think. Are there any other possible
drawbacks of this solution that I missed?

[1] http://webpolicy.org/2013/02/22/the-new-firefox-cookie-policy/
[2] 
https://developer.mozilla.org/en-US/docs/Site_Compatibility_for_Firefox_22

[3] https://gerrit.wikimedia.org/r/#/c/54813/

--
Juliusz

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Github/Gerrit mirroring

2013-03-15 Thread Juliusz Gonera

On 03/14/2013 08:36 PM, Erik Moeller wrote:
And I wouldn't be too quick to celebrate the increased vendor lock-in 
of a large percentage of the open source community into an ecosystem 
of partially proprietary tools and services (the GitHub engine itself, 
the official GitHub applications, etc.). Gerrit and other open source 
git repo management and code review tools are one of the best hopes 
for the development of a viable alternative. Unlike GitHub, Gerrit can 
be improved by its users over time, and the issues that frustrate and 
annoy us about it _can_ be fixed (and indeed, many have been). Yes to 
better pull request management from GitHub. But let's stop complaining 
about Gerrit, and instead get both functionality and UX issues into 
their bug tracker, and help get them fixed. Erik 


That's a good argument. Someone mentioned GitLab though. If I have more 
time I'll go again through the list of our requirements and what it offers.


--
Juliusz

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Github/Gerrit mirroring

2013-03-14 Thread Juliusz Gonera

On 03/09/2013 01:06 PM, Tyler Romeo wrote:

I strongly disagree that Gerrit is harder to learn than Github. The only
difficult thing to understand is the web UI, which takes only a few minutes
to really get used to. Let's look at the biggest complaints:


Let's not forget about this one: ...forces a *one commit at a time* 
workflow on developers and forces the use of |git commit --amend| as the 
only way to update patches. [1] For me this breaks the purpose of branches.


[1] 
http://www.mediawiki.org/wiki/Git/Gerrit_evaluation#The_case_against_Gerrit


--
Juliusz
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Github/Gerrit mirroring

2013-03-14 Thread Juliusz Gonera

On 03/08/2013 10:20 AM, Bartosz DziewoƄski wrote:
On Fri, 08 Mar 2013 17:07:18 +0100, Antoine Musso hashar+...@free.fr 
wrote:

I guess the whole idea of using GitHub is for public relation and to
attract new people.  Then, if a developer is not willing to learn
Gerrit, its code is probably not worth the effort of us integrating
github/gerrit.  That will just add some more poor quality code to your
review queues.


This a hundred times. I manage a few (small) open-source projects at 
GitHub, and most of the patches I get are not even up to my standards 
(and those are significantly lower than WMF's ones).


Submitting a patch to gerrit and even fixing it after code review is 
not that hard. (Of course any more complicated operations like 
rebasing do suck, but you hopefully won't be doing that with your 
first patch.)


I strongly disagree with this. I also get some poor quality pull 
requests to my projects on GitHub, but once in a while I get something good.


To be honest, if I hadn't worked at WMF I'd have never thought about 
learning something as obscure as gerrit just to submit a small patch. 
And I wouldn't assume that without knowing the project well anyone would 
want to contribute something big.


My reasoning: people get involved in open source projects by starting 
with a small contribution and if we don't make it easy for them, they 
just won't try.


--
Juliusz

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Github/Gerrit mirroring

2013-03-14 Thread Juliusz Gonera

On 03/08/2013 08:55 AM, Andrew Otto wrote:

I've been hosting my puppet-cdh4 (Hadoop) repository on Github for a while now. 
 I am planning on moving this into Gerrit.

I've been getting pretty high quality pull requests for the last month or so 
from a couple of different users. (Including CentOS support, supporting 
MapReduce v1 as well as YARN, etc.)

   https://github.com/wikimedia/puppet-cdh4/issues?page=1state=closed

I'm happy to host this in Gerrit, but I suspect that contribution to this 
project will drop once I do. :/


Why do you want to move it to gerrit then?

--
Juliusz

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Github/Gerrit mirroring

2013-03-14 Thread Juliusz Gonera

On 03/14/2013 07:19 PM, Tyler Romeo wrote:

Like I said before, if you know how to use Git, you know how to use Gerrit
(and the contra-positive is true as well). The primary thing holding people
back is that it's confusing and not user friendly enough to make an account
and get working. Imagine if people could sign into Gerrit using their
Google accounts like Phabricator allows. I can guarantee participation
would skyrocket.


I wouldn't be that optimistic, maybe it would slightly increase. Having 
an account is one of the factors but I wouldn't underestimate user 
friendliness. The first time I tried to find the URL to clone a repo in 
gerrit it took me probably around a minute. On GitHub it probably took 
me 5 seconds.


--
Juliusz

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Problem with CentralAuth in MobileFrontend

2013-02-28 Thread Juliusz Gonera

On 02/27/2013 06:36 PM, Chris Steipp wrote:

I'm not able to reproduce the error (auto login is working for me) in
Chrome 25 or Firefox.

If someone is still able to reproduce this, can you let me know if:
* The images from various Special:AutoLogin pages are loading when you login
* You do (or don't) have a centralauth_Session cookie
* You do (or don't) have a wiki specific session cookie, such as
commonswiki_session for commons


https://bugzilla.wikimedia.org/show_bug.cgi?id=45578

--
Juliusz

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Problem with CentralAuth in MobileFrontend

2013-02-28 Thread Juliusz Gonera

On 02/27/2013 05:13 PM, Paul Selitskas wrote:

Do you use the same protocol in Wikipedia and other projects? When I
first log in via HTTPS and then somehow get to HTTP, I need to log in.


We use the same protocol. We enforce HTTPS after login, and later use 
protocol agnostic URLs.


--
Juliusz

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] Problem with CentralAuth in MobileFrontend

2013-02-27 Thread Juliusz Gonera

Hi,

Yesterday we released photo uploads in mobile (actually moved it from 
beta to stable). We're logging errors we get when people try to upload 
photos and it seems the most common one is that they're not logged in to 
Commons even though they logged in to Wikipedia. It seems to be 
happening quite randomly and doesn't seem to depend on any particular 
browser.


We're not sure how to debug this. Have any similar issues ever happened 
on desktop (people not being logged in to other projects)? Is there 
anyone who has a good knowledge of how CentralAuth works and could help us?


Thanks,
Juliusz

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] DevOps/Continuous Deployment discussion?

2013-02-26 Thread Juliusz Gonera

On 02/20/2013 12:04 PM, Luke Welling WMF wrote:

I am strongly of the opinion that within broad ranges deployment frequency
does not matter.  It really does not matter if you deploy twice an hour or
every second day.


What teams deploy every second day?


But, having the machinery to make it so that you could deploy twice an hour
if you wanted to is all kinds of valuable.

Putting time into building:
  * Continuous integration with build-on-commit
  * Tests with good coverage
  * A staging environment that reflects production
  * Managed configuration
  * Scripted deployment to a large number of machines
pays dividends in uptime, ops sanity and developer productivity even if you
only use that machinery every few days.

We have some of that, but heading further down that road would be a good
thing even if we chose to keep organized periodic deployments.


I couldn't agree more. It's not that I feel a need of deploying 
something every single day, I just want the whole process to be easier 
and less scary, so that it doesn't disrupt everyone's work for half a 
day. Also, it shouldn't be a big deal to push a quick fix in between 
deployments.


--
Juliusz

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] DevOps/Continuous Deployment discussion?

2013-02-26 Thread Juliusz Gonera

On 02/20/2013 12:15 PM, Mark A. Hershberger wrote:
I think there is a lot of ground to cover before we get more 
continuous deployments, but what were you thinking we needed? 


A simpler and faster deployment process. The part of the process when 
supervision is needed shouldn't, in my opinion, take more than 15 
minutes (or less). Later, populating everything to all the machines 
should be fully automatic and not require any kind of human supervision. 
A report should be generated at the end. If something goes wrong, then 
an alert should be sent to someone responsible.


I never deployed on that scale before but I did read a bit how other 
projects deal with this and talked to some people. Definitely, a staging 
environment which mirrors production would help a lot. I hope we'll get 
there soon.


I know that Facebook has something like an internal bleeding edge 
version that is used by their employees. It works with the same database 
as the official Facebook so employees can use it just like they would 
use Facebook any day. This gives them an opportunity to test many things 
manually before they go live for millions of people without actually 
wasting time on boring testing, but just by using the product. I don't 
know the details but I think they sync the production with bleeding edge 
once or twice a week.


--
Juliusz

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] DevOps/Continuous Deployment discussion?

2013-02-20 Thread Juliusz Gonera
Sorry for digging up an old thread, but today I also started wondering 
if there's a way of making our deployments simpler and faster.


I'm not a big fan of special highly orchestrated events when the whole 
team gathers and waits and then looks for regressions after deploying 
dozens of commits at the same time. I've been reading a bit and it's a 
fact that some projects do Continuous Deployment and it works for them:

http://radar.oreilly.com/2009/03/continuous-deployment-5-eas.html
http://timothyfitz.com/2009/02/10/continuous-deployment-at-imvu-doing-the-impossible-fifty-times-a-day/
http://www.slideshare.net/mikebrittain/mbrittain-continuous-deploymentalm3public

Is there any interest at WMF in taking this path at some point?

--
Juliusz


On 12/26/2012 09:31 AM, Chris McMahon wrote:

Hi,

A number of people I know of have ideas and aspirations pertaining to a
DevOps-style deployment process, a.k.a Continuous Deployment.  In recent
times a number of pieces of such a system have become functional:  Zuul,
Jenkins enhancements for tests, automated acceptance tests, etc.

But looking at mediawiki.org I don't see any sort of central discussion of
overall approach/design/process for DevOps/Continuous Deployment.

Is it time to start such a discussion?  Or is this premature?

-Chris
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l



___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] On code review

2013-01-22 Thread Juliusz Gonera
I liked the post, but I'm not sure what exactly we should change in our 
code reviews. Could you explain?



On 01/21/2013 01:40 PM, Ori Livneh wrote:

There's a useful blog post on code review at Mozilla by Mozilla developer David 
Humphrey on his blog: http://vocamus.net/dave/?p=1569.

I like his breakdown of different types of code reviews. It seems like at Mozilla there 
is a lot of room for the patch submitter to indicate to reviewers what sort of review is 
needed for a particular patch, ranging from requests for manual testing and careful 
scrutiny all the way to what Humphrey calls catechism reviews, in which the 
committer uses a review request to announce her intent and solicit a basic sanity-check.

Unofficially such reviews do not exist at the WMF because we are all infallibly 
meticulous and diligent about testing every branch of every code change. But 
unofficially they do, of course. It'd be nice if such reviews were formally 
sanctioned (with whatever qualifications). I'm interested to hear other 
people's thoughts.

--
Ori Livneh



___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l



___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


[Wikitech-l] Short script to check core versions deployed

2013-01-17 Thread Juliusz Gonera

Reposting from mobile-tech:

I wrote a short Ruby script to show what versions of MediaWiki different 
projects are running on:

https://gist.github.com/b5a97d4dc34f5fc56304
Shows number of languages running on a given project/version.

Might make easier figuring out if a patch in core we need is already 
deployed. I guess for now we're interested in wiki and wikivoyage.


Sample output:
$ ruby wikiver.rb
---
all:
  php-1.21wmf7: 863
  php-1.21wmf8: 4
wikibooks:
  php-1.21wmf7: 121
wiki:
  php-1.21wmf7: 332
  php-1.21wmf8: 4
wiktionary:
  php-1.21wmf7: 171
wikiquote:
  php-1.21wmf7: 88
wikisource:
  php-1.21wmf7: 64
wikimedia:
  php-1.21wmf7: 30
wikinews:
  php-1.21wmf7: 33
wikiversity:
  php-1.21wmf7: 15
wikivoyage:
  php-1.21wmf7: 9

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


[Wikitech-l] Full name in Gerrit

2013-01-03 Thread Juliusz Gonera
Can I change the full name in Gerrit somehow? I don't like the fact that 
when I merge something in gerrit my commits have JGonera as an author 
instead of Juliusz Gonera.


Juliusz

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Full name in Gerrit

2013-01-03 Thread Juliusz Gonera

On 01/03/2013 11:52 AM, Chad wrote:

On Thu, Jan 3, 2013 at 2:30 PM, Matma Rex matma@gmail.com wrote:

There's a bug for everything, and they're all waiting...

https://bugzilla.wikimedia.org/show_bug.cgi?id=40061

(It should only take half a day to make this happen, but apparently nobody
took any action since September last year.)


We're still not on 2.5.


Thanks, it's not that important. I was just wondering if there's an easy 
way to fix it.


Juliusz

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Gerrit code review guidelines

2012-12-31 Thread Juliusz Gonera
Thanks, that clears up my doubts. I'll just assume that for now, as long 
as we don't have automatic merging in mobile, I should just merge myself 
instead of giving +2.


This information should definitely be included in one of the documents 
Dan is writing (https://bugzilla.wikimedia.org/show_bug.cgi?id=36437).



On 12/31/2012 10:39 AM, Krinkle wrote:

On Dec 27, 2012, at 7:18 PM, Juliusz Gonera jgon...@wikimedia.org wrote:


Hi,

I'm a bit confused when it comes to various options I have in gerrit and it 
seems the docs are not up to date on that.

* What is the difference between Verified and Code Review? When would I put +1 
in one of them but -1 in the other?
* What is the difference between +1 and +2, especially in Verified?
* Why do we even have +2? +1 means that someone else must approve. What does +2 
mean? No one else has to approve but I'm not merging anyway, why?

It seems the docs (http://www.mediawiki.org/wiki/Code_review_guide) do not 
explain it.

Juliusz

== Verified ==

Verified is for linting and executing unit tests.
* Verified +1 means Checked for lint issues
* Verified +2 means Tested by executing the Jenkins tests

If you are a human, do not use Verified (most people don't have user 
permissions to set this, anyway).

If you tested the change (either by checking it out locally and using the 
wiki and/or by running phpunit locally) and you have the ability to set Verified, still 
do not set it.

It does not mean the same thing. Because the Jenkins tests are much more 
elaborate than the testing you may do locally (e.g. different database 
backends, and soon also different browsers and test suites: jshint, phplint, 
phpunit, qunit, mysql, sqlite etc.).

We might rename this field to Automated Testing for clarification (Verified 
is a generic name that is the default in Gerrit) and (where not already) it will eventually be 
restricted to bots only[2].

== Code Review ==

Code Review is for human review of the code.

A positive value means you believe it is perfect and may be merged as-is (under 
the condition that the Jenkins test will pass[1]). If you have merge rights 
then you'd give +2. Others would give +1.

A negative value means there are issues.

[1] So if you set CR+2 you're saying The code looks great, let Jenkins run it, and 
if it passes, merge it.
[2] Except for wmf-deployment probably, to be able to override it if Jenkins is 
having issues and certain commits need emergency deployment.


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l



___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Gerrit code review guidelines

2012-12-28 Thread Juliusz Gonera

On 12/27/2012 10:31 AM, Matthew Flaschen wrote:


* What is the difference between +1 and +2, especially in Verified?

I think just how certain you are.


I still don't get it. I either think the code is good and should be 
merged or it's not good enough and shouldn't be merged. I don't see any 
situations in between, but maybe it's just me ;)



+2 means it's ready to merge.  In core, this will cause unit tests to
run, and if they pass, it will automatically merge.

I don't know of any reason (in any code) to vote CR +2 if you don't
think it's ready to merge.


This still seems redundant. If Jenkins runs tests only when I give +2, 
but I am supposed to give +2 only if I already run the tests myself 
manually, then what's the point?


And I guess we don't have that auto-merging behavior in mobile.

Juliusz

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Gerrit code review guidelines

2012-12-28 Thread Juliusz Gonera

On 12/27/2012 10:36 AM, Alex Monk wrote:

Only some people (project owners, gerrit admins, some WMF staff, etc.) can
give CodeReview+2 (approved), whereas everyone can give CodeReview+1. Only
people able to approve can mess with Verified I think...


But should we mess with Verified? Or should we just leave it to Jenkins?

Juliusz

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


[Wikitech-l] Gerrit code review guidelines

2012-12-27 Thread Juliusz Gonera

Hi,

I'm a bit confused when it comes to various options I have in gerrit and 
it seems the docs are not up to date on that.


* What is the difference between Verified and Code Review? When would I 
put +1 in one of them but -1 in the other?

* What is the difference between +1 and +2, especially in Verified?
* Why do we even have +2? +1 means that someone else must approve. What 
does +2 mean? No one else has to approve but I'm not merging anyway, why?


It seems the docs (http://www.mediawiki.org/wiki/Code_review_guide) do 
not explain it.


Juliusz

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


[Wikitech-l] Webinar: Getting the Most Out of Jenkins and, Selenium, with CloudBees and Sauce Labs

2012-12-06 Thread Juliusz Gonera

Reposting to wikitech-l, in case someone is interested:

http://www.cloudbees.com/webinars/getting-most-out-selenium-and-jenkins-cloudbees-sauce.cb
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] [Wmfall] Welcome Juliusz Gonera as Software Developer to the Mobile Team!

2012-11-20 Thread Juliusz Gonera
Thank you all for a very warm welcome. Hope to meet everyone in person soon!


On Mon, Nov 19, 2012 at 1:59 PM, Tomasz Finc tf...@wikimedia.org wrote:

 I am pleased to announce that Juliusz Gonera joins WMF this week as a
 Software Developer (Mobile team) today.

 Juliusz has worked at the University of Virginia, developing software
 for a laboratory that studies the macromolecular structure of
 proteins. Before that he created a system for sending bulk SMS
 messages for a Polish company. Juliusz is a proponent of open source
 and agile methodologies and apart from a few projects of his own [1]
 he contributes to open source software he uses. He has just moved to
 San Francisco and earlier lived in Virginia, Spain and Poland.

 The team would like to welcome him and wish him success.

 [1] - https://github.com/jgonera

 --tomasz

 ___
 Wmfall mailing list
 wmf...@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wmfall

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l