Re: [Wikitech-l] [Wikimedia-l] Fwd: [Tech/Product] Engineering/Product org structure

2012-11-08 Thread Federico Leva (Nemo)

Sue Gardner, 08/11/2012 07:03:
 I kind of have the sense that people are considering this a done 
deal. [...]


 So to be super-clear: None of this is a done deal at this moment. Lots
 of conversations are happening in various places, and it's all good.
 That's why Erik made the pre-announcement --- to create a window for
 discussion  iteration and further thinking :-)

Thank you for clarifying this. Another thing I found confusing in 
Terry's email is that he called it a decision (another terminology 
inconsistency problem? ;-) ).


Howie Fung, 08/11/2012 05:53:

[...]
That's how our project teams are structured.  When it comes to the proposed
organizational structure, Product consists of Product Managers,
Designers, and Analysts (1, 2 and 4) and Engineering consists engineers
across the different areas Terry describes.  One way to view it is that
Product involves everything outside of writing code for a feature and
developers in Engineering write the code.  It's oversimplification (e.g.,
analysts write code for analytics work, designers may prototype), but I
think it's directionally useful.

The individuals in Product and Engineering are then matrixed into project
teams like the one described above for Page Curation so project team
structure != organizational structure.


Thank you for this short explanation and its long (useful) premise.
It would seem that the tentative answer to my question, pending more 
insight from Erik when he has time, is more scattered rather than less.


A thing your answer doesn't cover is that not only «project team 
structure != organizational structure» (with Erik's words, «functional 
groupings» != «team groupings»), but also (it seems) project team 
structure != team structure, i.e. not all teams are the same.
By browsing the wonderful 
https://www.mediawiki.org/wiki/Category:WMF_Projects , one can see that 
each project has different ad-hoc teams (specified in the infobox), but 
some teams are more consistent/frequent than others, with the tightest 
groupings being the permanent teams mentioned also on the staff page, 
who mostly move together from one project to another. On the opposite 
side of the spectrum we have electrons who are not in any team (and 
don't have any day-to-day management?) but move across many projects 
(serving many teams).

It would seem to me that you can't treat everything in the same way?

Steven Walling, 07/11/2012 23:46:
 On Wed, Nov 7, 2012 at 2:16 PM, Platonides platoni...@gmail.com wrote:

 Thanks for your explanation but personally I'm more confused than 
before

 about the difference between Engineering and Product, also because the
 terminology didn't appear internally consistent. :-)

 I feel like you, Nemo. I am glad by Terry explanation, but as I went on
 reading it, the less I felt I understood it. It would benefit from a
 more layman explanation. Maybe it's just complex to everybody.


 Simplest possible explanation of what Erik is proposing: we need to 
split a

 large department in to two, and add more managers. It's too big ad too
 critical for one person to manage.

This is very clear and it's not hard to see that it's needed, but it 
doesn't actually explain the need for a split.
If, for instance, one only needs to make more equal «functional 
groupings» so that each C-level has to sign an equal amount of holiday 
permissions[1], instead of inventing boundaries between Engineering 
and Product or other names for almost-fake separations, one could as 
well just keep the two together, call it an area or super-department 
with its VP[2] and then Chief 1 and Chief 2 under it... or whatever.
But the further explanations will help us understand what are the aims 
and how it's expected to achieve them.


Nemo

[1] Simplifying at most; fake example with probably wrong terminology even.
[2] Speaking of terminology bikeshedding, I never understood that VP of 
Engineering and Product Development were actually /two/ VP roles. 
Previous discussion: 
http://thread.gmane.org/gmane.org.wikimedia.foundation/59115/focus=59147


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Data flow from Wikidata to the Wikipedias

2012-11-08 Thread Daniel Kinzler
First off, TL;DR:
(@Tim: did my best to summarize, please correct any misrepresentation.)

* Tim: don't re-parse when sitelinks change.

* Daniel: can be done, but do we really need to optimize for this case? Denny,
can we get better figures on this?

* Daniel: how far do we want to limit the things we make available via parser
functions and/or lua binding? Coudl allow more with LUA (faster, and
implementing complex functionality via parser function is nasty anyway).

* Consensus: we want to coalesc changes before acting on them.

* Tim: we also want to avoid redundant rendering by removing duplicate render
jobs (like multiple re-rendering of the same page) resulting from the changes.

* Tim: large batches (lower frequency) for re-rendering pages that are already
invalidated would allow more dupes to be removed. (Pages would still be rendered
on demand when viewed, but link tables would update later)

* Daniel: sounds good, but perhaps this should be a general feature of the
re-render/linksUpdate job queue, so it's also used when templates get edited.

* Consensus: load items directly from ES (via remote access to the repo's text
table), cache in memcached.

* Tim:  Also get rid of local item - page mapping, just look each page up on
the repo.

* Daniel: Ok, but then we can't optimize bulk ops involving multiple items.

* Tim: run the polling script from the repo, push to client wiki db's directly

* Daniel: that's scary, client wikis should keep control of how changes are 
handled.


Now, the nitty gritty:

On 08.11.2012 01:51, Tim Starling wrote:
 On 07/11/12 22:56, Daniel Kinzler wrote:
 As far as I can see, we then can get the updated language links before the 
 page
 has been re-parsed, but we still need to re-parse eventually. 
 
 Why does it need to be re-parsed eventually?

For the same reason pages need to be re-parsed when templates change: because
links may depend on the data items.

We are currently working on the assumption that *any* aspect of a data item is
accessible vis parser functions in the wikitext, and may thus infuence any
apsect of that page's parser output.

So, if *anything* about a data items chanes, *anything* about the wikipedia page
using it may change too. So that page needs to be re-parsed.

Maybe we'll be able to cut past the rendering for some cases, but for normal
property changes, like a new value for the population of a country, all pages
that use the respective data item need to be re-rendered soonish, otherwise the
link tables (especially categories) will get out of whack.

So, let's thinkabout what we *could* optimize:

* I think we could probably disallow access to wikidata sitelinks via parser
functions in wikipedia articles. That would allows us to use an optimized data
flow for changes to sitelinks (aka language links) which does not cause the page
to be re-rendered.

* Maybe we can also avoid re-parsing pages on changes that apply only to
languages that are not used on the respective wiki (lets call them unrelated
translation changes). The tricky bit here is to figure out changes to which
language effect which wiki in the presence of complex language fallback rules
(e.g. nds-de-mul or even nastier stuff involvinc circular relations between
language variants).

* Changes to labels, descriptions and aliases of items on wikidata will
*probably* not influence the content of wikipedia pages. We could disallow
acccess to these aspects of data items to make sure - this would be a shame, but
not terrible. At least not for infoboxes. For automatically generated lists
we'll need the labels at the very least.

* We could keep track of which properties of the data item are actually used on
each page, and then only re-parse of these properties change. That would be
quite a bit of data, and annoying to maintain, but possible. Whether this has a
large impact on the need to re-parse remains to be seen, since it greately
depends on the infobox templates.

We can come up with increasingly complex rules for skipping rendering, but
except perhaps for the sitelink changes, this seems brittle and confusing. I'd
like to avoid it as much as possible.

 And, when someone
 actually looks at the page, the page does get parsed/rendered right away, and
 the user sees the updated langlinks. So... what do we need the
 pre-parse-update-of-langlinks for? Where and when would they even be used? I
 don't see the point.
 
 For language link updates in particular, you wouldn't have to update
 page_touched, so the page wouldn't have to be re-parsed.

If the languagelinks in the sidebar come from memcached and not the cached
parser output, then yes.

 We could get around this, but even then it would be an optimization for 
 language
 links. But wikidata is soon going to provide data for infoboxes. Any 
 aspect of a
 data item could be sued in an {{#if:...}}. So we need to re-render the page
 whenever an item changes.

 Wikidata is somewhere around 61000 physical lines of code now. Surely
 somewhere 

[Wikitech-l] Collaborative code-editing tools (long-distance pair programming)

2012-11-08 Thread Sumana Harihareswara
When you want to pair program with someone far away, what do you use?  I
just read about Collide:

https://lwn.net/Articles/521647/

https://code.google.com/p/collide/

Collide has line numbering, syntax highlighting, auto-completion, and
real-time file tree manipulation but the syntax highlighter doesn't
support PHP yet, just JavaScript, Python, CSS, and HTML.

Do any of you use Cloud9, Brackets, emacs + xhost, or some other
tool/service?  Do you recommend them?  http://etherpad.wmflabs.org/pad/
is all very well and good but it doesn't support syntax highlighting.

-- 
Sumana Harihareswara
Engineering Community Manager
Wikimedia Foundation

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] MW coding query: pausing and resuming an edit

2012-11-08 Thread Amelia Ireland
Platonides Platonides at gmail.com writes:

 On 06/11/12 23:35, Amelia Ireland wrote:
  Hello all,
  
  Is it possible to pause and resume a Mediawiki edit?
  
  To explain, I've written a MW extension that accesses an external
  database; this database requires OAuth authentication [1.0, pre-OAuth
  wars version], which is a three-step process requiring the user to be
  redirected to an external site to allow the extension access to the
  external db. If the MW extension already has an access token for the
  extDb, all is well. However, if there isn't a token, there is a
  problem. This is a tag extension, and is triggered by finding a
  certain XML tag in the wiki page, which typically occurs in the
  'preview' or 'submit' of an edit, e.g.
  http://server.com/wiki/index.php?title=Bibliographyaction=submit (the
  parser hook is ParserFirstCallInit). The callback URL constructed by
  the OAuth code returns you to the page you were editing, but in its
  pre-edit state: i.e. you lose all your edits.
  
  How can I resume the edit and not lose my edit data?
  
  Thanks for any help, clues, or workarounds!
 
 Store the submitted data in the session, then in the callback add a
 parameter that makes your extension to load the data from the session.

Would you be able to provide a few pointers as to how to accomplish this? I am
not familiar with the MW core code, having only really done extension hacking.I
guess I need to be looking at the User class and methods like loadFromSession().
If you could direct me to some existing code I could base mine on, that would be
really helpful.

 It probably makes more sense to give your wiki server credentials to
 read the external db, though.

I don't have any control over this, unfortunately -- the external DB is run by a
company and I have been unsuccessful in even finding out from them how long the
DB access tokens are valid for.

Thanks,
Amelia.


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Collaborative code-editing tools (long-distance pair programming)

2012-11-08 Thread Mark Holmquist

Do any of you use Cloud9, Brackets, emacs + xhost, or some other
tool/service?  Do you recommend them?  http://etherpad.wmflabs.org/pad/
is all very well and good but it doesn't support syntax highlighting.


FWIW there is a way to add in syntax highlighting, and I could probably 
create a new instance for that. There was also chatter on the Etherpad 
channel yesterday about writing plugins for compiling and running 
programs on the backend of the server.


Let me know if there's interest.

--
Mark Holmquist
Software Engineer, Wikimedia Foundation
mtrac...@member.fsf.org
http://marktraceur.info

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Collaborative code-editing tools (long-distance pair programming)

2012-11-08 Thread Mark Holmquist

FWIW there is a way to add in syntax highlighting, and I could probably
create a new instance for that. There was also chatter on the Etherpad
channel yesterday about writing plugins for compiling and running
programs on the backend of the server.


Additionally, I suppose, we could write a plugin for enabling a grouping 
of pads into projects, which would make it easier to have multiple files 
open at once.


I think the major problem is that any files on the etherpad server will 
need to be downloaded or copy/pasted before you can actually run them, 
which may or may not be ideal. But again, there may be a solution in the 
plugin API.


(backend of the server - sorry, I just woke up)

--
Mark Holmquist
Software Engineer, Wikimedia Foundation
mtrac...@member.fsf.org
http://marktraceur.info

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Very low APC user cache utilization at MediaWiki 1.19.2 host

2012-11-08 Thread Antoine Musso
Le 07/11/12 13:07, Dmitriy Sintsov a écrit :
 
 At one wiki host there is 5.3.3-7+squeeze14 with apc 3.1.3p1 (both are
 quite old however these are provided by Debian and I am not a regular
 admin of the server).
 The wiki host has about 10-20k visits per day and about 3500 of pages.
 It's not the smallest wiki.

Hello Dimitriy,

Please repost your original message as a NEW message. You did reply to
an unrelated message named Engineering/Product org strucuture which
means lot of people will not notice your message since it is hidden in
that thread.

Your message copy:

http://permalink.gmane.org/gmane.science.linguistics.wikipedia.technical/65014

Thanks!

-- 
Antoine hashar Musso


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Collaborative code-editing tools (long-distance pair programming)

2012-11-08 Thread Dan Andreescu
+1 on getting collide up and running.  It's open source and looks like it
already does project management and syntax highlighting


On Thu, Nov 8, 2012 at 10:45 AM, Mark Holmquist mtrac...@member.fsf.orgwrote:

 FWIW there is a way to add in syntax highlighting, and I could probably
 create a new instance for that. There was also chatter on the Etherpad
 channel yesterday about writing plugins for compiling and running
 programs on the backend of the server.


 Additionally, I suppose, we could write a plugin for enabling a grouping
 of pads into projects, which would make it easier to have multiple files
 open at once.

 I think the major problem is that any files on the etherpad server will
 need to be downloaded or copy/pasted before you can actually run them,
 which may or may not be ideal. But again, there may be a solution in the
 plugin API.

 (backend of the server - sorry, I just woke up)


 --
 Mark Holmquist
 Software Engineer, Wikimedia Foundation
 mtrac...@member.fsf.org
 http://marktraceur.info

 __**_
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/**mailman/listinfo/wikitech-lhttps://lists.wikimedia.org/mailman/listinfo/wikitech-l

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Collaborative code-editing tools (long-distance pair programming)

2012-11-08 Thread Mark Holmquist

On 12-11-08 08:32 AM, Dan Andreescu wrote:

+1 on getting collide up and running.  It's open source and looks like it
already does project management and syntax highlighting


From https://code.google.com/p/collide/:


Requires
Client: recent Chrome or Safari
Server: Java 7 JRE
Build: Java 7 JDK, Ant 1.8.4+; all other build dependencies are bundled in.


It's possible that the first requirement is really more of an official 
recommendation from Google, but it's nasty that they recommend two 
non-free ones.


And the server requirements are also (at least partly) non-free, IIRC. 
I'm very willing to be proven wrong on that front. We may be able to use 
openjdk-7-* as drop-in replacements, but I don't know how nicely they'll 
play together.


*This has been a message from your friendly neighborhood FSF member*

Then again, it does seem like a lot less work to run Collide, if we can 
do it with Chromium and OpenJDK.


--
Mark Holmquist
Software Engineer, Wikimedia Foundation
mtrac...@member.fsf.org
http://marktraceur.info

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Collaborative code-editing tools (long-distance pair programming)

2012-11-08 Thread Sumana Harihareswara
On 11/08/2012 09:43 AM, Sumana Harihareswara wrote:
 When you want to pair program with someone far away, what do you use?  I
 just read about Collide:
 
 https://lwn.net/Articles/521647/
 
 https://code.google.com/p/collide/
 
 Collide has line numbering, syntax highlighting, auto-completion, and
 real-time file tree manipulation but the syntax highlighter doesn't
 support PHP yet, just JavaScript, Python, CSS, and HTML.
 
 Do any of you use Cloud9, Brackets, emacs + xhost, or some other
 tool/service?  Do you recommend them?  http://etherpad.wmflabs.org/pad/
 is all very well and good but it doesn't support syntax highlighting.

A quick note that another engineer mentioned to me an experimental
plugin for Sublime http://www.sublimetext.com/ that does remote pair
programming.  Sublime is released under a You can try it forever, but
please buy a license code if you like it. license and runs on OSX,
Windows, and Linux.

Tutorials:
https://tutsplus.com/course/improve-workflow-in-sublime-text-2/.
They'll get you started (especially with plugins).
https://tutsplus.com/lesson/multiple-cursors-and-incremental-search/
is especially persuasive, I'm told.

-- 
Sumana Harihareswara
Engineering Community Manager
Wikimedia Foundation

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


[Wikitech-l] [IRC] wm-bot maintenance

2012-11-08 Thread Petr Bena
Hi folks,

I know some of you are tired of these wm-bot messages :) but I didn't send
one regarding today's maintenance and now I feel sorry for that.

I did a huge maintenance today replacing the old monolithic core with new
lightweight core with features bundled as dynamic modules (which makes it
way easier to patch and maintain), but unfortunately, stuff didn't go so
well as I expected. This resulted in about 30 minutes outage, approximately
since 16:20 GMT till 17:00 GMT that means, all logs from all channels that
are being publicly logged may be missing during that time.

It's no problem for me to insert these logs by hand if you need, just send
me the missing part with GMT time and I will do that.

Also because stuff is still not fully recovered, you can expect some
additional issues, especially in wmf wiki's RC feed relay to freenode (note
WMF RC feed at irc.wikimedia.org is fine, I am talking about relay to
freenode only, provided by wm-bot) and in html dumps, statistics and
exports (this is provided by 2 modules, which both right now produces a lot
of errors in log).

I apologize for all inconveniences caused by this unannounced maintenance,
and if u don't mind, I will notify you all next time on this list, if I was
about to do something large what could make the bot unavailable for longer
time. (In fact I was hoping for quick replacement with no problems, but
that never happens :D)

Thanks, Petr
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


[Wikitech-l] Work flow for bugfixing

2012-11-08 Thread Stephan Gambke
Hi,

is there a recommended work flow for bugfixing?

Right now what I do is submit a patch to gerrit and, if I remember, set
some tag in bugzilla. At some point somebody approves and merges the
patch. Then, if I remember, I set the bug to resolved/fixed in bugzilla.

There is a bit too much remembering involved for my taste. It's easy to
forget to close the bug in bugzilla, especially if the patch has been
lying around for some time before being merged.

Would it be possible/sensible to automatically close a bug when the
patch is merged? Or did I miss something?

Cheers,
Stephan

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Work flow for bugfixing

2012-11-08 Thread Chad
On Thu, Nov 8, 2012 at 12:54 PM, Stephan Gambke s7ep...@gmail.com wrote:
 Hi,

 is there a recommended work flow for bugfixing?

 Right now what I do is submit a patch to gerrit and, if I remember, set
 some tag in bugzilla. At some point somebody approves and merges the
 patch. Then, if I remember, I set the bug to resolved/fixed in bugzilla.

 There is a bit too much remembering involved for my taste. It's easy to
 forget to close the bug in bugzilla, especially if the patch has been
 lying around for some time before being merged.

 Would it be possible/sensible to automatically close a bug when the
 patch is merged? Or did I miss something?


In theory, yes. Someone already started writing a plugin to do the same thing
for Jira[0]. Might be a good starting place for a Bugzilla plugin (and RT?).

-Chad

[0] https://gerrit.googlesource.com/plugins/hooks-jira/

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


[Wikitech-l] AJAX page patrolling

2012-11-08 Thread hoo
Hi,

I've implemented AJAX patrolling for the MediaWiki core
( https://gerrit.wikimedia.org/r/26440 ) and I would like to gather some
feedback and hopefully get someone to take a look at the code in the
hope to get it merged soon (as the change is in gerrit for more than a
month now).

It acts like this:
When clicking on a patrol link a spinning animation will show up between
the brackets the patrol link was shown in and the patrol request will be
send to the API. In case it succeeds the patrol links on page will
disappear (just like on a normal page view) and the user will get a
notification. In case an error occurs the spinner will be removed and
the old patrol link gets shown again, so that the user can give it
another try (on top of a notification, of course).

Thank you in advance


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Work flow for bugfixing

2012-11-08 Thread Daniel Friesen
On Thu, 08 Nov 2012 09:54:43 -0800, Stephan Gambke s7ep...@gmail.com  
wrote:



Hi,

is there a recommended work flow for bugfixing?

Right now what I do is submit a patch to gerrit and, if I remember, set
some tag in bugzilla. At some point somebody approves and merges the
patch. Then, if I remember, I set the bug to resolved/fixed in bugzilla.

There is a bit too much remembering involved for my taste. It's easy to
forget to close the bug in bugzilla, especially if the patch has been
lying around for some time before being merged.

Would it be possible/sensible to automatically close a bug when the
patch is merged? Or did I miss something?

Cheers,
Stephan


That would require two things:
A) Far more integration between Gerrit and Bugzilla than we currently have.
B) An assumption that every commit that mentions a bug actually fixes it.

And I really don't like the idea of B. I can easily see people mentioning  
bugs that are related to a commit in the commit message but not directly  
fixed by it.


--
~Daniel Friesen (Dantman, Nadir-Seen-Fire) [http://daniel.friesen.name]


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Work flow for bugfixing

2012-11-08 Thread Andre Klapper
On Thu, 2012-11-08 at 13:00 -0500, Chad wrote:
  Would it be possible/sensible to automatically close a bug when the
  patch is merged? Or did I miss something?
 
 
 In theory, yes. Someone already started writing a plugin to do the same thing
 for Jira[0]. Might be a good starting place for a Bugzilla plugin (and RT?).

http://www.theoldmonk.net/gitzilla/ exists but I cannot offer any field
reports.
Current workflow is cumbersome (patch-in-gerrit keyword which should
express a patch to review bug status).
Automatic closing of reports as RESOLVED FIXED via a bot from Gerrit/Git
could also have its cornercases (e.g. merging several patches across
components, support reverting of Git merges by reopening Bugzilla
ticket?, etc.). Would require some thoughts and testing.

andre
-- 
Andre Klapper | Wikimedia Bugwrangler
http://blogs.gnome.org/aklapper/


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Work flow for bugfixing

2012-11-08 Thread Amir E. Aharoni
2012/11/8 Stephan Gambke s7ep...@gmail.com:
 Hi,

 is there a recommended work flow for bugfixing?

 Right now what I do is submit a patch to gerrit and, if I remember, set
 some tag in bugzilla. At some point somebody approves and merges the
 patch. Then, if I remember, I set the bug to resolved/fixed in bugzilla.

 There is a bit too much remembering involved for my taste. It's easy to
 forget to close the bug in bugzilla, especially if the patch has been
 lying around for some time before being merged.

This is not a big problem. If you forget to mark the bug as fixed,
somebody will stumble upon it sooner or later, try to reproduce it,
notice that it's fixed and mark it accordingly.
--
Amir

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Work flow for bugfixing

2012-11-08 Thread Mark Holmquist

And I really don't like the idea of B. I can easily see people
mentioning bugs that are related to a commit in the commit message but
not directly fixed by it.


Maybe we should start a new branch per-bug instead, and merge the branch 
when the bug is fixed? That might help with this issue.


--
Mark Holmquist
Software Engineer, Wikimedia Foundation
mtrac...@member.fsf.org
http://marktraceur.info
* Sent from Ubuntu GNU/Linux *

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Work flow for bugfixing

2012-11-08 Thread Chad
On Thu, Nov 8, 2012 at 1:57 PM, Mark Holmquist mtrac...@member.fsf.org wrote:
 And I really don't like the idea of B. I can easily see people
 mentioning bugs that are related to a commit in the commit message but
 not directly fixed by it.


 Maybe we should start a new branch per-bug instead, and merge the branch
 when the bug is fixed? That might help with this issue.


I don't really like that idea either. How about instead of auto-closing, we
at least have Gerrit tell BZ a patch was committed/submitted? That would
save the I've put a patch in url step, and would prompt people on the
CC list to possibly close when Gerrit says Patch foo was submitted.
Gerrit could probably tweak the keywords as well.

-Chad

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Work flow for bugfixing

2012-11-08 Thread Bartosz Dziewoński
2012/11/8 Mark Holmquist mtrac...@member.fsf.org:
 And I really don't like the idea of B. I can easily see people
 mentioning bugs that are related to a commit in the commit message but
 not directly fixed by it.


 Maybe we should start a new branch per-bug instead, and merge the branch
 when the bug is fixed? That might help with this issue.

What about changes like https://gerrit.wikimedia.org/r/#/c/29422/,
which mentions bug 1, but obviously doesn't entirely fix it?

-- Matma Rex

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Work flow for bugfixing

2012-11-08 Thread Mark Holmquist

What about changes like https://gerrit.wikimedia.org/r/#/c/29422/,
which mentions bug 1, but obviously doesn't entirely fix it?


It wouldn't be put into the branch, or maybe it would be put into the 
branch but the branch would never get closed. But I like Chad's idea 
better, currently.


--
Mark Holmquist
Software Engineer, Wikimedia Foundation
mtrac...@member.fsf.org
http://marktraceur.info
* Sent from Ubuntu GNU/Linux *

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Collaborative code-editing tools (long-distance pair programming)

2012-11-08 Thread Mark Holmquist

Then again, it does seem like a lot less work to run Collide, if we can
do it with Chromium and OpenJDK.


Update: I tried running Collide on my machine. It took some hacking to 
get through the Ant build process, and finally I came to a point where 
the README said run ./bin/deploy/collide and I said there's no 
bin/deploy directory and the README stood silent.


I have an email out to the list, but I'm not sure they'll be 
super-responsive.


Cloud 9 may be a better option... :)

--
Mark Holmquist
Software Engineer, Wikimedia Foundation
mtrac...@member.fsf.org
http://marktraceur.info
* Sent from Ubuntu GNU/Linux *

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


[Wikitech-l] Open Tech Chat in an hour - Thursday, November 8 at 12:30pm PST (20:30 UTC)

2012-11-08 Thread Rob Lanphier
Hi everyone!

I'm really sorry for the late notice on this.  I could have sworn I
sent this on Thursday.

We would like to once again have our Open Tech Chat this Thursday,
November 8 at 12:30pm PST (20:30 UTC).  This week, we have a guest:
Nils Adermann, development lead for phpBB and a primary contributor on
Composer[1], a dependency management library we've discussed before on
wikitech-l[2].  Assuming his hotel wifi holds up (he's at a conference
this week), Nils will be available to discuss Composer at this week’s
tech chat.

If there are other topics you’d like for us to talk about, please add
them to the list:
https://www.mediawiki.org/wiki/Meetings/2012-11-08

Details about Google Hangouts and IRC channels and such are at the URL
above.  Looking forward to seeing you there!

Rob
p.s. In case you were wondering why this isn't coming from Erik, we're
going to be rotating the emcee duty around, and this week, it's my
turn.

[1] Composer: http://getcomposer.org/
[2] Discussion about Composer on wikitech-l:
http://www.gossamer-threads.com/lists/wiki/wikitech/307466

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Work flow for bugfixing

2012-11-08 Thread Stephan Gambke
On 11/08/2012 07:43 PM, Daniel Friesen wrote:
 Would it be possible/sensible to automatically close a bug when the
 patch is merged? Or did I miss something?
 
 That would require two things:
 A) Far more integration between Gerrit and Bugzilla than we currently have.
 B) An assumption that every commit that mentions a bug actually fixes it.
 
 And I really don't like the idea of B. I can easily see people
 mentioning bugs that are related to a commit in the commit message but
 not directly fixed by it.

I do not know enough about Gerrit and how hard it would be to implement
this, but B could be solved by having some popup that asks the person
merging a patch if the patch fixes the bug mentioned in the commit message.

Cheers,
Stephan

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Work flow for bugfixing

2012-11-08 Thread Tim Landscheidt
Daniel Friesen dan...@nadir-seen-fire.com wrote:

 is there a recommended work flow for bugfixing?

 Right now what I do is submit a patch to gerrit and, if I remember, set
 some tag in bugzilla. At some point somebody approves and merges the
 patch. Then, if I remember, I set the bug to resolved/fixed in bugzilla.

 There is a bit too much remembering involved for my taste. It's easy to
 forget to close the bug in bugzilla, especially if the patch has been
 lying around for some time before being merged.

 Would it be possible/sensible to automatically close a bug when the
 patch is merged? Or did I miss something?

 That would require two things:
 A) Far more integration between Gerrit and Bugzilla than we currently have.
 B) An assumption that every commit that mentions a bug actually fixes it.

 And I really don't like the idea of B. I can easily see
 people mentioning bugs that are related to a commit in the
 commit message but not directly  fixed by it.

Then why did you invent B only to rail against it?  Just use
a reasonable pattern, e. g. This fixes bug #(\d+)\.

Tim


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Work flow for bugfixing

2012-11-08 Thread Tim Landscheidt
Stephan Gambke s7ep...@gmail.com wrote:

 is there a recommended work flow for bugfixing?

 Right now what I do is submit a patch to gerrit and, if I remember, set
 some tag in bugzilla. At some point somebody approves and merges the
 patch. Then, if I remember, I set the bug to resolved/fixed in bugzilla.

 There is a bit too much remembering involved for my taste. It's easy to
 forget to close the bug in bugzilla, especially if the patch has been
 lying around for some time before being merged.

 Would it be possible/sensible to automatically close a bug when the
 patch is merged? Or did I miss something?

Bug #17322.

Tim


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Work flow for bugfixing

2012-11-08 Thread Chris Steipp
 And I really don't like the idea of B. I can easily see
 people mentioning bugs that are related to a commit in the
 commit message but not directly  fixed by it.

 Then why did you invent B only to rail against it?  Just use
 a reasonable pattern, e. g. This fixes bug #(\d+)\.


At my last job, we used 2 different keywords to either associate the
patch with the bug, or close the bug. Maybe something like that would
work?

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


[Wikitech-l] MW 1.20 backwards compatibility in extensions

2012-11-08 Thread Tim Starling
All extension branches were removed during the migration to Git. Very
few extensions have branches for MW core major version support.
There's no longer a simple way to branch all extensions when a core
release is updated, and nobody has volunteered to write a script.

So we're back to the situation we had in MW 1.9 and earlier, where
it's usually not possible to run any actively maintained extension
against an MW core that's not the current trunk.

Given this, I think code reviewers should insist on backwards
compatibility with MW 1.20 for commits to the master branch of
extensions that are commonly used outside Wikimedia, at least until
the release management issue is solved.

-- Tim Starling


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] jQuery 1.9 will remove $.browser (deprecated since jQuery 1.3 - January 2009)

2012-11-08 Thread Jérémie Roquet
Hi Timo,

2012/11/7 Krinkle krinklem...@gmail.com:
 Just a reminder that jQuery will (after almost 4 years of deprecation!) drop
 $.browser [1] in jQuery 1.9.

Thanks for the reminder: it was still used in the MediaWiki:Commons.js
of the French Wikipedia.
What frightens me a bit is that I've been unable to find it using the
internal MediaWiki search engine; I've had to search by myself.

Best regards,

-- 
Jérémie

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] MW 1.20 backwards compatibility in extensions

2012-11-08 Thread Platonides
It's easy to make a script which creates such branch for each extension,
but I'm not convinced that's the right thing to do.

Also, did we make Extension:Distributor properly support branches on git
extensions?

In an ideal world we would have perfect tests for all extensions, and
jenkins could automatically detect when a change makes it incompatible
with an old (but supported) MediaWiki.

Maybe a theme for a hackaton / Google programs / new volunteers.


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


[Wikitech-l] Circumvention Tech Summit in Tunis, Tunisia, Nov. 26-28

2012-11-08 Thread Sumana Harihareswara
OpenITP and the Information Security Coalition (ISC)] will be hosting
the second annual Circumvention Tech Summit from November 26-28 in
Tunis, Tunisia.  http://openitp.org/?q=cts_tunis_nov_2012

If you want to attend to help people get around internet censorship 
surveillance in reading and editing Wikimedia content, Wikimedia
Foundation may be able to reimburse your travel expenses:
https://meta.wikimedia.org/wiki/Participation:Support

-- 
Sumana Harihareswara
Engineering Community Manager
Wikimedia Foundation

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] MW 1.20 backwards compatibility in extensions

2012-11-08 Thread Tim Starling
On 09/11/12 09:43, Platonides wrote:
 It's easy to make a script which creates such branch for each extension,
 but I'm not convinced that's the right thing to do.
 
 Also, did we make Extension:Distributor properly support branches on git
 extensions?

No.

https://bugzilla.wikimedia.org/show_bug.cgi?id=37946

 In an ideal world we would have perfect tests for all extensions, and
 jenkins could automatically detect when a change makes it incompatible
 with an old (but supported) MediaWiki.

Yeah, right.

-- Tim Starling


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] MW 1.20 backwards compatibility in extensions

2012-11-08 Thread Daniel Friesen
On Thu, 08 Nov 2012 14:08:53 -0800, Tim Starling tstarl...@wikimedia.org  
wrote:



All extension branches were removed during the migration to Git. Very
few extensions have branches for MW core major version support.
There's no longer a simple way to branch all extensions when a core
release is updated, and nobody has volunteered to write a script.

So we're back to the situation we had in MW 1.9 and earlier, where
it's usually not possible to run any actively maintained extension
against an MW core that's not the current trunk.

Given this, I think code reviewers should insist on backwards
compatibility with MW 1.20 for commits to the master branch of
extensions that are commonly used outside Wikimedia, at least until
the release management issue is solved.

-- Tim Starling


I've always been in favor of the trunk/master of an extension retaining  
compatibility with the latest stable version of MediaWiki until our next  
release. (with brand new extensions designed around new features in alpha  
an exception)


However our LTS support for 1.19 is going to make this more of an issue.

--
~Daniel Friesen (Dantman, Nadir-Seen-Fire) [http://daniel.friesen.name]


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] MW 1.20 backwards compatibility in extensions

2012-11-08 Thread Chad
On Thu, Nov 8, 2012 at 5:08 PM, Tim Starling tstarl...@wikimedia.org wrote:
 All extension branches were removed during the migration to Git. Very
 few extensions have branches for MW core major version support.
 There's no longer a simple way to branch all extensions when a core
 release is updated, and nobody has volunteered to write a script.


Such a script is now done:

https://gerrit.wikimedia.org/r/#/c/32470/

-Chad

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Work flow for bugfixing

2012-11-08 Thread Platonides
On 08/11/12 20:00, Chad wrote:
 I don't really like that idea either. How about instead of auto-closing, we
 at least have Gerrit tell BZ a patch was committed/submitted? That would
 save the I've put a patch in url step, and would prompt people on the
 CC list to possibly close when Gerrit says Patch foo was submitted.
 Gerrit could probably tweak the keywords as well.
 
 -Chad

It's the bit I like most.
Although the problem is not notifying bugzilla of the new patch, but the
merge. Maybe the bot should fire on merge, and if there was a bug
mentioned in the merged patch and if in that bug one of the last
comments begins with Fixed in  + a url to the merged change,
automatically leave a message Change X merged by Foo and close the bug
as resolved fixed.

Maybe we need a Waiting_merge status in bugzilla.


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Work flow for bugfixing

2012-11-08 Thread bawolff

 Maybe we need a Waiting_merge status in bugzilla.


I would like that. I find the patch-in-gerrit keyword very easy to
miss, and really patch in gerrit and open are two very different
stages of a bugs lifestyle.

-bawolff

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] jQuery 1.9 will remove $.browser (deprecated since jQuery 1.3 - January 2009)

2012-11-08 Thread Tim Starling
On 07/11/12 13:09, Krinkle wrote:
 In most (if not all) cases of people using $.browser it is because they want
 different behaviour for browsers that don't support a certain something. 
 Please
 take a minute to look at the code and find out what it is you are 
 special-casing
 for that apparently doesn't work in a certain browser.

In OggHandler we used browser detection for several things. It is not
affected by this change because it never used jQuery, but I would
still be interested to know how such code could possibly be migrated
to feature detection.

For example:

if ( this.safari ) {
// Detect https://bugs.webkit.org/show_bug.cgi?id=25575
var match = /AppleWebKit\/([0-9]+)/.exec( navigator.userAgent );
if ( match  parseInt( match[1] )  531 ) {
this.safariControlsBug = true;
}
}

...

if ( !this.safariControlsBug ) {
html += ' controls';
}

The issue is that if you use the controls attribute in Safari before
version 531, it segfaults. Last time I checked, JavaScript didn't have
the ability to generate different attributes depending on whether or
not its host segfaults.

I can understand the rationale behind removing jQuery.browser:
apparently most developers are too stupid to be trusted with it. Maybe
the idea is to use per-project reimplementation of jQuery.browser as
an intelligence test. The trouble is, I think even the stupidest
developers are able to copy and paste.

-- Tim Starling


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] MW 1.20 backwards compatibility in extensions

2012-11-08 Thread Dmitriy Sintsov



9 Ноябрь 2012 г. 2:52:54 пользователь Daniel Friesen 
(dan...@nadir-seen-fire.com) написал:

On Thu, 08 Nov 2012 14:08:53 -0800, Tim Starling tstarl...@wikimedia.org    
wrote:

 All extension branches were removed during the migration to Git. Very
 few extensions have branches for MW core major version support.
 There's no longer a simple way to branch all extensions when a core
 release is updated, and nobody has volunteered to write a script.

 So we're back to the situation we had in MW 1.9 and earlier, where
 it's usually not possible to run any actively maintained extension
 against an MW core that's not the current trunk.

 Given this, I think code reviewers should insist on backwards
 compatibility with MW 1.20 for commits to the master branch of
 extensions that are commonly used outside Wikimedia, at least until
 the release management issue is solved.

 -- Tim Starling

I've always been in favor of the trunk/master of an extension retaining    
compatibility with the latest stable version of MediaWiki until our next    
release. (with brand new extensions designed around new features in alpha    
an exception)

However our LTS support for 1.19 is going to make this more of an issue.


I made quite enough of MediaWiki freelancer work for small to medium MediaWiki 
installations here in Russia. Most of owners want new extensions or new functionality in 
old extensions and rarely want to upgrade the core. They consider that core update 
is too expensive. It was especially true for 1.17, which introduced a lot of 
client-side changes.
Maybe someday the most of activity (like Wikidata, Parsoid and so on) will be 
outside of core? Such way core might become smaller part of total project and 
change less often.
Dmitriy

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l