[Wikitech-l] Parser cache update/migration strategies

2014-09-09 Thread Daniel Kinzler
Hi all!

tl;dr: How to best handle the situation of an old parser cache entry not
containing all the info expected by a newly deployed version of code?


We are currently working to improve our usage of the parser cache for
Wikibase/Wikidata. E.g., We are attaching additional information related to
languagelinks the to ParserOutput, so we can use it in the skin when generating
the sidebar.

However, when we change what gets stored in the parser cache, we still need to
deal with old cache entries that do not yet have the desired information
attached. Here's a few options we have if the expected info isn't in the cached
ParserOutput:

1) ...then generate it on the fly. On every page view, until the parser cache is
purged. This seems bad especially if generating the required info means hitting
the database.

2) ...then invalidate the parser cache for this page, and then a) just live with
this request missing a bit of output, or b) generate on the fly c) trigger a
self-redirect.

3) ...then generated it, attach it to the ParserOutput, and push the updated
ParserOutput object back into the cache. This seems nice, but I'm not sure how
to do that.

4) ...then force a full re-rendering and re-caching of the page, then continue.
I'm not sure how to do this cleanly.


So, the simplest solution seems to be 2, but it means that we invalidate the
parser cache of *every* page on the wiki potentially (though we will not hit the
long tail of rarely viewed pages immediately). It effectively means that any
such change requires all pages to be re-rendered eventually. Is that acceptable?

Solution 3 seems nice and surgical, just injecting the new info into the cached
object. Is there a nice and clean way to *update* a parser cache entry like
that, without re-generating it in full? Do you see any issues with this
approach? Is it worth the trouble?


Any input would be great!

Thanks,
daniel

-- 
Daniel Kinzler
Senior Software Developer

Wikimedia Deutschland
Gesellschaft zur Förderung Freien Wissens e.V.

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] Sudden growth of unreviewed changesets in Gerrit

2014-09-09 Thread Quim Gil
Hi, our metrics are reflecting a sudden growth of unreviewed changes in
Gerrit. Your attention and interpretations are welcome.

http://korma.wmflabs.org/browser/gerrit_review_queue.html

Look at the green line in the first graph, Volume of open changesets. It
shows the number of open changesets waiting for a review (WIP and -1 are
excluded). In the last months, the number of changesets waiting for review
has gone from 311 (June) to 529 (July) and 790 (August).

The increase is so high that first I thought the problem was in the metric,
but Alvaro has reviewed the raw data and says that the metric is legit.

https://bugzilla.wikimedia.org/show_bug.cgi?id=70278

Trying to find an explanation, I went to the MediaWiki Core repository, and
there are some inusual curves there as well:

http://korma.wmflabs.org/browser/repository.html?repository=gerrit.wikimedia.org_mediawiki_core

However, not even these numbers alone would explain the increase.

Has there been an unusual activity (or lack of reviews) in Gerrit or are we
missing an important detail in the metrics?

(pause)

In any case, in the past weeks I have been chasing old open reviews and in
general I get the impression that developers/maintainers are too busy to
review so many contributions. The tricky part is that many (most?) of those
contributions come from the same developers/maintainers...

Simplifying a lot the picture, it looks as if we can't review code because
we are busy writing code that won't be reviewed because we are busy writing
code that won't be reviewed because... I'm sure this representation is
unfair, but you get the point.

In many cases, unreviewed patches rotting mean real resources put into
waste -- right? Maybe teams should prioritize the attention to open
changesets at the expense of writing new code themselves? It's a real
question and I won't pretend to have an answer. Different teams and
repositories probably have different answers.

-- 
Quim Gil
Engineering Community Manager @ Wikimedia Foundation
http://www.mediawiki.org/wiki/User:Qgil
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Parser cache update/migration strategies

2014-09-09 Thread aude
On Tue, Sep 9, 2014 at 12:03 PM, Daniel Kinzler dan...@brightbyte.de
wrote:

 Hi all!

 tl;dr: How to best handle the situation of an old parser cache entry not
 containing all the info expected by a newly deployed version of code?


 We are currently working to improve our usage of the parser cache for
 Wikibase/Wikidata. E.g., We are attaching additional information related to
 languagelinks the to ParserOutput, so we can use it in the skin when
 generating
 the sidebar.

 However, when we change what gets stored in the parser cache, we still
 need to
 deal with old cache entries that do not yet have the desired information
 attached. Here's a few options we have if the expected info isn't in the
 cached
 ParserOutput:

 1) ...then generate it on the fly. On every page view, until the parser
 cache is
 purged. This seems bad especially if generating the required info means
 hitting
 the database.

 2) ...then invalidate the parser cache for this page, and then a) just
 live with
 this request missing a bit of output, or b) generate on the fly c) trigger
 a
 self-redirect.

 3) ...then generated it, attach it to the ParserOutput, and push the
 updated
 ParserOutput object back into the cache. This seems nice, but I'm not sure
 how
 to do that.



https://gerrit.wikimedia.org/r/#/c/158879/ is my attempt to update
ParserOutput cache entry, though it seems too simplistic a solution.

Any feedback on this would be great or suggestions on how to do this
better, or maybe it's crazy idea. :P

Cheers,
Katie



 4) ...then force a full re-rendering and re-caching of the page, then
 continue.
 I'm not sure how to do this cleanly.


 So, the simplest solution seems to be 2, but it means that we invalidate
 the
 parser cache of *every* page on the wiki potentially (though we will not
 hit the
 long tail of rarely viewed pages immediately). It effectively means that
 any
 such change requires all pages to be re-rendered eventually. Is that
 acceptable?

 Solution 3 seems nice and surgical, just injecting the new info into the
 cached
 object. Is there a nice and clean way to *update* a parser cache entry like
 that, without re-generating it in full? Do you see any issues with this
 approach? Is it worth the trouble?


 Any input would be great!

 Thanks,
 daniel

 --
 Daniel Kinzler
 Senior Software Developer

 Wikimedia Deutschland
 Gesellschaft zur Förderung Freien Wissens e.V.

 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l




-- 
@wikimediadc / @wikidata
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Parser cache update/migration strategies

2014-09-09 Thread Nikolas Everett
Also option 5 could be to continue without the days until the parser cash
is invalidated on its own.
Maybe option 6 could be to continue without the data and invalidate the
cache and completely rerender only some of the time. Like 5% of the time
for the first couple hours then 25% of the time for a day then 100% of the
time after that. It'd guarantee that the cache is good after a certain
amount of time without causing a big spike ridge after deploys.
All those options are less good then just updating the cache I think.

Nik
On Sep 9, 2014 6:42 AM, aude aude.w...@gmail.com wrote:

 On Tue, Sep 9, 2014 at 12:03 PM, Daniel Kinzler dan...@brightbyte.de
 wrote:

  Hi all!
 
  tl;dr: How to best handle the situation of an old parser cache entry not
  containing all the info expected by a newly deployed version of code?
 
 
  We are currently working to improve our usage of the parser cache for
  Wikibase/Wikidata. E.g., We are attaching additional information related
 to
  languagelinks the to ParserOutput, so we can use it in the skin when
  generating
  the sidebar.
 
  However, when we change what gets stored in the parser cache, we still
  need to
  deal with old cache entries that do not yet have the desired information
  attached. Here's a few options we have if the expected info isn't in the
  cached
  ParserOutput:
 
  1) ...then generate it on the fly. On every page view, until the parser
  cache is
  purged. This seems bad especially if generating the required info means
  hitting
  the database.
 
  2) ...then invalidate the parser cache for this page, and then a) just
  live with
  this request missing a bit of output, or b) generate on the fly c)
 trigger
  a
  self-redirect.
 
  3) ...then generated it, attach it to the ParserOutput, and push the
  updated
  ParserOutput object back into the cache. This seems nice, but I'm not
 sure
  how
  to do that.
 


 https://gerrit.wikimedia.org/r/#/c/158879/ is my attempt to update
 ParserOutput cache entry, though it seems too simplistic a solution.

 Any feedback on this would be great or suggestions on how to do this
 better, or maybe it's crazy idea. :P

 Cheers,
 Katie


 
  4) ...then force a full re-rendering and re-caching of the page, then
  continue.
  I'm not sure how to do this cleanly.
 
 
  So, the simplest solution seems to be 2, but it means that we invalidate
  the
  parser cache of *every* page on the wiki potentially (though we will not
  hit the
  long tail of rarely viewed pages immediately). It effectively means that
  any
  such change requires all pages to be re-rendered eventually. Is that
  acceptable?
 
  Solution 3 seems nice and surgical, just injecting the new info into the
  cached
  object. Is there a nice and clean way to *update* a parser cache entry
 like
  that, without re-generating it in full? Do you see any issues with this
  approach? Is it worth the trouble?
 
 
  Any input would be great!
 
  Thanks,
  daniel
 
  --
  Daniel Kinzler
  Senior Software Developer
 
  Wikimedia Deutschland
  Gesellschaft zur Förderung Freien Wissens e.V.
 
  ___
  Wikitech-l mailing list
  Wikitech-l@lists.wikimedia.org
  https://lists.wikimedia.org/mailman/listinfo/wikitech-l




 --
 @wikimediadc / @wikidata
 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Parser cache update/migration strategies

2014-09-09 Thread Daniel Kinzler
Am 09.09.2014 13:45, schrieb Nikolas Everett:
 All those options are less good then just updating the cache I think.

Indeed. And that *sounds* simple enough. The issue is that we have to be sure to
update the correct cache key, the exact one the OutputPage object in question
was loaded from. Otherwise, we'll be updating the wrong key, and will read the
incomplete object again, and try to update again, and again, on every page view.

Sadly, the mechanism for determining the parser cache key is quite complicated
and rather opaque. The approach Katie tries in I1a11b200f0c looks fine at a
glance, but even if i can verify that it works as expected on my machine, I have
no idea how it will behave on the more strange wikis on the live cluster.

Any ideas who could help with that?

-- daniel



___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Sudden growth of unreviewed changesets in Gerrit

2014-09-09 Thread Federico Leva (Nemo)
When I look at http://koti.kapsi.fi/~federico/crstats/extensions.txt , I
get the impression that the answer may be simple: the handful devs who
take care of most code review in MediaWiki have been doing something
else in the last months. When you remove those few, the queue doesn't
move that much.

Nemo

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Sudden growth of unreviewed changesets in Gerrit

2014-09-09 Thread Strainu
Could it be just... holidays?

Strainu

2014-09-09 15:26 GMT+03:00 Federico Leva (Nemo) nemow...@gmail.com:
 When I look at http://koti.kapsi.fi/~federico/crstats/extensions.txt , I
 get the impression that the answer may be simple: the handful devs who
 take care of most code review in MediaWiki have been doing something
 else in the last months. When you remove those few, the queue doesn't
 move that much.

 Nemo

 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Sudden growth of unreviewed changesets in Gerrit

2014-09-09 Thread Quim Gil
On Tue, Sep 9, 2014 at 2:30 PM, Strainu strain...@gmail.com wrote:

 Could it be just... holidays?


Could be... although the holidays reviewing code seem to come together with
no-holidays uploading new patches.
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Sudden growth of unreviewed changesets in Gerrit

2014-09-09 Thread Bartosz Dziewoński
Is the increase connected to any particular users (or type of users,  
new/experienced)? Any repositories?


There have been a few sweeping changes recently that resulted in a lot of  
search-and-replace changesets being submitted for many extensions. Perhaps  
some of these are stuck somewhere.


--
Matma Rex

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Sudden growth of unreviewed changesets in Gerrit

2014-09-09 Thread Marc A. Pelletier
On 09/09/2014 09:00 AM, Quim Gil wrote:
 with
 no-holidays uploading new patches.

Wouldn't volunteers' output _increase_ as their free time increases?  :-)

-- Marc


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Sudden growth of unreviewed changesets in Gerrit

2014-09-09 Thread Quim Gil
On Tue, Sep 9, 2014 at 3:08 PM, Bartosz Dziewoński matma@gmail.com
wrote:

 Is the increase connected to any particular users (or type of users,
 new/experienced)? Any repositories?


I have mentioned the case of MediaWiki core. operations/puppet also had a
steep curve this Summer:

http://korma.wmflabs.org/browser/repository.html?repository=gerrit.wikimedia.org_operations_puppet

But so far it is guesswork for me.



 There have been a few sweeping changes recently that resulted in a lot of
 search-and-replace changesets being submitted for many extensions. Perhaps
 some of these are stuck somewhere.


Reedy alone seems to have plenty of patches waiting to be reviewed. Maybe
these are the ones you are referring to? See for example

https://gerrit.wikimedia.org/r/#/q/owner:%22Reedy+%253Creedy%2540wikimedia.org%253E%22+status:open,n,002f48b24b1d
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] UploadWizard bug triage on Sep 09, 17:00UTC

2014-09-09 Thread Andre Klapper
Reminder: This starts in 60 minutes.

On Sun, 2014-08-31 at 21:33 +0200, Andre Klapper wrote:
 you are invited to join us on the next Bugday:
 
  Tuesday, September 09, 2014, 17:00 to 19:00UTC [1]
 in #wikimedia-multimedia on Freenode IRC [2]
 
 We will take a look at open Bugzilla tickets for UploadWizard.
 
 Everyone is welcome to join. No technical knowledge needed! 
 It's an easy way to get involved or to give something back.
 
 More information can be found here:
   https://www.mediawiki.org/wiki/Bug_management/Triage/20140909
 
 For more information on triaging in general, check out 
   https://www.mediawiki.org/wiki/Bug_management/Triage
 
 See you there?
 
 andre
 
 
 [1] Timezone converter:
 http://www.timeanddate.com/worldclock/converter.html
 [2] See http://meta.wikimedia.org/wiki/IRC for more info on IRC

-- 
Andre Klapper | Wikimedia Bugwrangler
http://blogs.gnome.org/aklapper/


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Parser cache update/migration strategies

2014-09-09 Thread Nikolas Everett
On Tue, Sep 9, 2014 at 8:00 AM, Daniel Kinzler dan...@brightbyte.de wrote:

 Am 09.09.2014 13:45, schrieb Nikolas Everett:
  All those options are less good then just updating the cache I think.

 Indeed. And that *sounds* simple enough. The issue is that we have to be
 sure to
 update the correct cache key, the exact one the OutputPage object in
 question
 was loaded from. Otherwise, we'll be updating the wrong key, and will read
 the
 incomplete object again, and try to update again, and again, on every page
 view.

 Sadly, the mechanism for determining the parser cache key is quite
 complicated
 and rather opaque. The approach Katie tries in I1a11b200f0c looks fine at a
 glance, but even if i can verify that it works as expected on my machine,
 I have
 no idea how it will behave on the more strange wikis on the live cluster.

 Any ideas who could help with that?


No, not really.  My only experience with the parser cache was accidentally
polluting it with broken pages one time.

I suppose one option is to be defensive around reusing the key.  I mean, if
you could check the key used to fetch from the parser cache and you had a
cache hit then you know if you do a put you'll be setting _something_.

Another thing - I believe uncached calls to the parser are wrapped in pool
counter acquisitions to make sure no two processes spend duplicate effort.
You may want to acquire that to make sure anything you do that is heavy
doesn't get done twice.

Once you start talking about that it might just be simpler to invalidate
the whole entry.

Another option:
Kick off some kind of cache invalidation job that _slowly_ invalidates the
appropriate parts of the cache.  Something like how the varnish cache is
invalidated on template change.  That gives you marginally more control
than randomized invalidation.

Nik
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] Individual Engagement Grants Committee recruiting for members

2014-09-09 Thread Pine W
Dear all,

The Individual Engagement Grants Committee is seeking new members. We are
diverse, consensus-based committee with members on five continents and a
variety of skills. As a group we speak 16 languages and have over 500,000
edits to a variety of Wikimedia sites. We review grant applications twice
per year in four categories: Research, Tools (usually software), Online
community organizing, and Offline outreach and partnerships.

Members may specialize in as few or as many areas as they wish.

We look for the following qualifications. This information is taken from
our Meta page.

Mandatory:

   1. Experience with the Wikimedia movement and at least one Wikimedia
   project
   https://meta.wikimedia.org/wiki/Special:MyLanguage/Wikimedia_projects.
   2. Experience with some aspect of Wikimedia programmatic or
   project-based work, e.g. editor engagement, WikiProjects or other on-wiki
   organizing processes, outreach, events, partnerships, research, education,
   gadget or bot-building, etc.
   3. Ability to edit basic wiki-markup (grant proposal discussions are
   largely conducted on meta-wiki).
   4. Reasonable facility with English, for reviewing and discussing grant
   proposals.
   5. In good community- and legal- standing (not currently blocked or
   banned, involved in allegations of unethical financial behavior, etc).
   6. Availability to actively engage in the selection process during the
   published schedule for that round (time commitment is about 3 hours per
   week, plus 1 extra day for scoring).

Preferable:

   1. Experience leading, coordinating, or managing projects with an
   intended on-wiki or online impact.
   2. Experience handling externally provided money and working within
   budgets, preferably in a non-profit context.
   3. Experience applying for grants or working in grants programs (in the
   Wikimedia, academic, or wider non-profit world).
   4. Ability to read and write in multiple languages.

Acceptable:

   1. Members may apply for an Individual Engagement Grant themselves, but
   they will recuse themselves from reviewing proposals in the same category
   as their own during that round.
   2. Membership does not conflict with membership in other Wikimedia
   committees, including the Grant Advisory Committee
   
https://meta.wikimedia.org/wiki/Special:MyLanguage/Grants:PEG/Grant_Advisory_Committee
   or the Wikimania Scholarships Committee.

Please apply at
https://meta.wikimedia.org/wiki/Grants:IEG/Committee#ieg-candidates. The
close of the application period is September 21. The Committee will review
the applications internally and announce new members by early October,
coinciding with the start of the public review period for new grant
proposals.

For the Committee,

Pine
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Sudden growth of unreviewed changesets in Gerrit

2014-09-09 Thread Federico Leva (Nemo)
http://korma.wmflabs.org/browser/gerrit_review_queue.html shows most
increase is in August vs. May, of which about +550 waiting for review
(those without any comment?) and +200 pending.
The easiest way to identify outliers is diffing this report among the
two dates:
https://www.mediawiki.org/w/index.php?title=Gerrit%2FReports%2FOpen_changesets_by_ownerdiff=1128160oldid=1018652
Devs having an increase of 15 mediawiki/.* patches or more (and
sometimes over 50) are bawolff, deepali, Florianschmidtwelzow,
jackmcbarn, MarkAHershberger, Paladox, Rohan013, Withoutaname. Usually
they're worth looking into and resolving a way or another. :)

Nemo

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] OPW final report and thanks to you all

2014-09-09 Thread Frances Hocutt
My (belated) final report for my OPW project is up on my progress
reports page: 
https://www.mediawiki.org/wiki/Evaluating_and_Improving_MediaWiki_web_API_client_libraries/Progress_Reports#Final_report
. Thank you to everyone who has helped me learn as much and contribute
as much as I have this summer, and special thanks to my mentors and
technical advisers. This has been a great experience, and I'm happy to
be leaving API:Client code in better shape than I found it!

I hope to stick around the MediaWiki development community as I move
on in my career as a software developer. I've applied for the January
batch of Hacker School. Until that starts, I'll be working on API
applications for Growstuff (http://growstuff.org): a gardening site
that collects crowdsourced data from local gardeners around the world
and freely licenses the resulting data. Wherever I end up after this,
I'm glad to have gotten my start on MediaWiki.

-Frances

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] OOjs, MobileFrontend and Flow

2014-09-09 Thread Jon Robson
Update: Roan's patch got merged but Max identified an issue with it in
MobileFrontend which is stopping us switching over to use it in
MobileFrontend [1]

It seems to be related to an event that we use called 'progress' that
we use in our api handler.

You can replicate it by trying to do a wikitext edit in the stable
mode of the mobile site with the above patch.

Uncaught TypeError: Cannot use 'in' operator to search for 'progress'
in undefined 
load.php?debug=falselang=enmodules=ext.mantle%7Cext.mantle.hogan%2Cmodules%2Coo%2Ctemplates%7Cjqu…:68oo.EventEmitter.emit
load.php?debug=falselang=enmodules=ext.mantle%7Cext.mantle.hogan%2Cmodules%2Coo%2Ctemplates%7Cjqu…:68(anonymous
function)

Not sure if this is an issue with OOJS or the patch Roan made for us.
Any ideas VE guys?

[1] https://gerrit.wikimedia.org/r/#/c/129336/


On Fri, Aug 22, 2014 at 11:36 AM, Jon Robson jrob...@wikimedia.org wrote:
 Shahyar, Juliusz, Trevor, Roan and I met to discuss using oojs inside
 the mobile and Flow projects.

 The following 3 patches kicks off moving MobileFrontend's class model
 towards that of oojs - many thanks for Roan for doing most of the
 legwork :-):
 https://gerrit.wikimedia.org/r/155593
 https://gerrit.wikimedia.org/r/155589
 https://gerrit.wikimedia.org/r/129336

 On the long term we'd look to swap out the Class.js and
 eventemitter.js files in MobileFrontend for oojs, but this is going to
 be tricky and require some care, possibly mixing both oojs and
 MobileFrontend's class code in the repository at the same time. e.g.
 increasing JavaScript on the short term, but reducing it on the
 longterm. The MobileFrontend core development team will need to work
 out how best to manage this transition.

 Since Flow is very DOM-focused, as opposed to many smaller JavaScript
 modules with element management per the currently-accepted use of
 OOjs, it is unclear how we may go about integrating with OOjs fully.
 However, some potential use cases have been identified as candidates
 for implementing OOjs on an interim basis, primarily by abstracting
 some current FlowBoardComponent workflows, such as those which handle
 (re-)rendering of existing and new content fetched from the API.

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] The future of skins

2014-09-09 Thread Jon Robson
 1) Trevor is going to build a watch star widget on client and server.
 We identified that the existing watch star code is poorly written and
 has resulted in MobileFrontend rewriting it. We decided to target this
 as it is a simple enough example that it doesn't need a template. It's
 small and contained enough that we hope this will allow us to share
 ideas and codify a lot of those. Trevor is hoping to begin working on
 this the week of the 2nd September.

 2) We need a templating system in core. Trevor is going to do some
 research on server side templating systems. We hope that the
 templating RFC [1] can get resolved however we are getting to a point
 that we need one as soon as possible and do not want to be blocked by
 the outcome of this RFC, especially given a mustache based templating
 language can address all our current requirements.

Hey Trevor
Are you able to update us on the current status of the above?
Have you begun work on them? Are there any patches you can point at us
(even if they are work in progress patches?)

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Standardising icons across projects

2014-09-09 Thread Jon Robson
 1)  Monte to explore using SVGs in the Wikipedia apps. He will create
 font from SVGs. He will work with May to ensure her workflow is as
 easy as before.

Monte any updates on this?


 2) Trevor is going to look into how we can automatically generate
 different colour versions of SVGs and automatically create PNGs from
 them.

Trevor any updates?


 3) I am going to aim to agree on a standard icon HTML markup for
 mediawiki ui. We still have an issue with the fact that everyone is
 using different HTML markup to create icons. After reviewing all our
 current approaches [1] it was clear that we were relatively closely
 aligned and that it simply is a case of agreeing on class names.

I'm still working on this. Sam Smith has now suggested a mixin based
approach [1] which I am not too comfortable with as I think it works
against standardisation by allowing far too much flexibility and I
don't see how it will map to us using OOJS for it in future.

A class based approach [2] that follows the http://i.imgur.com/bieQ9zn.jpg

Please chime in on those patches - I'm super keen to make progress on
this to get our icon generation mechanism under control.

[1] https://gerrit.wikimedia.org/r/139368
[2] https://gerrit.wikimedia.org/r/158632

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Parser cache update/migration strategies

2014-09-09 Thread Tim Starling
On 09/09/14 22:00, Daniel Kinzler wrote:
 Sadly, the mechanism for determining the parser cache key is quite complicated
 and rather opaque.

It's only as complicated as it has to be, to support the desired features:

* Options which change the parser output.
* Merging of parser output objects when a given option does not affect
the output for a given input text, even though it may affect the
output other inputs.

 The approach Katie tries in I1a11b200f0c looks fine at a
 glance, but even if i can verify that it works as expected on my machine, I 
 have
 no idea how it will behave on the more strange wikis on the live cluster.

It will probably work. It assumes that the parser output for the
current article will always be added to the OutputPage before the
SidebarBeforeOutput hook is called. If that assumption was violated on
some popular URL, then it could waste quite a lot of CPU time.

It also assumes that the context page is always the same as the page
which was parsed and added to the OutputPage -- another assumption
which could have nasty consequences if it is violated.

I think it is fine to just invalidate all pages on the wiki. This can
be done by deploying the parser change, then progressively increasing
$wgCacheEpoch over the course of a week or two, until it is higher
than the change deployment time. If you increase $wgCacheEpoch too
fast, then you will get an overload.

-- Tim Starling


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l