Re: [Wikitech-l] ZERO architecture

2013-05-31 Thread Mark Bergsma
Hi Yuri,

Thanks for writing this up. I'll put some comments and questions inline.

On May 30, 2013, at 7:16 PM, Yuri Astrakhan yastrak...@wikimedia.org wrote:

 *== Technical Requirements ==*
 * increase Varnish cache hits / minimize cache fragmentation
 * Set up and configure new partners without code changes
 * Use partner-supplied IP ranges as a preferred alternative to the geo-ip
 database for fundraising  analytic teams

Note that a Varnish VMOD to support the latter is being written at the moment 
by Brandon Black.

 *== Current state ==*
 Zero domain requests set X-Subdomain=ZERO, and treat the request as
 mobile. The backend uses X-Subdomain and X-CS headers to customize result.
 The cache is heavily fragmented due to its variance on both of these
 headers in addition to the variance set by MobileFrontend extension and
 MediaWiki core.

...and also, variance due to the different hostname (and thus URL).

 *== Proposals ==*
 In order to reduce Zero-caused fragmentation, we propose to shrink from one
 bucket per carrier (X-CS) to three general buckets:
 * smart phones bucket -- banner and site modifications are done on the
 client in javascript
 * feature phones -- HTML only, the banner is inserted by the ESI
 ** for carriers with free images
 ** for carriers without free images
 
 *=== Varnish logic ===*
 * Parse User-Agent to distinguish between desktop / mobile / feature phone:
 X-Device-Type=desktop|mobile|legacy

Using the OpenDDR library?

 * Use IP - X-CS lookup (under development by OPs) to convert client's IP
 into X-CS header
 * If X-CS  X-Device-Type == 'legacy': Use IP - X-Images lookup (same
 lookup plugin, different database file) to determine if carrier allows
 images

Hopefully we can set the X-Images header straight from the ip database.

 Since  each carrier has its own list of free languages, language links on
 feature phones will point to origin, which will either silently redirect
 or ask for confirmation.

Perhaps we can store the list of supported languages for the carrier in the ip 
database as well?

 *=== ZERO vs M ===*
 Even  though I think zero. and m. subdomains should both go the way of the
 dodo to make each article have just one canonical location (no more
 linking  Google issues) , this won't happen until we are fully  migrated
 to Varnish and make some mobile code changes (and possibly  other changes
 that I am not aware of).

What do you mean by until we are fully migrated to Varnish? MobileFrontend 
has always exclusively been on Varnish.

 At the same time, we  should try to get rid of ZERO wherever possible.
 There are two  technical differences between m  zero: zero shows a link to
 image  instead of the actual image, and a big red zero warning is shown if
 the  carrier is not detected. There is also an organizational difference --
 some carriers only whitelist zero, some - only m, and some --  both zero
  m subdomains.

I'm still a little confused about m vs ZERO and images vs no images. 
That probably means others are too. :) Can you elaborate a little on that? I 
thought that was pretty much the same, but according to your spreadsheet that 
doesn't seem to be the case?

Overall this sounds reasonable I think, we'll just need to work out the details.

As Arthur also said in this thread, I'd like to keep zero  m completely 
aligned, ideally sharing the Varnish cache objects and the mobile device 
detection at the Varnish level as much as possible. I don't think we disagree 
here.

-- 
Mark Bergsma m...@wikimedia.org
Lead Operations Architect
Wikimedia Foundation





___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] ZERO architecture

2013-05-31 Thread Mark Bergsma
 * feature phones -- HTML only, the banner is inserted by the ESI
 ** for carriers with free images
 ** for carriers without free images
 
 
 What about including ESI tags for banners for smart devices as well as
 feature phones, then either use ESI to insert the banner for both device
 types or, alternatively, for smart devices don't let Varnish populate the
 ESI chunk and instead use JS to replace the ESI tags with the banner? That
 way we can still serve the same HTML for smart phones and feature phones
 with images (one less thing for which to vary the cache).

I think the verdict is still out on whether it's better to use ESI for Banners 
in Varnish or use JS for that client-side. I guess we'll have to test and see.

 Are there carrier-specific things that would result in different HTML for
 devices that do not support JS, or can you get away with providing the same
 non-js experience for Zero as MobileFrontend (aside from the
 banner, presumably handled by ESI)? If not currently, do you think its
 feasible to do that (eg make carrier-variable links get handled via special
 pages so we can always rely on the same URIs)? Again, it would be nice if
 we could just rely on the same HTML to further reduce cache variance. It
 would be cool if MobileFrontend and Zero shared buckets and they were
 limited to:
 
 * HTML + images
 * HTML - images
 * WAP

That would be nice.

 Since we improved MobileFrontend to no longer vary the cache on X-Device,
 I've been surprised to not see a significant increase in our cache hit
 ratio (which warrants further investigation but that's another email). Are
 there ways we can do a deeper analysis of the state of the varnish cache to
 determine just how fragmented it is, why, and how much of a problem it
 actually is? I believe I've asked this before and was met with a response
 of 'not really' - but maybe things have changed now, or others on this list
 have different insight. I think we've mostly approached the issue with a
 lot more assumption than informed analysis, and if possible I think it
 would be good to change that.

Yeah, we should look into that. We've already flagged a few possible culprits, 
and we're also working on the migration of the desktop wiki cluster from Squid 
to Varnish, which has some of the same issues with variance (sessions, XVO, 
cookies, Accept-Language...) as MobileFrontend does. After we've finished 
migrating that and confirmed that it's working well, we want to unify those 
clusters' configurations a bit more, and that by itself should give us 
additional opportunity to compare some strategies there.

We've since also figured out that the way we've calculate cache efficiency with 
Varnish is not exactly ideal; unlike Squid, cache purges are done as HTTP 
requests to Varnish. Therefore in Varnish, those cache lookups are calculated 
into the cache hit rate, which isn't very helpful. To make things worse, the 
few hundreds of purges a second vs actual client traffic matter a lot more on 
the mobile cluster (with much less traffic but a big content set) than it does 
for our other clusters. So until we can factor that out in the Varnish counters 
(might be possible in Varnish 4.0), we'll have to look at other metrics.

More useful therefore is to check the actual backend fetches (backend_req), 
and these appear to have gone down some. Annoyingly, every time we restart a 
Varnish instance we get a spike in the Ganglia graphs, making the long-term 
graphs pretty much unusable. To fix that we'll either need to patch Ganglia 
itself or move to some other stats engine (statsd?). So we have a bit of work 
to do there on the Ops front.

Note that we're about to replace all Varnish caches in eqiad by (fewer) newer, 
much bigger boxes, and we've decided to also upgrade the 4 mobile boxes with 
those same specs. And we're also doing that in our new west coast caching data 
center as well as esams. This will increase the mobile cache size a lot, and 
will hopefully help by throwing resources at the problem.

-- 
Mark Bergsma m...@wikimedia.org
Lead Operations Architect
Wikimedia Foundation





___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Wikitech-l Digest, Vol 118, Issue 102

2013-05-31 Thread anton . bolfing
Besten Dank für Ihre Nachricht.

Ich bin bis am Montag, dem 24. Juni 2013  abwesend und kann Ihre Nachricht erst 
ab dann wieder bearbeiten. Ihre Email wird nicht weitergeleitet.

Beste Grüsse

Anton Bolfing



Thank you for your message.

I'm out of office until Monday, June 26.  I will not respond to your email 
until then.

Best regards

Anton Bolfing



___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] Architecture Guidelines: Writing Testable Code

2013-05-31 Thread Daniel Kinzler
When looking for resources to answer Tim's question at
https://www.mediawiki.org/wiki/Architecture_guidelines#Clear_separation_of_concerns,
I found a very nice and concise overview of principles to follow for writing
testable (and extendable, and maintainable) code:

Writing Testable Code by Miško Hevery
http://googletesting.blogspot.de/2008/08/by-miko-hevery-so-you-decided-to.html.

It's just 10 short and easy points, not some rambling discussion of code 
philosophy.

As far as I am concerned, these points can be our architecture guidelines.
Beyond that, all we need is some best practices for dealing with legacy code.

MediaWiki violates at least half of these principles in pretty much every class.
I'm not saying we should rewrite MediaWiki to conform. But I'd wish that it was
recommended for all new code to follow these principles, and that (local) just
in time refactoring of old code in accordance with these guidelines was 
encouraged.

-- daniel

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] Centralized Lua modules for Wikisource (OPW mentor needed)

2013-05-31 Thread David Cuenca
Hi all,

After a talk with Brad Jorsch during the Hackathon (thanks again Brad for
your patience), it became clear to me that Lua modules can be localized
either by using system messages or by getting the project language code
(mw.getContentLanguage().getCode()) and then switching the message. This
second option is less integrated with the translation system, but can serve
as intermediate step to get things running.

For Wikisource it would be nice to have a central repository (sitting on
wikisource.org) of localized Lua modules and associated templates. The
documentation could be translated using Extension:Translate. These modules,
templates and associated documentation would be then synchronized with all
the language wikisources that subscribe to an opt-in list. Users would be
then advised to modify the central module, thus all language versions would
benefit of the improvements. This could be the first experiment of having a
centralized repository of modules.

What do you think of this? Would be anyone available to mentor an Outreach
Program for Women project?

Thanks,
David Cuenca --Micru
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Centralized Lua modules for Wikisource (OPW mentor needed)

2013-05-31 Thread Yuvi Panda
I'm pretty sure that the 'syncing' can be accomplished by a simple
bot, and it might even already exist(?). Will be happy to help write
the bot if it doesn't exist yet.
--
Yuvi Panda T
http://yuvi.in/blog

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Centralized Lua modules for Wikisource (OPW mentor needed)

2013-05-31 Thread Paul Selitskas
It's the worst case scenario. Bot syncing is a bad practice (and it was
such since the beginning of times). Since we have banners (CentralNotice)
and interwiki links (Wikidata) centralized, perhaps we should move forward
and start centralizing on-wiki development efforts. There are also global
gadgets (i.e. suitable for most wikis), and said above should be the case
for Gadgets 2.0 as well.


On Fri, May 31, 2013 at 6:19 PM, Yuvi Panda yuvipa...@gmail.com wrote:

 I'm pretty sure that the 'syncing' can be accomplished by a simple
 bot, and it might even already exist(?). Will be happy to help write
 the bot if it doesn't exist yet.
 --
 Yuvi Panda T
 http://yuvi.in/blog

 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l




-- 
З павагай,
Павел Селіцкас/Pavel Selitskas
Wizardist @ Wikimedia projects
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Centralized Lua modules for Wikisource (OPW mentor needed)

2013-05-31 Thread Yuvi Panda
On Fri, May 31, 2013 at 9:43 PM, Paul Selitskas p.selits...@gmail.com wrote:
 It's the worst case scenario. Bot syncing is a bad practice (and it was
 such since the beginning of times). Since we have banners (CentralNotice)
 and interwiki links (Wikidata) centralized, perhaps we should move forward
 and start centralizing on-wiki development efforts. There are also global
 gadgets (i.e. suitable for most wikis), and said above should be the case
 for Gadgets 2.0 as well.

Someone official (anomie?) mentioned this a couple of other times too,
but also said that it is something 'on the roadmap' but without any
concrete dates, by which I assume it is *at least* a year away.


--
Yuvi Panda T
http://yuvi.in/blog

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Quo Vadis, Vagrant

2013-05-31 Thread Yuri Astrakhan
This is an awesome news! Just to through some magic dust into gears - has
anyone succeeded in running it under windows? Alternativelly, I wouldn't
mind having a vagrant VM that includes GUI  a browser so that no host
interaction is needed.


On Wed, May 29, 2013 at 9:37 AM, Željko Filipin zfili...@wikimedia.orgwrote:

 On Mon, May 13, 2013 at 3:20 PM, Ori Livneh ori.liv...@gmail.com wrote:

  If you're interested in using MW-V to document and automate your
  development setup, get in touch -- I'd be happy to help. I'm 'ori-l'
  on IRC, and I'll be at the Amsterdam Hackathon later this month too.
 

 I just wanted to say thanks to Ori. Mediawiki-Vagrant is now ready to run
 Selenium tests[1].

 Željko
 --
 1:

 https://github.com/wikimedia/mediawiki-vagrant/tree/master/puppet/modules/browsertests
 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Centralized Lua modules for Wikisource (OPW mentor needed)

2013-05-31 Thread Greg Grossmeier
quote name=Yuvi Panda date=2013-05-31 time=22:04:09 +0530
 On Fri, May 31, 2013 at 9:43 PM, Paul Selitskas p.selits...@gmail.com wrote:
  It's the worst case scenario. Bot syncing is a bad practice (and it was
  such since the beginning of times). Since we have banners (CentralNotice)
  and interwiki links (Wikidata) centralized, perhaps we should move forward
  and start centralizing on-wiki development efforts. There are also global
  gadgets (i.e. suitable for most wikis), and said above should be the case
  for Gadgets 2.0 as well.
 
 Someone official (anomie?) mentioned this a couple of other times too,
 but also said that it is something 'on the roadmap' but without any
 concrete dates, by which I assume it is *at least* a year away.

This topic will no doubt come up during our (Platform's) next quarterly
review which will be in June. Thats when we prioritize our effort for
big things during the next quarter.

Greg

-- 
| Greg GrossmeierGPG: B2FA 27B1 F7EB D327 6B8E |
| identi.ca: @gregA18D 1138 8E47 FAC8 1C7D |

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Centralized Lua modules for Wikisource (OPW mentor needed)

2013-05-31 Thread Quim Gil

As for the OPW part...

On 05/31/2013 08:15 AM, David Cuenca wrote:

What do you think of this? Would be anyone available to mentor an Outreach
Program for Women project?


Round 6 is now running and the next one will start by the end of the year.

Listing this project idea at 
http://www.mediawiki.org/wiki/Mentorship_programs/Possible_projects 
can't harm in any case.


--
Quim Gil
Technical Contributor Coordinator @ Wikimedia Foundation
http://www.mediawiki.org/wiki/User:Qgil

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Centralized Lua modules for Wikisource (OPW mentor needed)

2013-05-31 Thread Helder .
See also:
https://meta.wikimedia.org/wiki/Requests_for_comment/Global_bits_and_pieces

Helder

On Fri, May 31, 2013 at 1:57 PM, Greg Grossmeier g...@wikimedia.org wrote:
 quote name=Yuvi Panda date=2013-05-31 time=22:04:09 +0530
 On Fri, May 31, 2013 at 9:43 PM, Paul Selitskas p.selits...@gmail.com 
 wrote:
  It's the worst case scenario. Bot syncing is a bad practice (and it was
  such since the beginning of times). Since we have banners (CentralNotice)
  and interwiki links (Wikidata) centralized, perhaps we should move forward
  and start centralizing on-wiki development efforts. There are also global
  gadgets (i.e. suitable for most wikis), and said above should be the case
  for Gadgets 2.0 as well.

 Someone official (anomie?) mentioned this a couple of other times too,
 but also said that it is something 'on the roadmap' but without any
 concrete dates, by which I assume it is *at least* a year away.

 This topic will no doubt come up during our (Platform's) next quarterly
 review which will be in June. Thats when we prioritize our effort for
 big things during the next quarter.

 Greg

 --
 | Greg GrossmeierGPG: B2FA 27B1 F7EB D327 6B8E |
 | identi.ca: @gregA18D 1138 8E47 FAC8 1C7D |

 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] ZERO architecture

2013-05-31 Thread Adam Baso
Sorry to reply on a thread that will probably not sort nicely on the
mailman web interface or threading mail clients. Anybody know of an
easy way to reply to digest email in Gmail such that mailman will
retain threading?

I'll be working on the conversion of Wikipedia Zero banners to ESI.
There will be good lessons here I think for looking at dynamic loading
in other parts of the interface.

Arthur, to address your questions in the parent to Mark's reply:

is there any reason you'd need serve different HTML than what is
already being served by MobileFrontend?

Currently, some languages are not whitelisted at carriers, meaning
that users may get billed when they hit language.m.wikipedia.org or
language.zero.wikipedia.org. Thus, a number of a hrefs are
rewritten to include interstitials if the link is going to result in a
charge. By the way, we see some shortcomings in the existing rewrites
that need to be corrected (e.g., some URLs don't have interstitials,
but should), but that's a separate bug.

My thinking is that we start intercepting all clicks by JavaScript if
it's a Javascripty browser, or via a default interceptor at the URL on
that language.(m|zero).wikipedia.org's language's corresponding
subdomain otherwise. In either case, if the destination link is on a
non-whitelisted Wikipedia domain for the carrier or if the link is
external of Wikipedia, the user should land at an interstitial page
hosted on the same whitelisted subdomain from whence the user came.

Out of curiosity, is there WAP support in Zero? I noticed some
comments like '# WAP' in the varnish acls for Zero, so I presume so.
Is the Zero WAP experience different than the MobileFrontend WAP
experience?

No special WAP considerations in ZeroRatedMobileAccess above and
beyond MobileFrontend, as I recall. The # WAP comments is just for
us to remember in case a support case comes up with that particular
carrier.

We'll want to keep in mind impacts for USSD/SMS support. I think
Jeremy had some good conversations at the Wikimedia Hackathon in
Amsterdam that will help him to refine how his middleware receives and
transforms content.



Mark Bergsma mark at wikimedia.org
Fri May 31 09:44:48 UTC 2013

Previous message: [Wikitech-l] ZERO architecture
Next message: [Wikitech-l] ZERO architecture
Messages sorted by: [ date ] [ thread ] [ subject ] [ author ]



 * feature phones -- HTML only, the banner is inserted by the ESI
 ** for carriers with free images
 ** for carriers without free images


 What about including ESI tags for banners for smart devices as well as
 feature phones, then either use ESI to insert the banner for both device
 types or, alternatively, for smart devices don't let Varnish populate the
 ESI chunk and instead use JS to replace the ESI tags with the banner? That
 way we can still serve the same HTML for smart phones and feature phones
 with images (one less thing for which to vary the cache).

I think the verdict is still out on whether it's better to use ESI for
Banners in Varnish or use JS for that client-side. I guess we'll have
to test and see.

 Are there carrier-specific things that would result in different HTML for
 devices that do not support JS, or can you get away with providing the same
 non-js experience for Zero as MobileFrontend (aside from the
 banner, presumably handled by ESI)? If not currently, do you think its
 feasible to do that (eg make carrier-variable links get handled via special
 pages so we can always rely on the same URIs)? Again, it would be nice if
 we could just rely on the same HTML to further reduce cache variance. It
 would be cool if MobileFrontend and Zero shared buckets and they were
 limited to:

 * HTML + images
 * HTML - images
 * WAP

That would be nice.

 Since we improved MobileFrontend to no longer vary the cache on X-Device,
 I've been surprised to not see a significant increase in our cache hit
 ratio (which warrants further investigation but that's another email). Are
 there ways we can do a deeper analysis of the state of the varnish cache to
 determine just how fragmented it is, why, and how much of a problem it
 actually is? I believe I've asked this before and was met with a response
 of 'not really' - but maybe things have changed now, or others on this list
 have different insight. I think we've mostly approached the issue with a
 lot more assumption than informed analysis, and if possible I think it
 would be good to change that.

Yeah, we should look into that. We've already flagged a few possible
culprits, and we're also working on the migration of the desktop wiki
cluster from Squid to Varnish, which has some of the same issues with
variance (sessions, XVO, cookies, Accept-Language...) as
MobileFrontend does. After we've finished migrating that and confirmed
that it's working well, we want to unify those clusters'
configurations a bit more, and that by itself should give us
additional opportunity to compare some strategies there.


[Wikitech-l] Deployment Highlights - Week of June 3rd

2013-05-31 Thread Greg Grossmeier
Hello all,

Next week (starting June 3rd) will be a very interesting week as far as
deployments are concerned. Why? Because on Thursday June 6th we will be
starting the one-week deployment cycle! But, we're also in the tail end
of the old two-week deploy cycle for 1.22wmf5, thus making next week a
bit crowded.

For more information on the one-week deploy cycle, see:
https://wikitech.wikimedia.org/wiki/Deployments/One_week.


== Monday ==
* MediaWiki general deployment window for 1.22wmf5 to English Wikipedia.
  See: https://www.mediawiki.org/wiki/MediaWiki_1.22/Roadmap

== Tuesday ==
* Universal Language Selector (ULS) will be updating to the latest
  master after last week's revert in preparation for the following
  week's deploy which will start the transition to using ULS. See
  https://www.mediawiki.org/wiki/UniversalLanguageSelector/Deployment/Planning
  for more information.

* AFTv5, see: http://etherpad.wikimedia.org/AFT5-release and
  http://www.mediawiki.org/wiki/Article_feedback/Version_5/Release_Plan_2013


== Wednesday ==
* MediaWiki general deployment window for 1.22wmf5 to all 'pedias
  (completion of wmf5 rollout).
  See: https://www.mediawiki.org/wiki/MediaWiki_1.22/Roadmap

* MobileFrontEnd updates (moved from Tuesday due to conflicting
  meetings). See:
  https://www.mediawiki.org/wiki/Extension:MobileFrontend/Deployments


== Thursday ==

* MediaWiki general deployment window (1.22) the beginning of 1.22wmf6
  (and of the one-week cycle), rolled out to mediawiki.org,
  test.wikipedia.org, and test2.wikipedia.org.
  See:
  https://www.mediawiki.org/wiki/MediaWiki_1.22/Roadmap
  and
  https://wikitech.wikimedia.org/wiki/Deployments/One_week

* E3: Updates to GettingStarted and potentially some updates to
  GuidedTour.
  See:
  https://www.mediawiki.org/wiki/Extension:GettingStarted
  and
  https://www.mediawiki.org/wiki/Extension:GuidedTour


== All week ==

* Lightning deploys!
  https://wikitech.wikimedia.org/wiki/Lightning_deployments



Any questions, just ask!

Greg


-- 
| Greg GrossmeierGPG: B2FA 27B1 F7EB D327 6B8E |
| identi.ca: @gregA18D 1138 8E47 FAC8 1C7D |

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] Welcome, Nik Everett

2013-05-31 Thread Rob Lanphier
Hi everyone

It is my pleasure to introduce Nik Everett as a new Senior Software
Engineer specializing in Search, working remotely from his home in
Raleigh, North Carolina.  Nik joins us from Lulu, a company founded by
Bob Young[1] to enable anyone to publish a book (dealing with print,
distribution, and order fulfillment issues).  Nik was responsible for
a number of DevOps-related tasks at Lulu, mainly centered around test
automation, build infrastructure, and database performance, but also
including maintenance and support of their Solr infrastructure.  He
also founded and chaired Lulu’s architecture review board, which is
similar to a group we’re just now starting up here[2]

Prior to Lulu, Nik worked at Bandwidth.com, Radarfind, and Nortel.  He
has Bachelors Degrees in Computer Science and Mathematics from
University of North Carolina at Chapel Hill.

Nik knows many programming languages.  He prefers writing in Python,
but due to the nature of what we’re doing here, he’ll probably end up
writing more Java and (of course) PHP.  He likes picking up a new
languages every so often.  He’s currently playing around with Haskell,
and on good days may be able to explain what a monad is.  :-)

Nik’s nick on Freenode is “manybubbles”, and he's cc'd on this mail.

Rob

[1] One of Bob Young’s other claims to fame is as co-founder and
former CEO of Red Hat.

[2] Not yet an “architecture review board”, but that may be as good a
name as any for this:
http://lists.wikimedia.org/pipermail/wikitech-l/2013-May/069569.html

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Welcome, Nik Everett

2013-05-31 Thread Marc A. Pelletier
On 05/31/2013 05:48 PM, Rob Lanphier wrote:
 It is my pleasure to introduce Nik Everett as a new Senior Software
 Engineer specializing in Search, working remotely from his home in
 Raleigh, North Carolina.

Jumping *peppers*!! This is *smiley* time!

Do not forget to *enjoy the sauce*!

-- Marc


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Welcome, Nik Everett

2013-05-31 Thread Tyler Romeo
Welcome!

I'm in the midst of trying to get MediaWiki to work with AWS's search
interface, so it's always good to have more search people (although I'm
sure you'll be doing more with WMF's search backend).

*-- *
*Tyler Romeo*
Stevens Institute of Technology, Class of 2016
Major in Computer Science
www.whizkidztech.com | tylerro...@gmail.com


On Fri, May 31, 2013 at 5:48 PM, Rob Lanphier ro...@wikimedia.org wrote:

 Hi everyone

 It is my pleasure to introduce Nik Everett as a new Senior Software
 Engineer specializing in Search, working remotely from his home in
 Raleigh, North Carolina.  Nik joins us from Lulu, a company founded by
 Bob Young[1] to enable anyone to publish a book (dealing with print,
 distribution, and order fulfillment issues).  Nik was responsible for
 a number of DevOps-related tasks at Lulu, mainly centered around test
 automation, build infrastructure, and database performance, but also
 including maintenance and support of their Solr infrastructure.  He
 also founded and chaired Lulu’s architecture review board, which is
 similar to a group we’re just now starting up here[2]

 Prior to Lulu, Nik worked at Bandwidth.com, Radarfind, and Nortel.  He
 has Bachelors Degrees in Computer Science and Mathematics from
 University of North Carolina at Chapel Hill.

 Nik knows many programming languages.  He prefers writing in Python,
 but due to the nature of what we’re doing here, he’ll probably end up
 writing more Java and (of course) PHP.  He likes picking up a new
 languages every so often.  He’s currently playing around with Haskell,
 and on good days may be able to explain what a monad is.  :-)

 Nik’s nick on Freenode is “manybubbles”, and he's cc'd on this mail.

 Rob

 [1] One of Bob Young’s other claims to fame is as co-founder and
 former CEO of Red Hat.

 [2] Not yet an “architecture review board”, but that may be as good a
 name as any for this:
 http://lists.wikimedia.org/pipermail/wikitech-l/2013-May/069569.html

 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Welcome, Nik Everett

2013-05-31 Thread Matthew Flaschen
On 05/31/2013 05:48 PM, Rob Lanphier wrote:
 Hi everyone
 
 It is my pleasure to introduce Nik Everett as a new Senior Software
 Engineer specializing in Search, working remotely from his home in
 Raleigh, North Carolina.  Nik joins us from Lulu, a company founded by
 Bob Young[1] to enable anyone to publish a book (dealing with print,
 distribution, and order fulfillment issues).  Nik was responsible for
 a number of DevOps-related tasks at Lulu, mainly centered around test
 automation, build infrastructure, and database performance, but also
 including maintenance and support of their Solr infrastructure.  He
 also founded and chaired Lulu’s architecture review board, which is
 similar to a group we’re just now starting up here[2]

Welcome!

Matt Flaschen

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] New git-review lets you configure 'origin' as the gerrit remote

2013-05-31 Thread Ori Livneh
Hey,

The new version of git-review released today (1.22) includes a patch I
wrote that makes it possible to work against a single 'origin' remote. This
amounts to a workaround for git-review's tendency to frighten you into
thinking you're about to submit more patches than the ones you are working
on. It makes git-review more pleasant to work with, in my opinion.

To enable this behavior, you first need to upgrade to the latest version of
git-review, by running pip install -U git-review. Then you need to create
a configuration file: either /etc/git-review/git-review.conf (system-wide)
or ~/.config/git-review/git-review.conf (user-specific).

The file should contain these two lines:

[gerrit]
defaultremote = origin

Once you've made the change, any new Gerrit repos you clone using an
authenticated URI will just work.

You'll need to perform an additional step to migrate existing repositories.
In each repository, run the following commands:

  git remote set-url origin $(git config --get remote.gerrit.url)
  git remote rm gerrit
  git review -s

Hope you find this useful.
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] New git-review lets you configure 'origin' as the gerrit remote

2013-05-31 Thread Chad
On Fri, May 31, 2013 at 6:14 PM, Ori Livneh o...@wikimedia.org wrote:
 The new version of git-review released today (1.22) includes a patch I
 wrote that makes it possible to work against a single 'origin' remote. This
 amounts to a workaround for git-review's tendency to frighten you into
 thinking you're about to submit more patches than the ones you are working
 on. It makes git-review more pleasant to work with, in my opinion.


Sweet!

 To enable this behavior, you first need to upgrade to the latest version of
 git-review, by running pip install -U git-review. Then you need to create
 a configuration file: either /etc/git-review/git-review.conf (system-wide)
 or ~/.config/git-review/git-review.conf (user-specific).


Bummer we can't just use ~/.gitconfig, but hey, it's still an improvement :)

-Chad

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] showing videos and images in modal viewers within articles

2013-05-31 Thread Fabrice Florin
Hey Kaldari,

Thanks so much for yesterday's code fix to display videos in a modal viewer 
when you click on article thumbnails!

This was long overdue and makes a huge difference already, as shown in this 
example:

https://en.wikipedia.org/wiki/Congenital_insensitivity_to_pain

I agree with Erik that the ideal behavior would be to play the video 
immediately after you click on the thumbnail, without requiring a second click 
to play in modal view.

Now, is there any chance we could do something similar for still photos? As a 
longtime photographer, it drives me nuts that clicking on a thumbnail article 
takes you to this overwhelming page on Commons, with scary walls of text all 
around the image. (That's one of the reasons I haven't yet migrated any of my 
20k Flickr photos to Commons, BTW.)

Most modern web sites nowadays just show you the photo in full screen, with 
only a few icons around it, so you can experience the image as it was intended 
to be seen (typically in a black modal panel, to make it pop up more). You 
still have the option to reveal all the text details if you want them, but they 
are not forced on you as we do today on Wikipedia and Commons.

Hopefully, our new multimedia team will be able to join forces with the design 
team to tackle some of these commonsense UI improvements, once we get up to 
speed this summer.

Thanks again for this welcome prelude :)


Fabrice


___

Fabrice Florin
Product Manager, Editor Engagement
Wikimedia Foundation

http://en.wikipedia.org/wiki/User:Fabrice_Florin_(WMF)


On May 30, 2013, at 1:35 AM, wikitech-l-requ...@lists.wikimedia.org wrote:
 
 From: Erik Moeller e...@wikimedia.org
 Subject: Re: [Wikitech-l] showing videos and images in modal viewers within 
 articles
 Date: May 29, 2013 9:21:48 PM PDT
 To: Ryan Kaldari rkald...@wikimedia.org
 Cc: Wikimedia developers wikitech-l@lists.wikimedia.org
 Reply-To: Wikimedia developers wikitech-l@lists.wikimedia.org
 
 
 Yes, better support for display of images through a modal viewer would
 be great. I'm not sure a modal parameter that has to be explicitly
 set for files is the best approach - I would recommend optimizing the
 default experience when a user clicks an image or video. It's not
 clear that the current behavior based on a pixel threshold is actually
 desirable as the default behavior. (On a side note, the TMH behavior
 should be improved to actually play the video immediately, not require
 a second click to play in modal view.)
 
 Magnus Manske explored an alternative approach pretty extensively in
 response to the October 2011 Coding Challenge, which is worth taking a
 look at:
 https://www.mediawiki.org/wiki/User:Magnus_Manske/wikipic
 
 Cheers,
 Erik
 
 
 From: Brion Vibber bvib...@wikimedia.org
 Subject: Re: [Wikitech-l] showing videos and images in modal viewers within 
 articles
 Date: May 30, 2013 12:42:00 AM PDT
 To: Wikimedia developers wikitech-l@lists.wikimedia.org
 Cc: Erik Moeller emoel...@wikimedia.org
 Reply-To: Wikimedia developers wikitech-l@lists.wikimedia.org
 
 
 I tend to agree that a light box or other modalish zoom should become the
 default behavior.
 
 -- brion
 On May 30, 2013 5:50 AM, Ryan Kaldari rkald...@wikimedia.org wrote:
 
 For years, I have weeped and wailed about people adding complicated maps
 and diagrams as 220px thumbnail images to Wikipedia articles. These sort of
 images are virtually useless within an article unless they are displayed at
 relatively large sizes. Unfortunately, including them at large sizes
 creates a whole new set of problems. Namely, large images mess up the
 formatting of the page and cause headers, edit links, and other images to
 get jumbled around into strange places (or even overlapping each other on
 occasion), especially for people on tablets or other small screens. The
 problem is even worse for videos. Who wants to watch a hi-res video in a
 tiny 220px inline viewer? If there are subtitles, you can't even read them.
 But should we instead include them as giant 1280px players within the
 article? That seems like it would be obnoxious.
 
 What if instead we could mark such complicated images and high-res videos
 to be shown in modal viewers when the user clicks on them? For example:
 [[File:Highres-video1.webm|**thumb|right|modal|A high res video]]. When
 you clicked on the thumbnail, instead of going to Commons, a modal viewer
 would overlay across the screen and let you view the video/image at high
 resolution (complete with a link to Commons and the attribution
 information). Believe it or not, this capability already exists for videos
 on Wikipedia, but it's basically a hidden feature of TimedMediaHandler. If
 you include a video in a page and set the size as 200px or less, it
 activates the modal behavior. Unfortunately, the default size for videos is
 220px (as of 2010) so you will almost never see this behavior on a real
 article. If you want to see it, go to https://en.wikipedia.org/wiki/**
 

Re: [Wikitech-l] New git-review lets you configure 'origin' as the gerrit remote

2013-05-31 Thread Matthew Flaschen
On 05/31/2013 06:14 PM, Ori Livneh wrote:
 Hey,
 
 The new version of git-review released today (1.22) includes a patch I
 wrote that makes it possible to work against a single 'origin' remote.

Cool, thank you.

Matt Flaschen

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Welcome, Nik Everett

2013-05-31 Thread Dan Andreescu
Awesome, welcome Nik!  And see you June 17th, a few more of us remoteys
will be in the office :)

Dan


On Fri, May 31, 2013 at 6:02 PM, Matthew Flaschen
mflasc...@wikimedia.orgwrote:

 On 05/31/2013 05:48 PM, Rob Lanphier wrote:
  Hi everyone
 
  It is my pleasure to introduce Nik Everett as a new Senior Software
  Engineer specializing in Search, working remotely from his home in
  Raleigh, North Carolina.  Nik joins us from Lulu, a company founded by
  Bob Young[1] to enable anyone to publish a book (dealing with print,
  distribution, and order fulfillment issues).  Nik was responsible for
  a number of DevOps-related tasks at Lulu, mainly centered around test
  automation, build infrastructure, and database performance, but also
  including maintenance and support of their Solr infrastructure.  He
  also founded and chaired Lulu’s architecture review board, which is
  similar to a group we’re just now starting up here[2]

 Welcome!

 Matt Flaschen

 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] New git-review lets you configure 'origin' as the gerrit remote

2013-05-31 Thread Greg Grossmeier
quote name=Ori Livneh date=2013-05-31 time=15:14:21 -0700
 Hey,
 
 The new version of git-review released today (1.22) includes a patch I
 wrote that makes it possible to work against a single 'origin' remote.

I just want to point out that both Ori and Krenair contributed to this
release of git-review (a couple patches each):
https://review.openstack.org/gitweb?p=openstack-infra/git-review.git;a=log;h=refs/tags/1.22

Thanks, you two!

Greg

-- 
| Greg GrossmeierGPG: B2FA 27B1 F7EB D327 6B8E |
| identi.ca: @gregA18D 1138 8E47 FAC8 1C7D |

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Quo Vadis, Vagrant

2013-05-31 Thread Ori Livneh
On Fri, May 31, 2013 at 9:51 AM, Yuri Astrakhan yastrak...@wikimedia.orgwrote:

 This is an awesome news! Just to through some magic dust into gears - has
 anyone succeeded in running it under windows? Alternativelly, I wouldn't
 mind having a vagrant VM that includes GUI  a browser so that no host
 interaction is needed.


The tests run on Firefox / X11 on the guest VM. To invoke them from a
'vagrant ssh' session, though, you need an X display server. Graphical
Linuxes have all the prerequisites installed, and on OS X you just need to
install XQuartz. On Windows, people have good things to say about NoMachine
(http://www.nomachine.com/download-client-windows.php) but I haven't tested
it myself and while the client is free as in beer, it is not open source.

There is another alternative that you'll probably find simpler: edit your
Vagrantfile and find these two lines:

# To boot the VM in graphical mode, uncomment the following line:
# vb.gui = true

Uncomment the second line and do a vagrant halt / vagrant up. You'll then
be able to run the browser tests on an X display server running on the
guest VM.
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Quo Vadis, Vagrant

2013-05-31 Thread Yuri Astrakhan
I so wish there was a way not to mess with files under git's control. Can
that option be exposed through the local user config file?
On May 31, 2013 6:59 PM, Ori Livneh o...@wikimedia.org wrote:

 On Fri, May 31, 2013 at 9:51 AM, Yuri Astrakhan yastrak...@wikimedia.org
 wrote:

  This is an awesome news! Just to through some magic dust into gears - has
  anyone succeeded in running it under windows? Alternativelly, I wouldn't
  mind having a vagrant VM that includes GUI  a browser so that no host
  interaction is needed.
 

 The tests run on Firefox / X11 on the guest VM. To invoke them from a
 'vagrant ssh' session, though, you need an X display server. Graphical
 Linuxes have all the prerequisites installed, and on OS X you just need to
 install XQuartz. On Windows, people have good things to say about NoMachine
 (http://www.nomachine.com/download-client-windows.php) but I haven't
 tested
 it myself and while the client is free as in beer, it is not open source.

 There is another alternative that you'll probably find simpler: edit your
 Vagrantfile and find these two lines:

 # To boot the VM in graphical mode, uncomment the following line:
 # vb.gui = true

 Uncomment the second line and do a vagrant halt / vagrant up. You'll then
 be able to run the browser tests on an X display server running on the
 guest VM.
 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Quo Vadis, Vagrant

2013-05-31 Thread Ori Livneh
On Fri, May 31, 2013 at 4:16 PM, Yuri Astrakhan yastrak...@wikimedia.orgwrote:

 I so wish there was a way not to mess with files under git's control. Can
 that option be exposed through the local user config file?


Yep. Create 'Vagrantfile-extra.rb' in the same directory as the Vagrantfile
and add these lines:

Vagrant.configure('2') do |config|
config.vm.provider :virtualbox do |vb|
vb.gui = true
end
end
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] ZERO architecture

2013-05-31 Thread K. Peachey
On Sat, Jun 1, 2013 at 4:25 AM, Adam Baso ab...@wikimedia.org wrote:
 Sorry to reply on a thread that will probably not sort nicely on the
 mailman web interface or threading mail clients. Anybody know of an
 easy way to reply to digest email in Gmail such that mailman will
 retain threading?

Don't use digest mode... (if you insist, you a external service to
reply, eg: Gmame or whatever its called seems popularish)

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l