Re: [Wikitech-l] Access from China

2013-09-23 Thread Ryan Lane
Ryu,

As far as we've been told Wikipedia access to China is open, excluding a
filter for specific pages, assuming you are using HTTP and not HTTPS.

- Ryan


On Mon, Sep 23, 2013 at 7:36 PM, Ryu Cheol rch...@gmail.com wrote:

 Hello all,
 last week I had been in Nanjing of China for my business. I tried
 to access to Wikipedia, my love, of course. Wow, I found I can access on my
 iPad. It's quite weird. So I tried again on my notebook computer, the
 firewall reset the session to stop me to access Wikipedia. That's exactly
 what I expected.  The great fire wall of China was so vulnerable. Any idea
 why I could access on my iPad?

 Best regards
 Cheol
 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Access from China

2013-09-23 Thread Ryan Lane
On Mon, Sep 23, 2013 at 8:07 PM, Ryu Cheol rch...@gmail.com wrote:

 On iPad, I could not check it is http or https. You means that on mobile
 devices such as iPad, it does not redirect to https. I feel a bit, it is
 not consistent.

 In the region which https is not allowed, could we keep the http as the
 default protocol for example China?
 I thought I totally lost the access. I ask we could change directing
 behavior according to the region the access comes from?


Can you email me privately and let me know what your IP address is? We
actually do have something in place to not redirect logged-in users in some
regions (such as China). Maybe the geoip database we are using is incorrect
for your area.

- Ryan
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] File cache + HTTPS question

2013-10-01 Thread Ryan Lane
On Tue, Oct 1, 2013 at 8:21 AM, Jeroen De Dauw jeroended...@gmail.comwrote:

 The file cache is barely maintained and I am not sure whether anyone is
  still relying on it.   We should probably remove that feature entirely
  and instruct people to setup a real frontend cache instead.
 

 This means we say you no can has cache to people using shared hosts. Do
 we really want to do that? If so, the docs ought to be updated, and the
 deprecation announced.


In my opinion we should completely drop support for shared hosts. It's 2013
and virtual hosts are cheap and superior in every way. Supporting shared
hosting severely limits what we can do in the software reasonably.

- Ryan
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] File cache + HTTPS question

2013-10-01 Thread Ryan Lane
On Tue, Oct 1, 2013 at 12:46 PM, Christopher Wilson gwsuper...@gmail.comwrote:

 I'm usually just an observer on these lists, but I'll weigh in as a user
 who runs MediaWiki on a shared host. The host *is* a VPS, but our wiki is
 used by the environmental department of a large international non-profit.
 As such it lives on the enviro server along with some WordPress sites and
 other minor things.

 If we have to give the wiki its own dedicated VPS, it will likely not
 survive, or we will move to another platform. I see some REALLY low costs
 for VPSes being tossed around here, but honestly, we'd probably be looking
 at a minimum of $50/month to give the site its own dedicated VPS. I realize
 that in the grand scheme of things, that's not a huge cost, but at that
 point, management will probably insist on making the site pay for itself
 rather than the current situation of letting it exist on shared resources.


We aren't discussing dropping support for running MediaWiki along with
other applications, but we're discussing dropping support for shared
hosting services, which run hundreds of applications on the same host as
different customers. It's horribly insecure and doesn't allow the user to
install anything at the system-level.

- Ryan
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] File cache + HTTPS question

2013-10-01 Thread Ryan Lane
On Tue, Oct 1, 2013 at 1:43 PM, Tyler Romeo tylerro...@gmail.com wrote:

 Has anybody ever considered the possibility that maybe people don't know
 (or want to know) how to set up a caching proxy? One of the nice things
 about MediaWiki is that it's extraordinarily easy to set up. All you have
 to do is dump a tar.gz file into a directory, run the web installer and
 call it a day. No sysadmin experience required.


This is only true if you want almost no functionality of out MediaWiki and
you want it to be very slow. MediaWiki is incredibly difficult to properly
run and requires at least some minor sysadmin experience to do so. There's
a reason that almost every MediaWiki install in existence is completely out
of date.

When we get to Wordpress's ease of use, then we can assume this.

- Ryan
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] File cache + HTTPS question

2013-10-01 Thread Ryan Lane
On Tue, Oct 1, 2013 at 2:53 PM, Tyler Romeo tylerro...@gmail.com wrote:

 On Tue, Oct 1, 2013 at 2:46 PM, Ryan Lane rlan...@gmail.com wrote:

  This is only true if you want almost no functionality of out MediaWiki
 and
  you want it to be very slow. MediaWiki is incredibly difficult to
 properly
  run and requires at least some minor sysadmin experience to do so.
 There's
  a reason that almost every MediaWiki install in existence is completely
 out
  of date.
 

 Do you have some specific examples?


Extension management, upgrades, proper backend caching, proper localization
cache, proper job running, running any maintenance script, using a *lot* of
different extensions, etc. etc..


 Also, if that's the case then removing file caching would be a step
 backwards.


Pretending that we support the lowest common denominator while not actually
doing it is the worst of all worlds. We should either support shared hosts
excellently, which is very difficult, or we should just stop acting like we
support them.

I'm not saying we should make the software unusable for shared hosts, but
we also shouldn't worry about supporting them for new features or
maintaining often broken features (like file cache) just because they are
useful on shared hosting. It makes the software needlessly complex for a
dying concept.

- Ryan
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] File cache + HTTPS question

2013-10-01 Thread Ryan Lane
On Tue, Oct 1, 2013 at 3:00 PM, Jeroen De Dauw jeroended...@gmail.comwrote:

 Hey,

 This is only true if you want almost no functionality of out MediaWiki and
  you want it to be very slow.
 

 There are quite a few people running MW without a cache or other magic
 config and find it quite suitable for their needs, which can be quite
 non-trivial.

 MediaWiki is incredibly difficult to properly
  run and requires at least some minor sysadmin experience to do so.
 There's
  a reason that almost every MediaWiki install in existence is completely
 out
  of date.
 

 So because MediaWiki already sucks in some regards, its fine to have it
 suck more in others as well? Is that really the point made here?


I'm actually arguing that we should prioritize fixing the things that suck
for everyone over the things that suck for shared hosts. We should
especially not harm the large infrastructures just so that we can support
the barely usable ones. If we keep supporting shared hosts we likely can't
break portions of MediaWiki into services without a lot of duplication of
code and effort (and bugs!).

- Ryan
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Officially supported MediaWiki hosting service?

2013-10-02 Thread Ryan Lane
On Wed, Oct 2, 2013 at 4:50 PM, Jay Ashworth j...@baylink.com wrote:

 - Original Message -
  From: Ryan Lane rlan...@gmail.com

  On Wed, Oct 2, 2013 at 12:48 PM, C. Scott Ananian
  canan...@wikimedia.orgwrote:
 
   One intermediate position might be for WMF to distribute virtual
   machine images

  I'd rather provide cloud-init scripts with instructions on how to use
 them.
  The cloud init script could pull and run a puppet module, or a salt
 module,
  or etc. etc.. Providing images kind of sucks.

 Ok, time for me to throw an oar in the water, as an ops and support guy.

 My perception of Brion's use of officially supported was, roughly,
 whomever is providing support to these hosting customers has a direct,
 *formal* line of communication to the development staff.

 The reason you build images, and version the images, is that it provides a
 clear solid baseline for people providing such support to know (and,
 preferably, be able to put their fingers on) exactly the release you're
 running, so they can give you clear and correct answers -- it's not just
 going to be the Mediawiki release number that's the issue there.

 That's impossible to do reliably if you pull the code down and build it
 sui generis on each customer's machine... which is what I understand
 Ryan to be suggesting.

 It's a little more work to build images, but you're not throwing it
 away; you get it back in reduced support costs.


No. I'm suggesting that we provide cloud-init scripts [1] to let people
seed their virtual instances. It would pull down a puppet, salt, chef,
juju, etc. repository which would then install all the prerequisites, start
all the necessary services, install MediaWiki, and maybe install and
configure a number of extensions. We could skip cloud-init completely for
some of these. I think vagrant has ec2 [2] and openstack [3] providers.
salt stack has salt-cloud [4] (or you could use salty vagrant [5]).

Some of these also already have mediawiki modules created and usable. In
fact, juju uses MediaWiki for demo purposes relatively often. WMF has a
usable puppet module. I wrote a salt module for webplatform.org, which
we'll be publishing soon.

The nice part about this is that it's all configuration managed, can be
versioned and could also be used to upgrade MediaWiki and all related
infrastructure.

Images are a pain in the ass to build and maintain (I build and maintain
images), you need to keep them updated because the image will be insecure
otherwise, and they are giant. They are also way less flexible.

- Ryan

[1] http://cloudinit.readthedocs.org/en/latest/
[2] https://github.com/mitchellh/vagrant-aws
[3] https://github.com/cloudbau/vagrant-openstack-plugin
[4] https://github.com/saltstack/salt-cloud
[5] https://github.com/saltstack/salty-vagrant
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] [IRC] wm-bot semi outage

2013-10-15 Thread Ryan Lane
On Tue, Oct 15, 2013 at 9:15 AM, Petr Bena benap...@gmail.com wrote:

 Hi,

 because of this bug: https://bugzilla.wikimedia.org/show_bug.cgi?id=55690

 wm-bot is unable to write logs to file storage. Unfortunatelly since
 10 minutes ago, the mysql storage broke as well, I have no idea why
 but I am afraid that only fix that comes to my mind now involves bot
 restart which isn't possible because all logs that couldn't be written
 are cached in operating memory (so if I restarted wm-bot we would
 loose 2 - 3 days of logs of all channels that are in RAM now).

 So, if someone wondered what is up with public channel logging, this
 is a reason why there are logs missing, not only on file storage at
 bots.wmflabs.org/~wm-bot/logs but also sql at
 http://tools.wmflabs.org/wm-bot/logs/

 I hope wmf will soon start selling I love gluster t-shirts :-) I would
 get one


It's possible to switch to NFS, you know ;).

- Ryan
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] [IRC] wm-bot semi outage

2013-10-16 Thread Ryan Lane
On Wed, Oct 16, 2013 at 1:45 AM, Petr Bena benap...@gmail.com wrote:

 not that it would actually worked better :P I would be most happy to
 switch to local /dev/vdb storage which never had any problems, but for
 that wm-bot would need to be on own project


That's not a good idea. Performance would be better (which you probably
don't need), but at the cost of decreased ability to recover. /dev/vdb goes
away with an instance. If a compute node fails and your instance is on it,
your data is gone.

- Ryan
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Operations buy in on Architecture of mwlib Replacement

2013-11-14 Thread Ryan Lane
On Thu, Nov 14, 2013 at 8:13 AM, C. Scott Ananian canan...@wikimedia.orgwrote:

 And I'll add that there's another axis: gwicke (and others?) have been
 arguing for a broader collection of services architecture for mw.  This
 would decouple some of the installability issues.  Even if PDF rendering
 (say) was a huge monster, Jimmy MediaWiki might still be able to simply
 install the core of the system.  Slow progress making PDF rendering more
 friendly wouldn't need to hamper all the Jane MediaWikis who don't need
 that feature.


Definitely and others.  Apart from decoupling instability issues it also
breaks the application into separately maintainable applications that can
have teams of people working on them separately. The only thing needed to
ensure compatibility with other teams is a stable API, and that's what API
versioning is for. Having multiple services doesn't complicate things much,
unless you're running on a shared host.

- Ryan
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] OpenID deployment delayed until sometime in 2014

2013-11-27 Thread Ryan Lane
On Wed, Nov 27, 2013 at 3:51 AM, Gerard Meijssen
gerard.meijs...@gmail.comwrote:

 According to the website of myopenid [1], they are closing down Februari 1.
 My hope was for the Wikimedia Foundation to come to the rescue and provide
 a non commercial alternative to what Google and Facebook offer.

 I am extremely disappointed that we are not. I hope that what is done
 instead has at least a similar impact than leaving it all to the commercial
 boys and girls. Privacy should not be left to commerce and national secret
 services.


Our account security honestly makes us a poor choice for an auth provider
and you should have never considered us for this anyway. We don't have
password requirements, we don't offer two factor auth, we don't offer good
ways to rescue a lost account, and we really don't want to be a target of
attack for the purpose of owning other websites.

Also, if you're really concerned about privacy you should be putting your
support behind Mozilla's BrowserID (Persona).

- Ryan
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Making MediaWiki extensions installable via Composer

2013-12-08 Thread Ryan Lane
On Mon, Dec 9, 2013 at 12:06 AM, Jeroen De Dauw jeroended...@gmail.comwrote:

 Finding a way to separate MW the library from MW the application may be a
  solution to this conflict. I don't think this would be a trivial
  project, but it doesn't seem impossible either.
 

 That'd be fanatic if it happened for many other reasons as well. For all
 intents and purposes it is a big caste in the sky though. I expect this to
 not happen in the coming years, unless there is a big shift in opinion on
 what constitutes good software design and architecture in the community.

 The side effect is that it removed the ability to use Composer to
  manage external components used by MW the library which is Tyler's
  proposed use case [0].
 

 If the core community actually gets to a point where potential usage of
 third party libraries via Composer is actually taken seriously, this will
 indeed need to be tackled. I do not think we are quite there yet. For
 instance, if we go down this road, getting a clone of MW will no longer be
 sufficient to have your wiki run, as some libraries will first need to be
 obtained. This forces change to very basic workflow. People who dislike
 Composer will thus scream murder. Hence I tend to regard this as a moot
 point for now. Though lets pretend this has already bee taken care off and
 look at solutions to the technical problem.

 One approach, that is a lot more feasible then making MediaWiki (partially)
 library like, is to specify the MW dependencies somewhere else. Not in
 composer.json. Then when you install MediaWiki, these dependencies get
 added automatically to composer.json. And when you update it, new ones also
 get added. In a way, this is similar to the generation of LocalSettings.
 This is the approach I'd be investigating further if we actual where at a
 point where a technical solution is needed.


No one has any major objections to composer *support*. People have
objections to composer being required, though, as many of us use git as our
method of installation and deployment and have very valid beliefs that
composer is inferior to git with submodules for a number of reasons.

I think the biggest obstacle so far is the half assed way things are
currently happening, from all fronts. There's no real support from
Wikimedia in this regard. There's a composer is the right way of do it,
and you're an idiot for not using attitude from some. There's also a
relatively large lack of understanding of how we can use this for both
extensions and MediaWiki all around.

Someone needs to actually take the reigns on this, or let's stop wasting
the list's time with it.

- Ryan
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Wiki - Gerrit was Re: FWD: [Bug 58236] New: No longer allow gadgets to be turned on by default for all users on Wikimedia sites

2013-12-11 Thread Ryan Lane
On Wed, Dec 11, 2013 at 3:21 PM, Jeroen De Dauw jeroended...@gmail.comwrote:

 Hey,

 Has there been thought on how GitHub can potentially help here? I'm not
 sure it fits the workflow well, though can make the following observations:


Unless you're implying that github writes some code for us, I'm going to
assume this is a troll from you and leave it at that.

- Ryan
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Mailing list etiquette and trolling

2013-12-11 Thread Ryan Lane
On Wed, Dec 11, 2013 at 5:11 PM, Jeroen De Dauw jeroended...@gmail.comwrote:

 In recent months I've come across a few mails on this list that only
 contained accusations of trolling. Those are very much not constructive and
 only serve to antagonize. I know some forums that have an explicit rule
 against this, which results in a ban on second violation. If there is a
 definition of the etiquette for this list somewhere, I suggest having a
 similar rule be added there. Thoughts?


To be fair, you were proposing that we use a proprietary third party web
site for editing wikimedia wiki pages, which would violate out privacy
policy and break our principles of openness. How was I not to think you
were trolling? My only alternative was to think you've simply lost your
mind.

- Ryan
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Reasonator use in Wikipedias

2014-01-21 Thread Ryan Lane
On Tue, Jan 21, 2014 at 7:17 AM, Gerard Meijssen
gerard.meijs...@gmail.comwrote:

 Hoi,

 At this moment Wikipedia red links provide no information whatsoever.
 This is not cool.

 In Wikidata we often have labels for the missing (=red link) articles. We
 can and do provide information from Wikidata in a reasonable way that is
 informative in the Reasonator. We also provide additional search
 information on many Wikipedias.

 In the Reasonator we have now implemented red lines [1]. They indicate
 when a label does not exist in the primary language that is in use.

 What we are considering is creating a template {{Reasonator}} that will
 present information based on what is available in Wikidata. Such a template
 would be a stand in until an article is actually written. What we would
 provide is information that is presented in the same way as we provide it
 as this moment in time [2]

 This may open up a box of worms; Reasonator is NOT using any caching. There
 may be lots of other reasons why you might think this proposal is evil. All
 the evil that is technical has some merit but, you have to consider that
 the other side of the equation is that we are not sharing in the sum of
 all knowledge even when we have much of the missing requested information
 available to us.

 One saving (technical) grace, Reasonator loads round about as quickly as
 WIkidata does.

 As this is advance warning, I hope that you can help with the issues that
 will come about. I hope that you will consider the impact this will have on
 our traffic and measure to what extend it grows our data.

 The Reasonator pages will not show up prettily on mobile phones .. so does
 Wikidata by the way. It does not consider Wikipedia zero. There may be more
 issues that may require attention. But again, it beats not serving the
 information that we have to those that are requesting it.


I have a strong feeling you're going to bring labs to its knees.

Sending editors to labs is one thing, but you're proposing sending readers
to labs, to a service that isn't cached.

If reasonator is something we want to support for something like this,
maybe we should consider turning it into a production service?

- Ryan
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] wmf getting ready for puppet3, advice please

2014-01-28 Thread Ryan Lane
On Tue, Jan 28, 2014 at 2:41 AM, Ariel T. Glenn ar...@wikimedia.org wrote:

 Hi puppet wranglers,

 We're trying to refactor the WMF puppet manifests to get rid of reliance
 on dynamic scope, since puppet 3 doesn't permit it.  Until now we've
 done what is surely pretty standard pupet 2.x practice: assign
 values to a variable in the node definition and pick it up in the
 class from there dynamically.  Example: we set $openstack_version
 to a specific version depending on the node, and the nova and
 openstack classes do different things depending on that value.

 Without dynamic scoping that's no longer possible, so I'm wondering
 what people are doing in the real world to address this, and
 what best practices are, if any.  Hiera? Some sort of class
 parameterization?  Something else?

 It should go without saying that we'd like not to have to massively
 rewrite all our manifests, if possible.


In puppet3 variables assigned in the node are still global. It's the only
place other than facts (or hiera) that you can assign them and have their
scope propagate. So, this'll continue working. I think the future path is
surely to use hiera rather than assigning variables in role classes (and
making tons of role classes based on realm/cluster/site/etc).

- Ryan
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Faidon Liambotis promoted to Principal Operations Engineer

2014-02-10 Thread Ryan Lane
On Mon, Feb 10, 2014 at 5:48 AM, Mark Bergsma m...@wikimedia.org wrote:

 I'm pleased to announce that Faidon Liambotis has been promoted to
 Principal Operations Engineer.


Definitely deserved and far too late in coming! Congrats Faidon, you do
great work and you're an awesome person to work with.

- Ryan
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] deploying the most recent MediaWiki code: which branch?

2014-02-20 Thread Ryan Lane
On Thu, Feb 20, 2014 at 1:20 PM, Sumana Harihareswara suma...@wikimedia.org
 wrote:

 Reuben Smith of wikiHow asked:
 We're having a hard time figuring out whether we should be basing our
 wikiHow code off Mediawiki's external releases (such as the latest 1.22.2),
 or off the branches that WMF uses for their internal infrastructure (latest
 looks to be wmf/1.23wmf14).

 Do you have any thoughts of guidance on that? We're leaning towards moving
 to using the WMF internal branches, since we use MySQL as well, but I
 wanted to hear from different people about the drawbacks.


 Greg Grossmeier responded:

 The quick answer is:
 Feel free to base it off of either. There shouldn't be any WMF-specific
 things in those wmfXX branches. If there is, it is a commit called
 something like Commit of various WMF live hacks. That one commit can
 be safely reverted.

 The wmfXX branches are made every week on Thursday morning (Pacific)
 before we deploy. As we get closer to the next release (1.23) the
 MediaWiki Release Managers (our outside contractors, Mark and Markus,
 not myself) will pick a wmfXX to call a Release Candidate.

 Going with a 1.22.x would give you stability at the loss of getting
 fixes faster and it means a bigger upgrade task when 1.23 is out.

 Summary: If you want to keep closer to WMF, pick a wmfXX branch (this
 assumes you'll follow at some pace behind WMF). If you don't want to be
 that bleeding edge, stick with 1.22.x.


Note that unless you're willing to keep up to date with WMF's relatively
fast pace of branching, you're going to miss security updates. No matter
what, if you use git you're going to get security updates slower, since
they are released into the tarballs first, then merged into master, then
branches (is this accurate?). Sometimes the current WMF branch won't even
get the security updates since they are already merged locally onto
Wikimedia's deployment server.

That said, due to the poor state of extension management in MediaWiki, the
only reasonable way to manage MediaWiki is to use the WMF branches since
they handle dependencies for the most popular extensions. I was hoping that
composer would make managing extensions easier, but I've been tracking it
in SMW and it's actually making things more difficult.

- Ryan
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Drop support for PHP 5.3

2014-02-20 Thread Ryan Lane
On Thu, Feb 20, 2014 at 3:22 PM, Trevor Parscal tpars...@wikimedia.orgwrote:

 Is that the rule then, we have to make MediaWiki work on anything Ubuntu
 still supports?

 Is there a rule?


We should strongly consider ensuring that the latest stable releases of
Ubuntu and probably RHEL (or maybe fedora) can run MediaWiki.

- Ryan
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Drop support for PHP 5.3

2014-02-20 Thread Ryan Lane
On Thu, Feb 20, 2014 at 3:58 PM, James Forrester
jforres...@wikimedia.orgwrote:

 On 20 February 2014 15:34, Ryan Lane rlan...@gmail.com wrote:

  On Thu, Feb 20, 2014 at 3:22 PM, Trevor Parscal tpars...@wikimedia.org
  wrote:
 
   Is that the rule then, we have to make MediaWiki work on anything
 Ubuntu
   still supports?
  
   Is there a rule?
  
 
  We should strongly consider ensuring that the latest stable releases of
  Ubuntu and probably RHEL (or maybe fedora) can run MediaWiki.
 

 Is that a no, only the latest stable releases, or yes, the latest
 stable releases and also the LTS releases?


If it isn't an LTS it isn't a stable release.

- Ryan
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Gerrit Commit Wars

2014-03-06 Thread Ryan Lane
On Thu, Mar 6, 2014 at 3:54 PM, Tyler Romeo tylerro...@gmail.com wrote:

 I think this entire thing was a big failure in basic software development
 and systems administration. If MobileFrontend is so tightly coupled with
 the desktop login form, that is a problem with MobileFrontend. In addition,
 the fact that a practically random code change was launched into production
 an hour later without so much as a test... That's the kind of thing that
 gets people fired at other companies.


At shitty companies maybe.

Things break. You do a post-mortem and track the things that lead to an
outage and try to make sure it doesn't break again, ideally by adding
automated tests, if possible.

- Ryan
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Gerrit Commit Wars

2014-03-07 Thread Ryan Lane
On Fri, Mar 7, 2014 at 2:54 PM, Tyler Romeo tylerro...@gmail.com wrote:

 On Fri, Mar 7, 2014 at 5:39 PM, George Herbert george.herb...@gmail.com
 wrote:

  With all due respect; hell, yes, development comes in second to
 operational
  stability.
 
  This is not disrespecting development, which is extremely important by
 any
  measure.  But we're running a top-10 worldwide website, a key worldwide
  information resource for humanity as a whole.  We cannot cripple
  development to try and maximize stability, but stability has to be
 priority
  1.  Any large website's teams will have the same attitude.
 
  I've had operational outages reach the top of everyone's news
  source/feed/newspaper/broadcast.  This is an exceptionally unpleasant
  experience.
 

 If you really think stability is top priority, then you cannot possibly
 think that the current deployment process is sane.


Developers shouldn't be blocked on deployment or operations. Development is
expensive and things will break either way. It's good to assume things will
break and:

1. Have a simple way to revert
2. Put tests in for common errors
3. Have post-mortems where information is kept for historical purposes and
bugs are created to track action items that come from them


 Right now you are placing the responsibility on the developers to make sure
 the site is stable, because any change they merge might break production
 since it is automatically sent out. If anything that gives the appearance
 that the operations team doesn't care about stability, and would rather
 wait until things break and revert them.


Yes! This is a _good_ thing. Developers should feel responsible for what
they build. It's shouldn't be operation's job to make sure the site is
stable for code changes. Things should go more in this direction, in fact.

I'm not totally sure what you mean by it's automatically sent out,
though. Deploys are manual.


 It is the responsibility of the operations team to ensure stability. Having
 to revert something because that's the only way production will be stable
 is not a proper workflow.


It's the responsibility of the operations team to ensure stability at the
infrastructure level, not at the application level. It's sane to expect to
revert things because things will break no matter what. Mean time to
recovery is just as important or more important than mean time between
failure.

- Ryan
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Gerrit Commit Wars

2014-03-08 Thread Ryan Lane
On Sat, Mar 8, 2014 at 1:38 PM, Tyler Romeo tylerro...@gmail.com wrote:

 On Fri, Mar 7, 2014 at 7:04 PM, Ryan Lane rlan...@gmail.com wrote:

  Yes! This is a _good_ thing. Developers should feel responsible for what
  they build. It's shouldn't be operation's job to make sure the site is
  stable for code changes. Things should go more in this direction, in
 fact.
 

 If you want to give me root access to the MediaWiki production cluster,
 then I'll start being responsible for the stability of the site.


You don't work for WMF, so you personally have no responsibility for the
stability of the site. It's the WMF developers who have a responsibility to
ensure code they're pushing out won't break the site. As an aside, root
isn't necessary to maintain MediaWiki on WMF's cluster ;).

Tell me something, what about the developers of MariaDB? Should they be

 responsible for WMF's stability? If they accidentally release a buggy
 version, are they expected to revert it within hours so that the WMF
 operations team can redeploy? Or will the operations team actually test new
 releases first, and refuse to update until things start working again?


You're confused about the role of operations and development, especially
with respect to organizations that let developers deploy and maintain the
applications they develop.

- Ryan
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Gerrit Commit Wars

2014-03-08 Thread Ryan Lane
On Sat, Mar 8, 2014 at 2:06 PM, Ryan Lane rlan...@gmail.com wrote:

 On Sat, Mar 8, 2014 at 1:38 PM, Tyler Romeo tylerro...@gmail.com wrote:

 On Fri, Mar 7, 2014 at 7:04 PM, Ryan Lane rlan...@gmail.com wrote:

  Yes! This is a _good_ thing. Developers should feel responsible for what
  they build. It's shouldn't be operation's job to make sure the site is
  stable for code changes. Things should go more in this direction, in
 fact.
 

 If you want to give me root access to the MediaWiki production cluster,
 then I'll start being responsible for the stability of the site.


 You don't work for WMF, so you personally have no responsibility for the
 stability of the site. It's the WMF developers who have a responsibility to
 ensure code they're pushing out won't break the site. As an aside, root
 isn't necessary to maintain MediaWiki on WMF's cluster ;).


Of course, I'm really saying that everyone who has access to deploy has
this responsibility.

- Ryan
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Gerrit Commit Wars

2014-03-08 Thread Ryan Lane
On Sat, Mar 8, 2014 at 6:21 PM, Tyler Romeo tylerro...@gmail.com wrote:

 On Sat, Mar 8, 2014 at 8:38 PM, Marc A. Pelletier m...@uberbox.org
 wrote:

  The answer is: no, obviously not.  And for that reason the MariaDB
  developers are not allowed to simply push their latest code on our
  infrastructure with a simple +2 to code review.
 

 Yes, and my point is that MediaWiki developers shouldn't be able to do that
 either! Receiving a +2 should not be the only line between a patch and
 deployment. Changes should be tested *before* deployment. Nobody is saying
 that developers are not responsible for writing good and working code, but
 there needs to be guards against things like this happening.


Wikimedia uses deployment branches. Just because someone +2/merges into
master doesn't mean it immediately shows up on Wikimedia servers. It needs
to go into a deployment branch, then it needs to get deployed by a person.
Also, we use a gating model, so tests are required to pass before something
is merged. I believe there are some tests that are essential, but take too
long to run, so they aren't gating, but the situation isn't as dire as you
claim.

- Ryan
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Gerrit Commit Wars

2014-03-08 Thread Ryan Lane
On Sat, Mar 8, 2014 at 7:05 PM, Tyler Romeo tylerro...@gmail.com wrote:

 On Sat, Mar 8, 2014 at 9:48 PM, Ryan Lane rlan...@gmail.com wrote:

  Wikimedia uses deployment branches. Just because someone +2/merges into
  master doesn't mean it immediately shows up on Wikimedia servers. It
 needs
  to go into a deployment branch, then it needs to get deployed by a
 person.
  Also, we use a gating model, so tests are required to pass before
 something
  is merged. I believe there are some tests that are essential, but take
 too
  long to run, so they aren't gating, but the situation isn't as dire as
 you
  claim.
 

 OK, then how did this change get deployed if it broke tests?


The jenkins report says it passed tests, hence why it was deployed. If
there's other tests that aren't reporting to gerrit or if there's a test
that needs to be added, maybe that's a post-mortem action to track?

- Ryan
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Gerrit Commit Wars

2014-03-08 Thread Ryan Lane
On Sat, Mar 8, 2014 at 9:34 PM, MZMcBride z...@mzmcbride.com wrote:

 Chad wrote:
  On Mar 8, 2014 1:42 PM, Brandon Harris bhar...@wikimedia.org wrote:
 https://en.wikipedia.org/wiki/Badger
 
 New rule: calling badger is synonymous with asking for the moderation bit
 for yourself.

 Fine, but first we have to ban the people who top-post and don't trim
 unnecessary parts of their e-mails. :-)

 I personally read the badger link as a politer(?) version of calling
 someone a troll.


It's actually something used on WMF staffer lists and it's supposed to be a
cute way of telling people a thread is getting out of hand. Some people use
it that way, and I feel Brandon was in this case, but it's most often used
to shut down conversations that people find unpleasant or controversial and
its actual effect is usually to enrage those it's used against.

I'd wager it has an overwhelmingly negative effect on communication (and
WMF's internal politics likely prove that) and I'm with Chad in saying
anyone that uses it here should be moderated. This is of course off topic,
so I'll leave this at that.

- Ryan
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] wikitech downtime Tuesday, April 1st 9AM PST (16:00UTC)

2014-03-26 Thread Ryan Lane
On Wed, Mar 26, 2014 at 1:40 PM, Petr Bena benap...@gmail.com wrote:

 Will the new wiki be faster? Will it not log me off randomly even if I
 check remember me. Will it serve coffee to new users?


Do you still have an issue where it logs you off randomly? That was very
briefly an issue that I caused when I switched keystone to a redis token
backend and incorrectly purged everyone's wiki tokens. I fixed that a few
days later. If you're still having the issue you should open a bug.

Also, wikitech pretty much assumes you'll be logged in, so I don't see it
getting much faster, seeing as that MediaWiki is slow as hell for logged-in
user no matter what you do.

- Ryan
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Abandoning -1 code reviews automatically?

2014-04-14 Thread Ryan Lane
On Sun, Apr 13, 2014 at 1:37 PM, Marcin Cieslak sa...@saper.info wrote:

 2) https://gerrit.wikimedia.org/r/#/c/11562/

 My favourite -1 here is needs rebase.


Well, obviously trivial rebases should be done automatically by the system
(which OpenStack's system does), and changes that need a rebase due to
conflicts should be fixed. Reviewer time is generally in short supply, so
it makes sense to have the committer do any conflict resolution.


 Regarding Openstack policies: I'd say we should not follow them.

 I used to be #2 git-review contributor according to launchpad
 until recently. I gave up mainly because of my inability
 to propose some larger change to this relatively simple
 script. For a nice example of this, please see

 https://review.openstack.org/#/c/5720/

 I have given up to contribute to this project some time
 after this, I have no time to play politics to submit
 a set of tiny changes and play the rebase game depending
 on the random order they might would have got reviewed.


This seems like an odd change to use as an example. There seems to be no
politics in play there. All of the reviews were encouraging, but it looked
like there was a release happening during your reviews and it caused a
number of merge conflicts. The change was automatically abandoned, but you
could have restored it and pinged someone on the infra team.


 The next time I find time to improve Johnny the causual
 developer experience with gerrit I will just rewrite
 git-review from scratch. The amount of the red tape
 openstack-infra has built around their projects is
 simply not justifiable for such a simple utility
 like git-review. Time will tell if gerrit-based
 projects generally fare better than others.


Maybe, until you start looking at their statistics:


http://stackalytics.com/?release=icehousemetric=marksproject_type=openstackmodule=company=user_id=


If you notice, this release cycle they had 1,200 reviewers. One
organization had 20k reviews over the cycle, and the top 5 each had over
10k reviews. Their process scales way better than Wikimedia's, but that's
also due to the way projects are split up and organized as well.

- Ryan
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Affiliation in username

2014-05-09 Thread Ryan Lane
On Wed, May 7, 2014 at 1:22 PM, Jared Zimmerman 
jared.zimmer...@wikimedia.org wrote:

 Affiliations change, and user names are quite difficult to change, this
 sounds like something that would be good for a structured profile, not for
 a user name.


Indeed, or in the user preferences so that it could be accessed natively.

- Ryan
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Login to Wikimedia Phabricator with a GitHub/Google/etc account?

2014-05-15 Thread Ryan Lane
On Thu, May 15, 2014 at 5:20 PM, Quim Gil q...@wikimedia.org wrote:

 This is a casual request for comments about the use of 3rd party
 authentication providers for our future Wikimedia Phabricator instance.

 Wikimedia Phabricator is expected to replace Bugzilla, Gerrit and many
 other tools, each of them having their own registration and user account.
 The plan is to offer Wikimedia SUL (your Wikimedia credentials) as the
 default way to login to Phabricator -- details at
 http://fab.wmflabs.org/T40

 However, Phabricator can support authentication using 3rd party providers
 like GitHub, Google, etc. You can get an idea at
 https://secure.phabricator.com/auth/start/

 There are good reasons to plan for Wikimedia SUL only (consistency with the
 rest of Wikimedia projects), and there are good reasons to plan for other
 providers as well (the easiest path for most first-time contributors).

 What do you think? Should we offer alternatives to Wikimedia login? If so,
 which ones?


Will Labs no longer have the same authentication as the rest of the
tooling? Is this something that will be solved before the switch?

- Ryan
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Login to Wikimedia Phabricator with a GitHub/Google/etc account?

2014-05-16 Thread Ryan Lane
On Thu, May 15, 2014 at 7:36 PM, Steven Walling steven.wall...@gmail.comwrote:

 On Thu, May 15, 2014 at 2:20 PM, Quim Gil q...@wikimedia.org wrote:

  However, Phabricator can support authentication using 3rd party providers
  like GitHub, Google, etc. You can get an idea at
  https://secure.phabricator.com/auth/start/
 

 I think since this is already built and would require no extra work, we
 should definitely support GitHub and Persona as well.


Persona is dead. It's no longer being actively developed by Mozilla.

- Ryan
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] translatewiki.net compromised: server reinstalled

2014-07-10 Thread Ryan Lane
On Thu, Jul 10, 2014 at 10:09 AM, Siebrand Mazeland siebr...@kitano.nl
wrote:

 This is an email to shell account holders on translatewiki.net and to
 wikitech-l, so that you are informed.

 Today at 08:10 UTC Niklas noticed that the translatewiki.net server had
 been compromised. We saw some suspicious files in /tmp and a few processes
 that didn't belong:

 elastic+ 22862  0.0  0.0   2684  2388 ?S04:53   0:00
 /tmp/freeBSD /tmp/freeBSD 1
 elastic+ 31575  0.0  0.0   2684  2388 ?S06:38   0:00
 /tmp/freeBSD /tmp/freeBSD 1
 elastic+ 31580 16.7  0.0  90816   724 ?Ssl  06:38  16:26
 [.Linux_time_y_2]

 We gathered data and looked at our recent traffic statistics. We drew the
 following conclusions:

 - Only the Elasticsearch account had been compromised. The intruder did not
 gain access to other accounts.
 - The attack could be made because the Elasticsearch process was bound to
 all interfaces, instead of only the localhost interface, and dynamic
 scripting was enabled, because it is required by CirrusSearch
 (CVE-2014-3120).
 - A virtual machine was started, and given the traffic that was generated
 (about 1TB in the past 4 days), we think this was a DDoS drone. The process
 reported to an IP address in China.
 - A server reinstall is the right thing to do (better safe than sorry).

 The compromised server was taken off-line around 10:00 UTC today.

 Actions taken:
 - Bind Elasticsearch only to localhost from now on:
 https://gerrit.wikimedia.org/r/#/c/145262/
 - Reinstall the server

 Actions to be taken:
 - Configure a firewall to only allow expected traffic to enter and exit the
 translatewiki.net server so that something like the added virtual machine
 could not have communicated to the outside world.
 - As a precaution, shell account holders should change any secret that they
 have used on the translatewiki.net server in the past 7 days.


Did this server have access to private ssh keys that are used to push/merge
code for upstream repos? If so, will they be rotated as well?

- Ryan
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] News about stolen Internet credentials; reducing Wikimedia reliance on usernames and passwords

2014-08-07 Thread Ryan Lane
On Thu, Aug 7, 2014 at 6:58 AM, Casey Brown li...@caseybrown.org wrote:

 On Thu, Aug 7, 2014 at 8:10 AM, Risker risker...@gmail.com wrote:
  A lot of the solutions  normally bandied about involve things like
  two-factor identification, which has the additional password coming
  through a separate route (e.g., gmail two-factor ID sends a second
 password
  as a text to a mobile) and means having more expensive technology) or
 using
  technology like dongles that cannot be sent to users in certain
 countries.

 Actually, most modern internet implementations use the TOTP algorithm
 open standard that anyone can use for free.
 https://en.wikipedia.org/wiki/Time-based_One-time_Password_Algorithm
 One of the most common methods, other than through text messages, is
 the Google Authenticator App that anyone can download for free on a
 smart phone. https://en.wikipedia.org/wiki/Google_Authenticator.


Yep. This. It's already being used for high-risk accounts on
wikitech.wikimedia.org. It's not in good enough shape to be used anywhere
else, since if you lose your device you'd lose your account. Supporting two
factor auth also requires supporting multiple ways to rescue your account
if you lose your device (and don't write down your scratch tokens, which is
common). Getting this flow to work in a way that actually adds any security
benefit is difficult. See the amount of effort Google has gone through for
this.

Let's be a little real here, though. There's honestly no good reason to
target these accounts. There's basically no major damage they can do and
there's very little private information accessible to them, so attackers
don't really care enough to attack them.

We should take basic account security seriously, but we shouldn't go
overboard.

- Ryan
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] News about stolen Internet credentials; reducing Wikimedia reliance on usernames and passwords

2014-08-07 Thread Ryan Lane
On Thu, Aug 7, 2014 at 11:27 AM, Pine W wiki.p...@gmail.com wrote:

 There are good reasons people would target checkuser accounts, WMF staff
 email accounts, and other accounts that have access to lots of private info
 like functionary email accounts and accounts with access to restricted IRC
 channels.


WMF uses gmail; they should force-require the use of two factor
authentication for their employees if they care about that. Restricted IRC
channels also don't have anything to do with Wikimedia wiki account
security (and IRC security is a joke anyway, so if we're really relying on
that to be secure, shame on us).

- Ryan
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] planned move to phabricator

2014-08-24 Thread Ryan Lane
On Sun, Aug 24, 2014 at 8:06 PM, svetlana svetl...@fastmail.com.au wrote:

 Hi all.

 I am reading https://www.mediawiki.org/wiki/Phabricator/Migration and it
 says:

 A review of our project management tools [1] was started, including an
 assessment of our needs and requirements, and a discussion of available
 options. A request for comment [2] was set up to gauge interest in
 simplifying our development toolchain and consolidating our tools (gitblit,
 Gerrit, Jenkins, Bugzilla, RT, Trello, and Mingle) into Phabricator. The
 result of the RFC was in favor of Phabricator.

 The RFC linked is
 https://www.mediawiki.org/wiki/Requests_for_comment/Phabricator . It is
 clearly put wrong. People were not given alternatives. They were told
 let's move to Phabricator and they said OK, let's do that. I am
 frustrated. Decide on X, open a let's do X, get support - this is not how
 decisions are done.

 Would anyone oppose me writing another RFC detailing all the available
 options (including staying with bugzilla and what cons it has) and posting
 an URL to it here?


The RFC was open for quite some time. You missed the comment period, which
is unfortunate, but the decision has already been made and work is already
underway to switch.

- Ryan
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] HTTPS Wikipedia search for Firefox - update?

2012-12-28 Thread Ryan Lane
On Fri, Dec 28, 2012 at 9:37 AM, Amir E. Aharoni 
amir.ahar...@mail.huji.ac.il wrote:

 Hi,

 Mozilla are asking for update about bug
 https://bugzilla.mozilla.org/show_bug.cgi?id=758857

 Can anybody help?


There's no change. We're still waiting on MediaWiki changes to occur before
we switch logged-in users to HTTPS by default.

- Ryan
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] HTTPS Wikipedia search for Firefox - update?

2012-12-29 Thread Ryan Lane
On Fri, Dec 28, 2012 at 10:58 PM, Jeremy Baron jer...@tuxmachine.comwrote:

 On Sat, Dec 29, 2012 at 6:52 AM, bawolff bawolff...@gmail.com wrote:
  On Fri, Dec 28, 2012 at 1:50 PM, Ryan Lane rlan...@gmail.com wrote:
  There's no change. We're still waiting on MediaWiki changes to occur
 before
  we switch logged-in users to HTTPS by default.
 [...]

  Furthermore, what does making the firefox search box be https have
  to do with having users log in to secure by default. I suppose we
  might not want people to loose their login if they're logged into
  insecure and search via firefox with secure login - is that what
  you're concerned about, or is it something else?

 I was thinking it was just wanting to ramp up load internally first
 (where it's really easy and fast to ramp back down if needed) and then
 expand to other places when we're more confident.

 Turning on HTTPS by default for all logged in users is one of those
 ways to ramp up load in a controlled and easily reversible way.


This.

- Ryan
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] monitoring / control system for bots

2013-01-06 Thread Ryan Lane
On Sat, Jan 5, 2013 at 6:48 AM, Daniel Schwen li...@schwen.de wrote:

 What we rather need is monitoring for the instances. My bots have not been
 the problem, so far the source for unreliable bot operation has been the
 underlying infrastructure. Be it the moving of home dirs and the read-only
 fs or overloaded instances.


Are you subscribed to the labs-l list? We sent out a warning about the home
directory change and about what you'd need to do once we made it. We don't
like to reboot people's instances for them, so we left instances running
with a read-only home directory until they were ready to reboot.

If there are overloaded instances, create more and have less bots running
on a single instance. If your bot uses a lot of resources, then it can have
its own instance.

- Ryan
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] How to contribute to sysadmin / devops

2013-01-08 Thread Ryan Lane
On Tue, Jan 8, 2013 at 9:42 AM, Quim Gil q...@wikimedia.org wrote:

 Re 
 https://www.mediawiki.org/**wiki/Sysadmin_hubhttps://www.mediawiki.org/wiki/Sysadmin_hub


 On 01/08/2013 02:52 AM, Bryan Tong Minh wrote:

 That page is for MediaWiki sysadmins, not for Wikimedia sysadmins.


 Yes, I know, but put yourself in the feet of a skilled and goodwilling
 sysadmin landing at mediawiki.org and willing to contribute. If there is
 a Sysadmin Hub promoted in the homepage and pushed in the search resultst
 then that is probably the page s/he will visit sooner or later. Either we
 improve that page to include relevant links to sysadmins willing to
 contribute, or we create a different landing page, or...


 Re Labs and opportunities to contribute

 Thank you for the update on labs and the current (little) chances to get
 involved in Wikimedia sysadmin tasks.

 However, is that really all? Is there anything a sysadmin could help with
 at https://toolserver.org/ , the many bots, products / components in
 bugzilla, support to other MediaWiki syadmins?


I'm just a very basic hobbyist sysadmin for my MediaWiki pet project [1] so
 I can't hardly put myself in the skin of a proper sysadmin willing to
 volunteer. But there must be something this kind of profile can do to get
 involved in MediaWiki - Wikimedia tech? Now all we seem to have are
 Wikimedia manual pages.

 PS: but ok, I will put this task back to my backlog until Labs and the
 whole idea of sysadmin contributors is more mature. In the meantime your
 feedback and help is welcome in this thread or

 https://www.mediawiki.org/**wiki/Talk:Sysadmin_hubhttps://www.mediawiki.org/wiki/Talk:Sysadmin_hub
 https://www.mediawiki.org/**wiki/Talk:How_to_contributehttps://www.mediawiki.org/wiki/Talk:How_to_contribute


Actually, the first goal of Labs was to provide a means for operations
volunteers to contribute. It's one of the easier things to do in Labs
currently. Contributing to our beta infrastructure isn't the only thing
someone can do to help out with our infrastructure.

We have nearly 150 projects in Labs. Some of them are development related,
but a large number are for improving infrastructure in production. Here's
some projects that are infrastructure related:

* analytics
* ganglia
* gerrit
* gluster
* integration
* maps
* nginx
* openstack
* otrs
* packaging
* pediapress
* performance
* puppet
* puppet-cleanup
* ... about 2x more

There's a list of TODO items: 
http://www.mediawiki.org/wiki/Wikimedia_Labs#TODO

There's a list of proposals: 
http://www.mediawiki.org/wiki/Wikimedia_Labs#Proposals

There's a list of Labs infrastructure bugs: 
https://bugzilla.wikimedia.org/buglist.cgi?list_id=171774resolution=---resolution=LATERresolution=DUPLICATEquery_format=advancedcomponent=Infrastructureproduct=Wikimedia%20Labs


I think what we're mostly missing is a quick list of easy things to do or
fix as a call to action.

- Ryan
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] How to contribute to sysadmin / devops

2013-01-09 Thread Ryan Lane
On Wed, Jan 9, 2013 at 12:31 PM, Leslie Carr lc...@wikimedia.org wrote:

 On Wed, Jan 9, 2013 at 12:02 PM, Quim Gil q...@wikimedia.org wrote:
 
  Can we define ONE task a good sysadmin could take here and now?
 

 Sure, if you want to copy the information from this ticket - that's a good
 task
 https://rt.wikimedia.org/Ticket/Display.html?id=4060


Here's one listed in bugzilla, as well: 
https://bugzilla.wikimedia.org/show_bug.cgi?id=36994

- Ryan
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] creating the context-aware sidebar

2013-01-21 Thread Ryan Lane
On Mon, Jan 21, 2013 at 6:07 AM, Yury Katkov katkov.ju...@gmail.com wrote:

 Hi guys!

 I'm searching the good sidebar extension for MediaWiki project, can
 anyone recommend something nice? I've tried SidebarEx, NavBlocks,
 CustomSidebar


https://www.mediawiki.org/wiki/Extension:DynamicSidebar

Maintained by myself for use on labsconsole.


 Requirements:

 1) Full MediaWiki markup support;


It supports the same level of markup as the normal sidebar.


 2) show different content for different groups of users;


It handles this.


 3) show different content for different categories. Or maybe using the
 sidebar calls inside the Templates that will then be invoked on the
 pages, creating sidebars that are speciafic for the given page;


It does not support sidebars specific to a given page. It would be
necessary to disable sidebar caching to do so.

Feel free to add this, but make it optional via a config option if you do
so.


 4) nice appearance.


It looks and acts like the normal sidebar.

5) ideally live project that is supported by developers


I maintain it and am happy to take changes in Gerrit.

- Ryan
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Integrating MediaWiki into MS SBS

2013-02-02 Thread Ryan Lane
On Fri, Feb 1, 2013 at 6:29 PM, Jay Ashworth j...@baylink.com wrote:

 - Original Message -
  From: Dan Andreescu dandree...@wikimedia.org

  The following manual seems to be the most actively maintained guide
  for getting MediaWiki installed on Windows:
 
  http://www.mediawiki.org/wiki/Manual:Running_MediaWiki_on_Windows
 
  If you run into any problems, I'd suggest adding them to the manual
  along with any resolutions you or others come up with. Good luck!

 I'm not sure he actually wants to run it on Windows.

 He may just need SSO with Active Directory.

https://encrypted.google.com/search?q=mediawiki+active+directory


If that's the case, the LDAP extension works for that:

http://www.mediawiki.org/wiki/Extension:LDAP_Authentication

- Ryan
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Documentation talk, the second

2013-02-06 Thread Ryan Lane
On Wed, Feb 6, 2013 at 12:45 PM, Steven Walling steven.wall...@gmail.comwrote:

 On Wed, Feb 6, 2013 at 11:48 AM, Chad innocentkil...@gmail.com wrote:

  Well, we enabled transwiki importing on labsconsole from wikitech (see
  https://labsconsole.wikimedia.org/wiki/Special:Import). I don't see any
  reason we couldn't start importing things.
 

 Being someone who often uses wikitech and almost never uses labsconsole, it
 doesn't make sense to me why we'd make a documentation space whose scope is
 ostensibly just Labs the hub for everything. What's the rationale?


First, we'd be merging them and renaming labsconsole to wikitech. Second,
it's absurd to have operations documentation spread across two places, and
wikitech is almost solely operations documentation. Third, it's easier to
create an account on labsconsole, so people can actually edit the docs.
Fourth, someone actually maintains the labsconsole wiki. Fifth, it actually
makes more sense to host things like project documentation on wikitech than
it does to host it on mediawiki.org, but it's too hard for folks to edit
wikitech (see #3). By using labsconsole we could also move the project
documentation there as well. Sixth, we have too many wikis.

- Ryan
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Documentation talk, the second

2013-02-07 Thread Ryan Lane
On Thu, Feb 7, 2013 at 2:29 AM, Petr Bena benap...@gmail.com wrote:

 Everyone knows your phone is more powerful than half of production cluster,
 but now back to the business

 We should make a structure for this new documentation base we are going to
 create, I think beside of categories, they might be some spaces at least
 Documentation: or something like that, where we could put stuff, and it
 should be divided to:

 Wikimedia operation (what is now mostly on wikitech)
  - cluster a
   -- server 1
   -- server 2 etc
  - cluster b...
  ...


I'm in favor of just not importing the server docs. There may be a couple
of ops folks this don't agree with me here, though.


 Wikimedia labs
  - project A
  -- instance A-1
  -- instance A-2
 ...


I'm in favor of heavily refactoring the project pages. They haven't really
had much love since they were introduced. Can you give more of an idea of
how this would work?

At minimum I'd like to move the projects out of the Nova_Resource
namespace.


 Wikimedia bots
  - bot A
  -- developer docs (for devs)
  -- operation docs (for ops)
  -- user manuals (for end users)

 Wikimedia tools
  - tool A
  -- developer docs (for devs)
  -- operation docs (for ops)
  -- user manuals (for end users)

 other stuff

 this is my proposal, of course it could be done in a different way but we
 should make a final decision before we start importing stuff or we create a
 mess


Documenting tools and bots this way seems sane to me.

BTW, for all of this, are you describing pages with subsections, or lots of
subpages?

- Ryan
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Bootstrap based theme for Mediawiki

2013-02-09 Thread Ryan Lane
Patrick had also shown this to me a while back and I've made a demo for a
replacement skin on labsconsole using a modified version of it:

https://labsconsole-test.wmflabs.org/wiki/Main_Page

The strapping skin itself has a few bugs that need to be fixed. I've fixed
them and will be working on upstreaming them.


On Fri, Feb 8, 2013 at 1:37 PM, Yuvi Panda yuvipa...@gmail.com wrote:

 https://github.com/OSAS/strapping-mediawiki looks pretty awesome! (Example
 at http://www.ovirt.org/Home). This also makes bootstrap's classes /
 layouting helpers available to content inside the wiki itself, which is
 pretty cool.

 (thanks to Patrick Reilly on IRC)

 --
 Yuvi Panda T
 http://yuvi.in/blog
 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Developer Hub update

2013-02-09 Thread Ryan Lane
On Fri, Feb 8, 2013 at 12:30 PM, Petr Bena benap...@gmail.com wrote:

 Are you going to reference software that has nothing to do with mediawiki
 on mediawiki.org as well? if not, then keep the hub we have on meta...


I thought it was decided quite a long time ago that meta was for discussing
the projects and not a place to keep random tech documentation, or am I
mistaken?

- Ryan
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Maria DB

2013-02-13 Thread Ryan Lane
As far as I know we're using a custom mariadb version in production, not
the ubuntu version. It would be ideal to use the same versions for
everything, but just switching out to MariaDB isn't going to solve anything
for us.

Also, we're not using MariaDB everywhere in production. We're using a mix
of MySQL and MariaDB. Let's not change anything till we know how that's
going to work out in production.


On Wed, Feb 13, 2013 at 5:19 AM, Petr Bena benap...@gmail.com wrote:

 Okay - so what is outcome? Should we migrate beta cluster? Are we going to
 use it in production?


 On Wed, Feb 13, 2013 at 2:08 PM, Chad innocentkil...@gmail.com wrote:

  On Wed, Feb 13, 2013 at 8:05 AM, bawolff bawolff...@gmail.com wrote:
   Umm there was a thread several months ago about how it is used on
 several
   of the slave dbs, if I recall.
  
 
  Indeed, you're looking for mariadb 5.5 in production for english
  wikipedia
 
  http://www.gossamer-threads.com/lists/wiki/wikitech/319925
 
  -Chad
 
  ___
  Wikitech-l mailing list
  Wikitech-l@lists.wikimedia.org
  https://lists.wikimedia.org/mailman/listinfo/wikitech-l
 
 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Engineering Announcement: Marc-Andre Pelletier joins TechOps

2013-02-16 Thread Ryan Lane
On Fri, Feb 15, 2013 at 11:25 PM, Ct Woo ct...@wikimedia.org wrote:

 All,

 I am pleased to announce Marc-Andre will be joining us as aTechnical
 Operations Engineer (contractor) starting this February 25, 2013. His
 primary focus will be to build up the Wikimedia Labs infrastructure and to
 assist the community developers to migrate their tools to that
 infrastructure, especially those residing on the Toolserver today.

 Marc-Andre is an active wikipedian and he is better known as 'Coren' on
 English Wikipedia and has been  a volunteer editor since 2006  where he has
 served as administrator and arbitrator. He also always kept himself
 involved with the technical and procedural aspects of automated editing
 (so-called bots), having written and operated a copyright-violation
 detection bot  for several years.

 Marc has been a Unix system administrator  and occasional computer science
 instructor for 20+ years, in fields ranging from telecommunication to game
 development. He studied IT at  École de Technologie Supérieure (Canada).

 Please join me to welcome him and you can find him on IRC (freenode.net)
 using the nick 'Coren'.


\o/ Great to have you on the team Coren!

- Ryan
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] Sending 200s for empty user pages of valid users

2013-02-21 Thread Ryan Lane
See:

https://bugzilla.wikimedia.org/show_bug.cgi?id=45255

https://gerrit.wikimedia.org/r/#/c/50305/

Just in case this is controversial, I thought I'd bring this topic up on
the list.

We currently send 404s for empty user pages, even for valid users. It would
be ideal to use user page urls as OpenID provider identity urls, but most
users never create their user pages. OpenID expects identity urls to be
200s, not 404s. I'd like us to send 200s rather than 404s for any valid
user's user pages, whether the page has content or not.

- Ryan
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Wikitech-l Digest, Vol 115, Issue 93

2013-02-21 Thread Ryan Lane
Our bugs are at https://bugzilla.wikimedia.org. Was there anything
specific you'd like to help with?


On Thu, Feb 21, 2013 at 4:29 PM, Kancer Ezeroğlu
ezeroglukan...@gmail.comwrote:

 Hi everybody,

 I want to help about wikimedia's bugs. How can I get wikimedia's bugs ? Can
 somebody please help me ?
 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] Bringing OpenID as a provider to Wikimedia projects

2013-02-22 Thread Ryan Lane
I believe the OpenID extension is matured to the point where it's usable on
the Wikimedia projects, acting as an OpenID provider. The extension still
needs review and such, but I think it's a good time to discuss how we'd
like to implement this on the projects.

My preference for this would be to have a centralized wiki for identity
urls. The identity urls would be based on user pages. I'm proposing this
for a few reasons:

1. It's easier to deal with identity urls in a centralized location, and it
allows us to avoid including the OpenID extension on every wiki
2. We could very strictly limit our vulnerability surface on this wiki by
only including what's necessary
3. At a later point we could decide to limit all authentication to this
location, pointing login links from all projects/wikis here
4. At a later point we could decide to also use this as a global profile
location

I'd prefer if we avoid the bikeshedding of the domain name in this
discussion, if we are all in agreement over the use of a centralized wiki.

Thoughts?

- Ryan
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Bringing OpenID as a provider to Wikimedia projects

2013-02-22 Thread Ryan Lane
On Fri, Feb 22, 2013 at 12:40 PM, Greg Grossmeier g...@wikimedia.orgwrote:

 quote name=maiki date=2013-02-22 time=12:17:25 -0800
  Is this up for discussion, or are we at the point of planning
  deployment? It isn't apparent to me why any WMF site would be an OpenID
  provider.

 To phrase this differently:

 Do you more prefer that WMF sites consume OpenIDs instead of (or in
 addition to) providing them?


This isn't really a matter of having one or the other. As Marc has
mentioned, we need some non-hacky form of authentication for bots, tools,
out-of-cluster applications, and non-mediawiki applications.

OpenID as a consumer is a more difficult task for a number of reasons. I
like to tackle problems one at a time and making a provider is an easy
first step.

- Ryan
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Bringing OpenID as a provider to Wikimedia projects

2013-02-22 Thread Ryan Lane
On Fri, Feb 22, 2013 at 1:03 PM, James Forrester
jforres...@wikimedia.orgwrote:

 On 22 February 2013 12:54, Yuri Astrakhan yuriastrak...@gmail.com wrote:

  Do you intend to cover both SUL and legacy accounts?
 

 I don't think it's worth anyone's time working out a way of supporting
 non-global accounts, given the on-going work to fix these as part of SUL
 finalisation which hopefully will get some finished soon. See
 https://www.mediawiki.org/wiki/Admin_tools_development#Roadmap for wider
 work in this area.


Totally agreed. I think it would be a waste of time to support non-global
accounts.

- Ryan
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Bringing OpenID as a provider to Wikimedia projects

2013-02-22 Thread Ryan Lane
On Fri, Feb 22, 2013 at 2:03 PM, Jay Ashworth j...@baylink.com wrote:

 - Original Message -
  From: Ryan Lane rlan...@gmail.com

  I believe the OpenID extension is matured to the point where it's usable
 on
  the Wikimedia projects, acting as an OpenID provider. The extension still
  needs review and such, but I think it's a good time to discuss how we'd
  like to implement this on the projects.

 I, too, want to clarify: you're proposing centralizing WMF identity to
 whatever extent it is not already centralized, and then using OpenID
 *within MWF*: so that all WMF sites and installed extensions can auth
 users against our own user database?

 Not authenticating users against external OID providers (which, as nearly
 as I can tell, largely amount to I am whom I say I am), or allowing
 external non-WMF sites to authenticate against our user database.


Any OpenID consumer, whether WMF or not, would be able to use us as an
authentication provider.

- Ryan
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Bringing OpenID as a provider to Wikimedia projects

2013-02-22 Thread Ryan Lane
On Fri, Feb 22, 2013 at 2:30 PM, Thomas Gries m...@tgries.de wrote:

 Ryan wrote:


  Any OpenID consumer, whether WMF or not, would be able to use us as an
  authentication provider.
 There is currently no option, but an option (to restrict serving OpenIDs
 to certain
 consumer domains eg. only to our domain) could be implemented.


I see no reason in doing so. If third parties want to allow Wikimedia as a
provider, I don't see why we'd object.

- Ryan
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Bringing OpenID as a provider to Wikimedia projects

2013-02-22 Thread Ryan Lane
On Fri, Feb 22, 2013 at 3:19 PM, Tyler Romeo tylerro...@gmail.com wrote:

 To be absolutely clear, this does *not* solve the problem of bots/tools
 authenticating on behalf of a user. All it does is solve the problem of
 where a bot/tool authenticates under its own user account and, out of pure
 courtesy for the community, asks users to prove their identity before
 allowing them to use the bot/tool. For bots/tools that actually perform
 edits as the user, OpenID would be useless.


You're confusing use cases. What you're talking is the use case for OAuth.
This thread isn't about OAuth. I believe we have plans to add OAuth next
quarter, but if you wish to continue discussing it, please make a new
thread.

In cases where a tool is keeping an authentication database, and is not
acting on behalf of a user, then OpenID would let the tool eliminate its
username/password store.


 Also, I think Wikipedia acting as an OpenID consumer would be bounds more
 useful than acting as a provider. That's not to say that having both
 wouldn't be a good idea, but the consumer side of it should definitely be a
 priority. Think of sites now like StackOverflow, where creating an account
 is as simple as pressing a few Accept buttons.


Sure, it would be great, but allowing authentication as a consumer is a
much more difficult step, and we're not ready to take it right now. OpenID
as a provider solves some long-standing problems and is a step in the right
direction, let's focus on one thing at a time.

- Ryan
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Bringing OpenID as a provider to Wikimedia projects

2013-02-22 Thread Ryan Lane
On Fri, Feb 22, 2013 at 4:07 PM, Tyler Romeo tylerro...@gmail.com wrote:

 
  In cases where a tool is keeping an authentication database, and is not
  acting on behalf of a user, then OpenID would let the tool eliminate its
  username/password store.


 This is exactly what I'm saying. It doesn't do this. If a tool has a
 username/password store, i.e., it uses the username and password of each
 user, enabling OpenID wouldn't solve the authentication problem. Like I
 said, it only works in cases where the bot does all of its work under its
 own account.


Let's consider bugzilla.wikimedia.org, for instance. It has its own
credentials store. With OpenID as a provider on the projects, it could be
possible to use your Wikimedia credentials rather than a username/password
specific to bugzilla.

In this situation bugzilla isn't acting on behalf of a user to interact
with another application. An application acting on behalf of a user with
another application is what OAuth does, not OpenID, and this thread isn't
about that.


  Sure, it would be great, but allowing authentication as a consumer is a

 much more difficult step, and we're not ready to take it right now. OpenID
  as a provider solves some long-standing problems and is a step in the
 right
  direction, let's focus on one thing at a time.


 How exactly is it so difficult? You just set the configuration option for
 the extension.


Feel free to bring this question up in another thread. Please search
through the archives before doing so, though. I've answered this question
numerous times over the past 2-3 years.

- Ryan
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] [Engineering] Merging wikitech and labsconsole on Thursday at 1 PM PDT

2013-02-26 Thread Ryan Lane
On Tue, Feb 26, 2013 at 2:43 PM, MZMcBride z...@mzmcbride.com wrote:

 Ryan Lane wrote:
  At 1 PM I'll do the following:
 
  1. Mark wikitech as read-only
  2. Add wikitech-old.wikimedia.org, pointing at the old wiki
  3. Change wikitech.wikimedia.org to labsconsole's IP
  4. Import all of the content to labsconsole
  5. Change labsconsole.wikimedia.org to wikitech.wikimedia.org
 
 [...]

 labsconsole.wikimedia.org and wikitech.wikimedia.org have different
 article view paths:

 * https://labsconsole.wikimedia.org/wiki/Main_Page
 * http://wikitech.wikimedia.org/view/Main_Page

 As long as the wikitech.wikimedia.org/view/ URL forms continue to work, I
 don't see any issue here. I suppose there are a few different ways to
 support this.



I have a rewrite rule on the new server for this, that I pushed in
yesterday:

https://gerrit.wikimedia.org/r/#/c/50856/

- Ryan
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] [Labs-l] Merging wikitech and labsconsole on Thursday at 1 PM PDT

2013-02-26 Thread Ryan Lane
On Tue, Feb 26, 2013 at 2:46 PM, Petr Bena benap...@gmail.com wrote:

 the labsconsole will remain operational? or it will redirect?


labsconsole will redirect.

- Ryan
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Nagios is dead, long live icinga!

2013-02-26 Thread Ryan Lane
On Tue, Feb 26, 2013 at 5:32 PM, Leslie Carr lc...@wikimedia.org wrote:

 As some may have noticed, we are phasing out nagios in favor of icinga
 ( https://www.icinga.org/ )

 nagios.wikimedia.org now redirects to icinga.wikimedia.org ! Please
 let us know if you notice anything that has broken or is inconsistent.


Awesome work Leslie!

- Ryan
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] [Engineering] Merging wikitech and labsconsole on Thursday at 1 PM PDT

2013-02-27 Thread Ryan Lane
On Wed, Feb 27, 2013 at 6:25 AM, Tyler Romeo tylerro...@gmail.com wrote:

 Is using rewrite a good idea, or would it be better to just redirect so
 that there's only one actual URI? (I don't have an answer to that, just
 asking.)


Rewrite is doing a redirect, not pass-through.

- Ryan
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] [Engineering] Merging wikitech and labsconsole on Thursday at 1 PM PDT

2013-02-27 Thread Ryan Lane
On Wed, Feb 27, 2013 at 5:20 AM, Federico Leva (Nemo) nemow...@gmail.comwrote:

 Does all of the content include archive (deleted pages)?


Pages that are already deleted will not be migrated. All other pages will
be, including ones that were marked as to be deleted will be migrated.

- Ryan
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] wikitech migration still ongoing

2013-03-01 Thread Ryan Lane
Seems my previous updates on this may not have made it to this list, so
here's another update:

The wikitech dump was rather large and had a lot of revisions (60k
revisions), even though it has a fairly small number of pages (1,600
pages). The dump was about 6.1GB. The vast majority of these revisions are
from the Server Admin log.

Importing this is taking quite a bit longer than anticipated. Until the
import is complete, the wikitech documentation is still available at
http://wikitech-old.wikimedia.org. When the import finishes, I'll be
re-logging any !log messages from the channel that were missed.

Until this happens, the !log messages can be found in the IRC channel logs:

http://bots.wmflabs.org/~wm-bot/logs/%23wikimedia-operations/

- Ryan
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] wikitech migration still ongoing

2013-03-01 Thread Ryan Lane
Migration is done and morebots is logging again. I still need to re-log
past actions, but I'll do that soon. Let me know if you see any migration
issues.


On Fri, Mar 1, 2013 at 12:08 AM, Ryan Lane rlan...@gmail.com wrote:

 Seems my previous updates on this may not have made it to this list, so
 here's another update:

 The wikitech dump was rather large and had a lot of revisions (60k
 revisions), even though it has a fairly small number of pages (1,600
 pages). The dump was about 6.1GB. The vast majority of these revisions are
 from the Server Admin log.

 Importing this is taking quite a bit longer than anticipated. Until the
 import is complete, the wikitech documentation is still available at
 http://wikitech-old.wikimedia.org. When the import finishes, I'll be
 re-logging any !log messages from the channel that were missed.

 Until this happens, the !log messages can be found in the IRC channel logs:

 http://bots.wmflabs.org/~wm-bot/logs/%23wikimedia-operations/

 - Ryan

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] OpenID as a provider project page

2013-03-01 Thread Ryan Lane
 I wrote up some quick documentation on OpenID as a provider. Feel free to
modify it, especially for inaccurately used terminology. It's also likely a
good time to start bikeshed discussions on the urls, as I think it'll end
up taking a while to lock that down.

https://www.mediawiki.org/wiki/OpenID_Provider

- Ryan
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Editing wikipedia using google, openID or facebook

2013-03-07 Thread Ryan Lane
On Thu, Mar 7, 2013 at 12:10 PM, Tyler Romeo tylerro...@gmail.com wrote:

 On Thu, Mar 7, 2013 at 3:05 PM, Antoine Musso hashar+...@free.fr wrote:

  We still have to figure out which account will be used, the URL, whether
  we want a dedicated wiki etc...
 

 Those discussions are unrelated to using OpenID as a client, though.


As I've mentioned before. I'm the one championing OpenID support on the
sites and I have no current plans on enabling OpenID as a consumer. Making
authentication changes is difficult. We're focusing on OpenID as a provider
and OAuth support right now, and that's way more than enough to try to do
this quarter.

- Ryan
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] Extension:OpenID 3.00 - Security Release

2013-03-07 Thread Ryan Lane
*Marc-Andre Pelletier discovered a vulnerability in the MediaWiki OpenID
extension for the case that MediaWiki is used as a “provider” and the wiki
allows renaming of users.

All previous versions of the OpenID extension used user-page URLs as
identity URLs. On wikis that use the OpenID extension as “provider” and
allows user renames, an attacker with rename privileges could rename a user
and could then create an account with the same name as the victim. This
would have allowed the attacker to steal the victim’s OpenID identity.

Version 3.00 fixes the vulnerability by using Special:OpenIDIdentifier/id
as the user’s identity URL, id being the immutable MediaWiki-internal
userid of the user. The user’s old identity URL, based on the user’s
user-page URL, will no longer be valid.

The user’s user page can still be used as OpenID identity URL, but will
delegate to the special page.

This is a breaking change, as it changes all user identity URLs. Providers
are urged to upgrade and notify users, or to disable user renaming.

Respectfully,

Ryan Lane

https://gerrit.wikimedia.org/r/#/c/52722
Commit: f4abe8649c6c37074b5091748d9e2d6e9ed452f2*
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Extension:OpenID 3.00 - Security Release

2013-03-08 Thread Ryan Lane
On Fri, Mar 8, 2013 at 1:07 AM, Yuvi Panda yuvipa...@gmail.com wrote:

 Was this the last blocker to getting the extension deployed?


On wikitech the blockers were the switch of the wiki name (from labsconsole
to wikitech) and this. There's still some issues that need to be worked out
for deployment on the main projects. Also, it needs a full review before
deployment to the projects, and we need to work out how this will affect
the OAuth plans. We have a kickoff meeting for this coming up soon. I'll
send updates when that occurs.

For deployment on wikitech I think I'd like to wait for a full security
review, so it may be a little while.

- Ryan
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Github/Gerrit mirroring

2013-03-09 Thread Ryan Lane
On Sat, Mar 9, 2013 at 12:15 PM, Yuri Astrakhan yuriastrak...@gmail.comwrote:

 Should we re-start the lets migrate to github discussion?


No.

- Ryan
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Github/Gerrit mirroring

2013-03-10 Thread Ryan Lane
On Sun, Mar 10, 2013 at 12:55 AM, Waldir Pimenta wal...@email.com wrote:

 On Sat, Mar 9, 2013 at 8:39 PM, Ryan Lane rlan...@gmail.com wrote:

  On Sat, Mar 9, 2013 at 12:15 PM, Yuri Astrakhan yuriastrak...@gmail.com
  wrote:
 
   Should we re-start the lets migrate to github discussion?
  
  
  No.


 To be fair, though I understand the arguments against using github (though
 accepting pull requests from it is a must!), I always had the impression
 that other options weren't given enough attention during that discussion.
 Particularly GitLab looked very usable, could be used for self-hosting,
 etc., and never even got much content in the case against section (only
 two items, both currently marked as no longer true). I am not entirely
 sure why it was discarded so promptly in favor of the admittedly powerful,
 but clearly problematic/controversial Gerrit. Are there strong reasons do
 dismiss it that weren't stated in that page?


Let's please not start the giant flamewar of solutions again. We went
through a fairly long and fairly difficult process just to continue using
Gerrit. If you feel strongly about this, I'll create a project in Labs, and
you can prove that it's worth switching to. Remember to use the evaluation
process we already went through 
http://www.mediawiki.org/wiki/Git/Gerrit_evaluation#Criteria_by_which_to_judge_a_code_review_tool
and also consider there's probably a giant list of things that we've added
into Gerrit since then.

Remember that if something uses about 100 ruby gems that ops will want to
skin you alive, even if it's a superior solution.

- Ryan
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] New unified SSL certificate deployed

2013-03-12 Thread Ryan Lane
We just finished deploying a new SSL certificate to the sites. Now all *.m
and *. certificates are included in a single certificate, except
mediawiki.org. Unfortunately we somehow forgot mediawiki.org when we
ordered the updated cert. We'll be replacing this soon with another cert
that had mediawiki.org included.

This should fix any certificate errors that folks have been seeing on
non-wikipedia m. domains.

- Ryan
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] New unified SSL certificate deployed

2013-03-12 Thread Ryan Lane
On Tue, Mar 12, 2013 at 4:47 PM, Brion Vibber br...@pobox.com wrote:

 Thanks guys!

 -- brion


Don't thank us too quick. We needed to revert this for mobile. Seems *.
m.wikipedia.org was also missing from the cert. Needless to say I'll be
writing a script that can be run against a cert to ensure it's not missing
anything. We'll also be adding monitoring to check for invalid certificates
for any top level domain.

- Ryan
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] New unified SSL certificate deployed

2013-03-12 Thread Ryan Lane
You mean: https://arbcom.en.wikipedia.org ?

Our certificates have never covered that. That's a sub-sub domain, and our
certs only cover single subdomains. We really need to rename all of our
sub-sub domains to single subdomains for them to be covered (or we need to
include every sub-subdomain in the unified cert, but that's going to bloat
it).


On Tue, Mar 12, 2013 at 9:18 PM, Risker risker...@gmail.com wrote:

 On 12 March 2013 21:15, Ryan Lane rlan...@gmail.com wrote:

  On Tue, Mar 12, 2013 at 4:47 PM, Brion Vibber br...@pobox.com wrote:
 
   Thanks guys!
  
   -- brion
  
  
  Don't thank us too quick. We needed to revert this for mobile. Seems *.
  m.wikipedia.org was also missing from the cert. Needless to say I'll be
  writing a script that can be run against a cert to ensure it's not
 missing
  anything. We'll also be adding monitoring to check for invalid
 certificates
  for any top level domain.
 
 
 I think it might also be missing some of the small/private wikis.  I got
 bad certificate messages for the English Wikipedia Arbcom wiki tonight.

 But thanks for working on this.

 Risker/Anne
 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] New unified SSL certificate deployed

2013-03-12 Thread Ryan Lane
On Tue, Mar 12, 2013 at 9:45 PM, Risker risker...@gmail.com wrote:

 Yes, that's the wiki I mean.  And I can see your point about all those
 sub-subdomains; there must be a stack of them.  The domain name was changed
 fairly recently and we got the bad cert messages then, and added our
 exceptions. Tonight we got the messages again.  Perhaps it was because the
 subdomain's cert changed.


Ah. Yes. You're likely to get another round of messages when we get the new
cert in as well. All of the arbcom wikis, and other sub-subdomain wikis are
in the same cluster and use the same IP addresses and as such use the same
certificates. I wonder if we have a bug in about renaming sub-subdomain
wikis

- Ryan
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] New unified SSL certificate deployed

2013-03-12 Thread Ryan Lane
On Tue, Mar 12, 2013 at 10:06 PM, Risker risker...@gmail.com wrote:


 Well, if you need to rename the wiki again, a bit of notice would be
 appreciated; it was rather a shock when folks went to log in using
 bookmarks, only to find they no longer worked. We still haven't finished
 cleaning up links. :-)


Hm. Kind of annoying that they were renamed, but weren't renamed to
something that would solve the certificate issues. :(

If we rename them to solve the certificate issues, I'll make sure the
community is notified well in advance.


 But as to security certificates, I do want to thank the team - this does
 make a difference for a lot of users.


Great. Glad to hear it!

- Ryan
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] New unified SSL certificate (round two) to be deployed this afternoon

2013-03-13 Thread Ryan Lane
This is finished.


On Wed, Mar 13, 2013 at 11:43 AM, Rob Halsell rhals...@wikimedia.orgwrote:

 We will be pushing the new unified certificate this afternoon @ 2PM PDT.
 This new certificate will include all our primary top level domains, as
 well as the mobile subdomains.

 Unfortunately, those projects that have a sub.subdomain (example:
 arbcom.de.wikimedia.org) will have to redo their redirection rules and the
 like, as this will break them again.  (As those users saw yesterday)

 The serial/fingerprint of the new certificate is:

 07:24:ee:a9:7c:55:f2:57:5e:28:8b:a4:cc:f2:0e:8e

 A quick grep through the pdns files shows me the following projects will be
 affected by this change (and have to redo their redirections/whatnot):

 arbcom.de.wikipedia.org
 arbcom.en.wikipedia.org
 arbcom.fi.wikipedia.org
 arbcom.nl.wikipedia.org
 noboard.chapters.wikimedia.org

 This is all the ones that I found, but it may not be all inclusive.

 This change will fix certificate issues that have been around for awhile
 now, as listed in:

 https://bugzilla.wikimedia.org/show_bug.cgi?id=34788

 If there are any questions or concerns, feel free to reply back to this
 thread.

 Thanks,


 --
 Rob Halsell
 Operations Engineer
 Wikimedia Foundation, Inc.
 E-Mail: rhals...@wikimedia.org
 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] New unified SSL certificate deployed

2013-03-13 Thread Ryan Lane
On Wed, Mar 13, 2013 at 8:12 PM, Jay Ashworth j...@baylink.com wrote:

 - Original Message -
  From: Ryan Lane rlan...@gmail.com

  We just finished deploying a new SSL certificate to the sites. Now all
  *.m and *. certificates are included in a single certificate, except
  mediawiki.org. Unfortunately we somehow forgot mediawiki.org when we
  ordered the updated cert. We'll be replacing this soon with another
  cert that had mediawiki.org included.
 
  This should fix any certificate errors that folks have been seeing on
  non-wikipedia m. domains.

 Hey, Ryan; did you see, perhaps on outages-discussion, the after action
 report from Microsoft about how their Azure SSL cert expiration screwup
 happened?


What's the relevance here?

- Ryan
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] Fwd: New unified SSL certificate deployed

2013-03-13 Thread Ryan Lane
On Wed, Mar 13, 2013 at 9:24 PM, Jay Ashworth j...@baylink.com wrote:

 - Original Message -
  From: Ryan Lane rlan...@gmail.com

   Hey, Ryan; did you see, perhaps on outages-discussion, the after action
   report from Microsoft about how their Azure SSL cert expiration screwup
   happened?

  What's the relevance here?

 Does ops have a procedure for avoiding unexpected SSL cert expirations,
 and does this affect it in any way other than making it easier to
 implement?,
 I would think...


We didn't have a certificate expiration. We replaced all individual
certificates, delivered by different top level domains, with a single
unified certificate. This change was to fix certificate errors being shown
on all non-wikipedia domains for HTTPS mobile users, who were being
delivered the *.wikipedia.org certificate for all domains.

The unified certificate was missing 6 Subject Alternative Names:
mediawiki.org, *.mediawiki.org,  m.mediawiki.org, *.m.mediawiki.org,
m.wikipedia.org and *.m.wikipedia.org. Shortly after deploying the
certificate we noticed it was bad and reverted the affected services (
mediawiki.org and mobile) back to their individual certificates. The change
only affected a small portion of users for a short period of time.

If you notice, I've already mentioned how we'll avoid and more quickly
detect problems like this in the future:

Needless to say I'll be writing a script that can be run against a cert to
ensure it's not missing anything. We'll also be adding monitoring to check
for invalid certificates for any top level domain.

- Ryan
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Github/Gerrit mirroring

2013-03-15 Thread Ryan Lane
On Thu, Mar 14, 2013 at 10:02 PM, Quim Gil q...@wikimedia.org wrote:

 This thread has probably reached that limit of fresh arguments. Can we
 continue with proper documentation, reporting and patches?

 We care about GitHub and we believe it is an important source of potential
 contributors. This is why we are mirroring our repos there, and this is why
 we are working on

 Bug 35497 - Implement a way to bring GitHub pull requests into Gerrit
 https://bugzilla.wikimedia.**org/show_bug.cgi?id=35497https://bugzilla.wikimedia.org/show_bug.cgi?id=35497

 If you want to improve the path from GitHub to Gerrit please contribute to
 that bug report or e.g. in a landing page (at mediawiki.org or github.com)
 to help GitHub users interested in our projects.


 About the Gerrit UI, if you have specific complaints or enhancement
 requests please file them. We have been upstreaming reports and also
 patches. We are surely not the only Gerrit users thinking that GitHub does
 some things better so there is hope. But saying it's ugly doesn't help.



 On 03/09/2013 01:06 PM, Tyler Romeo wrote:

 - Registration - The only thing that is probably a problem. We need to
 make Gerrit registration easier.


 What are the bottlenecks nowadays? If there are still any please file bugs.


Yes, please do. Notice that I'm tracking account creation improvement here,
as well:


http://www.mediawiki.org/wiki/Wikimedia_Labs/Account_creation_improvement_project


- Ryan
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] [RFC] performance standards for new mediawiki features

2013-03-26 Thread Ryan Lane
On Tue, Mar 26, 2013 at 3:58 PM, Asher Feldman afeld...@wikimedia.orgwrote:

 There are all good points, and we certainly do need better tooling for
 individual developers.

 There are a lot of things a developer can do on just a laptop in terms of
 profiling code, that if done consistently, could go a long way, even
 without it looking anything like production.  Things like understanding if
 algorithms or queries are O(n) or O(2^n), etc. and thinking about the
 potential size of the relevant production data set might  be more useful at
 that stage than raw numbers.  When it comes to gathering numbers in such an
 environment, it would be helpful if either the mediawiki profiler could
 gain an easy visualization interface appropriate for such environments, or
 if we standardized around something like xdebug.

 The beta cluster has some potential as a performance test bed if only it
 could gain a guarantee that the compute nodes it runs on aren't
 oversubscribed or that the beta virts were otherwise consistently
 resourced.  By running a set of performance benchmarks against beta and
 production, we may be able to gain insight on how new features are likely
 to perform.


This is possible in newer versions of OpenStack, using scheduler hinting.

That said, the instances would still be sharing a host with each other,
which can cause some inconsistencies. We'd likely not want to use beta
itself, but something that has limited access for performance testing
purposes only, as we wouldn't want other unrelated testing load to skew
results. Additionally, we'd want to make sure to avoid things like
/data/project or /home (both of which beta is using), even once we've moved
to a more stable shared storage solution, as it could very heavily skew
results based on load from other projects.

We need to upgrade to the Folsom release or greater for a few other
features anyway, and enabling scheduler hinting is pretty simple. I'd say
let's consider adding something like this to the Labs infrastructure
roadmap once the upgrade happens and we've tested out the hinting feature.

- Ryan
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Proposal: Wikitech contributors

2013-04-03 Thread Ryan Lane
On Wed, Apr 3, 2013 at 3:30 AM, Ori Livneh o...@wikimedia.org wrote:



 On Tuesday, April 2, 2013 at 5:45 AM, Quim Gil wrote:

  I have been drafting a proposal to attract new contributors, help them
  settle in, and connect them to interesting tasks. It turns out that many
  of these problems are not unique to new contributors. We suffer them as
  well and we are just used to them.
 
  The proposal has evolved into a deeper restructuring of our community
  spaces. We're still drafting it, but a round of wider feedback is
  welcome before opening the official RFC at
 
 
 https://www.mediawiki.org/wiki/Requests_for_comment/Wikitech_contributors
 
  In summary:
 
  * wikitech.wikimedia.org (http://wikitech.wikimedia.org) would become
 the one and only site for our open
  source software contributors, powered by semantic software and an
  ontology of categories shared across wiki pages, Bugzilla and hopefully
  Gerrit.

 That seems wrong. Of the two, MediaWiki.org is clearly the more successful
 wiki. It is larger by all measures, and draws a wide pool of active
 contributors. This is reflected in the quality of the documentation, which
 (pace self-deprecating humor) tends to be quite good. Up until recently a
 good portion of the links on Wikitech's main page pointed to content that
 was flagged as obsolete. It has come some way since then, but not quite
 enough to subsume mediawikiwiki.


mediawiki.org will still exist to document MediaWiki. The domain name
itself makes it fairly ill-fit to document our non-MediaWiki software
documentation.

Wikitech, till recently, was a fishbowl wiki that required an
operations-team member to create an account for anyone wanting to edit.
Since being merged with labsconsole it has open registration and its
authentication is integrated with a number of our developer and operations
tools.

All contributors must become at least partially familiar with wikitech
anyway, since it's the registration location for commit access. That, along
with its more generic name, better suit it for hosting our documentation
for non-MediaWiki products.


 The core of MediaWiki is in my mind still radical and exciting: you make
 or find a page, click edit, and just type into it. This seems to have
 gotten buried over the years under a pile of bad ideas and bad
 implementations, and skewered from the outside by
 MegaTronEditKillerLaserBot2000 and the like. The solution is not to pile
 additional layers on top, but to excavate MediaWiki from underneath them
 and make it fast as hell and simple, so that it once again feels like a
 dangerous, underspecified, liberating idea. Semantic this-and-that and
 ontologies of categories entails putting up rails everywhere and signs with
 arrows on them indicating which way you should go. That approach will
 inevitably end up reflecting a narrow, inflexible view of what a wiki is
 and what you do with it. And all the while MediaWiki will creak and groan
 from underneath.


The biggest freedom a wiki provides is the ability to create your own
structures and to change and modify those structures quickly and easily.
Adding semantics gives you more freedom to create more interesting
structures, especially if this is combined with proper templating.

One example of how semantics could improve mediawiki.org is the extension
matrix https://www.mediawiki.org/wiki/Extension_Matrix. That list is
maintained by a bot. Its listing is completely inflexible. The bot is
required because MediaWiki has no mechanism for handling this. Adding
semantics to the extension pages would make the matrix a simple query. We
could also add some statistics about extensions. It wouldn't require
anything from the editor's side as the semantics are hidden away via
templates. Editing extension pages could also be considerably easier, since
the extension page could have a proper form for the templates. We will
hopefully be able to use wikidata for things like this in the future.

Spend some time editing a well designed Semantically enabled wiki. Web
Platform is a good example: http://docs.webplatform.org/wiki/Main_Page.
There's a high degree of structure there. That wiki is way above average
quality from the point of view of a reader specifically because it enables
the editor to easily make the content consistent.

I think the level of structure on webplatform doesn't well suit something
like wikitech or mediawiki.org, but adding some extra structure or at least
adding semantics to existing structure will be very useful.

- Ryan
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Proposal: Wikitech contributors

2013-04-03 Thread Ryan Lane
On Apr 3, 2013, at 8:29 AM, Antoine Musso hashar+...@free.fr wrote:

 Le 03/04/13 11:18, Ryan Lane a écrit :
 One example of how semantics could improve mediawiki.org is the extension
 matrix https://www.mediawiki.org/wiki/Extension_Matrix.

 Can't we get Semantic extensions deployed on mediawiki.org ?


I'd rather we wait for wikidata. I'd eventually like to replace the
SMW use on wikitech with wikidata as well.

- Ryan

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Proposal: Wikitech contributors

2013-04-03 Thread Ryan Lane
On Wed, Apr 3, 2013 at 2:11 PM, Ori Livneh o...@wikimedia.org wrote:



 On Wednesday, April 3, 2013 at 2:18 AM, Ryan Lane wrote:

  Spend some time editing a well designed Semantically enabled wiki. Web
  Platform is a good example: http://docs.webplatform.org/wiki/Main_Page
 .
  There's a high degree of structure there. That wiki is way above average
  quality from the point of view of a reader specifically because it
 enables
  the editor to easily make the content consistent.

 OK, given your experience with Web Platform, I'd like to get your
 assessment of what sort of engineering and design effort was required. What
 you are proposing is considerably more ambitious in scope (Web Platform
 doesn't integrate with bug management and SCM), but some napkin cost
 analysis could be very useful. Web Platform has very good usability and
 design. Even if the basic blocks for a semantically-enabled wikitech are
 there, there is still a large additional investment of designer time and
 effort that would be needed to make it usable and well-integrated. But how
 much? (Input from designers would be useful, too.)


This isn't technically my proposal. It's Quim's, so he can likely better
answer this.

For webplatform the semantic design was implemented by a couple engineers
in about 3-6 months.

- Ryan
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Proposal: Wikitech contributors

2013-04-03 Thread Ryan Lane
On Wed, Apr 3, 2013 at 5:01 PM, Steven Walling steven.wall...@gmail.comwrote:

 Quim, I think even this first iteration is problematic on a bunch of
 fronts. 3 months as a first iteration to build several major features as
 the basic proof of concept should be a sign that you're biting off too much
 in terms of scope.


I think this is somewhat exaggerated. Almost all of the things proposed can
likely be done by defining a set of semantic properties, modifying existing
templates, then adding queries into templates that can be added back into
the same templates we're already using on other pages. Defining forms is
also relatively simple for all of this. I doubt much or any of this will
requirement any development work.

If we hire someone that already has a lot of SMW experience, this is likely
a pretty easy target.


 I also think it's deeply problematic that you don't seem to have shaped the
 proposal based on the expressed needs of people who have tried to use the
 current system and failed, and that you're seemingly ignoring the use case
 of all the many different kinds of contributors by focusing a comprehensive
 restructure solely for new contributors. When we make something like Echo,
 we're doing it first and foremost to attract new people, but we can't get
 away with ignoring the needs of existing users.


We have a current system?


 In general, I don't think you've fully considered how the current set up
 might serve our needs with less heavy-handed changes than migrating to
 Semantic MediaWiki, and I'm wary of supporting a restructuring of
 documentation systems I depend of every day based on a grand plan of any
 kind.


Almost all of the changes Quim is suggesting will likely be completely
transparent to you and your normal processes. Semantic annotations are
almost always added to templates and users have no clue that magic is
happening behind them.

- Ryan
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Proposal: Wikitech contributors

2013-04-03 Thread Ryan Lane
On Wed, Apr 3, 2013 at 5:19 PM, Isarra Yos zhoris...@gmail.com wrote:

 On 03/04/13 20:48, Matthew Flaschen wrote:

 On 04/03/2013 03:30 AM, Ori Livneh wrote:

 That seems wrong. Of the two, MediaWiki.org is clearly the more
 successful wiki. It is larger by all measures, and draws a wide pool
 of active contributors.

 I don't know that it's appropriate to put WMF-only stuff on the
 MediaWiki site.  Of course, I'm not totally convinced merging the other
 way is a good idea other.  Although there's overlap between MW proper
 and the WMF tech, we have to remember they're distinct.


 I know some of WMF-specific stuff would still be useful to others trying
 to do similar - things like server setup, installation issues,
 configuration and good practices and whatnot comes to mind. Having examples
 of what others have done - mostly Wikimedia and ShoutWiki in my case -
 proved invaluable when I was setting up my own wikifamily-like thing, at
 least, but they weren't exactly easy to find.


Wikimedia's configuration isn't specified on MediaWiki.org. It's on
wikitech already. The docs you are referring to are non-wikimedia
configuration examples provided by MediaWiki users, and that stuff will
stay there.

- Ryan
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Proposal: Wikitech contributors

2013-04-03 Thread Ryan Lane
On Wed, Apr 3, 2013 at 5:31 PM, Steven Walling steven.wall...@gmail.comwrote:

 Let me put it a simpler way: I don't support moving to Semantic MediaWiki,
 which to me as user seems like a somewhat arcane and bloated piece of
 software that will require me and lots of people to relearn how we write
 documentation and project tracking, unless you can show why the changes you
 want to make are A) necessary B) require SMW to accomplish them.
 Demonstrating that the high level structure proposed will work before
 making the more drastic change seems like a good way to convince everyone
 that what's being proposed is the right path toward a better wiki for all.


Have you tried SMW? Unless you are heavily editing templates you won't need
to relearn anything and neither will anyone else. The system is hidden from
you behind normal MediaWiki templates. In many cases the templates are also
editable via forms, which actually makes it far easier to edit than regular
MediaWiki. If you are a content organizer that modifies templates and likes
to make structures easier for for readers and editors, SMW actually makes
it much easier to do things that are otherwise impossible in MediaWiki
without inflexible bots. Writing bots is a lot harder than learning SMW.

Are you also opposed to wikidata? The vast majority of what's being
proposed for wikitech is functionality that will also be available at some
point in wikidata. When that's available we can switch to that system,
rather than SMW.

- Ryan
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Proposal: Wikitech contributors

2013-04-03 Thread Ryan Lane
On Wed, Apr 3, 2013 at 6:06 PM, Steven Walling steven.wall...@gmail.comwrote:

 On Wed, Apr 3, 2013 at 2:59 PM, Ryan Lane rlan...@gmail.com wrote:

  If you are a content organizer that modifies templates and likes
  to make structures easier for for readers and editors, SMW actually makes
  it much easier to do things that are otherwise impossible in MediaWiki
  without inflexible bots.
 

 Yes, and that sounds like an advantage for people like Quim, whose job is
 to create structures for other people to use. What I'm saying, as someone
 who is living with our current totally imperfect system of documenting
 software and projects on a day-to-day basis, is that I don't see what the
 advantages are, and that a redesign and a migration needs to take in to
 account everyone's needs more, not just roles like Quim's and the new
 contributors he is admirably working to attract. The best way to approach a
 project like this is not to propose an up-front migration of an entire wiki
 to a new piece of software, just to prototype a few new features.


The proposal is to move non-MediaWiki documentation our of
mediawiki.orginto a more generically named wiki. The proposal isn't
for migrating all of
mediawiki.org.

- Ryan
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Proposal: Wikitech contributors

2013-04-04 Thread Ryan Lane
On Thu, Apr 4, 2013 at 2:15 AM, Jeroen De Dauw jeroended...@gmail.comwrote:

 Hey,

  Out of curiosity, what's the largest wiki to use Semantic MediaWiki?

 That depends on how you defined largest. If you want a list of public
 wikis (which of course is not complete) by property value count, then here
 you go: http://wikiapiary.com/wiki/Semantic_statistics

 SMW is used on many hundreds of wikis, including those of governments and
 large corporations. It has been out there for years. People seem to be
 using it very successfully. It also has by far the largest third party
 community next to core itself. From my perspective a lot of the concerns
 typically coming from people whenever WMF usage of SMW is in question are
 uninformed paranoia. And there seems to be a good deal of populism going on
 as well, since apparently in some WMF related circles it is cool to hate
 SMW.


I'd say it's well informed paranoia. SMW is used on some largish wikis at
Wikia and it apparently causes issues very often. The largest wiki with SMW
at Wikia is very small in comparison to a lot of WMF wikis.

SMW has historically had a touch of the featuritis. Most of its features
would need to be disabled on WMF wikis due to performance reasons, and then
what's the point?

- Ryan
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Clicktracking being phased out

2013-04-12 Thread Ryan Lane
On Fri, Apr 12, 2013 at 9:46 AM, Yury Katkov katkov.ju...@gmail.com wrote:

 Hi everyone!

 Aren't you too quick about this phasing out thing? First of all the
 term is pretty weird - if the current maintainer don't want to support
 the extension anymore we use [[Template:Unmaintained_extension]] and
 ideally search for new maintainer don't we? The fact that the
 maintainer is WMF doesn't have to change anything.

 The second thing is that Click Tracking requires the current version
 of MediaWiki and this makes it useful extension for the next year or
 two. Not many people use the devel version of MW which is need for
 Event Logging.


Did anyone other than WMF use clicktracking?

- Ryan
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Labs Shell Access Requests, 39 pending

2013-04-16 Thread Ryan Lane
On Tue, Apr 16, 2013 at 10:20 AM, rogerchris...@gmail.com wrote:

 Could someone with authority please process the 8 days long backlog at
 Labs Shell Access Requests?:


 https://wikitech.wikimedia.org/wiki/Special:Ask/-5B-5BCategory:Shell-20Access-20Requests-5D-5D-5B-5BIs-20Completed::false-5D-5D/-3FShell-20Justification/-3FModification-20date

 39 requests pending.


We have a lot of these coming in as of late (which is awesome!), so we're
not doing a great job keeping up. Anyone want to help out with this
process? We can give this specific right out to anyone willing to help.

- Ryan
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Labs Shell Access Requests, 39 pending

2013-04-16 Thread Ryan Lane
On Tue, Apr 16, 2013 at 12:48 PM, Thehelpfulone thehelpfulonew...@gmail.com
 wrote:

 On 16 April 2013 19:45, Ryan Lane rlan...@gmail.com wrote:

  We have a lot of these coming in as of late (which is awesome!), so we're
  not doing a great job keeping up. Anyone want to help out with this
  process? We can give this specific right out to anyone willing to help.
 

 Do we actually review the individual requests to look for anything in
 particular, or just grant access for all who request it?


The default is to just give access. This is a mechanism to be able to
remove shell from trolls and to avoid trolls just creating new accounts to
bypass a block. If we have an issue with a troll, we'll let the shell
admins know and we'll take action.

- Ryan
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Labs Shell Access Requests, 39 pending

2013-04-16 Thread Ryan Lane
On Tue, Apr 16, 2013 at 2:21 PM, Tyler Romeo tylerro...@gmail.com wrote:

 What exactly is Labs Shell Access? Come to think of it, I really don't have
 much of an idea of what Labs is in the first place. I assumed it was some
 sort of testing environment, but I've never actually been on it.


Labs is a virtualized environment that's meant for test, development and
semi-production uses such as bots or tools. Shell access gives you the
ability to ssh into instances (specifically the bastion). Without shell,
it's impossible to ssh into any instances.

- Ryan
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Countdown to SSL for all sessions?

2013-04-30 Thread Ryan Lane
On Tue, Apr 30, 2013 at 2:59 PM, Matthew Walker mwal...@wikimedia.orgwrote:

 
  We enabled it for about an hour previously (before reverting due to
  the centralauth bug), and the change was barely noticeable in ganglia.


 Do we have numbers on what this did to the number of active editors during
 that time period? Esp. broken down on a per country basis?

 I think I want to agree with Petr -- we should not be forcing SSL always;
 we should be respecting what the user requested. In that way if it ever
 becomes enforced by a government that SSL is disallowed users may still
 contribute to the site. (Remember we block things like Tor so they can't
 even proxy around it.)

 Perhaps we should just make it really obvious on the login page (e.g. big
 button to login via SSL, small button to not do so.)


I know it's heretical to base what we do on more popular sites, especially
the great evil that is facebook, but let's assume they know what they are
doing. Visit facebook. Looks to me like it redirects to https even if you
specifically ask for http. Nowhere on the page does it give you an option
to switch to http. They have twice the number of users as us.

Do we have numbers? seems to be the death call of discussions on our
lists lately. We have examples we can look at to know what we're doing is
sensible.

- Ryan
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] New extension: Special:ExtensionStatus (opinions?)

2013-05-13 Thread Ryan Lane
On Sun, May 12, 2013 at 11:26 PM, Moriel Schottlender mor...@gmail.comwrote:

 Hello everyone,

 I'd like to get your opinions and critique on my very first MediaWiki
 extension, which, I hope, will be helpful to other developers.

 I noticed that there's no easy way of seeing if extensions that we have
 installed on our MediaWiki require update, and there are even some
 extensions that get so little regular attention that when they do get
 updated, there's no real way of knowing it (unless we check specifically).

 It can be annoying when encountering problems or bugs, and then discovering
 that one of our extensions (probably the one we least expected) actually
 has an update that fixed this bug.

 So, I thought to try and solve this issue with my extension. Since
 MediaWiki's changes are submitted through gerrit, I thought I'd take
 advantage of that and perform a remote check to see if there are any new
 commits that appeared in any of the extensions since they were installed.


I like the idea of this, but if this is used widely it's going to kill our
gerrit server. It hits gitweb directly. Gitweb calls are uncached and are
fairly expensive to run against our server. We replicate all repositories
to github, and they allow this kind of thing. Is there any way you can
change this to use the github replicas?

- Ryan
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] migrating hooks doc to doxygen?

2013-06-04 Thread Ryan Lane
On Tue, Jun 4, 2013 at 1:43 PM, Chad innocentkil...@gmail.com wrote:

 On Tue, Jun 4, 2013 at 1:40 PM, Brad Jorsch bjor...@wikimedia.org wrote:
  On Tue, Jun 4, 2013 at 12:00 PM, Antoine Musso hashar+...@free.fr
 wrote:
 
  Since we introduced hooks in MediaWiki, the documentation has been
  maintained in a flat file /docs/hooks.txt . Over the week-end I have
  converted the content of that file to let Doxygen recognize it.
 
  The patchset is:
https://gerrit.wikimedia.org/r/#/c/66128/
 
  The result is pretty. But personally I'll probably continue to just
  look in hooks.txt if I need the info in there, and the markup in the
  (now-misnamed) file is rather ugly. Not that the existing file isn't
  also ugly, just less so.
 

 I'm with Brad. Considering we document this in the tree and on
 mw.org, I'm not entirely sure what the benefit of having it done
 via Doxygen is.


I've never understood why we have some subsection of documentation stuck in
the tree. It makes no sense. If we want to include docs with the software
shouldn't we just dump tagged docs from mediawiki.org into the tree, per
release? Right now we have docs in two places to keep up to date and
neither place is kept very well documented.

- Ryan
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] Needless forks with incompatible license changes

2013-06-05 Thread Ryan Lane
On Tue, Jun 4, 2013 at 7:56 PM, Tyler Romeo tylerro...@gmail.com wrote:

 AuthPlugin never used to handle this kind of stuff. The only extensions
 that use AuthPlugin are those that provide *supplemental* authentication
 services. Notice that E:LDAPAuthentication uses AuthPlugin, but
 E:TwoFactorAuthentication does not. AuthPlugin has never handled additional
 authorization logic, and I don't see any reason why it should.


So... I'm just now noticing this fork of OATHAuth and it's slightly
infuriating. I would have gladly accepted changes, and the license has been
changed to GPL3 or higher, when OATHAuth is GPL2 or higher, so I can't even
take changes without changing OATHAuth's license!

Was there any reason for this? Can we avoid forks like this? Please?

- Ryan
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] HTTPS Questions

2013-06-06 Thread Ryan Lane
On Thu, Jun 6, 2013 at 6:56 PM, Tyler Romeo tylerro...@gmail.com wrote:

 Quick two questions:

 1) Was that bug with E:CentralAuth that prevented us from turning on
 $wgSecureLogin ever fixed? If so, can we attempt another deployment?

 2) I noticed something in the signpost:

 Walsh told the *Signpost* that while moving to an https default is a goal
  the WMF is actively working on, doing so is not trivial—it is a
 delicate
  process that the WMF plans to enable in graduated steps, from logged-in
  users to testing on smaller wikis before making it the default for
  anonymous users and readers on all projects.


 Is this at all true? Because from what I've been told on bug reports it
 seems like turning on HTTPS would indeed be a trivial step and that the
 operations team has confirmed we can do it at will. I also question the
 definition of actively working on. ;)


You are misreading. What is being discussed is HTTPS by default for
anonymous readers.

- Ryan
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] [Labs-l] [labs] bots project is turning to testing environment of tools project

2013-06-15 Thread Ryan Lane
On Sat, Jun 15, 2013 at 4:51 AM, Petr Bena benap...@gmail.com wrote:

 Most of you already know that, but I will repeat it once more, so that
 everyone who is using bots project is aware of that.

 Since tool labs (project tools on wikimedia labs) is now serving the
 purpose which bots project on labs initially served for, we decided to
 convert bots project to testing environment of tools project (and
 shelter both wikimedia labs projects under the one big virtual
 project called tool labs)

 I know it is very confusing naming but I hope once aliasing of
 project names is possible on wikitech we just rename bots project to
 tools-testing or something like that.


Renaming of projects will take a lot of effort for little gain. If
everything in bots is puppetized, we should just create a new project,
launch everything there, move the data and kill the other project.

- Ryan
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

  1   2   3   4   5   >