is in Berlin, soon. I'm not sure if they
have this planned. It's apparently GLAM focused (which excludes devs
like me), so I'd imagine not, unless the bugs targeted are GLAM
related.
- Ryan Lane
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https
been meeting them in person,
much more so than the years prior. The hack-a-tons may not be
effective at bringing in new people, but I feel they are very
effective at solidifying the current community that we have.
- Ryan Lane
___
Wikitech-l mailing list
On Sun, Feb 13, 2011 at 9:23 AM, Maury Markowitz
maury.markow...@gmail.com wrote:
Are there _no_ performance issues we should be concerned about here?
I know local ISP's did (used to?) throttle all encrypted traffic.
Would this fall into that category?
Well, there's nothing we can really do
:
The server side is definitely more the problem here. There is a fairly
substantial performance hit for setting up/breaking down the SSL
connections. To support this we'll need a cluster of systems dedicated
to acting as SSL proxies.
- Ryan Lane
in depth
discussion with the other ops folks about this last week. I think I'm
going to put it on my goal list; however, we have a lot of higher
priority tasks, so I wouldn't expect anything too soon.
- Ryan Lane
___
Wikitech-l mailing list
Wikitech-l
this if this would actually result in a
substantial amount of traffic to secure, but I seriously doubt it
will. I love the EFF, but they have a fairly shallow reach when it
comes to things like this.
- Ryan Lane
___
Wikitech-l mailing list
Wikitech-l
All this boils down to, yes full HTTPS is best practice, but if you make use
of external APIs or services, it may be hard to achieve.
We don't for anything, I believe.
- Ryan Lane
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https
But really, I should have just ignored this thread since it was mostly a
troll.
Wouldn't each page view mean a connection, and a ssl handshake? Or are
you thinking on keep-alives?
No. Once a connection is made, requests from the same user should go
through the same connection.
- Ryan
that developers can have a
usable working environment, but I don't think it's stable enough to
run on any site where you care about security or your data.
- Ryan Lane
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman
All I know is doing svn update php update.php is a lot easier than
any other way of updating.
You can do that with the tagged releases too. You don't have to run
trunk for that.
- Ryan
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
control system of their choosing,
yet they don't. This thread is discussing a perceived problem with a
tool we are already successfully using. Let's focus on one issue at a
time.
- Ryan Lane
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https
has a problem, and it should be addressed. It isn't a
problem with the MediaWiki developer community though, it's a problem
with the toolserver community, and they need to fix it. But again,
let's focus on one issue at a time.
- Ryan Lane
___
Wikitech-l
a totally new problem
into the mix won't help.
- Ryan Lane
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l
think we all get this. If we switch to git this will be addressed.
If there's some specific technical concern you have, let us know.
- Ryan Lane
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech
this. Digest authentication protects against replay attacks, but
I don't believe it can protect against man in the middle attacks.
Respectfully,
Ryan Lane
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman
on the channel, I love the
idea of someone working on AJAX login support. It is perfectly useable
by a bunch of third parties, and would be a great addition to the
software.
Respectfully,
Ryan Lane
___
Wikitech-l mailing list
Wikitech-l
It's fixed.
On Sun, Mar 27, 2011 at 2:18 PM, Roan Kattouw roan.katt...@gmail.com wrote:
2011/3/27 Schneelocke schneelo...@gmail.com:
Hi everyone,
did the SSL certificate for secure.wikimedia.org expire?
It did, yes. Informing our ops department, they should be able to sort
this out
I've given extensions, core, and deployment access to a new member of
our ops team: Peter Youngmeister. He should be adding his userinfo as
we speak.
- Ryan Lane
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org
I'm doing squid upgrades to add some patches to fix some issues we
were seeing, and to add new log capabilities. I upgraded too many at
once and it's overloading our apache backend. It should stablize soon.
Sorry!
On Tue, Apr 19, 2011 at 1:20 PM, John phoenixoverr...@gmail.com wrote:
I have been
* https://bugzilla.wikimedia.org/show_bug.cgi?id=20643 -- Serve SSL/HTTPS
sites out of same domain names as HTTP access: https://en.wikipedia.org/
This'll need more work as it has to deal with the offsite proxies, multiple
domains, etc. But it's been on the slate for a long time and we did
I totally 3 that you wrote it in python.
On Mon, May 2, 2011 at 4:21 PM, Russell N. Nelson - rnnelson
rnnel...@clarkson.edu wrote:
Maybe there's a better tool to tell you what function is defined in what
class in PHP, but I couldn't find one in the time it would take me to write
it, so I
I'd also love to know how it keeps getting unsubscribed. This is the
third time...
On Mon, May 2, 2011 at 7:03 PM, K. Peachey p858sn...@gmail.com wrote:
On Tue, May 3, 2011 at 10:05 AM, Happy-melon happy-me...@live.com wrote:
Wow, now here's a blast from the past... :-D A lot of these stats
On Tue, May 3, 2011 at 10:25 AM, Chad innocentkil...@gmail.com wrote:
On Tue, May 3, 2011 at 2:15 PM, MZMcBride z...@mzmcbride.com wrote:
I realize you have a dry wit, but I imagine this joke was lost on nearly
everyone. You're not really suggesting that everyone who wants to parse
MediaWiki
It is much easier to embed it in other languages, once you get shared object
with Parser methods exposed ;-)
Which would also require the linking application to be GPL licensed,
which is less than ideal. We shouldn't limit the licensing of
applications that want to write wikitext. An
Which of course allows me to fork the thread and ask why does MediaWiki have
to be GPL licensed.
I was just talking about this in IRC :). We could re-license the
parser to be LGPL or BSD so that other implementations can use our
parser more freely.
- Ryan
This is how WMF staff treats volunteers:
[21:17:23] Ryan_Lane domas: and now I took your BSD idea, and didn't give
you credit
[21:17:38] * Ryan_Lane wins
[21:17:51] yuvipanda_ FLAWLESS VICTORY
[21:17:55] yuvipanda_ except for the IRC logs
You are evil Domas. For those interested,
On Tue, May 3, 2011 at 1:33 PM, Trevor Parscal tpars...@wikimedia.org wrote:
I think the idea that we might break the existing PHP parser out into a
library for general use is rather silly.
Well, if that's the case, why was it brought up in the discussion to
begin with? Here's the comment Tim
Personally, I don't see any problem with a parser library being GPL.
You can still link it with proprietary code as long as you don't
distribute the result, so it would be fine for research projects or
similar that rely on proprietary components. You can always *use*
GPLd code however you
The only mention of the word link in the GPLv3 terms and conditions,
outside an example, is in the phrase link or combine in section 13:
http://www.gnu.org/licenses/gpl.html
GPLv2 doesn't use it at all in the terms and conditions:
http://www.gnu.org/licenses/gpl-2.0.html
Linking has no
Added Asher Feldman (asher) to extensions, core, and deployment. He's
a new member of the Operations team.
- Ryan
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l
We know of the issue and are working it now. We are having issues with
one of our routers.
- Ryan
On Tue, May 10, 2011 at 8:57 AM, John phoenixoverr...@gmail.com wrote:
For about the last 12 hours I have been unable to reliably connect to WMF
wikis. Ive tried both Firefox and IE8 both without
We did some network migrations to stabilize things. We'll do
maintenance soon to fix things more permanently.
Let me know if you are still having issues.
- Ryan
On Tue, May 10, 2011 at 11:43 AM, Ryan Lane rlan...@gmail.com wrote:
We know of the issue and are working it now. We are having
To coordinate efforts occurring in Labs, and to be able to
send/receive announcements, I've created a new labs-l list. If you are
a labs member, or are interested in the work going on there, please
subscribe:
https://lists.wikimedia.org/mailman/listinfo/labs-l
- Ryan
Of course, if the login form code wasn't such a swamp, there'd be a hook
you could use to preprocess the usernames server-side... :-(
The getCanonicalName function in the auth plugins will do this. I was
actually thinking of adding a hook to LdapAuthentication in that
function to let users
I'd love some help with the OpenStackManager interface. Development of
it is fairly straightforward when the environment is configured, and I
have that done in labs in the openstack project. I can give them
access to that project, of course.
- Ryan
On Fri, Jan 13, 2012 at 6:09 AM, Sumana
I think you're missing the objective of the policy. It aims
at taking away the fear from potential participants to be
harassed. Encouraging people with insufficient social
skills to come is not going to help that cause especially if
pressure is put on the other participants to engage with
I don't think Wikimedia ops can be complicit in turning off editing like
that. Ops operates under the exact opposite goal, doesn't it? To ensure that
the site is continually running, accessible, functioning as much as humanly
possible?
I was under the impression that any SOPA-related action
As was mentioned here, a 503 would be the most appropriate HTTP response
code to serve. It would also prevent non-js users and text-only users from
bypassing, it would avoid the flicker effect, and would cause search
engines to correctly back off trying to index our pages. It would also
Strange considering google said to use 503s so it would make their job easier.
https://plus.google.com/u/0/115984868678744352358/posts/Gas8vjZ5fmB
Right. I read that post too. As I said, they told us not to change
return codes. They told us so directly.
I think them telling us directly
I second that.
Also, it seems as if the starting point of this discussion somehow gets lost.
It was about the javascript redirection simply not working for many million
browser configurations - like Firefox running noscript, or IExplorer when
extra secure configured (i.e. because you do
Making sure articles look good in w3m ensures they are machine
processable -- as you never know what kind of machine might be
processing your article.
We have an API for machines. We don't support screen scraping.
- Ryan
___
Wikitech-l mailing
Ryan, when i criticized the javascript implementation, i did not intend to be
rude. I see now there might be a lot more considerations besides shocking
visitors and editors, and there seem to have been not so much time too. Maybe
i underestimated the task.
I am impressed that you tried
No, there isn't a difference. A blackout where everyone sees a page
with a particular message instead of the article they wanted is
exactly the same as unscheduled downtime where everyone sees a page
with a particular message instead of the article they wanted. If
search engines and caches
Abe is working on our nginx udp logging module. Today he fixed a
pretty annoying log format issue for us at the hackathon.
- Ryan
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l
So, I voted for Saturday at 2pm, but I'm thinking it's unlikely I'll
be able to make it this Saturday. We'll likely need to wait till after
I get back from FOSDEM, which is some time after Feb 11th.
On Tue, Jan 24, 2012 at 7:05 AM, Petr Bena benap...@gmail.com wrote:
Sumana suggested to use this
I requested a configuration file for squid weeks ago, but it seems to be a
bit complicated to remove confidential data from it. I suppose we should
split it to multiple files, having some public and private, and keep the
public files with no server specific lines in puppet, so that we can
I initially suggested you just dump the files I gave you into puppet
and have a volunteer puppetize it; did you not get that (I might not
have told you), or do you prefer to puppetize it yourself for some
reason (e..g because it's hard to get right)?
It's hard to get right because it can't
On Fri, Feb 10, 2012 at 5:08 AM, John Du Hart compwhi...@gmail.com wrote:
Unless you've been living under a rock (If you have, how's the wifi under
there?) we're moving to git soon. Along with this will come a change in how
we do code review. However, some people have expressed concerns over
On Fri, Feb 10, 2012 at 5:08 AM, John Du Hart compwhi...@gmail.com wrote:
Unless you've been living under a rock (If you have, how's the wifi under
there?) we're moving to git soon. Along with this will come a change in how
we do code review. However, some people have expressed concerns over
We already have SubPageList, SubPageList2, and SubPageList3 sitting around
(SubPageList and SubPageList3 are in SVN, SubPageList is supposed to be the
one more up to date then either the 2 or 3) inside MW.org and SVN. Heck it's
hard to have as much of a naming mess as we've had with Dynamic
These exist. Gerrit sends notifications when an event occurs (creation
of a new change, comment/review on an existing change, new version of
an existing change, change merged), and it sends these notifications
to all users that have commented on the change, as well as the author.
I believe
I compeletely agree with that. It would be a complete waste to have chad
and others' work from the past 5 months thrown away at the last minute for
a solution brought up this late. We can initially use git for the migration
however if we decide to later, Phabricator is still available. :-)
3. jQuery drop down menu - I wanted to implement this functionality on
every page. I had seen the SignUP API wanted this universally. If there
are security issues with AJAX, then there is no need to even implement the
jQuery alongside. (Idea dropped)
Well, just because it would be insecure
I've given a few talks on this at random places. Slides are up on
wikitech in PDF form, and on my blog in ODF format:
http://wikitech.wikimedia.org/index.php?title=File:Ryan_Lane_-_How_to_be_a_part_of_the_MediaWiki_developer_community.pdfpage=1
On Sat, Feb 18, 2012 at 7:39 PM, A jokestr...@gmail.com wrote:
Tonight the Twitter account @wikimediatech sent out two problematic
tweets that began with jfaritu, one of which contains a racial slur.
Not sure who handles this, but please have someone look into it.
Thanks.
A channel troll
That's because the bot records the user's nick in the log message.
On Tue, Feb 21, 2012 at 7:53 AM, Platonides platoni...@gmail.com wrote:
It's interesting to note that the text 'jfaritu' which began the
problematic texts was precisely pointing to the abuser's nick.
As long as everyone's information is known in advance, it can be scripted, yes.
On Mon, Feb 27, 2012 at 3:09 PM, Chad innocentkil...@gmail.com wrote:
Surely this could be scripted.
-Chad
On Feb 27, 2012 6:07 PM, Alexander Klauer graf.z...@gmx.net wrote:
Hi,
Am Montag, 27. Februar 2012
On Tue, Feb 28, 2012 at 4:46 AM, Chad innocentkil...@gmail.com wrote:
On Tue, Feb 28, 2012 at 7:20 AM, Gregory Varnum
gregory.var...@gmail.com wrote:
Could mimic what we did here:
https://www.mediawiki.org/wiki/Git/New_repositories#Step_3:_Request_space_for_your_extension
I think as long as
I don't think that list is up to date. I also think that not everyone
who has access knows how to do that or want to do that.
The puppet repo (in manifests/site.pp) is the most accurate way of finding this:
706 $sudo_privs = [ 'ALL = NOPASSWD: /usr/local/sbin/add-ldap-user',
707
Unfortunately for git-review, I don't think there is a single GUI that
supports it.
There's actually no requirement for using git-review. It simply makes
things easier. If you set up the remote, and make sure to include the
commit-msg hook, you can use anything you want.
I haven't started
I'd hardly call Gerrit a lame horse, more like a horse with funny spots
on it and an extra tail.
Also: what's this mythical best tool? I've not seen it suggested before.
+1
There are alternative solutions, but none of them are viable without
development work. Gerrit is viable right now, in
On Thu, Mar 8, 2012 at 4:03 PM, Platonides platoni...@gmail.com wrote:
I like Gerrit diff system. Yeah I know that sounds like trolling read
below though before discarding this mail.
Sure, inline comments are cool but I think they lack discoverability.
You need to browse all the diffs just in
What if our Labs and thus gerrit logins are different then our
subversion username?
Your shell account name is your svn username. It's linked with your
gerrit user name.
- Ryan
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
Well, sure, but perfect is the enemy of the done. For right now, it can at
least be added as a local user preference (or just be implemented
unconditionally)... I think. That's why I started the thread: to clarify
what needs to be done and who's ready for what. :-)
Please let's not add a
Please let's not add a silly preference like this. Let's just keep
operating under the idea that all logged in users will use HTTPS, and
keep moving towards that goal.
Okay, I don't disagree. Do you think
https://bugzilla.wikimedia.org/show_bug.cgi?id=29898 should have a different
bug
This is starting in a few minutes. If you are interested, pop into
#wikimedia-tech now.
- Ryan
On Mon, Aug 15, 2011 at 5:23 PM, Ryan Lane rlan...@gmail.com wrote:
The Ubuntu Ensemble community is going to be working on replicating
our environment using Ensemble, inside of the Labs
Roan sent out a new set of HTTPS fixes today, which made us confident
enough to enable protocol-relative URLs and HTTPS on commonswiki and
foundationwiki. We haven't purged the cache yet for these wikis, so
it's very likely some pages will point you back to HTTP. We'll be
purging caches some time
Main thing I notice off the bat is that interwiki links seem to have been
set up to use protocol-relative links that don't actually work yet -- at
https://commons.wikimedia.org/wiki/Atlas there's a link Stielers
Handatlashttps://en.wikipedia.org/wiki/en:Stielers_Handatlas
which ends up
We've just released our puppet repository into a public git
repository. For more information, see the blog post about this:
http://blog.wikimedia.org/2011/09/19/ever-wondered-how-the-wikimedia-servers-are-configured/
As noted in the blog post, we are releasing this to treat operations
like a
On Thu, Sep 22, 2011 at 6:29 PM, Brion Vibber br...@pobox.com wrote:
Yay!
I've volunteered to do a quick intro-to-our-scary-git-future session at the
New Orleans hackathon; I'll see if I can lay out a nice workflow
demonstration from a few different perspectives:
* staff or very active
I've talked to some of the ops folks a bit, and we've already agreed
that inline display of diffs is something we really need to add to
Gerrit. I've filed a feature request to this end:
http://code.google.com/p/gerrit/issues/detail?id=1137
It's worth pointing out that Google Code will let
I'm happy to announce we've added native HTTPS support to all of the
projects. See the blog post for more information:
http://blog.wikimedia.org/2011/10/03/native-https-support-enabled-for-all-wikimedia-foundation-wikis/
If you find any bugs, please report in bugzilla.
- Ryan Lane
We're announcing HTTPS support but why isn't this link pointing to
https://blog.wikimedia.org/ ?
I know this is a troll, but I'm going to answer it seriously. We are
supporting https, but we aren't doing https only. We don't have plans
for only doing https. You are welcome to take the link and
Sorry? Note that only a small subset of those with commit access do have
full ssh to run commands. And a good system shouldn't need those hacks
(which is why we are discussing in advance).
I don't see why we can't open up access to basically everyone. I'd
like to make it so that everyone who
You can see the one used live on our sites:
https://gerrit.wikimedia.org/r/gitweb?p=operations/puppet.git;a=blob;f=templates/mysql/prod.my.cnf.erb;h=734009a4c170ffc2f525c4e18fc4c72352cffe45;hb=HEAD
Isn't that the same thing Facebook just caught Congressional shit for?
Not even close. We don't track you from site to site, you just happen
to have a cookie that didn't get deleted that also happens to cause
you to bypass cache. We should likely find out why these aren't
getting deleted.
-
On Mon, Oct 10, 2011 at 1:39 PM, Thomas Gries m...@tgries.de wrote:
WMF Staff Announcement WMF Staff Announcement WMF Staff Announcement .
It would be nice to read more about the actual progress resulting from
the current massive staff extension.
We put out a very extensive report of
On Mon, Oct 10, 2011 at 2:01 PM, Tomasz Finc tf...@wikimedia.org wrote:
Or just take a look at our software deployments page to see exactly
what changed.
http://wikitech.wikimedia.org/view/Software_deployments
http://wikitech.wikimedia.org/view/Software_deployments/2011_archive
That
Hmm, maybe we should set up a second box in addition to gallium for
this to act as the database host that has all the various DBMSes we
want to test.
https://labsconsole.wikimedia.org/wiki/Main_Page
Just a suggestion ;)
- Ryan
___
Wikitech-l
Aha, replication lag would explain it! Since I'm only doing things on
off hours, how long should replication lag be these days?
It shouldn't normally be more than 2-3 seconds. We did have an issue
with the S4 cluster (commonswiki) this weekend where the master had
crashed, and replag was very
As suggested some days ago by Ryan Lane - and I support his view - such
hooks should
- go/come via the core, and in consequence
- via Auth.
The two hooks should be implemented in core as
abstract public function ()
or whatever is conformity to our standards (pls. let me know
then I understood you, let's say, partially. Your last sentence
clarifies the situation in that you indirectly confirm, that the
introduction of the two hooks
DeleteAccount and MergeAccountFromTo (in UserMerge and in OpenID
extensions) is -currently- the correct and only way.
If the
In case anyone's wondering what I think of this, I was pretty blunt
last time around:
http://lists.wikimedia.org/pipermail/wikitech-l/2011-April/052893.html
To be any more blunt than that, I'd have to press the caps lock key ;)
This post seems to be a much more optimistic view of the 1.17
It's been fixed for three hours now. One thing to note is that the IP
address for the server changed. The DNS entry's cache settings are for
one hour, you should be able to access it without issues now. I just
tried it and it's working for me. When was the last time you tried it?
On Wed, Jun 8,
In anticipation for proper HTTPS support on Wikimedia sites, we will
be enabling protocol relative URLs. This means that things like logos,
references to resources, and interwiki links will use links like:
//en.wikipedia.org/wiki/Main_Page
instead of links like:
My personal preference would be to run *all* logged-in activity over HTTPS,
so every mail link etc should be on SSL. But I think that's still a ways out
yet and will need better SSL acceleration; poor Ryan Lane will kill me if I
keep pushing on that too soon! ;)
Actually, this is exactly
Really great to have you on the team. Looking forward to working with you!
On Thu, Jun 30, 2011 at 4:31 PM, CT Woo ct...@wikimedia.org wrote:
All,
Please join me to welcome Jeff Green to Wikimedia Foundation.
Jeff is taking up the Special Ops position in the Tech Ops department where
one of
[quote]
Summer of Research 2011
http://meta.wikimedia.org/wiki/Research:Wikimedia_Summer_of_Research_2011
‹ Asher Feldman and Ryan Lane
http://www.mediawiki.org/wiki/User:Ryan_lane created the systems
infrastructure for the Summer of Research team to perform data mining and
analysis work
Over the past couple days Roan Kattouw and I have been pushing out
changes to enable protocol-relative URL support. We've gotten to a
point where we think it is stable and working.
We've enabled this on test.wikipedia.org, and plan on running it for
two weeks before enabling it elsewhere. Please
Gave Aaron Parecki access to extensions. He's been working on
extensions such as:
* MediaWiki-SEO-Title-Tag
* MediaWiki-Changelog-Graphs
* MediaWiki-Glossary-Extension
- Ryan
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
On Wed, Aug 3, 2011 at 11:10 PM, Asher Feldman afeld...@wikimedia.org wrote:
From the orig post Recent Intel CPU has a fature called
AES-NIhttp://en.wikipedia.org/wiki/AES_instruction_set that
accelerates AES processing. A CPU with AES-NI can perform 5 to 10 times
faster than a CPU without it.
On Sat, Aug 6, 2011 at 2:38 AM, Thomas Morton
morton.tho...@googlemail.com wrote:
I am note sure who might be in a position to correct this, but this list
seems the most likely..
For some reason sep11.wikipedia.org subdomain is forwarding to a spam site -
this was pointed out on OTRS earlier.
Set-up of etherpad-lite about 3 minutes.
I'm more than happy to make you an instance in Labs to install this
and puppetize it for us.
- Ryan
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
[1] that is being updated. A lot of the live configs are also on NOC
[2]. Between sites like NOC and access to Puppet (via Git), you've got a
majority of the data AS it is actually used (rather than as it was,
when written, on wiki).
Well, sure, but that's nap-of-earth; without a solid
Andrew Bogott (andrew) - A new devops contractor working on the Labs project.
- Ryan
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l
On Tue, Oct 25, 2011 at 1:43 PM, Platonides platoni...@gmail.com wrote:
Ryan Lane wrote:
https://labsconsole.wikimedia.org/wiki/Main_Page
Just a suggestion ;)
- Ryan
1) This is the first time it is mentioned in this mailing list.
It isn't the first time labs has been. I was suggesting
As someone who tested
http://nova-controller.tesla.usability.wikimedia.org/ it was quite a
shock to find it out in this way.
Well, we released it at the hack-a-thon for early beta testing. It
isn't ready for MediaWiki development, so I didn't publicize it to
this list just yet.
Once I work
Doing this doesn't work, and never did. Creepy Jimmy, with his come
to my van, child eyes invariably comes back,
like a particularly nasty genital infection, within 2 days at most.
Look, there have been numerous ways discussed to remove the banners
and never see them ever
Agreed. I've wanted most of the stuff Petr has put into this bot in
the operations channels for quite a while. It works well, and is being
actively maintained.
I don't care what language it is in. Learning a new language enough to
do simple fixes and basic maintenance takes a week or two. Who
You need a labs account for this to work. If you'd like one, email me
with the following information:
1. Your preferred wiki user name. This will also be your gerrit/git
user name. If you want your git username to be your real name, then
make this your real name.
2. Your svn account name
3. Your
301 - 400 of 436 matches
Mail list logo