Re: [Wikitech-l] Change the installer to make Project: the default option for meta namespace name

2014-05-03 Thread Nathan Larson
On Sat, May 3, 2014 at 11:08 PM, Max Semenik  wrote:

> This proposal makes no sense: all these namespaces _are_ dependent on wiki
> language, so even if you force "Project" down the throats of non-English
> users, project talk would still be e.g. Project_ахцәажәара for Abkhazian
> wikis and so on. Thus, you're attempting to disrupt MediaWiki's
> localizability for the sake of your ephemerous Inclumedia project. I don't
> think this is what our users want.


That's a non sequitur, as there has been no mention of Inclumedia anywhere
in the discussion of this bug, and this proposal has no relation to
Inclumedia. Nor does it have much relevance to to localisation, that I know
of. The option of making Project: the name of the meta namespace is already
available, and works fine when selected; this merely makes it the default
option in the installer, so as to encourage what appears to be a better
practice.

If you choose Abkhazian as the language in the installer, it will give you
option of setting the meta namespace to Проект (Abkhazian for "Project").
This appears to be set in includes/installer/i18n/ru.json. In that file,
the line reads ("config-ns-generic": "Проект",). In MessagesAb.php and so
on, the meta talk namespace name is set by
$namespaceNames[NS_PROJECT_TALK]. In that file, it says NS_PROJECT_TALK
=> '$1_ахцәажәара', If you pick that option, your meta talk namespace will
be Проект_ахцәажәара, as it should be. So it seems like this wouldn't
interfere with localisation in the way you describe.

I don't see how the proposed change forces anything down the throats of
anyone any more than the current situation does; some option or another has
to be the default, so the question is what's the best option to encourage
people to pick.
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Change the installer to make Project: the default option for meta namespace name

2014-05-03 Thread Nathan Larson
On Sat, May 3, 2014 at 7:11 PM, K. Peachey  wrote:

> On 4 May 2014 05:38, Nathan Larson  wrote:
> >
> > 4) We don't have any other namespace names (User, Template, etc.) that
> are,
> > by default, different from one wiki to the next; what is the logic for
> > making an exception for the meta namespace?
> >
> >
> Because the Template/User namespaces aren't the same thing as the project
> namespace?


The Category namespace isn't the same thing as the Template or User
namespaces either, but by default it's called by the same name from one
wiki to the next. So, what makes the meta namespace so special?

If the argument is "The contents of the meta namespace differ from one wiki
to the next; therefore the name of the meta namespace should also differ",
that same argument could apply to all the other namespaces as well. The
templates of enwiktionary, for example, differ from the templates of
enwiki, but the namespace is called Template: on both.
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] Change the installer to make Project: the default option for meta namespace name

2014-05-03 Thread Nathan Larson
I propose that we change the installer to make Project:, rather than the
wiki name, the default option for meta namespace
name (i.e. namespace
4 <https://www.mediawiki.org/wiki/Manual:Namespace#Built-in_namespaces>).
See bug 60168 <https://bugzilla.wikimedia.org/show_bug.cgi?id=60168>. Here
is an image of what I have in mind that the installer should show:
https://upload.wikimedia.org/wikipedia/mediawiki/b/b9/ProjectDefaultNamespace.png

Rationales:
1) Standardizing the name makes life easier when people copy content from
one wiki to another. E.g., if you import a template that links to
Project:Policy rather than BarWiki:Policy, you don't need to edit the
imported template, because it already links where it should.

2) Having the meta namespace name be the same as the wiki name causes
collisions with interwiki prefixes when content is copied from one wiki to
another. E.g., suppose FooWiki has an interwiki prefix linking to BarWiki
(barwiki:) and then a user imports a template from BarWiki that links to
BarWiki:Policy. That will now link interwiki to BarWiki's mainspace page
"Policy" rather than to Project:Policy. It is not even possible to import
BarWiki:Policy until the interwiki barwiki: prefix is first removed; the
importer will say "Page 'BarWiki:Policy' is not imported because its name
is reserved for external linking (interwiki)."

3) If you ever change the name of the wiki, you then have to change all the
wikilinks to FooWiki: namespace; if it were Project:, they would stay the
same. True, you can use namespace
aliases<https://www.mediawiki.org/wiki/Manual:$wgNamespaceAliases>,
but that could defeat part of the point of rebranding, which is to
deprecate use of the old wiki name.

4) We don't have any other namespace names (User, Template, etc.) that are,
by default, different from one wiki to the next; what is the logic for
making an exception for the meta namespace?

-- 
Nathan Larson <https://mediawiki.org/wiki/User:Leucosticte>
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] RFC discussion Friday 14 Feb: TitleValue

2014-02-11 Thread Nathan Larson
On Mon, Feb 10, 2014 at 11:23 PM, Sumana Harihareswara <
suma...@wikimedia.org> wrote:

> The discussion will be in #wikimedia-office at 13:00 UTC on Friday the
> 14th:
>
> http://www.timeanddate.com/worldclock/fixedtime.html?msg=RFC+review&iso=20140214T14&p1=37&ah=1


Cool, now I have something to tell people when they ask what I'll be doing
for Valentine's Day.
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Let's improve our password policy

2014-02-06 Thread Nathan Larson
On Thu, Feb 6, 2014 at 9:58 AM, Chris Steipp  wrote:

> 1) As I understand it, the reason we went from 0 to 1 character required is
> spammers were actively trying to find accounts with no password so they
> could edit with an autoconfirmed account. We rely on "number of
> combinations of minimum passwords" to be greater than "number of tries
> before an IP must also solve captcha to login" to mitigate some of this,
> but I think there are straightforward ways for a spammer to get accounts
> with our current setup. And I think increasing the minimum password length
> is one component.
>
> 2) We do have a duty to protect our user's accounts with a reasonable
> amount of effort/cost proportional to the weight we put on those
> identities. I think we would be in a very difficult spot if the foundation
> tried to take legal action against someone for the actions they took with
> their user account, and the user said, "That wasn't me, my account probably
> got hacked. And it's not my fault, because I did the minimum you asked me."
> So I think we at least want to be roughly in line with "industry standard",
> or have a calculated tradeoff against that, which is roughly 6-8 character
> passwords with no complexity requirements. I personally think the
> foundation and community _does_ put quite a lot of weight into user's
> identities (most disputes and voting processes that I've seen have some
> component that assume edits by an account were done by a single person), so
> I think we do have a responsibility to set the bar at a level appropriate
> to that, assuming that all users will do the minimum that we ask. Whether
> it's 4 or 6 characters for us I think is debatable, but I think 1 is not
> reasonable.
>

1) Merely increasing the length could increase required keystrokes without
making it more secure. A couple comments from the
meeting
:
 "" ain't secure
 "password" isn't secure either, and that's 8

It seems to me that a pretty secure approach would be to have the system
give the user his 8-12 character password, rather than letting him pick a
password. Then we can be assured that he's not doing stuff like "p@ssword"
to meet the complexity requirements.

2) How plausible is this scenario you mention, involving legal action?
Has/would the WMF ever take/taken legal action against someone for actions
taken with their user account? Why would that happen, when any damage done
by a non-checkuser can generally be reverted/deleted/etc.? What would be
the remedy; trying to get money out of the person? It probably wouldn't
amount to much.
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Let's improve our password policy

2014-02-05 Thread Nathan Larson
On Wed, Feb 5, 2014 at 2:58 AM, Tyler Romeo  wrote:

> For example, MZMcBride, what if your password is "wiki", and somebody
> compromises your account, and changes your password and email. You don't
> have a committed identity, so your account is now unrecoverable. You now
> have to sign up for Wikipedia again, using the username "MZMcBride2". Of
> course, all your previous edits are still accredited to your previous
> account, and there's no way we can confirm you are the real MZMcBride, but
> at least you can continue to edit Wikipedia... Obviously you are not the
> best example, since I'm sure you have ways of confirming your identity to
> the Wikimedia Foundation, but not everybody is like that. You could argue
> that if you consider your Wikipedia account to have that much value, you'd
> put in the effort to make sure it is secure. To that I say see the above
> paragraph.
>

What if all of the email addresses that a user has ever used were to be
stored permanently? Then in the event of an account hijacking, he could say
to WMF, "As your data will confirm, the original email address for user Foo
was f...@example.com, and I am emailing you from that account, so either my
email account got compromised, or I am the person who first set an email
address for user Foo." The email services have their own procedures for
sorting out situations in which people claim their email accounts were
hijacked.
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Tor exemption process (was: Re: Jake requests enabling access and edit access to Wikipedia via TOR)

2014-02-01 Thread Nathan Larson
On Sat, Feb 1, 2014 at 2:20 PM, Tyler Romeo  wrote:

> Your scenario is based on the premise that Wikipedia vandals care enough
> about vandalizing Wikipedia that they would get in their car (assuming
> they're old enough to have a license), drive to the nearest Starbucks,
> vandalize Wikipedia, and then drive somewhere else when they are blocked.


Also, a lot of wifi hotspots such as restaurants already have had their
Wikipedia editing access blocked.
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Non-complicated way to display configuration for a given wiki?

2014-01-24 Thread Nathan Larson
On Fri, Jan 24, 2014 at 9:35 AM, Tim Landscheidt wrote:

> Hi,
>
> to test changes to operations/mediawiki-config, I'd like to
> display the settings for a given wiki.
>
> So far I figured out:
>
> | 
> | $IP = 'wmf-config';
> | $cluster = 'pmtpa';
>
> | require ('/home/tim/public_html/w/includes/SiteConfiguration.php');
> | require ('wmf-config/wgConf.php');
>
> | var_dump ($wgConf->getAll ([...]));
>
> However, all my creativity regarding parameters to getAll()
> only returned an empty array.
>
> What would be the parameters needed for example to get the
> settings for de.wikipedia.org?


Are you saying you want to display the settings for your own wiki or
someone else's wiki (whose backend you can't access)? If it's your own
wiki, you can use
ViewFilesto let
people access a special page that will display a configuration file.
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Tor exemption process (was: Re: Jake requests enabling access and edit access to Wikipedia via TOR)

2014-01-17 Thread Nathan Larson
On Fri, Jan 17, 2014 at 6:33 PM, Erik Moeller  wrote:

> It would allow a motivated person to reset their identity and go
> undetected provided they avoid the kind of articles and behaviors they
> got in trouble over in the first place. It's not clear to me that the
> consequences would be particularly severe or unmanageable beyond that.


People get banned from Wikimedia projects for off-wiki conduct too that has
nothing to do with Wikimedia projects.
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Tor exemption process (was: Re: Jake requests enabling access and edit access to Wikipedia via TOR)

2014-01-17 Thread Nathan Larson
On Fri, Jan 17, 2014 at 1:21 PM, Erik Moeller  wrote:

> I tested the existing process by creating a new riseup.net email
> account via Tor, then requesting account creation and a global
> exemption via stewa...@wikimedia.org. My account creation request was
> granted, but for exemption purposes, I was requested to go through the
> process for any specific wiki I want to edit. In fact, the account was
> created on Meta, but not exempted there.


Thanks for taking the initiative to check that out. Now maybe the stewards
will be paranoid that any further Tor requests might be from you, and act
accordingly. It kinda reminds me of the Rosenham
Experiment.
Just the known possibility that there might be a mystery customer keeps the
service providers on their toes, and they are much more likely to mistake
an ordinary customer for the mystery customer than vice versa, as
demonstrated by the non-existent impostor
experiment
.
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Revamping interwiki prefixes

2014-01-17 Thread Nathan Larson
I forgot to mention, another problem is that you can't even import
Wikipedia: namespace pages to your wiki without changing your interwiki
table to get rid of the wikipedia: interwiki prefix first. The importer
will say, "Page 'Wikipedia:Sandbox' is not imported because its name is
reserved for external linking (interwiki)." As I
notedin bug
60168, it's unclear why we would standardize the other namespace
names (Help:, Template:, MediaWiki:, User:, etc.) from one wiki to the
next, but not Project:
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Revamping interwiki prefixes

2014-01-17 Thread Nathan Larson
On Fri, Jan 17, 2014 at 9:00 AM, go moko  wrote:

> >What if filling the interwiki table with predefined links was an
> installation option, possibly with several lists, and void?


Probably won't (and shouldn't) happen, since we're trying to keep the
installer options close to the bare minimum. Changing or clearing the
interwiki table is pretty easy with the right script or special page. The
fact that, when the installer's interwiki list was changed in 2013, it was
accurate to say "This list has obviously not been properly updated for many
years. There are many long-dead sites that are removed in this patch"
suggests that third party wikis are pretty good at ignoring interwiki links
they don't want or need.

I disagree that "collisions are very rare, and none of the alternatives
seem viable or practical". Collisions (or whatever one would call them)
happen fairly often, and the resulting linking
errorscan
be hard to notice because one sees the link is blue and assumes it's
going where one wanted it to.

It wouldn't be such a problem if wikis would name their project namespace
Project: rather than the name of the wiki. Having it named Project: would
be useful when people are importing user or project pages from Wikipedia
(e.g. if they wanted to import userboxes or policy pages) and don't want
the Wikipedia: links to become interwiki links. I would be in favor of
renaming the project namespaces to Project: on Wikimedia wikis; that's how
it is on MediaWiki.org (to avoid a collision with the MediaWiki: namespace)
and it seems to work out okay. I'll probably start setting up my third
party wikis that way too, because I've run into similar problems when
exporting and importing content among them. Perhaps the installer should
warn that it's not recommended to name the meta namespace after the site
name.

Tim's proposal seems pretty elegant but in a few situations will make links
uglier or hide where they point to. E.g. "See also" sections with interwiki
links (like what you see
here)
could become like the "Further reading" section you see
herein
which one has to either put barelinks or make people hover over the
link
to see the URL it goes to.

Interwiki page existence detection probably wouldn't be any more difficult
to implement in the absence of interwiki prefixes. We could still have an
interwiki table, but page existence detection would be triggered by certain
URLs rather than prefixes being used. I'm not sure how interwiki
transclusion would work if we didn't have interwikis; we'd have to come up
with some other way of specifying which wiki we're transcluding from,
unless we're going to use URLs for that too.

In short, I think the key is to come up with something that doesn't break
silently when there's a conflict between an interwiki prefix and namespace.
For that purpose, it would suffice to keep interwiki linking and come up
with a new delimiter. But changing the name of the Project: namespace would
work just as well. Migration of links could work analogously to what's
described in bug 60135
.

TTO, you were saying "I'm not getting a coherent sense of a direction to
take" -- that could be a good thing at this point in the discussion; it
could mean people are still keeping an open mind and wanting to hear more
thoughts and ideas rather than making too hasty of a conclusion. But I
guess it is helpful, when conversations fall silent, for someone to push
for action by asking, "...so, in light of all that, what do you want to
do?" :)
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Revamping interwiki prefixes

2014-01-16 Thread Nathan Larson
On Thu, Jan 16, 2014 at 10:56 PM, Tim Starling wrote:

> I think the interwiki map should be retired. I think broken links
> should be removed from it, and no new wikis should be added.
>
> Interwiki prefixes, local namespaces and article titles containing a
> plain colon intractably conflict. Every time you add a new interwiki
> prefix, main namespace articles which had that prefix in their title
> become inaccessible and need to be recovered with a maintenance script.
>
> There is a very good, standardised system for linking to arbitrary
> remote wikis -- URLs. URLs have the advantage of not sharing a
> namespace with local article titles.
>
> Even the introduction of new WMF-to-WMF interwiki prefixes has caused
> the breakage of large numbers of article titles. I can see that is
> convenient, but I think it should be replaced even in that use case.
> UI convenience, link styling and rel=nofollow can be dealt with in
> other ways.
>

These are some good points. I've run into a problem many times when
importing pages (e.g. templates and/or their documentation) from Wikipedia,
that pages like [[Wikipedia:Signatures]] become interwiki links to
Wikipedia mainspace rather than redlinks. Also, usually I end up accessing
interwiki prefixes through templates like
Template:wanyway. It would
be a simple matter to make those templates generate URLs
rather than interwiki links. The only other way to prevent these conflicts
from happening would be to use a different delimiter besides a single
colon; but what would that replacement be?

Before retiring the interwiki map, we could run a bot to edit all the pages
that use interwiki links, and convert the interwiki links to template uses.
A template would have the same advantage as an interwiki link in making it
easy to change the URLs if the site were to switch domains or change its
URL scheme.
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Revamping interwiki prefixes

2014-01-16 Thread Nathan Larson
On Thu, Jan 16, 2014 at 7:35 PM, This, that and the other <
at.li...@live.com.au> wrote:

> I can't say I care about people reading through the interwiki list. It's
> just that with the one interwiki map, we are projecting "our" internal
> interwikis, like strategy:, foundation:, sulutil:, wmch: onto external
> MediaWiki installations.  No-one needs these prefixes except WMF wikis, and
> having these in the global map makes MediaWiki look too WMF-centric.
>

It's a WMF-centric wikisphere, though. Even the name of the software
reflects its connection to Wikimedia. If we're going to have a
super-inclusive interwiki list, then most of those Wikimedia interwikis
will fit right in, because they meet the criteria of having non-spammy
recent changes and significant content in AllPages. If you're saying that
having them around makes MediaWiki "look" too WMF-centric, it sounds like
you are concerned about people reading through the interwiki list and
getting a certain impression, because how else would they even know about
the presence of those interwiki prefixes in the global map?


> I don't see the need for instruction creep here.  I'm for an inclusive
> interwiki map.  Inactive wikis (e.g. RecentChanges shows only sporadic
> non-spam edits) and non-established wikis (e.g. AllPages shows little
> content) should be excluded.  So far, there have been no issues with using
> subjective criteria at meta:Talk:Interwiki map.


I dunno about that. We have urbandict: but not dramatica: both of which are
unreliable sources, but likely to be used on third-party wikis (at least
the ones I edit). We have wikichristian:
(~4,000content
pages) but not rationalwiki: (
~6,000  content pages).
The latter was 
rejectedawhile
ago. Application of the subjective criteria seems to be hit-or-miss.

If we're going to have a hyper-inclusionist system of canonical interwiki
prefixes , we
might want to use WikiApiary and/or WikiIndex rather than MediaWiki.org as
the venue. These wikis that already have a page for every wiki could add
another field for interwiki prefix to those templates and manage the
interwiki prefixes by editing pages. Thingles
saidhe'd
be interested in WikiApiary's getting involved. The only downside is
that WikiApiary doesn't have non-MediaWiki wikis. It
soundedas
though Mark Dilley might be interested in WikiIndex's playing some
role
in this too. But even WikiIndex has the problem of only containing wikis;
the table will have to have other websites as well.
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Revamping interwiki prefixes

2014-01-16 Thread Nathan Larson
On Thu, Jan 16, 2014 at 4:53 AM, This, that and the other <
at.li...@live.com.au> wrote:

> 1. Split the existing interwiki map on Meta [2] into a "global interwiki
> map",
>located on MediaWiki.org (draft at [3]), and a "WMF-specific interwiki
> map"
>on Meta (draft at [4]).  Wikimedia-specific interwiki prefixes, like
>bugzilla:, gerrit:, and irc: would be located in the map on Meta,
> whereas
>general-purpose interwikis, like orthodoxwiki: and wikisource: would go
> to
>the "global map" at MediaWiki.org.
>

Why is it worth the trouble of maintaining two separate lists? Do the
Wikimedia-specific interwiki prefixes get in people's way, e.g. when
they're reading through the interwiki list and encounter what is, to them,
useless clutter? As the list starts getting longer (e.g. hundreds,
thousands or tens of thousands of prefixes), people will probably do a Find
on the list rather than scrolling through, so it may not matter much if
there's that little bit of extra clutter. Sometimes I do use those
Wikimedia-specific prefixes on third-party wikis (e.g. if I'm talking about
MediaWiki development issues), and they might also end up getting used if
people import content from Wikimedia wikis.


> * Define a proper scope for the interwiki map.  At the moment it is a bit
>   unclear what should and shouldn't be there.  The fact that we currently
> have
>   a Linux users' group from New Zealand and someone's personal blog on the
> map
>   suggests the scope of the map have not been well thought out over the
> years.
>   My suggested criterion at [3] is:
>

People will say we should keep those interwikis for historical reasons. So,
I think we should have a bot ready to go through the various wikis and make
edits converting those interwiki links to regular links. We should make
this tool available to the third-party wikis too. Perhaps it could be a
maintenance script.


> "Most well-established and active wikis should have interwiki
> prefixes, regardless of whether or not they are using MediaWiki
> software.
> Sites that are not wikis may be acceptable in some cases,
> particularly if they are very commonly linked to (e.g. Google,
> OEIS)."
>

Can we come up with numerical cutoffs for what count as "well-established",
"active", and "very commonly linked to", so that people know what to expect
before they put a proposal forth, or will it be like notability debates,
and come down to people's individual opinions of what should count as "very
commonly linked to" (as well as a certain amount of
ILIKEITand
IDONTLIKEIT, even if users deny that's the basis for their decision)?
We might get the help of WikiIndex and (especially) WikiApiary in getting
the necessary statistics.


> ** Many of the links are long dead. (snip)
>
** We could add API URLs to fill the iw_api column in the database
> (currently
>empty by default).
>

Those two should be uncontroversial.


> Sorry for the long message, but I really think this topic has been
> neglected
> for such a long time.
>

It's okay, it's a complicated subject with a lot of tricky implementation
decisions that need to be made (which is probably part of why it's been
neglected). Thanks for taking the time to do a thorough analysis.
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Inclupedia: Developers wanted

2014-01-14 Thread Nathan Larson
On Tue, Jan 14, 2014 at 6:22 PM, Matthew Flaschen
wrote:

> That's not the case.  There are components for software the WMF does not
> use.  This ranges from major projects like Semantic MW to one-off
> extensions that WMF does not have a use for (e.g. Absentee Landlord) to
> tools that work *with* MW but are not part of it (e.g. Tools Labs tools,
> Pywikibot).


I guess it would depend on making the scope of the bug broad enough that it
would seem useful for more than just one site. E.g. one could put
"implement the functionality needed for Inclupedia". That functionality
could be reused for any number of sites, just like SMW's code. On the other
hand, if someone were to say "Switch configuration setting x to true on
Inclupedia" that would be of little interest to non-Inclupedia users, I
would think. I assume that Bugzilla is not intended as the place for all
technical requests for the entire wikisphere?
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Inclupedia: Developers wanted

2014-01-13 Thread Nathan Larson
On Mon, Jan 13, 2014 at 2:27 PM, Brian Wolff  wrote:

> Id say that http://getwiki.net/-GetWiki:1.0 was similar to your "superset"
> concept (minus the merging part)


Yeah, per WikiIndex , "Instead of red links,
GetWiki uses green links to point to articles which do not exist locally.
When the user follows such a link, GetWiki tries to dynamically fetch it
from the wiki designated as an external source (in Wikinfo's case, the
English Wikipedia), renders and displays the article text. A local copy is
created only if the page is edited. Effectively, Wikinfo therefore provides
a transparent 'wrapper' around Wikipedia pages which have not yet been
copied."

That sounds pretty easy to implement, compared to what is contemplated for
Inclupedia. If a page had been deleted from Wikipedia, presumably it
wouldn't have been accessible on Wikinfo unless someone had created a fork
of that page on Wikinfo prior to its deletion from Wikipedia. That's a
major absence of Inclupedia (and Deletionpedia) functionality.

Also, a high-traffic live mirror would be contrary to live
mirrorspolicy, as the
load on WMF servers would increase at the same rate as the
wiki's readership. A site that only polled WMF servers for changes, and
stored local copies of those changes, would not have that problem. To the
contrary, it might take load off of WMF servers, if some readers were to
retrieve mirrored Wikipedia pages from Inclupedia that they would have
otherwise retrieved from Wikipedia.

The approach used by GetWiki of combining a Wikipedia mirror with locally
stored forks is probably more suitable for either (1) a general
encyclopedia with a non-neutral viewpoint (e.g. a Sympathetic Point of
View), or (2) a site that wishes to have a narrower focus than that of a
general encyclopedia. E.g., if you ran a site like Conservapedia
(conservative bias) or Tampa Bay Wiki (narrow focus), while you were
building up the content, you might want to be able to wikilink to articles
non-existent on your local wiki, such as "Florida", and dynamically pull
content from Wikipedia for users who click on those links. In the case of
Conservapedia, Wikipedia's "Florida" content might be considered better
than nothing, pending the creation of a forked version of that content that
would be biased conservatively. In the case of Tampa Bay Wiki, the content
of the "Florida" article might be sufficient for their purposes, so they
could just keep serving the mirrored content from Wikipedia forever.

If a site were to aspire to be a general encyclopedia with a neutral point
of view, it would be better to discourage or disable, as much as possible,
forking of Wikipedia's articles. Once an article is forked, it will require
duplication of Wikipedians' labor in order to keep the content as
well-written, comprehensive, well-researched, and in general as
high-quality and up-to-date as Wikipedia's coverage of the subject. It
would be better to instead mirror those articles, and have users go to
Wikipedia if they want to edit them. If users edit articles that have been
deleted from Wikipedia, on the other hand, there is no forking going on,
and therefore no duplication of labor. Unnecessary duplication of
Wikipedians' labor tends to demoralize and distract users from building up
complementary content; therefore, NPOV general encyclopedia community
builders should consider it anathema.

GetWiki and Wikinfo don't appear to have prospered. It would
seemthat GetWiki was created in 2004 and
that Wikinfo abandoned it in March
2007 and switched to MediaWiki. According to
Wikipedia,
"In 2013, the content was removed without explanation". I'm glad I didn't
devote too much labor to creating content on Wikinfo. GetWiki.net itself has
had no activity since 2012 .

The main problem with a NPOV general encylopedias' forking Wikipedia is
that, for the purposes of most people, there's not a very compelling reason
to do it. They deem the quality of the product Wikipedia offers, within the
purview of its coverage (viz. NPOV notable topics) to be good enough that
it's not worthwhile to create a fork. Wikipedia's shortcoming, from the
standpoint of inclusionists, is not so much insufficient quality of
articles, as insufficient quantity of topics covered. Forking is more
suitable for those who find the quality insufficient to meet their
particular needs.
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] Inclupedia: Developers wanted

2014-01-13 Thread Nathan Larson
TLDR: I seek fellow developers with whom to collaborate on creating the
largest and most inclusive wiki in the world, Inclupedia.
http://meta.inclumedia.org/wiki/Inclupedia

Inclupedia is a project to make available, on an
OpenEdit<http://wikiindex.org/Category:OpenEdit>wiki, pages deleted
from Wikipedia for notability reasons, as well as
articles created from scratch on Inclupedia. Thus, it will combine elements
of Deletionpedia <https://en.wikipedia.org/wiki/Deletionpedia> and
Wikinfo<http://wikiindex.org/Wikinfo>as well as the various Wikipedia
mirrors <https://en.wikipedia.org/wiki/Wikipedia:Mirrors_and_forks>. It
will be, in other words, a supplement to, and an up-to-date mirror of,
Wikipedia content.

Inclupedia seeks to accomplish the entire
inclusionist<https://en.wikipedia.org/wiki/Deletionism_and_inclusionism_in_Wikipedia>agenda
through technical means, rather than through a political solution
that would require persuading deletionists and moderates to change their
wiki-philosophies. Inclupedia will have no notability requirements for
articles, and will let people post (almost) whatever they want in
userspace. It will also, however, learn from the failures of the various
Wikipedia forks that could not sustain much activity because they had no
way of becoming a comprehensive encyclopedia without duplicating
Wikipedians' labor.

Complete, seamless, and continuous integration with Wikipedia is required.
That is what will enable Inclupedia to be different from those ill-fated
aspirants to the throne whose abandoned, rotting carcasses now litter the
wikisphere. Inclupedia aspires to be the largest and most inclusive wiki in
the world, since the set of Inclupedia pages (and revisions) will always be
a superset <https://en.wiktionary.org/wiki/superset> of the set of
Wikipedia pages (and revisions).

People sometimes ask, "Hasn't this already been done?" It would seem that
it hasn't, which is why so much of the implementing code has to be designed
and developed rather than borrowed or reverse-engineered. In some ways, the
closest project to this one may have been been the various proprietary
sites that used the Wikimedia update feed
service<https://meta.wikimedia.org/wiki/Wikimedia_update_feed_service>to
stay continuously up-to-date with Wikipedia, but to my knowledge none
of
them used MediaWiki as their engine, and their inner workings are a
mystery. Those also tended to be read-only rather than mass collaborative
sites.

The core of what needs to be done is (1) developing a bot(s) to pull
post-dump data from Wikipedia and push it to Inclupedia, (2) developing
capability to merge received data into Inclupedia without losing any data
or suffering inconsistencies, e.g. resulting from collisions with existing
content (as might happen, e.g. in mirrored page moves involving
destinations that already exist on Inclupedia), and (3) developing all the
other capabilities involved in running a site that's both a mirror and a
supplement, e.g. locking mirrored pages from editing by Inclupedians
(unless there will be forking/overriding capability).

I can't exactly post a bug to MediaZilla saying "Create Inclupedia" and
then have a bunch of different bugs it depends on, because non-WMF projects
are beyond the scope of MediaZilla. Some bugs (e.g. bug
59618<https://bugzilla.wikimedia.org/show_bug.cgi?id=59618>)
concerning Inclupedia-reliant functionality are already in MediaZilla, but
there will inevitably arise completely Inclupedia-specific matters that
need to be dealt with in a different venue. Presumably, it'll be necessary
to create a whole new infrastructure of bug reporting, mailing lists, IRC
channels, etc. But, I want to get it right from the beginning, since this
is an opportunity to start from scratch (e.g. maybe there is a better code
review tool than Gerrit?) I have created Meta-Inclu as a venue for project
coordination.

Mostly, I would like help with design decisions, code review, etc. It's
such a big project, it seems almost overwhelming to contemplate doing
singlehandedly, but it's probably doable if there are a few people involved
who can bounce ideas off one another, provide moral support, etc. So, if
you are interested, feel free to email back or create an account at
Meta-Inclu, and we can begin discussing the details of implementation.
http://meta.inclumedia.org/

If there were to be insufficient volunteer support for implementing this
wiki, then the next step might be to try to get funding to pay developers.
There's no guarantee that such funding would be obtainable, though, or that
it wouldn't come with significant strings attached, that would conflict
with the basic principles and vision of the site, or lead to a lot of (what
I might consider) undesirable technical decisions being made. But we do
what we have to do to make what we are passionate about a reality, to the
exten

Re: [Wikitech-l] Captcha filter list

2014-01-04 Thread Nathan Larson
On Wed, Jan 1, 2014 at 2:31 AM, Benjamin Lees  wrote:

> On Wed, Jan 1, 2014 at 2:00 AM, Tim Landscheidt  >wrote:
> I checked out the registration form for a white supremacist forum, and they
> just use reCAPTCHA.  No doubt they'll be developing a CAPTJCA or CAPTMCA
> soon enough.


 Conversely, one could use an empathy CAPTCHA to try to screen out that
type of crowd (although it might merely screen out the politically
incorrect). http://www.wired.com/threatlevel/2012/10/empathy-captcha/
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Captcha filter list

2014-01-01 Thread Nathan Larson
On Wed, Jan 1, 2014 at 2:31 AM, Benjamin Lees  wrote:

> I checked out the registration form for a white supremacist forum, and they
> just use reCAPTCHA.  No doubt they'll be developing a CAPTJCA or CAPTMCA
> soon enough.
>
> There are likely a number of strings that should be added to the default
> blacklist.  Patches welcome!
>

Yes, sites that want to screen out users with certain political, religious,
etc. sensibilities could, with some small tweaks to the code, create a
ConfirmEdit whitelist that causes *all* CAPTCHAs to contain words
calculated to offend those groups. Actually, with QuestyCaptcha, this is
already possible. One could include in $wgCaptchaQuestions answers that
require the user to type in strings denigrating certain groups or deities;
swearing allegiance to certain causes or entities; or stating that one has
certain stigmatized attractions, preferences or desires or engages in
certain stigmatized behaviors. One could also use it to screen for a
certain level of intelligence, knowledge, etc. in a given field by asking
questions only certain people would be able to answer. The possibilities
are endless once one's main priority becomes to repel or screen out, rather
than attract, most potential users.
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] FW: Mass file undeletion tools?

2013-12-30 Thread Nathan Larson
On Mon, Dec 30, 2013 at 9:48 AM, Tuszynski, Jaroslaw W. <
jaroslaw.w.tuszyn...@leidos.com> wrote:

> Lately someone deleted 70 pages from the middle of several thousand page
> Russian encyclopedia used by Wikisource. Everything was done by the
> book: since  the uploader forgot to add a license and did not fixed the
> issue when notified. However deletion was done without checking the
> licenses in the other pages from the book. See
> https://commons.wikimedia.org/wiki/User_talk:Lozman#Copyright_status:_Fi
> le:.D0.A0.D1.83.D1.81.D1.81.D0.BA.D0.B8.D0.B9_.D1.8D.D0.BD.D1.86.D0.B8.D
> 0.BA.D0.BB.D0.BE.D0.BF.D0.B5.D0.B4.D0.B8.D1.87.D0.B5.D1.81.D0.BA.D0.B8.D
> 0.B9_.D1.81.D0.BB.D0.BE.D0.B2.D0.B0.D1.80.D1.8C_.D0.91.D0.B5.D1.80.D0.B5
> .D0.B7.D0.B8.D0.BD.D0.B0_4.2_001.jpg
>  ile:.D0.A0.D1.83.D1.81.D1.81.D0.BA.D0.B8.D0.B9_.D1.8D.D0.BD.D1.86.D0.B8.
> D0.BA.D0.BB.D0.BE.D0.BF.D0.B5.D0.B4.D0.B8.D1.87.D0.B5.D1.81.D0.BA.D0.B8.
> D0.B9_.D1.81.D0.BB.D0.BE.D0.B2.D0.B0.D1.80.D1.8C_.D0.91.D0.B5.D1.80.D0.B
> 5.D0.B7.D0.B8.D0.BD.>  .


https://www.mediawiki.org/wiki/Extension:UndeleteBatch should work.
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Mailing list etiquette and trolling

2013-12-11 Thread Nathan Larson
On Thu, Dec 12, 2013 at 12:03 AM, Rob Lanphier  wrote:

> Or perhaps he merely suggested something that you disagreed with (or didn't
> understand), without "losing [his] mind" or being a "troll"?
>
> I'm a little skeptical about Jeroen's GitHub suggestion, but it seems like
> something reasonable people can disagree about.  Could we not accuse Jeroen
> or anyone else of being a troll or losing his/her mind for floating an
> honest proposal?


From my experience on the Internet, I suspect that most of the times when
people are accused of trolling, they are actually serious; and most of the
times when people are trolling, they're taken seriously and no one
expresses any suspicion that they were trolling.
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] FWD: [Bug 58236] New: No longer allow gadgets to be turned on by default for all users on Wikimedia sites

2013-12-11 Thread Nathan Larson
On Wed, Dec 11, 2013 at 3:30 PM, Tyler Romeo  wrote:

> In this case we should promptly work to fix this issue. To be honest, the
> only difficult part of our code review process is having to learn Git if
> you do not already know how to use it. If there were a way to submit
> patchsets without using Git somehow (maybe some kind of desktop
> application), it may make things easier.
>

I agree, it would make life a lot easier! Something like TortoiseSVN, but
for Git, then?
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] FWD: [Bug 58236] New: No longer allow gadgets to be turned on by default for all users on Wikimedia sites

2013-12-11 Thread Nathan Larson
On Wed, Dec 11, 2013 at 2:38 PM, Tyler Romeo  wrote:

> I can definitely understand the reasoning behind this. Right now with both
> Gadgets and common.js we are allowing non-reviewed code to be injected
> directly into every page. While there is a bit of trust to be had
> considering only administrators can edit those pages, it is still a
> security risk, and an unnecessary one at that.
>
> I like the idea of having gadgets (and any JS code for that matter) going
> through Gerrit for code review. The one issue is the question of where
> would Gadget code go? Would each gadget have its own code repository? Maybe
> we'd have just one repository for all gadgets as well as common.js
> (something like operations/common.js)? I don't think sending wiki edits to
> Gerrit is too feasible a solution, so if this were implemented it'd have to
> be entirely Gerrit-based.
>

Could FlaggedRevs, perhaps with some modifications, be used to implement a
review process?
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Please use sitenotice when a new version of software is deployed

2013-12-05 Thread Nathan Larson
On Thu, Dec 5, 2013 at 5:00 PM, Dan Garry  wrote:

> How about getting this stuff included in the Signpost? I think that's a
> good medium for it.
>
> There used to be a "Technology report" but I've not seen it for a while...
>
> Dan
>

BRION, they called it.
https://en.wikipedia.org/wiki/Wikipedia:Bugs,_Repairs,_and_Internal_Operational_NewsI
think they should bring it back.
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] can i mod the [~~~~~]magic word's data style?

2013-11-29 Thread Nathan Larson
On Sat, Nov 30, 2013 at 1:12 AM, Sen Slybe  wrote:

> This's my first time talk at a open source  mail list,I got say this is
> very feel great,I personally use mediawiki to record what I seeing,what I
> think,I really like it,maybe first it's kind hard to learn like about the
> wired wiki text language,special the table syntax ,I still this it's
> suck,to many [|] in it,but after that,I just can't help myself to love
> it,it have a lot expend way to use,I can just add a js gadget (which I had
> add a lot),and if I WANA go far,I can make a PHP extension,it's very easy
> to expend,just never like other web program I have use.
> Sure,the mediawiki may look like past time(old ui like a text editor),not
> fashion,but for me,it's good enough to back the simple easy way.
> But the most magic I just have seen by now,the guy who make mw,are bunch
> stronger guy,but just make it work ,and just better and better:)
>

Oh good, another
testimonial.
Yes, MediaWiki is a beautiful and wonderful product; some people criticize
the codebase (particularly the UI; I have to admit, it's not all that
intuitive, but those who stick around typically learn to, or have learned
to, love it) but I personally find it to be much better-documented (and
more fun to work with and contribute to) than some of the software I had to
help develop in the corporate world. Of course, any endeavor in which you
can choose to pursue your interests rather than being tied down to what the
employer demands tends to be more fun. Welcome to the
20%,
if that's the direction you choose to go in!

The only major downer is the code review backlog, but you can often get
around that by writing as much of your code as possible as extensions and
choosing the direct-push instead of code review option for the repository.
I guess in a way, I'm probably contributing to the problem by not doing
many code reviews, but I haven't contributed enough code to show up very
often in git blame, so people don't add me as a reviewer very often. When
they do add me (seemingly randomly), I often think "This is a complicated
change, I know nothing about this part of the codebase, and it would take
hours/days to ramp up my level of knowledge to the point where I'd be able
to make an informed comment or decision" so I remove myself.
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] can i mod the [~~~~~]magic word's data style?

2013-11-29 Thread Nathan Larson
On Sat, Nov 30, 2013 at 1:09 AM, Brian Wolff  wrote:

> >
> > There's two internationalization changes in play here. First of all,
> > month names get translated. This is something that you cannot change
> > without modifying core (as far as I know).
> >
>
> Actually what am I talking about. The obvious solution to this is to just
> create the page MediaWiki:January on your wiki (and the other months),
> editing the month names to whatever you want.
>
> MediaWiki will use whatever is on the page MediaWiki:January as the word
> for January, which allows you to modify the output without editing
> mediawiki source code.


I was going to do it that way but he said that he only wanted the signature
timestamps to be in English and the rest of the site in Chinese, and I was
concerned that maybe MediaWiki:January might be used for other purposes as
well.
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] can i mod the [~~~~~]magic word's data style?

2013-11-29 Thread Nathan Larson
On Fri, Nov 29, 2013 at 11:30 PM, Brian Wolff  wrote:

> On 11/29/13, Sen  wrote:
> > i know it's depend the mediawiki language,but can i change it by my own?i
> > use chinese mediawiki,but i wana the mediawiki signature time format use
> > english.
> >
> > sorry again,if this question should not ask in this maillist,plz let me
> > know i should send which maillist...i very know to the maillist,and my
> > english is pretty bad too.
> > ___
> > Wikitech-l mailing list
> > Wikitech-l@lists.wikimedia.org
> > https://lists.wikimedia.org/mailman/listinfo/wikitech-l
>
> Hi.
>
> This question is probably better suited to the mediawiki-l mailing
> list then this one, but in any case here is the answer:
>
>
> There's two internationalization changes in play here. First of all,
> month names get translated. This is something that you cannot change
> without modifying core (as far as I know).
>
> Second the date format changes (for example, where in the date the
> year is put). This one you can control. Just add to the bottom of
> LocalSettings.php:
>
> $wgDefaultUserOptions['date'] = 'mdy';
>
> and you will get something like "01:22, 11月 30, 2013 (UTC)" instead of
> "2013年11月30日 (六) 01:26 (UTC) ". As noted the month names are still
> shown in Chinese.
>
> If you do something like:
>
> $wgDefaultUserOptions['date'] = 'ISO 8601';
>
> You will get dates that are all numeric like 2013-11-30T01:28:42
> (UTC). Other options for the date format preference include dmy, ymd.
> By default it has the value "default" which is equivalent to the value
> "zh" in the chinese internationalization.
>
> --Bawolff
>
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
>

To implement what he is asking for, I hacked Parser::pstPass2() in
includes/parser/Parser.php as follows:

Change:
 $key = 'timezone-' . strtolower( trim( $tzMsg ) );
to:
 $key = 'signature-timezone-' . strtolower( trim( $tzMsg ) );

Change:
 $d = $wgContLang->timeanddate( $ts, false, false ) . " ($tzMsg)";
to:
 $en = Language::factory ( 'en' );
 $d = $en->timeanddate( $ts, false, false ) . " ($tzMsg)";

Next, write the following PHP script and run it:

https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Can we see mw 1.22 today?

2013-11-29 Thread Nathan Larson
On Fri, Nov 29, 2013 at 10:53 PM, Sen  wrote:

> i wana to,but i dont know any easy way to switch the version,the upgrade
> for me is use the patch,but i mod some system file,so if everytime i need
> to change them,that's seems pretty crazy..
>

If you're finding you need to hack MediaWiki because it's not extensible or
configurable enough, feel free to submit a bug
reportasking for
the new
hooks  or configuration
settings that
should be added. Who knows, it might benefit someone else too. Then
you can submit your modifications as extension code, and others can
collaborate on it with you.
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Can we see mw 1.22 today?

2013-11-29 Thread Nathan Larson
On Fri, Nov 29, 2013 at 9:43 PM, Sen Slybe  wrote:

> It's very excited to upgrade for the new version,so I can final have a try
> for the visual editor...
>
> Is here not suppose ask this kind question ,I sorry if that's so.I just
> really like mediawikii and WANA thx for you hard work
>
> Regrades
> Sen
>

Why settle for 1.22 when you can have 1.23alpha? The more people are
successfully encouraged to use it, the more people will encounter and
possibly report any bugs in it, benefiting almost everyone.
https://www.mediawiki.org/wiki/Download_from_git#Download
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Applying nofollow only to external links added in revisions that are still unpatrolled

2013-11-18 Thread Nathan Larson
On Mon, Nov 18, 2013 at 12:58 PM, Gabriel Wicke wrote:

> On 11/18/2013 09:46 AM, Nathan Larson wrote:
> > I do think the implications of changing how nofollow is applied are very
> > different on, say, Wikipedia than they would be on a small or even
> > medium-sized wiki
>
> As I said, at least for Google there should be no difference as it
> ignores rel=nofollow on MediaWiki-powered sites anyway. See
> https://bugzilla.wikimedia.org/show_bug.cgi?id=52617.
>
> Gabriel
>

Do we have any way of knowing that Yong-Gang Wang of Google is correct
about this? I sent a message to this
individual<https://plus.google.com/105349418663822362024/about>(hopefully
it's the same guy) asking for more information. It seems like a
pretty major departure from past Google policy/practice.
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Applying nofollow only to external links added in revisions that are still unpatrolled

2013-11-18 Thread Nathan Larson
To aggregate some of the arguments and counter-arguments, I posted
https://www.mediawiki.org/wiki/The_dofollow_FAQ and
https://www.mediawiki.org/wiki/Manual:Costs_and_benefits_of_using_nofollow.
It does seem, from my googling of what the owners of smaller wikis
have
to say about it, that nofollow is less popular outside of WMF with many of
those wiki owners who have taken the time to analyze the issue. On the
other hand, it could be that people who were happy with the default felt
less dissatisfied with MediaWiki devs' decision and therefore didn't feel
as much need to voice their opinions, since they had already gotten their
way and didn't have to take any measures to override the default.

I do think the implications of changing how nofollow is applied are very
different on, say, Wikipedia than they would be on a small or even
medium-sized wiki where the average user watches RecentChanges instead of a
watchlist. In a small town, you can leave your doors unlocked and get away
with it because you don't have as much traffic coming through and the
neighbors would notice and care about (for curiosity, if no other reason)
the presence of anyone who seemed out of place. It's the same way on these
small wikis; it's rare than anyone comes along to try to subtly add a spam
link, and when they do, it's noticed. Likewise, if someone starts marking
spammy edits as patrolled, that gets noticed.

Spambots are not able yet to be subtle, and the labor required to get
accustomed to the norms of a wiki and to become fluent enough in the native
language to fit in require a skilled labor that is more expensive than that
required to simply pass a CAPTCHA. So, I think that putting dofollow on
patrolled external links would be okay especially on smaller wikis, as the
patrol would stop the spambots from getting a pagerank boost and the labor
costs would deter the subtler ones. Even on Wikipedia, those fighting spam
can take advantage of the same economies of scale as those adding spam,
such as using pattern recognition on the entire wiki to catch people, or
blacklisting individual spammers and taking measures to keep them out (on
the smaller wikis, a person caught spamming can just go to another wiki,
but if you're caught spamming on Wikipedia, there isn't another site of
Wikipedia's size and scope you can go to.)

To say that patrolling wouldn't do enough to keep spam out is basically to
say, at least to some extent, that patrolling is not a very effective
system and that the wiki way doesn't work very well. If Google agrees, they
can stop giving wikis in general, or certain wikis, such influence over
pagerank. The spammers have market incentives to become more sophisticated,
but so does Google, since their earnings depend on keeping their search
results relevant and useful, so that people don't switch to competitors
that do a better job.

The question of what the default configuration should be, or what
configuration should be used on WMF sites, can be addressed in other bugs
besides this one. It doesn't take much coding to change a default setting
from "true" to "false". For now, I would just like to implement the feature
and make it available for those wikis who want to use it. So, is there
support for putting this in the core as an optional feature, and is there
anyone who will do the code review if I write this?
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] Applying nofollow only to external links added in revisions that are still unpatrolled

2013-11-17 Thread Nathan Larson
Following the mediawiki-l
discussion<http://lists.wikimedia.org/pipermail/mediawiki-l/2013-November/042038.html>about
$wgNoFollowLinks and various other discussions, in which some
discontent was expressed with the current two options of either applying or
not applying nofollow to all external links, I wanted to see what support
there might be for applying nofollow only to external links added in
revisions that are still unpatrolled (bug
42599<https://bugzilla.wikimedia.org/show_bug.cgi?id=42599>
).

How common do you think it would be for a use case to arise in which one
could be confident that a revision's being patrolled means that the
external links added in that revision have been adequately reviewed for
spamminess? Nemo had mentioned "sysadmins would be interested in this only
if their wiki has a strict definition of what's patrollable which matches
the assumptions here." In my experience, spam is pretty easy to spot
because the bots aren't very subtle about it.

I would think that if someone went around marking such obviously spammy
edits as patrolled, that if there were any bureaucrats around who cared
about keeping spam off the wiki, his patrol rights would end up getting
taken away. Spam is a form of vandalism, so it would fall under the duties
of patrollers. At Wikipedia, RecentChanges patrollers are expected to be on
the lookout for spam.
https://en.wikipedia.org/wiki/Wikipedia:Recent_changes_patrol#Spam

-- 
Nathan Larson <https://mediawiki.org/wiki/User:Leucosticte>
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] The interwiki table as a whitelist of non-spammy sites

2013-11-16 Thread Nathan Larson
On Sat, Nov 16, 2013 at 3:19 AM, Nathan Larson
wrote:

> Do we really need to set criteria for a wiki being a candidate? Are we
> trying to keep the number of approved interwiki links down? I had in mind
> just letting everyone have a prefix and a URL that we would distribute.
>

I meant to say, "approved interwiki prefixes". Anyway, this has been filed
as bug 57131 <https://bugzilla.wikimedia.org/show_bug.cgi?id=57131>.
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] The interwiki table as a whitelist of non-spammy sites

2013-11-16 Thread Nathan Larson
On Thu, Nov 14, 2013 at 1:08 PM, Quim Gil  wrote:

> As mentioned by Mark and quoted in my email, http://wikiapiary.com/
> could be a good starting point.
>
> Just improvising a hypothetical starting point for a process to maintain
> the decentralized interwiki table:
>
> In order to become a candidate, a wiki must have the extension installed
> and a quantifiable score based on age, size, license, and lack of
> reports as spammer.
>
> The extension could perhaps check how much a wiki is linked by how many
> wikis pof which characteristics, calculating a popularity index of
> sorts. Maybe you can even have a classification of topics filtering the
> langage and type of content that matters to your wiki. By default, only
> wikis above some popularity index would be included in your local
> interwiki table. The admins could fine tune locally.
>
> The master interwiki table could be hosted in Wikiapiary or wherever. It
> would be mirrored in some wahy by the wikis with the extension installed
> willing to do so.
>
> The maintenance of the table itself doesn't even look like a big deal,
> compared to developing the extension and adding new interwiki features.
> It would be based on the userbase of wiki installing the extension.
>
> Whether Wikimedia projects join the interwiki party of not, that would
> depend on the extension being ready for Wikimedia adoption annd a
> decision to deploy it. But that would be a Wikimedia discussion, not a
> Interwiki project discussion.
>
> As said, all of the above is improvised and hypothetical. Sorry in
> advance for any planning flaws.  :)
>

Do we really need to set criteria for a wiki being a candidate? Are we
trying to keep the number of approved interwiki links down? I had in mind
just letting everyone have a prefix and a URL that we would distribute.

Some might get the more preferable prefixes, though; e.g. we would have
some sort of tiebreaker if, say, there were two wikis wanting to call
themselves Foowiki and get the Foowiki: prefix. I guess we should continue
this discussion on WikiApiary. I want to try to get someone (Jamie
Thingelstad?) with authority to install extensions to give the go-ahead on
the specs for the extension; once they're approved, there will be no reason
not to start writing it. I was originally planning to work with WikiIndex
but WikiApiary seems like a better idea. By the way, I like how they note
that they're monitoring over
9,300wikis.

It might help that WikiApiary runs SMW. Maybe the extension could be
written to interact with it in some way that would be more elegant than
using tag or parser functions to insert metadata into the database à la the
{{setbpdprop: }} of
Extension:BedellPenDragon
.
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Wikipedia Issue

2013-11-14 Thread Nathan Larson
On Thu, Nov 14, 2013 at 2:05 PM, Derric Atzrott <
datzr...@alizeepathology.com> wrote:

> The issue seems to be confined to Wikipedia.  I am not seeing it affect
> Mediawiki.org
>
> Additionally the issue seems to be affecting more than just the English
> language Wikipedia.  The lojban language Wikipedia is also experiencing
> difficulties.
>
> Thank you,
> Derric Atzrott


I'm getting it at MediaWiki.org intermittently too. Just lost an edit, in
fact (my cache didn't save it for some reason; maybe it expired).
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] The interwiki table as a whitelist of non-spammy sites

2013-11-14 Thread Nathan Larson
On Thu, Nov 14, 2013 at 12:18 PM, Quim Gil  wrote:

> With my hat of third party wiki admin I personally agree with all this.
>
> Another possibility further down in the roadmap:
>
> Imagine a World in which you could transclude in your wiki content from
> a subset of this interwiki table of wikis, based on license
> compatibility and whatever other filters. To avoid performance problems,
> the check with the sources could be done by a cron periodically, etc.
>
> The most interesting part of this proposal is to start an interwiki
> table of friendly and compatible wikis willing to ease the task of
> linking and sharing content among them. The attributes of the table and
> a decentralized system to curate and maintain the data could open many
> possibilities of collaboration between MediaWiki sites.
>

Is there reason to think that a decentralized system would be likely to
evolve, or that it would be optimal? It seems to me that most stuff in the
wikisphere is centered around WMF; e.g. people usually borrow templates,
the spam blacklist, MediaWiki extensions, and so on, from WMF sites. Most
wikis that attempted to duplicate what WMF does have failed to catch on;
e.g. no encyclopedia that tried to copy Wikipedia's approach (e.g.
allegedly neutral point of view and a serious, rather than humorous, style
of writing) came close to Wikipedia's size and popularity, and no wiki
software caught on as much as MediaWiki. It's just usually more efficient
to have a centralized repository and widely-applied standards so that
people aren't duplicating their labor too much.

But if one were to pursue centralization of interwiki data, what would be
the central repository? Would WMF be likely to be interested? Hardly
anything at https://meta.wikimedia.org/wiki/Proposals_for_new_projects has
been approved for creation, so I'm not sure how one would go about getting
something like this established through WMF.

Some advantages of WMF are that we can be pretty confident its projects
will be around for awhile, and none of them are clogged up with the kind of
advertising we see at, say, Wikia. Non-WMF wikis come and go all the time;
one never knows when the owner will get hit by a bus, lose interest, etc.
and then the users are left high and dry. That could be a problem if the
wiki in question is a central repository that thousands of wikis have come
to rely upon.

Perhaps the MediaWiki Foundation could spearhead this? Aside from its
nonexistence, I think that organization could be a pretty good venue for
getting this done. I'll have to bring this up with some of my imaginary
friends who sit on the MWF board of trustees.
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] The interwiki table as a whitelist of non-spammy sites

2013-11-13 Thread Nathan Larson
TL;DR: How can we collaboratively put together a list of non-spammy sites
that wikis may want to add to their interwiki tables for whitelisting
purposes; and how can we arrange for the list to be efficiently distributed
and imported?

Nemo bis points out that
Interwiki<https://www.mediawiki.org/wiki/Extension:Interwiki>is "the
easiest way to manage whitelisting" since nofollow isn't applied to
interwiki links. Should we encourage, then, wikis to make more use of
interwiki links? Usually, MediaWiki installations are configured so that
only sysops can add, remove, or modify interwiki prefixes and URLs. If a
user wants to link to another wiki, but it's not on the list, often he will
just use an external link rather than asking a sysop to add the prefix
(since it's a hassle for both parties and often people don't want to bother
the sysops too much in case they might need their help with something else
later). This defeats much of the point of having the interwiki table
available as a potential whitelist, unless the sysops are pretty on top of
their game when it comes to figuring out what new prefixes should be added.
In most cases, they probably aren't; the experience of Nupedia shows that
elitist, top-down systems tend not to work as well as egalitarian,
bottom-up systems.

Currently, 
interwiki.sql<https://git.wikimedia.org/blob/mediawiki%2Fcore//maintenance%2Finterwiki.sql>has
100 wikis, and there doesn't seem to be much rhyme or reason to which
ones are included (e.g. Seattlewiki?) I wrote
InterwikiMap<https://www.mediawiki.org/wiki/Extension:InterwikiMap>,
which dumps the contents of the wiki's interwiki table into a backup page
and substitutes in its place the interwiki table of some other wiki (e.g. I
usually use Wikimedia's), with such modifications as the sysops see fit to
make. The extension lets sysops add, remove and modify interwiki prefixes
and URLs in bulk rather than one by one through Special:Interwiki, which is
a pretty tedious endeavor. Unfortunately, as written it is not a very
scalable solution, in that it can't accommodate very many thousand wiki
prefixes before the backup wikitables it generates exceed the capacity of
wiki pages, or it breaks for other reasons.

I was thinking of developing a tool that WikiIndex (or some other wiki
about wikis) could use to manage its own interwiki table via edits to
pages. Users would add interwiki prefixes to the table by adding a
parameter to a template that would in turn use a parser function that, upon
the saving of the page, would add the interwiki prefix to the table.
InterwikiMap could be modified to do incremental updates, polling the API
to find out what changes have recently been made to the interwiki table,
rather than getting the whole table each time. It would then be possible
for WikiIndex (or whatever other site were to be used) to be the
wikisphere's central repository of canconical interwiki prefixes. See
http://wikiindex.org/index.php?title=User_talk%3AMarkDilley&diff=172654&oldid=172575#Canonical_interwiki_prefixes.2C_II

But there's been some question as to whether there would be much demand for
a 200,000-prefix interwiki table, or whether it would be desirable. It
could also provide an incentive for spammers to try to add their sites to
WikiIndex. See
https://www.mediawiki.org/wiki/Talk:Canonical_interwiki_prefixes

It's hard to get stuff added to meta-wiki's interwiki
map<https://meta.wikimedia.org/wiki/Interwiki_map>because one of the
criteria is that the prefix has to be one that would be
used a lot on Wikimedia sites. How can we put together a list of non-spammy
sites that wikis would be likely to want to have as prefixes for nofollow
whitelisting purposes, and distribute that list efficiently? I notice that
people are more likely to put together lists of spammy than non-spammy
sites; see e.g. Freakipedia's
list<http://freakipedia.net/index.php5?title=Spam_Site_List>.
(Hmm, I think I'll pimp my websites to that wiki when I get a chance; the
fact that the spam isn't just removed but put on permanent record in a
public denunciation means it's a potential opportunity to gain exposure for
my content. They say there's no such thing as bad publicity. ;) )

-- 
Nathan Larson <https://mediawiki.org/wiki/User:Leucosticte>
Distribution of my contributions to this email is hereby authorized
pursuant to the CC0 license<http://creativecommons.org/publicdomain/zero/1.0/>
.
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] New Bugzilla users have restricted accounts

2013-11-09 Thread Nathan Larson
Why not make assigning bugs work sort of like FlaggedRevs -- as a new user,
you can make changes, but the changes won't take effect until they're
reviewed and approved? After a bureaucrat sees that you're behaving
responsibly, or after you've made a certain number of changes that have
been approved, then your changes are automatically approved.

With that system, you don't have to make a request of any particular
person; it's whoever is the first to see that there are changes that need
to be reviewed.
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Architectural leadership in Wikimedia's technical community

2013-11-05 Thread Nathan Larson
On Tue, Nov 5, 2013 at 9:10 PM, Yuvi Panda  wrote:

> (snip)
> I actually like the formalism a bit - since it at least makes sure
> that they don't rot. BDFLs are good.
>
>
Does it keep them from rotting? It looks like of the 60 RFCs in draft or in
discussion, 24 were last updated before 2013.
https://www.mediawiki.org/wiki/User:Leucosticte/RFCs_sorted_by_%22updated%22_date

-- 
Nathan Larson <https://mediawiki.org/wiki/User:Leucosticte>
Distribution of my contributions to this email is hereby authorized
pursuant to the CC0 license<http://creativecommons.org/publicdomain/zero/1.0/>
.
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Page title length

2013-10-24 Thread Nathan Larson
>
> Changing the maximum length of a title will require a huge number of
> changes and is not easy to make configurable so we should do this only
> one time changing the maximum length of title to the maximum we can
> feasibly make it.
>


> ~Daniel Friesen (Dantman, Nadir-Seen-Fire) [http://danielfriesen.name/]
>

Reposted to https://www.mediawiki.org/wiki/Page_title_size_limitations .

-- 
Nathan Larson <https://mediawiki.org/wiki/User:Leucosticte>
Distribution of my contributions to this email is hereby authorized
pursuant to the CC0 license<http://creativecommons.org/publicdomain/zero/1.0/>
.
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Page title length

2013-10-24 Thread Nathan Larson
On Thu, Oct 24, 2013 at 1:01 PM, Élie Roux wrote:

> Dear MediaWiki developers,
>
> I'm responsible for the development of a new Wiki that will contain many
> Tibetan resources. Traditionnaly, Tibetan titles of books or even parts of
> books are extremely long, as you can see for instance here :
> http://www.tbrc.org/#!rid=**W23922 ,
> and sometimes too long for Mediawiki, for instance the title of
>
> http://www.tbrc.org/?locale=**bo#library_work_ViewByOutline-**
> O01DG1049094951|W23922
>
> , which is
>
> ཤངས་པ་བཀའ་བརྒྱུད་ཀྱི་གཞུང་བཀའ་**ཕྱི་མ་རྣམས་ཕྱོགས་གཅིག་ཏུ་བསྒྲི**
> ལ་བའི་ཕྱག་ལེན་བདེ་ཆེན་སྙེ་མའི་**ཆུན་པོ་
>
> . This title is around 90 Tibetan characters, but each caracter being 3
> bytes, it exceeds the limit for title length of 256 bytes that MediaWiki
> has.
>
> So I have two questions:
>
> 1. If I change this limit to 1023 in the structure of the database
> ('page_title' field of the 'page' base), will other things (such as search
> engine) break? Is there a way to change it more cleanly?
>
> 2. Could I propose you to make this limit 1023 instead of 255 (or to make
> it configurable easily)? This would allow at least 256 characters (even for
> asian languages) instead of 256 bytes, which seems more consistant with the
> fact that MediaWiki is well internationalized.
>
> Thank you in advance,
> --
> Elie
>

Wouldn't ar.title, log_title, rc_title, and so on also need to have their
sizes increased? I didn't see any bug report proposing an increase in page
title size, although there was this one about comment size.
https://bugzilla.wikimedia.org/show_bug.cgi?id=4715
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] Deletion schema proposal: Bug 55398

2013-10-18 Thread Nathan Larson
In one of the earlier discussions about revising the deletion schema,
Happy-Melon pointed out, "Right now everyone who has  ideas for *an*
implementation isn't working on it because they don't know if  it's *the*
implementation we want." It seems to me that if one proposal is clearly
superior, we may as well pick that one, so someone can get to work on it;
but if the proposals are about equally good, then we may as well pick one
at random, or whatever proposal is slightly better than the others, so
someone can get to work on that one.
http://thread.gmane.org/gmane.science.linguistics.wikipedia.technical/48109

The RFC over at MediaWiki.org hasn't received a lot of comment. Is there
any objection to my implementing the "new field" option described at
https://www.mediawiki.org/wiki/Requests_for_comment/Page_deletion ?

I.e., I would add a new field, page.pg_deleted (analogous to
revision.rv_deleted). Also, I would add revision.rv_logid to store the
log_id of the deletion event. Upon restoring a page, only the revisions
pertaining to the selected deletion event(s) would be restored.  So,
suppose you revision delete some revisions from a page for copyvio reasons.
Those revisions are now updated as rv_deleted=[whatever value we want to
use to signify the kind of access restrictions imposed by the deletion],
and revision.rv_log_id=1, assuming that is the first log action ever
performed on that wiki.

Later, you delete the whole page (update rv_log_id=2 for the remainder of
that page's revisions) for notability reasons. Then you decide you were
mistaken about notability, but the copyvios would still be a problem. You
undelete the page, selecting only that second deletion action. You only
restore the revisions that have rv_logid=2, and leave the rv_logid=1
revisions deleted.  This will render the archive table obsolete, so it can
be deprecated/eliminated.

I would like to code this for v1.23.
https://bugzilla.wikimedia.org/show_bug.cgi?id=55398 Thanks.

-- 
Nathan Larson
https://mediawiki.org/wiki/User:Leucosticte
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Priorities in Bugzilla

2012-11-28 Thread Nathan Larson
On Wed, Nov 28, 2012 at 12:56 PM, bawolff  wrote:

> What I would really like to see is banning users from touching the
> priority field. The field is made rather useless by bug reporters who
> feel their pet issue is the most important thing to ever happen - and
> suddenly there's a whole lot of issues at highest. Of course I would
> still like to see triaging people setting the field - just not the
> randoms who think by marking their bug highest priority it will
> actually be treated like highest priority.
>
> Although Bugzilla isn't a wiki, it might still be helpful to follow the
wiki principle of letting it remain easy to fix people's bad changes (e.g.
setting the wrong priority) rather than tying their hands from making those
changes. Could we instead draw users' attention to the page listing what
all the different priorities mean, e.g. by having Bugzilla ask "Are you
sure this meets the criteria for a highest priority bug?" and then linking
to http://www.mediawiki.org/wiki/Bugzilla/Fields#Priority ?

-- 
Nathan Larson
703-399-9376
http://mediawiki.org/wiki/User:Leucosticte
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] About RESOLVED LATER

2012-11-13 Thread Nathan Larson
On Tue, Nov 13, 2012 at 2:01 PM, Mark A. Hershberger wrote:

> On 11/13/2012 11:54 AM, Martijn Hoekstra wrote:
> >> Once no RESOLVED LATER tickets remain, I can remove the resolution.
> >>
> >> Silence means approval.
> >>
> >
> > No it doesn't.
>
> The above exchange really confuses me.
>
> I'm not sure what else silence could mean when someone explicitly tells
> you (as Andre has) "If no one voices any objections, I'm going to do
> this."  Could you clarify?
>
> And, if you think silence doesn't mean approval, wouldn't you be better
> served by speaking up an voicing your concerns?  It certainly seems like
> you'd be better served by clearly stating your objection than by a terse
> reply such as you've given.
>
> Mark.
>
>
Perhaps it should've been "silence means acquiescence"?
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] About RESOLVED LATER

2012-11-05 Thread Nathan Larson
On Mon, Nov 5, 2012 at 5:25 PM, Quim Gil  wrote:

> What about removing the LATER resolution from our Bugzilla? It feels like
> sweeping reports under the carpet. If a team is convinced that something
> won't be addressed any time soon then they can WONTFIX. If anybody feels
> differently then they can take the report back and fix it.
>
> Before we could simply reopen the 311 reports filed under RESOLVED LATER:
>

Should we create a new status, e.g. BLOCKED or LATER (rather than RESOLVED
LATER) for these?
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Best policies for WikiFarms

2012-10-16 Thread Nathan Larson
>
> I doubt WMF uses wgCacheDirectory at all. It's default is false and WMF
> has Memcached. It doesn't even affect WikiFarms since it's not set by
> default. You have to explicitly set it yourself.
>

It's absent from InitialiseSettings.php but in CommonSettings.php it's set
as $wgCacheDirectory = '/tmp/mw-cache-' . $wmfVersionNumber;

* http://noc.wikimedia.org/conf/highlight.php?file=CommonSettings.php
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l