Re: [Wikitech-l] Revamping interwiki prefixes

2014-01-17 Thread This, that and the other

Tim Starling  wrote in message news:lba9ld$8pj$1...@ger.gmane.org...


I think the interwiki map should be retired. I think broken links
should be removed from it, and no new wikis should be added.

Interwiki prefixes, local namespaces and article titles containing a
plain colon intractably conflict. Every time you add a new interwiki
prefix, main namespace articles which had that prefix in their title
become inaccessible and need to be recovered with a maintenance script.

There is a very good, standardised system for linking to arbitrary
remote wikis -- URLs. URLs have the advantage of not sharing a
namespace with local article titles.

Even the introduction of new WMF-to-WMF interwiki prefixes has caused
the breakage of large numbers of article titles. I can see that is
convenient, but I think it should be replaced even in that use case.
UI convenience, link styling and rel=nofollow can be dealt with in
other ways.

-- Tim Starling


The one main advantage of interwiki mapping is the convenience you mention.  They 
save a great amount of unnecessary typing and remembering of URLs.  Whenever we go 
to any WMF wiki, we can simply type [[gerrit:12345]] and know that the link will 
point where we want it to.


Some possible alternatives to our current system would include:
* to make people manually type out URLs everywhere (silly)
* to use cross-wiki linking templates instead of interwikis.  This has its own set 
of problems: cross-wiki transclusion is another area in sore need of attention (see 
bug 4547); we need to decide which wikis get their own linking templates; how do we 
deal with collisions between local and global (cross-wiki) templates?  etc.  To me, 
it doesn't seem worth the effort.
* to introduce a new syntax for interwiki links that does not collide with internal 
links (too ambitious?)


I personally favour keeping interwikis as we know them, as collisions are very rare, 
and none of the alternatives seem viable or practical.  Maybe the advent of 
interactive editing systems like VisualEditor and Flow will make them obsolete, but 
until then, editors need the convenience and flexibility that they offer when 
writing wikitext.


It seems as though your proposal, Tim, relates to the WMF cluster.  I'd be 
interested to know what your thoughts are with relation to the interwiki table in 
external MediaWiki installations.


TTO
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l 




___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] Dynamic unit conversion

2014-01-17 Thread Petr Bena
.Hi,

There are many articles on wikipedia that contain different units.
Some use cm, that are common in europe, other use inches that are more
widely used in US, same with other unit types.

I think it would be cool if an extension was created which would allow
everyone to specify what units they prefer, and the values in articles
would be converted automatically based on preference.

For example you would say the object has width of {{unit|cm=20}} and
people who prefer cm would see 20 cm in article text, but people who
prefer inches would see 7.87 inch. This could be even based on
geolocation for IP users

What do you think?

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Dynamic unit conversion

2014-01-17 Thread Petr Bena
there could be even some cool javascript based toolbar that would
allow people to switch the unit, or see unit hint when they roll over
the text

On Fri, Jan 17, 2014 at 9:58 AM, Petr Bena benap...@gmail.com wrote:
 .Hi,

 There are many articles on wikipedia that contain different units.
 Some use cm, that are common in europe, other use inches that are more
 widely used in US, same with other unit types.

 I think it would be cool if an extension was created which would allow
 everyone to specify what units they prefer, and the values in articles
 would be converted automatically based on preference.

 For example you would say the object has width of {{unit|cm=20}} and
 people who prefer cm would see 20 cm in article text, but people who
 prefer inches would see 7.87 inch. This could be even based on
 geolocation for IP users

 What do you think?

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Revamping interwiki prefixes

2014-01-17 Thread This, that and the other
Nathan is right that I am contradicting myself a bit.  It's true that if you don't 
look at the interwiki map, you'll never know what's there - you'll never know that 
WMF is stuffing the default map full of its own junk.  What I really meant to say is 
that external users will feel short-changed that we get to add our internal 
interwikis to the global map, yet they aren't allowed to add their internal wikis 
(equivalent to our strategy, outreach, etc) to the global map, for any given reason.


I'm not getting a coherent sense of a direction to take.  Do we split the existing 
interwiki map into a local and a global map (as I originally proposed)?  Do we start 
from scratch, rewriting the interwiki map from a blank slate, or do we start with 
what we've got?  Do we flood external MW users with a ton of new prefixes, or do we 
ship a mostly empty table to new MW installations?  Do we scale right back and limit 
ourselves to a small core of interwiki prefixes?  Do we take up Tim's idea and toss 
interwikis altogether? 




___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Revamping interwiki prefixes

2014-01-17 Thread Bináris
2014/1/17 This, that and the other at.li...@live.com.au

 Nathan Larson  wrote in message news:CAF-JeUxsM-jQ85nij+OALA=
 rlolnppmhx7yhka1_hiz7m0a...@mail.gmail.com...

 Nice qouting. :-)



 I can't say I care about people reading through the interwiki list. It's
 just that with the one interwiki map, we are projecting our internal
 interwikis, like strategy:, foundation:, sulutil:, wmch: onto external
 MediaWiki installations.  No-one needs these prefixes except WMF wikis, and
 having these in the global map makes MediaWiki look too WMF-centric.


One central intwerwiki map with an extra flag? May be branched for WMF and
general as well as maintined together.
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Dynamic unit conversion

2014-01-17 Thread Gerard Meijssen
Hoi,
The best place for such an effort would be Wikidata.. There are many units
that could do with some TLC. Particularly with the arrival of values in
Wikidata it is expedient to do this.

One other area where attention would be relevant are dates.. Did you know
that some calendars (that are used) have cycles of sixty years ??
Thanks,
 Gerard


On 17 January 2014 09:59, Petr Bena benap...@gmail.com wrote:

 there could be even some cool javascript based toolbar that would
 allow people to switch the unit, or see unit hint when they roll over
 the text

 On Fri, Jan 17, 2014 at 9:58 AM, Petr Bena benap...@gmail.com wrote:
  .Hi,
 
  There are many articles on wikipedia that contain different units.
  Some use cm, that are common in europe, other use inches that are more
  widely used in US, same with other unit types.
 
  I think it would be cool if an extension was created which would allow
  everyone to specify what units they prefer, and the values in articles
  would be converted automatically based on preference.
 
  For example you would say the object has width of {{unit|cm=20}} and
  people who prefer cm would see 20 cm in article text, but people who
  prefer inches would see 7.87 inch. This could be even based on
  geolocation for IP users
 
  What do you think?

 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Dynamic unit conversion

2014-01-17 Thread Antoine Musso
Le 17/01/14 09:58, Petr Bena a écrit :
 There are many articles on wikipedia that contain different units.
 Some use cm, that are common in europe, other use inches that are more
 widely used in US, same with other unit types.
 
 I think it would be cool if an extension was created which would allow
 everyone to specify what units they prefer, and the values in articles
 would be converted automatically based on preference.
 
 For example you would say the object has width of {{unit|cm=20}} and
 people who prefer cm would see 20 cm in article text, but people who
 prefer inches would see 7.87 inch. This could be even based on
 geolocation for IP users

Hello,

I like the idea.  One thing to take in account is that the unit
conversion should be done on the client side to avoid fragmentation of
the parser cache.

A possibility would be for the template to output BOTH metrics and
imperial units, then use JS/CSS to hide the irrelevant one.  Hence
{{unit|cm=20}} would generate something like:

 span class=mw-unitsystem-imperial7,874 inches/span
 span class=mw-unitsystem-metric20 cm/span

Then using user preference to hide one of the class.


-- 
Antoine hashar Musso


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Dynamic unit conversion

2014-01-17 Thread Petr Bena
Yes, but what I mean is the user interface for wikipedia readers so
that it's easier for them to understand what the value is. I think
this issue has 2 parts.

One is implementation in user interface for readers (probably some js
widget that pop up, when you roll over the unit, allowing you to
immediately convert it or change the default preferences) who are
reading articles on wikipedia or any other wmf project.

The other part is the implementation of this for editors. It could
either use wikidata as backend, which probably is a good solution
here, or just some magic word like {{DYNAMICUNIT|unit_type=value}} etc
etc.

I think that best would be combination of both wikidata and magic,
because most of wikipedia editors aren't familiar with wikidata at all
and may find it complicated to insert the values directly in there. If
wikidata was the only option, it would need to be EXTREMELY simple for
users to insert the value into article with a 0 need for any knowledge
of how wikidata works, otherwise nobody is going to use it.

On Fri, Jan 17, 2014 at 10:28 AM, Gerard Meijssen
gerard.meijs...@gmail.com wrote:
 Hoi,
 The best place for such an effort would be Wikidata.. There are many units
 that could do with some TLC. Particularly with the arrival of values in
 Wikidata it is expedient to do this.

 One other area where attention would be relevant are dates.. Did you know
 that some calendars (that are used) have cycles of sixty years ??
 Thanks,
  Gerard


 On 17 January 2014 09:59, Petr Bena benap...@gmail.com wrote:

 there could be even some cool javascript based toolbar that would
 allow people to switch the unit, or see unit hint when they roll over
 the text

 On Fri, Jan 17, 2014 at 9:58 AM, Petr Bena benap...@gmail.com wrote:
  .Hi,
 
  There are many articles on wikipedia that contain different units.
  Some use cm, that are common in europe, other use inches that are more
  widely used in US, same with other unit types.
 
  I think it would be cool if an extension was created which would allow
  everyone to specify what units they prefer, and the values in articles
  would be converted automatically based on preference.
 
  For example you would say the object has width of {{unit|cm=20}} and
  people who prefer cm would see 20 cm in article text, but people who
  prefer inches would see 7.87 inch. This could be even based on
  geolocation for IP users
 
  What do you think?

 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l

 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Dynamic unit conversion

2014-01-17 Thread Petr Bena
Yes, this is precisely what I mean. It could be some js gadget that
does this, just like the google translate gadget makes it easy to
translate a word by rolling over the text, this would let users see
the value in different units the same way. However this gadget needs
to know what is supposed to be converted. So there is also some need
to alter the current articles as they are.

Maybe it would be possible for it to automagically recognize what is a
unit using some regex or something like it, but I am afraid it
wouldn't be very reliable if the unit and value wasn't clearly
specified in wikitext. There are some special values, for example
recently I was reading about hard drives and values like size per
square inch (for example Tib/sq. inch) and similar are pretty exotic
to be matched easily by some automatic algorithm without explicitly
specifying what kind of unit is that.

On Fri, Jan 17, 2014 at 10:38 AM, Antoine Musso hashar+...@free.fr wrote:
 Le 17/01/14 09:58, Petr Bena a écrit :
 There are many articles on wikipedia that contain different units.
 Some use cm, that are common in europe, other use inches that are more
 widely used in US, same with other unit types.

 I think it would be cool if an extension was created which would allow
 everyone to specify what units they prefer, and the values in articles
 would be converted automatically based on preference.

 For example you would say the object has width of {{unit|cm=20}} and
 people who prefer cm would see 20 cm in article text, but people who
 prefer inches would see 7.87 inch. This could be even based on
 geolocation for IP users

 Hello,

 I like the idea.  One thing to take in account is that the unit
 conversion should be done on the client side to avoid fragmentation of
 the parser cache.

 A possibility would be for the template to output BOTH metrics and
 imperial units, then use JS/CSS to hide the irrelevant one.  Hence
 {{unit|cm=20}} would generate something like:

  span class=mw-unitsystem-imperial7,874 inches/span
  span class=mw-unitsystem-metric20 cm/span

 Then using user preference to hide one of the class.


 --
 Antoine hashar Musso


 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Dynamic unit conversion

2014-01-17 Thread Marc Ordinas i Llopis
On Fri, Jan 17, 2014 at 9:58 AM, Petr Bena benap...@gmail.com wrote:

 For example you would say the object has width of {{unit|cm=20}} and
 people who prefer cm would see 20 cm in article text, but people who
 prefer inches would see 7.87 inch.


This is a great idea! As proposed it'd be very helpful, but maybe it'd be
better if it showed the original text and a conversion on mouse-over (maybe
with a small icon to indicate it, like external links).


 This could be even based on
 geolocation for IP users


Oh, please, don't use IP geolocation for anything. It's terrible for people
travelling, using proxies, living abroad, living in places where more than
one languages are commonly spoken, learning new languages… If you want to
get an initial default, use Accept-Language (like, inches for en-US and cm
for anyone else :) and allow the user to modify it.

Thanks,
Marc
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Dynamic unit conversion

2014-01-17 Thread Jasper Deng
I would like to ask, how are significant figures going to be dealt with?
300 could mean anything from one to three significant figures, for example.


On Fri, Jan 17, 2014 at 1:47 AM, Marc Ordinas i Llopis 
marc...@wikimedia.org wrote:

 On Fri, Jan 17, 2014 at 9:58 AM, Petr Bena benap...@gmail.com wrote:

  For example you would say the object has width of {{unit|cm=20}} and
  people who prefer cm would see 20 cm in article text, but people who
  prefer inches would see 7.87 inch.


 This is a great idea! As proposed it'd be very helpful, but maybe it'd be
 better if it showed the original text and a conversion on mouse-over (maybe
 with a small icon to indicate it, like external links).


  This could be even based on
  geolocation for IP users
 
 
 Oh, please, don't use IP geolocation for anything. It's terrible for people
 travelling, using proxies, living abroad, living in places where more than
 one languages are commonly spoken, learning new languages… If you want to
 get an initial default, use Accept-Language (like, inches for en-US and cm
 for anyone else :) and allow the user to modify it.

 Thanks,
 Marc
 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Dynamic unit conversion

2014-01-17 Thread Happy Melon
Enwiki's {{convert}} template is a behemoth of a structure which is
intended to do this.  I once made an attempt to write a PHP-side extension
to do it (look at the revision history of the ParserFunctions extension),
but it never took off [1].  I don't think there was ever any enthusiasm to
take the ability to tinker with the formatting and output ({{convert}}
has a million and one different stylistic variations and parameters) away
from wiki template editors.

--HM

[1] https://bugzilla.wikimedia.org/show_bug.cgi?id=40039


On 17 January 2014 10:56, Jasper Deng jas...@jasperswebsite.com wrote:

 I would like to ask, how are significant figures going to be dealt with?
 300 could mean anything from one to three significant figures, for example.


 On Fri, Jan 17, 2014 at 1:47 AM, Marc Ordinas i Llopis 
 marc...@wikimedia.org wrote:

  On Fri, Jan 17, 2014 at 9:58 AM, Petr Bena benap...@gmail.com wrote:
 
   For example you would say the object has width of {{unit|cm=20}} and
   people who prefer cm would see 20 cm in article text, but people who
   prefer inches would see 7.87 inch.
 
 
  This is a great idea! As proposed it'd be very helpful, but maybe it'd be
  better if it showed the original text and a conversion on mouse-over
 (maybe
  with a small icon to indicate it, like external links).
 
 
   This could be even based on
   geolocation for IP users
  
  
  Oh, please, don't use IP geolocation for anything. It's terrible for
 people
  travelling, using proxies, living abroad, living in places where more
 than
  one languages are commonly spoken, learning new languages… If you want to
  get an initial default, use Accept-Language (like, inches for en-US and
 cm
  for anyone else :) and allow the user to modify it.
 
  Thanks,
  Marc
  ___
  Wikitech-l mailing list
  Wikitech-l@lists.wikimedia.org
  https://lists.wikimedia.org/mailman/listinfo/wikitech-l
 
 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Dynamic unit conversion

2014-01-17 Thread Petr Bena
That's why I don't think this is a best job for a template. It
definitely should be done by some kind of extension or gadget instead.

On Fri, Jan 17, 2014 at 11:16 AM, Happy Melon
happy.melon.w...@gmail.com wrote:
 Enwiki's {{convert}} template is a behemoth of a structure which is
 intended to do this.  I once made an attempt to write a PHP-side extension
 to do it (look at the revision history of the ParserFunctions extension),
 but it never took off [1].  I don't think there was ever any enthusiasm to
 take the ability to tinker with the formatting and output ({{convert}}
 has a million and one different stylistic variations and parameters) away
 from wiki template editors.

 --HM

 [1] https://bugzilla.wikimedia.org/show_bug.cgi?id=40039


 On 17 January 2014 10:56, Jasper Deng jas...@jasperswebsite.com wrote:

 I would like to ask, how are significant figures going to be dealt with?
 300 could mean anything from one to three significant figures, for example.


 On Fri, Jan 17, 2014 at 1:47 AM, Marc Ordinas i Llopis 
 marc...@wikimedia.org wrote:

  On Fri, Jan 17, 2014 at 9:58 AM, Petr Bena benap...@gmail.com wrote:
 
   For example you would say the object has width of {{unit|cm=20}} and
   people who prefer cm would see 20 cm in article text, but people who
   prefer inches would see 7.87 inch.
 
 
  This is a great idea! As proposed it'd be very helpful, but maybe it'd be
  better if it showed the original text and a conversion on mouse-over
 (maybe
  with a small icon to indicate it, like external links).
 
 
   This could be even based on
   geolocation for IP users
  
  
  Oh, please, don't use IP geolocation for anything. It's terrible for
 people
  travelling, using proxies, living abroad, living in places where more
 than
  one languages are commonly spoken, learning new languages… If you want to
  get an initial default, use Accept-Language (like, inches for en-US and
 cm
  for anyone else :) and allow the user to modify it.
 
  Thanks,
  Marc
  ___
  Wikitech-l mailing list
  Wikitech-l@lists.wikimedia.org
  https://lists.wikimedia.org/mailman/listinfo/wikitech-l
 
 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l

 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Revamping interwiki prefixes

2014-01-17 Thread go moko





 From: Bináris wikipo...@gmail.com
To: Wikimedia developers wikitech-l@lists.wikimedia.org 
Sent: Friday, January 17, 2014 10:07 AM
Subject: Re: [Wikitech-l] Revamping interwiki prefixes
 

2014/1/17 This, that and the other at.li...@live.com.au

 Nathan Larson  wrote in message news:CAF-JeUxsM-jQ85nij+OALA=
 rlolnppmhx7yhka1_hiz7m0a...@mail.gmail.com...

 Nice qouting. :-)



 I can't say I care about people reading through the interwiki list. It's
 just that with the one interwiki map, we are projecting our internal
 interwikis, like strategy:, foundation:, sulutil:, wmch: onto external
 MediaWiki installations.  No-one needs these prefixes except WMF wikis, and
 having these in the global map makes MediaWiki look too WMF-centric.


One central intwerwiki map with an extra flag? May be branched for WMF and
general as well as maintined together.


What if filling the interwiki table with predefined links was an installation 
option, possibly with several lists, and void?

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Revamping interwiki prefixes

2014-01-17 Thread Nathan Larson
On Fri, Jan 17, 2014 at 9:00 AM, go moko gom...@yahoo.com wrote:

 What if filling the interwiki table with predefined links was an
 installation option, possibly with several lists, and void?


Probably won't (and shouldn't) happen, since we're trying to keep the
installer options close to the bare minimum. Changing or clearing the
interwiki table is pretty easy with the right script or special page. The
fact that, when the installer's interwiki list was changed in 2013, it was
accurate to say This list has obviously not been properly updated for many
years. There are many long-dead sites that are removed in this patch
suggests that third party wikis are pretty good at ignoring interwiki links
they don't want or need.

I disagree that collisions are very rare, and none of the alternatives
seem viable or practical. Collisions (or whatever one would call them)
happen fairly often, and the resulting linking
errorshttps://en.wikiquote.org/w/index.php?title=User%3ALeucostictediff=1503289oldid=1503284can
be hard to notice because one sees the link is blue and assumes it's
going where one wanted it to.

It wouldn't be such a problem if wikis would name their project namespace
Project: rather than the name of the wiki. Having it named Project: would
be useful when people are importing user or project pages from Wikipedia
(e.g. if they wanted to import userboxes or policy pages) and don't want
the Wikipedia: links to become interwiki links. I would be in favor of
renaming the project namespaces to Project: on Wikimedia wikis; that's how
it is on MediaWiki.org (to avoid a collision with the MediaWiki: namespace)
and it seems to work out okay. I'll probably start setting up my third
party wikis that way too, because I've run into similar problems when
exporting and importing content among them. Perhaps the installer should
warn that it's not recommended to name the meta namespace after the site
name.

Tim's proposal seems pretty elegant but in a few situations will make links
uglier or hide where they point to. E.g. See also sections with interwiki
links (like what you see
herehttps://en.wikipedia.org/wiki/Help:Interwiki_linking#See_also)
could become like the Further reading section you see
herehttps://en.wikipedia.org/wiki/Wikipedia:Policies_and_guidelines#Further_readingin
which one has to either put barelinks or make people hover over the
link
to see the URL it goes to.

Interwiki page existence detection probably wouldn't be any more difficult
to implement in the absence of interwiki prefixes. We could still have an
interwiki table, but page existence detection would be triggered by certain
URLs rather than prefixes being used. I'm not sure how interwiki
transclusion would work if we didn't have interwikis; we'd have to come up
with some other way of specifying which wiki we're transcluding from,
unless we're going to use URLs for that too.

In short, I think the key is to come up with something that doesn't break
silently when there's a conflict between an interwiki prefix and namespace.
For that purpose, it would suffice to keep interwiki linking and come up
with a new delimiter. But changing the name of the Project: namespace would
work just as well. Migration of links could work analogously to what's
described in bug 60135https://bugzilla.wikimedia.org/show_bug.cgi?id=60135
.

TTO, you were saying I'm not getting a coherent sense of a direction to
take -- that could be a good thing at this point in the discussion; it
could mean people are still keeping an open mind and wanting to hear more
thoughts and ideas rather than making too hasty of a conclusion. But I
guess it is helpful, when conversations fall silent, for someone to push
for action by asking, ...so, in light of all that, what do you want to
do? :)
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Revamping interwiki prefixes

2014-01-17 Thread Nathan Larson
I forgot to mention, another problem is that you can't even import
Wikipedia: namespace pages to your wiki without changing your interwiki
table to get rid of the wikipedia: interwiki prefix first. The importer
will say, Page 'Wikipedia:Sandbox' is not imported because its name is
reserved for external linking (interwiki). As I
notedhttps://bugzilla.wikimedia.org/show_bug.cgi?id=60168#c2in bug
60168, it's unclear why we would standardize the other namespace
names (Help:, Template:, MediaWiki:, User:, etc.) from one wiki to the
next, but not Project:
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Revamping interwiki prefixes

2014-01-17 Thread Gabriel Wicke
On 01/16/2014 07:56 PM, Tim Starling wrote:
 I think the interwiki map should be retired. I think broken links
 should be removed from it, and no new wikis should be added.
 
 Interwiki prefixes, local namespaces and article titles containing a
 plain colon intractably conflict. Every time you add a new interwiki
 prefix, main namespace articles which had that prefix in their title
 become inaccessible and need to be recovered with a maintenance script.
 
 There is a very good, standardised system for linking to arbitrary
 remote wikis -- URLs. URLs have the advantage of not sharing a
 namespace with local article titles.


The underlying issue here is that we are still using wikitext as our
primary storage format, rather than treating it as the textual user
interface it is. With HTML storage this issue disappears, as interwiki
links are stored with full URLs. When using the wikitext editor,
prefixes are introduced correctly and on demand, so you get the
convenience without the conflicts.

Currently Flow is the only project using HTML storage. We are working on
preparing this for MediaWiki proper though, so in the longer term the
interwiki conflict issue should disappear.

Gabriel

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Revamping interwiki prefixes

2014-01-17 Thread Bartosz Dziewoński

On Fri, 17 Jan 2014 17:33:52 +0100, Nathan Larson nathanlarson3...@gmail.com 
wrote:


it's unclear why we would standardize the other namespace
names (Help:, Template:, MediaWiki:, User:, etc.) from one wiki to the
next, but not Project:


The User: namespace differs per-wiki in many languages other than English, to reflect the name of 
the wiki (something akin to Wikipedia-editor, Wikisource-editor etc.).

I don't think standardizing Project: is a good idea, in particular because the name could 
be confused with Wikiprojects, which on some wikis (also non-English) have a 
separate namespace for them.

--
Matma Rex

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Dynamic unit conversion

2014-01-17 Thread Gabriel Wicke
On 01/17/2014 12:58 AM, Petr Bena wrote:
 I think it would be cool if an extension was created which would allow
 everyone to specify what units they prefer, and the values in articles
 would be converted automatically based on preference.

Whatever you do, make it client-side. It would be great to expose type
information with an attribute in HTML, so that it can be used for
client-side unit conversions. Ideally the typed information comes
directly from wikidata (and can be fed to nice table/pie chart/whatever
widgets), but it should also not be too hard to mark up data that lives
on the wiki the same way.

Gabriel

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] HTML Storage for Wikidata (was Re: Revamping interwiki prefixes)

2014-01-17 Thread Jay Ashworth
- Original Message -
 From: Gabriel Wicke gwi...@wikimedia.org

 Currently Flow is the only project using HTML storage. We are working on
 preparing this for MediaWiki proper though, so in the longer term the
 interwiki conflict issue should disappear.

Where, by HTML storage I hope you actually mean something that isn't HTML
storage, since HTML is a *presentation* markup manguage, not a semantic one,
and thus singularly unsuited to use for the sort of semantic storage a wiki
engine requires...

Cheers,
-- jra
-- 
Jay R. Ashworth  Baylink   j...@baylink.com
Designer The Things I Think   RFC 2100
Ashworth  Associates   http://www.bcp38.info  2000 Land Rover DII
St Petersburg FL USA  BCP38: Ask For It By Name!   +1 727 647 1274

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] Tor exemption process (was: Re: Jake requests enabling access and edit access to Wikipedia via TOR)

2014-01-17 Thread Erik Moeller
On Mon, Jan 13, 2014 at 9:10 AM, Chris Steipp cste...@wikimedia.org wrote:
 To satisfy Applebaum's request, there needs to be a mechanism whereby
 someone can edit even if *all of their communications with Wikipedia,
 including the initial contact* are coming over Tor or equivalent.
 Blinded, costly-to-create handles (minted by Wikipedia itself) are one
 possible way to achieve that; if there are concrete reasons why that
 will not work for Wikipedia, the people designing these schemes would
 like to know about them.

 This should be possible, according to https://meta.wikimedia.org/wiki/NOP,
 which Nemo also posted. The user sends an email to the stewards (using tor
 to access email service of their choice). Account is created, and user can
 edit Wikimedia wikis. Or is there still a step that is missing?

I tested the existing process by creating a new riseup.net email
account via Tor, then requesting account creation and a global
exemption via stewa...@wikimedia.org. My account creation request was
granted, but for exemption purposes, I was requested to go through the
process for any specific wiki I want to edit. In fact, the account was
created on Meta, but not exempted there.

The reason I gave is as follows:

My reason for editing through Tor is that I would like to write about
sensitive issues (e.g. government surveillance practices) and prefer not
to be identified when doing so. I have some prior editing experience, but
would rather not disclose further information about it to avoid any
correlation of identities.

This seems like a valid reason for a global exemption to me, so I'm
not sure the current global policy is sufficient.

Erik
-- 
Erik Möller
VP of Engineering and Product Development, Wikimedia Foundation

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] Tampa datacenter issues

2014-01-17 Thread Erik Moeller
Hi all,

We had a fibre cut of our connection to our Tampa DC this morning. ETA
of a fix is still pending, but the cuts have been located and crews
are being dispatched. Meanwhile public traffic is being rerouted via
the public Internet, so most services should be reachable. Tampa is
our secondary DC which we're decommissioning, so there was no impact
on our main sites. Impacted temporarily were: inbound Wikimedia.org
email, Bugzilla, Labs. Fundraising is still impacted, but shouldn't be
(we're debugging).

Thanks to the ops team for their quick response.

Erik

-- 
Erik Möller
VP of Engineering and Product Development, Wikimedia Foundation

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Tor exemption process (was: Re: Jake requests enabling access and edit access to Wikipedia via TOR)

2014-01-17 Thread Nathan Larson
On Fri, Jan 17, 2014 at 1:21 PM, Erik Moeller e...@wikimedia.org wrote:

 I tested the existing process by creating a new riseup.net email
 account via Tor, then requesting account creation and a global
 exemption via stewa...@wikimedia.org. My account creation request was
 granted, but for exemption purposes, I was requested to go through the
 process for any specific wiki I want to edit. In fact, the account was
 created on Meta, but not exempted there.


Thanks for taking the initiative to check that out. Now maybe the stewards
will be paranoid that any further Tor requests might be from you, and act
accordingly. It kinda reminds me of the Rosenham
Experimenthttps://en.wikipedia.org/wiki/Rosenhan_experiment.
Just the known possibility that there might be a mystery customer keeps the
service providers on their toes, and they are much more likely to mistake
an ordinary customer for the mystery customer than vice versa, as
demonstrated by the non-existent impostor
experimenthttps://en.wikipedia.org/wiki/Rosenhan_experiment#The_non-existent_impostor_experiment
.
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Tor exemption process (was: Re: Jake requests enabling access and edit access to Wikipedia via TOR)

2014-01-17 Thread Nathan
On Fri, Jan 17, 2014 at 1:21 PM, Erik Moeller e...@wikimedia.org wrote:

 On Mon, Jan 13, 2014 at 9:10 AM, Chris Steipp cste...@wikimedia.org
 wrote:
  To satisfy Applebaum's request, there needs to be a mechanism whereby
  someone can edit even if *all of their communications with Wikipedia,
  including the initial contact* are coming over Tor or equivalent.
  Blinded, costly-to-create handles (minted by Wikipedia itself) are one
  possible way to achieve that; if there are concrete reasons why that
  will not work for Wikipedia, the people designing these schemes would
  like to know about them.

  This should be possible, according to
 https://meta.wikimedia.org/wiki/NOP,
  which Nemo also posted. The user sends an email to the stewards (using
 tor
  to access email service of their choice). Account is created, and user
 can
  edit Wikimedia wikis. Or is there still a step that is missing?

 I tested the existing process by creating a new riseup.net email
 account via Tor, then requesting account creation and a global
 exemption via stewa...@wikimedia.org. My account creation request was
 granted, but for exemption purposes, I was requested to go through the
 process for any specific wiki I want to edit. In fact, the account was
 created on Meta, but not exempted there.

 The reason I gave is as follows:

 My reason for editing through Tor is that I would like to write about
 sensitive issues (e.g. government surveillance practices) and prefer not
 to be identified when doing so. I have some prior editing experience, but
 would rather not disclose further information about it to avoid any
 correlation of identities.

 This seems like a valid reason for a global exemption to me, so I'm
 not sure the current global policy is sufficient.


I use an anonymous encrypted VPN when accessing the Internet from home, and
found myself unable to edit on en.wp. I requested IPBE
https://en.wikipedia.org/w/index.php?title=User_talk:Nathandiff=572887054oldid=568388651
and
was initially denied because I was able to turn off the VPN to edit.
Exemption was only granted because an administrator familiar with me was
watching my talkpage and stepped in.
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Tor exemption process (was: Re: Jake requests enabling access and edit access to Wikipedia via TOR)

2014-01-17 Thread Tyler Romeo
On Fri, Jan 17, 2014 at 1:21 PM, Erik Moeller e...@wikimedia.org wrote:

 I tested the existing process by creating a new riseup.net email
 account via Tor, then requesting account creation and a global
 exemption via stewa...@wikimedia.org. My account creation request was
 granted, but for exemption purposes, I was requested to go through the
 process for any specific wiki I want to edit. In fact, the account was
 created on Meta, but not exempted there.


I feel like a much better experiment would be to:

1) Do what you just did
2) Request Tor access on a specific wiki
3) Edit for a while and become an established editor
4) Then ask for a global exemption

If anything it is good for stewards to not randomly grant global exemptions
to anybody who walks in off the street.

If anything I would try testing out the enwiki-specific exemption process
and see how that works out for you.

*-- *
*Tyler Romeo*
Stevens Institute of Technology, Class of 2016
Major in Computer Science
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Dynamic unit conversion

2014-01-17 Thread Tim Landscheidt
Gabriel Wicke gwi...@wikimedia.org wrote:

 I think it would be cool if an extension was created which would allow
 everyone to specify what units they prefer, and the values in articles
 would be converted automatically based on preference.

 Whatever you do, make it client-side. It would be great to expose type
 information with an attribute in HTML, so that it can be used for
 client-side unit conversions. Ideally the typed information comes
 directly from wikidata (and can be fed to nice table/pie chart/whatever
 widgets), but it should also not be too hard to mark up data that lives
 on the wiki the same way.

There's also http://microformats.org/wiki/measure for a more
standardized solution that could work on other websites as
well.

Tim


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Tor exemption process (was: Re: Jake requests enabling access and edit access to Wikipedia via TOR)

2014-01-17 Thread Marc A. Pelletier
On 01/17/2014 01:21 PM, Erik Moeller wrote:
 This seems like a valid reason for a global exemption to me, so I'm
 not sure the current global policy is sufficient.

To be fair, Erik, I don't think it's fair to expect that one would be
granted IPBE (especially globally) simply by just remembering to not add
and vandalize in the request.

On English Wikipedia, at least, IPBE is normally only granted to someone
who has some positive history and has an actual /need/ for the bit.  The
reason for this is simple:  It's be abused over and over again
historically.  The number of times I personally caught someone misusing
a proxy for socking that happened to have a good hand account with
IPBE also on that proxy is much higher than the number of IPBE I've seen
used legitimately.

The problem isn't straight up vandalism (IPBE is no help there -- the
account'd get swiftly blocked) but socking.  POV warriors know how to
misuse proxies and anonymity to multiply their consensus, and having
IPBE and editing through any sort of anonimizing proxy (including TOR)
defeats what little means checkuser have to curb socking.

-- Marc


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Tor exemption process (was: Re: Jake requests enabling access and edit access to Wikipedia via TOR)

2014-01-17 Thread Thomas Gries
Am 17.01.2014 20:08, schrieb Marc A. Pelletier:
 The problem isn't straight up vandalism (IPBE is no help there -- the
 account'd get swiftly blocked) but socking.  POV warriors know how to
 misuse proxies and anonymity to multiply their consensus, and having
 IPBE and editing through any sort of anonimizing proxy (including TOR)
 defeats what little means checkuser have to curb socking.


There are (may be) solutions for this and related issues, see:

(book) Peter Wayner Disappearing Cryptography Third edition pages 225-227, 
2009.
ISBN 978-0-12-374479-1 Chapter 10.7.3 Stopping Bad user:

bad users of the onion routing network can ruin the reputation of other users.
The Wikipedia, for instance, often blocks TOR exit nodes complete because some
people have used the network to hide their identities while defacing the wiki's
entries...

In the further passages Peter Wayner explains a one straight-forward solution
is to use some form of certificates with a *blind signature*, a technique that
borrows from some of the early solutions for building anonymous digital cash
(A typical example with Alice follows - must read this).

and

http://dx.doi.org/10.1109/TDSC.2009.38

(article) Tsang, P.P.; Kapadia, A.; Cornelius, C.; Smith, S.W.,
Nymble: Blocking Misbehaving Users in Anonymizing Networks. 
IEEE Transactions on Dependable and Secure Computing, vol.8, no.2, pp.256-269,
March-April 2011.

I mentioned both in https://bugzilla.wikimedia.org/show_bug.cgi?id=59146 .



___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Tor exemption process (was: Re: Jake requests enabling access and edit access to Wikipedia via TOR)

2014-01-17 Thread Risker
On 17 January 2014 14:08, Marc A. Pelletier m...@uberbox.org wrote:

 On 01/17/2014 01:21 PM, Erik Moeller wrote:
  This seems like a valid reason for a global exemption to me, so I'm
  not sure the current global policy is sufficient.

 To be fair, Erik, I don't think it's fair to expect that one would be
 granted IPBE (especially globally) simply by just remembering to not add
 and vandalize in the request.

 On English Wikipedia, at least, IPBE is normally only granted to someone
 who has some positive history and has an actual /need/ for the bit.  The
 reason for this is simple:  It's be abused over and over again
 historically.  The number of times I personally caught someone misusing
 a proxy for socking that happened to have a good hand account with
 IPBE also on that proxy is much higher than the number of IPBE I've seen
 used legitimately.

 The problem isn't straight up vandalism (IPBE is no help there -- the
 account'd get swiftly blocked) but socking.  POV warriors know how to
 misuse proxies and anonymity to multiply their consensus, and having
 IPBE and editing through any sort of anonimizing proxy (including TOR)
 defeats what little means checkuser have to curb socking.



I agree with Marc on this, and further would say that the reason given by
Erik in his application for IPBE is pretty much a red flag that a user is
going to be editing in a controversial and non-neutral manner. It's also a
red flag that the user's probably been blocked for doing it before, and
thinks this will be a workaround that will prevent him/her from being
blocked this time.

Risker/Anne
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Tor exemption process (was: Re: Jake requests enabling access and edit access to Wikipedia via TOR)

2014-01-17 Thread Marc A. Pelletier
On 01/17/2014 02:15 PM, Thomas Gries wrote:
 In the further passages Peter Wayner explains a one straight-forward solution
 is to use some form of certificates with a *blind signature*, a technique that
 borrows from some of the early solutions for building anonymous digital cash
 (A typical example with Alice follows - must read this).

That's no help; because it'd be trivial for any person to get any number
of those certificates and we're back to one editor holding multiple
identities in a way that cannot be collated by checkusers.

Unless you're willing to add a price tag to this that has value to the
socker (and no, time or effort aren't it -- POV warriors have plenty
of both when they have a Truth to Defend™) then it offers no solution.

-- Marc


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Tor exemption process (was: Re: Jake requests enabling access and edit access to Wikipedia via TOR)

2014-01-17 Thread Erik Moeller
On Fri, Jan 17, 2014 at 11:08 AM, Marc A. Pelletier m...@uberbox.org wrote:
 The problem isn't straight up vandalism (IPBE is no help there -- the
 account'd get swiftly blocked) but socking.  POV warriors know how to
 misuse proxies and anonymity to multiply their consensus, and having
 IPBE and editing through any sort of anonimizing proxy (including TOR)
 defeats what little means checkuser have to curb socking.

I understand. Wikimedia's current abuse prevention strategies rely on
limits to user privacy being maintained, and any technical solution
that attempts to broaden access for Tor users is unlikely to be
successful at any significant scale unless this changes, no matter how
clever a solution it is.

The Board or global community could decide that protecting users'
right to anonymity is more important than having abuse prevention
tools relying on IP disclosure, but in the absence of such a
Board-level decision or community-wide vote, I don't think the
situation relative to Tor users will change. My personal view is that
we should transition away from tools relying on IP disclosure, given
the global state of Internet surveillance and censorship which makes
tools like Tor necessary.

Erik

-- 
Erik Möller
VP of Engineering and Product Development, Wikimedia Foundation

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] HTML Storage for Wikidata (was Re: Revamping interwiki prefixes)

2014-01-17 Thread Gabriel Wicke
On 01/17/2014 10:17 AM, Jay Ashworth wrote:
 - Original Message -
 From: Gabriel Wicke gwi...@wikimedia.org
 
 Currently Flow is the only project using HTML storage. We are working on
 preparing this for MediaWiki proper though, so in the longer term the
 interwiki conflict issue should disappear.
 
 Where, by HTML storage I hope you actually mean something that isn't HTML
 storage, since HTML is a *presentation* markup manguage, not a semantic one,
 and thus singularly unsuited to use for the sort of semantic storage a wiki
 engine requires...

I mean our HTML5+RDFa DOM spec format [1], which is semantic markup that
also displays as expected. It exposes all the semantic information of
Wikitext in RDFa, which is why Parsoid can provide a wikitext editing
interface to it.

Gabriel

[1]: https://www.mediawiki.org/wiki/Parsoid/MediaWiki_DOM_spec

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] Deployment and Roadmap highlights - week of January 20th

2014-01-17 Thread Greg Grossmeier
Hello and welcome to the latest edition of the deployment and roadmap
highlights.

Full up-to-date schedule, as always, can be found online at:
https://wikitech.wikimedia.org/wiki/Deployments


== Throughout the week ==

* The Technical Operations team will be bringing the west coast caching
  center (ULSFO) online (again, after a failed first attempt due to
  connectivity limitations). This work will happen throughout the week.

* On Thursday and Friday there is the MediaWiki Architecture Summit in
  San Francisco. See:
  https://www.mediawiki.org/wiki/Architecture_Summit_2014

* Due to the US public holiday on Monday (Martin Luther King Jr Day) and
  the Architecture Summit, the week of the 20th will be a low/no deploy
  week. As such, the only things on the calendar are placeholders for
  Lightning Deploys on Tues, Wed, and Thurs (Pacific time).

Let me know if you have any questions,

Greg

-- 
| Greg GrossmeierGPG: B2FA 27B1 F7EB D327 6B8E |
| identi.ca: @gregA18D 1138 8E47 FAC8 1C7D |


signature.asc
Description: Digital signature
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Tor exemption process (was: Re: Jake requests enabling access and edit access to Wikipedia via TOR)

2014-01-17 Thread Risker
On 17 January 2014 16:26, Erik Moeller e...@wikimedia.org wrote:

 On Fri, Jan 17, 2014 at 11:08 AM, Marc A. Pelletier m...@uberbox.org
 wrote:
  The problem isn't straight up vandalism (IPBE is no help there -- the
  account'd get swiftly blocked) but socking.  POV warriors know how to
  misuse proxies and anonymity to multiply their consensus, and having
  IPBE and editing through any sort of anonimizing proxy (including TOR)
  defeats what little means checkuser have to curb socking.

 I understand. Wikimedia's current abuse prevention strategies rely on
 limits to user privacy being maintained, and any technical solution
 that attempts to broaden access for Tor users is unlikely to be
 successful at any significant scale unless this changes, no matter how
 clever a solution it is.

 The Board or global community could decide that protecting users'
 right to anonymity is more important than having abuse prevention
 tools relying on IP disclosure, but in the absence of such a
 Board-level decision or community-wide vote, I don't think the
 situation relative to Tor users will change. My personal view is that
 we should transition away from tools relying on IP disclosure, given
 the global state of Internet surveillance and censorship which makes
 tools like Tor necessary.




Well, Erik, the vast majority of socks are blocked without checkuser
evidence, and always have been, on all projects; the evidence is often in
the edits, and doesn't need any privacy-invading tools to confirm.
I get the notion of reducing or eliminating the public visibility of IP
addresses and am quite supportive of it; IPv6 addresses in particular can
often disclose far too much personal information.

End of the day, though, absent blocking problematic IP addresses and ranges
(which really can't be done unless the person blocking actually knows the
IP address or range), the socks and spammers just keep coming.  This
problem isn't unique to WMF projects, and I don't believe anyone has come
up with a solution that allows open/unregistered editing without also using
IP information for blocking or limiting access.

Risker
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Tor exemption process (was: Re: Jake requests enabling access and edit access to Wikipedia via TOR)

2014-01-17 Thread Marc A. Pelletier
On 01/17/2014 04:26 PM, Erik Moeller wrote:
 I understand. Wikimedia's current abuse prevention strategies rely on
 limits to user privacy being maintained, and any technical solution
 that attempts to broaden access for Tor users is unlikely to be
 successful at any significant scale unless this changes, no matter how
 clever a solution it is.


Not necessarily.  Abuse prevention requires, fundamentally, only one
thing:  being able to tell that edit X has been done by the same person
as edit Y with N% probability.  That's the fundamental decision done by
administrators and checkusers when deciding whether to block a user or
source of edits.

User IP and UA is one of the datapoints that is used for that
determination (both by checkusers and, indirectly, by administrators via
autoblocks or range blocks); but any other method by which that
determination can be made would serve just as well.

That we are not currently able to satisfactorily find a method by which
we can attribute online actions to an individual without (currently)
placing some limits on their privacy does not mean we never will be able
to -- or at least that we'll be able to tip the balance towards more
privacy than less.

It's a Hard Problem.  Businesses tend to fix it by tying online
identities to some physical (and finite) token of existence (like a
Credit Card); something which we emphatically would never want to do
because that vastly /reduces/ privacy.  We don't care to know who
someone *is*, just whether they are the same one as before.

IMO, efforts should be directed towards that more fundamental goal;
everything else will fall into place from there.

-- Marc


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Dynamic unit conversion

2014-01-17 Thread Petr Bena
it can't be done client side. It must be done on both sides, so that
user can save their preference into database without having to set it
everytime they get their cookies wiped (which in my case is like 10
times a day just by switching devices and browsers)

On Fri, Jan 17, 2014 at 7:15 PM, Gabriel Wicke gwi...@wikimedia.org wrote:
 On 01/17/2014 12:58 AM, Petr Bena wrote:
 I think it would be cool if an extension was created which would allow
 everyone to specify what units they prefer, and the values in articles
 would be converted automatically based on preference.

 Whatever you do, make it client-side. It would be great to expose type
 information with an attribute in HTML, so that it can be used for
 client-side unit conversions. Ideally the typed information comes
 directly from wikidata (and can be fed to nice table/pie chart/whatever
 widgets), but it should also not be too hard to mark up data that lives
 on the wiki the same way.

 Gabriel

 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Dynamic unit conversion

2014-01-17 Thread Petr Bena
Also this is not anything I am going to do just myself. This is
something WE should do. I have like 0 knowledge of this kind of
things, I was just thinking it would be a nice thing to have. Some
interface developer who is master of JS and such is needed to make
this. I can help of course, but basically I am just proposing an idea
:-)

On Fri, Jan 17, 2014 at 11:11 PM, Petr Bena benap...@gmail.com wrote:
 it can't be done client side. It must be done on both sides, so that
 user can save their preference into database without having to set it
 everytime they get their cookies wiped (which in my case is like 10
 times a day just by switching devices and browsers)

 On Fri, Jan 17, 2014 at 7:15 PM, Gabriel Wicke gwi...@wikimedia.org wrote:
 On 01/17/2014 12:58 AM, Petr Bena wrote:
 I think it would be cool if an extension was created which would allow
 everyone to specify what units they prefer, and the values in articles
 would be converted automatically based on preference.

 Whatever you do, make it client-side. It would be great to expose type
 information with an attribute in HTML, so that it can be used for
 client-side unit conversions. Ideally the typed information comes
 directly from wikidata (and can be fed to nice table/pie chart/whatever
 widgets), but it should also not be too hard to mark up data that lives
 on the wiki the same way.

 Gabriel

 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Tampa datacenter issues

2014-01-17 Thread Tim Landscheidt
Erik Moeller e...@wikimedia.org wrote:

 We had a fibre cut of our connection to our Tampa DC this morning. ETA
 of a fix is still pending, but the cuts have been located and crews
 are being dispatched. Meanwhile public traffic is being rerouted via
 the public Internet, so most services should be reachable. Tampa is
 our secondary DC which we're decommissioning, so there was no impact
 on our main sites. Impacted temporarily were: inbound Wikimedia.org
 email, Bugzilla, Labs. Fundraising is still impacted, but shouldn't be
 (we're debugging).

Small correction: Labs is still impacted as neither the
replica servers nor Wikipedia  Co. can be reached without
the link so tools and bots relying on that are out of ser-
vice at the moment.

Tim


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Tampa datacenter issues

2014-01-17 Thread Tim Landscheidt
I wrote:

 We had a fibre cut of our connection to our Tampa DC this morning. ETA
 of a fix is still pending, but the cuts have been located and crews
 are being dispatched. Meanwhile public traffic is being rerouted via
 the public Internet, so most services should be reachable. Tampa is
 our secondary DC which we're decommissioning, so there was no impact
 on our main sites. Impacted temporarily were: inbound Wikimedia.org
 email, Bugzilla, Labs. Fundraising is still impacted, but shouldn't be
 (we're debugging).

 Small correction: Labs is still impacted as neither the
 replica servers nor Wikipedia  Co. can be reached without
 the link so tools and bots relying on that are out of ser-
 vice at the moment.

... and when I hit C-c C-c, the link came back up.  So now
Labs should be fully working again.

Tim


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Dynamic unit conversion

2014-01-17 Thread Gabriel Wicke
On 01/17/2014 02:11 PM, Petr Bena wrote:
 it can't be done client side. It must be done on both sides, so that
 user can save their preference into database without having to set it
 everytime they get their cookies wiped (which in my case is like 10
 times a day just by switching devices and browsers)

Sorry if I wasn't clear; yes, saving the preference is fine. I am mainly
concerned about keeping the page content independent of the preference,
so that we can serve the same cached content to anonymous and logged-in
users. Currently that is not yet possible, but we are working on
eliminating the last preference dependencies so that logged-in users can
get the same performance as anonymous users.

Gabriel

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Tampa datacenter issues

2014-01-17 Thread Leslie Carr
Did that on purpose ;)

But the links are still recovering, we're on a single link and we're
not sure of their stability, so be wary!

On Fri, Jan 17, 2014 at 2:26 PM, Tim Landscheidt t...@tim-landscheidt.de 
wrote:
 I wrote:

 We had a fibre cut of our connection to our Tampa DC this morning. ETA
 of a fix is still pending, but the cuts have been located and crews
 are being dispatched. Meanwhile public traffic is being rerouted via
 the public Internet, so most services should be reachable. Tampa is
 our secondary DC which we're decommissioning, so there was no impact
 on our main sites. Impacted temporarily were: inbound Wikimedia.org
 email, Bugzilla, Labs. Fundraising is still impacted, but shouldn't be
 (we're debugging).

 Small correction: Labs is still impacted as neither the
 replica servers nor Wikipedia  Co. can be reached without
 the link so tools and bots relying on that are out of ser-
 vice at the moment.

 ... and when I hit C-c C-c, the link came back up.  So now
 Labs should be fully working again.

 Tim


 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l



-- 
Leslie Carr
Wikimedia Foundation
AS 14907, 43821
http://as14907.peeringdb.com/

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Tor exemption process (was: Re: Jake requests enabling access and edit access to Wikipedia via TOR)

2014-01-17 Thread Erik Moeller
On Fri, Jan 17, 2014 at 1:38 PM, Risker risker...@gmail.com wrote:
 End of the day, though, absent blocking problematic IP addresses and ranges
 (which really can't be done unless the person blocking actually knows the
 IP address or range), the socks and spammers just keep coming.  This
 problem isn't unique to WMF projects, and I don't believe anyone has come
 up with a solution that allows open/unregistered editing without also using
 IP information for blocking or limiting access.

I'm not arguing for open editing from Tor. I do think it would be nice
if global exemptions could in fact be obtained reasonably easily be
emailing stewa...@wikimedia.org. While it's true that such requests
could be misused, the following are also true:

- We regulate the influx of requests and the exemptions we grant. This
means that we can use wait periods, interview questions, and other
mechanisms to avoid it turning into a free-for-all. This is
effectively the same mechanism riseup.net uses to grant anonymous
email addresses.

- We know all the accounts that we have granted global exemptions to
and therefore can investigate behavior _across the group_ of Tor users
fairly easily, or even subsets of that group such as exemptions
granted in a certain time window, by a certain user, etc.

It would allow a motivated person to reset their identity and go
undetected provided they avoid the kind of articles and behaviors they
got in trouble over in the first place. It's not clear to me that the
consequences would be particularly severe or unmanageable beyond that.

Erik
-- 
Erik Möller
VP of Engineering and Product Development, Wikimedia Foundation

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Tor exemption process (was: Re: Jake requests enabling access and edit access to Wikipedia via TOR)

2014-01-17 Thread Nathan Larson
On Fri, Jan 17, 2014 at 6:33 PM, Erik Moeller e...@wikimedia.org wrote:

 It would allow a motivated person to reset their identity and go
 undetected provided they avoid the kind of articles and behaviors they
 got in trouble over in the first place. It's not clear to me that the
 consequences would be particularly severe or unmanageable beyond that.


People get banned from Wikimedia projects for off-wiki conduct too that has
nothing to do with Wikimedia projects.
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Tor exemption process (was: Re: Jake requests enabling access and edit access to Wikipedia via TOR)

2014-01-17 Thread Faidon Liambotis

On Fri, Jan 17, 2014 at 01:26:04PM -0800, Erik Moeller wrote:

The Board or global community could decide that protecting users'
right to anonymity is more important than having abuse prevention
tools relying on IP disclosure, but in the absence of such a
Board-level decision or community-wide vote, I don't think the
situation relative to Tor users will change. My personal view is that
we should transition away from tools relying on IP disclosure, given
the global state of Internet surveillance and censorship which makes
tools like Tor necessary.


Hear, hear. I couldn't agree more.

My own view:

This matter isn't about dissidents in oppressive regimes or suspected 
criminals. It never was, but it has become especially apparent to 
everyone this summer.


The whole world -literally everyone- is being constantly surveilled and 
our communications recorded for decades to come. Everyone is a suspect 
and everyone has a file. We'll never be sure again, for example, that 
actions that we perform today, as innocent as they are now -like a 
Wikipedia edit- won't be used against us in 5 or 10 years to link us 
with a crime or group.


All access  edits to Wikipedia being monitored isn't some paranoid 
theory anymore, we can be more than sure of it. Tor is one of the very 
few ways to resist to this pervasive surveillance and work around the 
panopticon of modern states. We *must* find a way to support it as a 
first-class citizen, for exactly the same reasons Wikipedia has been 
protective of users' privacy and has a stringent privacy policy.


(I was at 30C3; I got a bazillion complaints from numerous people about 
this every time I mentioned my affiliation, even before Jake's talk)


Regards,
Faidon

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Dynamic unit conversion

2014-01-17 Thread Petr Bena
Yes that sounds pretty good to me, I also think it would be much
better for this to be handled on client side, I don't really see any
problem with caching, but what I am not sure is if this could be
implemented without having to update the articles with some magic
words etc. But even if we had to, it still would be just a one time
update.

On Fri, Jan 17, 2014 at 11:30 PM, Gabriel Wicke gwi...@wikimedia.org wrote:
 On 01/17/2014 02:11 PM, Petr Bena wrote:
 it can't be done client side. It must be done on both sides, so that
 user can save their preference into database without having to set it
 everytime they get their cookies wiped (which in my case is like 10
 times a day just by switching devices and browsers)

 Sorry if I wasn't clear; yes, saving the preference is fine. I am mainly
 concerned about keeping the page content independent of the preference,
 so that we can serve the same cached content to anonymous and logged-in
 users. Currently that is not yet possible, but we are working on
 eliminating the last preference dependencies so that logged-in users can
 get the same performance as anonymous users.

 Gabriel

 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] Status of the new PDF Renderer

2014-01-17 Thread Matthew Walker
All,

We've just finished our second sprint on the new PDF renderer. A
significant chunk of renderer development time this cycle was on non latin
script support, as well as puppetization and packaging for deployment. We
have a work in progress pipeline up and running in labs which I encourage
everyone to go try and break. You can use the following featured articles
just to see what our current output is:
* http://ocg-collection-alpha.wmflabs.org/index.php/Alexis_Bachelot
*
http://ocg-collection-alpha.wmflabs.org/index.php/Atlantis:_The_Lost_Empire

Some other articles imported on that test wiki:
* http://ur1.ca/gg0bw

Please note that some of these will fail due to known issues noted below.

You can render any page in the new renderer by clicking the sidebar link
Download as WMF PDF; if you Download as PDF you'll be using the old
renderer (useful for comparison.) Additionally, you can create full books
via Special:Book -- our renderer is RDF to Latex (PDF) and the old
renderer is e-book (PDF). You can also try out the RDF to Text (TXT)
renderer, but that's not on the critical path. As of right now we do not
have a bugzilla project entry so reply to this email, or email me directly
-- we'll need one of: the name of the page, the name of the collection, or
the collection_id parameter from the URL to debug.

There are some code bits that we know are still missing that we will have
to address in the coming weeks or in another sprint.
* Attribution for images and text. The APIs are done, but we still need
to massage that information into the document.
* Message translation -- right now all internal messages are in English
which is not so helpful to non English speakers.
* Things using the cite tag and the Cite extension are not currently
supported (meaning you won't get nice references.)
* Tables may not render at all, or may break the renderer.
* Caching needs to be greatly improved.

Looking longer term into deployment on wiki, my plans right now are to get
this into beta labs for general testing and connect test.wikipedia.org up
to our QA hardware for load testing. The major blocker there is acceptance
of the Node.JS 0.10, and TexLive 2012 packages into reprap, our internal
aptitude package source. This is not quite as easy as it sounds, we already
use TexLive 2009 in production for the Math extension and we must apply
thorough tests to ensure we do not introduce any regressions when we update
to the 2012 package. I'm not sure what actual dates for those migrations /
testing will be because it greatly depends on when Ops has time. In the
meantime, our existing PDF cluster based on mwlib will continue to serve
our offline needs. Once our solution is deployed and tested, mwlib
(pdf[1-3]) will be retired here at the WMF and print on demand services
will be provided directly by PediaPress servers.

For the technically curious; we're approximately following the parsoid
deployment model -- using trebuchet to push out a source repository
(services/ocg-collection) that has the configuration and node dependencies
built on tin along with git submodules containing the actual service code.

It may not look like it on the surface, but we've come a long way and it
wouldn't have been possible without the (probably exasperated) help from
Jeff Green, Faidon, and Ori. Also big thanks to Brad and Max for their
work, and Gabriel for some head thunking. C. Scott and I are not quite off
the hook yet, as indicated by the list above, but hopefully soon enough
we'll be enjoying the cake and cookies from another new product launch.
(And yes, even if you're remote if I promised you cookies as bribes I'll
ship them to you :p)

~Matt Walker
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] Review Milestone reached

2014-01-17 Thread bawolff
Aaron Schulz has become the first person to have approved (+2'ed) =
1000 patchsets to mediawiki core [1]. I thought this nice round number
deserved a note, and a good job. Thank you Aaron for all your hard
work reviewing things.

--bawolff


[1] https://toolserver.org/~nemobis/crstats/core.txt

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Review Milestone reached

2014-01-17 Thread Kartik Mistry
On Sat, Jan 18, 2014 at 8:33 AM, bawolff bawolff...@gmail.com wrote:
 Aaron Schulz has become the first person to have approved (+2'ed) =
 1000 patchsets to mediawiki core [1]. I thought this nice round number
 deserved a note, and a good job. Thank you Aaron for all your hard
 work reviewing things.

Milestone it is! Thanks Aaron!

-- 
Kartik Mistry/કાર્તિક મિસ્ત્રી | IRC: kart_
{kartikm, 0x1f1f}.wordpress.com

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l