Re: [Wikitech-l] Please use sitenotice when a new version of software is deployed

2013-12-06 Thread John Vandenberg
On Dec 6, 2013 1:12 PM, Gerard Meijssen gerard.meijs...@gmail.com wrote:

 Hoi,
 Let us be honest indeed. The others are not heard when they scream and
 shout in their little corner of the world..
 Thanks,
  Gerard

How true. Alex asked because of a major upgrade problem on Wikisource which
interrupted almost everyone. See wikisource-l for details.

I think there are times when a community would like a mass notice of some
sort, and it will differ based on the project and community size and the
likely impact of the upgrade.

--
John Vandenberg
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Italian Wikipedia complements search results with Wikidata based functionality

2013-12-04 Thread John Vandenberg
It would be nice if wikidata could store the appropriate pagename for
articles that dont exist yet.

Not very important, as a suitable pagename can be guessed in most cases
using occupation.
On Dec 4, 2013 11:55 PM, Andy Mabbett a...@pigsonthewing.org.uk wrote:

 On 2 December 2013 17:49, Gerard Meijssen gerard.meijs...@gmail.com
 wrote:
  The Italian Wikipedia is the first project where people who use search
 will
  find results added from Wikidata.

 This is great; and now on pl.WP too.

 What about a big button, in the relevant host language, saying Start
 a Wikipedia article on this subject (with a smaller how to link
 alongside)?

 The edit window could be pre-populated.

 (I believe the German-language community have also been discussing this
 idea)

 --
 Andy Mabbett
 @pigsonthewing
 http://pigsonthewing.org.uk

 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Should MediaWiki CSS prefer non-free fonts?

2013-10-26 Thread John Vandenberg
On Oct 27, 2013 4:19 AM, MZMcBride z...@mzmcbride.com wrote:

 Brad Jorsch (Anomie) wrote:
 On Sat, Oct 26, 2013 at 12:52 PM, MZMcBride z...@mzmcbride.com wrote:
  There's an open question in my mind as to what constitutes a non-free
 font,
 
 In this context, I mean non-free in the context of libre rather than
 gratis.[1]
 
 [1]: https://en.wikipedia.org/wiki/Gratis_versus_libre

 Right. The libre part is what I consider a legal issue, though I think I
 understand more clearly now that you're talking about technical policy
 here.

 There are a number of fonts that can be downloaded for free (gratis)
 but are under terms along the lines of a CC -NC or -ND license, and
 there are more that are distributed with various popular operating
 systems so many people already have them for free in the loosest
 sense. I'm not counting these as free here.

 Thank you for clarifying this point. It might be helpful to have a list of
 gratis/libre fonts and a list of gratis/non-libre fonts, if such lists
 don't exist already.

I started building a list a while ago, for similar reasons, limited to
fonts being used on the projects.

https://meta.wikimedia.org/wiki/Fonts

--
John
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Getting birthday from the sidebar of a wiki page and showing in the home page

2013-10-03 Thread John Vandenberg
On Oct 3, 2013 10:05 PM, Mark A. Hershberger m...@everybody.org wrote:

 On 10/03/2013 09:18 AM, Shrinivasan T wrote:
  Wikidata + JS to the rescue!??
 
  Thanks for the reply.
 
  I am very new to the wikidata.

 Me too.

 Unfortunately I don't see a way to get this information any API query.

 You can find someone's birthday, but I don't see a way to look for
 entities that are people (P735 -- personal name) with birthdays (P569).

 Mark.

An automated sidebar that includes any entities that have a significant
date on todays day would be useful for small projects that dont want to
spend lots of time building a hand-picked list.

E.g. The list on wikisources would have works published on the day of the
year.

--
John
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Improving anti-vandalism tools (twinkle, huggle etc) - suspicious edits queue

2013-09-27 Thread John Vandenberg
On Sep 27, 2013 5:06 PM, Bartosz Dziewoński matma@gmail.com wrote:

 On Fri, 27 Sep 2013 11:51:34 +0200, Petr Bena benap...@gmail.com wrote:

 We are getting somewhere else than I wanted... I didn't want to
 discuss what should be reverted on sight or not. Problem is that right
 now lot of vandal-fighters see certain amount of dubious edits they
 skip because they can't verify if they are correct or not, which are
 then ignored and get lost in editing history. That's a fact. This
 problem could be easily solved if these specific edits could be
 highlighted somehow so that they would get attention of people who
 understand the topic well enough to check if they are OK. But there is
 no such a system / mechanism that would allow us to do that. I think
 this is worth of implementing somehow because it could significantly
 improve the reliability of encyclopedia content. There is a lot of
 vandalism that remains unnoticed even for months


 Really, you have just described FlaggedRevs. It could be enabled by
default for all articles and would solve all of your problems. Many large
Wikipedias already use it, including pl.wp and de.wp.

And en.wp does has Pending Changes enabled. It would be great to have dev
resources thrown at improving it to resolve the issues preventing wider use.

All wikis have abuse filters, which provides tagging of suspicious edits.
That extension has lots of bus and feature requests against it; fixing them
will make it more flexible. Does huggle make use of the abuse filter tags?

--
John Vandenberg
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Improving anti-vandalism tools (twinkle, huggle etc) - suspicious edits queue

2013-09-27 Thread John Vandenberg
On Sep 27, 2013 8:18 PM, Derric Atzrott datzr...@alizeepathology.com
wrote:

   Is it possible to
  enable them in reverse-mode so that all edits are flagged as good, but
  editors can flag them as bad? If not, I can't see how it could be
  useful for this purpose...
 
 
  flagging as bad? Do you mean reverting?
 
  I just don't see what you are trying to accomplish. Sorry.
 
 
 I think he's looking to flag as suspect, not to revert as bad. Are you
 really not following, or just not agreeing?
 
 A possibility is to use a maintenance template, like {{cn}} or
{{dubious}},
 but this solution shares with using flagged revs for it - which would be
a
 great solution - that it might be viewed as negative by the en.wp
community.

 I thought FlaggedRevs prevented the newest version of the page from being
shown until it has been approved?

 Flagged Revisions allows for Editor and Reviewer users to rate revisions
of articles and set those revisions as the default revision to show upon
normal page view. These revisions will remain the same even if included
templates are changed or images are overwritten.

 I think he also wants the edit to go through and be visible right away.
 I believe that he is trying to assume good faith in these types of edits.
 Trust, but verify, if you will.

 I'm not entirely sure that FlaggedRevs is the best solution here.

The 'trust, but verify' model is patrolled edits, which nemo mentioned
earlier.

Combine unpatrolled edits with abuse filter tags, and a nice interface like
huggle, and this sounds like a great tool.

--
John
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Improving anti-vandalism tools (twinkle, huggle etc) - suspicious edits queue

2013-09-27 Thread John Vandenberg
On Fri, Sep 27, 2013 at 8:52 PM, Petr Bena benap...@gmail.com wrote:
 Not really, I can't see how tags help at all in here. We are talking
 about any kind of edit (nothing that can be matched by regex) which
 seems suspicious to vandal-fighter (human) but who can't make sure if
 it's vandalism or not. Nothing like abuse filter nor patrolled edits
 can help here (unless we mark every single edit as patrolled and these
 people who see such a suspicious edit would mark it as un-patrolled or
 something like that)

If I understand correctly, you want user-applied tags on revisions,
which is bug 1189.

https://bugzilla.wikimedia.org/show_bug.cgi?id=1189

All edits start out unpatrolled.

If your interface shows the union of unpatrolled edits and a
huggle-user-selected-tag (be it tor, abusefilter, or manually added
tag) ..

'Experts' edit the page if required, and then mark the revision as
patrolled so it no longer appears in the queue.

--
John Vandenberg

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] VE: why editing a paragraph opens the whole page?

2013-08-03 Thread John Vandenberg
On Aug 4, 2013 1:31 AM, Tyler Romeo tylerro...@gmail.com wrote:

 I think we can agree that VE has some performance considerations, but if
 you take a look at the bug report, it's explained why it would be so
 incredibly difficult to implement section editing.

Not really. Some sections on some pages are complex, like that example.
Most sections on most pages are bog standard.

Detect the complicated cases and mark those cases non-section-editable, or
mark the set of sections involved as 'must be section edited as one group'.

p.s. Try VE on 26kbps - that is all my mother can obtain currently, unless
she gets satellite, which is too expensive for her - she lives ~10 kms from
what is considered to be a large city in Australia. And she is over 70yo
and has more than 500 edits. She is happy with the wikitext editor (and
learnt it herself with the messy help pages we have) and thinks it is
simpler than word processing apps (which she has to relearn regularily
because they keep changing every few years).

--
John
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] VE: why editing a paragraph opens the whole page?

2013-08-03 Thread John Vandenberg
On Aug 4, 2013 6:14 AM, David Gerard dger...@gmail.com wrote:

 On 3 August 2013 21:06, John Vandenberg jay...@gmail.com wrote:

  p.s. Try VE on 26kbps - that is all my mother can obtain currently,
unless
  she gets satellite, which is too expensive for her - she lives ~10 kms
from


 I would actually be interested to know if new Javascript, new
 functionality, things like the VE, etc. are routinely tested on slow
 connections. Is there anything like this in place?

I was there yesterday, only briefly so I didnt have time to do any perf
testing, but it was clearly worse compared to the last time I did it. I
suspect ULS is playing its part of the problem, but user doesnt care which
feature is to blame. I am sure there are 'dialup' simulators, but nothing
beats a real modem struggling to avoid dropouts when its the last
connection on a fair dinkum noisey line.

Just consider what the image insertion dialog is doing, and it clear the VE
is not being designed to accomodate dialup users at all. Understandable.
Hopefully soon we'll be able to replace stock widgets with low bandwidth
equivalents.

John
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] Voting disabled in bugzilla for some products

2013-07-28 Thread John Vandenberg
I may be mistaken, but I thought voting was enabled for the product
VisualEditor, and now it is not.

Could someone confirm this?

It is currently disabled for:
Commons App
Huggle
openZIM
Parsoid
(Spam)
Tool Labs tools
VisualEditor
Wiki Loves Monuments
WikiLoves Monuments Mobile
Wikimedia Labs
Wikipedia App

Voting provides a way to watch a bug without sending any bugmail.  It
also lets people do a +1 without adding a comment.

Is the reason for disabling voting because of perception that large
numbers of votes magically creates new developers?  If so, would it be
re-enabled if the voting interface was relabelled and better
documented. i.e.

https://bugzilla.wikimedia.org/show_bug.cgi?id=34490

--
John Vandenberg

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] WMF deployment highlights - week of July 29th, 2013

2013-07-26 Thread John Vandenberg
My guess is nlwp was excluded because of high impact bugs, rather than a
non-technical reason.

https://bugzilla.wikimedia.org/show_bug.cgi?id=49400

The high-ish impact bug I know of for German Wikipedia is

https://bugzilla.wikimedia.org/show_bug.cgi?id=51119

(but that is on dab pages, which probably dont receive many edits)

John Vandenberg.
sent from Galaxy Note
On Jul 27, 2013 6:15 AM, Juergen Fenn schneeschme...@googlemail.com
wrote:

 2013/7/26 Greg Grossmeier g...@wikimedia.org:

  == Monday ==
  * VisualEditor will be enabled for non-logged in (anonymous) users on
the following Wikipedias:
  ** German, Spanish, French, Italian, Polish, and Swedish

 I just would like to point out that it would be a sign of respect
 towards the community to postpone the deployment of VE on German
 Wikipedia until the RFC has been held there. AFAICS the majority of
 users rejects VE (and more beta releases that are in the pipeline) at
 this point of time if it comes as an opt-out version only. The German
 community seeks to be treated as the Dutch community.

 https://de.wikipedia.org/wiki/Wikipedia:Meinungsbilder/Software-Updates

 Regards,
 Jürgen.

 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Remove 'visualeditor-enable' from $wgHiddenPrefs

2013-07-23 Thread John Vandenberg
On Tue, Jul 23, 2013 at 4:32 PM, Subramanya Sastry
ssas...@wikimedia.org wrote:
 On 07/22/2013 10:44 PM, Tim Starling wrote:

 Round-trip bugs, and bugs which cause a given wikitext input to give
 different HTML in Parsoid compared to MW, should have been detected
 during automated testing, prior to beta deployment. I don't know why
 we need users to report them.


 500+ edits are being done per hour using Visual Editor [1] (less at this
 time given that it is way past midnight -- I have seen about 700/hour at
 times).  I did go and click on over 100 links and examined the diffs.  I did
 that twice in the last hour.  I am happy to report clean diffs on all edits
 I checked both times.

 I did run into a couple of nowiki-insertions which
 is, strictly speaking not erroneous and based on user input, but is more a
 usability issue.

What is a dirty diff?  One that inserts junk unexpectedly, unrelated
to the user's input?

The broken table injection bugs are still happening.

https://en.wikipedia.org/w/index.php?title=Sai_Baba_of_Shirdicurid=144175diff=565442800oldid=565354286

If the parser isnt going to be fixed quickly to ignore tables it
doesnt understand, we need to find the templates and pages with these
broken tables - preferably using SQL and heuristics and fix them.  The
same needs to be done for all the other wikis, otherwise they are
going to have the same problems happening randomly, causing lots of
grief.

I presume this is also a dirty diff

https://en.wikipedia.org/w/index.php?title=Dubbing_%28filmmaking%29curid=8860diff=565438776oldid=565408739

In addition to nowikis, there are also wikilinks that are not what the
user intended

https://en.wikipedia.org/w/index.php?title=Ben_Trecurid=1822927diff=565439119oldid=561995413
https://en.wikipedia.org/w/index.php?title=Celton_Manxcurid=28176434diff=565439020oldid=565436056

Here is three edits to try to add a section header and a sentence,
with a wikilink in the section header.
(In the process they added other junk into the page, probably unintentionally.)

https://en.wikipedia.org/w/index.php?title=Port_of_Davaoaction=historyoffset=2013072307limit=4

A leading line feed in a parameter - what the?

https://en.wikipedia.org/w/index.php?title=List_of_Sam_%26_Cat_episodescurid=39469556diff=565437324oldid=565416618

External links which should be converted to internal links:

https://en.wikipedia.org/w/index.php?title=Krishna_%28TV_series%29diff=565437699oldid=564621100

That is all in the last hour, and I've only checked ~100 diffs.

I appreciate that some of these are a result of user input, and the RC
feed of non-VE edits will have similar problems exhibited by newbies,
albeit different because it is the source editor.  And it is great to
see so many constructive VE edits.  But you're not going to get much
love by claiming that it is now stable and not causing broken diffs.
In addition, VE can crash a Google Chrome tab, and it can cause
(unsaved changes) dataloss in most browser configurations.

-- 
John Vandenberg

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] VE and nowiki

2013-07-23 Thread John Vandenberg
On Wed, Jul 24, 2013 at 6:11 AM, Steve Summit s...@eskimo.com wrote:
 VE will (correctly, from its point of view) translate that to
 nowiki[[pagename]]/nowiki in the page source.

 This is a likely enough mistake, and the number of times you
 really want explicit double square brackets is small enough, that
 it's worth thinking about (if it hasn't been already) having VE
 detect and DTRT when a user types that.

There is an enhancement for that.

https://bugzilla.wikimedia.org/show_bug.cgi?id=51897

IMO the current behaviour isnt correct.  There are so few instances
that nowiki is desirable, that the current VE should refuse to accept
wikitext (at least [[ and {{, and maybe ==, # and * at beginning of
line, etc ), until at least such time as they have sorted out all the
other bugs causing this to happen unintentionally.  If the user needs
to input [[ or {{ they can use the source editor.  Or the VE could
walk the user through each nowiki and either a) ask the user to
confirm they want the obvious fix done automatically for them, or b)
help them fix the problem.  Before saving.

nowiki is also being inserted at beginning of lines.

https://es.wikipedia.org/w/index.php?title=Yacimiento_de_Son_Forn%C3%A9sdiff=prevoldid=68561708
https://fr.wikipedia.org/w/index.php?title=Joel_Lindperecurid=5580501diff=95038161oldid=95037498

instead of lists

https://es.wikipedia.org/w/index.php?title=Jorge_Eslavadiff=68542729oldid=68451155

There are 24 open bugs for 'visualeditor nowiki'

https://bugzilla.wikimedia.org/buglist.cgi?title=Special%3ASearchquicksearch=visualeditor%20nowikilist_id=219994

--
John Vandenberg

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] dirty diffs and VE

2013-07-23 Thread John Vandenberg
On Wed, Jul 24, 2013 at 2:06 AM, Subramanya Sastry
ssas...@wikimedia.org wrote:
 Hi John and Risker,

 First off, I do want to once again clarify that my intention in the previous
 post was not to claim that VE/Parsoid is perfect.  It was more that we've
 fixed sufficient bugs at this point that the most significant bugs (bugs,
 not missing features) that need fixing (and are being fixed) are those that
 have to do with usability tweaks.

How do you know that?  Have you performed automated tests on all
Wikipedia content?  Or are you waiting for users to find these bugs?

 My intention in that post was also not
 one to put some distance between us and the complaints, just to clarify that
 we are fixing things as fast as we can and it can be seen in the recent
 changes stream.

 John: specific answers to the edit diffs you highlighted in your post.  I
 acknowledge your intention to make sure we dont make false claims about
 VE/Parsoid's usability.   Thanks for taking the time for digging them up.
 My answers below are made with an intention of figuring out what the issues
 are so they can be fixed where they need to be.


 On 07/23/2013 02:50 AM, John Vandenberg wrote:

 On Tue, Jul 23, 2013 at 4:32 PM, Subramanya Sastry
 ssas...@wikimedia.org wrote:

 On 07/22/2013 10:44 PM, Tim Starling wrote:

 Round-trip bugs, and bugs which cause a given wikitext input to give
 different HTML in Parsoid compared to MW, should have been detected
 during automated testing, prior to beta deployment. I don't know why
 we need users to report them.


 500+ edits are being done per hour using Visual Editor [1] (less at this
 time given that it is way past midnight -- I have seen about 700/hour at
 times).  I did go and click on over 100 links and examined the diffs.  I
 did
 that twice in the last hour.  I am happy to report clean diffs on all
 edits
 I checked both times.

 I did run into a couple of nowiki-insertions which
 is, strictly speaking not erroneous and based on user input, but is more
 a
 usability issue.

 What is a dirty diff?  One that inserts junk unexpectedly, unrelated
 to the user's input?


 That is correct.  Strictly speaking, yes, any changes to the wikitext markup
 that arose from what the user didn't change.

 The broken table injection bugs are still happening.


 https://en.wikipedia.org/w/index.php?title=Sai_Baba_of_Shirdicurid=144175diff=565442800oldid=565354286

 If the parser isnt going to be fixed quickly to ignore tables it
 doesnt understand, we need to find the templates and pages with these
 broken tables - preferably using SQL and heuristics and fix them.  The
 same needs to be done for all the other wikis, otherwise they are
 going to have the same problems happening randomly, causing lots of
 grief.


 This maybe related to this:
 https://bugzilla.wikimedia.org/show_bug.cgi?id=51217  and I have a tentative
 fix for it as of y'day.

Fixes are of course appreciated.  The pace of bugfixes is not the problem ...

 VE and Parsoid devs have put in a lot and lot of effort to recognize broken
 wikitext source, fix it or isolate it,

My point was that you dont appear to be doing analysis of how of all
Wikipedia content is broken; at least I dont see a public document
listing which templates and pages are causing the parser problems, so
the communities on each Wikipedia can fix them ahead of deployment.

I believe there is bug about automated testing of the parser against
existing pages, which would identify problems.

I scanned the Spanish 'visualeditor' tag's 50 recentchanges earlier
and found a dirty diff, which I believe hasnt been raised in bugzilla
yet.

https://bugzilla.wikimedia.org/show_bug.cgi?id=51909

50 VE edits on eswp is more than one day of recentchanges.  Most of
the top 10 wikis have roughly the same level of testing going on.
That should be a concern.  The number of VE edits is about to increase
on another nine Wikipedias, with very little real impact analysis
having been done.  That is a shame, because the enwp deployment has
provided us with a list of problems which will impact those wikis if
they are using the same syntax, be it weird or broken or otherwise
troublesome.

 and protect it across edits, and
 roundtrip it back in original form to prevent corruption.  I think we have
 been largely successful but we still have more cases to go that are being
 exposed here which we will fix.  But, occasionally, these kind of errors do
 show up -- and we ask for your patience as we fix these.  Once again, this
 is not a claim to perfection, but a claim that this is not a significant
 source of corrupt edits.  But, yes even a 0.1% error rate does mean a big
 number in the absolute when thousands of pages are being edited -- and we
 will continue to pare this down.

Is 0.1% a real data point, or a stab in the dark?  Because I found two
in 100 on enwp; Robert found at least one in 200 on enwp; and I found
1 in 50 on eswp.

 In addition to nowikis, there are also wikilinks

Re: [Wikitech-l] dirty diffs and VE

2013-07-23 Thread John Vandenberg
On Wed, Jul 24, 2013 at 9:02 AM, Subramanya Sastry
ssas...@wikimedia.org wrote:
 On 07/23/2013 05:28 PM, John Vandenberg wrote:

 On Wed, Jul 24, 2013 at 2:06 AM, Subramanya Sastry
 ssas...@wikimedia.org wrote:

 Hi John and Risker,

 First off, I do want to once again clarify that my intention in the
 previous
 post was not to claim that VE/Parsoid is perfect.  It was more that we've
 fixed sufficient bugs at this point that the most significant bugs
 (bugs,
 not missing features) that need fixing (and are being fixed) are those
 that
 have to do with usability tweaks.

 How do you know that?  Have you performed automated tests on all
 Wikipedia content?  Or are you waiting for users to find these bugs?


 http://parsoid.wmflabs.org:8001/stats

 This is the url for our round trip testing on 160K pages (20K each from 8
 wikipedias).

Fantastic!  How frequently are those tests re-run?  Could you add a
last-run-date on that page?

Was a regression testsuite built using the issues encountered during
the last parser rewrite?

--
John Vandenberg

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] dirty diffs and VE

2013-07-23 Thread John Vandenberg
On Wed, Jul 24, 2013 at 9:02 AM, Subramanya Sastry
ssas...@wikimedia.org wrote:
 http://parsoid.wmflabs.org:8001/stats

 This is the url for our round trip testing on 160K pages (20K each from 8
 wikipedias).

Very minor point .. there are ~400 missing pages on the list; is that
intentional ? ;-)

One is 'Mos:time' which is in NS 0, and does actually exist as a
redirect to the WP: manual of style:
https://en.wikipedia.org/wiki/Mos:time

...
 But, 99.6% means that 0.4% of pages still had corruptions, and that 15% of
 pages had syntactic dirty diffs.

So 15% is 24000 pages which can bust, but may not if the edit doesnt
touch the bustable part.

Does /topfails cycle through all 24000, 40 pages at a time?

Could you provide a dump of the list of 24000 bustable pages?  Split
by project?  Each community could then investigate those pages for
broken tables, and more critically .. templates which emit broken
wikisyntax that is causing your team grief.

Do you have stats on each of those eight wikipedias? i.e. is there
noticeable differences in the percentages on different wikipedias? if
so, can you report those percentages for each projects?  I'm guessing
Chinese is an example where there are higher percentages..?

--
John Vandenberg

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] SSL certificate

2013-01-22 Thread John Vandenberg
On Jan 23, 2013 8:09 AM, Thehelpfulone thehelpfulonew...@gmail.com
wrote:

 On 22 January 2013 19:44, Manuel Schneider manuel.schnei...@wikimedia.ch
wrote:

  Am 22.01.2013 20:42, schrieb Leslie Carr:
   On Tue, Jan 22, 2013 at 11:28 AM, Manuel Schneider
   manuel.schnei...@wikimedia.ch wrote:
   I'd like to notify you that thanks to the generosity and cooperation
if
   the previous owner, wikimania.org and wikimania.com are now owned by
   Wikimedia CH.
  
   The domain is currently set up in the same way as it was before but
we
   have now the possibility tu use ssl.wikimania.org for the
registrations
   and scholarship tools already hosted by Wikimedia CH.
  
  
   Are you interested in transferring the domain to the WMF which already
   has a SSL cluster and an established procedure for handling new ssl
   certificates?
 
  we can do so, when we have a proper concept how to bring this together
  with the tools hosted at WMCH server now.
  My original inquiry was to get ssl.wikimania.org from WMF which should
  point to the WMCH server. As the response was that we don't own that
  domain that was what I fixed first.
 

 Were the domains not previously set to redirect to the latest wikimania
 wiki?

They were.

I hope that will be fixed quickly.

--
John
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


[Wikitech-l] Why are we still using captchas on WMF sites?

2013-01-22 Thread John Vandenberg
There is a Firefox extension to get past the captchas...

-- Forwarded message --
From: Graham Pearce graha...@jazi.net
Date: Wed, Jan 23, 2013 at 5:41 PM
Subject: Re: Fwd: [Wikitech-l] Why are we still using captchas on WMF sites?
To: John Vandenberg jay...@gmail.com


Yes, I have. There isn't an essay about it as such, but in this edit to the help
page about using JAWS:
http://en.wikipedia.org/w/index.php?title=Wikipedia:Using_JAWSdiff=32016oldid=320037024

I offered to help people use WebVisum, a Firefox extension to get past
the captchas:
http://webvisum.com

It's an invite-only system. Nobody took up my offer, but I notice now
that there's a page to request invitations:
http://www.webvisum.com/en/main/invitationrequest

So maybe it's not so necessary ...

Feel free to pass this reply on to whoever you like.

Graham


On 23/01/2013 12:07 PM, John Vandenberg wrote:

 Have you heard about this?

 is there a wiki essay about this accessibility problem?


 -- Forwarded message --
 From: Chris Grant chrisgrantm...@gmail.com
 Date: Wed, Jan 23, 2013 at 2:33 PM
 Subject: Re: [Wikitech-l] Why are we still using captchas on WMF sites?
 To: Wikimedia developers wikitech-l@lists.wikimedia.org


 On Wed, Jan 23, 2013 at 3:53 AM, Bawolff Bawolff bawo...@gmail.com wrote:

 Someone should write a browser addon to automatically decode and fill in
 captchas for blind users. (Only half joking)

 Don't joke, I have a blind relative who's screen reader does just that
 (simple captchas only).

 There are other services like http://www.azavia.com/zcaptcha which is
 specifically for the blind, but hell its probably cheaper to use the same
 captcha reading services that the spammers do.

 -- Chris

 On Wed, Jan 23, 2013 at 3:53 AM, Bawolff Bawolff bawo...@gmail.com wrote:

 On 2013-01-22 3:30 PM, aude aude.w...@gmail.com wrote:

 On Tue, Jan 22, 2013 at 8:18 PM, Luke Welling WMF 

 lwell...@wikimedia.org

 wrote:

 That was not the end of the problem I was referring to. We know our
 specific captcha is broken at turning away machines. As far as I am

 aware

 we do not know how many humans are being turned away by the difficulty

 of

 it.


 It's at least impossible for blind users to solve the captcha, without an
 audio captcha.  (unless they manage to find the toolserver account

 creation

 thing and enough motivated to do that)

 I am not convinced of the benefits of captcha versus other spam filtering
 techniques.

 Cheers,
 Katie



 Someone should write a browser addon to automatically decode and fill in
 captchas for blind users. (Only half joking)

 -bawolff
 __**_

 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/**mailman/listinfo/wikitech-l

 https://lists.wikimedia.org/mailman/listinfo/wikitech-l
 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l



 --
 @wikimediadc / @wikidata
 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l

 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l

 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l





-- 
John Vandenberg

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Can we help Tor users make legitimate edits?

2012-12-27 Thread John Vandenberg
Add to that list the underlying XFF blocking bug.

https://bugzilla.wikimedia.org/show_bug.cgi?id=23343

Back on topic, is it necessary for good users seeking IP block exemption to
be checkusered? I doubt it.

IP block exemption is rarely given because it allows someone to keep
editing on their main account when a sock is blocked.

Tor exemption should be separate from IP block exemption.

John Vandenberg.
sent from Galaxy Note
On Dec 28, 2012 12:13 PM, Federico Leva (Nemo) nemow...@gmail.com wrote:

 I rather think that devs' time would be best spent ensuring that our tools
 against Tor users and open proxies are effective and reliable. Huge amounts
 of volunteers' time are spent combating abuse of them, with inadequate
 tools.
 See for instance:
 https://bugzilla.wikimedia.**org/show_bug.cgi?id=30716https://bugzilla.wikimedia.org/show_bug.cgi?id=30716
 https://bugzilla.wikimedia.**org/show_bug.cgi?id=42438https://bugzilla.wikimedia.org/show_bug.cgi?id=42438
 https://bugzilla.wikimedia.**org/show_bug.cgi?id=8475https://bugzilla.wikimedia.org/show_bug.cgi?id=8475

 As for legitimate users, probably the most useful thing to do would be
 ensuring that the TorBlock extension shows an understandable error message
 and sends people to a translatable page with instructions valid for all
 language editions of our projects with current poliecies (most projects
 will have none).

 Nemo

 __**_
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/**mailman/listinfo/wikitech-lhttps://lists.wikimedia.org/mailman/listinfo/wikitech-l

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Can we help Tor users make legitimate edits?

2012-12-27 Thread John Vandenberg
On Dec 28, 2012 12:47 PM, John Vandenberg jay...@gmail.com wrote:

 Add to that list the underlying XFF blocking bug.

 https://bugzilla.wikimedia.org/show_bug.cgi?id=23343

 Back on topic, is it necessary for good users seeking IP block exemption
to be checkusered? I doubt it.

 IP block exemption is rarely given because it allows someone to keep
editing on their main account when a sock is blocked.

Hmm. I guess Tor does too ;)

 Tor exemption should be separate from IP block exemption.

..so ignore that comment ;)

 John Vandenberg.
 sent from Galaxy Note

 On Dec 28, 2012 12:13 PM, Federico Leva (Nemo) nemow...@gmail.com
wrote:

 I rather think that devs' time would be best spent ensuring that our
tools against Tor users and open proxies are effective and reliable. Huge
amounts of volunteers' time are spent combating abuse of them, with
inadequate tools.
 See for instance:
 https://bugzilla.wikimedia.org/show_bug.cgi?id=30716
 https://bugzilla.wikimedia.org/show_bug.cgi?id=42438
 https://bugzilla.wikimedia.org/show_bug.cgi?id=8475

 As for legitimate users, probably the most useful thing to do would be
ensuring that the TorBlock extension shows an understandable error message
and sends people to a translatable page with instructions valid for all
language editions of our projects with current poliecies (most projects
will have none).

 Nemo

 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Mobile apps: time to go native?

2012-12-11 Thread John Vandenberg
Small (native) apps can do Wikimedia work quite effectively using the api

Upload image
File categorisation
New page patrol
Flagged revs/Pending changes
OTRS

John Vandenberg.
sent from Galaxy Note
On Dec 12, 2012 7:04 AM, MZMcBride z...@mzmcbride.com wrote:

 Brion Vibber wrote:
  Over on the mobile team we've been chatting for a while about the various
  trade-offs in native vs HTML-based (PhoneGap/Cordova) development.
 
  [...]
 
  iOS and Android remain our top-tier mobile platforms, and we know we can
 do
  better on them than we do so far...
 
  Any thoughts? Wildly in favor or against?

 It's unclear from your e-mail what the goal of mobile interaction (for lack
 of a better term) is. Are you trying to build editing features? File upload
 features? Or do you want to just have a decent reader? This seems like a
 key
 component to any discussion of the future of mobile development.

 Looking at the big picture, I don't think we'll ever see widespread editing
 from mobile devices. The user experience is simply too awful. The best I
 think most people are hoping for is the ability to easily fix a typo,
 maybe,
 but even then you have to assess costs vs. benefit. That is, is it really
 worth paying two or three full-time employees so that someone can easily
 change Barrack to Barack from his or her iPhone? Probably not.

 Perhaps mobile uploading could use better native support, but again, is the
 cost worth it? Does Commons need more low-quality photos? And even as phone
 cameras get better, do those photos need to be _instantly_ uploaded to the
 site? There's something to be said for waiting until you get home to upload
 photos, especially given how cumbersome the photo upload process is
 (copyright, permissions, categorization, etc.). And this all side-steps the
 question of whether there are better organizations equipped at handling
 photos (such as Flickr or whatever).

 That leaves reading. If a relatively recent browser can't read Wikimedia
 wikis without performance issues, I think that indicates a problem with
 Wikimedia wikis (way too much JavaScript, images are too large, etc.).
 Mobile browsers are fairly robust (vastly more robust compared to what they
 used to be), so I'm not sure why having a Wikipedia reader is valuable or
 why it's worth investing finite resources in. Occasionally I'll hear but I
 want to have a favorite pages feature or I want to support offline
 reading,
 but the phone's OS should be able to handle most of this from the built-in
 Web browser. Or not, but I think if a phone is incapable of a feature such
 as e-mail this Web page (article) to a friend, it's not an important
 enough feature to devote resources to.

 Wikimedia wikis have a lot of bugs and the Wikimedia Foundation has finite
 resources. I personally only use the Messages and Music apps on my
 phone, so I'm not the best person to make the argument for additional
 mobile
 development, but when I look at how terrible the user experience continues
 to be for desktop users, it becomes difficult for me to understand why the
 Wikimedia Foundation would delve into the world of mobile (apart from
 initiatives such as Wikipedia Zero). What benefit to the creation or
 dissemination of free educational content are we seeing (or hoping to see)
 from native apps?

 MZMcBride



 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] [Wikimediauk-l] AdBlock Plus wants to protect you from a malicious website, namely Wikimedia UK

2012-11-26 Thread John Vandenberg
I received the same error when I visited
http://incubator.wikimedia.org/ , suggesting that I wanted to go to
incubator.wikipedia.org , and asking if I wanted to block
wikimedia.org

On Sat, Nov 24, 2012 at 11:46 PM, Andy Mabbett
a...@pigsonthewing.org.uk wrote:
 On 24 November 2012 16:41, Tom Morris t...@tommorris.org wrote:

 AdBlock Plus has a facility to protect users from typosquatting
 (as in, typing poopal.com rather than paypal.com)

 http://cl.ly/image/0M1l1D0K1x0g

 Ouch.

 Does someone want to work out how to complain to the relevant parties?

 Is there no facility to report false positives on their website?

 --
 Andy Mabbett
 @pigsonthewing
 http://pigsonthewing.org.uk

 ___
 Wikimedia UK mailing list
 wikimediau...@wikimedia.org
 http://mail.wikimedia.org/mailman/listinfo/wikimediauk-l
 WMUK: http://uk.wikimedia.org



-- 
John Vandenberg

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] suggestion: replace CAPTCHA with better approaches

2012-07-25 Thread John Vandenberg
On Thu, Jul 26, 2012 at 5:00 AM, Derric Atzrott
datzr...@alizeepathology.com wrote:
This way if people feel motivated at cheating at captcha they will end up
 helping Wikipedia It is up to us to try to balance things out.

I'm pretty sure users will be less annoyed at solving captchas that
 actually contribute some value.

 Obligatory XKCD: https://xkcd.com/810/

;-)

 The best CAPTCHAs are the kind that do this.  Look at how hard it is to beat
 reCAPTCHA because they have taken this approach.  One must be careful though
 that the CAPTCHA is constructed such that it won't be as simple as a lookup
 though, and will actually require some thought (so that probably eliminates
 the noun, verb, adjective idea).

 This idea has my support.

We should use less CAPTCHAs.

If the problem is spam, we should build better new URL review
systems.  There are externally managed spam lists that we could use to
identify spammers.

'new URL' s could be defined as domain names that were not in the
external links table for more than 24 hrs.

Addition of these new URLs could be smartly throttled.

un-autoconfirmed edits which include 'new URLs' could be throttled so
that they can only be added to a single article for the first 24
hours.  That allows a new user to make use of a new domain name
unimpeded, however they can only use it on one page for the first 24
hrs.  If the new URL was spam, it will hopefully be removed within 24
hrs, which resets the clock for the spammer.  i.e. they can only add
the spam to one page each 24 hrs.

Another idea is for the wiki to ask the user that adds new URLs to
review three recent edits that included new URLs and ask the user to
indicate whether or not the new URL was SPAM and should be removed.
This may be unworkable because the spam-bot could use the linksearch
tool to check whether a link is good or not.

--
John Vandenberg

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Inactive sysops + improving security

2012-04-13 Thread John Vandenberg
On Fri, Apr 13, 2012 at 3:56 PM, Petr Bena benap...@gmail.com wrote:
 That was just an example, interesting are of course even checkuser
 accounts and such. I don't see any reason why system which notify you
 that someone is trying to login as you is something bad for stewards.

That feature would be awesome.

https://bugzilla.wikimedia.org/show_bug.cgi?id=9838

-- 
John Vandenberg

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Inactive sysops + improving security

2012-04-12 Thread John Vandenberg
On Thu, Apr 5, 2012 at 12:42 AM, Petr Bena benap...@gmail.com wrote:
 Let me clarify:
 we are talking about accounts which are interesting for hackers such
 as these of stewards.

Really?  You're clearly designing a system targeting inactive sysops
who disappear, but it doesnt seem suitable for inactive stewards who
disappear.
The former happens and nobody notices or cares, whereas the latter
rarely happens without anyone noticing/caring, and never happens for
more than a year because of steward re-elections.

Unless the software is going to poll stewards who have been inactive
for a month or two, it wont be more effective than the current system,
and the current system of steward re-elections isnt replaceable with
your design (or any other software solution).

-- 
John Vandenberg

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Inactive sysops + improving security

2012-04-04 Thread John Vandenberg
On Wed, Apr 4, 2012 at 5:43 PM, Petr Bena benap...@gmail.com wrote:
 I have seen there is a lot of wikis where people are concerned about
 inactive sysops. They managed to set up a strange rule where sysop
 rights are removed from inactive users to improve the security.
 However the sysops are allowed to request the flag to be restored
 anytime. This doesn't improve security even a bit as long as hacker
 who would get to some of inactive accounts could just post a request
 and get the sysop rights just as if they hacked to active user.

 For this reason I think we should create a new extension auto sysop
 removal, which would remove the flag from all users who didn't login
 to system for some time, and if they logged back, the confirmation
 code would be sent to email, so that they could reactivate the sysop
 account. This would be much simpler and it would actually make hacking
 to sysop accounts much harder. I also believe it would be nice if
 system sent an email to holder of account when someone do more than 5
 bad login attemps, in order to be warned that someone is likely trying
 to compromise their account.

What happens if the ex-sysop has lost access to their original email
address .. ?

-- 
John Vandenberg

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Inactive sysops + improving security

2012-04-04 Thread John Vandenberg
On Wed, Apr 4, 2012 at 6:25 PM, Petr Bena benap...@gmail.com wrote:
 On Wed, Apr 4, 2012 at 10:15 AM, John Vandenberg jay...@gmail.com wrote:
 What happens if the ex-sysop has lost access to their original email
 address .. ?


 If the sysop lost their email, they are in same troubles as if any
 other user lost their email and forgot password. It simply shouldn't
 happen.

It does happen.

It's a significant problem if it is an ordinary user, but they can
always create a new account and assert that they are the old user and
nobody really cares.

It's a major issue if it is a sysop, because it can be very hard to
verify the identity of someone claiming to be a sysop, so 'crats and
arbcom and the community try and find a web of trust that extends back
in time.  Not fun.

Also, you might want to query the databases to see how many admins
don't have an email address set.  I wouldn't be surprised if there are
a few.  IMO any that dont have an email set should have their sysop
bit removed.

-- 
John Vandenberg

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Inactive sysops + improving security

2012-04-04 Thread John Vandenberg
On Wed, Apr 4, 2012 at 7:12 PM, Petr Bena benap...@gmail.com wrote:
 Actually sysops could be just required to have the email set in the
 project guidelines. If they don't do that and their account expire,
 they lost the sysop. I don't see it as a big deal. I hope that sysops
 are clever enough to at least be able to follow their own rules.

Rules?  which rules.  There are seven projects which have multiple
languages, and over 282 languages.  Good luck updating all their
policies, and talking to all the sysops who arnt complying with the
policy, etc.

--
John Vandenberg

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Inactive sysops + improving security

2012-04-04 Thread John Vandenberg
On Wed, Apr 4, 2012 at 7:28 PM, Petr Bena benap...@gmail.com wrote:
 Indeed :-)

 But if I didn't think it's weird, I wouldn't start this. I am always
 trying to find a solution from programmer point of view for a problems
 which community sometimes try to solve by hand.

But the community isn't complaining about this being time consuming.
Where they desysop inactive sysops, it is a social community exercise
and your technical solution is probably not going to have the same
effect.  On English Wikisource we have annual reconfirmations and we
use our brains to gauge whether a sysop is likely to return soon.  The
security aspect is only a small part of it.

-- 
John Vandenberg

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] HTTPS preference?

2012-03-10 Thread John Vandenberg
On Sun, Mar 11, 2012 at 5:58 AM, MZMcBride z...@mzmcbride.com wrote:
 Hi.

 https://bugzilla.wikimedia.org/show_bug.cgi?id=29898 is about adding a user
 preference for HTTP vs. HTTPS while a user is logged in.

 I'd really like to see this bug resolved, as I regularly encounter HTTP
 links and the lack of auto-redirection is becoming a larger and larger
 usability problem for me. (I don't use HTTPS-Everywhere on my personal
 computer.)

Why isnt HTTPS Everywhere a good solution for you?

-- 
John Vandenberg

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] [Foundation-l] COMPLETED: Mailing lists server migration today

2012-01-22 Thread John Vandenberg
Hi Mark,

There are a lot of http links to mail.wiki[pm]edia.org, where mailman
and pipermail used to work.  They are in email footers headers and in
Wikipedia.

https://google.com/search?q=%22mail.wikipedia.org%22+-site:lists.wikimedia.org

Thanks,
John Vandenberg

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] [Foundation-l] COMPLETED: Mailing lists server migration today

2012-01-22 Thread John Vandenberg
On Mon, Jan 23, 2012 at 4:40 PM, K. Peachey p858sn...@gmail.com wrote:
 This really should be filed in bugzilla.

aye.

https://bugzilla.wikimedia.org/show_bug.cgi?id=33897

-- 
John Vandenberg

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] [Foundation-l] COMPLETED: Mailing lists server migration today

2012-01-22 Thread John Vandenberg
Also related:

https://bugzilla.wikimedia.org/show_bug.cgi?id=33898

-- 
John Vandenberg

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Files with unknown copyright status on MediaWiki wiki

2012-01-03 Thread John Vandenberg
On Wed, Jan 4, 2012 at 10:51 AM, Platonides platoni...@gmail.com wrote:
 Thanks Roan. It's also interesting to learn about that clause.

I agree.  I asked whether WMF had something like this
http://lists.wikimedia.org/pipermail/foundation-l/2011-August/067223.html

And they do!?  very cool.

Is that clause in a publicly accessible document?

-- 
John Vandenberg

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] special:search/foo redirect to category:foo

2011-10-15 Thread John Vandenberg
On Sat, Oct 15, 2011 at 4:33 AM, Daniel Barrett d...@vistaprint.com wrote:
Is it possible for the special:search logic to take us to [[category:foo]] if 
[[foo]] doesnt exist and [[category:foo]] does?

 This should be an easy wiki extension to write, using the hook 
 SpecialSearchGo($title, $term). Just check for the existence of the article 
 and the category, and modify the title correspondingly.

It looks like there is some code to check for an appropriate page,
with a bit of fuzzy logic.

e.g. there is no enwp page 1964 Brinks Hotel Bombing with a capital
B in Bombing

http://enwp.org/1964_Brinks_Hotel_Bombing

however Special:Search will go directly to the right page

http://enwp.org/Special:Search/1964_Brinks_Hotel_Bombing

I think it would be useful to have an array of namespaces to
automatically redirect to.

On Wikisource it would be nice if the 'Author' namespace was part of
this did you mean automatic redirection.

e.g.

http://enws.org/Special:Search/Mark_Twain

could take the browser directly to

http://en.wikisource.org/wiki/Author:Samuel_Langhorne_Clemens

This would remove the need to have a author=yes parameter on enwp
{{Sister project links}}

--
John Vandenberg

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Giving up on LiquidThreads

2011-10-12 Thread John Vandenberg
The discussion is at

http://sv.wikisource.org/wiki/Wikisource:M%C3%B6tesplatsen#1.18-buggar

and the talk page

http://sv.wikisource.org/wiki/Wikisourcediskussion:M%C3%B6tesplatsen#1.18

I successfully created a LQ thread on my sv.ws talk page. :/

http://sv.wikisource.org/wiki/Anv%C3%A4ndardiskussion:John_Vandenberg

--
John Vandenberg

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] [Foundation-l] We need to make it easy to fork and leave

2011-08-13 Thread John Vandenberg
On Sat, Aug 13, 2011 at 4:53 AM, emijrp emi...@gmail.com wrote:
 Man, Gerard is thinking about new methods to fork (in an easy way) single
 articles, sets of articles or complete wikipedias, and people reply about
 setting up servers/mediawiki/importing_databases and other geeky weekend
 parties. That is why there is no successful forks. Forking Wikipedia is
 _hard_.

 People need a button to create a branch of an article or sets of articles,
 and be allowed to re-write and work in the way they want. Of course, the
 resulting articles can't be saved/showed close to the Wikipedia articles,
 but in a new plataform. It would be an interesting experiment.

Something like this.. ?

http://wikimedia.org.au/wiki/Proposal:PersonalWikiTool

-- 
John Vandenberg

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Shell requests backlog

2011-05-17 Thread John Vandenberg
On Tue, May 17, 2011 at 5:37 PM, Brion Vibber br...@wikimedia.org wrote:
 MZ, if you're aware of *any* specific bugs where this is an issue, please
 let me or Mark know and we'll take a look at them.

http://wikisource.org/wiki/WS:BUGS

-- 
John Vandenberg

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


[Wikitech-l] wiki/ prefix in URL

2011-05-17 Thread John Vandenberg
from foundation-l thread [Fwd: Re: Do WMF want enwp.org?]

On Tue, May 17, 2011 at 11:27 PM, Casey Brown li...@caseybrown.org wrote:
 On Tue, May 17, 2011 at 9:24 AM, Lodewijk lodew...@effeietsanders.org wrote:
 Of course
 unless someone finds a way to redirect en.wikipedia.org/Example to
 en.wikipedia.org/wiki/Example .

 Did you mean to type http://en.wikipedia.org/wiki/Example? You will
 be automatically redirected there in five seconds.

 :-)

 It already redirects there, though we don't want to advertise that we
 have a link shortener because the 404 page redirects.

Is there a good reason for us to always include the '/wiki/' in the URL?
Removing the prefix would save five characters, and I'm guessing that
it would also save a measurable amount of traffic serving 5KB 404
pages.

Is there something else on these virtual hosts other than a few
regexes which are extremely unlikely to be used as page names (i.e.
\/w\/.*\.php).

-- 
John Vandenberg

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] How would you disrupt Wikipedia?

2010-12-30 Thread John Vandenberg
On Fri, Dec 31, 2010 at 1:07 AM, David Gerard dger...@gmail.com wrote:
 There is some discussion of how the community and ArbCom enable
 grossly antisocial behaviour on internal-l at present. Admin behaviour
 is enforced by the ArbCom, and the AC member on internal-l has mostly
 been evasive.

Wtf?
ArbCom members are expected to be responsive to discussions about
English Wikipedia occurring on internal-l?
Could you please clarify who are you're obliquely attacking here?

--
John Vandenberg

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Creating a Media handler extension for molecular files ?

2010-11-22 Thread John Vandenberg
Hi Nicolas,

There is a bug open for CML support.

https://bugzilla.wikimedia.org/show_bug.cgi?id=16491

and a tracking bug for other chemistry related improvements

https://bugzilla.wikimedia.org/show_bug.cgi?id=17598

djbeets...@hotmail.com (user:Beetstra) has done some related extension
work and may be keen to help.

--
John Vandenberg

On Tue, Nov 23, 2010 at 8:03 AM, Nicolas Vervelle nverve...@gmail.com wrote:
 Hello,

 I am interested in developing an extension for handling molecular files
 (files containing informations about chemical molecules : atoms, bonds,
 ...).
 If I understand correctly it will enable me to display specific informations
 in the File:... page, like what MediaWiki does for simple images.
 Something like existing extensions (FlvHandler, OggHandler,
 PagedTiffHandler, PNGHandler, TimedMediaHandler in SVN trunk for example).


 I have read several pages on MediaWiki about writing extensions, but they
 are not very detailed for media handler extensions.
 I have also written an extension to display and interact with
 moleculeshttp://wiki.jmol.org/index.php/Jmol_MediaWiki_Extension,
 but I still have several questions on how I can create a handler for
 molecular files in MediaWiki.
 Any help or links to some explanations will be appreciated.


 Molecular files exist in several formats : pdb, cif, mol, xyz, cml, ...
 Usually they are detected as simple MIME types (either text/plain or
 application/xml) by MediaWiki and not as more precise types (even if this
 types exist : chemical/x-pdb, chemical/x-xyz, ...).
 It seems that to register a Media handler, I have to add an entry to
 $wgMediaHandlers[] : $wgMediaHandler['text/plain'] = 'MolecularHandler';
 Will it be a problem to use such a general MIME type to register the handler
 ? Especially for files of the same MIME type but that are not molecular
 files ?
 Are there some precautions to take into account ? (like letting an other
 handler deal with the file if it's not a molecular file, ...)


 I want to use the Jmol http://www.jmol.org/ applet for displaying the
 molecule in 3d, and allowing the user to manipulate it.
 But the applet is about 1M in size, so it takes time to load the first time,
 then to start and load the molecular file.
 I would like to start showing a still image (generated on the server) and a
 button to let the user decide when loading the applet if interested in.
 Several questions for doing this with MediaWiki :

   - What hook / event should I use to be able to add this content in the
   File:... page ?
   - Is there a way to start displaying the File:... page, compute the still
   image in the background,and add it in the File:... page after ?
   - Are there any good practices for doing this kind of things ?


 Is it also possible to create thumbnails in articles if they include links
 to a molecular file (like [[File:example.pdb]]) ?
 What hook should I use ?
 Is it possible to compute the thumbnail in the background ?


 Any other advice for writing a media handler extension ?
 Or other possibilities that could enhance the extension ?
 Among the few handler extensions in SVN, which is the better example ?


 Thanks for any help
 Nico
 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Skip local file description pages

2010-11-03 Thread John Vandenberg
On Wed, Nov 3, 2010 at 6:41 PM, Bryan Tong Minh
bryan.tongm...@gmail.com wrote:
 On Wed, Nov 3, 2010 at 1:58 AM, Lars Aronsson l...@aronsson.se wrote:
 Could we please just skip local file description pages
 for everybody? Perhaps a reversed gadget could enable
 them back, but the default should be for image links
 to go directly to Commons. Is this possible? Does any
 project already use this? What should I write in the
 site request in Bugzilla?

 There is no such specific function in MediaWiki as far as I know.

It should be possible to do this as a gadget or as part of Common.js,
like is done for 'secure' interwikis.

https://bugzilla.wikimedia.org/show_bug.cgi?id=5440

--
John Vandenberg

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Firesheep

2010-10-26 Thread John Vandenberg
On Tue, Oct 26, 2010 at 6:24 PM, George Herbert
george.herb...@gmail.com wrote:
..
 But I would prefer to move towards a logged-in user by default goes to
 secure connection model.  That would include making secure a
 multi-system, fully redundantly supported part of the environment, or
 alternately just making https work on all the front ends.

 Any login should be protected.  The casual eh attitude here is
 unprofessional, as it were.  The nature of the site means that this
 isn't something I would rush a crash program and redirect major
 resources to fix immediately, but it's not something to think of as
 desirable and continue propogating for more years.

I agree.  Even if we still do drop users back to http after
authentication, and the cookies can be sniffed, that is preferable to
having authentication over http.

People often use the same password for many sites.

Their password may not have much value on WMF projects ('at worst they
access admin functions'), but it could be used to access their gmail
or similar.

--
John Vandenberg

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Commons ZIP file upload for admins

2010-10-26 Thread John Vandenberg
On Tue, Oct 26, 2010 at 6:50 AM, Max Semenik maxsem.w...@gmail.com wrote:
 

 Instead of amassing social constructs around technical deficiency, I
 propose to fix bug 24230 [1] by implementing proper checking for JAR
 format. Also, we need to check all contents with antivirus and
 disallow certain types of files inside archives (such as .exe). Once
 we took all these precautions, I see no need to restrict ZIPs to any
 special group. Of course, this doesn't mean that we soul allow all the
 safe ZIPs, just several open ZIP-based file formats.

If we only want zip's for several formats, we should check that they
are of the expected type, _and_ that they consist of open file formats
within the zip.

e.g. Open Office XML (the MS format) can include binary files for OLE
objects and fonts (I think)

see Table 2. Content types in a ZIP container

http://msdn.microsoft.com/en-us/library/aa338205(office.12).aspx

OOXML can also include any other mimetype, which are registered
_within_ the zip, and linked into the main content file.

afaics, allowing only safe zip to be upload isn't difficult.

Expand the zip, and reject any zip which contains files on
$wgFileBlacklist, and not on $wgFileExtensions + $wgZipFileExtensions.

$wgZipFileExtensions would consist of array('xml')

Then check the mimetypes of the files in the zip, against
$wgMimeTypeBlacklist (with 'application/zip' removed), again allowing
desired XML mimetypes through.

--
John Vandenberg

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Firesheep

2010-10-25 Thread John Vandenberg
On Tue, Oct 26, 2010 at 10:01 AM, Aryeh Gregor
simetrical+wikil...@gmail.com wrote:
 Gmail typically contains things like credit card numbers, passwords,
 maybe state secrets if you pick the right person
..
 at most you could compromise an admin account

.. or an account with checkuser or revdel-suppression, which provides
access to all sorts of goodies like the above.

but anyone with these tools should know to only use the insecure site.
yesterday was a problem, with secure down, meaning we needed to log
into the insecure site in order to handle oversight requests.
I guess I need to change my password now ;-(

--
John Vandenberg

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] ResourceLoader Debug Mode

2010-09-30 Thread John Vandenberg
On Thu, Sep 30, 2010 at 8:27 AM, Trevor Parscal tpars...@wikimedia.org wrote:
 ...
 When debug mode is off:

    * Modules are requested in batches
    * Resources are combined into modules
    * Modules are combined into a response
    * The response is minified

 When debug mode is on:

    * Modules are requested individually
    * Resources are combined into modules

There are two features here.
  modules
  optimising traffic

It seems everyone realises that the serving of raw files has to remain.

Why is this old and remaining functionality going to be named 'debug mode'?

Why not give the new feature a name, put it in the prefs, and turn
that pref on by default after sufficient road testing.  As happened
with the new skin.

There are going to be mediawiki implementers and wikimedia users who
want to disable this feature for reasons other than debugging.

(I've always considered the 'Misc' section of the prefs to be 'Performance')

--
John Vandenberg

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] [Toolserver-l] Static dump of German Wikipedia

2010-09-23 Thread John Vandenberg
On 9/24/10, Marcin Cieslak sa...@saper.info wrote:
 There are static dumps available here:

 http://download.wikimedia.org/dewiki/

 Is there any problem with using them?

I think they are from June 2008.

A fresh static dump would be good.

--
John Vandenberg

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Community vs. centralized development

2010-09-02 Thread John Vandenberg
On Fri, Sep 3, 2010 at 9:05 AM, Aryeh Gregor
simetrical+wikil...@gmail.com wrote:
 ... I scrolled; but agree with the bits I read...
 I don't know how seriously these suggestions will be taken in practice
 by the powers that be, but I hope I've made a detailed and cogent
 enough case to make at least some impact.

There is one very, very simple solution.

All changes should either be:
a) a patch on bugzilla, reviewed by anyone, and approved _on_bugzilla_
by a tech lead
b) the landing of a branch, approved by a tech lead in some public forum.

From that, all else flows.

--
John Vandenberg

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] [Wikisource-l] Parallel text alignment (was: Push translation)

2010-08-08 Thread John Vandenberg
On Sun, Aug 8, 2010 at 2:10 PM, Lars Aronsson l...@aronsson.se wrote:
 ...
 Is there any good free software for aligning parallel texts and
 extracting translations? Looking around, I found NAtools,
 TagAligner, and Bitextor, but they require texts to be marked
 up already. Are these the best and most modern tools available?

there is a Mediawiki extension which is supposed to provide this:

http://wikisource.org/wiki/Wikisource:DoubleWiki_Extension

It is enabled on all wikisource subdomains.

http://en.wikisource.org/wiki/Crito?match=el

It doesn't work very well because our Wikisource projects have
different layouts, esp. templates such as the header on each page.

--
John Vandenberg

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] wikipedia is one of the slower sites on the web

2010-08-02 Thread John Vandenberg
On Mon, Aug 2, 2010 at 11:24 PM, Roan Kattouw roan.katt...@gmail.com wrote:
 The resourceloader branch contains work in progress on aggressively
 combining and minifying JavaScript and CSS. The mapping of one
 resource = one file will be preserved, but the mapping of one resource
 = one REQUEST will die: it'll be possible, and encouraged, to obtain
 multiple resources in one request.

Does that approach gain much over HTTP pipelining?

--
John Vandenberg

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] What is exactly needed for setting up a new language edition of WM project?

2010-08-02 Thread John Vandenberg
On Mon, Aug 2, 2010 at 11:45 PM, Milos Rancic mill...@gmail.com wrote:
 Please, tell me which data you need for setting up a project?
 According to my information, data are:

 * language
 * language code
 * type of project (Wikipedia, Wikinews etc.)
 * logo
 * the name of project in local language
 * the name of the project namespace
 * the name of the project talk namespace

 Anything else needed?

I think Wikisource is now at the stage that any new projects should be
using the Proofread Page extension.  This extension has two
namespaces, 'Index' and 'Page', and they are in the messages, so
translations for them (and the talk namespaces) are necessary.

http://translatewiki.net/w/i.php?title=Special%3ATranslatetask=viewgroup=ext-proofreadpagelanguage=frlimit=100

--
John Vandenberg

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] wikipedia is one of the slower sites on the web

2010-07-30 Thread John Vandenberg
On Fri, Jul 30, 2010 at 6:23 AM, Aryeh Gregor
simetrical+wikil...@gmail.com wrote:
 On Thu, Jul 29, 2010 at 4:07 PM, Strainu strain...@gmail.com wrote:
 Could you please elaborate on that? Thanks.

 When pages are parsed, the parsed version is cached, since parsing can
 take a long time (sometimes  10 s).  Some preferences change how
 pages are parsed, so different copies need to be stored based on those
 preferences.  If these settings are all default for you, you'll be
 using the same parser cache copies as anonymous users, so you're
 extremely likely to get a parser cache hit.  If any of them is
 non-default, you'll only get a parser cache hit if someone with your
 exact parser-related preferences viewed the page since it was last
 changed; otherwise it will have to reparse the page just for you,
 which will take a long time.

 This is probably a bad thing.

Could we add a logged-in-reader mode, for people who are infrequent
contributors but wish to be logged in for the prefs.

They could be served a slightly old cached version of the page when
one is available for their prefs.  e.g. if the cached version is less
than a minute old.
The down side is that if they see an error, it may already be fixed.
OTOH, if the page is being revised frequently, the same is likely to
happen anyway.  The text could be stale before it hits the wire due to
parsing delay.

For pending changes, the pref 'Always show the latest accepted
revision (if there is one) of a page by default' could be enabled by
default.  Was there any discussion about the default setting for this
pref?

--
John Vandenberg

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Project management and issue tracking; Bugzilla vs Redmine

2010-06-13 Thread John Vandenberg
On Mon, Jun 14, 2010 at 11:29 AM, Rob Lanphier ro...@wikimedia.org wrote:
 Hi everyone,

 Sorry for the email length here.  Here's my quick summary:
 *  Trying to figure out how to do good project management on Bugzilla,
 working with the Bugzilla community to juice it up
 *  Failing that, we'd like to migrate off of it.  Leading candidate: Redmine
 *  Wiki page: http://www.mediawiki.org/wiki/Tracker/PM_tool

fwiw, I added a lot of time/project-management functionality to my
Bugzilla when I was using it in a commercial project.  Some of the
desired functionality was already available as patches on
bugzilla.mozilla.org; I added more, and most of my patches will be
available on bugzilla.mozilla.org.  All of those patches are probably
bitrotten.  I haven't looked at how far the bugzilla codebase has
changed since 2002, but I would be surprised if it was difficult to
rebase these patches.

For example, on mw:Tracker/PM_tool, one dot point is the separation of
tasks, features, etc.  This is bug 9412.

https://bugzilla.mozilla.org/show_bug.cgi?id=9412

I'd be happy to work on a few of these in a few weeks time.

--
John Vandenberg

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] User-Agent:

2010-02-16 Thread John Vandenberg
On Wed, Feb 17, 2010 at 1:00 PM, Anthony wikim...@inbox.org wrote:
 On Wed, Feb 17, 2010 at 11:57 AM, Domas Mituzas midom.li...@gmail.com wrote:
 Probably everything looks easier from your armchair. I'd love to have that
 view! :)


 Then stop volunteering.

Did you miss the point?

The graphs provided in this thread clearly show that the solution had
a positive  desired effect.

A few negative side-effects have been put forward, such as preventing
browsing without a UA, but Domas has also indicated that other tech
team members can overturn the change if they don't like it.

--
John Vandenberg

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] User-Agent:

2010-02-16 Thread John Vandenberg
On Wed, Feb 17, 2010 at 2:51 PM, Jamie Morken jmor...@shaw.ca wrote:
 Don't forget some normal traffic was blocked from this unannounced change, 
 ie. Google's translate service?  How much of the traffic reduction was from 
 services like this?  Some of the cited reduced traffic proving the strategies 
 success is coming from valid services.  Be careful or soon you will be 
 saying: you are either with wikimedia or with the terrorists? :)

With this solution, it is now possible to determine how much of the
traffic was from valid services.  i.e. google translate and other
useful services will identify themselves, and the traffic graph will
rise accordingly.

Note that I am not in favour of the solution, as it breaks backwards
compatibility with tools.  Unannounced breakages are especially nasty.
 e.g. it could have been quite easy to shoot an email to Google asking
that they add a user-agent for the unidentified traffic.

I am even less in favour of Domas retiring to an armchair, and think
that anyone suggesting that is deluding themselves about Wikimedia's
need of Domas, and Domas' reason for volunteering.

--
John Vandenberg

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Chrome frame support?

2010-02-03 Thread John Vandenberg
On Thu, Feb 4, 2010 at 8:17 AM, Nimish Gautam ngau...@wikimedia.org wrote:
 Hey all,

 I really want to turn on Google chrome frame support for the mediawiki
 projects:

 http://code.google.com/chrome/chromeframe/

 Basically, it would just involve us putting in a meta tag on our pages
 that would trigger an IE plugin Google wrote, assuming the IE user had
 that plugin installed. The plugin essentially causes IE to use google's
 HTML renderer and JS engine, which are much nicer to develop for than
 IE. This won't really solve IE development issues, but would be a good
 move in the right direction as far as I'm concerned.

 Any thoughts or compelling reasons why this might not be a good thing to do?

The issues log at the bottom of the developers guide indicate this is
not ready for prime time - i.e. printing doesn't work.

http://code.google.com/chrome/chromeframe/developers_guide.html

More fundamentally, the _user_ should control when a different
renderer is used for websites; not the website!  Currently this can be
done with registry keys.

HKCU\Software\Google\ChromeFrame\OptInUrls

Hopefully they add a GUI to configure the OptIn list, and the ability
to import a whitelist like Adblock and IETab.

fwiw, Mozilla-in-IE was at a similar stage three years ago.

http://starkravingfinkle.org/blog/2006/12/xule-what-if/

--
John Vandenberg

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Google phases out support for IE6

2010-02-03 Thread John Vandenberg
On Thu, Feb 4, 2010 at 9:11 AM, Trevor Parscal tpars...@wikimedia.org wrote:
..
 I think that's obvious. In all seriousness though, we may want to
 consider the possibility of letting users know about features they are
 missing out on. In the case of the UsabilityInitiative work I'm doing,
 IE6 hasn't really been a problem since we ditched it long ago. Users of
 IE6 won't even know how wonderful things might be on the other side of
 the upgrade.

IE 6.0 was the third most popular browser in November 2009, with 12%
of the marketshare.

http://stats.wikimedia.org/wikimedia/squids/SquidReportClients.htm

A fully patched Windows 2000 machine only upgrades as far as 6.x;
IE78 are not available for W2K.  Of course, W2K users can migrate to
Firefox, so they do have options, however IE6 is sufficiently
functional for wikitext editing and reading.

Is the UsabilityInitiative ignoring IE 6.0?

--
John Vandenberg

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Google phases out support for IE6

2010-01-31 Thread John Vandenberg
On Sun, Jan 31, 2010 at 8:28 PM, Thomas Dalton thomas.dal...@gmail.com wrote:
 On 31 January 2010 05:07, John Vandenberg jay...@gmail.com wrote:
 An important distinction is that IE for Mac users on Mac OS (classic)
 don't have the copious upgrade options available to IE 5.5 users on
 Windows.

 I don't think we need to worry about Mac OS classic:
 http://stats.wikimedia.org/wikimedia/squids/SquidReportOperatingSystems.htm

 If I'm reading that right, only 0.02% of users are using Mac OS
 classic (although I suppose there could be some more that have been
 grouped into Other, but that won't be many).

That is still at least 1000 pageviews per month.  If it does not
degrade gracefully, we would be telling them to buy a new computer in
order to view Wikipedia.  How many contributors will be lost in the
process?  Only developers can tell us how many of those 1000 pageviews
were logged in users.

We create projects for languages who have only ~6000 _in total_ (e.g.
Creek language).

I don't see any IE 5.1 or 5.2 in this list:

http://stats.wikimedia.org/wikimedia/squids/SquidReportClients.htm

 Most IE5.2 users should have upgraded to iCab.  Do we have good iCab support?

 IE to iCab isn't really an upgrade... it's a completely different
 browser, isn't it?

iCab was a supported browser long after IE for Mac was dropped.

--
John Vandenberg

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Google phases out support for IE6

2010-01-31 Thread John Vandenberg
On Mon, Feb 1, 2010 at 10:45 AM, Ryan Kaldari kald...@gmail.com wrote:
 This entire discussion about support for IE 5 Mac is pointless. IE for
 Mac used a completely different rendering engine than IE for Windows
 and was actually (pretty-much) standards compliant. We don't need to
 worry about what browser people on Mac Classic should migrate to for
 viewing Wikipedia. I'm reasonably confident that Wikipedia would
 render fine in IE 5 for Mac (or iCab, or Classilla).

In this thread, Chad suggested that the IE for Mac stylesheet could be
phased out.

--
John Vandenberg

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Google phases out support for IE6

2010-01-31 Thread John Vandenberg
On Mon, Feb 1, 2010 at 2:37 PM, Ryan Kaldari kald...@gmail.com wrote:
 Who said anything about dropping support for basic use? I would be
 surprised if you could tell much difference on Mac IE between
 Wikipedia with the custom stylesheet and without. But perhaps you
 could give it a try and report back to us, rather than relying on my
 wild speculations :)

Is this file unused?

http://svn.wikimedia.org/viewvc/mediawiki/trunk/phase3/skins/monobook/IEMacFixes.css?revision=11898view=markup

It seems that the IE for Mac style is littered through:

http://svn.wikimedia.org/viewvc/mediawiki/trunk/phase3/skins/monobook/main.css?revision=56937view=markup

... with an interesting comment:

/*
** IE/Mac fixes, hope to find a validating way to move this
** to a separate stylesheet. This would work but doesn't validate:
** @import(IEMacFixes.css);
*/

Would it be acceptable to use a conditional include?

http://stopdesign.com/examples/ie5mac-bpf/

Alternatively, we could load the stylesheet at runtime with
JavaScript.  That approach is already being used for IE specific
JavaScript fixes.

--
John Vandenberg

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] 100% open source stack (was Re: Bugzilla Vs other trackers.)

2010-01-09 Thread John Vandenberg
On Sat, Jan 9, 2010 at 3:49 PM, Tim Starling tstarl...@wikimedia.org wrote:
 John Vandenberg wrote:
 On Sat, Jan 9, 2010 at 12:10 PM, Tim Starling tstarl...@wikimedia.org 
 wrote:
 Platonides wrote:
 What were the reasons for replacing lighttpd with Sun Java System Web
 Server ?
 Probably the same reason that the toolserver uses Confluence instead
 of MediaWiki.

 It only contains one page, which points to the MediaWiki wiki.

 https://confluence.toolserver.org/pages/listpages-dirview.action?key=main

 I count 65 pages.

 https://confluence.toolserver.org/pages/listpages-dirview.action?key=tech

 Maybe you were confused by the unfamiliar UI.

Thanks Tim.  I should know Confluence better; we use it at work.  sigh.

 Are there plans to make greater use of the Confluence wiki?

 Certainly not.

Good to hear.

--
John Vandenberg

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] 100% open source stack (was Re: Bugzilla Vs other trackers.)

2010-01-08 Thread John Vandenberg
On Sat, Jan 9, 2010 at 12:10 PM, Tim Starling tstarl...@wikimedia.org wrote:
 Platonides wrote:
 What were the reasons for replacing lighttpd with Sun Java System Web
 Server ?

 Probably the same reason that the toolserver uses Confluence instead
 of MediaWiki.

It only contains one page, which points to the MediaWiki wiki.

https://confluence.toolserver.org/pages/listpages-dirview.action?key=main

Are there plans to make greater use of the Confluence wiki?

https://wiki.toolserver.org/view/Domains#confluence.toolserver.org

--
John Vandenberg

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Fwd: [Foundation-l] Security holes in Mediawiki

2009-09-15 Thread John Vandenberg
On Wed, Sep 16, 2009 at 9:26 AM, Anthony wikim...@inbox.org wrote:
 On Tue, Sep 15, 2009 at 7:17 PM, Andrew Garrett agarr...@wikimedia.orgwrote:
 I think the appropriate expression here is put up or shut up.

 If you are aware of unpatched security vulnerabilities in MediaWiki,
 report them to secur...@wikimedia.org, and to this list if you don't
 receive a response, and they will be immediately patched.


 If you want to offer some sort of bounty program, then maybe.  Otherwise, no
 thanks.

If you don't believe in responsible disclosure of attacks without
being paid, perhaps you will at least publicly disclose the problem.

If you do neither, you're well on your way to being a bottom-feeder.

--
John Vandenberg

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] how to chang {{SITENAME}}

2009-09-01 Thread John Vandenberg
On Tue, Sep 1, 2009 at 7:06 PM, Roan Kattouwroan.katt...@gmail.com wrote:
 2009/9/1 Domas Mituzas midom.li...@gmail.com:
 Enwiki will be really really sad by not have pagetitle constantly
 translated by translatewiki :)
 That's not the point. If a message is customized by replacing
 {{SITENAME}} with Wikipedia, updates to *that message* (not to the
 site name) from TranslateWiki (or changes to the message in
 MessagesEn.php , in the case of English) will not get through because
 the local customization containing the old version overrides it.

The sysadmins are only doing this on wikis with high traffic, which
means that there is an active community who can ensure that the
messages are up to date.

Domas suggested the top 20 wikis, but I think the top 10 of these
should be the wikis where optimisations takes precedence over
translations.

http://meta.wikimedia.org/wiki/List_of_Wikipedias

--
John Vandenberg

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] how to chang {{SITENAME}}

2009-08-31 Thread John Vandenberg
2009/8/31 Domas Mituzas midom.li...@gmail.com:
 Hello,
 For example, [[MediaWiki:Aboutsite]] have the default message About
 {{SITENAME}}. But we have overwritten it by ウィキペディ
 アについて
 (translated word from about Wikipedia).

 Performance-wise it is even better, if all main messages which have
 {{SITENAME}} get replacements with literals. Otherwise you're adding
 up 5ms of page load time to each page. :)

It would be good if that was documented ;-P

http://www.mediawiki.org/wiki/Manual:Interface/Aboutsite

I asked about this a while ago.

http://www.mediawiki.org/wiki/Project:Support_desk/Archives/Miscellaneous/007#tuning_MediaWiki:Aboutsite

--
John Vandenberg

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Watchlistr.com, an outside site that asks for Wikimedia passwords

2009-07-22 Thread John Vandenberg
On Thu, Jul 23, 2009 at 9:57 AM, Aryeh
Gregorsimetrical+wikil...@gmail.com wrote:
 On Wed, Jul 22, 2009 at 10:40 PM, Happy-melonhappy-me...@live.com wrote:
 I have a Greasemonkey script that does this, IMO, very nicely. I'm not 100%
 sure how GM script distribution works, but can't a server put files in a
 particular directory to have them be automatically suggested for
 installation by Greasemonkey?

Greasemonkey will try and install any file which ends in .js and
includes a few special words.

Where is this script?  I couldnt find it on userscripts.org or here:

http://en.wikipedia.org/wiki/Wikipedia:Tools/Greasemonkey_user_scripts

 Greasemonkey is far from ideal.  It only works on the computer you
 install it on, and only works for Firefox users.

That depends on how complex the script is; it could be turned into a
bookmarklet, and many other browsers support user-scripts.

http://en.wikipedia.org/wiki/Greasemonkey

--
John Vandenberg

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Dump status update

2009-03-09 Thread John Vandenberg
On Mon, Mar 9, 2009 at 6:09 PM, Andrew Garrett and...@werdn.us wrote:
 On Mon, Mar 9, 2009 at 2:36 PM, John Vandenberg jay...@gmail.com wrote:
 Has the dumper been tweaked to remove all hidden revisions, including
 hidden usernames recently fixed in bug 17792?

 That was a bug in contributions, not in deleted revisions. There is no
 reason to expect that that bug has any relevance to dumps.

I would like to see some affirmative statement that the dump routine
has been improved in light of the various items now being suppressed
with RevisionDelete, as it is being used to remove items which would
previously have been removed via Oversight.

I'm asking because we did have problems with this on the toolserver.

--
John Vandenberg

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Dump status update

2009-03-08 Thread John Vandenberg
On Wed, Feb 25, 2009 at 3:58 AM, Brion Vibber br...@wikimedia.org wrote:
 Quick update on dump status:

 * Dumps are back up and running on srv31, the old dump batch host.

 Please note that unlike the wikis sites themselves, dump activity is
 *not* considered time-critical -- there is no emergency requirement to
 get them running as soon as possible.

 Getting dumps running again after a few days is nearly as good as
 getting them running again immediately. Yes, it sucks when it takes
 longer than we'd like. No, it's not the end of the world.


 * Dump runner redesign is in progress.

 I've chatted a bit with Tim in the past on rearranging the architecture
 of the dump system to allow for horizontal scaling, which will make the
 big history dumps much much faster by distributing the work across
 multiple CPUs or hosts where it's currently limited to a single thread
 per wiki.

 We seem to be in agreement on the basic arch, and Tomasz is now in
 charge of making this happen; he'll be poking at infrastructure for this
 over the next few days -- using his past experience with distributed
 index build systems at Amazon to guide his research -- and will report
 to y'all later this week with some more concrete details.

Has the dumper been tweaked to remove all hidden revisions, including
hidden usernames recently fixed in bug 17792?

--
John Vandenberg

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Extension:Pdfhandler

2008-12-25 Thread John Vandenberg
On Fri, Dec 26, 2008 at 5:40 AM, Luiz Augusto lugu...@gmail.com wrote:
 On Thu, Dec 25, 2008 at 3:52 PM, Ilmari Karonen nos...@vyznev.net wrote:

 Luiz Augusto wrote:
 
  I'm asking it because I've approximately 30GB of public domain scans in
 .pdf
  format to upload on Commons on the next months (see
 
 http://en.wikisource.org/w/index.php?oldid=928004#Royal_Society_Digital_Archive_only_for_3_Months_FREE
  for further information on it) and because I fully agree to the reasons
  listed on https://bugzilla.wikimedia.org/show_bug.cgi?id=11215#c3

 Assuming that these are scanned documents that haven't been vectorized,
 have you considered converting them to DjVu format?  Not only does
 Wikimedia currently have better support for it than PDF, but you might
 realize some file size savings.  Apparently, there's software out there
 to more or less automate it.

Large batches of scans should be converted to djvu, as it is a better
format.  PDF support will be useful for the small tasks where the
person already has a PDF (or it is already uploaded onto commons), and
they dont want to learn lots of tools before they start seeing
results.  i.e. PDF support will make wikisource more accessible.

 Someone asked it on en.wikisource and I've replied with this:
 http://en.wikisource.org/w/index.php?title=Wikisource:Scriptoriumdiff=prevoldid=928130

 DjVu (or at least all conversion tools/configuration options that I've tried
 in the past months, including the LizardTech Document Express Enterprise
 pdf2djvu and png2djvu options) is a lossy format. If I convert a .pdf
 downloaded from Google Book Search I will get a low quality file (70 dpi or
 150 dpi per page), but if I extract the images from the same .pdf file using
 Adobe Acrobat Pro 8 I will get a 600 dpi jpeg for each page (OCR
 softwares normally
 recommeds to use 300 dpi images).

My understanding is that the compression is optional, and the lossy
compression is much better than the equivalent lossy compression of
PDF.

I think it is the free PDF-to-image extraction tools that are causing
your problems.

 Of course, that doesn't in any way preclude or remove the need for
 _also_ improving our PDF support.


 Surely :)


 But PDF, as common and useful as it
 is, might not be the optimal format here.


 Well, all digitized works from all libraries that I known (from Europe,
 United States and Brazil) are avaiable only in .pdf file format. The
 Internet Archive is the only one to make avaiable both .pdf and .djvu for
 the same book (the .djvu version from IA is also a low quality file, but it
 at least is delivered with a high-quality OCR embedded at the .djvu file due
 to some closed-source and pay OCR software [Abbyy FineReader, I believe]).

I have found the djvu files from IA to be of an appropriate quality,
especially for transcription purposes.  The PDFs are usually much
larger, and not much better quality.

--
John Vandenberg

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l