Re: [Wikitech-l] Improving anti-vandalism tools (twinkle, huggle etc) - suspicious edits queue

2013-09-26 Thread Arcane 21
That idea sounds like something already that could be done by the Flagged Revs 
extension.

Given that many of those suspicious edits could be extremely subtle, like minor 
changes to mathematical equations and statistics, articles with lots of 
potential for those types of subtle vandal edits would probably be better 
handled by having them approved with Flagged Revs and allowing trusted editors 
(like members of a particular Wiki Project based in those fields) to review the 
edits.

I doubt Twinkle or Huggle would be ideal for such vandalism, as it would be 
easy to mistake legitimate edits for vandal edits, and automated vandal 
detection/reversion processes would generally have a poor margin of error for 
such subtle vandalism.

 Date: Thu, 26 Sep 2013 15:06:47 +0200
 From: benap...@gmail.com
 To: wikitech-l@lists.wikimedia.org
 Subject: [Wikitech-l] Improving anti-vandalism tools (twinkle, huggle etc) - 
 suspicious edits queue
 
 Hi,
 
 I noticed that there is a high amount of suspicious edits that may be
 vandalism but were never reverted because people who were dealing with
 vandals (using some automated tool) in that moment weren't able to
 decide if it was vandalism or wasn't. For example some smart changes
 to statistical data, dates, football scores, changes that look weird
 but aren't clearly vandalism etc. These edits should be reviewed by
 expert on the topic, but in this moment, they aren't collected
 anywhere.
 
 I think we should create a new service (on tool labs?) that would
 allow these tools to insert such edits to queue (or database) of
 suspicious edits for later review by experts, this categorized
 database / queue could be browsed by people who are experts on given
 topics and got reviewed / reverted by them.
 
 The database would need to be periodically scanned and all changes
 that were reverted would need to be removed from it. The people who
 reviewed the edits could also flag them as ok.
 
 This way we could improve the efficiency of anti-vandalism tools by
 the amount of edits which are ignored or skipped these days.
 
 Some suggestions or ideas how to implement such a feature?
 
 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l
  
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] A message for all everyone on the Wikitech mailing list

2013-09-26 Thread Arcane 21
Sumana Hariharesw​ara recently provided me assistance with some code I wanted 
to write and invited me to join the wikitech mailing list, and she also 
suggested I share my response to her with the rest of the list subscribers, 
which I have reproduced (with some mild alterations) below:


Thanks for the mailing list link, Sumana, just submitted a subscription request.





As for those links you provided, I have bookmarked them and will be 
studying them extensively, your assistance is greatly appreciated :)





As for my wiki (https://mediawikitesters.orain.org/wiki/Main_Page), I like to 
test gadgets, user scripts, and extensions, 
and the wiki is basically for me and anyone else who is interested to 
display useful code, test it in a live environment, and hopefully swap 
notes with each other on how to improve it it.
The wiki is still rather new, but anyone interested is welcome.





As for the gadgets section of my wiki, I find gadgets to be 
incredibly useful and am trying to compile as many gadgets as I can that
 can be used by anyone with MediaWiki that I find universally useful and
 adaptable to any wiki, as I prefer code that
works out of the box and is compatible as possible, especially with the 
more recent versions of MediaWiki, and my goal is to make it available 
to anyone for any wiki.





Most are from WMF projects or have been adapted from them to be useful on any 
wiki.   
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Can we help Tor users make legitimate edits?

2013-09-27 Thread Arcane 21
I like this idea. Not every Tor user is vandal or troll, and assuming that all 
of them are by default is not assuming good faith. Some people are just really 
paranoid about their internet anonymity or live in restrictive countries (both 
of which I sympathize with), so this idea would let them edit in good faith 
while filtering out vandal/troll edits. It would also be a good idea to apply 
this to certain IP ranges like government/office buildings for similar reasons.

 Date: Fri, 27 Sep 2013 19:40:53 -0400
 From: suma...@wikimedia.org
 To: wikitech-l@lists.wikimedia.org; e...@dymaxion.org
 Subject: Re: [Wikitech-l] Can we help Tor users make legitimate edits?
 
 This is a quick followup to
 http://www.gossamer-threads.com/lists/wiki/wikitech/323006 and partly in
 keeping with the anti-vandalism discussion at
 http://www.gossamer-threads.com/lists/wiki/wikitech/392727 as well.
 
 On 12/27/2012 07:26 PM, Sumana Harihareswara wrote:
  TL;DR: A few ideas follow on how we could possibly help legit editors
  contribute from behind Tor proxies
 [snip]
  4) Allow more users the IP block exemption, possibly even automatically
  after a certain number of unreverted edits, but with some kind of
  FlaggedRevs integration; Tor users can edit but their changes have to be
  reviewed before going live.  We could combine this with (3); Nymble
  administrators or token-issuers could pledge to review edits coming from
  Tor. But that latter idea sounds like a lot of social infrastructure to
  set up and maintain.
 
 From talking to Eleanor Saitta: could we do FlaggedRevs by IP space,
 and/or by the intersection of IPs and topic space? Basically, let people
 edit from Tor IPs (and/or whitelist or blacklist categories) as long as
 those go through a FlaggedRevs-type process?  And we could also do
 FlaggedRevs on specific IP ranges, like blocks that are known to be
 certain government office buildings.
 
 -- 
 Sumana Harihareswara
 Engineering Community Manager
 Wikimedia Foundation
 
 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l
  
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Advance notice: I'm taking a sabbatical October-December

2013-09-27 Thread Arcane 21
Good luck and have fun, Sumana! :)

 Date: Fri, 27 Sep 2013 20:31:37 -0400
 From: suma...@wikimedia.org
 To: wikitech-l@lists.wikimedia.org
 Subject: Re: [Wikitech-l] Advance notice: I'm taking a sabbatical 
 October-December
 
 Well, off I go. I'm about to unsubscribe from this list for the first
 time in years. Looking forward to returning in January with MORE POWERS.
 
 -Sumana
 P.S. I'm sure you can find my personal email address if you need to. :)
 
 On 09/04/2013 09:47 PM, Sumana Harihareswara wrote:
  Thanks, Oren! If I am able to work on pywikipediabot or other Wikimedia
  tech and I blog about it, I'll syndicate to
  http://en.planet.wikimedia.org . I may cross-post additional thoughts to
  http://geekfeminism.org/ . If people want a summary IRC office
  hour/video tech talk after I come back, I could do that too.
  -Sumana
  
  On 08/31/2013 07:57 AM, Oren Bochman wrote:
  Congratulations - I hope you get co make many new bugs and even do some
  cool coding outside the wmf!
  I'm wandering if you'll update us about this unique experience as time
  allows?
 
 
 
  On Wednesday, August 28, 2013, Sumana Harihareswara wrote:
 
  I've been accepted to Hacker School https://www.hackerschool.com, a
  writers' retreat for programmers in New York City. I will therefore be
  taking an unpaid personal leave of absence from the Wikimedia Foundation
  via our sabbatical program. My last workday before my leave will be
  Friday, September 27. I plan to be on leave all of October, November,
  and December, returning to WMF in January.
 
  During my absence, Quim Gil will be the temporary head of the
  Engineering Community Team. Thank you, Quim! I'll spend much of
  September turning over responsibilities to him. Over the next month I'll
  be saying no to a lot of requests so I can ensure I take care of all my
  commitments by September 27th, when I'll be turning off my wikimedia.org
  email.
 
  If there's anything else I can do to minimize inconvenience, please let
  me know. And -- I have to say this -- oh my gosh I'm so excited to be
  going to Hacker School in just a month! Going from advanced beginner
  to confident programmer! Learning face-to-face with other coders, 30-45%
  of them women, all teaching each other! Thank you, WMF, for the
  sabbatical program, and thanks to my team for supporting me on this. I
  couldn't do this without you.
 
  --
  Sumana Harihareswara
  Engineering Community Manager
  Wikimedia Foundation
 
 
 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l
  
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Can we help Tor users make legitimate edits?

2013-09-27 Thread Arcane 21
Hmm, I can see your point. Flagged Revs would be as much of hindrances on 
regular users as it would be on Tor users. I still think it should be 
permissable for Tor editors to submit legitimate edits in some way, but your 
points about the AGF policy and the purpose of Flagged Revs are duly noted.

 Date: Fri, 27 Sep 2013 21:05:29 -0400
 From: risker...@gmail.com
 To: wikitech-l@lists.wikimedia.org
 Subject: Re: [Wikitech-l] Can we help Tor users make legitimate edits?
 
 On 27 September 2013 19:40, Sumana Harihareswara suma...@wikimedia.orgwrote:
 
  This is a quick followup to
  http://www.gossamer-threads.com/lists/wiki/wikitech/323006 and partly in
  keeping with the anti-vandalism discussion at
  http://www.gossamer-threads.com/lists/wiki/wikitech/392727 as well.
 
  On 12/27/2012 07:26 PM, Sumana Harihareswara wrote:
   TL;DR: A few ideas follow on how we could possibly help legit editors
   contribute from behind Tor proxies
  [snip]
   4) Allow more users the IP block exemption, possibly even automatically
   after a certain number of unreverted edits, but with some kind of
   FlaggedRevs integration; Tor users can edit but their changes have to be
   reviewed before going live.  We could combine this with (3); Nymble
   administrators or token-issuers could pledge to review edits coming from
   Tor. But that latter idea sounds like a lot of social infrastructure to
   set up and maintain.
 
  From talking to Eleanor Saitta: could we do FlaggedRevs by IP space,
  and/or by the intersection of IPs and topic space? Basically, let people
  edit from Tor IPs (and/or whitelist or blacklist categories) as long as
  those go through a FlaggedRevs-type process?  And we could also do
  FlaggedRevs on specific IP ranges, like blocks that are known to be
  certain government office buildings.
 
 
 I think perhaps there's a real disconnect between what Flagged Revisions
 does and its purpose, as well as how widespread its use is.  FR is not used
 on 95% of Wikimedia projects.  It is attached to specific pages (or entire
 namespaces); it is not attached to either anonymous (IP) or registered
 users.  You're looking for some other type of software, some form of user
 right if it is to be attached to specific users (either anonymous or
 registered) that woulddo what, exactly?  Require that a project's
 editors review every single edit from those IPs but not block them, no
 matter how much junk they put in a project?
 
 And again, any such use would be specific to each project.  I would be very
 disturbed if the WMF was to take it upon itself to start telling projects
 they have to accept edits from IPs and ranges they've had extremely poor
 experience with.  AGF is not a suicide pact.
 
 Risker/Anne
 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l
  
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Officially supported MediaWiki hosting service?

2013-10-02 Thread Arcane 21
I have a wiki there, and Orain is actually pretty decent as wiki farms go, 
though they could probably use more regular staff members. It is non profit for 
the forseeable future, though ads have been discussed only as in opt in option 
for those that want them.

 Date: Wed, 2 Oct 2013 21:33:35 -0400
 From: m...@nichework.com
 To: wikitech-l@lists.wikimedia.org
 Subject: Re: [Wikitech-l] Officially supported MediaWiki hosting service?
 
 On 10/02/2013 07:17 PM, phoebe ayers wrote:
  I have needed such a thing for wikis for small non-profits/library
  associations that I've been involved with, where I didn't want to host it
  myself (because I didn't want to take personal responsibility for the site
  of an organization that I might not stay involved with, and because I don't
  really have the chops to deal with security and spam issues); but also did
  not have a good hosting option with a larger organization or library, which
  I find are often not very familiar with mediawiki (e.g. I've been trying to
  get our library systems dept. to install some basic extensions for our
  internal mediawiki for a couple years now).
  
  There's not a lot of money in that particular use case, unfortunately, but
  I imagine I'm not alone in that need either.
 
 Did you see the Orain wikifarm? https://meta.orain.org/
 
 It seems to be targeted to the non-profit use case.
 
 Mark.
 
 -- 
 Mark A. Hershberger
 NicheWork LLC
 717-271-1084
 
 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l
  
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Updating the RELEASE-NOTES format for 1.22 and 1.23

2013-11-07 Thread Arcane 21
Seems like a reasonable approach to me. The old release notes were a little 
hard to follow, but your suggestion should make them more straightforward from 
now on.

 Date: Thu, 7 Nov 2013 22:21:52 -0500
 From: m...@nichework.com
 To: wikitech-l@lists.wikimedia.org
 Subject: [Wikitech-l] Updating the RELEASE-NOTES format for 1.22 and 1.23
 
 I've already posted to mediawiki-l to try and get input from end users
 on these proposed changes [1], but I'd like to see if we can't change
 the RELEASE-NOTES format for the time that we're working on 1.23.
 
 The main thing I've done is a bit of reordering and collection:
 
 * New features
 * Breaking changes (collected in a single section)
 * Configuration changes
 * Bug fixes
 * API changes
 * Language updates
 * Misc changes
 
 This is an effort to put the things users to know most about in a new
 release at the top of the file.
 
 This could even be done using git comments as others have suggested.
 
 Does this seem like a reasonable approach going forward?
 
 Thanks for any comments,
 
 Mark.
 
 [1] http://article.gmane.org/gmane.org.wikimedia.mediawiki/42443
 
 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l
  
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Applying nofollow only to external links added in revisions that are still unpatrolled

2013-11-18 Thread Arcane 21
I agree. While spammers are so pathetic they will do anything for page views, I 
have to admire (and detest) their ability to adapt in order to spread their 
nonsense.

Anything that slows them down, even in the slightest degree, is something I 
support and recommend.

 Date: Mon, 18 Nov 2013 09:24:54 -0500
 From: m...@uberbox.org
 To: wikitech-l@lists.wikimedia.org
 Subject: Re: [Wikitech-l] Applying nofollow only to external links added in 
 revisions that are still unpatrolled
 
 On 11/18/2013 04:39 AM, Happy Melon wrote:
  I'm sure spam directed at, say, enwiki, would get very subtle very quickly
  if spammers thought there was a real chance of it being able to use
  enwiki's pagerank weight.  Don't underestimate spammers' ability to learn
  and adapt.
 
 Also +1; pagerank is a valuable thing and Wikipedia has lots of it.
 Spammers would be quick to find ways to cheat, lie and manipulate their
 way into tapping into it.
 
 Right now, we are plagued with the spammers that are too desperate or
 stupid to care; if we turned nofollow off, they would all descend upon
 us like a plague of locusts.
 
 -- Marc
 
 
 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l
  
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] PHP new version(s) 5.5.7 and other versions. (upgrade, if you can)

2013-12-16 Thread Arcane 21
I've tried PHP 5.5.0, had to go back to PHP 4.4.9 due to issues with deleting 
and moving pages in MediaWiki.

 From: p858sn...@gmail.com
 Date: Mon, 16 Dec 2013 19:08:24 +1000
 To: wikitech-l@lists.wikimedia.org
 Subject: Re: [Wikitech-l] PHP new version(s) 5.5.7 and other versions. 
 (upgrade, if you can)
 
 Who? What? When? Where? Why?
 
 
 On Mon, Dec 16, 2013 at 5:05 PM, Thomas Gries m...@tgries.de wrote:
 
  PHP users are strongly advised to upgrade their PHP versions:
  PHP 4.0.6 - PHP 4.4.9
  PHP 5.0.x
  PHP 5.1.x
  PHP 5.2.x
  PHP 5.3.0 - PHP 5.3.27
  PHP 5.4.0 - PHP 5.4.22
  PHP 5.5.0 - PHP 5.5.6
  Vendor Status: Vendor has released PHP 5.5.7, PHP 5.4.23 and PHP 5.3.28|.||
 
 
  If you compile PHP from its sources, then the following configure
  works well:
  |
  |./configure --prefix=/usr --datadir=/usr/share/php
  --mandir=/usr/share/man --bindir=/usr/bin --libdir=/usr/share
  --includedir=/usr/include --sysconfdir=/etc --with-libdir=lib64
  --with-config-file-path=/etc --with-apxs2=/usr/sbin/apxs2-prefork
  --with-openssl --with-bz2 --with-zlib --with-curl --with-ldap
  --with-mysql --with-mysqli=mysqlnd  --with-pdo-mysql --enable-mbstring
  --with-xsl --enable-calendar --with-gd --with-jpeg-dir=/usr/lib64
  --with-png-dir=/usr/lib64 --with-iconv --with-pspell --with-gmp
  --with-mcrypt --enable-zip --enable-bcmath
  make
  make install
 
 
  |
  ___
  Wikitech-l mailing list
  Wikitech-l@lists.wikimedia.org
  https://lists.wikimedia.org/mailman/listinfo/wikitech-l
 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l
  
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] PHP new version(s) 5.5.7 and other versions. (upgrade, if you can)

2013-12-16 Thread Arcane 21
Thanks for the suggestion Tyler, I'll be sure to file a report.

 From: tylerro...@gmail.com
 Date: Mon, 16 Dec 2013 07:58:38 -0500
 To: wikitech-l@lists.wikimedia.org
 Subject: Re: [Wikitech-l] PHP new version(s) 5.5.7 and other versions. 
 (upgrade, if you can)
 
 On Mon, Dec 16, 2013 at 7:08 AM, Arcane 21 arc...@live.com wrote:
 
  I've tried PHP 5.5.0, had to go back to PHP 4.4.9 due to issues with
  deleting and moving pages in MediaWiki.
 
 
 Is there a bug filed for this. If not please do so. MediaWiki should be
 compatible with newer PHP versions.
 
 *-- *
 *Tyler Romeo*
 Stevens Institute of Technology, Class of 2016
 Major in Computer Science
 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l
  
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] PHP new version(s) 5.5.7 and other versions. (upgrade, if you can)

2013-12-16 Thread Arcane 21
I submitted a bug report describing the problem in more details on the 
Wikimedia Bugzilla, but short version is that when deleting or moving multiple 
pages at once, I experienced severe server lag and timeouts, and the job queue 
seemed to process the requests in PHP very slowly in PHP 5.5.0. I rolled back 
to 5.4, and it's working fine.

More details can be found on the bug report I submitted on Bugzilla.

 Date: Mon, 16 Dec 2013 23:36:17 +0100
 From: m...@tgries.de
 To: wikitech-l@lists.wikimedia.org
 Subject: Re: [Wikitech-l] PHP new version(s) 5.5.7 and other versions. 
 (upgrade, if you can)
 
 Am 16.12.2013 13:08, schrieb Arcane 21:
  I've tried PHP 5.5.0, had to go back to PHP 4.4.9 due to issues with 
  deleting and moving pages in MediaWiki.
 
 We need more details. Which version of MediaWiki do you use ?
 I never encountered any problems with the latest PHP versions
 5.5.0...5.5.7 and latest MediaWiki core.
 
 
 If you compile PHP from its sources, then the following configure
 works well:
 ./configure --prefix=/usr --datadir=/usr/share/php
 --mandir=/usr/share/man --bindir=/usr/bin --libdir=/usr/share
 --includedir=/usr/include --sysconfdir=/etc --with-libdir=lib64
 --with-config-file-path=/etc --with-apxs2=/usr/sbin/apxs2-prefork
 --with-openssl --with-bz2 --with-zlib --with-curl --with-ldap
 --with-mysql --with-mysqli=mysqlnd  --with-pdo-mysql --enable-mbstring
 --with-xsl --enable-calendar --with-gd --with-jpeg-dir=/usr/lib64
 --with-png-dir=/usr/lib64 --with-iconv --with-pspell --with-gmp
 --with-mcrypt --enable-zip --enable-bcmath
 make
 make install
 
 
 
 
 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l
  
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] I'm back from Hacker School

2014-01-03 Thread Arcane 21
Welcome back, Sumana!

 From: suma...@wikimedia.org
 Date: Fri, 3 Jan 2014 22:48:22 -0500
 To: wikitech-l@lists.wikimedia.org
 Subject: [Wikitech-l] I'm back from Hacker School
 
 Hi! As of yesterday, I'm back after my three-month sabbatical at Hacker
 School. I'm in catchup mode so I haven't yet resubscribed to most lists,
 nor quite taken back over Engineering Community Team (Quim Gil is still in
 charge until sometime next week when I feel back up to speed).
 
 Thank you to WMF for the sabbatical program, thanks to my boss Rob Lanphier
 for his support, and thanks to my team. I was able to walk away worry-free
 for three great months because I knew that Andre Klapper, Quim Gil, and
 Guillaume Paumier had my back. :)
 
 Sumana Harihareswara
 Engineering Community Manager
 Wikimedia Foundation
 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l
  
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Revamping interwiki prefixes

2014-01-16 Thread Arcane 21
If I might weigh in here, I don't see the harm in including all the WMF wikis 
onto the interwiki map.

MediaWiki is intensely related to the WMF, so those links make logical sense 
and it does no harm to include them in my opinion.

 Date: Thu, 16 Jan 2014 22:40:37 -0500
 From: nathanlarson3...@gmail.com
 To: wikitech-l@lists.wikimedia.org
 Subject: Re: [Wikitech-l] Revamping interwiki prefixes
 
 On Thu, Jan 16, 2014 at 7:35 PM, This, that and the other 
 at.li...@live.com.au wrote:
 
  I can't say I care about people reading through the interwiki list. It's
  just that with the one interwiki map, we are projecting our internal
  interwikis, like strategy:, foundation:, sulutil:, wmch: onto external
  MediaWiki installations.  No-one needs these prefixes except WMF wikis, and
  having these in the global map makes MediaWiki look too WMF-centric.
 
 
 It's a WMF-centric wikisphere, though. Even the name of the software
 reflects its connection to Wikimedia. If we're going to have a
 super-inclusive interwiki list, then most of those Wikimedia interwikis
 will fit right in, because they meet the criteria of having non-spammy
 recent changes and significant content in AllPages. If you're saying that
 having them around makes MediaWiki look too WMF-centric, it sounds like
 you are concerned about people reading through the interwiki list and
 getting a certain impression, because how else would they even know about
 the presence of those interwiki prefixes in the global map?
 
 
  I don't see the need for instruction creep here.  I'm for an inclusive
  interwiki map.  Inactive wikis (e.g. RecentChanges shows only sporadic
  non-spam edits) and non-established wikis (e.g. AllPages shows little
  content) should be excluded.  So far, there have been no issues with using
  subjective criteria at meta:Talk:Interwiki map.
 
 
 I dunno about that. We have urbandict: but not dramatica: both of which are
 unreliable sources, but likely to be used on third-party wikis (at least
 the ones I edit). We have wikichristian:
 (~4,000http://www.wikichristian.org/index.php?title=Special:Statisticscontent
 pages) but not rationalwiki: (
 ~6,000 http://rationalwiki.org/wiki/Special:Statistics content pages).
 The latter was 
 rejectedhttps://meta.wikimedia.org/w/index.php?title=Talk%3AInterwiki_mapdiff=4573672oldid=4572621awhile
 ago. Application of the subjective criteria seems to be hit-or-miss.
 
 If we're going to have a hyper-inclusionist system of canonical interwiki
 prefixes https://www.mediawiki.org/wiki/Canonical_interwiki_prefixes, we
 might want to use WikiApiary and/or WikiIndex rather than MediaWiki.org as
 the venue. These wikis that already have a page for every wiki could add
 another field for interwiki prefix to those templates and manage the
 interwiki prefixes by editing pages. Thingles
 saidhttps://wikiapiary.com/w/index.php?title=User_talk%3AThinglesdiff=409395oldid=408940he'd
 be interested in WikiApiary's getting involved. The only downside is
 that WikiApiary doesn't have non-MediaWiki wikis. It
 soundedhttp://wikiindex.org/index.php?title=User_talk:Leucostictediff=prevoldid=144256as
 though Mark Dilley might be interested in WikiIndex's playing some
 role
 in this too. But even WikiIndex has the problem of only containing wikis;
 the table will have to have other websites as well.
 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l
  
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Drop support for PHP 5.3

2014-02-20 Thread Arcane 21
Being a firm believer in the LTS model, I support David's take on this issue. 
Besides, they tend to be tested and reliable and have a longer support window 
by default, so it makes sense to support them in turn.

 From: dger...@gmail.com
 Date: Fri, 21 Feb 2014 01:04:39 +
 To: wikitech-l@lists.wikimedia.org
 Subject: Re: [Wikitech-l] Drop support for PHP 5.3
 
 On 21 February 2014 01:00, Techman224 techman...@techman224.ca wrote:
 
  Let me put this out there so there isn’t confusion. The regular 6 month 
  releases of Ubuntu are the stable releases. A LTS release is released every 
  two years on the same cycle as regular Ubuntu releases. A LTS release is 
  certainly more stable than regular releases, but not calling regular 
  releases stable is a bit misleading.
 
 
 For server purposes, I think we can stick to LTSes. Approximately
 nobody runs a non-LTS Ubuntu for their web hosting. (And even less now
 that non-LTSes are only getting a nine-month lifetime.)
 
 
 - d.
 
 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l
  
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Drop support for PHP 5.3

2014-02-21 Thread Arcane 21
I support this. Adequate testing of 5.4 before abandonment of 5.3 would be a 
sensible prelude to any major decision concerning this matter.

 Date: Fri, 21 Feb 2014 16:37:24 -1000
 From: canan...@wikimedia.org
 To: wikitech-l@lists.wikimedia.org
 Subject: Re: [Wikitech-l] Drop support for PHP 5.3
 
 On Feb 18, 2014 8:11 AM, Faidon Liambotis fai...@wikimedia.org wrote:
  Last time we were discussing PHP 5.4 it was quite a while ago but I
  remember hearing that we'd need to do some porting work for our
  extensions.
 
 Is this still the case? If so, it seems the first step would be for WMF to
 ensure that it can run on 5.4 (and transition prod to 5.4?), before
 dropping support for 5.3.
   --scott
 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l
  
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Preview of the proposal for MediaWiki Homepage

2014-02-24 Thread Arcane 21
The picture at the top and most of the other icons need shrunk a little, and 
the page needs to fit more items together (it looks rather spaced out), but I 
do like the icon choices.

It would also benefit from some more frames to separate the page sections IMO.

Otherwise, looks great. :)

 From: monteirobr...@gmail.com
 Date: Mon, 24 Feb 2014 17:54:13 -0300
 To: wikitech-l@lists.wikimedia.org
 Subject: [Wikitech-l] Preview of the proposal for MediaWiki Homepage
 
 Hello everyone,
 
 I would like to invite you to take a look the preview [1] of MediaWiki
 Homepage. Your opinion, contribution and help are very welcome.
 
 Thank you.
 
 Best regards,
 
 [1] https://www.mediawiki.org/wiki/MediaWiki/Homepage_redesign/Preview
 
 Brena Monteiro
 +55 27 98109 0123
 @monteirobrena http://twitter.com/monteirobrena
 Reflexões Brenianas http://monteirobrena.wordpress.com
 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l
  
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Text-to-speech extension?

2014-04-28 Thread Arcane 21
I'm not disabled myself, but I do see the benefits in such a venture, and 
assuming there is text-to-speech program with a MediaWiki friendly API (or at 
least can hook into MediaWiki fairly well) that could be used as a backend, I 
would think this would not only be possible, but a great idea.

 Date: Tue, 29 Apr 2014 02:49:51 +0200
 From: ilias.k...@freemail.gr
 To: wikitech-l@lists.wikimedia.org
 Subject: [Wikitech-l] Text-to-speech extension?
 
 Hello, and thank you for your work,
 
 I was looking at the Librivox project today and I thought that it'd be
 great to have voice support in Mediawiki, to aid people with sight
 impairments or disabilities.
 So I was wondering whether there is a text-to-speech extension for
 Mediawiki, or if there could be some other means of offering wiki pages
 in audio form.
 
 Thanks for your time as well,
 Ilias K.
 
 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l
  
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] LiquidThreads - how do we kill it?

2014-06-06 Thread Arcane 21
As someone who likes LQT (autosigning and visible threading are excellent 
features) and wishes Flow was basically that, but better, I personally would 
hope Flow would attempt to incorporate as many advantages of LQT and provide a 
way to convert LQT threads to Flow format as smoothly as possible if Flow is to 
be a true successor to LQT.

Personally, I'm not as crazy about VisualEditor (I find the classic WikiEditor 
more comfortable for some reason), but I too see its appeal.

 From: dger...@gmail.com
 Date: Fri, 6 Jun 2014 20:48:03 +0100
 To: wikitech-l@lists.wikimedia.org
 Subject: Re: [Wikitech-l] LiquidThreads - how do we kill it?
 
 On 6 June 2014 20:12, Martijn Hoekstra martijnhoeks...@gmail.com wrote:
  On Fri, Jun 6, 2014 at 11:27 AM, David Gerard dger...@gmail.com wrote:
 
  YMMV. Wikipedia is pretty much enculturated, but RationalWiki gets
  n00bs *all the time* who object to something on a page. You know what
  the most frequent reply involves? Please learn to sign your
  comments.
 
  If auto signatures are the best thing LQT/Flow brings, I have a bad feeling
  about it.
 
 
 That would be a distortion of my point :-) I have mostly found LQT
 annoying as a user, but I can see the attraction of a discussion where
 the threading is clearly visible, and I'm presuming extensive user
 testing will be done - typical mind fallacy is a serious hazard, and
 not one Wikimedia can afford.[1] This is why software changes on WMF
 wikis are ultimately up to WMF, not the wiki communities.
 
 (Of course, debacles like the introduction of VE show why caution is
 still sensible. And I say that as a huge fan and advocate of VE.)
 
 
 - d.
 
 
 [1] http://wiki.lesswrong.com/wiki/Typical_mind_fallacy What works for
 you or me cannot be presumed to work for the world. Geeks are
 regularly *shocked* at what ordinary people make of things.
 
 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l
  
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] LiquidThreads - how do we kill it?

2014-06-06 Thread Arcane 21
Going to have concur on this. Flow and VE would be great for attracting new 
users, but leaving the foundation of the community in the dust in favor of 
innovation strikes me as a bad idea.

I support the idea of having the ability for the old methods of editing and 
talk pages to work when and where possible and for the transition to newer 
ideas to be as gentle as possible on those who might otherwise be alienated by 
the changes.

 Date: Sat, 7 Jun 2014 02:39:17 +0200
 From: schneeschme...@googlemail.com
 To: wikitech-l@lists.wikimedia.org
 Subject: Re: [Wikitech-l] LiquidThreads - how do we kill it?
 
 2014-06-06 22:28 GMT+02:00 S Page sp...@wikimedia.org:
 
  The tone of your message made me want to cry, quit my job, and punch the
  wall in frustration  :(
 
 I am sorry, S, this is certainly not what I intended. I apologise for
 the tone of my last mail.
 
  But I appreciate you being open about your dislike
  and suspicion of Flow, and can only hope that over time Flow's growing
  feature set leads users to *want* it enabled on the talk pages they visit.
 
 The most important point about everything new the WMF is currently
 developing is that it really can frustrate and drive away even more
 old editors. The WMF only talks about new editors, but does not take
 into account that we need to keep the old ones in the first place.
 
 @Gerard: Losing old editors is the biggest cost of all those changes.
 This is why we have to keep these as small as possible and we have to
 avoid them wherever possible. There always has to be a switch for
 disabling everything new and to return to the old state.
 
 To give you some more background: Our community has suffered a lot of
 stress over the past years which results in frustration in many
 places. Many long-time editors have gone inactive because of that.
 Some of the stress is self-made, some came in from the German chapter,
 mostly last year (this will hopefully change now), and some of the
 stress comes from San Francisco. The latter stems mostly from new
 developments in technical terms. Remember that this is only a small
 community of only a few hundred regulars, and that we had to fight
 against visual editor, and when media viewer was announced the first
 question was whether it would still be possible to switch it off when
 it would not be beta any more. Many of us are exhausted and tired to
 fight against the Wikimedia bodies.
 
 That's why I interfere when there is talk about Flow. Both visual
 editor and flow have the potential of providing another and perhaps
 final blow to the still active old editor community. There is only one
 community. We don't have another one as substitutes to take over. If a
 wiki project is dead and broken once it will be broken forever.
 
 Regards,
 Jürgen.
 
 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l
  
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Preventing MW from adding a page to watchlist

2014-06-10 Thread Arcane 21
This seems to be a MW bug. Ever since we updated our wiki to MW 1.23, this has 
been happening to lots of our users, myself included.

Apparently, the option to add pages edited to the watchlist is checked by 
default, which probably needs to be disabled by default.

 Date: Tue, 10 Jun 2014 10:57:17 +0200
 From: benap...@gmail.com
 To: wikitech-l@lists.wikimedia.org
 Subject: [Wikitech-l] Preventing MW from adding a page to watchlist
 
 There is no mention of this feature in
 https://www.mediawiki.org/wiki/API:Rollback but according to Helder in
 this bug report https://bugzilla.wikimedia.org/show_bug.cgi?id=66273
 mediawiki insert every page to which a rollback api was used on to a
 users watchlist.
 
 Is there a way to disable it?
 
 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l
  
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Preventing MW from adding a page to watchlist

2014-06-10 Thread Arcane 21
Thanks for the reminder.

However, I think that not all those settings are not necessarily good, and 
here's why:

* (bug 45020) Make preferences Add pages I create and files I upload to my 
watchlist and pages and files I edit true by default.


This seems to be the default on Wikia, which I have also used, but not everyone 
wants to do thus. I personally only like to use a watchlist to keep track of 
select pages instead of every page I edit, so I find this default irritating.

* (bug 45022) Make preference Email me when a page or file on my watchlist is 
changed true by default.


This is a good idea.

* (bug 49719) Watch user page and user talk page by default.
This will allow your new users to immediately start benefiting from the 
watchlist and email notification features, without needing to first read all 
the docs to find out that they're as useful as they are.

This is a good idea.

In short, the first setting strikes me as bad, but I think the other two are 
good ideas.

 Date: Tue, 10 Jun 2014 13:22:26 +
 From: jer...@tuxmachine.com
 To: wikitech-l@lists.wikimedia.org
 Subject: Re: [Wikitech-l] Preventing MW from adding a page to watchlist
 
 On Jun 10, 2014 9:18 AM, MZMcBride z...@mzmcbride.com wrote:
  And
  sysadmins of larger wiki installations probably want to re-visit whether
  the user preferences defaults are appropriate for their communities.
 
 They also may wan to consider reading the release announcement emails…
 (e.g.
 http://lists.wikimedia.org/pipermail/mediawiki-announce/2014-June/000152.html
 )
 
 -Jeremy
 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l
  
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] new terms of use/paid contributions - do they apply to mediawiki

2014-06-16 Thread Arcane 21
In terms of applying the rule to things like articles written by someone for 
pay on behalf of someone else, that policy makes sense.

As for code, I agree, that policy is counterproductive.

 To: wikitech-l@lists.wikimedia.org
 Date: Mon, 16 Jun 2014 22:42:05 +0200
 From: matma@gmail.com
 Subject: Re: [Wikitech-l] new terms of use/paid contributions - do they apply 
 to mediawiki
 
 I agree. The new policy would just introduce pointless bureaucracy.
 
 -- 
 Matma Rex
 
 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l
  
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Deploying a centralized questionnaire to all pages of a specific Category.

2014-08-02 Thread Arcane 21
You might wish to try this:

https://www.mediawiki.org/wiki/Extension:Quiz

 Date: Sat, 2 Aug 2014 18:12:51 -0400
 From: valing...@gmail.com
 To: wikitech-l@lists.wikimedia.org
 Subject: [Wikitech-l] Deploying a centralized questionnaire to all pages of   
 a specific Category.
 
 Hello,
 
 I'm trying to find if there's a way (extension or built in functionality)
 to create a list of questions on a specific page then deploy this set of
 questions on say every page of a certain Category.
 
 Also, when the questions are deployed to a specific category, i'd need to
 give users a way to answer those questions (a simple textbox would be
 enough). It doesn't need to keep track of who answered what, only the most
 recent update would be the answer.
 
 Thanks
 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l
  
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] MediaWiki Security and Maintenance Releases: 1.22.10 and 1.23.3

2014-08-28 Thread Arcane 21
Mostly a few bugfixes.

 To: mediawiki-annou...@lists.wikimedia.org; mediawik...@lists.wikimedia.org; 
 mediawiki-enterpr...@lists.wikimedia.org; wikitech-l@lists.wikimedia.org; 
 gla...@hallowelt.biz
 Date: Thu, 28 Aug 2014 15:10:09 +0200
 From: matma@gmail.com
 Subject: Re: [Wikitech-l] MediaWiki Security and Maintenance Releases: 
 1.22.10 and 1.23.3
 
 There weren't actually any security patches in these releases, were there?
 
 -- 
 Matma Rex
 
 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l
  
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] [BREAKING CHANGE] Plans to move Cite configuration from wikitext messages to CSS styles

2014-12-16 Thread Arcane 21
At the risk of sounding stupid, does this mean other wikis will need a Parsoid 
instance to use this extension, or is this simply being tested in a Parsoid 
environment and it's just using CSS instead of wikitext for rendering? If the 
former, not a fan of the idea at all. If the latter, awesome, sounds like a 
great idea.

I'm still confused on the exact details of how this will work, so I'd 
appreciate clarification.

 From: jforres...@wikimedia.org
 Date: Tue, 16 Dec 2014 18:10:46 -0800
 To: wikitech-l@lists.wikimedia.org
 Subject: Re: [Wikitech-l] [BREAKING CHANGE] Plans to move Cite configuration 
 from wikitext messages to CSS styles
 
 On 16 December 2014 at 04:45, Brad Jorsch (Anomie) bjor...@wikimedia.org
 wrote:
 
  On Tue, Dec 16, 2014 at 5:44 AM, Marc Ordinas i Llopis 
  marc...@wikimedia.org wrote:
 
   Due to how MediaWiki's messages system works, changes to the display
   styles need to be copied into each of the ~300 display languages for
  users,
   else those users with different languages will see different reference
   styles on the same page.
 
 
  That sounds like bug T33216, which was fixed a while ago. Does this
  actually occur now?
 
 
 ​No, this is talking about the problem of changing the rendering styles
 needing to be done in each of the customised languages manually​ through
 the translation system (and being totally unlike what the translation
 system on TranslateWiki.net generally uses and is suited for). It was not
 meant to be referring to https://phabricator.wikimedia.org/T33216, which I
 believe is still fixed, yes.
 
 J.
 -- 
 James D. Forrester
 Product Manager, Editing
 Wikimedia Foundation, Inc.
 
 jforres...@wikimedia.org | @jdforrester
 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l
  
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] [BREAKING CHANGE] Plans to move Cite configuration from wikitext messages to CSS styles

2014-12-17 Thread Arcane 21
Thanks for the clarification. 

I think the idea (as you explained it) sounds great, hope you guys can get it 
working. :)

 From: jforres...@wikimedia.org
 Date: Tue, 16 Dec 2014 22:06:11 -0800
 To: wikitech-l@lists.wikimedia.org
 Subject: Re: [Wikitech-l] [BREAKING CHANGE] Plans to move Cite configuration 
 from wikitext messages to CSS styles
 
 On 16 December 2014 at 21:07, Arcane 21 arc...@live.com wrote:
 
  At the risk of sounding stupid, does this mean other wikis will need a
  Parsoid instance to use this extension, or is this simply being tested in a
  Parsoid environment and it's just using CSS instead of wikitext for
  rendering? If the former, not a fan of the idea at all. If the latter,
  awesome, sounds like a great idea.
 
 
 ​The latter; this has become a pressing need for faithful replication of
 expected styles​ for pages rendered using Parsoid, but we'll do this in
 both the regular Cite extension and the current re-implementation of it
 inside Parsoid. They will share the styling so that it will just work.
 
 
 
  I'm still confused on the exact details of how this will work, so I'd
  appreciate clarification.
 
 
 ​Right now we've not yet confirmed that we can entirely replicate all the
 features and configurations that people are using in Cite via the existing
 messages in the new​ CSS system. This is mostly a please help and/or tell
 us if we're crazy e-mail. :-)
 
 ​J.
 -- 
 James D. Forrester
 Product Manager, Editing
 Wikimedia Foundation, Inc.
 
 jforres...@wikimedia.org | @jdforrester
 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l
  
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Status of the Purge extension

2014-12-28 Thread Arcane 21
I agree. Purging should either be integrated into the MW core or the extension 
needs to be made so anyone can use it whenever needed without needing JS.

 Date: Sun, 28 Dec 2014 09:38:26 -0800
 From: jdlrob...@gmail.com
 To: wikitech-l@lists.wikimedia.org
 Subject: Re: [Wikitech-l] Status of the Purge extension
 
 On 28 Dec 2014 08:16, MZMcBride z...@mzmcbride.com wrote:
 
  Ricordisamoa wrote:
  It looks like the Purge extension
  https://www.mediawiki.org/wiki/Extension:Purge for MediaWiki could be
  very useful to replace many similar gadgets on Wikimedia sites.
  New users often need to purge the server cache, and don't know how to do
  it. A simple link is way more helpful than visit the history page and
  replace 'history' with 'purge', etc. Also, a server-side extension has
  much better i18n support and works without JavaScript. Is anyone
  interested in getting it in?
 
  Hi.
 
  Rather than trying find ways to institutionalize the purge action, I'd
  strongly prefer that we examine the purge action's current use-cases and
  find ways to obviate them.
 
  Purging should be an internal implementation detail and any time a user
  feels that manually purging a page is required, MediaWiki has failed the
  user.
 
 Well said.
 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l
  
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Parsoid's progress

2015-01-20 Thread Arcane 21
Given what I've seen so far, it might be best to aim for a gradual 
reimplementation of Parsoid features to make most of them work without a need 
for Parsoid, with the eventual goal of severing the need for Parsoid completely 
if possible. At any rate, the less the parser has to outsource, the less 
complicated things will be, correct?

 Date: Tue, 20 Jan 2015 11:02:10 -0500
 From: canan...@wikimedia.org
 To: wikitech-l@lists.wikimedia.org
 Subject: Re: [Wikitech-l] Parsoid's progress
 
 I believe Subbu will follow up with a more complete response, but I'll note
 that:
 
 1) no plan survives first encounter with the enemy.  Parsoid was going to
 be simpler than the PHP parser, Parsoid was going to be written in PHP,
 then C, then prototyped in JS for a later implementation in C, etc.  It has
 varied over time as we learned more about the problem.  It is currently
 written in node.js and probably is at least the same order of complexity as
 the existing PHP parser.  It is, however, built on slightly more solid
 foundations, so its behavior is more regular than the PHP parser in many
 places -- although I've been submitting patches to the core parser where
 necessary to try to bring them closer together.  (c.f.
 https://gerrit.wikimedia.org/r/180982 for the most recent of these.)  And,
 of course, Parsoid emits well-formed HTML which can be round-tripped.
 
 In many cases Parsoid could be greatly simplified if we didn't have to
 maintain compatibility with various strange corner cases in the PHP parser.
 
 2) Parsoid contains a partial implementation of the PHP expandtemplates
 module.  It was decided (I think wisely) that we didn't really gain
 anything by trying to reimplement this on the Parsoid side, though, and it
 was better to use the existing PHP code via api.php.  The alternative would
 be to basically reimplement quite a lot of mediawiki (lua embedding, the
 various parser functions extensions, etc) in node.js.  This *could* be done
 -- there is no technical reason why it cannot -- but nobody thinks it's a
 good idea to spend time on right now.
 
 But the expandtemplates stuff basically works.   As I said, it doesn't
 contain all the crazy extensions that we use on the main WMF sites, but it
 would be reasonable to turn it on for a smaller stock mediawiki instance.
 In that sense it *could* be a full replacement for the Parser.
 
 But note that even as a full parser replacement Parsoid depends on the PHP
 API in a large number of ways: imageinfo, siteinfo, language information,
 localized keywords for images, etc.  The idea of independence is somewhat
 vague.
   --scott
 
 
 On Mon, Jan 19, 2015 at 11:58 PM, MZMcBride z...@mzmcbride.com wrote:
 
  Matthew Flaschen wrote:
  On 01/19/2015 08:15 AM, MZMcBride wrote:
   And from this question flows another: why is Parsoid
   calling MediaWiki's api.php so regularly?
  
  I think it uses it for some aspects of templates and hooks.  I'm sure
  the Parsoid team could explain further.
 
  I've been discussing Parsoid a bit and there's apparently an important
  distinction between the preprocessor(s) and the parser. Though in practice
  I think parser is used pretty generically. Further notes follow.
 
  I'm told in Parsoid, ref and {{!}} are special-cased, while most other
  parser functions require using the expandtemplates module of MediaWiki's
  api.php. As I understand it, calling out to api.php is intended to be a
  permanent solution (I thought it might be a temporary shim).
 
  If the goal was to just add more verbose markup to parser output, couldn't
  we just have done that (in PHP)? Node.js was chosen over PHP due to
  speed/performance considerations and concerns, from what I now understand.
 
  The view that Parsoid is going to replace the PHP parser seems to be
  overly simplistic and goes back to the distinction between the parser and
  preprocessor. Full wikitext transformation seems to require a preprocessor.
 
  MZMcBride
 
 
 
  ___
  Wikitech-l mailing list
  Wikitech-l@lists.wikimedia.org
  https://lists.wikimedia.org/mailman/listinfo/wikitech-l
 
 
 
 
 -- 
 (http://cscott.net)
 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l
  
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Parsoid's progress

2015-01-19 Thread Arcane 21
If I might weigh in, I concur with MZMcBride. If Parsoid is absolutely needed 
regardless, that's one thing, but if a VE editing interface can be set up that 
doesn't need Parsoid, that would reduce dependence on third party software, 
make installation easier for all parties concerned, and not be as resource 
intensive, and since the optimization of resources is always a plus IMO, 
severing dependency on Parsoid and attempting to do it's current functions 
purely in house seems like a good plan to pursue.

 Date: Mon, 19 Jan 2015 11:15:54 -0500
 From: z...@mzmcbride.com
 To: wikitech-l@lists.wikimedia.org
 Subject: [Wikitech-l] Parsoid's progress
 
 (Combining pieces of Jay's thread and pieces of the shared hosting thread.)
 
 Daniel Friesen wrote:
 Parsoid can do Parsoid DOM to WikiText conversions. So I believe the
 suggestion is that storage be switched entirely to the Parsoid DOM and
 WikiText in classic editing just becomes a method of editing the content
 that is stored as Parsoid DOM in the backend.
 
 Tim Starling wrote:
  Parsoid depends on the MediaWiki parser, it calls it via api.php. It's
 not a complete, standalone implementation of wikitext to HTML
 transformation.
 
  
  HTML storage would be a pretty simple feature, and would allow
 third-party users to use VE without Parsoid. It's not so simple to use
 Parsoid without the MediaWiki parser, especially if you want to support
 all existing extensions.
 
  
  So, as currently proposed, HTML storage is actually a way to reduce the
 dependency on services for non-WMF wikis, not to increase it.
 
  Based on recent comments from Gabriel and Subbu, my understanding is
 that there are no plans to drop the MediaWiki parser at the moment.
 
 Yeah... what is this all about? My understanding (and please correct me if
 I'm wrong) is that Parsoid is/was intended to be a standalone service
 capable of translating wikitext -- HTML. You seem to be stating that
 Parsoid is neither complete nor standalone. Why?
 
 Currently Parsoid is the largest client of the MediaWiki PHP parser, I'm
 told. If Parsoid is regularly calling and relying upon the MediaWiki PHP
 parser, what exactly is the point of Parsoid?
 
 How much parity is there between Parsoid without the use of the MediaWiki
 parser and the MediaWiki parser? That is, if you selected a random sample
 of pages from a Wikimedia wiki, how many of them could Parsoid correctly
 parse on its own? And from this question flows another: why is Parsoid
 calling MediaWiki's api.php so regularly?
 
 I'm also interested in Parsoid's development as it relates to the broader
 push for services. If Parsoid is going to be the model of future services
 development, I'd like a clearer evaluation of what kind of model it is.
 
 Again, please correct me if I'm wrong, mistaken, misinformed, etc., but
 from my place of limited knowledge, it sounds very unappealing to create
 large Node.js applications (services) that closely tie in and require(!)
 PHP counterparts. This seems like the opposite of moving toward a more
 flexible, modular architecture. From my perspective, it would seem to only
 saddle us with additional technical debt moving forward, as we double
 complexity indefinitely.
 
 MZMcBride
 
 
 
 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l
  
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l