Re: [Wikitech-l] Preliminary git module splitup notes (MediaWiki core extensions)

2011-10-05 Thread Brion Vibber
On Wed, Oct 5, 2011 at 7:18 AM, Platonides platoni...@gmail.com wrote:

 There are 615 extensions in trunk/extensions
 With the new setup, would someone which has a checkout of everything
 need to open 616 connections (network latency, ssh authentication, etc.)
 whenever he wants to pull?
 svn:externals already noticeably slow checkouts, and there's only a
 bunch of them.


I'll do some tests; it may be possible to avoid having to do serial
connection setup/teardown by using SSH's ControlMaster[1] setting to
piggyback the 'new' connections on one that's already open.

[1] 
http://www.anchor.com.au/blog/2010/02/ssh-controlmaster-the-good-the-bad-the-ugly/


(Part of what makes svn:externals slow is that you're jumping over to
another site entirely -- that's an extra DNS lookup, possibly separate auth,
and who knows how well the other server performs.)

-- brion
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Preliminary git module splitup notes (MediaWiki core extensions)

2011-10-05 Thread Brion Vibber
On Oct 5, 2011 1:03 PM, Platonides platoni...@gmail.com wrote:
 I know about ControlMaster (which we can only those of us with ssh+git
 can benefit), but just launching a new process and waiting if there's
 something new will slow-down. OTOH git skips the recurse everything
 locking all subfolders step, so it may be equivalent.

 Maybe there's some way for fetching updates from aggregate repositories
 at once and I am just when everything is solved, though.

Submodules may actually work well for this, as long as something propagates
the ext updates to the composite repo. The checked-out commit id of each
submodule is stored in the tree, so if no changes were seen from that one
containing repo it shouldn't have to pull anything from the submodule's
repo.

(Not yet tested)

-- brion


 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Preliminary git module splitup notes (MediaWiki core extensions)

2011-10-05 Thread Brion Vibber
On Wed, Oct 5, 2011 at 1:59 PM, Brion Vibber br...@pobox.com wrote:

 On Oct 5, 2011 1:03 PM, Platonides platoni...@gmail.com wrote:
  I know about ControlMaster (which we can only those of us with ssh+git
  can benefit), but just launching a new process and waiting if there's
  something new will slow-down. OTOH git skips the recurse everything
  locking all subfolders step, so it may be equivalent.
 
  Maybe there's some way for fetching updates from aggregate repositories
  at once and I am just when everything is solved, though.

 Submodules may actually work well for this, as long as something propagates
 the ext updates to the composite repo. The checked-out commit id of each
 submodule is stored in the tree, so if no changes were seen from that one
 containing repo it shouldn't have to pull anything from the submodule's
 repo.

 (Not yet tested)


Ok, did some quick tests fetching updates for 16 repos sitting on my
Gitorious account.

Ping round-trip from my office desktop to Gitorious's server is 173ms,
making the theoretical *absolute best* possible time involving a round-trip
for each at 2-3 seconds.


Running a simple loop of 'git fetch' over each repo (auth'ing with my ssh
key, passphrase already provided) takes 53 seconds (about 3 seconds per
repo). This does a separate ssh setup  poke into git for each repo.

Clearly unacceptable for 600+ extensions. :)


Turning on ControlMaster and starting a long-running git clone in the
background, then running the same 'git fetch' loop took the time down to
about 10 seconds (1s per repo). ControlMaster lets those looped 'git
fetch's piggyback on the existing SSH connection, but still has to start up
git and run several round-trips.

Better, but still doesn't scale to hundreds of extensions: several minutes
for a null update is too frustrating!


Checking them out as submodules via 'git submodule add' and then issuing a
single 'git submodule update' command takes... 0.15 seconds. Nice!

Looks like it does indeed see that there's no changes, so nothing has to be
pulled from the upstream repos. Good!


The downside is that maintaining submodules means constantly pushing commits
to the containing repo so it knows there are updates. :(

Probably the most user-friendly way to handle this is with a wrapper script
that can do a single query to fetch the current branch head positions of a
bunch of repos, then does fetch/pull only on the ones that have changed.

This could still end up pulling from 600+ repos -- if there are actually
changes in them all! -- but should make typical cases a *lot* faster.

We should check in a little more detail how Android  other big projects
using multiple git repos are doing their helper tools to see if we can just
use something that already does this or if we have to build it ourselves. :)

-- brion
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Help setting up mysql.cnf for local WIki mirror

2011-10-06 Thread Brion Vibber
On Thu, Oct 6, 2011 at 12:54 PM, Fred Zimmerman zimzaz@gmail.comwrote:

 Hi,

 I am hoping that someone here can help me - I realize there is an
 xmlwikidumps mailing list but it is pretty low volume and expertise
 relative
 to this one. THere is a lot of conflicting advice on the mirroring
 wikipedia
 page.  C I am setting up a local mirror of english Wikipedia
 pages-articles...xml and I have been stopped by repeated failures in mySQL
 configuration.

 I have downloaded the .gz data from the dumped and run it through mwdumper
 to create an SQL file without problems, but things keep breaking down on
 the
 way into mysql. I have had a lot of agony with inno_db log files, etc. and
 have learned how to make them bigger, but I'm still apaprently missing some
 pieces.


What fails, exactly? Do you get error messages of some kind? Without knowing
what's going wrong, there's little advice that can be given.


 my target machine is an AWS instance 32 bit Ubuntu 1.7GB RAM, 1 Core, 200
 GB, which is only for this project.  I can make it bigger if necessary.

 Can someone take a look at this my.cnf file and tell me what I need to
 change to get this to work?

 this is what my.cnf file looks like:


The only requirement I can think of is making sure max_packet_size is
biggish so all the pages import; you appear to have set it to 128M which
should be more than big enough.

Other settings I would assume should depend on your available memory,
workload, etc.

-- brion
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Wikipedia Android app – Support Wikipedia version

2011-10-07 Thread Brion Vibber
2011/10/7 Gerard Meijssen gerard.meijs...@gmail.com

 Hoi.
 One distinct advantage of daily builds is that you use these to distribute
 improved localisations... Will this app come to translatewiki.net ? If so,
 it does make sense to build this functionality that is similar to what is
 offered by thr localisationupdate extrnsion.


We're working on getting the localization infrastructure set up:
https://bugzilla.wikimedia.org/show_bug.cgi?id=31467

Should end up on TWN soon. :)

-- brion
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Wikipedia Android app – Support Wikipedia version

2011-10-07 Thread Brion Vibber
2011/10/7 church.of.emacs.ml church.of.emacs...@googlemail.com

 How about making a free version and an (identical!) Support Wikipedia
 version that has absolutly the same features, but costs $2 that go to
 WMF as a donation? :)
 Why we should do this: It's an extremly simple way for users to donate
 money. No entering of credit card or bank account information necessary,
 just select one app instead of another.


This would be pretty straightforward to do, I think.


 Very rough estimate: 20M total downloads, 0.5% download the Support
 Wikipedia version for $2, that's $200k.


Note that the app-store middlemen take a 30% transaction fee, so we'd get
$140k and Google or Amazon (or Apple if we do an iPhone version same way)
gets $60k. 30% sounds high but probably isn't much worse than individual $2
credit card payments. :)

-- brion
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Wikipedia Android app – Support Wikipedia version

2011-10-07 Thread Brion Vibber
2011/10/7 Jay Ashworth j...@baylink.com

 On a separate topic, it would probably be useful if someone built nightlies
 of the repo, so that those of us who are good usability test types but not
 necessarily coders or set up to build can contribute too; the original
 announcement didn't sound like that was in the plans.  Is it practical?


I think that would be pretty easy to set up; stuck a note in BZ so we don't
forget: https://bugzilla.wikimedia.org/show_bug.cgi?id=31498

-- brion
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Bug 31424 - Anecdotal evidence of IE 8 problems

2011-10-07 Thread Brion Vibber
Just to note: we deployed an upgrade to jQuery 1.6.4 earlier today, which
appears to have stopped successfully the IE 8 crashing for people who were
able to test with us.

There was some slight breakage in UploadWizard, which has been patched.

-- brion
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] WikiEditor dialogs and caching? (fwd)

2011-10-17 Thread Brion Vibber
On Mon, Oct 17, 2011 at 4:26 PM, Daniel Barrett d...@vistaprint.com wrote:

 (Reposted from mediawiki-l due to zero responses.)

 I've been adding custom tools to the WikiEditor toolbar (MW 1.17.0 version)
 and am running into difficulty with browser caching.

 When I do something simple, such as adding a button or changing text in a
 dialog, the change doesn't take effect in my browser due to caching. Even if
 I hard-refresh (ctrl-F5), the changes do not appear. I have to delete my
 browser's cached content (e.g., in Firefox, Tools / Options / Advanced /
 Network / Clear Now) to see the change. That seems extreme: a browser
 refresh or hard-refresh ought to be sufficient, right?  Or ideally, no
 refresh at all.


So where's your code being loaded from? Site or use JS pages? Some
literal JS files output from an extension? Something else? Can you see it
being loaded incorrectly from something cached? Did you check in Firebug or
WebKit's developer tools or IE's developer tools to see what the response
from the server is?

-- brion
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] File::getPath() refactoring

2011-10-20 Thread Brion Vibber
On Wed, Oct 19, 2011 at 12:55 PM, Happy Melon happy.melon.w...@gmail.comwrote:

 On 19 October 2011 20:41, Russell Nelson russnel...@gmail.com wrote:

  In order to support viewing deleted files (currently a blocker bug in
  SwiftMedia), I'm going to refactor File::getPath() into a new public
  function File::getLocalPath(), which will return an instance of a new
  class TempLocalPath, which will have two methods: getPath(), and
  close(). This class will own a local copy of the file. When it goes
  out of scope or its close() method is called (same thing), any
  resources held by the class will be freed.

 Is there a more standardised name than close?  dispose is pretty common
 in compiled languages; do we have any sort of standard for that behaviour
 within MW?  If not, is this a good opportunity to create one?


We have free() on things like database result wrappers; I'd probably stick
with that. Generally though nobody has to call these explicitly as the
runtime's reference counting auto-destroys objects once they fall out of
use, and the destructor calls it for us.


As Tim noted, the particular case in Special:Undelete is probably an example
of old code using a lazy interface due to old assumptions, and should
actually be changed to use a proper File reference, and have a method to
call from the File object / via the FileRepo/FileBackend that wraps
streaming output so it doesn't actually need to do a file checkout.

This didn't come up as much for ForeignAPIRepo because we do fewer direct
operations on files from non-local repositories: we leave metadata lookups
and thumbnailing to the foreign site, so don't have to pass the file into
MediaHandler's shell-outs. We don't stream restricted or deleted files out
because we don't manage their deletion/undeletion or security.


On Wed, Oct 19, 2011 at 3:06 PM, Platonides platoni...@gmail.com wrote:

 Why is this needed?
 A FileSwift class which needed to fetch the file from Swift could
 acquire it when calling getPath(), and have it deleted on FileSwift
 destruction (getPath() is usually called from several methods, so it
 isn't practical to acquire and destroy every time).


I would broadly warn against unnecessarily checking files out to a local
filesystem: while this is fairly cheap with tiny JPEGs and PNG diagrams, and
feels cheap on modern computers for several-megabyte photos, it's still
potentially very expensive for multi-hundred-meg and multi-gigabyte video
files.

Any time we are copying to a local file, we should be sure that we *really
mean it*.

A getPath() that implicitly copies files makes me uneasy... I might actually
recommend putting a deprecation warning on it to help find and reduce these.

A more explicit check-out / discard interface will let us keep existing code
like our thumbnailing that does need to deal with local files, without
hiding that it's a potentially expensive operation. And that can encourage
us to move things towards interfaces that don't need to check files out, but
can read them as streams or skip expensive original-file checks in the first
place when they're not really needed.

-- brion
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Open positions with Wikimedia Germany

2011-10-24 Thread Brion Vibber
I just want to chime in with a yee-haw! on getting this project going!

-- brion
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Welcome, Antoine Musso! (hashar)

2011-10-24 Thread Brion Vibber
On Fri, Oct 21, 2011 at 3:24 PM, Platonides platoni...@gmail.com wrote:

 Rob Lanphier wrote:
   He has been working in our community
  for quite some time, going by the name Ashar Voultoiz on mailing
  lists, and by hashar on IRC.Now that he's working as a
  contractor for WMF, he's decided to let everyone know the Superman
  behind the Clark Kent alter ego :)

 That was my first concern, wasn't that supposed to be secretly kept??


Nobody recognizes him when he takes those glasses off ;)

-- brion
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


[Wikitech-l] MediaWiki unit testing and non-MySQL databases

2011-10-24 Thread Brion Vibber
Hey all --

I've got a stack of issues discussed during the last couple months'
conferences and hackathons to raise, so there may be a few more manifestos
on the way. But don't worry, this one will be short!


I had a great chat at the New Orleans hackathon with D J Bauch who's been
working on maintaining MediaWiki's MS SQL Server support, and also had a
very useful email chat a couple weeks back with Freakalowsky who's put a lot
of work into MediaWiki's Oracle support.

Long story short: though traditionally MediaWiki developers have put very
little work on our own into maintaining non-MySQL compatibility, there's
still lots of folks interested in running on them... AND THEY ACTUALLY
MOSTLY WORK!

At this point I think it's a bit crazy of us to keep on marginalizing that
code; some folks running their own instances will need or want or prefer (or
be forced by office IT) to run on some funny database, and we shouldn't
stand in their way. More importantly, keeping things working with multiple
DB backends helps us to keep our code cleaner, and reduces our own hard
dependencies on a particular product line.


There are two main impediments to keeping code working on non-MySQL
platforms:
* lazy code breakages -- us MySQL-natives accidentally use MySQL-isms that
break queries on other platforms
* lazy schema updates -- us MySQL-natives add new schema updates into the
system but only implement them for MySQL

The first could often be helped simply by having automated testing run to
make sure that relevant code paths get exercised. Often there's just a
function that's used, or lazy use of LIMIT/OFFSET, or GROUP BY in a form
that other databases are pickier about. Just flagging them from the testing
can often be enough to make it easy to fix.

The second is a bit harder, but again testing can help flag that something
needs to be updated so it doesn't wait until too late for the next release,
or something.


I'd like for everybody who's working on MediaWiki on non-MySQL databases to
make sure that the phpunit test suite can run on your favorite platform, so
we can see about getting them set up in our regular reports on
http://integration.mediawiki.org/ci/

Green bars are your friend! :)

-- brion
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] MediaWiki unit testing and non-MySQL databases

2011-10-24 Thread Brion Vibber
On Mon, Oct 24, 2011 at 2:08 PM, Fred Zimmerman zimzaz@gmail.comwrote:

 while we're on the subject, a question about MySQL: is it possible to make
 MW independent of MySQL innodb tables? Innodb tables have a number of
 attributes that can make set up a pain (e.g. file and log sizes).


There should be nothing relying specifically on InnoDB that I know of; we
default to InnoDB for everything possible because it's the only sane table
type widely available on standard MySQL installs (and it should fall back to
whetever the server default is if you don't have InnoDB).

If you do choose to use non-InnoDB tables, I recommend using something else
that is designed to work with transactions and survive server crashes (eg,
DO NOT USE MyISAM)

As noted in other replies, you can tweak $wgDBTableOptions to change the
default table type used when creating new tables; you can change existing
ones with an ALTER TABLE at any time.

-- brion
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] MediaWiki unit testing and non-MySQL databases

2011-10-25 Thread Brion Vibber
On Tue, Oct 25, 2011 at 8:17 AM, Antoine Musso hashar+...@free.fr wrote:

 Looks like there is a free Oracle version named 'Express' which we might
 use [1].  They are providing a Debian package [2] so we can get it
 installed on Jenkins host easily, just need some configurationw hich can
 probably be handled with puppet [3].


 [1]

 http://www.oracle.com/technetwork/database/express-edition/overview/index.html
 [2] https://help.ubuntu.com/community/Oracle


That debian repo only has the previous version (10g) though; the current 11g
only seems to be available for x86_64 as an RPM package, which allegedly
works on RHEL and OpenSuSE. Sigh...

I added a couple links to the current dl  documentation yesterday, but
haven't had a chance to test anything yet:
https://www.mediawiki.org/wiki/Manual:PHP_unit_testing/Oracle

-- brion
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Heads up: Online coding challenge

2011-10-25 Thread Brion Vibber
On Tue, Oct 25, 2011 at 10:40 AM, Platonides platoni...@gmail.com wrote:

 That's a lot! :)
 I wonder how many of them will proceed to code something useful and
 submit it. And how good will it be.

 An issue I see with the challenge is that it encourages the cathedral
 model,* so we lose the opportunity of training them as they go (eg.
 point out how we do i18n).


Indeed, this is very different from how we traditionally get volunteer
developers involved (by letting them walk into our web of addictive code and
giving them feedback on their code until they can't imagine doing anything
else ;).

I think the theory is that some small percentage of contestants will be
interested enough to actually dive in and do research, write code, and ask
people for feedback directly, and those are the people we'd actually want to
get back to?

Depending on how many of those 2000+ signups actually submit anything, that
might be a *lot* of extra judging work to find them, or it might not be too
much.

-- brion
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


[Wikitech-l] Visual Editor / parser status update meeting notes 2011-10-26

2011-10-26 Thread Brion Vibber
I've copied the live etherpad notes from today's Visual Editor / New Parser
update meeting onto wikitext-l for those who may be interested:

http://lists.wikimedia.org/pipermail/wikitext-l/2011-October/000463.html

-- brion
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Mobile site rewrite implementation

2011-05-13 Thread Brion Vibber
On Fri, May 13, 2011 at 2:40 PM, Patrick Reilly prei...@wikimedia.orgwrote:

 The reason that I went the route of creating an extension vs a skin was
 that
 I wanted the most flexibility in adapting the content for mobile device
 rendering. There are a number of sections that need to be removed from the
 final output in order to render the content effectively on mobile devices.
 So, being able to use a PHP output buffer handling is a nice feature. I
 also
 wanted the ability to use many of the features that are available when
 writing an extension to hook into core functionality.


To clarify -- a skin alone can't really do everything we want out of this
system -- it needs to be able to freely modify a large swath of output,
including rewriting content and styles and breaking long pages into shorter
pages.

A combination of a skin *and* other extension bits could probably be a good
future step for simply *adjusting the view a bit* for high-functioning
modern smartphones on the regular site, but will need to be combined with
paying more attention to how our MediaWiki's native user interface elements
display on a very small screen at native mobile size (as opposed to a
zoomable desktop-sized rendering as when you view regular MediaWiki on an
iPhone or Android browser).

The 'extension that rewrites a bunch o' stuff on output' allows a more
direct port of the existing Ruby codebase, which'll get equivalent stuff
running and ready to hack sooner. IMO this is a plus!

We *should* continue to look at the core MediaWiki interface as well, and
longer term also end up thinking about ways to supplement some forms 
special pages with small-screen-friendly variants. (Editing for instance
will not be very nice with a toolbar that's three times wider than your
screen, and will need a major redesign! :D)

-- brion
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] MediaWiki tech poster ideas?

2011-05-13 Thread Brion Vibber
Ooooh that would be AWESOME with an update fotr the Ashburn
DC!

-- brion
On May 14, 2011 1:35 AM, Happy-melon happy-me...@live.com wrote:
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


[Wikitech-l] Hackathon parser session initial notes

2011-05-14 Thread Brion Vibber
We had a whole bunch of folks who've had their hands in the world of
MediaWiki parsing  rich text editing here at the Berlin Hackathon, and made
some great progress on setting out some ideas for how to start actually
working on it.

Tomorrow I'll distill our session notes into a clearer description of the
core ideas  next steps (dare I say... a manifesto? :)

In the meantime, if you're brave you can peek at the raw session notes:
http://etherpad.wikimedia.org/mwhack11Sat-Parser

We're reviving the wikitext-l mailing list for people interested in the
project; it's gotten some traffic about interesting projects but we'll be
making it an active working group. I'll also be making regular posts here on
wikitech-l, on the Wikimedia tech blog, and on the wikis -- but I don't want
to clutter wikitech-l *too* much with the nitty-gritty details. ;)

Project hub pages will go up tomorrow at
http://www.mediawiki.org/wiki/Future

-- brion vibber (brion @ wikimedia.org / brion @ pobox.com)
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Hackathon parser session initial notes

2011-05-15 Thread Brion Vibber
On Sat, May 14, 2011 at 7:09 PM, Brion Vibber br...@wikimedia.org wrote:

 In the meantime, if you're brave you can peek at the raw session notes:
 http://etherpad.wikimedia.org/mwhack11Sat-Parser

 We're reviving the wikitext-l mailing list for people interested in the
 project; it's gotten some traffic about interesting projects but we'll be
 making it an active working group. I'll also be making regular posts here on
 wikitech-l, on the Wikimedia tech blog, and on the wikis -- but I don't want
 to clutter wikitech-l *too* much with the nitty-gritty details. ;)

 Project hub pages will go up tomorrow at
 http://www.mediawiki.org/wiki/Future


I've stubbed out a couple sections on:

http://www.mediawiki.org/wiki/Future/Parser_plan

More specific things to follow based on the notes previously posted.

-- brion
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Hackathon parser session initial notes

2011-05-16 Thread Brion Vibber
On Sun, May 15, 2011 at 2:13 PM, Brion Vibber br...@wikimedia.org wrote:

 I've stubbed out a couple sections on:

 http://www.mediawiki.org/wiki/Future/Parser_plan

 More specific things to follow based on the notes previously posted.


Added a stub 'get involved' section on
http://www.mediawiki.org/wiki/Future#Core_projects and the other pages;
there'll be more exciting stuff to see within a couple weeks as Trevor 
Neil select testing tools  we all get started on prelim grammar
descriptions.

We can pretty well expect to discard these initial grammar steps that we're
doing as we get farther along, but we're going to need something to work
with for now. :)

Collecting parser test cases  helping to document the existing parse tree
formats in use by some other parser variants will be very useful while we're
getting those together -- please feel free to add some notes to the AST 
test case pages (linked to from the above) or post notes directly on
wikitext-l.

-- brion
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Shell requests backlog

2011-05-17 Thread Brion Vibber
On Tue, May 17, 2011 at 3:56 AM, MZMcBride z...@mzmcbride.com wrote:

 Mark A. Hershberger wrote:
  There is hope for every bug, no matter how old.

 As long as people stop resolving old bugs as wontfix simply because
 they're old. I went through and re-opened quite a few bugs that were
 improperly marked as resolved over the weekend.

 People take time to register an account on Bugzilla, describe their issue,
 and then wait, often for _years_, for their issue to get looked at, much
 less worked on and resolved. I can't really stress enough how completely
 unacceptable it is to go around marking these bugs as wontfix without any
 explanation or simply because it's been a long time since the bug was
 filed.


MZ, if you're aware of *any* specific bugs where this is an issue, please
let me or Mark know and we'll take a look at them.

For old shell requests, one issue is that old discussions and consensuses
may no longer apply several years later -- have the conditions changed? Has
there been huge turnover in the community? Will people be pleased that
something is done or angry that something they've never heard of suddenly
changed without their having any voice in it?

In this case generally what we'll try to do is *ping the affected community*
to double-check if the old conditions and decisions stand, and then go ahead
and do it. These won't generally be closed as WONTFIX without actually
attempting to get that info -- or if they are it's as an intermediate stage
along with a request to reopen if more info comes through.

A 'NEEDINFO' resolution would probably be better than 'WONTFIX' for this
case, and we should see if we can enable that. Ensuring that devs know where
to go to ask for a status  consensus update in case the original filer 
CC'd people are unavailable would also help to make these go more
consistently.


When it comes to 'this doesn't work as I expected' bugs, if there's no
apparent way to reproduce the bug, and the original filer doesn't respond,
then closing out is not unreasonable.

Again, it would be better to have a 'NEEDINFO' resolution enabled for us to
use in this case.

I actually took this to mean sister projects (non-Wikipedias), not foreign
 language projects. If Wikisource has gotten 30 minutes of Wikimedia
 Foundation development time _this year_, I'd be shocked.


Wikisource could definitely use more attention; we're still thinking about
ways to keep some of the smaller projects more in the loop.

The primary method is by ensuring that folks can do site script  gadget
JavaScript development without bottlenecking through Wikimedia which as a
matter of mission has prioritized keeping servers running and Wikipedia
functional.

Wiktionary and Commons have created a *lot* of additional UI functionality
this way, without that bottleneck, and that's something I'm very happy
about.

Wikisource has a number of extensions that were targeted specifically to it
(classic DPL, labeled section transclusion, PagedTiffHandler, ProofreadPage)
which at least get some review maintenance but are mostly created 
maintained by volunteers. Making sure that they're up to date, modernized,
user-friendly, and well-performing is something that we at Wikimedia should
at least be supporting -- and I hope we'll be able to assign a little more
explicit time to making that happen. To date, they generally get maintenance
updates from volunteers but deployment doesn't get pushed, so the old
versions stay in place until a major upgrade with our current deployment
system.

But for the meantime, I can only say that it's important to accentuate the
positive -- find specific projects that need some effort, and help round up
people to help with them.

-- brion vibber (brion @ wikimedia.org / brion @ pobox.com)
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Interwikis

2011-05-17 Thread Brion Vibber
On Mon, May 16, 2011 at 9:35 PM, John McClure jmccl...@hypergrove.comwrote:

 I am wondering if interwikis can be be more powerfully leveraged.

 For instance, I'd like much to say ZZZ:my_ns:my_page to indicate a page in
 a
 particular subset of pages within a namespace in my wiki. Ultiimately an
 ACL
 can be associated with each interwiki descriptor.


Interwiki links don't know the namespace structure of a foreign site's
pages, so all the system knows is the interwiki prefix (eg 'wb:') and the
rest (eg 'fr:Discussion:Foobar'). The target site then has the
responsibility for resolving the link target (which as above might actually
be yet another interwiki link pointing to a wiki in another language, which
*then* resolves the actual namespace and title).


 And although I haven't tested this, does anyone know if a 'local' interwiki
 link (one mapped to the local wiki) is *supposed to* resolve to the local
 wiki?


By definition yes, but you need to tell the system that by setting
$wgLocalInterwiki to that prefix, or it'll still act, format, etc as a
foreign interwiki link and can't detect a self-reference. (If you have
multiple aliases, right now I think you can only pick one as the local
form.)

-- brion
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Output as ePub or Mobi?

2011-05-17 Thread Brion Vibber
On Mon, May 16, 2011 at 1:44 PM, Tomasz Finc tf...@wikimedia.org wrote:

 I'm having some discussions about adding it to the Books collection
 extension (http://www.mediawiki.org/wiki/Extension:Collection).

 Will update as I have more info.


OP has filed a request for review of an EPub exporter extension found via
mediawiki.org: BZ entry -
https://bugzilla.wikimedia.org/show_bug.cgi?id=29023

I've taken a quick initial look and I think it should be feasible with fixes
for smaller third-party sites -- so could probably be copied into our SVN
for maintenance  distribution if there's no license or other major issues.

For production Wikimedia use though I think we'd be happiest with
integrating with the Collection / mwlib stuff, so it fits with our other
export options and (most importantly) that method of building export sets.

-- brion
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Feedback on online communications around the Berlin hackathon (including remote participants)

2011-05-23 Thread Brion Vibber
On Fri, May 20, 2011 at 2:59 PM, Sumana Harihareswara suma...@wikimedia.org
 wrote:

 At Nikerabbit's suggestion, an excerpt from a LWN article about Ubuntu
 Developer Summit describing how to thoroughly encourage participation
 from remote  local audiences:

  All of the UDS meetings are set up the same, with a fishbowl of
  half-a-dozen chairs in the center where the microphone is placed so
  that audio from the meeting can be streamed live. There are two
  projector screens in each room, one showing the IRC channel so that
  external participants can comment and ask questions; the other is
  generally tuned to the Etherpad notes for the session, though it can
  be showing the Launchpad blueprint or some other document of interest.
 
  The team that is running the meeting sits in the fishbowl, while the
  other attendees are seated just outside of it; sometimes all over the
  floor and spilling out into the hallway. Audience participation is
  clearly an important part of UDS sessions.


That's a great way to run certain kinds of planning or present cool idea 
brainstorm about it to find cool things to start working on sessions.

It seemed a lot of our sessions this time around were kind of halfway
between that style and either an open-room presentation or a small intense
workgroup; I think with a little better room/group separation for some of
the break-out groups we make more of them work like that and be more
inviting to remote participants.

Particularly if we can coordinate a little better with some of the
additional groups like the Language Committee  Wiki Loves Monuments people
-- as some folks said on-site the langcom folks seemed to be a bit more
aggressive about coming over and grabbing devs for questions  comments (hi
GerardM! ;) than the WLM folks, and we'd probably benefit from a little
explicit session time with both groups. Scheduling a brief breakout session
 letting the remote folks have the chance to show up for it too can help
here over just the ad-hoc connections we make person-to-person.


Etherpad's a particularly nice medium for the group note-taking since you
tend to end up with two or three people each sort of half-covering the
session in notes, and they can fill in for each other as attentions wander
to and from specific parts of the conversation. It also gives remote
participants a *direct* way to interact -- what was THIS about? can you
clarify THAT? -- before the on-site participants lose their context and end
up unable to clarify the documents.


Anyway long story short -- super great meetings, and I think we're well on
our way to figuring out how to do a fun  productive hackathon. Thanks to
everybody at WMDE, WMF, and the Beta Haus who helped make it a reality this
year!

-- brion
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


[Wikitech-l] Any issues with pending jQuery 1.6.1 update on trunk?

2011-05-23 Thread Brion Vibber
Passing on an issue from IRC -- there's some talk of updating the core
jQuery copy from 1.5.2 to the recently-released 1.6.1, which among other
things fixes a bug mdale was encountering with certain JSON loads with
$.ajax() being evaluated in the incorrect way.

Krinkle  mdale are doing some quick testing with 1.6.1 to make sure it
doesn't seem to break anything, but please feel free to test  check your
own code!

Full release notes:
http://blog.jquery.com/2011/05/12/jquery-1-6-1-released/

A few specific things to watch out for:
* .attr() and .prop() have been split apart; some .attr() things may behave
different now
* .attr() behavior for boolean properties has changed
* .data() now pulls in HTML 5 data attributes automatically; make sure
there's no conflict?
* animations have changed a bit, which could conceivably affect things using
it

-- brion
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Code review process (was: Status of more regular code deployments)

2011-05-31 Thread Brion Vibber
On Tue, May 31, 2011 at 12:09 PM, Russell N. Nelson - rnnelson 
rnnel...@clarkson.edu wrote:

 Robla writes:
 1.  We say that a commit has some fixed window (e.g. 72 hours) to get
 reviewed, or else it is subject to automatic reversion.  This will
 motivate committers to make sure they have a reviewer lined up, and
 make it clear that, if their code gets reverted, it's nothing
 personal...it's just our process.

 Worse than pre-commit review. What if you make a change, I see it, and make
 changes to my code using your changes. 72 hours later, they get reverted,
 which screws my code.


For the same reason my position is that a minimum time before revert is
not desireable -- if anything is ever found to be problematic, the correct
thing is to remove it immediately.

Robla counters this with the observation that when there are multiple people
handling code review, simply being *uncertain* about whether something needs
a revert or not is better grounds for *passing it on* to someone else to
look at than to immediately revert.

This should usually mean a few minutes on IRC, however, not just a 72-hour
silence without feedback!


What really worries me though isn't the somebody else built a patch on this
within 72 hours, now it's slightly awkward to revert case but the other
people built 100 patches on this in the last 10 months, now it's effing
impossible to revert.


Ultimately, we need to remember what reverting a commit means:

** Reverting means we're playing it safe: we are taking a machine in unknown
condition out of the line of fire and replacing it with a machine in
known-good condition, so the unknown one can be reexamined safely in
isolation while the whole assembly line keeps moving. **

It's not a moral judgment against you when someone marks code as fixme or
reverts it; it's simple workplace safety.

-- brion
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Code review process (was: Status of more regular code deployments)

2011-05-31 Thread Brion Vibber
On Tue, May 31, 2011 at 12:37 PM, Max Semenik maxsem.w...@gmail.com wrote:

 On 31.05.2011, 22:41 Rob wrote:

  So, there's a number of possible solutions to this problem.  These are
  independent suggestions, but any of these might help:
  1.  We say that a commit has some fixed window (e.g. 72 hours) to get
  reviewed, or else it is subject to automatic reversion.  This will
  motivate committers to make sure they have a reviewer lined up, and
  make it clear that, if their code gets reverted, it's nothing
  personal...it's just our process.
  2.  We encourage committers to identify who will be reviewing their
  code as part of their commit comment.  That way, we have an identified
  person who has license to revert if they don't understand the code.

 Social  problems are hard to fix using technical means. What will
 happen if someone approves a revision that later turns out to be
 defective, off with their head?


This isn't a technical solution, but a social one to force us to make sure
that things that haven't been looked over GET looked at.

Consider it to already be our hypothetical model -- it's just not been
enforced as consistently as it's meant to. Bad code - fixed or reverted
ASAP. Unseen code - assumed bad code, so look at it.


 OMG my commit will be wasted in 20
 minutes! Heeelponeone is not the best scenario for clear thinking.
 Time pressure might result in people reviewing stuff they're less
 familiar with, and reviews being done in a haste, so CR quality will
 suffer.


I'd say better to do that continuously and confirm that things are being
reviewed *and* run on automated tests *and* run against humans on
actual-production-use beta servers *and* in actual production use within a
reasonable time frame, than to let review slide for 10 months and rush it
all right before a giant release.

That IMO is what really hurts code review -- removing it from the context of
code *creation*, and removing the iterative loop between writing, debugging,
testing, debugging, deploying, debugging, fixing, and debugging code.



 Although I'm not a biggest fan of rushing towards DVCS, the
 distributed heavily-branched model where mainline is supported only
 through merges from committed-then-reviewed branches looks far more
 superior to me. This way, we will also be able to push stuff to WMF
 more often as there will be no unreviwewed or unfinished code in
 trunk.


I tend to agree that a more active 'pull good fixes  features in from
feeder branches' model can be helpful, for instance in allowing individual
modules, extensions, etc to iterate faster in a shared-development
environment without forcing the changes to trunk/mainlines's production
deployment  release channels until they're ready.

But these need to be combined with really active, consistent management;
just like keeping the deployment  release branches clean, working, and up
to date consistently requires a lot of attention -- when attention gets
distracted, the conveyor belt falls apart and fixes  features stop reaching
the public until the next giant whole-release push, which turns into a
circus.

-- brion
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Status of more regular code deployments

2011-05-31 Thread Brion Vibber
On Tue, May 31, 2011 at 2:50 PM, Alex Mr.Z-man mrzmanw...@gmail.comwrote:

 On 5/31/11, Trevor Parscal tpars...@wikimedia.org wrote:
  There's been tons of discussion about what is an ideal release schedule
 for
  us (probably literally if you printed it out at 12pt on standard copy
  paper). At this point I think we are balancing between fighting the bad
  habits that got us into the 1.17 rut, and trying to give our engineering
  team enough time to do significant work (rather than spending 1 day a
 week
  pushing code).

 One issue is that time spent deploying does not appear to be
 independent of time between deployments. If it always took a day to
 deploy new code, that would be a good argument in favor of more time
 between deployments. But in practice, the more time that passes
 without pushing code to the site, the longer it takes to complete the
 deployment. Spending a week deploying every other month doesn't really
 save much time over spending a day every week.


I'd actually consider it pretty ideal to spend a day every week deploying
updates, gathering feedback, and making fixes. Rather than an argument in
favor of longer cycles, I'd consider that an argument in favor of shorter
cycles!

-- brion
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] LiquidThreads - drag to new location does not work. Does it work for you?

2011-05-31 Thread Brion Vibber
On Sun, May 29, 2011 at 12:54 AM, Thomas Gries m...@tgries.de wrote:

 RE: http://www.mediawiki.org/wiki/Extension_talk:LiquidThreads

 Let me talk about talk pages with Lqt.
 The function to drag to new location a LiquidThread is not working for
 me.


Looks like this got resolved now via
https://bugzilla.wikimedia.org/show_bug.cgi?id=28407 ?

-- brion
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Status of more regular code deployments

2011-05-31 Thread Brion Vibber
On Tue, May 31, 2011 at 3:20 PM, MZMcBride z...@mzmcbride.com wrote:

 Yes, it's incredibly frustrating to the point that volunteer developers
 have
 walked away from MediaWiki code development. This is the wiki project;
 things are supposed to be fast! When someone makes a patch or a commit and
 the best we can say is it might get looked at this year with a fifty
 percent confidence interval, that's not okay.

 Code deployments to the live sites need to happen more often than
 quarterly.
 There are almost no benefits to such a long gap between deployments and the
 detriments are clear.

 Is there a problem with saying hell or high water, we won't go more than a
 month without a deployment? I realize Wikimedia is a bit different, but
 taking a page out of the rest of the business world's book, you set a
 deadline and then it just fucking gets met. No excuses, no questions. The
 problem seems to be finding anyone to lay down the damn law.


Sing it, brother! We're getting *some* stuff through quicker within the
deployment branches, but not nearly as much as we ought to.

Whatever the overall release branching frequency, there's also an update
cycle within that in getting fixes  independent features (usually
extensions) pushed out to the wild, and that's the cycle that's often most
important to users (because that's where their most immediate problems get
fixed)  developers (because users are where we get feedback).

Fixes need to be hitting production within days at longest, and more
features than just UploadWizard and ArticleFeedback should be able to make
it out on their own schedule without being forced to wait for a total
infrastructure update.

-- brion
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Status of more regular code deployments

2011-05-31 Thread Brion Vibber
On Tue, May 31, 2011 at 4:31 PM, Happy-melon happy-me...@live.com wrote:

 Brion Vibber br...@pobox.com wrote in message
 news:BANLkTinZzef=orvuw9dwicvahybuwxn...@mail.gmail.com...
 
  Sing it, brother! We're getting *some* stuff through quicker within the
  deployment branches, but not nearly as much as we ought to.

 The fact that some things are *not* getting stuck in the CR quagmire is
 part
 of the *problem*, not the solution.  The upper levels of the developer
 hierarchy ***obviously know*** that the mainline CR process is
 substantially
 broken, BECAUSE ***THEY'RE NOT USING IT*** FOR THINGS THAT ARE IMPORTANT TO
 THEM.


That's not correct -- the same code review tools are being used to look over
and comment on those things, AND THEN THE DAMN THING ACTUALLY GETS MERGED TO
THE DEPLOYMENT BRANCH BY SOMEBODY AND PUSHED TO PRODUCTION BY SOMEBODY.

It's that step 2 that's missing for everything else until the accumulated
release pressure forces a system-wide update.

The unavoidable implication of that observation is that the work of
 the volunteer developers *DOESN'T* matter to them.  Whether or not that
 implication is correct (I have confidence that it's not) is irrelevant, the
 fact that it's there is what's doing horrible damage to the MediaWiki
 community.



-- brion
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Code review process (was: Status of more regular code deployments)

2011-06-01 Thread Brion Vibber
On Wed, Jun 1, 2011 at 8:23 AM, Roan Kattouw roan.katt...@gmail.com wrote:

 This, for me, is the remaining problem with the 72-hour rule. If I
 happen to commit a SecurePoll change during a hackathon in Europe just
 as Tim leaves for the airport to go home, it's pretty much guaranteed
 that he won't be able to look at my commit within 72 hours. Or maybe
 I'm committing an UploadWizard change just after 5pm PDT on Friday,
 and the next Monday is a federal holiday like the 4th of July. Or
 maybe the one reviewer for a piece of code has just gone on vacation
 for a week and we're all screwed.


I think rather than us being all screwed, this is a situation where we
*don't actually have to* commit that change to trunk right at that exact
time. Unless the change needs to be deployed immediately -- in which case
you really should have someone else looking it over to review it IMMEDIATELY
-- it can probably wait until reviewers are available.

And if it does get reverted before the reviewer gets to it -- what's wrong
with that? It'll get reviewed and go back in later.


Part of the reason we went so hog-wild with handing out commit access for a
long time is because we've never *been* as good as we want about following
up with patches submitting through the bug trackers and mailing lists;
sometimes things get reviewed and pushed through, but often they just get
forgotten. Letting people commit directly meant that things didn't get
forgotten -- they were right there in the code now -- with the caveat that
if there turned out to be problems, we would have to go through and take
them back out.

It's a trade-off we chose to make at the time, and it's a trade-off we can
revisit if we change the underlying conditions.

-- brion
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Code review process (was: Status of more regular code deployments)

2011-06-01 Thread Brion Vibber
On Wed, Jun 1, 2011 at 8:28 AM, Chad innocentkil...@gmail.com wrote:

 I *do* think we should enforce a 48hr revert if broken rule. If you
 can't be bothered to clean up your breakages in within 48 hours of
 putting your original patch in, it must not have been very important.


Officially speaking, since forever we have had an it's acceptable to revert
on sight if broken rule.

If you encounter something broken, correct responses include:
* revert it immediately to a known good state; send an explanation to the
author about what's wrong (usually a CR comment and reopening a 'fixed'
bugzilla entry)
* figure out the problem and fix it; send follow-up explanation to the
author (usually a CR comment and a Bugzilla comment if it's a followup on a
bug fix)
* take a stab at the problem and recommend a solution to someone else who
will fix it straight away; including a note on CR or Bugzilla summarizing
your IRC discussion is wise in case it ends up getting dropped
* tag it fixme and ask someone more familiar with the stuff than you to
handle it

-- brion
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] tests for branches (was: Code review process)

2011-06-01 Thread Brion Vibber
On Wed, Jun 1, 2011 at 10:28 AM, Max Semenik maxsem.w...@gmail.com wrote:

 On 01.06.2011, 20:56 Ashar wrote:
  I am planning to add the REL1_18 branch to the CruiseControl system as a
  new project The prerequisite being to make it works for trunk :-)

 Would  be  great!  Though even greater would be to ditch CruiseControl
 completely  and  find  something that allows us to add/remove branches
 via  web  interface  so that every private branch can be easily tested
 *before* being merged into trunk.


That'll be especially handy in a future git world, which will be a lot
friendlier to private and shared work branches. (SVN branches are hard to
re-merge, which has led to us mainly using them for version branching --
which never have to merge back upstream -- and have used so few work
branches.)

-- brion
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Status of more regular code deployments

2011-06-01 Thread Brion Vibber
On Wed, Jun 1, 2011 at 7:58 AM, Roan Kattouw roan.katt...@gmail.com wrote:

 Alright, we've had fun with all caps for a while, so let's focus on
 solutions now.


Ok, I'm turning in my caps lock key. ;)


 So let me sketch how I see us getting there.

 1. Get 1.18 reviewed
 2a. Get 1.18 deployed and released
 2b. At the same time, continue efforts to have code review catch up
 with SVN HEAD
 3. Once we have reviewed everything up to HEAD (or up to a revision
 less than a week older than HEAD), deploy it to Wikimedia
 4. Keep reviewing so the CR backlog doesn't grow, and deploy HEAD
 again in 1-2 weeks. Rinse and repeat.

 There are a few caveats here, of course:
 * 2a and 2b largely compete for the time of the same developers


That's one of the reasons actually assigning time for it should be helpful:
currently we have zero staff dedicated full-time to getting code ready for
deployment and deployed. We have several staff  contractors with it on
their list of duties, but no actual requirement that it get done on a
regular basis. With no regular short-term review  deployment schedule (even
for fixes), we all find other things to do and get out of the habit.

And let's not forget that a big part of what review does is to help develop
the reviewee's coding skills! By pointing out things that need fixing and
things to watch out for, we help all our coders become better coders. And by
pushing review closer to the original code-writing, we'll make that feedback
loop tighter  more effective.



 * 3 (initial HEAD deployment) will probably be a bit hairy because
 it's a lot of code the first time. After that it's only 1-2 weeks'
 worth each time
 * For 4, we need to figure out a way to make CR actually happen in a
 regular and timely fashion


+1



 So a lot of it comes down to implementing a proper code review
 process, both for the one-off catch-up sprint (to 1.18, then to HEAD;
 we've done one of these before with 1.17) and for the day-to-day
 keep-up work. I think we may also need to make it clearer that
 reviewers (who, typically, are paid developers working on many other
 things) need to spend one-fifth of their time (i.e. a day per week for
 full-time employees) on code review. This is basically the admins
 aren't watching Special:Unreviewedpages at all problem that
 Happy-melon mentioned.


*nod*

-- brion
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Code review process (was: Status of more regular code deployments)

2011-06-01 Thread Brion Vibber
On Wed, Jun 1, 2011 at 12:43 PM, Aryeh Gregor simetrical+wikil...@gmail.com
 wrote:
[snip]

 The point is, people's code should only be rejected with some specific
 reason that either a) lets them fix it and resubmit, or b) tells them
 definitively that it's not wanted and they shouldn't waste further
 effort or hope on the project.

[snip]

This isn't ready for core yet -- code looks a bit dense and we don't
understand some of what it's doing yet. Can you help explain why you coded
it this way and add some test cases?

It's usually better if you can do this at a pre-commit stage: when reviewing
a patch on bugzilla, when looking over a merge request from a branch, etc.
But it's ok to pull something back out of trunk to be reworked, too.

-- brion
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Code review process (was: Status of more regular code deployments)

2011-06-01 Thread Brion Vibber
On Wed, Jun 1, 2011 at 12:43 PM, Aryeh Gregor simetrical+wikil...@gmail.com
 wrote:

 So then what happens if volunteers' contributions aren't reviewed
 promptly?


Indeed, this is why we need to demonstrate that we can actually push code
through the system on a consistent basis... until we can, nobody seems
willing to trust pre-commit review.

-- brion
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Code review process (was: Status of more regular code deployments)

2011-06-01 Thread Brion Vibber
On Wed, Jun 1, 2011 at 1:53 PM, bawolff bawolff...@gmail.com wrote:

 As a volunteer person, I'm fine if code I commit is reverted based on
 it sucking, being too complicated, being too ugly, etc provided there
 is actually some objection to the code. However, I'd be rather
 offended if it was reverted on the basis of no one got around to
 looking at in the last 3 days, since that is something essentially out
 of my control.


When someone looks at your commit within ~72 hours and reverts it because
nobody's yet managed to figure out whether it works or not and it needs more
research and investigation... what was the reason for the revert?

Because 'no one reviewed it'? Or because someone looked at it and decided it
needed more careful review than they could give yet?

-- brion
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Code review process (was: Status of more regular code deployments)

2011-06-01 Thread Brion Vibber
On Wed, Jun 1, 2011 at 4:41 PM, Happy-melon happy-me...@live.com wrote:

 +1.  Pre-commit-review, post-commit-lifetime, branching, testing, whatever;
 all of the suggestions I've seen so far are IMO doomed to fail because they
 do not fix the underlying problem that not enough experienced manhours are
 being dedicated to Code Review for the amount of work (not the 'number of
 commits', the amount of *energy* to make changes to code) in the system.  A
 pre-commit-review system doesn't reduce the amount of work needed to get a
 feature into deployment, it just changes the nature of the process.


This is one of the reasons I tend to advocate shorter
development/review/deployment cycles. By keeping the cycle short, we can
help build up regular habits: run through some reviews every couple days. Do
a deployment update *every* week. If you don't think your code will be
working within that time, either work on it outside of trunk or break it up
into pieces that won't interfere with other code.

With a long cycle, review gets pushed aside until it's no longer habit, and
gets lost.

-- brion
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Status of more regular code deployments

2011-06-01 Thread Brion Vibber
On Wed, Jun 1, 2011 at 5:57 PM, Aryeh Gregor
simetrical+wikil...@gmail.comwrote:

 On Wed, Jun 1, 2011 at 6:52 PM, MZMcBride z...@mzmcbride.com wrote:
  This topic has come up several times and each time, it seems like the
 people
  with any kind of power are conspicuously absent. Where is Danese? Is she
  subscribed to this list? Is she reading it?

 As I read it, EPMs are directly beneath Danese and presumably have
 some say in how tech resources are used (thus Manager):

 http://wikimediafoundation.org/wiki/Engineering_Program_Manager

 Software EPMs listed at
 http://wikimediafoundation.org/wiki/Staff#Technology (excluding
 Usability) are Alolita Sharma, Tomasz Finc, and Rob Lanphier.  Robla
 is the EPM in charge of General Engineering, which as I understand it
 is the group that includes most of the staff developers participating
 in this discussion.  He's consistently commented on all of these
 threads, and is clearly reading them.  So I'm pretty sure the issue
 isn't being ignored.


Indeed, this conversation is as active live in the office as it is on the
list, and the folks most directly in the line of fire are commenting.

-- brion
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


[Wikitech-l] On the IE6 query string security check breakage (bug 28235)

2011-06-02 Thread Brion Vibber
Is there a way we can narrow down this security check so it doesn't keep
breaking API requests, action=raw requests, and ResourceLoader requests,
etc?

Having the last param in a query string end in say .png or .svg or
.jpg or .ogg is. very frequent when dealing with uploaded files and
file pages. In addition to the reported breakages with ResourceLoader, I've
seen this problem break classic-style action=raw site CSS page loads
(action=rawtitle=MediaWiki:Filepage.css) and API requests for
MwEmbed+TimedMediaHandler's video player, for the OEmbed extension I'm
fiddling with, etc.

The impression I get is this is:

1) only exploitable on IE 6 (which is now a small minority and getting
smaller)
2) only exploitable if the path portion of the URL does not include an
unencoded period (eg 'api' or 'api%2Ephp' instead of 'api.php')
3) only exploitable if raw HTML fragments can be injected into the output,
eg a 'body' or other that triggers IE's HTML detection


For 1) I'm honestly a bit willing to sacrifice a few IE 6 users at this
point; the vendor's dropped support, shipped three major versions, and is
actively campaigning to get the remaining users to upgrade. :) But I get
protecting, so if we can find a workaround that's ok.

For 2) ... if we can detect this it would be great as we could avoid
breaking *any* api.php, index.php, or load.php requests in most real-world
situations.

For 3) ... formatted  XML output from the API should always be safe as,
even if it triggers XML/HTML detection you can't slip in arbitrary script
bits. JSON output seems to be the problematic vector currently, as you can
manage to get arbitrary strings embedded in some places like error messages:

  {warnings:{siteinfo:{*:Unrecognized value for parameter
'siprop': body onload=alert(1).html}}}

On the other hand if our JSON output escaped '' and '' characters you'd
get this totally safe document:

  {warnings:{siteinfo:{*:Unrecognized value for parameter
'siprop': \u003Cbody onload=alert(1)\u003E.html}}}


I tested this by slipping a couple lines into ApiFormatJson:

$this-printText(
$prefix .
str_replace( '', '\\u003C',
str_replace( '', '\\u003E',
FormatJson::encode( $this-getResultData(),
$this-getIsHtml() ) ) ) .
$suffix
);

and can confirm that IE 6 doesn't execute the script bit.

Are there any additional exploit vectors for API output other than HTML tags
mixed unescaped into JSON?

-- brion
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] On the IE6 query string security check breakage (bug 28235)

2011-06-02 Thread Brion Vibber
On Thu, Jun 2, 2011 at 2:20 PM, Roan Kattouw roan.katt...@gmail.com wrote:

 On Thu, Jun 2, 2011 at 10:56 PM, Brion Vibber br...@pobox.com wrote:
  Is there a way we can narrow down this security check so it doesn't keep
  breaking API requests, action=raw requests, and ResourceLoader requests,
  etc?
 
 Tim had an idea about redirecting bad URLs to fixed ones. He ran it by
 me last night his time, and my guess is he'll probably implement it
 this morning his time. But I'll leave it up to him to elaborate on
 that.


I know this has already been brought up, but that doesn't work for POST, and
may not work for API clients that don't automatically follow redirects.
(Which it looks like includes MediaWiki's ForeignAPIRepo since our Http
class got redirection turned off by default a couple versions ago.)

Your ideas to secure api.php output against HTML abuse are
 interesting, but I don't think the txt and dbg formats can be fixed
 that way.


Why do we actually have these extra unparseable formats? If they're for
debug readability then we can probably just make them HTML-formatted, like
jsonfm/xmlfm/etc.

-- brion
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] On the IE6 query string security check breakage (bug 28235)

2011-06-02 Thread Brion Vibber
On Thu, Jun 2, 2011 at 2:44 PM, Roan Kattouw roan.katt...@gmail.com wrote:

  However, I don't think POST is that much
 of a problem. Problematic dots can only result from 'variable' pieces
 of data being put in your query string, and with POST you'd typically
 put the variable parts in the POST body. And even if you're not doing
 that, moving something from the query string to the POST body should
 be trivial.


Good point! That helps simplify things. :D

 Why do we actually have these extra unparseable formats? If they're for
  debug readability then we can probably just make them HTML-formatted,
 like
  jsonfm/xmlfm/etc.
 
 To be honest, I don't remember. I think they can die. I'll take a look
 at the revision history tomorrow to check why they were introduced in
 the first place.


Spiff. I'd be happy to kill 'em off entirely ;) but if there's a better way
that still keeps our  functional output formats safe  working, that's
super.

-- brion
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] On the IE6 query string security check breakage (bug 28235)

2011-06-03 Thread Brion Vibber
On Thu, Jun 2, 2011 at 5:21 PM, Tim Starling tstarl...@wikimedia.orgwrote:

 On 03/06/11 06:56, Brion Vibber wrote:
  For 1) I'm honestly a bit willing to sacrifice a few IE 6 users at this
  point; the vendor's dropped support, shipped three major versions, and is
  actively campaigning to get the remaining users to upgrade. :) But I get
  protecting, so if we can find a workaround that's ok.

 We can't really do this without sending Vary: User-Agent, which
 would completely destroy our cache hit ratio. For people who use Squid
 with our X-Vary-Options patch, it would be possible to use a very long
 X-Vary-Options header to single out IE 6 requests, but not everyone
 has that patch.


I'm really thinking more along the lines of: if someone's an IE 6-or-below
user they have hundreds of other exploit vectors staring them in the face
too, and we can't protect them against many of them -- or ANY of them if
they're visiting other sites than just an up-to-date MediaWiki.

The cost of this fix has been immense; several versions of the fix with
varying levels of disruption on production sites, both for IE 6 users and
non-IE 6 users, and several weeks of delay on the 1.17.0 release.

I'd be willing to accept a few drive-by downloads for IE 6 users; it's not
ideal but it's something that their antivirus tools etc will already be
watching out for, that end-users already get trained to beware of, and that
will probably *still* be exploitable on other web sites that they visit
anyway.


The main issue here is that we don't a wide variety of web servers set
 up for testing. We know that Apache lets you detect %2E versus dot via
 $_SERVER['REQUEST_URI'], but we don't know if any other web servers do
 that.

 Note that checking for %2E alone is not sufficient, a lot of
 installations (including Wikimedia) have an alias /wiki -
 /w/index.php which can be used to exploit action=raw.


Well that should be fine; as long as we can see the /wiki?/foo.bat then we
can identify that it doesn't contain an unencoded dot in the path.

It sounds like simply checking REQUEST_URI when available would eliminate a
huge portion of our false positives that affect real-world situations.
Apache is still the default web server in most situations for most folks,
and of course runs our own production servers.



  Are there any additional exploit vectors for API output other than HTML
 tags
  mixed unescaped into JSON?

 Yes, all other content types, as I said above.


Only as drive-by downloads, or as things that execute without interaction?


 I think the current solution in trunk, plus the redirect idea that
 I've been discussing with Roan, is our best bet for now, unless
 someone wants to investigate $_SERVER['REQUEST_URI'].


*nod* Checking REQUEST_URI is probably the first thing we should do when
it's available.


 If there is an actual problem with ForeignAPIRepo then we can look at
 server-side special cases for it. But r89248 should allow all API
 requests that have a dotless value in their last GET parameter, and a
 quick review of ForeignAPIRepo in 1.16 and trunk indicates that it
 always sends such requests.


Yay! That's one less thing to worry about. :D


 Since we're talking about discarded solutions for this, maybe it's
 worth noting that I also investigated using a Content-Disposition
 header. The vulnerability involves an incorrect cache filename, and
 it's possible to override the cache filename using a
 Content-Disposition filename parameter. The reason I gave up on it
 is because we already use Content-Disposition for wfStreamFile():

header( Content-Disposition:
 inline;filename*=utf-8'$wgLanguageCode' . urlencode( basename( $fname
 ) ) );

 IE 6 doesn't understand the charset specification, so it ignores the
 header and goes back to detecting the extension.


Good to know.

-- brion
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Code review process (was: Status of more regular code deployments)

2011-06-03 Thread Brion Vibber
On Fri, Jun 3, 2011 at 3:33 PM, Rob Lanphier ro...@wikimedia.org wrote:

 So, can we make it a goal to get to 1329 by this time next week?  That
 first 265 should be a lot easier than the last 265, so if we can't get
 through the first 265, then we don't really stand a chance.


I accept this challenge! My 20% is committed for next week... are y'all with
us? :D

-- brion
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] [Foundation-l] YouTube and Creative Commons

2011-06-03 Thread Brion Vibber
(I'm not sure offhand if I'm set up to cross-post to Foundation-l; if this
doesn't make it, somebody please CC a mention if necessary. Thanks!)

On Fri, Jun 3, 2011 at 4:42 PM, aude aude.w...@gmail.com wrote:

 Aside from the very real privacy issue, YouTube videos can disappear at any
 time.  I would much rather we host them on Commons.

 A youtube2commons script is pretty easy to implement, but the big hindrance
 is the 100 MB upload limit on Commons.  What are the plans (if any) to
 increase this? or help facilitate us (e.g. via the api, from the
 toolserver,
 ...) in uploading video and files that exceed the limit?


There's been some ongoing work on TimedMediaHandler extension which will
replace the older OggHandler (this provides the nicer video player we've got
hacked in on Commons, and some automatic transcoding for different
resolutions in the free Theora  WebM formats). This still needs a little
more work, and probably some improvements on the actual transcoding
management and such, but this will help a bit in supporting larger files (eg
by transcoding high-res files to lower-res they're more easily viewable).
This isn't ready to go just yet though, and there's still the issue of
actually uploading the files.

Basic uploads by URL work in theory, but I'm not sure the deployment status.
Background large-file downloads are currently disabled in the latest code
and needs to be reimplemented if that's to be used.

For straight uploads, regular uploads of large files are a bit problematic
in general (they hit memory limits and such and have to make it through
caching proxies and whatnot), but there's also been some new work on
improved chunked uploads for FireFogg (and perhaps for general modern
browsers that can do fancier uploads). Michael Dale can probably give some
updates on this, but it'll be a bit yet before it's ready to go.

-- brion
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


[Wikitech-l] I'm in ur trunk, reviewing pre-1.18 branch code

2011-06-07 Thread Brion Vibber
Currently working (mostly backwards) to fill in the Code Review holes from
before 1.18 branch point:

http://www.mediawiki.org/w/index.php?title=Special:Code/MediaWikioffset=87519path=%2Ftrunk%2Fphase3

Roadmap provisionally says July for 1.18 deployment:
http://www.mediawiki.org/wiki/MediaWiki_roadmap

-- brion
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] I'm in ur trunk, reviewing pre-1.18 branch code

2011-06-07 Thread Brion Vibber
On Tue, Jun 7, 2011 at 12:58 PM, Roan Kattouw roan.katt...@gmail.comwrote:

 On Tue, Jun 7, 2011 at 8:56 PM, Brion Vibber br...@pobox.com wrote:
  Currently working (mostly backwards) to fill in the Code Review holes
 from
  before 1.18 branch point:
 
 
 http://www.mediawiki.org/w/index.php?title=Special:Code/MediaWikioffset=87519path=%2Ftrunk%2Fphase3
 
 Yay!

 Did you see the Etherpad in which Mark Hershberger assigned certain
 paths to certain people?


Etherpad's great for notes but less great as an ongoing assignment space,
such as say a wiki page that can be found by searching.

-- brion
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] I'm in ur trunk, reviewing pre-1.18 branch code

2011-06-08 Thread Brion Vibber
That reminds me I should merge docs to the roadmap page! For now those live
at http://www.mediawiki.org/wiki/Future

-- brion
On Jun 8, 2011 4:13 AM, Roan Kattouw roan.katt...@gmail.com wrote:
 On Wed, Jun 8, 2011 at 9:30 AM, Dmitriy Sintsov ques...@rambler.ru
wrote:
 I wish there was WYSIWYG editor in the roadmap. It can be anything:
 Wikia RTE, Magnus Manske visual editor, or anything else - however,
 supporting creating content from scratch (not just editing over already
 existing article), working with built-in parser out of box, and no
 glitches like FCKeditor has. And being an WMF extension with all the
 advantages of continuous integration (core / extension compatibility).
 Dmitriy

 There are people at WMF working on a visual editor. It's not on the
 roadmap page, but that doesn't mean it's not being worked on :)

 Roan Kattouw (Catrope)

 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Extension bundling for 1.18

2011-06-08 Thread Brion Vibber
On Wed, Jun 8, 2011 at 1:30 PM, Roan Kattouw roan.katt...@gmail.com wrote:

 Why don't we first ask Tim how complicated it would be, and get
 someone else to do it if it's more than 2-3 hours of work? I'm also
 not sure the scripts Tim uses to create a tarball are even in SVN
 anywhere, maybe he'd be willing to share them if they're not public
 already.


They've been in SVN for some time (and luckily Tim rewrote them in Python
from my older horrid bash code ;)

http://svn.wikimedia.org/viewvc/mediawiki/trunk/tools/make-release/

Basically, we do a SVN export, chop out a few unneeded files, build a
tarball and a patch, and GPG-sign them. Also checking out some extensions
and dropping them in would not be very difficult to throw in.

Currently the installer's support for extensions is limited; some won't
actually set up right, and we don't handle dependencies well, but
self-contained stuff like ParserFunctions and Gadgets should be pretty
trivial.

It would I think be possible to stash a few things in for 1.18 release with
no problem (and no new code, just tossing the existing branched extensions
into the tarball) if we wanted to, though they wouldn't be automatically
activated at install unless the user actually selects them. Actually making
things default would need some more work, and either way we'd want to do a
little testing. :)

-- brion
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Intended purpose of MediaWiki:Common.js anno MW 1.19/RL 2.0

2011-06-08 Thread Brion Vibber
On Wed, Jun 8, 2011 at 12:16 PM, Krinkle krinklem...@gmail.com wrote:

 Subject was: Update Gadgets extension on WMF wikis

 This part of the thread is hereby forked to discuss
 MediaWiki:Common.js. I do not believe MediaWiki:Common.js should be
 deleted (atleast not yet), I'm merely curious what usecases people
 have for it which may or may not be useful to be taken into account
 during the upcoming revision of ResourceLoader and Gadgets.


In general, modular gadgets will be a better way in the future to create,
maintain, share, and re-use client-side code components, as it will provide
better infrastructure for doing those things versus manually cut-and-pasting
a few bits of code.

At the moment, we're only partway there; sharing things from site to site is
still awkward, and there's not a good management interface for site admins
to work with.

Obviously MediaWiki:Common.js won't be removed in the short term (and may
stay around for compat for a long long time).

-- brion
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Extension bundling for 1.18

2011-06-08 Thread Brion Vibber
On Wed, Jun 8, 2011 at 1:43 PM, Chad innocentkil...@gmail.com wrote:

 On Wed, Jun 8, 2011 at 4:41 PM, Brion Vibber br...@pobox.com wrote:
  Currently the installer's support for extensions is limited;

 Yes :(

  some won't actually set up right,

 Examples?


Whatever was listed in bugzilla on that one bug where something didn't run
its installer stages or something? I don't remember; the point is that we
know we don't hook all hooks etc.

 and we don't handle dependencies well,

 s/well/at all/


exactly. :)


  but
  self-contained stuff like ParserFunctions and Gadgets should be pretty
  trivial.
 

 I agree :)


\o/

-- brion
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Cache Expiry

2011-06-08 Thread Brion Vibber
On Wed, Jun 8, 2011 at 1:50 PM, Robert Cummings rob...@interjinn.comwrote:

 I've looked around for methods to manipulate the cache duration for a
 given article. There are methods to facilitate the expiration of an
 article, and there are extensions that can disable caching for articles
 via convenient management interfaces. What seems to be lacking though,
 is a way to expire the article after some duration. For instance, I'm
 working with embedded RSS feeds...

[snip]

It looks like MediaWiki 1.17 and later support this already, by calling
updateCacheExpiry() on the parser cache output object.

The RSS extension uses this in its rendering hook:

if ( $wgRSSCacheCompare ) {
$timeout = $wgRSSCacheCompare;
} else {
$timeout = $wgRSSCacheAge;
}

$parser-getOutput()-updateCacheExpiry( $timeout );

In theory at least this should propagate the shorter expiry time out to both
the parser cache entry and the actual output page's HTTP headers, though I
haven't tested to double-check myself yet.

-- brion
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Extension bundling for 1.18

2011-06-08 Thread Brion Vibber
On Wed, Jun 8, 2011 at 1:50 PM, Chad innocentkil...@gmail.com wrote:


  Whatever was listed in bugzilla on that one bug where something didn't
 run
  its installer stages or something? I don't remember; the point is that we
  know we don't hook all hooks etc.
 

 That would be bug 28983, which is already fixed in trunk.


Objection withdrawn then. :D

-- brion
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Extension bundling for 1.18

2011-06-08 Thread Brion Vibber
On Wed, Jun 8, 2011 at 3:21 PM, Rob Lanphier ro...@wikimedia.org wrote:

 On Wed, Jun 8, 2011 at 1:30 PM, Roan Kattouw roan.katt...@gmail.com
 wrote:
  This argument completely misses the point. The (probably trivial)
  extra work is in the tarballing process and doesn't touch anything
  else. There's no way it can subtly break things.

 You're willing to say there are exactly *zero* fixes that would be
 needed to be done in trunk and merged into 1.18 as a result of making
 this change?


Trunk and 1.18 *and* 1.17 already have this ability; all that's required is
to drop some directories into the tarball, and they'll be available for
selection in the installer.

If any extension doesn't install  work successfully in this manner, don't
put it in the tarball.

-- brion
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Extension bundling for 1.18

2011-06-09 Thread Brion Vibber
On Wed, Jun 8, 2011 at 10:59 PM, K. Peachey p858sn...@gmail.com wrote:

 * WikiEditor
 Possibly I guess, although I would actually prefer it in core compared
 to a extension, I'm not a fan of how it takes a little longer to load
 and the screen jumps around.


Note that the jumping isn't because it's an extension, but rather because of
the specific way the toolbar and editor bits get injected and activated.

Merging to core or not would have no effect on that -- it can be fixed while
still an extension, or it can remain jumpy in core.

-- brion
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Extension bundling for 1.18

2011-06-09 Thread Brion Vibber
On Thu, Jun 9, 2011 at 11:18 AM, Roan Kattouw roan.katt...@gmail.comwrote:

 On Thu, Jun 9, 2011 at 8:14 PM, Brion Vibber br...@pobox.com wrote:
  Note that the jumping isn't because it's an extension, but rather because
 of
  the specific way the toolbar and editor bits get injected and activated.
 
  Merging to core or not would have no effect on that -- it can be fixed
 while
  still an extension, or it can remain jumpy in core.
 
 That's not entirely accurate. If it's in core, we can add some of the
 container divs in the HTML, reducing the jumpiness.


Simple matter of catching some hooks and outputting divs, and if there
aren't the right hooks already, add em.

-- brion
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Extension bundling for 1.18

2011-06-09 Thread Brion Vibber
On Thu, Jun 9, 2011 at 11:58 AM, Chad innocentkil...@gmail.com wrote:

 Well if I'm setting up an internal wiki for 10 people, saving space may be
 beneficial over supporting 250+ languages no one speaks.


At the moment the *entire* languages/ dir in trunk comes to... 43 megabytes.
(Not counting the complete copy of every file in the .svn subdir in a
subversion checkout; people in this situation apparently are working from
tarballs so that's not there.)

With English only 1.2 megabytes, for a saving of ~42 MB.

The simple expedient of gzipping the message files would reduce it to 13
megabytes, for a saving of ~30MB with *no* loss in functionality.

Of course, your users could fill up 30-40 megabytes' worth of text and
images in just a few minutes, so even the savings of removing them all is
pretty piddly...


If there is a compelling reason to reduce the size, switching to a
gzip-friendly format would be simpler: almost as much savings as removing
the files, but without losing anything.

-- brion
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Extension bundling for 1.18

2011-06-09 Thread Brion Vibber
On Thu, Jun 9, 2011 at 12:11 PM, MZMcBride z...@mzmcbride.com wrote:

 I'm not sure if the same trick can work for the new toolbar, as the new
 toolbar doesn't have a consistent height. The height fluctuates depending
 on
 which sub-modules have been expanded previously. This could probably be
 fixed by loading the toolbar earlier, though?


I'm sure folks can figure it out -- but hopefully in another thread. :)

-- brion
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Testing the new php mobile extension

2011-06-10 Thread Brion Vibber
On Fri, Jun 10, 2011 at 3:39 PM, MZMcBride z...@mzmcbride.com wrote:

 Tomasz Finc wrote:
  We're moving into our first testing stage for the new mobile extension
  and we'd love your help. You can find all the details on todays blog
  post at http://blog.wikimedia.org/2011/06/10/testing-mobile-prototype/

 Forgive me, but I don't understand the purpose of a prototype wiki that's
 not really a prototype: there are no images or templates. The home page
 (http://nomad.tesla.usability.wikimedia.org/index.php/Main_Page) seems
 to
 indicate that this is expected, but unless the goal is to have users test
 how well red links render on their phone, I'm not sure what testing can be
 done currently. Are the images and templates going to be imported?


There's way more than red links there: there's paragraph on paragraph of
text, with templates, images, category links, interlanguage links,
references, etc. Some bits are indeed missing and ought to be cleaned up for
further testing though -- and your feedback about that is valuable to the
people setting up the tests.

Just don't confuse giving feedback with blasting people into oblivion for
not being perfect on the first round.


 [snip]
 That's client-side resizing, which is painful enough on a full web browser,
 much less a mobile device.  The wikicode specifies |thumb|, which should
 resize the image server-side, surely. I'm not sure what's going on there.


Awesome -- you found a bug! Apparently the system does work. :)

We also have this nice bug that was reported on the RTL test, again
confirming that this testing is helping to identify bugs:
https://bugzilla.wikimedia.org/show_bug.cgi?id=29341


  Come help test, develop, or just ask about our mobile efforts on irc
  freednode #wikimedia-mobile.

 I don't think yet another IRC channel is a good idea. Surely this could go
 in #wikimedia-dev or #mediawiki.


That's a pre-existing channel that's been in use for a couple years.

-- brion
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] JSGantt inside wiki articles

2011-06-11 Thread Brion Vibber
On Sat, Jun 11, 2011 at 2:16 PM, Daniel Friesen
li...@nadir-seen-fire.comwrote:

 There's also the technique Raphael JS uses.


I'm quite fond of Raphael for interactive graphics -- it provides a nice
little JS API that maps things to native elements in either SVG or VML (for
older versions of IE which lack SVG). The downside for more static things is
that you won't necessarily get your graphics included in search engines or
other spidered destinations, and it's probably harder to adapt them to other
kinds of data export. (For instance to do a PDF export you'd either need to
be able to execute the JavaScript to produce SVG, then capture the SVG, or
add a second renderer in your export tool that produces suitable output.)

Of course if you're only ever going to use it in-browser for private sites,
no worries. :D

-- brion
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


[Wikitech-l] We hit 90k in subversion!

2011-06-13 Thread Brion Vibber
Ok, not as exciting a milestone as 100k will be, but it's still exciting!
Congratulations to MaxSem for winning the rev id lottery with r9:

http://www.mediawiki.org/wiki/Special:Code/MediaWiki/9
avoid test being marked as incomplete due to lack of assertions

Testing's always a win. :) Keep on truckin' y'all!

-- brion vibber (brion @ wikimedia.org / brion @ pobox.com)
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] I'm in ur trunk, reviewing pre-1.18 branch code

2011-06-13 Thread Brion Vibber
On Mon, Jun 13, 2011 at 4:00 PM, Rob Lanphier ro...@wikimedia.org wrote:

 On Tue, Jun 7, 2011 at 11:57 PM, Ashar Voultoiz hashar+...@free.fr
 wrote:
  This is a linear progression, the revision become harder and harder to
  review since, with time, most of the easy stuff got reviewed :-) Make it
  a long tail maybe ? :-)

 Well, kinda moot at this point.  Unfortunately, we're not even beating
 a linear progression to a July 15 target:


There's been only one data point so far (last tuesday)... so it might be
tough to extrapolate yet. :)

I'm not making a big fuss about this until 1.17 is done, since I know,
 for example, that Tim has been doing the last bits of detail work
 necessary for a 1.17 tarball release, so he hasn't been able to make
 as much of a dent as others have been available for.  Still, we're
 going to need to go faster than this to expect a deploy before August.
  Thoughts on how to accelerate?


When we were prepping 1.17 for deployment the theory seemed to be that it'd
be out the door and *done* within a couple weeks and we'd be able to
concentrate on 1.18 from there out and keep up with everything.

That was several months ago... we still haven't released 1.17, there seems
to be little or no commitment to a deployment or release schedule on 1.18,
and what review has happened between 1.17 and 1.18 has been totally
unorganized ad-hoc opt-in poking.

If we're working on the assumption that 1.17 being unreleased is slowing
down 1.18 work, then what are we doing sitting around?

Get that 1.17 tarball OUT and move on. It's been any day now for like
three months...

Give us a deployment date for 1.18 and make it a top priority for
engineering.

Give us a release date for 1.18 and make it a top priority for engineering.

-- brion
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Alternate/remote editor API?

2011-06-13 Thread Brion Vibber
On Fri, May 6, 2011 at 11:22 AM, Brion Vibber br...@pobox.com wrote:

 On Fri, May 6, 2011 at 11:20 AM, Trevor Parscal tpars...@wikimedia.orgwrote:

 The way the WikiEditor works right now, the textbox can be replaced with
 anything that can support a few methods, such as getSelection,
 encapsulateSelection, etc. There are some modules that depend on specific
 edit box implementations, such as the current and only alternative to the
 textarea we called iframe since it's a contentEditable iframe.

 If you take a look at jquery.wikiEditor.iframe.js, you will see what I
 mean.
 It should be pretty straightforward to drop anything in there, and be able
 to take advantage of the toolbar. There are some things, like find and
 replace that may need to be reworked or just turned off, but even things
 like the link dialog should work just fine but just supporting a few
 methods.

 The API could be better documented, and re-factored a bit to be even more
 generic, but the basic structure is there, and can be reused without much
 hacking.


 Spiffy... I'll play with it for CodeEditor, see if I can make the
 special-char inserts for instance work on it (which would actually be useful
 for some JS!).


Finally got around to poking at this recently as practice for the rich
editor project.

CodeEditor is now implemented as an extension:
http://www.mediawiki.org/wiki/Extension:CodeEditor
and the gadget pulls in the JS from there -- so if you're using the gadget
on mediawiki.org, it should continue to work.

I've also got it now working with WikiEditor's formatting toolbar (mostly),
special characters list (works great!), and search  replace dialog,
implementing most of the same interfaces that WikiEditor's iframe mode does.

We'll probably want to extend that API a bit further, a few offhand notes:

* Our jquery.textSelection module which implements the
fetch-and-encapsulate-text stuff still has a few WikiEditor-specific
assumptions, and probably needs to be generalized a little more.

* Various bits of formatting  help text that are suitable for wikitext
pages are less useful when you're on a JS or CSS page. We may want to have a
concept of moving up from 'generic' editor (a few generic buttons) to having
a specific data format ('wiki' pages get the wikitext help; 'js' pages get a
MediaWiki JS API help page; 'css' pages get a list of common selectors and a
link to CSS documentation). Those should be independent of what actual
*editor mode* is being used as well, so we can show JS-relevant help on a JS
page even if you don't have some fancy syntax highlighting thing.

* For things that are 'fancy views of plain text' like the WikiEditor iframe
mode and CodeEditor, the formatting toolbars etc work fairly
straightforwardly; we just need to get at some selection of text, spit back
a modified bit of text, and fiddle around the selection or view. This
probably won't adapt so well for a rich visual editor; so we may need an
adaptor layer to let plain-text and enhanced-text editors fiddle with the
wikitext sample fragments while a rich editor has its own adaptor that turns
high-level calls like 'wrap in a link' or 'make bold' and does the
appropriate AST  view modifications.

* A few of WikiEditor's experimental fields require the iframe mode and
force it to switch in; may need something to avoid ambiguity when we're
deliberately using a different mode.

* Probably would be good to add a specific notion of switching editor modes;
WikiEditor's preview tab opens up _surrounding_ the editor, but if we switch
between plaintext  syntax-highlighting, we probably want a toggle on the
toolbar which just swaps the guts around.

-- brion
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


[Wikitech-l] BlockTest bugs resolved -- test board is clear!

2011-06-15 Thread Brion Vibber
We've been fighting some weird regressions in the test cases for the Block
class which handles IP  user blocks, some of which took us a while to even
reproduce consistently. After Chad  others cleaned up some sqlite-related
issues, we still had a remaining stubborn one which failed in the full test
suite, but not when run standalone.

Between me, Mark H, and Chad, we seem to have finally worked that one out
today, and made some fixes to both Block and BlockTest which have resolved
it.

Full story if you like. :D - http://etherpad.wikimedia.org/Block-test-bug

Short story:
http://ci.tesla.usability.wikimedia.org/cruisecontrol/buildresults/mw

Unit Tests: (1628)   All Tests Passed   

*break out the virtual champaign*

-- brion
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] wikistream: displays wikipedia updates in realtime

2011-06-16 Thread Brion Vibber
On Wed, Jun 15, 2011 at 9:40 PM, Ed Summers e...@pobox.com wrote:

 I've been looking to experiment with node.js lately and created a
 little toy webapp that displays updates from the major language
 wikipedias in real time:

http://wikistream.inkdroid.org


That's pretty cool -- and having the source all ready to be shared on github
is *super* cool. :D Having good starting points for other people to build
real-time displays and visualizations on can be a big help in jumpstarting
some really cool innovation.

I'd love to see some more active visualization that also include images, and
if it's possible to show interactions between multiple people working on the
same pages that could be super killer. :)

As noted in this thread already, use of flags to represent languages can be
problematic -- there's a nice overview of the issues here:
http://www.cs.tut.fi/~jkorpela/flags.html  On the other hand, flags can be a
fine way to represent 'country of origin' in visualizations/reports that
estimate where edits are being made from based on IP address etc -- so keep
em in mind!

-- brion
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] statistics about skin usage

2011-06-17 Thread Brion Vibber
On Fri, Jun 17, 2011 at 10:50 AM, Chad innocentkil...@gmail.com wrote:

 On Fri, Jun 17, 2011 at 1:46 PM, Trevor Parscal tpars...@wikimedia.org
 wrote:
  We also used this to keep track of specific preferences, some interesting
  info there..
 
  http://en.wikipedia.org/wiki/Special:PrefStats
 

 Cool stuff, but doesn't quite get the big picture I'm going for. The
 most useful data here (as I said elsewhere) would be joining skin
 choice on user_touched as an indicator of activity, and aggregating
 those totals per-wiki--maybe in per-year buckets.


Beware of user_touched; it's also triggered when _other_ people do things,
like editing your talk page, or bulk operations like a batch change of some
other user preference.

user_touched could be current on an account that hasn't logged in in years.

-- brion
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Alternate/remote editor API?

2011-06-17 Thread Brion Vibber
(Cross-posting this update to wikitech-l  wikitext-l since this ties into
Visual Editor project)

I've made some more updates to the CodeEditor extension  gadget
demonstrating integration  adaption of the WikiEditor editor  toolbar
framework.

The syntax-highlighting mode in CodeEditor can now be toggled on and off;
when it's off, any WikiEditor functions that had been patched for CodeEditor
fall automatically through to the defaults, and you get the regular textarea
view.

(If you just want to go poke at it, scroll down to the older quoted post and
follow the extension link ;)

There's a few more things I'll want to poke to make it suitable for dropping
the visual editor in:

* a concept of 'document format' that lets us specify that certain tools are
suitable for wikitext pages, while others are suitable for JS/CSS code --
this'll let things like the 'bold' and 'link' buttons hide themselves
automatically on regular wiki pages, while the JS/CSS pages can
automatically show tools like 'indent/deindent block', 'find declaration',
'syntax cleanup'. (Right now this can be done by manually removing/adding
tools in the JS  CSS modes, but we can integrate that better -- just like
the browser  RTL compatibility checks currently in some WikiEditor modules)

* An abstraction layer on data type / structure type? For the way tools like
text insertions, link dialogs, search  replace etc work we can in many ways
treat 'plain textarea', 'wikiEditor iframe with template folding', and 'Ace
syntax-highting editor' as equivalent: all are views on a plain text
document that can be addressed character by character, and all that needs to
be implemented are the pieces to get text in and out, do selection and
insert/delete, etc. For the visual editor, we'll have a document structure
that's very different, so something that 'makes a section bold' would work
differently: operating on a DOM-like model to move some nodes around, rather
than dropping bits of text in.

* cleaner implementation for toggle switches on the toolbar


In the meantime though, I should be able to get the ParserPlayground demos
more tightly integrated into the editor widget, and will try to hack up some
temporary handling for the bold/italic/link etc on the provisional dom
structure.

-- brion


On Mon, Jun 13, 2011 at 5:37 PM, Brion Vibber br...@pobox.com wrote:

 On Fri, May 6, 2011 at 11:22 AM, Brion Vibber br...@pobox.com wrote:

 On Fri, May 6, 2011 at 11:20 AM, Trevor Parscal 
 tpars...@wikimedia.orgwrote:

 The way the WikiEditor works right now, the textbox can be replaced with
 anything that can support a few methods, such as getSelection,
 encapsulateSelection, etc. There are some modules that depend on specific
 edit box implementations, such as the current and only alternative to the
 textarea we called iframe since it's a contentEditable iframe.

 If you take a look at jquery.wikiEditor.iframe.js, you will see what I
 mean.
 It should be pretty straightforward to drop anything in there, and be
 able
 to take advantage of the toolbar. There are some things, like find and
 replace that may need to be reworked or just turned off, but even things
 like the link dialog should work just fine but just supporting a few
 methods.

 The API could be better documented, and re-factored a bit to be even more
 generic, but the basic structure is there, and can be reused without much
 hacking.


 Spiffy... I'll play with it for CodeEditor, see if I can make the
 special-char inserts for instance work on it (which would actually be useful
 for some JS!).


 Finally got around to poking at this recently as practice for the rich
 editor project.

 CodeEditor is now implemented as an extension:
 http://www.mediawiki.org/wiki/Extension:CodeEditor
 and the gadget pulls in the JS from there -- so if you're using the gadget
 on mediawiki.org, it should continue to work.

 I've also got it now working with WikiEditor's formatting toolbar (mostly),
 special characters list (works great!), and search  replace dialog,
 implementing most of the same interfaces that WikiEditor's iframe mode does.

 We'll probably want to extend that API a bit further, a few offhand notes:

 * Our jquery.textSelection module which implements the
 fetch-and-encapsulate-text stuff still has a few WikiEditor-specific
 assumptions, and probably needs to be generalized a little more.

 * Various bits of formatting  help text that are suitable for wikitext
 pages are less useful when you're on a JS or CSS page. We may want to have a
 concept of moving up from 'generic' editor (a few generic buttons) to having
 a specific data format ('wiki' pages get the wikitext help; 'js' pages get a
 MediaWiki JS API help page; 'css' pages get a list of common selectors and a
 link to CSS documentation). Those should be independent of what actual
 *editor mode* is being used as well, so we can show JS-relevant help on a JS
 page even if you don't have some fancy syntax highlighting thing.

 * For things

Re: [Wikitech-l] Enabling protocol relative URLs on Wikimedia sites week of July 18th

2011-06-17 Thread Brion Vibber
On Fri, Jun 17, 2011 at 2:41 PM, Ryan Lane rlan...@gmail.com wrote:

 In anticipation for proper HTTPS support on Wikimedia sites, we will
 be enabling protocol relative URLs. This means that things like logos,
 references to resources, and interwiki links will use links like:

  //en.wikipedia.org/wiki/Main_Page

 instead of links like:

  http://en.wikipedia.org/wiki/Main_Page

 Doing this ensures that we don't split our squid/varnish caches.

 We'll enable this for test.wikipedia.org some time before enabling it
 globally, so that there will be at least a short time to test
 beforehand.


*squee* :DD

I'm s looking forward to consistent https on the main domains! :DDD

Note there's a possibility that this URL tweak may affect some
JavaScript-side code or client-side spiders/bots that may expect
fully-qualified links or do relative link resolution incorrectly, so
everybody keep an eye out for potential issues in your code. Previous
testing has been pretty favorable about actual browser support.

-- brion
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Enabling protocol relative URLs on Wikimedia sites week of July 18th

2011-06-17 Thread Brion Vibber
On Fri, Jun 17, 2011 at 3:29 PM, Roan Kattouw roan.katt...@gmail.comwrote:

 On Fri, Jun 17, 2011 at 11:41 PM, Ryan Lane rlan...@gmail.com wrote:
  We'll enable this for test.wikipedia.org some time before enabling it
  globally, so that there will be at least a short time to test
  beforehand.
 
 For completeness: this, both enabling it on testwiki and enabling it
 sitewide, will be going down some time in early or mid-July, so that's
 almost a month away. Ryan and I are just planning and announcing this
 freakishly early :)


Is there a handy config snippet people can use to test things against their
local installs in the meantime?

-- brion
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] ICANN expansion, relative URLs

2011-06-17 Thread Brion Vibber
On Fri, Jun 17, 2011 at 3:37 PM, Jay Ashworth j...@baylink.com wrote:

 While the topic of how Mediawiki handles URLs is on the table, let me
 point
 out today's Slashdot piece, which notes that ICANN is about to open up the
 gTLD namespace...

 *to everyone*, not just commercial registries.

 Contemplate, if you will:

  http://apple/

 How will MW handle a FQDN with no dots in it, when that becomes legal?



Those are already perfectly legal hostnames to have in URLs, and you see
single-part hostnames all the time on internal networks, either by eliding
the local domain part (since local DNS will resolve it) or by only using
single-part names to begin with.

For a common example: try linking to http://localhost/ -- it works just
fine. :)

I suppose in theory having apple available is no worse than apple.com
(since you *could* have an apple.com.mylocaldomain already and have to
worry about which takes precedence), but in practice that sounds like a
crappy thing to do. :)

-- brion
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Enabling protocol relative URLs on Wikimedia sites week of July 18th

2011-06-17 Thread Brion Vibber
On Fri, Jun 17, 2011 at 3:42 PM, Roan Kattouw roan.katt...@gmail.comwrote:

 On Sat, Jun 18, 2011 at 12:32 AM, Brion Vibber br...@wikimedia.org
 wrote:
  Is there a handy config snippet people can use to test things against
 their
  local installs in the meantime?
 
 Afraid not. We have to change hardcoded http:// prefixes in a million
 different places, see
 http://wikitech.wikimedia.org/view/Https#Protocol-relative_URLs . The
 logos are largely monkey work (regex them, then get a team to verify
 100s of hopefully-not-broken logos), but other things are much more
 sensitive. It'd suck to break the setting that determines where the
 uploaded files on Commons live
 (http://upload.wikimedia.org/site/lang/etc) or where our static assets
 and RL things are served from (bits), especially if broken URLs get
 cached in Squid and/or memcached. This is why we're scheduling this
 thing this far out when Ryan and I will both be available to very
 carefully go about this.


Great that we have a list! :D

Do make sure that all of those individual settings get tested before
touching the production cluster; I'd be particularly worried about the
possibility of exposing '//domain/something' style URLs into output that
requires a fully-clickable link.

It looks like wfExpandUrl() *does* correctly expand protocol-relative URLs
against the current $wgServer setting, so a lot of things that already
expect to handle path-relative URLs should already be fine.

But some like the interwikis may be assuming fully-qualified URLs; for
instance on API results those should probably return fully-qualified URLs
but they're probably not running through wfExpandURL.

ApiQueryIWLinks for instance returns interwiki links retrieved via
$title-getFullURL(); for interwiki cases this takes whatever was returned
by $interwiki-getURL() (which would in this case presumably be the '//
fr.wikipedia.org/wiki/$1' form) and passes it back, without making any
attempt to expand it further.

This'll probably need refactoring to that getLocalURL() handles the regular
interwiki URLs, then in getFullURL() we go ahead and call that and pass the
result through wfExpandUrl().

-- brion
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Enabling protocol relative URLs on Wikimedia sites week of July 18th

2011-06-17 Thread Brion Vibber
On Fri, Jun 17, 2011 at 4:02 PM, Roan Kattouw roan.katt...@gmail.comwrote:

 On Sat, Jun 18, 2011 at 12:56 AM, Brion Vibber br...@pobox.com wrote:
  Great that we have a list! :D
 
  Do make sure that all of those individual settings get tested before
  touching the production cluster;
 Well that's what the testwiki thing is for :)


IIRC testwiki shares a lot of infrastructure and common configuration with
the rest of the sites, so unless it's been given an isolated set of config
files  interwiki database files, still be careful. :)



  I'd be particularly worried about the
  possibility of exposing '//domain/something' style URLs into output that
  requires a fully-clickable link.
 
 I don't quite get this -- the example below refers to API output, and
 you're right we need to really output absolute URLs there (good
 catch!), but I don't see any other scenario where outputting
 protocol-relative URLs would be bad. Maybe RSS feeds, too, and non-UI
 things in general, but really everything in the UI ought to be fair
 game, right?


Almost everything in web UI should be pretty much fine yes -- I really mean
things where the URL ends up someplace *outside* a browser, directly or
indirectly, and then needs to be resolved and used without the browser's
context. Places you definitely want some full URLs include:

* api
* additional specialty APIs (say, movie embedding links)
* feeds (in at least some places)
* email
* export XML that contains URLs for files or pages

There may be other things that we just haven't thought of, which is why I
wanted to raise it.



  But some like the interwikis may be assuming fully-qualified URLs; for
  instance on API results those should probably return fully-qualified URLs
  but they're probably not running through wfExpandURL.
 
 Good point, but I'm more worried about the reverse TBH: what if
 there's some code that runs stuff through wfExpandUrl() even though it
 could, and maybe even should, be protocol-relative?


Also worth checking for! I'm less worried about those though since the worst
case result is status quo. :)

-- brion
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Enabling protocol relative URLs on Wikimedia sites week of July 18th

2011-06-17 Thread Brion Vibber
On Fri, Jun 17, 2011 at 4:15 PM, Victor Vasiliev vasi...@gmail.com wrote:

 On Sat, Jun 18, 2011 at 3:11 AM, Brion Vibber br...@pobox.com wrote:
  * email

 What about user preference for cases like this? With four options:
 Prefer HTTP, Prefer HTTPS, Force HTTP, Force HTTPS.


The current case is that you get whatever was the current setting on the web
server at the time the email got formatted (which usually means, either HTTP
or HTTPS depending on *someone else's usage*) or it defaults to HTTP for
something formatted from default.

Currently MediaWiki doesn't have a native notion of multiple
hostname/protocol availability and it's done by lightly patching
configuration at runtime when loaded over HTTPS. Unless something is added,
that'll continue in the same way.

My personal preference would be to run *all* logged-in activity over HTTPS,
so every mail link etc should be on SSL. But I think that's still a ways out
yet and will need better SSL acceleration; poor Ryan Lane will kill me if I
keep pushing on that too soon! ;)

-- brion
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Enabling protocol relative URLs on Wikimedia sites week of July 18th

2011-06-17 Thread Brion Vibber
On Fri, Jun 17, 2011 at 4:20 PM, Brion Vibber br...@pobox.com wrote:

 [snip] or it defaults to HTTP for something formatted from default.


s/default/background job running on CLI/

:P

-- brion
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Sections as first-class citizen

2011-06-18 Thread Brion Vibber
On Fri, Jun 17, 2011 at 5:24 PM, Misdre misdre+mediaw...@gmail.com wrote:

 The most cumbersome situation is when you want to watch only one
 section, especially on talk pages. You have no choice but to watch the
 entire page and that can be really painful.

 Is there an easy way to improve this situation?


My honest recommendations:

For talk pages: finish reworking or replacing LiquidThreads into a system
that's actually designed for discussion, with sensible notifications. This
probably should be a high priority for Wikimedia, but the way things
shuffled out under this year's budget it's only partially covered under the
'-1 to 100 edits' umbrella project covering UI and process improvements to
help integrate new editors. If folks can seriously rally on it though we
could get some really great stuff done!

For articles: break up giant articles into more reasonably-sized pieces.
This should improve search locality, rendering  load times, and
manageability of editing. This may benefit from more explicit ways to group
related pages, which is also desperately needed as a first-class feature for
wikibooks and other projects.

-- brion
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Store IP using varchar(40) is enough?

2011-06-21 Thread Brion Vibber
On Jun 21, 2011 9:29 AM, Ryan Chan ryanchan...@gmail.com wrote:

 According to this:
 http://www.mediawiki.org/wiki/Manual:User_newtalk_table, IP was stored
 using varchar(40) .

 But seems 45 should be the safe instead of 39?


http://stackoverflow.com/questions/4982701/best-way-to-store-ip-in-database

The 45-char version is for the dotted-quad variant of an ipv4 mixed address.
I think we're normalizing to the canonical grouped hex form with elided
zeroes so that format wouldn't get stored.

-- brion
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


[Wikitech-l] Sortable tables broken in 1.18 trunk; needs tests fixes or revert

2011-06-22 Thread Brion Vibber
The sortable table system in MediaWiki was completely rewritten in r86088;
unfortunately this was done without benefit of any unit testing or
regression testing, and there seem to be a lot of new bugs introduced.

I've started adding test cases for table sorting in r90595; this adds a
qunit test suite that creates some short tables (listing planets Mercury
through Saturn and their radii) and tries setting up and triggering sorting,
then compares the resulting order to the expected value.

The ascending sort by name usually works, but if called twice on two tables,
the second table usually gets sorted *completely* incorrectly. This can be
easily confirmed by manual inspection by copying a page such as
http://test.wikipedia.org/wiki/Planets_of_doom to your trunk wiki and
sorting first one table, then the other.

Numeric sorting on the radius column is also incorrect; the data set
includes values with two different orders of magnitude ( 10k and  10k),
and they don't sort correctly. You can confirm on the test.wikipedia page
that these sort as expected in 1.17.

These specific bugs will also need to have test cases added, all of which
are claimed to be fixed by r86088 or its follow-ups:
* https://bugzilla.wikimedia.org/show_bug.cgi?id=8028
* https://bugzilla.wikimedia.org/show_bug.cgi?id=8115
* https://bugzilla.wikimedia.org/show_bug.cgi?id=15406
* https://bugzilla.wikimedia.org/show_bug.cgi?id=17141
* https://bugzilla.wikimedia.org/show_bug.cgi?id=8732
* https://bugzilla.wikimedia.org/show_bug.cgi?id=28775

As a reminder, you can run the qunit tests from your browser by simply going
to the tests/qunit/ subdirectory of your wiki installation.

General info: http://www.mediawiki.org/wiki/Manual:JavaScript_unit_testing

-- brion
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Sortable tables broken in 1.18 trunk; needs tests fixes or revert

2011-06-22 Thread Brion Vibber
On Wed, Jun 22, 2011 at 2:32 PM, Krinkle krinklem...@gmail.com wrote:

 Brion Vibber wrote:

  The sortable table system in MediaWiki was completely rewritten in
  r86088;
  unfortunately this was done without benefit of any unit testing or
  regression testing, and there seem to be a lot of new bugs introduced.

 The legacy script was removed from svn without deprecation or
 quarantine phase, that is in contrary to the descriptoin and
 expectation set here [1].


It's a reimplementation of an existing system that never gets manually
called; since there's no API for it that doesn't really apply -- there's
nothing to deprecate.

It needs to *actually work* and not introduce regressions for having been
refactored.



 I suggest restoring the legacy ts_makeSortable functions in
 wikibits.js, except for one thing, which is sortables_init, it should
 not be applied to any table on-load (like nothing in a module should
 be executed on load, modules should only contain the plugins themselfs!)


To me this doesn't make sense; if it's not to be used, and there's no API
designed for using it, and its existing implementation has been replaced,
there's no need to keep unused code.



 The new jQuery plugin would be in jquery.tablesorter.js which, like
 any module, also doesn't do anything on-load. Instead it is called in
 a lazy-loader for tables on the wiki page, this means on a unit test
 page both can be used seperately on indivudual tables since neither
 the legacy or the new one is called on load.


That's already how the jQuery module works, yes.


 This also makes sure our behaviour is in harmony with the expectation
 set by our javascript deprecation page [1] and will not break gadgets
 that are usiung (parts of) the legacy ts_* functions (which are global
 functions).


Those are internal functions (just poorly namespaced), anything using them
can be expected to fail at any time.

So summarized proposal:
 * Restore legacy script to comply with the deprecation guide (ie. not
 break gadgets that use (parts of) it)


Against -- no public API functions, doesn't apply.


 * Write unit tests for the legacy script


Only needed if it's included and used, which is only needed if the new
version is reverted


 * Write unit tests for hte new script
 * Fix the new script


If it's reverted, these should be done during development of the new version
prior to commit; if not, do them now.

-- brion
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Sortable tables broken in 1.18 trunk; needs tests fixes or revert

2011-06-22 Thread Brion Vibber
On Wed, Jun 22, 2011 at 1:27 PM, Brion Vibber br...@wikimedia.org wrote:

 [snip]
 The ascending sort by name usually works, but if called twice on two
 tables, the second table usually gets sorted *completely* incorrectly. This
 can be easily confirmed by manual inspection by copying a page such as
 http://test.wikipedia.org/wiki/Planets_of_doom to your trunk wiki and
 sorting first one table, then the other.

 Numeric sorting on the radius column is also incorrect; the data set
 includes values with two different orders of magnitude ( 10k and  10k),
 and they don't sort correctly. You can confirm on the test.wikipedia page
 that these sort as expected in 1.17.


Awesome -- diebuche tracked those bugs down to the merge sort implementation
and has just fixed it in r90612. The basic sort tests now run clean!

I've marked r90612 for needing merge to 1.18 so we'll be good on branch;
we'll still need more test cases to confirm fix of past known table sorting
bugs so everybody, do feel free to help add those. :)

Big thanks to everybody who's helped in getting the test systems running and
the table code improved and fixed!

-- brion
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] should we join the Unicode Constortium?

2011-06-27 Thread Brion Vibber
On Mon, Jun 27, 2011 at 1:53 PM, Ryan Kaldari rkald...@wikimedia.orgwrote:

 I'm not sure who would be in charge of this, but I think it would be
 useful if the WMF was a liaison member of the Unicode Constortium:
 http://unicode.org/consortium/memblogo.html

 This body makes all sorts of important decisions about the Unicode
 standard—decisions that affects many aspects of our projects. If an
 issue were to come up that adversely affected us, we would not have a
 formal way to object at the moment. Being a liason member gives us
 official standing with the organization, allowing us to participate
 alongside Google, Apple, and Microsoft in any Unicode-related
 discussions that are important to us.

 Other open sources projects that are currently liaison members:
 The GNOME Foundation
 The Mozilla Project
 OpenOffice.org

 I'm not sure if there is any fee for becoming a liaison member. The
 instructions simply say to contact the Unicode Office for details.
 Would it be worth contacting them to find out?


I think that'd be worth doing; I had a great chat with Michael Everson at
our Berlin meetup (Michael's been involved in lots of Unicode
standardization stuff over the years and has many tales to tell :) and it
definitely sounds like it would be useful for us to be a little more on the
inside at times to help push on requirements for specific language support,
or making sure that concerns about labeling, character types, and
normalization (shudder) get looked at and addressed.


In comparison, we have a couple of folks who keep an eye on some of the W3C
/ WHATWG lists that are working on the HTML 5 specifications -- these are
mostly open lists which made it easier for particularly interested
individuals like Aryeh  Tim to pop on either for specific issues or just as
voracious list readers. ;)

Unicode's development has traditionally been a little more closed and
old-fashioned -- not a horrible thing for something as delicate as the
universal character set! -- but it does mean that if we want to be involved,
we need to work with the system to make sure that we're heard.

I'll see if we can figure out who gets jurisdiction internally on this. :)

-- brion
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Google Chrome

2011-06-29 Thread Brion Vibber
On Wed, Jun 29, 2011 at 3:02 PM, James Heilman jmh...@gmail.com wrote:

 I am having some difficulties editing with Google Chome. The browser
 adds extra spaces with many edits and is unable to add lines to the
 middle of infoboxes (when you hit the enter key it just moves you to
 the bottom of the infobox code). I am not sure if this is a chrome
 problem or a wiki problem. I do wish to get a chrome book but unless
 this can be addressed it will just cause me frustration.


Where/how/when does this happen? Can you point to specific parts of specific
pages on specific wikis that exhibit this problem when you edit them? What
version of Chrome, on what operating system? Do you have any custom
settings, JS or gadgets set up that might change editing behavior? Does it
happen only when editing with your account, or also when anonymous or with a
fresh account?

-- brion
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] RL dynamic parameters

2011-06-30 Thread Brion Vibber
On Thu, Jun 30, 2011 at 11:34 AM, Bryan Tong Minh
bryan.tongm...@gmail.comwrote:


 How do I pass parameters dynamically to an RL module? I need to pass a
 FileRepo object to the RL module when I call $wgOut-addModuleStyle().


I'm not sure this is really possible; all that happens when you call
$wgOut-addModuleStyle() is that the given module and its dependencies get
added to the list of modules' styles to load up via link rel tags pointing
at load.php.

The subsequent call to load.php will be handled by either an HTTP cache or
by another PHP request, so it won't have any parameters you may have given
to an object in that first process.

If it'll be the same file on every request, or tied to your user login
session, then you should be able to handle that the same way as
ResourceLoaderUserModule by pulling from some user or global state.

If it's different on every page you're going to output it on, then you may
need to use a different technique.

-- brion
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


[Wikitech-l] ResourceLoader JavaScript validation on trunk (bug 28626)

2011-07-06 Thread Brion Vibber
Some of you may have found that ResourceLoader's bundled  minified
JavaScript loads can be a bit frustrating when syntax errors creep into your
JavaScript code -- not only are the line numbers reported in your browser of
limited help, but a broken file can cause *all* JS modules loaded in the
same request to fail[1]. This can manifest as for instance a jquery-using
Gadget breaking the initial load of jquery itself because it gets bundled
together into the same request.

I've taken a copy of JSMin+ (MPL 1.1/GPL 2.0/LGPL 2.1 triple-license) to our
includes/libs -- it's a JS minification library that had originally gotten
knocked out of the running for merging due to being a bit slow, but has the
advantage of coming with an actual JavaScript parser [2].

Only the parser is being used right now, in two places:
- on the JavaScriptMinifier test cases to confirm that results are valid JS
(should be extended to a fuzz tester, probably)
- on each individual file loaded via ResourceLoaderFileModule or
ResourceLoaderWikiModule, so we can throw a JavaScript exception with
details of the parse error *with line numbers for the original input file*

This can be disabled by turning off $wgResourceLoaderValidateJs, but I'm
setting it on by default to aid testing.

I'd like for folks to keep an eye out to make sure they don't get any false
positive parse errors in real-world modules, and to see if there are any
noticeable performance regressions. Like ResourceLoader's minification
itself the validation parses are cached so shouldn't cause too much ongoing
load, but it still takes some time.

[1] https://bugzilla.wikimedia.org/show_bug.cgi?id=28626
[2] http://crisp.tweakblogs.net/blog/1856/jsmin+-version-13.html

-- brion
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] ResourceLoader JavaScript validation on trunk (bug 28626)

2011-07-06 Thread Brion Vibber
On Wed, Jul 6, 2011 at 3:18 PM, K. Peachey p858sn...@gmail.com wrote:

 How is JSMin+ different to the plain JSMin that we had and was removed
 due to licensing conflicts?


It's a different program, written by different people, based on code from
another unrelated project, under a different license.

Open Source is a wonderful thing; we use it every day and in this case it
would be a waste if we were the only one to benefit from the hard work of
others. We dubbed our little project JSMin+ because essentially it acts as
Douglas Crockford's JSMin but is far less restrictive, and released it under
the same MPL/GPL/LGPL tri-license as the original Narcissus code.

http://crisp.tweakblogs.net/blog/1665/a-new-javascript-minifier-jsmin+.html

-- brion
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Do we need Selenium for anything anymore? (was: [Selenium] IDE test for regressing bug)

2011-07-07 Thread Brion Vibber
On Thu, Jul 7, 2011 at 8:28 AM, Chad innocentkil...@gmail.com wrote:

 On Thu, Jul 7, 2011 at 11:23 AM, Sumana Harihareswara
 suma...@wikimedia.org wrote:
  Thanks for the test  test suite!  Just wanted to update you that it
 seems
  like fewer and fewer MediaWiki developers are interested in Selenium
 tests,
  and many are moving to other tools for automatic testing:
 

 That being said: is there any reason to leave the Selenium support in
 core anymore?


I believe work still needs to be done to provide a test harness that uses
QUnit for 'live-wiki' tests -- the current QUnit tests are standalone and
run from static files, so can't test interaction with server-side behavior.

With our current QUnit tests you can see if your JS code does something on
some input data or HTML you construct, but you can't, say, confirm that
issuing a click event on the watch tab submits an API request and comes back
with the expected result.

Interaction tests also tend to modify data -- by performing edits, creating
accounts, changing settings, deleting things, undeleting things, etc. So,
like the parser  phpunit tests that are potentially destructive, they
either need to run on dedicated testing sites where that's acceptible, or
the test harness needs to be able to set up / tear down a temporary wiki
copy. Unlike the parser  unit tests, that temporary copy has to be actually
exposed via the web so client-side tests can run.

-- brion
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Heads-up: WMF engineering process improvement meetings

2011-07-07 Thread Brion Vibber
On Thu, Jul 7, 2011 at 12:40 PM, MZMcBride z...@mzmcbride.com wrote:

 Thanks very much for the heads-up. Posting the minutes or any notes from
 the
 meeting on Meta-Wiki or mediawiki.org would be fantastic.


We're taking notes on an etherpad for now; some process flowcharts are being
done on a physical wall so once they're distilled onto the wiki w/ photos it
should all be accessible together.

I might say that one more point to focus on specifically is to how to
 leverage volunteer development (this is hinted at in some of your five
 points). There are _a lot_ of people who are capable of coding in PHP and
 who are willing to donate their time and talents, but Wikimedia/MediaWiki
 code development has chased them off, generally through neglect (patches
 sitting, review sitting, etc.). If there are ways to specifically look at
 that, it would be an enormous benefit to Wikimedia/MediaWiki, I think.


This is a big thing we want to solve -- current processes are very vague 
laggy for a lot of stuff and only keep a fast deployment cycle for a few
WMF-driven projects; this isn't because we're mean but because those are the
only things that have internal drivers within WMF pushing them strongly
enough; how to make sure other ongoing work gets the attention it needs too
is very much on our minds.

-- brion
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] testing mobile browsers?

2011-07-11 Thread Brion Vibber
On Sat, Jul 9, 2011 at 2:30 AM, Ashar Voultoiz hashar+...@free.fr wrote:

 * Safari, Opera  Mozilla for mobile : they are probably mostly the same
 as the desktop version. I have not found emulators for them.
 * Android : has an emulator. On my computer it is painfully slow and not
 usable for anything.
 * Blackberry : emulator is Windows only :-/

 Would be great to enhance our testswarm with more browsers. Maybe we
 could contact those mobiles developers to connect to our testswarm?


Note also that there'll be at least three distinct things we want to test on
the mobile browsers:

* JS  browser interactive behavior _of MediaWiki_
* JS  browser interactive behavior _of the alternate mobile view_
(MobileFrontend, on track to replace the current mobile. and m. gateways)
* default redirection to the appropriate view based on device

For the MediaWiki JS tests on Mobile Safari, Android default browser,
Blackberry default browser, Opera Mobile, etc we should only need to add
those browsers to the job submissions going into TestSwarm so it'll farm the
tests out to any connected clients -- currently Krinkle manages how that's
set up.

However for the most part, Mobile Safari, default Android browser, Opera
Mobile, and Firefox for Android should work _about the same_ as their
desktop equivalents, with the same modern JavaScript engines etc. There will
be some differences in JavaScript  HTML support (for instance, the Android
browser doesn't support SVG until 3.0) but this is unlikely to trip up much
in the core JS tests.

More relevant in most cases will probably be UI tests, like making sure that
various links and buttons are in visible, clickable areas of the screen; we
don't have any of these tests yet using QUnit.


Anything related to MobileFrontend will need to have a test suite created
for it; note that for less-capable devices, TestSwarm's client-side
JavaScript system probably won't actually work (certainly it won't for Opera
Mini!)

Redirection tests again might need another test harness infrastructure.

-- brion
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] testing mobile browsers?

2011-07-11 Thread Brion Vibber
On Mon, Jul 11, 2011 at 12:56 PM, Jay Ashworth j...@baylink.com wrote:

 - Original Message -
  From: Brion Vibber br...@pobox.com

  * default redirection to the appropriate view based on device

 Please remember that some people with high-function browsers *want* the
 low-function results... and some people with *what your code thinks are
 low-function browsers* want the standard results; a user-controllable
 per-browser persistent cookie to override is a very nice touch here.


That's why there's a per-browser persistent cookie to override it, yes. :)

(Not 100% all pretty yet, but I think it mostly works in the MobileFrontend
version now. The live versions still just have the one-time flip-to-desktop
 the permanent-disable, which are both annoying limited.)

-- brion
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] ResourceLoader JavaScript validation on trunk (bug 28626)

2011-07-11 Thread Brion Vibber
On Wed, Jul 6, 2011 at 3:04 PM, Brion Vibber br...@pobox.com wrote:

 Only the parser is being used right now, in two places:
 - on the JavaScriptMinifier test cases to confirm that results are valid JS
 (should be extended to a fuzz tester, probably)
 - on each individual file loaded via ResourceLoaderFileModule or
 ResourceLoaderWikiModule, so we can throw a JavaScript exception with
 details of the parse error *with line numbers for the original input file*

 This can be disabled by turning off $wgResourceLoaderValidateJs, but I'm
 setting it on by default to aid testing.

 I'd like for folks to keep an eye out to make sure they don't get any false
 positive parse errors in real-world modules, and to see if there are any
 noticeable performance regressions. Like ResourceLoader's minification
 itself the validation parses are cached so shouldn't cause too much ongoing
 load, but it still takes some time.


Per feedback from TranslateWiki (yay testers!) I've disabled validation for
JS modules coming from on-disk files; large libs like jQuery can hit the PHP
memory limits if you're unlucky. This kills the load.php process, neatly
defeating the purpose of validating the code. ;)

Validation is still on by default for JS modules pulled from wiki pages --
which are editable and therefore what we really cared about to begin with.
:)

May still be nice to reduce the memory footprint of the syntax tree as it's
being built, as it's likely a very inefficient structure by default. Gadgets
that stuff big JS libraries into their pages are probably the most likely to
still be able to fail this way.

-- brion
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Minor API change, ApiUpload

2011-07-12 Thread Brion Vibber
On Tue, Jul 12, 2011 at 2:25 PM, Ian Baker iba...@wikimedia.org wrote:

 I've made a minor API change ApiUpload, for stashed files.  It's not a
 commonly used API and the change shouldn't break anything external, but I
 figured I should post just in case.

 Temporarily stashed files now have their metadata stored in the database
 instead of the session.  This allows for some future feature expansion with
 regard to the image review and categorization process, and works around a
 memcached race condition that was preventing simultaneous uploads from
 working (see bug 26179
 https://bugzilla.wikimedia.org/show_bug.cgi?id=26179
 )

 The actual change to the API is pretty simple.  First, the 'sessionkey'
 parameter has been superseded by 'filekey', though 'sessionkey' remains for
 compatibility.


Seems to me like this doesn't need any client-visible changes; any objection
to just leaving it labeled as is? It's a key to an 'upload session', so I
wouldn't assume it has any specific relationship to HTTP session cookies...

-- brion
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] improving bugzilla patch review: Splinter?

2011-07-12 Thread Brion Vibber
On Tue, Jul 12, 2011 at 4:31 PM, Sumana Harihareswara suma...@wikimedia.org
 wrote:

 We have about 150 MediaWiki patches in Bugzilla that await review.  To
 make reviewers' lives easier, we could install an interactive patch
 review extension called Splinter on our Bugzilla installation.

 A brief but old overview:

 http://blog.fishsoup.net/2009/09/23/splinter-patch-review/

 If you have a bugzilla.mozilla.org account you can try out the latest
 version here (random bug  patch chosen as an example):


 https://bugzilla.mozilla.org/page.cgi?id=splinter.htmlbug=652345attachment=528146


What I particularly like about Splinter is that it's not insanely esoteric
-- it saves regular comments that you can see in their regular place (which
means if we remove it later, we haven't lost that data!), but also lets you
see -- and make -- the comments in context.

I'd probably recommend a couple things to streamline it:
* some sort of visual feedback instead of just having to figure out that
double-clicking opens a review box
* navigation issues: it seems to default to 'overview' which shows
nothing. :)
* see if it's possible to make a prettier inline view so you don't have to
link out


 Brion wrote:
  Might also be worth adapting some ideas from it for CodeReview in MW
  (making cleaner annotations against bits of code would be nice)

 We're going to get patches via Bugzilla from noncommitters for the
 foreseeable future and this seems like a quick way to make that more
 painless.  If there's a good Bugzilla-integrated patch review tool
 that's better than Splinter, tell me.  The Splinter extension is running
 on bugzilla.mozilla.org, which is at 4.0.1+.  So it's maintained and
 would be reasonable to install on our Bugzilla.

 (This of course is all dependent on having sysadmin resources to check
 out alternatives and install Splinter or whatever's deemed best.)


It's definitely worth trying out! Being current, maintained, and in use on
mozilla's mama bugzilla are all good signs; being a fairly clean BZ plugin
that uses the BZ API to fetch its data, it _shouldn't_ be too hard to set up
either. *mwahahahah*

-- brion
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Supporting Extension authors

2011-07-17 Thread Brion Vibber
On Sat, Jul 9, 2011 at 2:17 PM, Mark A. Hershberger 
mhershber...@wikimedia.org wrote:


 MediaWiki.org is great for extension authors, as far as it goes.  Today,
 though, someone asked on #mediawiki how to create development branches
 for their extension in their SVN repo.  I told him I didn't think it
 could be done — that he might have to use the SVN repo as a backend to
 push to from git or bzr — but I'm not sure that answer was correct.


Should be able to just use the usual 'svn copy' from
/trunk/extensions/whatev to /branches/mycoolext/whatev, unless permissions
are locked down on the base 'branches' dir.

But managing/merging branches in SVN is always awkward; personally I tend to
do temporary work branches in one-off git repositories.

Anyway, as I was writing about the UNIQ tracking bug, I thought of some
 documentation and support that we should try to get in place for
 extension developers.  Since Sumana is creating a lot of good
 documentation about testing lately, that is where I started:

  * What sort of things should they test?

  * Can they have tests that will continue to work against the current
parser and the next one?


For now, keep using the existing parser test suite -- if we have to
adapt/change things in the future we'll figure it out then. :)

phpunit, qunit tests will generally not depend on parser, so don't forget to
test stuff in those ways if possible as well.


   * How can they write parser tests and unit tests to try out their
code?


'Look at what other extensions do' is the best answer until somebody writes
up some details, which would be super nice. :D

'Cite' is an example that has some parser tests; basically you need to
provide a .txt file with the test cases in the same format as the core ones,
and have a ... hook? Or a config var? that adds your file to the list of
cases to run.

If your extension will require certain DB tables to be present to operate
correctly, be sure to also handle the hook for adding your tables to the
list to be copied. Cite doesn't have any tables, but there are other
extensions like say Math or CheckUser that do. (Even if you don't have
parser stuff, if your ext gets used during load/save operations etc you may
need your tables copied if you want parser tests to work while you're
enabled!)


  * How can they make sure that those tests are run on the test server?
(I think this actually requires some work on the test server, but…)


Whoever runs the test setup would need to add the extensions in no idea
who/how at the moment. :)

-- brion
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Rebranching 1.18

2011-07-18 Thread Brion Vibber
On Mon, Jul 18, 2011 at 2:58 PM, Chad innocentkil...@gmail.com wrote:

 I've already said this before and I'm going to say it againPlease
 do not take this as
 an invitation to begin breaking trunk with huge refactorings. It makes
 backporting fixes
 a major PITA and pushes the 1.18 release back even further. Corporal
 punishment
 may be doled out if you cannot follow this simple instruction 3


CORPORAL PUNISHMENT REPORTING FOR DUTY!  WHO WANTS TO GET REEE-VERTED?!
:)

-- brion
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


[Wikitech-l] RFC: Should we drop the rendering preferences for math?

2011-07-18 Thread Brion Vibber
Bug 24207 requests switching the math rendering preference default from its
current setting (which usually produces a nice PNG and occasionally produces
some kinda ugly HTML) to the always render PNG setting.

I'd actually propose dropping the rendering options entirely...

* HTML if simple and if possible produce *horrible* ugly output that
nobody likes, so people use hacks to force PNG rendering. Why not just
render to PNG?
* MathML mode is even *MORE* limited than HTML if simple, making it
entirely useless.
* nobody even knows what Recommended for modern browsers means, but it
seems to be somewhere in that occasionally crappy HTML, usually PNG
continuum.

So we're left with only two sane choices:

* Always render PNG
* Leave it as TeX (for text browsers)

Text browsers will show the alt text on the images, which is... the TeX
code. So even this isn't actually needed for its stated purpose. (Hi
Jidanni! :) lynx should show the tex source when using the PNG mode.)

It's conceivable that a few folks really honestly prefer to see the latex
source in their graphical browsers (should at least do a quick stat check to
see if anybody uses it on purpose), but I wouldn't mind removing that
either.


Fancier rendering like MathJax etc should be considered as a separate thing
(and implemented a bit differently to avoid parser cache fragmentation!), so
don't let future mode concerns worry y'all. Any thoughts on whether this
makes sense to do for 1.18 or 1.19?

https://bugzilla.wikimedia.org/show_bug.cgi?id=24207#c9

-- brion
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Rendering preferences for math - PDFs

2011-07-19 Thread Brion Vibber
On Tue, Jul 19, 2011 at 1:42 PM, Carl (CBM) cbm.wikipe...@gmail.com wrote:

 A second issue that has been raised on the wiki is the poor quality of
 math in PDFs generated from articles. The math shows up as a
 low-quality bitmap which is very pixelated and noticeably different
 from the body text. I wanted to pass it along to the list.


There's an entry in Bugzilla for this; which was initially closed:
https://bugzilla.wikimedia.org/show_bug.cgi?id=27574

I've gone ahead and re-opened it as it would be nice to fix, but it'll
actually need fixing in PediaPress's mwlib renderer that's used as the
backend for the Collection extension and actually generates the PDFs.

There's some notes on the bug about possible implementation details.

-- brion
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Extensions needing a maintainer

2011-07-21 Thread Brion Vibber
That doesnt look like it would actually work -- anybody could still get at
those pages' source through:
* edit
* special:export
* action=raw
* api
* search result extracts

-- brion
On Jul 21, 2011 9:32 AM, Sumana Harihareswara suma...@wikimedia.org
wrote:
 On 07/21/2011 10:39 AM, Mark A. Hershberger wrote:
 During my poll of developers about their Bugzilla settings, I was told
 that the current maintainers of the following two extensions do not have
 time to take care of them any longer and they need someone to step up
 and maintain them:

 EasyTimeline
 APC

 For what it is worth, EasyTimeline is deployed on WMF wikis (See
 http://hexm.de/58 for an example from mlwiki).

 Thanks,

 Mark.

 One additional extension that needs someone to update for MW 1.17, put
 it on mediawiki.org, and do a little maintenance:


http://www.epistemographer.com/2005/05/04/page-by-page-authentication-in-mediawiki/

 Judging by the comments, people do want this functionality, so it would
 be used, even if not on WMF wikis.

 Thanks,
 Sumana

 --
 Sumana Harihareswara
 Volunteer Development Coordinator
 Wikimedia Foundation


 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] RFC: Should we drop the rendering preferences for math?

2011-07-21 Thread Brion Vibber
+1 on that -- using the wiki's the right thing! The list is great for devs
but it's a high bar to figure out for feedback.

My position for now is to recommend reducing the existing options to just
PNG and text, and after that look more solidly at auto-enabling mathjax etc
for the next major improvement.

Will whip up a proposal page shortly.

-- brion
On Jul 21, 2011 9:56 AM, Chad innocentkil...@gmail.com wrote:
 On Wed, Jul 20, 2011 at 8:13 PM, Michael Hardy drmichaelha...@gmail.com
wrote:
 The two discussions should not be separate.  The issues
 in this thread have been talked about frequently since
 February 2003 on that wikiproject talk page.  Various
 views exist, but there is general agreement about the
 less-than-perfect nature of existing system.


 I encourage more people to join this thread then, rather than having a
 discussion on enwiki. Perhaps we should also write up a more formal
 RfC on mw.org. Brion?

 -Chad

 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


<    4   5   6   7   8   9   10   11   12   13   >