Re: [Wikitech-l] Resource Loader problem

2010-11-10 Thread Trevor Parscal
Well, we basically just need a template parser. Michael has one that 
seems to be working for him, but it would need to be cleaned up and 
integrated, as it's currently spread across multiple files and methods.

Do you like writing parsers?

- Trevor

On 11/9/10 10:25 PM, Dmitriy Sintsov wrote:
 * Trevor Parscaltpars...@wikimedia.org  [Tue, 09 Nov 2010 11:01:47
 -0800]:
 It's in the roadmap and in process? Want to help?

 I found the following notes in the linked from Roadmap article:
 http://www.mediawiki.org/wiki/ResourceLoader/Status

 Port Michael's wikitext-in-JS code (for PLURAL and stuff)
 It seems that Michael (Dale?) already wrote the code for that.

 After looking at server-side Language classes, it seems that client-side
 plural either requires AJAX (which is really inefficient for messages),
 or, a much better approach is to make a set of JS language classes,
 similar to PHP classes, which is a larger work.

 After looking at ResourceLoader code, it seems that message classes are
 here, for example I've found
 resources/mediawiki.language/languages/ru.js, which implements
 mediaWiki.language.convertPlural. However mediaWiki.msg currently
 ignores them.

 I can match a plural with JS regexp and try to replace callback value by
 that mediaWiki.language.convertPlural() call, however I have the feeling
 that it's better to wait for trunk merge, instead of re-inventing
 something that will not be used.

 It seems that ru.js mediaWiki.language.converPlural matches the logic of
 LanguageRu::convertPlural. I don't know any more languages.
 What can I do to help?
 Dmitriy

 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Resource Loader problem

2010-11-10 Thread Dmitriy Sintsov
* Trevor Parscal tpars...@wikimedia.org [Wed, 10 Nov 2010 00:16:27 
-0800]:
 Well, we basically just need a template parser. Michael has one that
 seems to be working for him, but it would need to be cleaned up and
 integrated, as it's currently spread across multiple files and 
methods.

 Do you like writing parsers?

Maybe my knowledge of MediaWiki is not good enough, but aren't the local 
messages only provide the basic syntax features like {{PLURAL:||}}, not 
a full Parser with template calls and substitutions? I never tried to 
put real template calls into messages. Rewriting the whole Parser in 
Javascript would be a lot of work. Many people have already failed to 
make alternative parsers fully compatible. And how would one call the 
server-side templates, via AJAX calls? That would be inefficient.

I am currently trying to improve my Extension:WikiSync, also I have 
plans to make my another extensions ResourceLoader compatible.

You know, my country is a kind of third world country (also I am living 
in not the most developed local area), so I am not rich, so I mostly 
code for customers, who use MediaWiki at their sites. They are kind 
enough to allow me to publish my extensions for free (GPL) at MediaWiki 
site and repository so these might be useful to someone else (I hope 
so).

I cannot afford to make _a_lot_ of free work (I'll run out of money). I 
realize that there are free volunteers at Wikimedia, some are really 
skillful. However, most of these come from more successful places, I 
think.

However, if the most of work has already been done, I can take a look, 
but I don't have the links to look at (branches, patches). I just don't 
know how much time would it take. Sorry.
Dmitriy

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Killing $wgStyleDirectory

2010-11-10 Thread Daniel Friesen
On 10-11-09 03:42 PM, Platonides wrote:
 Trevor Parscal wrote:

 In relation to: https://bugzilla.wikimedia.org/show_bug.cgi?id=25124 - I
 am considering removing $wgStyleDirectory rather than further depending
 on it.

 It seems that it's only partially used right now, and making the style
 directory variable but not other directories is inconsistent as well.

 There was some point about this being a feature that makes MediaWiki
 more flexible, but I'm unsure this is actually adding flexibility, it
 seems more like unnecessary complexity.

 I'm considering just removing it - any points to the contrary would help
 me make this decision.

 - Trevor

 Backwards compatibility.
 Some systems may not support symlinks
 $wgStylePath would be sad if you killed $wgStyleDirectory
 (we provide them in pairs)

 Is it that hard to support it?

That's a -1 here too, we should be trying to improve flexibility, not 
drop existing flexibility. If resourceloader isn't listening to 
$wgStyleDirectory then it's a bug that should be fixed, unless 
resourceloader has a damn good reason for ignoring existing configuration.

Frankly, for awhile now I've wanted to fix MediaWiki's dependence on a 
single directory for skins. Currently installing a new skin requires 
adding a file or two and a directory into skins/. Whilst installing an 
extension, besides the LocalSettings.php tweak only requires dropping in 
a single directory. IMHO extensions should be able to provide new skins, 
just as extensions can currently provide new Special Pages. And the path 
to where default skins are fetched should be left configurable for those 
wanting to share skins between installs.

~Daniel Friesen (Dantman, Nadir-Seen-Fire) [http://daniel.friesen.name]

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Considering mirroring mediawiki-1.16.0.tar.gz

2010-11-10 Thread Roan Kattouw
2010/11/10 Roan Kattouw roan.katt...@gmail.com:
 2010/11/10 Tei oscar.vi...@gmail.com:
 Hi,

 Just a sugestion.

 Downloading the last version of MediaWiki seems to take ages ATM.
 Maybe servers are overloaded. And not mirror is offered.

 $  wget http://download.wikimedia.org/mediawiki/1.16/mediawiki-1.16.0.tar.gz
 this takes ages

 This is because of the dumps server outage reported in this channel
 earlier. The tarballs are hosted on the same server.

I have now mirrored the 1.16 tarball at
http://noc.wikimedia.org/mediawiki-1.16.0.tar.gz

Roan Kattouw (Catrope)

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Selenium Smoke Test Framework

2010-11-10 Thread Benedikt Kaempgen
Hi,

when I created tests for monobook and then switched to vector skin some
tests did not work anymore, e.g., due to different naming schemes of input
fields.

Although that would require only slight changes to the tests, it would
increase redundancy and risk of inconsistency. How do you deal with this
issue?

Cheers!

Benedikt

--
Karlsruhe Institute of Technology (KIT)
Institute of Applied Informatics and Formal Description Methods (AIFB)

Benedikt Kämpgen
Research Associate

Kaiserstraße 12
Building 11.40
76131 Karlsruhe, Germany

Phone: +49 721 608-7946
Fax: +49 721 608-6580
Email: benedikt.kaemp...@kit.edu
Web: http://www.kit.edu/

KIT – University of the State of Baden-Wuerttemberg and
National Research Center of the Helmholtz Association

-Original Message-
From: wikitech-l-boun...@lists.wikimedia.org
[mailto:wikitech-l-boun...@lists.wikimedia.org] On Behalf Of K. Peachey
Sent: Friday, November 05, 2010 1:47 AM
To: Wikimedia developers
Subject: Re: [Wikitech-l] Selenium Smoke Test Framework

On Fri, Nov 5, 2010 at 10:33 AM, Rob Lanphier ro...@wikimedia.org wrote:
 If people have thoughts on the types of things that developers
inadvertently
 break that something like Selenium would be appropriate for trying to
 detect, your thoughts would be greatly appreciated here.
DB support for the other formats (aka, apart from just MySQL), if it
installs, and some core functions work.
-Peachey

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] Considering mirroring mediawiki-1.16.0.tar.gz

2010-11-10 Thread Tei
Hi,

Just a sugestion.

Downloading the last version of MediaWiki seems to take ages ATM.
Maybe servers are overloaded. And not mirror is offered.

$  wget http://download.wikimedia.org/mediawiki/1.16/mediawiki-1.16.0.tar.gz
this takes ages

I have managed to download with this:
$ svn checkout 
http://svn.wikimedia.org/svnroot/mediawiki/branches/REL1_16/phase3

But I suppose is not the prefered method.

I may be a good idea to provide mirrors for the mediawiki-1.16.0.tar.gz file.

-- 
--
ℱin del ℳensaje.

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Resource Loader problem

2010-11-10 Thread Marco Schuster
On Wed, Nov 10, 2010 at 10:56 AM, Roan Kattouw roan.katt...@gmail.com wrote:
 We're not looking for a full-blown parser, just one that has a few
 basic features that we care about. The current JS parser only
 supports expansion of message parameters ($1, $2, ...), and we want
 {{PLURAL}} support too. AFAIK that's pretty much all we're gonna need.
 Michael Dale's implementation has $1 expansion and {{PLURAL}}, AFAIK,
 and maybe a few other features.

Actually PHP and JS are a bit similar. Different function names and
slight syntax differences, but I think it is possible to take the
existing PHP parser, strip out the references to MW internals and
replace the database queries with appropriate API calls.
That would also enable a true WYSIWYG editor or live preview at
least, as having a JS parser will also allow that the resulting DOM
nodes have some kind of reference attribute which can be looked at
to find the wikitext responsible for the creation of the node (and so,
enable inline editing).
Actually, this seems just perfect for a GSoC project for next year:
Port the MW parser to JavaScript, and a followup project to make a
WYSIWYG/inline editor based on it.

Marco
-- 
VMSoft GbR
Nabburger Str. 15
81737 München
Geschäftsführer: Marco Schuster, Volker Hemmert
http://vmsoft-gbr.de

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Considering mirroring mediawiki-1.16.0.tar.gz

2010-11-10 Thread Roan Kattouw
2010/11/10 Tei oscar.vi...@gmail.com:
 Hi,

 Just a sugestion.

 Downloading the last version of MediaWiki seems to take ages ATM.
 Maybe servers are overloaded. And not mirror is offered.

 $  wget http://download.wikimedia.org/mediawiki/1.16/mediawiki-1.16.0.tar.gz
 this takes ages

This is because of the dumps server outage reported in this channel
earlier. The tarballs are hosted on the same server.

Roan Kattouw (Catrope)

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Backward compatibility guidelines

2010-11-10 Thread Chad
On Tue, Nov 9, 2010 at 2:08 PM, Aryeh Gregor
simetrical+wikil...@gmail.com wrote:
 Extensions are branched along with MediaWiki, so trunk extensions
 only have to support current trunk MediaWiki. However, per #General
 principles, don't break support for old MediaWiki versions unless the
 compatibility code is causing actual quantifiable problems, because
 it's nice if users can get the extra features or bug fixes of trunk
 extensions even if they're still using old MediaWiki. Some extensions'
 maintainers deliberately maintain compatibility with old MediaWiki
 versions, and their wishes should be respected in all cases.


Another +1

-Chad

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Backward compatibility guidelines

2010-11-10 Thread Markus Krötzsch
On 09/11/2010 19:08, Aryeh Gregor wrote:
 On Mon, Nov 8, 2010 at 7:17 AM, Markus Krötzsch
 mar...@semantic-mediawiki.org  wrote:
 The proposed backward compatibility page says that trunk extensions
 are only compatible with the current MW version. This implies that it is not
 possible to cleanly develop an extension that supports multiple MW versions
 in trunk at all: since branches are copies of an earlier state of the trunk,
 the proposed guidelines entail that branches are also compatible with only
 one version of MediaWiki.

 Actually, what it says is:

 Extensions are branched along with MediaWiki, so trunk extensions
 only have to support current trunk MediaWiki. However, per #General
 principles, don't break support for old MediaWiki versions unless the
 compatibility code is causing actual quantifiable problems, because
 it's nice if users can get the extra features or bug fixes of trunk
 extensions even if they're still using old MediaWiki. Some extensions'
 maintainers deliberately maintain compatibility with old MediaWiki
 versions, and their wishes should be respected in all cases.

 The last sentence answers your question.  The phrase have to in the
 first sentence means they don't have to support old releases, but they
 can if their maintainers want them to.

Great, I overlooked this last part. +1 from me.

What remains of my email is the question how this wish is implemented 
technically. Either one has to generally give up potentially problematic 
updates on ./trunk/extensions, or there must be some record of each 
extension's compatibility wish to determine which extensions an update 
should be applied to.

-- Markus


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Killing $wgStyleDirectory

2010-11-10 Thread Roan Kattouw
2010/11/10 Daniel Friesen li...@nadir-seen-fire.com:
 If resourceloader isn't listening to
 $wgStyleDirectory then it's a bug that should be fixed, unless
 resourceloader has a damn good reason for ignoring existing configuration.

It's a bug that should be fixed and can be fixed relatively easily.
However, Trevor also said that other parts of MW already don't obey
$wgStyleDirectory . Maybe someone needs to try setting it and see if
things break.

Roan Kattouw (Catrope)

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Resource Loader problem

2010-11-10 Thread Daniel Friesen
On 10-11-10 02:55 AM, Marco Schuster wrote:
 On Wed, Nov 10, 2010 at 10:56 AM, Roan Kattouwroan.katt...@gmail.com  wrote:

 We're not looking for a full-blown parser, just one that has a few
 basic features that we care about. The current JS parser only
 supports expansion of message parameters ($1, $2, ...), and we want
 {{PLURAL}} support too. AFAIK that's pretty much all we're gonna need.
 Michael Dale's implementation has $1 expansion and {{PLURAL}}, AFAIK,
 and maybe a few other features.
  
 Actually PHP and JS are a bit similar. Different function names and
 slight syntax differences, but I think it is possible to take the
 existing PHP parser, strip out the references to MW internals and
 replace the database queries with appropriate API calls.
 That would also enable a true WYSIWYG editor or live preview at
 least, as having a JS parser will also allow that the resulting DOM
 nodes have some kind of reference attribute which can be looked at
 to find the wikitext responsible for the creation of the node (and so,
 enable inline editing).
 Actually, this seems just perfect for a GSoC project for next year:
 Port the MW parser to JavaScript, and a followup project to make a
 WYSIWYG/inline editor based on it.

 Marco

Sorry, but the differences between PHP and JS are more than you think.
Besides /x you're going to run into a bit of pain where /s is used.
And you'll be rearanging code a bit to cache RegExp's created through 
string concatenation.
And there's the potential to be tripped up by associative arrays if the 
parser uses them.

And as for WYSIWYG, parsing is quite different, at least if you're sane 
enough to not want to re-run an extremely expensive parser like the MW 
one for every few character changes.
And then there are extensions...

The parser is heavy... even if you take into account how efficient JS 
engines have become and the potential for them to be even faster at 
executing the parser than php is you don't want a heavy directly ported 
parser doing the work handling message parsing client side.

~Daniel Friesen (Dantman, Nadir-Seen-Fire) [http://daniel.friesen.name]


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Resource Loader problem

2010-11-10 Thread Roan Kattouw
2010/11/10 Dmitriy Sintsov ques...@rambler.ru:
 * Trevor Parscal tpars...@wikimedia.org [Wed, 10 Nov 2010 00:16:27
 -0800]:
 Well, we basically just need a template parser. Michael has one that
 seems to be working for him, but it would need to be cleaned up and
 integrated, as it's currently spread across multiple files and
 methods.

 Do you like writing parsers?

 Maybe my knowledge of MediaWiki is not good enough, but aren't the local
 messages only provide the basic syntax features like {{PLURAL:||}}, not
 a full Parser with template calls and substitutions? I never tried to
 put real template calls into messages. Rewriting the whole Parser in
 Javascript would be a lot of work. Many people have already failed to
 make alternative parsers fully compatible. And how would one call the
 server-side templates, via AJAX calls? That would be inefficient.

We're not looking for a full-blown parser, just one that has a few
basic features that we care about. The current JS parser only
supports expansion of message parameters ($1, $2, ...), and we want
{{PLURAL}} support too. AFAIK that's pretty much all we're gonna need.
Michael Dale's implementation has $1 expansion and {{PLURAL}}, AFAIK,
and maybe a few other features.

 I am currently trying to improve my Extension:WikiSync, also I have
 plans to make my another extensions ResourceLoader compatible.

I think {{PLURAL}} is an important feature for ResourceLoader, and if
no volunteer wants to implement it, I think a staff developer should.

 However, if the most of work has already been done, I can take a look,
 but I don't have the links to look at (branches, patches). I just don't
 know how much time would it take. Sorry.
I believe most of the work has already been done, yes, but I've never
seen Michael's code and I don't know where it is (maybe someone who
does can post a link?).

Roan Kattouw (Catrope)

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Question about external links CSS

2010-11-10 Thread Aryeh Gregor
On Tue, Nov 9, 2010 at 4:57 PM, Roan Kattouw roan.katt...@gmail.com wrote:
 ResourceLoader does this based on the lang= parameter passed to it
 when grabbing the CSS. Static dumps wouldn't be affected as long as
 they use one language consistently and fetch the CSS through RL.

If they're running ResourceLoader, they're hardly static, are they?
The idea is that you shouldn't have to run PHP to get a working static
(possibly partial) copy of a wiki's content.  Before ResourceLoader
the CSS and JS was a nonissue, since that was just static files, but
that's no longer the case.  Whatever script generates static dumps is
presumably just creating them with no working CSS at all anymore,
which is bad.

I don't think this should be too hard to fix, though, if someone cares
enough to do it.  Whatever script makes static dumps should just get
ResourceLoader to output one CSS file that contains everything that
could be needed, probably with no JS (unless there's some JS static
dumps will actually want that I can't think of), and include that
statically.

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Backward compatibility guidelines

2010-11-10 Thread Aryeh Gregor
On Tue, Nov 9, 2010 at 4:52 PM, Roan Kattouw roan.katt...@gmail.com wrote:
 Sounds good.

On Tue, Nov 9, 2010 at 7:48 PM, Andrew Garrett agarr...@wikimedia.org wrote:
 I support this wording.

On Wed, Nov 10, 2010 at 8:06 AM, Chad innocentkil...@gmail.com wrote:
 Another +1

I'm glad everyone likes this particular paragraph.  Have you looked at
the rest of the page?

http://www.mediawiki.org/wiki/Backward_compatibility

Perhaps the link got lost in my usual wall of text.

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Backward compatibility guidelines

2010-11-10 Thread Chad
On Wed, Nov 10, 2010 at 10:03 AM, Aryeh Gregor
simetrical+wikil...@gmail.com wrote:
 I'm glad everyone likes this particular paragraph.  Have you looked at
 the rest of the page?


Yes, and I pretty much agree with everything you've said. Especially this:

In an ideal world, anything that a third-party extension might call should
continue to work across versions. In practice this is impossible, since most
of our functions are public and we can't possibly keep all of them stable.

-Chad

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] [Xmldatadumps-l] XML dumps stopped, possible fs/disk issues on dump server under investigation

2010-11-10 Thread Ariel T. Glenn
The server refused to come up on reboot; raid errors.  The backplane is
suspect.  A ticket is being opened with the vendor.  The host will
remain offline until we have good information about how to resolve the
problem or we get a replacement part from the vendor; we don't want to
risk losing the data.  The backplane is suspect due to earlier errors
from April that seemed to be disk errors.

Future updates will be made available here: 

http://wikitech.wikimedia.org/view/Dataset1

Ariel

Στις 09-11-2010, ημέρα Τρι, και ώρα 21:44 -0800, ο/η Ariel T. Glenn
έγραψε:
 We noticed a kernel panic message and stack trace in the logs on the
 server that servers XML dumps.  The web server that provides access to
 these files is temporarily out of commission; we hope to have it back on
 line in 12 hours or less.  Dumps themselves have been suspended while we
 investigate.  I hope to have an update on this tomorrow as well.
 
 Ariel
 
 
 
 ___
 Xmldatadumps-l mailing list
 xmldatadump...@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/xmldatadumps-l



___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Resource Loader problem

2010-11-10 Thread Michael Dale
The code is not spread across many files.. its a single mw.Parser.js
file. Its being used in my gadget and Neil's upload wizard.   I agree
the parser is not the ideal parser, its not feature complete, is not
very optimised, and it was hacked together quickly. But it passes all
the tests and matches the output of php for all the messages across all
the languages. I should have time in the next few days to re merge /
clean it up a bit if no one else is doing it.

It should be clear who is doing what.

The parser as is ... is more of a starting point than a finished
project. But it starts by passing all the tests... If that useful we can
plop it in there.

an old version of the test file is here. I have a ported / slightly
cleaner version in a patch
http://prototype.wikimedia.org/s-9/extensions/JS2Support/tests/testLang.html

it also includes a test file that confirms the transforms work across a
sample set of messages. Its not clear to me how the current test files /
system scales ... Mostly for Krinkle:  The mediawiki.util.test.js seem
to always include itself when in debug mode. And why does
mediawiki.util.test.js not define an object by name
mediawiki.util.test it instead defines mediawiki.test

also:

if (wgCanonicalSpecialPageName == 'Blankpage' 
mw.util.getParamValue('action') === 'mwutiltest') {

Seems gadget like.. this logic can be done on php side no? Why not
deliver specific test payloads for specific test entry points? if you
imagine we have dozes of complicated tests systems with sub components
the debug mode will become overloaded with js code that is never running.

--michael

On 11/10/2010 10:56 AM, Roan Kattouw wrote:
 2010/11/10 Dmitriy Sintsov ques...@rambler.ru:
 * Trevor Parscal tpars...@wikimedia.org [Wed, 10 Nov 2010 00:16:27
 -0800]:
 Well, we basically just need a template parser. Michael has one that
 seems to be working for him, but it would need to be cleaned up and
 integrated, as it's currently spread across multiple files and
 methods.
 Do you like writing parsers?

 Maybe my knowledge of MediaWiki is not good enough, but aren't the local
 messages only provide the basic syntax features like {{PLURAL:||}}, not
 a full Parser with template calls and substitutions? I never tried to
 put real template calls into messages. Rewriting the whole Parser in
 Javascript would be a lot of work. Many people have already failed to
 make alternative parsers fully compatible. And how would one call the
 server-side templates, via AJAX calls? That would be inefficient.

 We're not looking for a full-blown parser, just one that has a few
 basic features that we care about. The current JS parser only
 supports expansion of message parameters ($1, $2, ...), and we want
 {{PLURAL}} support too. AFAIK that's pretty much all we're gonna need.
 Michael Dale's implementation has $1 expansion and {{PLURAL}}, AFAIK,
 and maybe a few other features.

 I am currently trying to improve my Extension:WikiSync, also I have
 plans to make my another extensions ResourceLoader compatible.

 I think {{PLURAL}} is an important feature for ResourceLoader, and if
 no volunteer wants to implement it, I think a staff developer should.

 However, if the most of work has already been done, I can take a look,
 but I don't have the links to look at (branches, patches). I just don't
 know how much time would it take. Sorry.
 I believe most of the work has already been done, yes, but I've never
 seen Michael's code and I don't know where it is (maybe someone who
 does can post a link?).

 Roan Kattouw (Catrope)

 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Resource Loader problem

2010-11-10 Thread Dmitriy Sintsov
* Daniel Friesen li...@nadir-seen-fire.com [Wed, 10 Nov 2010 05:54:51 
-0800]:
 Sorry, but the differences between PHP and JS are more than you think.
 Besides /x you're going to run into a bit of pain where /s is used.
 And you'll be rearanging code a bit to cache RegExp's created through
 string concatenation.
 And there's the potential to be tripped up by associative arrays if 
the
 parser uses them.

 And as for WYSIWYG, parsing is quite different, at least if you're 
sane
 enough to not want to re-run an extremely expensive parser like the MW
 one for every few character changes.
 And then there are extensions...

 The parser is heavy... even if you take into account how efficient JS
 engines have become and the potential for them to be even faster at
 executing the parser than php is you don't want a heavy directly 
ported
 parser doing the work handling message parsing client side.

PHP will never come to browsers. However, there is the way to bring 
Javascript to mod_php:
http://pecl.php.net/package/spidermonkey
It even has beta status, not alpha. Maybe even has a chance to be 
included to Parser? However, one should not expect to find it at 
crippled hosting. A good dedicated hosting / co-location is probably 
required to compile / setup it yourself (though a most of MediaWiki 
installations run from such hosting, not a crippled down ones).

The same language for server and client side would bring many 
advantages. Like you don't have to re-implement something complex twice 
in both languages, only the pars that are different for server / client.
Dmitriy

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Resource Loader problem

2010-11-10 Thread Dmitriy Sintsov
* Michael Dale md...@wikimedia.org [Wed, 10 Nov 2010 17:20:26 +0100]:
 The code is not spread across many files.. its a single mw.Parser.js
 file. Its being used in my gadget and Neil's upload wizard.   I agree
 the parser is not the ideal parser, its not feature complete, is not
 very optimised, and it was hacked together quickly. But it passes all
 the tests and matches the output of php for all the messages across 
all
 the languages. I should have time in the next few days to re merge /
 clean it up a bit if no one else is doing it.

As Roan have pointed out, local messages don't need a complete parser 
anyway. I hope you have the time to merge it, if so can you announce 
that, please?
Dmitriy

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Resource Loader problem

2010-11-10 Thread Trevor Parscal
On 11/10/10 8:20 AM, Michael Dale wrote:
 The code is not spread across many files..
I spent hours going over the code, and there were several points in the 
code that called out to other libraries. This was based on the patch you 
provided me.
 I agree the parser is not the ideal parser, its not feature complete, is not
 very optimised, and it was hacked together quickly.
Which is why I suggested it as a starting point

Essentially, Michael has solved some of this problem already, we just 
need to integrate it.

- Trevor

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Resource Loader problem

2010-11-10 Thread Trevor Parscal
On 11/10/10 1:56 AM, Roan Kattouw wrote:
 I think {{PLURAL}} is an important feature for ResourceLoader, and if
 no volunteer wants to implement it, I think a staff developer should.
Certainly staff members (me for instance) are already assigned this 
work. I was responding to a query about what can I do. In short, you 
can help :)

- Trevor

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Killing $wgStyleDirectory

2010-11-10 Thread Trevor Parscal
Skin extensions exist. Look at extensions/skins.

- Trevor

On 11/10/10 12:32 AM, Daniel Friesen wrote:
 On 10-11-09 03:42 PM, Platonides wrote:
 Trevor Parscal wrote:

 In relation to: https://bugzilla.wikimedia.org/show_bug.cgi?id=25124 - I
 am considering removing $wgStyleDirectory rather than further depending
 on it.

 It seems that it's only partially used right now, and making the style
 directory variable but not other directories is inconsistent as well.

 There was some point about this being a feature that makes MediaWiki
 more flexible, but I'm unsure this is actually adding flexibility, it
 seems more like unnecessary complexity.

 I'm considering just removing it - any points to the contrary would help
 me make this decision.

 - Trevor

 Backwards compatibility.
 Some systems may not support symlinks
 $wgStylePath would be sad if you killed $wgStyleDirectory
 (we provide them in pairs)

 Is it that hard to support it?

 That's a -1 here too, we should be trying to improve flexibility, not
 drop existing flexibility. If resourceloader isn't listening to
 $wgStyleDirectory then it's a bug that should be fixed, unless
 resourceloader has a damn good reason for ignoring existing configuration.

 Frankly, for awhile now I've wanted to fix MediaWiki's dependence on a
 single directory for skins. Currently installing a new skin requires
 adding a file or two and a directory into skins/. Whilst installing an
 extension, besides the LocalSettings.php tweak only requires dropping in
 a single directory. IMHO extensions should be able to provide new skins,
 just as extensions can currently provide new Special Pages. And the path
 to where default skins are fetched should be left configurable for those
 wanting to share skins between installs.

 ~Daniel Friesen (Dantman, Nadir-Seen-Fire) [http://daniel.friesen.name]

 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Selenium Smoke Test Framework

2010-11-10 Thread Trevor Parscal
These differences in field names aren't just to do with Vector, they are 
to do with the configuration of vector. If you have 
$wgVectorUseSimpleSearch = false; then the field names for search will 
be the same, the point is that it's a very different control, and a 
script that's hooking into it would fail because it's different. So what 
you are experiencing is not a bug, you just need to make your script 
work with one or the other.

- Trevor

On 11/10/10 2:36 AM, Benedikt Kaempgen wrote:
 Hi,

 when I created tests for monobook and then switched to vector skin some
 tests did not work anymore, e.g., due to different naming schemes of input
 fields.

 Although that would require only slight changes to the tests, it would
 increase redundancy and risk of inconsistency. How do you deal with this
 issue?

 Cheers!

 Benedikt

 --
 Karlsruhe Institute of Technology (KIT)
 Institute of Applied Informatics and Formal Description Methods (AIFB)

 Benedikt Kämpgen
 Research Associate

 Kaiserstraße 12
 Building 11.40
 76131 Karlsruhe, Germany

 Phone: +49 721 608-7946
 Fax: +49 721 608-6580
 Email: benedikt.kaemp...@kit.edu
 Web: http://www.kit.edu/

 KIT -- University of the State of Baden-Wuerttemberg and
 National Research Center of the Helmholtz Association

 -Original Message-
 From: wikitech-l-boun...@lists.wikimedia.org
 [mailto:wikitech-l-boun...@lists.wikimedia.org] On Behalf Of K. Peachey
 Sent: Friday, November 05, 2010 1:47 AM
 To: Wikimedia developers
 Subject: Re: [Wikitech-l] Selenium Smoke Test Framework

 On Fri, Nov 5, 2010 at 10:33 AM, Rob Lanphierro...@wikimedia.org  wrote:
 If people have thoughts on the types of things that developers
 inadvertently
 break that something like Selenium would be appropriate for trying to
 detect, your thoughts would be greatly appreciated here.
 DB support for the other formats (aka, apart from just MySQL), if it
 installs, and some core functions work.
 -Peachey

 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l


 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Question about external links CSS

2010-11-10 Thread Trevor Parscal
On 11/10/10 6:52 AM, Aryeh Gregor wrote:
 Whatever script makes static dumps should just get ResourceLoader to
 output one CSS file that contains everything that could be needed,
 probably with no JS (unless there's some JS static dumps will actually
 want that I can't think of), and include that statically.
You will need to generate one for right-to-left and one for 
left-to-right. ResourceLoader has an API that supports generating these 
responses directly in PHP.

- Trevor

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Question about external links CSS

2010-11-10 Thread Roan Kattouw
2010/11/10 Trevor Parscal tpars...@wikimedia.org:
 On 11/10/10 6:52 AM, Aryeh Gregor wrote:
 Whatever script makes static dumps should just get ResourceLoader to
 output one CSS file that contains everything that could be needed,
 probably with no JS (unless there's some JS static dumps will actually
 want that I can't think of), and include that statically.
 You will need to generate one for right-to-left and one for
 left-to-right. ResourceLoader has an API that supports generating these
 responses directly in PHP.

CSSJanus::transform( $css ) in includes/libs/CSSJanus.php is what
Trevor's talking about here.

I guess you could take a concatenated version of all style sheets and
generate an RTL version of that. What do the current static dumps do?

Roan Kattouw (Catrope)

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Question about external links CSS

2010-11-10 Thread Roan Kattouw
2010/11/10 Roan Kattouw roan.katt...@gmail.com:
 I guess you could take a concatenated version of all style sheets and
 generate an RTL version of that.
Hmm, and remap image paths, now I think of it. It'd probably be easier
to pull the CSS for all modules from the resource loader this way:

load.php?modules=all|module|names|hereonly=styleslang=en

Which will do concatenation and path remapping for you, and will do
RTL transformation if lang is set to an RTL language.

Roan Kattouw (Catrope)

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


[Wikitech-l] Is that download.wikimedia.org server is down?

2010-11-10 Thread Billy Chan
Hi all,

I am trying to download mediawiki by the following link, but with no luck,
the server seems down:

http://download.wikimedia.org/mediawiki/1.16/mediawiki-1.16.0.tar.gz

Anybody know where should i report to and how the service could be resume?
thanks.
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Is that download.wikimedia.org server is down?

2010-11-10 Thread Robin Krahl
Hi Billy,

On 10.11.2010 19:48, Billy Chan wrote:
 I am trying to download mediawiki by the following link, but with no luck,
 the server seems down:

That’s a known problem (see tech channel or server admin log). You may
user the following file instead:
  http://noc.wikimedia.org/mediawiki-1.16.0.tar.gz

Regards,
Robin

-- 
Robin Krahl || ireas
  http://robin-krahl.de
  m...@robin-krahl.de



signature.asc
Description: OpenPGP digital signature
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Resource Loader problem

2010-11-10 Thread Jeroen De Dauw
Hey,

 Can you link to the actual fail map instead of a screenshot so I/we
 can look at this in Firebug ourselves? I'm pretty sure /something/ is
 going wrong with image URL remapping here, but it's hard to figure out
what unless I can see it fail in my browser.

Sure, the issue can be seen here:
http://wiki.bn2vs.com/OpenLayers_map_with_RL

Cheers

--
Jeroen De Dauw
http://blog.bn2vs.com
Don't panic. Don't be evil.
--
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Resource Loader problem

2010-11-10 Thread Trevor Parscal
Your problem has nothing to do with CSS. The images that are not loading 
are actual img tags, and they are trying to find images in the /img/ 
folder, which means http://wiki.bn2vs.com/img/ in your case.

- Trevor

On 11/10/10 1:40 PM, Jeroen De Dauw wrote:
 Hey,

 Can you link to the actual fail map instead of a screenshot so I/we
 can look at this in Firebug ourselves? I'm pretty sure /something/ is
 going wrong with image URL remapping here, but it's hard to figure out
 what unless I can see it fail in my browser.
 Sure, the issue can be seen here:
 http://wiki.bn2vs.com/OpenLayers_map_with_RL

 Cheers

 --
 Jeroen De Dauw
 http://blog.bn2vs.com
 Don't panic. Don't be evil.
 --
 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] [Xmldatadumps-l] XML dumps stopped, possible fs/disk issues on dump server under investigation

2010-11-10 Thread emijrp
What data is in risk?

2010/11/10 Ariel T. Glenn ar...@wikimedia.org

 The server refused to come up on reboot; raid errors.  The backplane is
 suspect.  A ticket is being opened with the vendor.  The host will
 remain offline until we have good information about how to resolve the
 problem or we get a replacement part from the vendor; we don't want to
 risk losing the data.  The backplane is suspect due to earlier errors
 from April that seemed to be disk errors.

 Future updates will be made available here:

 http://wikitech.wikimedia.org/view/Dataset1

 Ariel

 Στις 09-11-2010, ημέρα Τρι, και ώρα 21:44 -0800, ο/η Ariel T. Glenn
 έγραψε:
  We noticed a kernel panic message and stack trace in the logs on the
  server that servers XML dumps.  The web server that provides access to
  these files is temporarily out of commission; we hope to have it back on
  line in 12 hours or less.  Dumps themselves have been suspended while we
  investigate.  I hope to have an update on this tomorrow as well.
 
  Ariel
 
 
 
  ___
  Xmldatadumps-l mailing list
  xmldatadump...@lists.wikimedia.org
  https://lists.wikimedia.org/mailman/listinfo/xmldatadumps-l



 ___
 Xmldatadumps-l mailing list
 xmldatadump...@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/xmldatadumps-l

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Resource Loader problem

2010-11-10 Thread Jeroen De Dauw
Hey,

 ...are trying to find images in the /img/ folder...

So why is this, and how can it be fixed? It works when not using the RL...

Cheers

--
Jeroen De Dauw
http://blog.bn2vs.com
Don't panic. Don't be evil.
--
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] [Xmldatadumps-l] XML dumps stopped, possible fs/disk issues on dump server under investigation

2010-11-10 Thread K. Peachey
On Thu, Nov 11, 2010 at 7:49 AM, emijrp emi...@gmail.com wrote:
 What data is in risk?
Data from dumps.wm and download.wm
-Peachey

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] [Xmldatadumps-l] XML dumps stopped, possible fs/disk issues on dump server under investigation

2010-11-10 Thread Ariel T. Glenn
Στις 11-11-2010, ημέρα Πεμ, και ώρα 09:16 +1000, ο/η K. Peachey έγραψε:
 On Thu, Nov 11, 2010 at 7:49 AM, emijrp emi...@gmail.com wrote:
  What data is in risk?
 Data from dumps.wm and download.wm
 -Peachey

The xml archives are at issue; other data was copied off because it
takes a lot less space.  We are proceeding very cautiously, as we do
want to preserve all data.

Ariel


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Is that download.wikimedia.org server is down?

2010-11-10 Thread Billy Chan
Hi Robin,

Thanks for your link. Do u know where i can download the xml dumps now?
Thanks.

2010/11/11 Robin Krahl m...@robin-krahl.de

 Hi Billy,

 On 10.11.2010 19:48, Billy Chan wrote:
  I am trying to download mediawiki by the following link, but with no
 luck,
  the server seems down:

 That’s a known problem (see tech channel or server admin log). You may
 user the following file instead:
  http://noc.wikimedia.org/mediawiki-1.16.0.tar.gz

 Regards,
Robin

 --
 Robin Krahl || ireas
  http://robin-krahl.de
  m...@robin-krahl.de


 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Killing $wgStyleDirectory

2010-11-10 Thread Tim Starling
On 10/11/10 08:56, Trevor Parscal wrote:
 In relation to: https://bugzilla.wikimedia.org/show_bug.cgi?id=25124 - I 
 am considering removing $wgStyleDirectory rather than further depending 
 on it.
 
 It seems that it's only partially used right now, and making the style 
 directory variable but not other directories is inconsistent as well.
 
 There was some point about this being a feature that makes MediaWiki 
 more flexible, but I'm unsure this is actually adding flexibility, it 
 seems more like unnecessary complexity.
 
 I'm considering just removing it - any points to the contrary would help 
 me make this decision.

I traced back the annotations of $wgStyleDirectory, thinking that it
must have been added for some frivolous purpose. It turns out to have
quite a long history.

It started off as $wgStyleSheetDirectory and was added by Lee. In the
V_PHASE3REORG branch, there was an installer script (install.php) that
copied files from a private source location to a web-accessible
install path ($IP). The contents of the ./stylesheets directory was
copied to $wgStyleSheetDirectory, which defaulted to $IP/style (i.e. a
different layout to the source). This was the only use of
$wgStyleSheetDirectory at the time.

Today, it doesn't really make sense to set $IP in LocalSettings.php,
but originally it was necessary to set it manually, and Lee imagined
that it would be some other directory than the place you unpacked the
source tarball. It's entirely possible that nobody ever used this
scheme other than Lee.

In 1.2, Brion introduced the web installer, which assumed a
web-accessible source directory, effectively obsoleting the source/$IP
distinction. The web installer set $wgStyleSheetDirectory to
$IP/stylesheets, matching the source directory layout. However it
wasn't actually used by anything, apart from the old installer.

The feature rotted until 1.7, when Rob Church resurrected it in r14449
and r14797, fixing bug 4033. Presumably it's been rotting again since
then, if it's broken like Trevor says.

My preference is to remove it, if it's only going to work for one
major version in every 10. I guess the point of it in today's
MediaWiki is to allow different wikis to use different sets of skins,
or different patched core skins with the same names. There's a number
of other ways to achieve the same thing, and I don't think any of them
would break so regularly.

Platonides wrote:
 Backwards compatibility.
 Some systems may not support symlinks
 $wgStylePath would be sad if you killed $wgStyleDirectory
 (we provide them in pairs)

If you're on Windows XP, you could just use completely separate copies
of MediaWiki, maybe with a deployment script to manage them. If you're
on Windows Vista or Server 2008, symlinks are supported and you can
use the same solutions as on Unix.

$wgStyle[Sheet]Path is a lot more useful than $wgStyleDirectory. We
used it during the 1.5 upgrade on Wikimedia, to allow MW 1.4 and 1.5
to run on the cluster simultaneously, with skins-1.4 mapped to the 1.4
skins directory, and skins-1.5 mapped to the 1.5 skins directory. And
we use it now to implement bits.wikimedia.org. It's not going to rot
while it's in use on Wikimedia.

-- Tim Starling


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l