[Wikitech-l] Patching, given diff

2009-04-22 Thread Kent Wang
I'm building an application that uses DifferenceEngine.php to generate
word level unified diffs. I've figured out how to do this but now need
to generate patches given the diff.

This is what I have written to generate the diff:

$orig = array('One Two Three', 'One Two Three');
$closing = array('One Two Three', 'One Two Four');

$diff = new WordLevelDiff($orig, $closing);
$formatter = new UnifiedDiffFormatter();
echo $formatter-format($diff);

Which returns:

@@ -5,3 +5,3 @@
  One
  Two
- Three
+ Four

So my application will store this and when it comes to time to patch
that diff, I need a function that will do that, i.e. given the diff
string above and $orig, it should generate $closing.

Does such a patch functionality exist in MediaWiki, or anywhere else?

I'm using PHP and am aware of the xdiff extension but it doesn't
support word-level-diff, only line-level. And I can't install it
anyway.

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Patching, given diff

2009-04-22 Thread Ilmari Karonen
Roan Kattouw wrote:
 2009/4/22 Kent Wang kw...@kwang.org:
 I'm building an application that uses DifferenceEngine.php to generate
 word level unified diffs. I've figured out how to do this but now need
 to generate patches given the diff.

 It's not in MediaWiki, and I don't know if it's in PHP, but there's a
 very widespread command line program installed on virtually every
 UNIX/Linux system that can do this. Unsurprisingly, it's called
 patch.

The problem is that diff and patch do line-level diffs, and he wants to 
do it on the word level.

Of course, a possible workaround would be to reversibly transform the 
files such that every word (or other token) ends up on a separate line. 
  Since the transformed version doesn't really have to be readable, you 
could, say, URL-encode every token.  Then you'd just have to figure out 
how to correspondingly transform your diff so that it can be applied to 
the transformed files by patch.

Of course, it's not that hard to apply a patch by hand either: a diff is 
essentially just a list of straightforward intructions of the form 
delete these lines/tokens, insert these in their place.  In general, 
you just first tokenize the file you're patching, and then loop over the 
diff applying the changes to the list of tokens.

This works just fine as long as the patch applies exactly.  Much of the 
complexity in the patch utility is involved in fuzzy matching, which 
allows it to apply patches even if the target file isn't quite identical 
to the one the diff was generated against, by using the context 
information in the diff to adjust the offsets.  For some purposes, this 
feature isn't particularly important or useful; for others, it's vital.

-- 
Ilmari Karonen

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] On extension SVN revisions in Special:Version

2009-04-22 Thread Mark Clements (HappyDog)
Sergey Chernyshev sergey.chernys...@gmail.com wrote in message 
news:9984a7a70903270737k8ae2e9dq9dbeb12d6dfe0...@mail.gmail.com...

 There is probably a solution that can be done within the code - something
 that makes all included PHP files override the version if it's later then 
 in
 main file - something like this in main file:

[Code snipped]

This is basically what I do in my WikiDB extension [1].  There is a 
WikiDB_RegisterRevision() function, and all files in the extension call this 
function using WikiDB_RegisterRevision('$Rev: 114 $'), where the 114 will be 
automatically substituted using svn:keyword expansion.  Check the files [2] 
to see it in situ.

[1] http://www.kennel17.co.uk/testwiki/WikiDB
[2] http://www.kennel17.co.uk/testwiki/WikiDB/Files


 Still, it's only a convention and not a universal solution, more over 
 it'll
 only work if revision updated at least one PHP file. Also if extension
 doesn't include all it's files or uses AutoLoader, then it will not be
 reliable.

All true.  In my case, I don't use auto-loading, all files are loaded on 
startup and I do not have any non-PHP files, so it works for me.  However 
this is far from best practice and won't give terribly good performance on 
busy wikis, I suspect.

Perhaps we should add a GetCredits hook, to be called on Special:Version 
in order to get the credits info for the extension?  If the hook is not 
found, or returns false, then the info in $wgExtensionCredits for that 
extension is used, otherwise the array returned from the function (which is 
in th same format) will be used instead.  This would mean that the extension 
could use this function to include() all available files in order to get the 
revision number, but wouldn't need to include them on normal pages (thus 
avoiding the performance hit).  Wouldn't solve the problem of non-PHP files 
being updated, but would solve the rest.

- Mark Clements (HappyDog).



___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] On extension SVN revisions in Special:Version

2009-04-22 Thread Chad
On Wed, Apr 22, 2009 at 7:12 AM, Mark Clements (HappyDog)
gm...@kennel17.co.uk wrote:
 Sergey Chernyshev sergey.chernys...@gmail.com wrote in message
 news:9984a7a70903270737k8ae2e9dq9dbeb12d6dfe0...@mail.gmail.com...

 There is probably a solution that can be done within the code - something
 that makes all included PHP files override the version if it's later then
 in
 main file - something like this in main file:

 [Code snipped]

 This is basically what I do in my WikiDB extension [1].  There is a
 WikiDB_RegisterRevision() function, and all files in the extension call this
 function using WikiDB_RegisterRevision('$Rev: 114 $'), where the 114 will be
 automatically substituted using svn:keyword expansion.  Check the files [2]
 to see it in situ.

 [1] http://www.kennel17.co.uk/testwiki/WikiDB
 [2] http://www.kennel17.co.uk/testwiki/WikiDB/Files


 Still, it's only a convention and not a universal solution, more over
 it'll
 only work if revision updated at least one PHP file. Also if extension
 doesn't include all it's files or uses AutoLoader, then it will not be
 reliable.

 All true.  In my case, I don't use auto-loading, all files are loaded on
 startup and I do not have any non-PHP files, so it works for me.  However
 this is far from best practice and won't give terribly good performance on
 busy wikis, I suspect.

 Perhaps we should add a GetCredits hook, to be called on Special:Version
 in order to get the credits info for the extension?  If the hook is not
 found, or returns false, then the info in $wgExtensionCredits for that
 extension is used, otherwise the array returned from the function (which is
 in th same format) will be used instead.  This would mean that the extension
 could use this function to include() all available files in order to get the
 revision number, but wouldn't need to include them on normal pages (thus
 avoiding the performance hit).  Wouldn't solve the problem of non-PHP files
 being updated, but would solve the rest.

 - Mark Clements (HappyDog).



 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Not sure it's worth it :-\ What's wrong with just giving version numbers
that make sense, rather than relying on the revision number which isn't
indicative of anything?

-Chad

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] On extension SVN revisions in Special:Version

2009-04-22 Thread Brion Vibber
On 4/22/09 5:54 AM, Chad wrote:
 Not sure it's worth it :-\ What's wrong with just giving version numbers
 that make sense, rather than relying on the revision number which isn't
 indicative of anything?

It's indicative of the running version of the code, as long as it also 
tells you which branch to pull from. :) And of course as long as it's a 
relevant number like the revision of the extension directory...

-- brion

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Google Summer of Code: accepted projects

2009-04-22 Thread Michael Dale
Aryeh Gregor wrote:
 I'm not clear on why we don't just make the daemon synchronously
 return a result the way ImageMagick effectively does.  Given the level
 of reuse of thumbnails, it seems unlikely that the latency is a
 significant concern -- virtually no requests will ever actually wait
 on it.
   
( I basically outlined these issues on the soc page but here they are 
again with at bit more clarity )

I recommended that the image daemon run semi-synchronously since the 
changes needed to maintain multiple states and return non-cached 
place-holder images while managing updates and page purges for when the 
updated images are available within the wikimedia server architecture 
probably won't be completed in the summer of code time-line. But if the 
student is up for it the concept would be useful for other components 
like video transformation / transcoding, sequence flattening etc. But 
its not what I would recommend for the summer of code time-line.

== per issues outlined in bug 4854 ==
I don't think its a good idea to invest a lot of energy into a separate 
python based image daemon. It won't avoid all  problems listed in bug 4854

Shell-character-exploit issues should be checked against anyway (since 
not everyone is going to install the daemon)

Other people using mediaWiki won't add a python or java based image 
resize and resolve dependency python or java  component  libraries. It 
won't be easier to install than imagemagick or php-gd that are 
repository hosted applications and already present in shared hosting 
environments.

Once you start integrating other libs like (java) Batik it becomes 
difficult to resolve dependencies (java, python etc) and to install you 
have to push out a new program that is not integrated into all the 
application repository manages for the various distributions. 

Potential to isolate CPU and memory usage should be considered in the 
core medaiWiki image resize support anyway . ie we don't want to crash 
other peoples servers who are using mediaWiki by not checking upper 
bounds of image transforms. Instead we should make the core image 
transform smarter maybe have a configuration var that /attempts/ to bind 
the upper memory for spawned processing and take that into account 
before issuing the shell command for a given large image transformation 
with a given sell application.

== what would probably be better for the image resize efforts should 
focus on ===

(1) making the existing system more robust and (2) better taking 
advantage of multi-threaded servers.

(1) right now the system chokes on large images we should deploy support 
for an in-place image resize maybe something like vips (?) 
(http://www.vips.ecs.soton.ac.uk/index.php?title=Speed_and_Memory_Use) 
The system should intelligently call vips to transform the image to a 
reasonable size at time of upload then use those derivative for just in 
time thumbs for articles. ( If vips is unavailable we don't transform 
and we don't crash the apache node.)

(2) maybe spinning out the image transform process early on in the 
parsing of the page with a place-holder and callback so by the time all 
the templates and links have been looked up the image is ready for 
output. (maybe another function wfShellBackgroundExec($cmd, 
$callback_function) (maybe using |pcntl_fork then normal |wfShellExec 
then| ||pcntl_waitpid then callback function ... which sets some var in 
the parent process so that pageOutput knows its good to go) |

If operationally the daemon should be on a separate server we should 
still more or less run synchronously ... as mentioned above ... if 
possible the daemon should be php based so we don't explode the 
dependencies for deploying robust image handling with mediaWiki.

peace,
--michael

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] tran-subst-antiation

2009-04-22 Thread William Allen Simpson
Aryeh Gregor wrote:
 https://bugzilla.wikimedia.org/show_bug.cgi?id=4484
 
 That should do what you want.

Yes, this might be the equivalent of {{substonlysubst:/substonly.

That RFE has been around for years, and I was aware of it. The problem is,
interactions between two different kinds of markup parsing ( and {{}})
are often unpredictable, and have changed.

In the beginning, {{subst:# didn't work.  It does now?

Having thought about such issues on more than one occasion, I'm proposing
clean markup that's easier and predictably parsable, and fits well with
current practices and syntax:

   {{#   function (existing)
   {{##  substitute-only subst:, otherwise is transclude {{
   {{### substitute-only subst: for function, otherwise is {{#

Obviously, a lot easier to document and for editors to type!

And I've given it a cute neologism.


 This is also worth considering:
 
 https://bugzilla.wikimedia.org/show_bug.cgi?id=5453
 
 If {{includeonlysubst:/includeonlyfoo}} worked as {{foo}} on
 transclusion and {{subst:foo}} on substitution, that would also
 accomplish what you want (eliminating the need for the subst=subst:).
 
But it doesn't (or didn't) and it's already used as a hack, for example:

{{#ifeq:{{NAMESPACE}}|{{includeonlysubst:/includeonlyNAMESPACE}}
|
| {{error:not substituted|cfd}}
}}

To be honest, I'm not sure how that even works, but it does

Obviously, nosubst would handle that more elegantly.

Should substonly and nosubst become standard, we could use them.

NB:I'd prefer substituteonly and nosubstitute, spelled out like include.

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Skin JS cleanup and jQuery

2009-04-22 Thread Brion Vibber
On 4/17/09 6:45 PM, Marco Schuster wrote:
 On Fri, Apr 17, 2009 at 11:42 PM, Brion Vibberbr...@wikimedia.org  wrote:
 * Background JavaScript worker threads

 You mean...stuff like bots written in Javascript, using the XML API?
 I could imagine also sending mails via Special:Emailuser in the background
 to reach multiple recipients - that's a PITA if you want send mails to
 multiple users.

Perhaps... but note that the i/o for XMLHTTPRequest is asynchronous to 
begin with -- it's really only if you're doing heavy client-side 
_processing_ that you're likely to benefit from a background worker thread.

(Background threads also cannot directly touch the DOM or any objects in 
your foreground thread. You need to pass stuff in and out through 
indirect messages.)


 * Geolocation services

 That sounds kinda interesting, even if the accuracy on non-GPS-enabled
 devices isn't that high... can this in any way be joined with the OSM
 integration?

It sure could! :)

-- brion

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] tran-subst-antiation

2009-04-22 Thread Aryeh Gregor
On Wed, Apr 22, 2009 at 12:18 PM, William Allen Simpson
william.allen.simp...@gmail.com wrote:
 Having thought about such issues on more than one occasion, I'm proposing
 clean markup that's easier and predictably parsable, and fits well with
 current practices and syntax:

   {{#   function (existing)
   {{##  substitute-only subst:, otherwise is transclude {{
   {{### substitute-only subst: for function, otherwise is {{#

Again, personally I don't like magic symbols.  Distinctive keywords
are much better for grepping/Googling/etc.

 NB:I'd prefer substituteonly and nosubstitute, spelled out like include.

Since the existing keyword is subst: and not substitute:, I think
nosubst would be better.

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Skin JS cleanup and jQuery

2009-04-22 Thread Bilal Abdul Kader
There is an issue with running a foreground JS thread that is super fast and
might send a lot of request to the server. Heavy processing on the client
side would alleviate the load from the server (if possible) but it might
push another load on the server (in the presented example of sending emails
to uses).

I have worked on an AJAX application that sends email using a Javascript
application and it turns out that the server was denying the JS requests
because it went beyond the allowed limit of connections from a single host.

A better approach might be to start the task at the client side and save it
ina queue at the server side for another process (server side) to take care
of it later on in FIFO mode.

On Wed, Apr 22, 2009 at 12:18 PM, Brion Vibber br...@wikimedia.org wrote:


 Perhaps... but note that the i/o for XMLHTTPRequest is asynchronous to
 begin with -- it's really only if you're doing heavy client-side
 _processing_ that you're likely to benefit from a background worker thread.


On 4/17/09 6:45 PM, Marco Schuster wrote:
 You mean...stuff like bots written in Javascript, using the XML API?
 I could imagine also sending mails via Special:Emailuser in the background
 to reach multiple recipients - that's a PITA if you want send mails to
 multiple users.
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Skin JS cleanup and jQuery

2009-04-22 Thread Brian
Many extensions are now using the Yahoo User Interface library. It would be
nice if mediawiki included it by default.

On Wed, Apr 15, 2009 at 3:05 PM, Brion Vibber br...@wikimedia.org wrote:

 Just a heads-up --

 Michael Dale is working on some cleanup of how the various JavaScript
 bits are loaded by the skins to centralize some of the currently
 horridly spread-out code and make it easier to integrate in a
 centralized loader so we can serve more JS together in a single
 compressed request.

 Unless there's a strong objection I'd be very happy for this to also
 include loading up the jQuery core library as a standard component.

 The minified jQuery core is 19k gzipped, and can simplify other JS code
 significantly so we can likely chop down wikibits.js, mwsuggest.js, and
 the site-customized Monobook.js files by a large margin for a net savings.

 If you've done browser-side JavaScript development without jQuery and
 wanted to kill yourself, I highly recommend you try jQuery -- it's
 so nice. :)

 -- brion

 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Google Summer of Code: accepted projects

2009-04-22 Thread Brion Vibber
Thanks for taking care of the announce mail, Roan! I spent all day 
yesterday at the dentists... whee :P

I've taken the liberty of reposting it on the tech blog: 
http://techblog.wikimedia.org/2009/04/google-summer-of-code-student-projects-accepted/

I'd love for us to get the students set up on the blog to keep track of 
their project progress and raise visibility... :D

-- brion

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Skin JS cleanup and jQuery

2009-04-22 Thread Brion Vibber
On 4/22/09 9:33 AM, Sergey Chernyshev wrote:
 Exactly because this is the kind of requests we're going to get, I think it
 makes sense not to have any library bundled by default, but have a
 centralized handling for libraries, e.g. one extension asks for latest
 jQuery and latest YUI and MW loads them, another extension asks for jQuery
 only and so on.

Considering we want core code to be able to use jQuery, I think the case 
for bundling it is pretty strong. :)

-- brion

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Skin JS cleanup and jQuery

2009-04-22 Thread Brion Vibber
On 4/22/09 9:28 AM, Brian wrote:
 Many extensions are now using the Yahoo User Interface library. It would be
 nice if mediawiki included it by default.

Rather than bundling multiple separate libraries we aren't going to use 
in core code, I'd rather just make sure we've got a consistent interface 
for loading them.

That might include, say, bundling up a YUI loader as an extension and 
having that marked as a dependency for automatic installation when you 
install your YUI-needing extensions.

-- brion

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Backups

2009-04-22 Thread Brion Vibber
On 4/19/09 7:10 AM, Eugene wrote:
 Hi everyone,

 Are there any updates regarding Wikipedia's backup systems? I've
 created a bug to track this at
 https://bugzilla.wikimedia.org/show_bug.cgi?id=18255.

We'll be doing an audit soon to get our backup list up to date and make 
sure anything that's been stalled or delayed gets back on track.

-- brion

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Skin JS cleanup and jQuery

2009-04-22 Thread Sergey Chernyshev
Yep, with jQuery in the core it's probably best to just bundle it.

There is another issue with the code loading and stuff - making JS libraries
call a callback function when they load and all the functionality to be
there instead of relying on browser to block everything until library is
loaded. This is quite advance thing considering that all the code will have
to be converted to this model, but it will allow for much better performance
when implemented. Still it's probably Phase 5 kind of optimization, but it
can bring really good results considering JS being the biggest blocker.

More on the topic is on Steve Souders' blog:
http://www.stevesouders.com/blog/2008/12/27/coupling-async-scripts/

Thank you,

Sergey


--
Sergey Chernyshev
http://www.sergeychernyshev.com/


On Wed, Apr 22, 2009 at 12:42 PM, Brion Vibber br...@wikimedia.org wrote:

 On 4/22/09 9:33 AM, Sergey Chernyshev wrote:
  Exactly because this is the kind of requests we're going to get, I think
 it
  makes sense not to have any library bundled by default, but have a
  centralized handling for libraries, e.g. one extension asks for latest
  jQuery and latest YUI and MW loads them, another extension asks for
 jQuery
  only and so on.

 Considering we want core code to be able to use jQuery, I think the case
 for bundling it is pretty strong. :)

 -- brion

 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] tran-subst-antiation

2009-04-22 Thread William Allen Simpson
William Allen Simpson wrote:
 Aryeh Gregor wrote:
 https://bugzilla.wikimedia.org/show_bug.cgi?id=4484

 That should do what you want.
 
 Yes, this might be the equivalent of {{substonlysubst:/substonly.
 ...
 Should substonly and nosubst become standard, we could use them.
 
 NB:I'd prefer substituteonly and nosubstitute, spelled out like 
 include.
 
And looking back at that RFE, which had a fairly simple patch, Brion
decided strongly inclined to WONTFIX this.

So, what do we have to do to change his mind?

And, who would re-code the patch for the significantly revised parser?

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Skin JS cleanup and jQuery

2009-04-22 Thread Bilal Abdul Kader
This would be a great idea as the library is always updated and has a lot of
features for the front end.

On Wed, Apr 22, 2009 at 12:28 PM, Brian brian.min...@colorado.edu wrote:

 Many extensions are now using the Yahoo User Interface library. It would be
 nice if mediawiki included it by default.
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] tran-subst-antiation

2009-04-22 Thread Aryeh Gregor
On Wed, Apr 22, 2009 at 12:53 PM, William Allen Simpson
william.allen.simp...@gmail.com wrote:
 And looking back at that RFE, which had a fairly simple patch, Brion
 decided strongly inclined to WONTFIX this.

 So, what do we have to do to change his mind?

Well, I already tried and he never responded, so maybe he already has.

 And, who would re-code the patch for the significantly revised parser?

Ah, now that's a good question.

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] tran-subst-antiation

2009-04-22 Thread William Allen Simpson
Aryeh Gregor wrote:
 Again, personally I don't like magic symbols.  Distinctive keywords
 are much better for grepping/Googling/etc.
 
That thinking would take us back to b, bold, etc.  But I'll often take
rough consensus and running code over purity  At least sometime in the
past there was some running code.


 NB:I'd prefer substituteonly and nosubstitute, spelled out like include.
 
 Since the existing keyword is subst: and not substitute:, I think
 nosubst would be better.
 
Yeah, I'm not too happy that keywords were sometimes chosen because they
match the name of a function provided by the underlying software.

How about allowing synonyms, {{substitute: = {{subst: = {{::?

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] On extension SVN revisions in Special:Version

2009-04-22 Thread Sergey Chernyshev
I probably have an idea of how to implement this using a bot (or post-commit
hook if we want real-time data) and externals.

Essentially bot script should be checking the version of extension folder
and generate and check-in an entry in another repository in the form like
this:

http://extensionversioningrepo/trunk/OpenID/version.php

and write a Last Changed Rev from svn info
http://svn.wikimedia.org/svnroot/mediawiki/trunk/extensions/OpenID/something
like this:

? $wgExtensionRevisions['OpenID'] = 49664;

Then we'll use externals trick to map this into constant location withing
http://svn.wikimedia.org/svnroot/mediawiki/trunk/extensions/OpenID/ -
something like:

svn propset svn:externals 'version
http://extensionversioningrepo/trunk/OpenID/'
http://svn.wikimedia.org/svnroot/mediawiki/trunk/extensions/OpenID/

Then $wgExtensionCredits declaration in
http://svn.wikimedia.org/svnroot/mediawiki/trunk/extensions/OpenID/OpenID.setup.phpwill
have something constant like this:

$wgExtensionCredits['other'][] = array(
'name' = 'OpenID',
'version' = '1.8.4.rev'.$wgExtensionRevisions['OpenID'],
...
);

This way every svn checkout and svn update will have extension version
checked out from extensionversioningrepo

Alternative, more simple approach to this is to just store versions directly
in repository as
http://svn.wikimedia.org/svnroot/mediawiki/trunk/extensions/OpenID/version.phpusing
post-commit hook, but it'll double the revision number in main
repository - it still can be done using a bot, in this case it'll only add
one revision per run checking all updated versions in.

Hope this can be a solution.

Sergey



On Wed, Apr 22, 2009 at 12:14 PM, Brion Vibber br...@wikimedia.org wrote:

 On 4/22/09 5:54 AM, Chad wrote:
  Not sure it's worth it :-\ What's wrong with just giving version numbers
  that make sense, rather than relying on the revision number which isn't
  indicative of anything?

 It's indicative of the running version of the code, as long as it also
 tells you which branch to pull from. :) And of course as long as it's a
 relevant number like the revision of the extension directory...

 -- brion

 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Skin JS cleanup and jQuery

2009-04-22 Thread Sergey Chernyshev
No, my link is about 3 ways of loading:

   1. Normal script tags (current style)
   2. Asynchronous Script Loading (loading scripts without blocking, but
   without waiting for onload)
   3. Lazyloading (loading script onload).

Number 2 might be usable as well.

In any case changing all MW and Extensions code to work for #2 or #3 might
be a hard thing.

Thank you,

Sergey


--
Sergey Chernyshev
http://www.sergeychernyshev.com/


On Wed, Apr 22, 2009 at 1:21 PM, Michael Dale md...@wikimedia.org wrote:

 The mv_embed.js includes a doLoad function that matches the autoLoadJS
 classes listed in mediaWiki php. So you can dynamically autoload
 arbitrary sets of classes (js-files in the mediaWiki software) in a
 single http request and then run something once they are loaded.
 It can also autoload sets of wiki-titles for user-space scripts again
 in a single request grouping, localizing, gziping and caching all the
 requested wiki-title js in a single request. This is nifty cuz say your
 script has localized msg. You can fill these in in user-space
 MediaWiki:myMsg then put them in the header of your user-script, then
 have localized msg in user-space javascript ;) .. When I get a chance I
 will better document this ;) But its basically outlined here:
 http://www.mediawiki.org/wiki/Extension:ScriptLoader

 The link you highlight appears to be about running stuff once the page
 is ready. jQuery includes a function $(document).ready(function(){
 //code to run now that the dom-state is ready }) so your enabled gadget
 could use that to make sure the dom is ready before executing some
 functions.

 (Depending on the type of js functionality your adding it /may/ be
 better to load on-demand once a new interface component is invoked
 rather than front load everything. Looking at the add-media-wizard
 gadget on testing.wikipedia.org for an idea of how this works.

 peace,
 --michael

 Sergey Chernyshev wrote:
  Yep, with jQuery in the core it's probably best to just bundle it.
 
  There is another issue with the code loading and stuff - making JS
 libraries
  call a callback function when they load and all the functionality to be
  there instead of relying on browser to block everything until library is
  loaded. This is quite advance thing considering that all the code will
 have
  to be converted to this model, but it will allow for much better
 performance
  when implemented. Still it's probably Phase 5 kind of optimization, but
 it
  can bring really good results considering JS being the biggest blocker.
 
  More on the topic is on Steve Souders' blog:
  http://www.stevesouders.com/blog/2008/12/27/coupling-async-scripts/
 
  Thank you,
 
  Sergey
 
 
  --
  Sergey Chernyshev
  http://www.sergeychernyshev.com/
 
 
  On Wed, Apr 22, 2009 at 12:42 PM, Brion Vibber br...@wikimedia.org
 wrote:
 
 
  On 4/22/09 9:33 AM, Sergey Chernyshev wrote:
 
  Exactly because this is the kind of requests we're going to get, I
 think
 
  it
 
  makes sense not to have any library bundled by default, but have a
  centralized handling for libraries, e.g. one extension asks for latest
  jQuery and latest YUI and MW loads them, another extension asks for
 
  jQuery
 
  only and so on.
 
  Considering we want core code to be able to use jQuery, I think the case
  for bundling it is pretty strong. :)
 
  -- brion
 
  ___
  Wikitech-l mailing list
  Wikitech-l@lists.wikimedia.org
  https://lists.wikimedia.org/mailman/listinfo/wikitech-l
 
 
  ___
  Wikitech-l mailing list
  Wikitech-l@lists.wikimedia.org
  https://lists.wikimedia.org/mailman/listinfo/wikitech-l
 


 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Google Summer of Code: accepted projects

2009-04-22 Thread Magnus Manske
On Wed, Apr 22, 2009 at 12:54 AM, Marco Schuster
ma...@harddisk.is-a-geek.org wrote:
 On Wed, Apr 22, 2009 at 12:22 AM, Roan Kattouw roan.katt...@gmail.comwrote:

 * Zhe Wu, mentored by Aryeh Gregor (Simetrical), will be building a
 thumbnailing daemon, so image manipulation won't have to happen on the
 Apache servers any more


 Wow, I'm lookin' forward to this. Mighta be worth a try to give the upper
 the ability to choose non-standard resizing filters or so... or full-fledged
 image manipulation, something like a wiki-style photoshop.

On a semi-related note: What's the status of the management routines
that handle thrwoaway things like math PNGs?
Is this a generic system, so it can be used e.g. for jmol PNGs in the future?
Is it integrated with the image thumbnail handling?
Should it be?

Magnus

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Skin JS cleanup and jQuery

2009-04-22 Thread Michael Dale
hmm right...
The idea of the scriptLoader is we get all our #1 included javascript in 
a single request. So we don't have round trips that would benefit as 
much from lazy loading so no need to rewrite stuff that is included that 
way already.

I don't think we are proposing convert all scripts to #2 or #3 
loading...  We already have the importScriptURI function which script 
use for loading when not using #1.

I do suggest we move away from importScriptURI to something like the 
doLoad function in mv_embed ... that way we can load multiple js files 
in a single request using the mediaWiki scriptServer (if its enabled). 
Right now all the importScriptURI stuff works non-blocking and included 
scripts need to include code to execute anything they want to run. To 
make things more maintainable and modular we should transition to 
objects/classes providing methods which can be extended and autoloaded 
rather than lots of single files doing lots of actions on the page in a 
less structured fashion. But there is no rush to transition as the 
scripts are working as is and the new infrastructure will work with 
the scripts as they are.

But the idea of the new infrastructure is to support that functionality 
in the future...

--michael

Sergey Chernyshev wrote:
 No, my link is about 3 ways of loading:

1. Normal script tags (current style)
2. Asynchronous Script Loading (loading scripts without blocking, but
without waiting for onload)
3. Lazyloading (loading script onload).

 Number 2 might be usable as well.

 In any case changing all MW and Extensions code to work for #2 or #3 might
 be a hard thing.

 Thank you,

 Sergey


 --
 Sergey Chernyshev
 http://www.sergeychernyshev.com/


 On Wed, Apr 22, 2009 at 1:21 PM, Michael Dale md...@wikimedia.org wrote:

   
 The mv_embed.js includes a doLoad function that matches the autoLoadJS
 classes listed in mediaWiki php. So you can dynamically autoload
 arbitrary sets of classes (js-files in the mediaWiki software) in a
 single http request and then run something once they are loaded.
 It can also autoload sets of wiki-titles for user-space scripts again
 in a single request grouping, localizing, gziping and caching all the
 requested wiki-title js in a single request. This is nifty cuz say your
 script has localized msg. You can fill these in in user-space
 MediaWiki:myMsg then put them in the header of your user-script, then
 have localized msg in user-space javascript ;) .. When I get a chance I
 will better document this ;) But its basically outlined here:
 http://www.mediawiki.org/wiki/Extension:ScriptLoader

 The link you highlight appears to be about running stuff once the page
 is ready. jQuery includes a function $(document).ready(function(){
 //code to run now that the dom-state is ready }) so your enabled gadget
 could use that to make sure the dom is ready before executing some
 functions.

 (Depending on the type of js functionality your adding it /may/ be
 better to load on-demand once a new interface component is invoked
 rather than front load everything. Looking at the add-media-wizard
 gadget on testing.wikipedia.org for an idea of how this works.

 peace,
 --michael

 Sergey Chernyshev wrote:
 
 Yep, with jQuery in the core it's probably best to just bundle it.

 There is another issue with the code loading and stuff - making JS
   
 libraries
 
 call a callback function when they load and all the functionality to be
 there instead of relying on browser to block everything until library is
 loaded. This is quite advance thing considering that all the code will
   
 have
 
 to be converted to this model, but it will allow for much better
   
 performance
 
 when implemented. Still it's probably Phase 5 kind of optimization, but
   
 it
 
 can bring really good results considering JS being the biggest blocker.

 More on the topic is on Steve Souders' blog:
 http://www.stevesouders.com/blog/2008/12/27/coupling-async-scripts/

 Thank you,

 Sergey


 --
 Sergey Chernyshev
 http://www.sergeychernyshev.com/


 On Wed, Apr 22, 2009 at 12:42 PM, Brion Vibber br...@wikimedia.org
   
 wrote:
 
   
 On 4/22/09 9:33 AM, Sergey Chernyshev wrote:

 
 Exactly because this is the kind of requests we're going to get, I
   
 think
 
 it

 
 makes sense not to have any library bundled by default, but have a
 centralized handling for libraries, e.g. one extension asks for latest
 jQuery and latest YUI and MW loads them, another extension asks for

   
 jQuery

 
 only and so on.

   
 Considering we want core code to be able to use jQuery, I think the case
 for bundling it is pretty strong. :)

 -- brion

 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l


 
 ___
 Wikitech-l 

Re: [Wikitech-l] On extension SVN revisions in Special:Version

2009-04-22 Thread Brion Vibber
On 4/22/09 11:01 AM, Sergey Chernyshev wrote:
 I probably have an idea of how to implement this using a bot (or post-commit
 hook if we want real-time data) and externals.

 Essentially bot script should be checking the version of extension folder
 and generate and check-in an entry in another repository in the form like
 this:

 http://extensionversioningrepo/trunk/OpenID/version.php

 and write a Last Changed Rev from svn info
 http://svn.wikimedia.org/svnroot/mediawiki/trunk/extensions/OpenID/something
 like this:

 ? $wgExtensionRevisions['OpenID'] = 49664;

H, could be maintained in tree this way, but IMHO not worth the 
trouble -- it's more likely to trigger merge conflicts and other annoyances.

Probably better would be:

1) Look up the actual useful revision from the working directory when 
we're running from a checkout

2) Also include that info into releases  ExtensionDistributor output so 
people who aren't working from checkouts will still get the useful 
versions, and look that file up if the SVN info isn't there.

-- brion

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l