thanks for the constructive response :) ... comments inline

Tim Starling wrote:
I agree we should move things into a global object ie: $j and all our components / features should extend that object. (like jquery plugins). That is the direction we are already going.

I think it would be better if jQuery was called window.jQuery and
MediaWiki was called window.mw. Then we could share the jQuery
instance with JS code that's not aware of MediaWiki, and we wouldn't
need to worry about namespace conflicts between third-party jQuery
plugins and MediaWiki.
Right but there are benefits to connecting into the jQuery plugin system that would not be as clean to wrap into our window.mw object. For example $('#textbox').wikiEditor() is using jQuery selectors for the target, and maybe other jQuery plugin conventions like the jquery class alias inside the function(){})(jQuery);

Although if not designing your tool as a jQuery plugin then yea ;) ... but I think most of the tools should be designed as jQuery plug-ins.

Dependency loading is not really beyond the scope... we are already supporting that. If you check out the mv_jqueryBindings function in mv_embed.js ... here we have loader calls integrated into the jquery binding. This integrates loading the high level application interfaces into their interface call.

Your so-called dependency functions (e.g. doLoadDepMode) just seemed
to be a batch load feature, there was no actual dependency handling.
Every caller was required to list the dependencies for the classes it
was loading.

I was referring to defining the dependencies in the module call ... ie $j('target').addMediaWiz( config ) and having the addMediaWiz module map out the dependencies in the javascript. doLoadDepMode just lets you get around an IE bug that when inserting scripts via the dom you have no gurantee one script will execute in the order inserted. If you your conncatinaging your scripts doLoadDepMode would not be needed as order will be preserved in the concatenated file.

I like mapping out the dependencies in javascript at that module level since it makes it easier to do custom things like read the passed in configuration and decide which dependencies you need to fulfill. If not you have to define many dependency sets in php or have much more detailed model of your javscript inside php.

But I do understand that it will eventually result in lots of extra javascript module definitions that the given installation may not want. So perhaps we generate that module definition via php configuration ... or we define the set of javascript files to include that define the various module loaders we want with a given configuration.

This is sort the approach taken with the wikiEditor that has a few thin javascript files that make calls to add modules (like add-sidebar) to a core component (wikiEditor). That way the feature set can be controlled by the php configuration while retaining runtime flexibility for dependence mapping.

The idea is to move more and more of the structure of the application into that system. so right now mwLoad is a global function but should be re-factored into the jquery space and be called via $j.load(); |
|

That would work well until jQuery introduced its own script-loader
plugin with the same name and some extension needed to use it.



That is part of the idea of centrally hosting reusable client-side components so we control the jquery version and plugin set. So a new version won't "come along" until its been tested and integrated.

If the function does mediawiki specifc scriptloader load stuff then yea it should be called mwLoad or what not. If some other plugin or native jquery piece comes along we can just have our plugin override it and or store the native as a parent (if its of use) ... if that ever happens...


We could add that convention directly into the script-loader function if desired so that on a per class level we include dependencies. Like mwLoad('ui.dialog') would know to load ui.core etc.

Yes, that is what real dependency handling would do.

Thinking about this more ... I think its a bad idea to exclusively put the dependency mapping in php. It will be difficult to avoid re-including the same things in client side loading chains. Say you have your suggest search system once the user starts typing we load jquery.suggest it knows that it needs jquery ui via dependency mapping stored in php. It sends both ui and suggest to the client. Now the user in the same page instance decides instead to edit a section. The editTool script-loader gets called its dependencies also include jquery.ui. How will the dependency-loader script-server know that the client already has the jquery.ui dependency from the suggest tool?

In the end you need these dependencies mapped out in the JS so that the client can intelligibly request the script set it needs. In that same example if the dependencies where mapped out in js we could avoid re-including jquery.ui.

Alternatively we can just put a crap load of js at the bottom of the page to ensure php knew what could possibly be used for every possible interface interaction chain of events... But the idea is it will be better for page display performance not to try and predict all of that ... so its better to store dependency mapping in javascript. I could give a few more examples if that would be helpful.

* The "class" abstraction as implemented in JS2 has very little value
to PHP callers. It's just as easy to use filenames.
The idea with "class" abstraction is that you don't know what script set you have available at any given time. Maybe one script included ui.resizable and ui.move and now your script depends on ui.resizable and ui.move and ui.drag... your loader call will only include ui.drag (since the other are already defined).

I think you're missing the point. I'm saying it doesn't provide enough
features. I want to add more, not take away some.
You can remove duplicates by filename.

see above example for why it will be difficult to remove duplicates by file name if your including dependency mappings that are not visible to the js in your script includes.
[...]
We want to move away from php code dependencies for each javascript module. Javascript should just directly hit a single exposure point of the mediawiki api. If we have php code generating bits and pieces of javascript everywhere it quickly gets complicated, is difficult to maintain, much more resource intensive, and requires a whole new framework to work right.

Php's integration with the javascript should be minimal. php should supply configuration, and package in localized msgs.

I don't think it will be too complicated or resource intensive. JS
generation in PHP is very flexible and you admit that there is a role
for it. I don't think there's a problem with adding a few more
features on the PHP side.

If necessary, we can split it back out to a non-MediaWiki standalone
mode by generating some static JS.

The nice thing about the way its working right now is you can just turn off the script-loader and the system continues to work ... you can build a page that includes the js and it "works"

Having an export mode, scripts doing transformations, dependency management output sounds complicated. I can imagine it ~sort of~ working... but it seems much easier to go the other way around.

What is your reason for saying this? Have you worked on some other
framework where integration of PHP and JavaScript has caused problems?

I am referring more to the php-javascript remoting type systems that seem to try and capture a language functionality inside a separate language. There is inevitably leakage and its complicity is rarely less than a more simple clean separation of systems. (not saying that your suggesting we go to that extream (ie defining most javascript classes and methods in php) ... but trying to map dependencies in that space is a step in that direction and will get complicated for applications interactions that go beyond the initial page display without adding more complexity on the php side.


There's a significant CPU cost to loading and parsing JS files on
every PHP request. I want to remove that behaviour. Instead, we can
list client-side files in PHP. Then from the PHP list, we can generate
static JS files in order to recover the standalone functionality.
As mentioned above I think it would be easier to make the "export" thing work the other way around. Ie instead of running a script to "export" the static javascript. We code our javscript in a way that it works stand alone to begin with and we "export" the information we want into the php.

I agree that the present system of parsing top of the javascipt file on every script-loader generation request is un-optimized. (the idea is those script-loader generations calls happen rarely but even still it should be cached at any number of levels. (ie checking the filemodifcation timestamp, witting out a php or serialized file .. or storing it in any of the other cache levels we have available, memcahce, database, etc )

[snip]

Have you looked at the profiling? On the Wikimedia app servers, even
the simplest MW request takes 23ms, and gen=js takes 46ms. A static
file like wikibits.js takes around 0.5ms. And that's with APC. You say
MW on small sites is OK, I think it's slow and resource-intensive.

That's not to say I'm sold on the idea of a static file cache, it
brings its own problems, which I listed.

yea... but almost all script-loader request will be cached. it does not need to check the DB or anything its just a key-file lookup (since script-loader request pass a request key either its there in cache or its not ...it should be on par with the simplest MW request. Which is substantially shorter then around trip time for getting each script individually, not to mention gziping which can't otherwise be easily enabled for 3rd party installations.

[...]
The performance impact of refreshing a common file once every hour or
two is not large. Your code sets the expiry time to a year, and
changes the urid parameter regularly, which sounds great until you
accidentally cache some buggy JS into squid and you have no way to
reconstruct the URID parameters and thus purge the object. Then you'd
be stuck with the choice of either waiting a month for all the
referring HTML to expire, or clearing the entire squid cache.
...right... we would want to avoid lots of live hacks. But I think we want to avoid lots of live hacks anyway. A serious javascript bug would only affect the pages that where generated in thous hours that it was a bug was present not the 30 days that your characterizing the lag time of page generation.

Do you have stats on that?... its surprising to me that pages are re-generated that rarely... How do central notice campaigns work?

[...]
Security takes precedence over performance. There are better ways to
improve performance than to open up your code to systematic exploit by
malicious parties.

ic.... I guess I rarely run into things being displayed are not A) not your own input or B) running through the mediaWiki api ... But your general point is valid. In theory this could come up. (even via api calls)

But... I think it would be just as easy if not easier to check for "escape( val )" as ".text( val ) which would be at the end of a long chain of jquery calls. Or you could set variable values post DOM insertion via .val() or .text() also avoiding non-native dom construction.

I guess it really comes down to readability. I find tabbed html a bit more readable then long chain of jquery elements. But it may be that people find the later more readable... If so I can start heading in that direction. Performance wise I attached a quick test.. seems pretty fast on my machine with a recent firefox build .. but older browsers / machines might be slower...at any rate we should read for both for speed and readability and "security review" ;)

--michael
_______________________________________________
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Reply via email to