> -----Original Message----- > From: Claude Schneegans [mailto:[EMAIL PROTECTED] > Sent: Saturday, October 07, 2006 2:52 PM > To: CF-Talk > Subject: Re: Ajax and CF...*sigh*...again... > > >>I don't agree that XML is "too much of a PITA" > > May be not, but it is not a panacea either. > Sometimes, other solutions are many times simpler.
Of course - but nothing is without issues. > Example : instead of using <cfwddx to transmit data to Javascript, which > may generate a 500 k file, > create the Javascript code to define the structure, which may be less > than 5k. That ratio seems a little off... it really depends on what dialect of XML you're using. And the size of the transmitted file really doesn't have anything to with the complexity of the script. How do a large XML file, a small JSON file and a medium-sized block of JS code differ in developer complexity when they all result in the same exact output - a JS object? (Remember also that in CF this is exactly what "WDDX2JS" does in the WDDX tag: it generates client-side script.) But in the end generating JS is essentially what JSON does (JSON being essentially JavaScript literal notation). JSON attempts to eliminate some of the problems with that and to a great extent does. But generation of client-side code has its own problems as well. Structural validation of transmitted JS is even harder than validation of plain JSON - but validation of XML is well-defined. Data validation of JS (data typing) is either non-existent or tends to increase complexity and file size. Like most JSON parsers most people sending JS to the browser tend to muddy dates and strings and numbers. Transmitted JS, unlike an abstracted transfer mechanism like XML or JSON is designed to be "run" - the consumer environment "matters". An abstracted service has the benefits of loose coupling. There are complexities to loose coupling and most of the benefits are delayed but they can be important and tend to simplify extended development - often offsetting the additional complexity of the abstraction layer. Passing code blocks firstly means that a method has to be in-place which can "run" the code. This adds some complexity to interactive applications (usually a frame or iframe is used to run the code to eliminate main-page refreshing). Secondly you're running that code "blind" - whatever code is in there will run. This was a major concern in early JSON development were the only "deserilization" was the "eval()" method. JSON addressed this by incorporating a stricter parser but the problems would still remain for direct code generation. Lastly there are consistency issues. If you're going to standardize your JS deployment - make it generic - then you're just doing another abstraction layer. In other words you essentially end up with an object on the server and an object on the client - and each of which need know how to generate/consume them. If you don't make the transfer generic then you're writing a lot of custom script (or probably a lot of semi-generic script for problem groups) that doesn't really give you a whole heck of a lot. In the end the main issue is transfer size. XML can be big - depending on the selected dialect it can be huge - but some dialects are briefer. When dealing with very large packets of regular data (recordsets for example) YODEL adds only marginally to JSON can even beat client-side code generation. But like I said the end result on the client is the same: a JavaScript object ready to use. Jim Davis ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~| Introducing the Fusion Authority Quarterly Update. 80 pages of hard-hitting, up-to-date ColdFusion information by your peers, delivered to your door four times a year. http://www.fusionauthority.com/quarterly Archive: http://www.houseoffusion.com/groups/CF-Talk/message.cfm/messageid:255907 Subscription: http://www.houseoffusion.com/groups/CF-Talk/subscribe.cfm Unsubscribe: http://www.houseoffusion.com/cf_lists/unsubscribe.cfm?user=89.70.4

