As Cameron points out XML is slower and I suspect many web 2.0 type projects are now using JSON exclusively.
The problem however is that there's no formalization of how to shape JSON. We're still stuck at the "syntax" instead of the "semantics". So in the switch to JSON we've lost inter-application and inter-vender dialogue; except by those vendors also going out of their way to offer traditional RDF/XML or GeoRSS or the like. JDIL would permit users to mix-and match content from different servers; without application developers having to resort to RDF/XML. So this makes it a good thing. It also (like RDF) is a "full" notation; able to express arbitrarily complex graphs; not just directed-acyclic trees. You could for example define a geometry namespace and then define a 50000 polygon house and then instance it 50 times; to say describe a city sub-division. Neither GML nor GeoRSS support this as far as I can figure. And it's also JSON... which is beautiful and elegant as opposed to say XML. To support JDIL the application developers job would be to is formalize whatever ad-hoc JSON format they are publishing and to possibly offer proxy gateways to get around rpc security issues. What's especially interesting is that there's an incentive to use JDIL even if nobody else is. In one of my projects I publish JSON - for example: http://hook.org/note/json_query?path=/venice/chat There is no particular way that my object data is arranged except as an implicit agreement between my server side Ruby On Rails database and my client side Javascript application. In fact the only way that I myself as the developer can know or share what is being published is to send somebody a copy of the source code for the application; I have no schema definition on the transport layer. If I switch to using JDIL instead of just JSON then that self-imposed formalism makes my data-structures more explicit to myself. It becomes a secondary benefit that other people can subscribe to the streams of content that I publish. There was a bit of discussion about Chris'es idea on #swig and the following URL came up which has some interesting thoughts about encoding RDF in JSON: http://www.thefigtrees.net/lee/blog/2007/01/using_rdf_on_the_web_a_vision.html Some lessons from being a game developer - and having to transport data between servers and clients (albeit more for 3d video games rather than for webby 2.0 social apps): People defining new grammers for vector geometry may want to remember to keep VRML and X3D in mind; these are grammers that express not just static content but things like multiple instancing, separation of skinning from geometry, switch nodes and behavior nodes as well as constraints. As well, and why JSON appeals, is that one definately seeks some kind of grammer that is terse, highly expressive and that can be "unpacked" with a minimum of labour on the client side. Typically a client side is less powerful than the server side; it is easiest to just "play back" the content; not have to parse it. Javascript in particular is mind-numblingly slow at doing array inserts and the like; it isn't even just the XML parsing... you really don't want to do any transformation on the data once you get it if at all possible. Finally, rich client apps are usually going to have some kind of "read through" database; even for just a single sessions activity. Javascript clients often will have a caching read through query layer; where queries only make it to the server if the client does not have the data cached already. So the data transport model should allow one to easily add new database columns or objects to an already existing pool of data on the client (not just replace it). - a _______________________________________________ Geowanking mailing list [email protected] http://lists.burri.to/mailman/listinfo/geowanking
