Bandwidth of the server and the client, and memory of the browser. And
the time...
Well, xml files are pretty well compress, my ~300 kilobytes files goes
down to ~100 kilobytes when gzipped (which usually automatically happens
if both the browser and the webserver support it). The document can be
even cached by the browser for some prolonged time I believe, so no need
for downloading again and again.
Browsers just love to eat big chunks of memory anyway, aren't they? So
an extra couple of megabytes shouldn't be a problem. I've just did a
quick test: my Firefox eats 33 megabytes when started, 35, when I loaded
a simple html file, 40 when I loaded
http://docbook-publishing.appspot.com/DocBookPublishing.html , and
around 47, when the table of contents is loaded (the xml file is loaded
into the xslt processor at this point).
The time, I don't know. It's a long way to go for me to have those
stylesheets ready for check, we'll see hopefully.
After all, I think there might be some use-cases where it's appropriate
and a lot of other cases where it's not. And since it's just an idea, we
can't tell for sure yet. I find it useful to collect arguments against
and in favour.
Yours,
Arpad
2010.03.12. 20:24 keltezéssel, [email protected] írta:
That is a very interesting idea. I am wondering if there is some sort of size
limit beyond which you would not want to do this, though. If you have a very
large document, is the time required to download and transform the entire thing
up front so long that it counterbalances the benefits of being able to browse
topics quickly? Consider the user that only needs to look at one or two of the
topics. For them the traditional method might be quicker.
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]