Re: [whatwg] Client-side includes proposal

2008-08-21 Thread Jonas Sicking

I have to say that I don't really agree with Hixie here either.

I think there is much value in letting HTML be a viable format for 
document distribution outside the web. I definitely don't think of it as 
a non-goal. Things like distributable cross-platform DVDs of wikipedia 
containing just a stack of HTML pages would be an awesome way of 
delivering part of the web to people that are offline (be that on an 
airplane, the Alaskan wilderness or stuck in a warzone).


That said, there is always a cost/benefit analysis to any new feature. 
And I think the benefit for a feature specifically targetted for non-web 
HTML pages is smaller, which means that we should accept only smaller costs.


In the case of client side includes I'm unconvinced benefit is worth the 
cost. Additionally there already is a standard for it called XInclude, 
so I'm not really sure what the debate is.


If UAs want to support client side include they can implement XInclude. 
If they don't, why would we add it to HTML5?


/ Jonas


Re: [whatwg] Client-side includes proposal

2008-08-20 Thread timeless
On Wed, Aug 20, 2008 at 12:06 AM, Ian Hickson [EMAIL PROTECTED] wrote:
 When you control the software used to read the data, it doesn't matter
 what the data format is.

i kinda object to this. By this argument, video isn't necessary
because youtube controlled the software (flash) used to read their
videos.

the argument you've provided supports vendors shipping binaries on
their readonly media and forcing users to rely on those binaries in
order to access content.

alternatively, if what you really want is for such content providers
to die, you could say so.

but i'd rather for them to be able (and encouraged) to ship non
executable content which can be safely read by an open user agent
which conforms to some limited set (it shouldn't have to implement
every single Office Document format which was marked as Open) of
modern standards [something like HTML5]

I've used a number of CDs which contained encyclopedias, dictionaries,
medical and legal references.
If those were shipped as html content w/ clever json indexes, then i
could add my own application later to read it. If it's some binary,
then I'm forced to trust the binary and have no access to the data.

That said. I don't know that there's anything in this specific request
that should actually be needed beyond the inline iframe feature


Re: [whatwg] Client-side includes proposal

2008-08-20 Thread Kristof Zelechovski
Publishers who publish commercial content on physical media are rarely
interested in having their content repackaged by someone else as they see
fit.  This is not very pleasant of course; however, you could possibly try
to solve your problem by asking the vendor for a license to repackage their
content.
Chris

-Original Message-
From: [EMAIL PROTECTED]
[mailto:[EMAIL PROTECTED] On Behalf Of timeless
Sent: Wednesday, August 20, 2008 12:16 PM
To: WHAT working group
Subject: Re: [whatwg] Client-side includes proposal

I've used a number of CDs which contained encyclopedias, dictionaries,
medical and legal references.
If those were shipped as html content w/ clever json indexes, then i
could add my own application later to read it. If it's some binary,
then I'm forced to trust the binary and have no access to the data.

That said. I don't know that there's anything in this specific request
that should actually be needed beyond the inline iframe feature



Re: [whatwg] Client-side includes proposal

2008-08-20 Thread Kristof Zelechovski
I admit XSLT is heavy and it causes a significant rendering slowdown in the
browser.  This is not a problem because the XSLT processor runs on the
publisher's machine once each time new content gets published - authoring
that content would probably take much more time than publishing it anyway.
I do not agree the problem is simple.  A document transformation is not
simple and inclusion is a simple special case; however, I am afraid
providing a specially crafted solution for every simple case you might need
is feature creep.  Once you have simple document inclusion, you would
probably find out you would appreciate something more sophisticated.
BTW, it is also possible to use entities for document fragments in XHTML.
Chris

-Original Message-
From: [EMAIL PROTECTED]
[mailto:[EMAIL PROTECTED] On Behalf Of Zac Spitzer
Sent: Tuesday, August 19, 2008 3:38 PM
To: [EMAIL PROTECTED]
Cc: WHAT working group; Shannon; Ian Hickson
Subject: Re: [whatwg] Client-side includes proposal

XML or XLST is too heavy, simple problem, simple solutions




Re: [whatwg] Client-side includes proposal

2008-08-20 Thread Elliotte Harold

Ian Hickson wrote:

On Tue, 19 Aug 2008, Elliotte Harold wrote:
In the case of non-Web content, the use of HTML is an academic point, 
since any format would work as well.

Really? Why? and how? That's certainly not self-evident.


When you control the software used to read the data, it doesn't matter 
what the data format is.




Who says you control the software used to read the data? I often want to 
send documents around via e-mail, network mounts, ÇDs, and other 
non-network means. I usually don't care and don't want to care what 
platform or software is used to read those documents. I certainly don't 
wnt to have to supply such software to people I'm communicating with. 
HTML works very nicely for me here. Software independence is a very good 
idea, and hardly unique to the Web.


Admittedly, there have been a lot of software dependent document formats 
over the last 20-30 years. That was a mistake that hopefully we are now 
recovering from.


--
Elliotte Rusty Harold  [EMAIL PROTECTED]
Refactoring HTML Just Published!
http://www.amazon.com/exec/obidos/ISBN=0321503635/ref=nosim/cafeaulaitA


Re: [whatwg] Client-side includes proposal

2008-08-19 Thread Martin Ellis
I still get the feeling that this is an element that is being invented for the 
purpose of being invented and doesn’t solve any existing problem, or resolve 
any difficult to implement problems that are not already solved in layers below 
and above HTML.

I worry slightly from this and similar proposals that certain parts of the spec 
will end up being deprecated in the next spec due to lack of use.

With regards to instant AJAX for dummies, I struggle to accept that no-one is 
able to download an already existing javascript library that provides this 
functionality in a few simple lines.

Maybe I've missed something but is there a mandate which sets out the audience 
HTML5 is trying to satisfy?

Do we really need to reinvent the wheel? 

-Original Message-
From: [EMAIL PROTECTED] [mailto:[EMAIL PROTECTED] On Behalf Of Jonas Sicking
Sent: 19 August 2008 03:36
To: Greg Houston
Cc: whatwg@lists.whatwg.org; Bill Wallace; João Eiras
Subject: Re: [whatwg] Client-side includes proposal

Greg Houston wrote:
 On Mon, Aug 18, 2008 at 4:01 PM, João Eiras [EMAIL PROTECTED] wrote:
 include src=static-header /
 include src=user-specific-data /
 include src=dynamic-daily-content /
 This is something that would probably not be represented with a new element,
 because elements are  preserved in the DOM tree so they can be accessed and
 queried.
 So then you'd have a question: keep includein the dom and allow access to
 children like an iframe, or replace include entirely ?
 The answer could be intuitive, but that can open a can of worms. Then you
 can do all sorts of dynamic manipulation, which would have to be very well
 specified.
 I think the way to go would be a processing instruction.

 The idea is good though ! But I think could be better implemented with a css
 template like feature. I don't like the idea of a new element.

 Bye.
 
 This seems to be mostly useful for people creating small websites that
 are afraid of server side scripting languages like PHP, Python and
 Ruby. That being the case, if something like this is implemented the
 included content should definitely not be accessed like with an
 iframe. The elements included should be in the DOM tree of the parent
 just as if the includes were done server side. Accessing the DOM of an
 iframe from the parent and vice versa causes people a lot of
 confusion. I don't think we need to add that level of confusion to the
 group of users that would most likely use this feature.
 
 Also, a bonus of keeping include src=some-content.html / in the
 DOM, is that changing the source could reload the content of that
 element. You would have instant AJAX/XHR for dummies.

This is basically what i've suggested in a thread some months back. 
Basically an iframe but that renders as if the inner was included inline.

This is far from easy to implement since you have to do layout across 
several documents. But I think it would be a pretty useful feature to 
simplify AJAXy pages.

If I ever get time I'm going to attempt an implementation in firefox 
which will hopefully provide experience to build a spec on.

/ Jonas

-- 
This message has been scanned for viruses and
dangerous content by MailScanner, and is
believed to be clean.



Re: [whatwg] Client-side includes proposal

2008-08-19 Thread Kristof Zelechovski
I have not bumped into any XSLT-related browser problems except for
converting resource tree fragments to nodes which is unportable but it is
needed in advanced applications only where you need recursion on document
node construction.  Anyway, my plan is to make the transformation a part of
the publishing process, not a part of the rendering process; that is, you
use your reliable XSLT driver to do the transformation and publish the
result to the server.  Clean, automatic and simple.
Chris

-Original Message-
From: [EMAIL PROTECTED]
[mailto:[EMAIL PROTECTED] On Behalf Of Shannon
Sent: Monday, August 18, 2008 7:57 PM
To: Kristof Zelechovski
Cc: 'WHAT working group'
Subject: Re: [whatwg] Client-side includes proposal

Kristof Zelechovski wrote:
 Client-side includes make a document transformation, something which
 logically belongs XSLT rather than HTML.  And what would the workaround
for
 legacy browsers be?
 Chris
   

You make good points but last I checked general browser and editor 
support for XSLT was abysmal. Everyone is saying its on their roadmaps 
though so maybe it will one day be reliable enough to use.





Re: [whatwg] Client-side includes proposal

2008-08-19 Thread Shannon
Ian, I have some final clarifications to make then if you wont change 
your mind I'll leave this one alone or pursue it elsewhere. You've 
previously made it abundantly clear that nothing happens around here 
with your approval (the formal definition of editor I suppose).


Ian Hickson wrote:


* Web applications and HTML documentation on the local filesystem.
* Autorun frontends on CDROM that are typically employed by magazine coverdiscs.
  


These aren't part of the use cases I am considering.
  


I think as editor of the HTML specification you need to consider all 
common uses of HTML - especially when you've also taken the stance that 
there are no unsolved use cases for this proposal. Aren't you also 
pushing for offline applications with your local storage proposal and 
whatnot? I could claim this is solved by server-side databases. You 
can't have it both ways. At any rate offline applications are certainly 
valid uses since they represent the most common environment where SSI is 
not even an option.


WYSIWYG editors are quite capable of implementing a model whereby pages 
are merged before being previewed, and are merged when published, removing 
the need for any post-publication inclusion mechanism.
  


Of course they can, and often they do. As I said they do it in 
proprietary, non-transparent and non-transferable ways. CSI would be 
open, standardised and independent of the authoring environment and this 
is a good thing.


I assume that by pre-generated you're referring to 
Dreamweaver/WebObjects templates rather than copy-and-paste.
I find these irritating as quite often you are given one format but 
require another (as a FOSS user I can't/won't use Dreamweaver).



I usually just roll my own using Perl, it's like one or two lines 
depending on what you want.
  


Of course you do, but you are far from the typical web developer. 
They'll use Dreamweaver templates unless a good reason exists not to. 
There is also the issue that the merge is a one-way process. If a web 
designer absconds with the original site development files (a quite 
common occurrence after disputes) then all you're left with on the 
server are the merged templates which have to be unmerged again. This 
may sound like a contrived example but I've been in this situation more 
than once.


Merging doesn't solve the issue of content being merged with live data 
from other sources (where keeping a page fresh requires some form of 
cron job or CGI that may not be possible for the given service). Yes 
iframes solve this too but then you're stuck with their limitations as well.


Merging generally has the side-effect of removing the reference to the 
include from the resulting file. This reference may have several 
benefits such as allowing user agents to block includes from certain 
domains and search engines to determine the authority or purpose of a 
page segment. There might be other advantages such as allowing users to 
swap out page segments for content of their choosing or other things 
nobody has thought of yet. Merging is basically a lossy conversion in so 
many ways.


Most content doesn't require blocking. Those that require blocking (like 
scripts) are a massive problem. Just ask any browser vendor. Safari even 
ends up running a second tokeniser in parallel just to try to help this.
  
External (blocking) JS is a disaster. We don't want to be adding more 
features like it. We're still trying to get out of the problems blocking 
scripts added.


This is just a loaded way of saying the implementation problems with 
out-of-order loading have already been solved by current vendors as best 
they can. The blocking nature of this proposal seems largely dependent 
on the method and quantity of scripts in the page and the amount of 
content already in local cache. As another respondent posted this is 
essentially an alternative to AJAX which has all the problems you 
mentioned and more. You of all people should realise that blocking and 
timing issues on external JS, cross-frame scripting, flash with script 
interaction and AJAX methods aren't going away so the larger problem and 
implementation requirements won't be solved or blocked by this proposal 
in the slightest.


Even when content doesn't block the results are far from ideal. Issues 
such as 'flash of unstyled content', excessive repainting and content 
jumping around all over the place are all artifacts of the real issue - 
which is (X)HTML is an imperfect streaming format. As a designer of 
primarily dynamic sites I rarely have the luxury of setting explicit box 
sizes so the blocking/reshuffling issue is something I've learn't to 
live with. I generally run scripts onload or at least using 'defer' so 
blocking is mostly a non-issue. As I said earlier if the browser decides 
not to paint my banner till the end then I really don't care; and if the 
content rendering is somehow blocked by the wait for banner code then 
I'd consider reordering my includes or 

Re: [whatwg] Client-side includes proposal

2008-08-19 Thread Ian Hickson
On Tue, 19 Aug 2008, Shannon wrote:

 Ian Hickson wrote:
 
   * Web applications and HTML documentation on the local filesystem.
   * Autorun frontends on CDROM that are typically employed by magazine
 coverdiscs.
 
  These aren't part of the use cases I am considering.
 
 I think as editor of the HTML specification you need to consider all 
 common uses of HTML - especially when you've also taken the stance that 
 there are no unsolved use cases for this proposal. Aren't you also 
 pushing for offline applications with your local storage proposal and 
 whatnot? I could claim this is solved by server-side databases. You 
 can't have it both ways. At any rate offline applications are certainly 
 valid uses since they represent the most common environment where SSI is 
 not even an option.

Server-based offline Web apps are applications that are served by a remote 
server and then cached locally; this is very different from non-Web cases 
such as documentation on a local filesystem or on CD-ROMs. In the case of 
non-Web content, the use of HTML is an academic point, since any format 
would work as well. The W in WHATWG stands for Web, and that is what we 
have to focus on here.

I agree entirely that there are people using Web technologies for non-Web 
problems like content on CD-ROMs, on Intranets, and so forth, but those 
are not part of the Web and are not in our scope.


  WYSIWYG editors are quite capable of implementing a model whereby 
  pages are merged before being previewed, and are merged when 
  published, removing the need for any post-publication inclusion 
  mechanism.
 
 Of course they can, and often they do. As I said they do it in 
 proprietary, non-transparent and non-transferable ways. CSI would be 
 open, standardised and independent of the authoring environment and this 
 is a good thing.

I agree that standardised mechanisms for author-side development can be 
useful; however, that is also outside of our scope.

I encourage you to look at XInclude, which appears to be exactly what you 
are looking for -- a standard inclusion mechanism that could be used with 
multiple independent implementations on the author side:

   http://www.w3.org/TR/xinclude/


   I assume that by pre-generated you're referring to 
   Dreamweaver/WebObjects templates rather than copy-and-paste. I find 
   these irritating as quite often you are given one format but require 
   another (as a FOSS user I can't/won't use Dreamweaver).
  
  I usually just roll my own using Perl, it's like one or two lines 
  depending on what you want.
 
 Of course you do, but you are far from the typical web developer. 
 They'll use Dreamweaver templates unless a good reason exists not to.

Ok, well, then the use case is already handled.


 There is also the issue that the merge is a one-way process. If a web 
 designer absconds with the original site development files (a quite 
 common occurrence after disputes) then all you're left with on the 
 server are the merged templates which have to be unmerged again. This 
 may sound like a contrived example but I've been in this situation more 
 than once.

That's not a technical problem, it's a social problem, and engineering 
technical solutions for social problems is rarely a useful exercise.


 Merging doesn't solve the issue of content being merged with live data 
 from other sources (where keeping a page fresh requires some form of 
 cron job or CGI that may not be possible for the given service). Yes 
 iframes solve this too but then you're stuck with their limitations as 
 well.

It's true that not all services provide CGIs and cronjobs, but I don't 
think it is unreasonable to presume that people who are providing dynamic 
data of this nature can avail themselves of a hosting provider that 
supports at least SSI.


 Merging generally has the side-effect of removing the reference to the 
 include from the resulting file. This reference may have several 
 benefits such as allowing user agents to block includes from certain 
 domains and search engines to determine the authority or purpose of a 
 page segment.

It's not clear what problem these solutions are attempting to solve. I 
assure you that at least in the case of search engines, determining the 
authority or purpose of a page segment is not something that client-side 
includes would especially help us with.


 There might be other advantages such as allowing users to swap out page 
 segments for content of their choosing or other things nobody has 
 thought of yet.

CSS provides this even without CSIs.


  Most content doesn't require blocking. Those that require blocking 
  (like scripts) are a massive problem. Just ask any browser vendor. 
  Safari even ends up running a second tokeniser in parallel just to try 
  to help this. External (blocking) JS is a disaster. We don't want to 
  be adding more features like it. We're still trying to get out of the 
  problems blocking scripts added.
 
 This is just a loaded 

Re: [whatwg] Client-side includes proposal

2008-08-19 Thread Elliotte Harold

XInclude?

--
Elliotte Rusty Harold  [EMAIL PROTECTED]
Refactoring HTML Just Published!
http://www.amazon.com/exec/obidos/ISBN=0321503635/ref=nosim/cafeaulaitA


Re: [whatwg] Client-side includes proposal

2008-08-19 Thread Elliotte Harold

Ian Hickson wrote:

Server-based offline Web apps are applications that are served by a remote 
server and then cached locally; this is very different from non-Web cases 
such as documentation on a local filesystem or on CD-ROMs. In the case of 
non-Web content, the use of HTML is an academic point, since any format 
would work as well. 


Really? Why? and how? That's certainly not self-evident.

Aside from embedded links, which can point into the file system and are 
usually relative anyway, there's very little web-specific about HTML. 
It's just one format that can be served over HTTP or read from a disk, 
just like PDF or text/plain or OpenDocument.


HTML has some nice characteristics like resolution independence, direct 
editability as text, and automatic reflow; but these are in no way 
limited to network transfers. For many use cases, especially 
cross-platform ones, HTML is the formatted text format of choice.


A properly designed HTML spec should not require, prohibit, or 
preference  a document being read from the network or from a local file 
system or via any other protocol. HTML 1 through 4 and XHTML 1 and 2 had 
this important characteristic. I hope HTML 5 does as well.


--
Elliotte Rusty Harold  [EMAIL PROTECTED]
Refactoring HTML Just Published!
http://www.amazon.com/exec/obidos/ISBN=0321503635/ref=nosim/cafeaulaitA


Re: [whatwg] Client-side includes proposal

2008-08-19 Thread Zac Spitzer
It sounds like a request for object src=document.html/object

which would end up like an inline IFRAME

This is such a  common use case and just because you can do it with
SSI means zlich,
it presents challenges which sure, you can work around, but everyone
will do it with a different
way whether via javascript or a server side approach

When it's so common, it's a good case for being standardised into some
thing simple.

XML or XLST is too heavy, simple problem, simple solutions

this also has potential for significant page speed improvements and
for reducing overall page sizes

z


On Tue, Aug 19, 2008 at 10:55 PM, Elliotte Harold
[EMAIL PROTECTED] wrote:
 Ian Hickson wrote:

 Server-based offline Web apps are applications that are served by a remote
 server and then cached locally; this is very different from non-Web cases
 such as documentation on a local filesystem or on CD-ROMs. In the case of
 non-Web content, the use of HTML is an academic point, since any format
 would work as well.

 Really? Why? and how? That's certainly not self-evident.

 Aside from embedded links, which can point into the file system and are
 usually relative anyway, there's very little web-specific about HTML. It's
 just one format that can be served over HTTP or read from a disk, just like
 PDF or text/plain or OpenDocument.

 HTML has some nice characteristics like resolution independence, direct
 editability as text, and automatic reflow; but these are in no way limited
 to network transfers. For many use cases, especially cross-platform ones,
 HTML is the formatted text format of choice.

 A properly designed HTML spec should not require, prohibit, or preference  a
 document being read from the network or from a local file system or via any
 other protocol. HTML 1 through 4 and XHTML 1 and 2 had this important
 characteristic. I hope HTML 5 does as well.

 --
 Elliotte Rusty Harold  [EMAIL PROTECTED]
 Refactoring HTML Just Published!
 http://www.amazon.com/exec/obidos/ISBN=0321503635/ref=nosim/cafeaulaitA




-- 
Zac Spitzer -
http://zacster.blogspot.com (My Blog)
+61 405 847 168


Re: [whatwg] Client-side includes proposal

2008-08-19 Thread Ian Hickson
On Tue, 19 Aug 2008, Elliotte Harold wrote:
 
  In the case of non-Web content, the use of HTML is an academic point, 
  since any format would work as well.
 
 Really? Why? and how? That's certainly not self-evident.

When you control the software used to read the data, it doesn't matter 
what the data format is.


 Aside from embedded links, which can point into the file system and are 
 usually relative anyway, there's very little web-specific about HTML. 
 It's just one format that can be served over HTTP or read from a disk, 
 just like PDF or text/plain or OpenDocument.

Exactly.


 HTML has some nice characteristics like resolution independence, direct 
 editability as text, and automatic reflow; but these are in no way 
 limited to network transfers.

It has charactersitics that are independent of its use on the Web, yes.


 For many use cases, especially cross-platform ones, HTML is the 
 formatted text format of choice.

HTML isn't a formatted text format...


 A properly designed HTML spec should not require, prohibit, or 
 preference a document being read from the network or from a local file 
 system or via any other protocol. HTML 1 through 4 and XHTML 1 and 2 had 
 this important characteristic. I hope HTML 5 does as well.

I imagine it will. But it's not a requirement. If something comes up that 
would make HTML better for the Web and as a side-effect makes it not work 
for non-Web cases, then we should do it, not because of the side-effect, 
but because it makes HTML better for the Web. That HTML5 works for non-Web 
cases can be a happy accident, but it's not an intentional characteristic.

-- 
Ian Hickson   U+1047E)\._.,--,'``.fL
http://ln.hixie.ch/   U+263A/,   _.. \   _\  ;`._ ,.
Things that are impossible just take longer.   `._.-(,_..'--(,_..'`-.;.'


[whatwg] Client-side includes proposal

2008-08-18 Thread Shannon
The discussion on seamless iframes reminded me of something I've felt 
was missing from HTML - an equivalent client functionality to 
server-side includes as provided by PHP, Coldfusion and SSI. In 
server-side includes the document generated from parts appears as a 
single entity rather than nested frames. In other words the source code 
seen by the UA is indistiguishable from a non-frames HTML page in every way.


iframes are good for some things but they can be really messy when 
you're trying to build a single seamless page with shared styles and 
scripts from multiple files. It makes code reuse a real pain without 
relying on a server-side dynamic language. The seamless iframes proposal 
doesn't really address this well because you'll have more than one HTML 
and BODY element causing strange behaviour or complex exceptions with 
seamless CSS.


The other issue with iframes is that for many page snippets the concept 
of a title, meta tags and other headers don't make sense or simply 
repeat what was in the main document. More often than not the head 
section is meaningless yet must still be included for the frame to be 
well-formed or indexed by spiders.


The proposal would work like this:

--- Master Document ---
html
   head
  titleInclude Example/title
  meta name=includes content=allow
  include src=global_head.ihtml
   /head
   body
 include src=header.ihtml
 include src=http://www.pagelets.com/foo.ihtml;
 include src=footer.ihtml
   /body
/html

--- Header.html ---
div id=header
   h1Header/h1
/div


With this proposal seamless CSS would work perfectly because child 
selectors won't see an intervening body element between sections.


Includes should allow any html segments except the initial doctype and 
head (for reasons explained below) and should allow start and end tags 
to be split across includes. Only tags themselves may not contain an 
include (eg, body include src=body_attributes.ihtml). Many 
server-side includes allow this but it breaks the syntax of HTML/XML.


Includes must respect their own HTTP headers but inherit all other 
properties, styles and scripts from the surrounding page. If an include 
is not set to expire immediately the browser should reuse it from 
memory, otherwise it should retreive it once for each include. Each 
behaviour has its own merits depending on the application.


The standard would recommend (but not require) includes to use an .ihtml 
extension. This will make it easier for authors, UAs and logging systems 
to distinguish partial and complete pages (ie, not count includes 
towards page views in a stats package).


UAs or UA extensions like the Mozilla-based Web Developer should allow 
the user to view the actual source and the final source (with all 
includes substituted).


HTTP 1.1 pipelining should remove any performance concerns that includes 
would have over traditional SSI since the retrieval process only 
requires the sending of a few more bytes of request and response 
headers. In some ways it is actually better because UAs and proxies can 
cache the static includes and only fetch the dynamic parts.


The only real issue with this proposal is security for untrusted content 
like myspace profiles. Traditional sanitisers would be unfamiliar with 
include and may allow it through, providing a backdoor for malicious 
code. For this reason it is necessary that includes be opt-in. The 
simplest mechanism is to use a meta tag in the head of the master document:


meta name=includes content=allow

I would consider any content system that allowed untrusted users to 
write their own head tags to be incurable insecure; however this 
requirement should ensure that the majority do not suddenly experience a 
wave of new exploits in HTML5 browsers.


Shannon


Re: [whatwg] Client-side includes proposal

2008-08-18 Thread Ian Hickson
On Mon, 18 Aug 2008, Shannon wrote:

 The discussion on seamless iframes reminded me of something I've felt 
 was missing from HTML - an equivalent client functionality to 
 server-side includes as provided by PHP, Coldfusion and SSI. In 
 server-side includes the document generated from parts appears as a 
 single entity rather than nested frames. In other words the source code 
 seen by the UA is indistiguishable from a non-frames HTML page in every 
 way.

What advantage does this have over server-side includes?

The iframe doc= seamless idea has the advantage of not blocking 
rendering, which a client-side parsing-level include would. I don't really 
see what the advantage of a client-side parsing-level include would be.


 The other issue with iframes is that for many page snippets the concept 
 of a title, meta tags and other headers don't make sense or simply 
 repeat what was in the main document. More often than not the head 
 section is meaningless yet must still be included for the frame to be 
 well-formed or indexed by spiders.

Yeah, I've been considering making the title element optional for 
documents that aren't meant to be navigated to directly, like includes.


 HTTP 1.1 pipelining should remove any performance concerns that includes 
 would have over traditional SSI since the retrieval process only 
 requires the sending of a few more bytes of request and response 
 headers.

A TCP round-trip is very expensive. A client-side parsing-level include 
would mean that the parser would have to stop while a complete round-trip 
is performed. There's really no way to get around that short of making it 
a higher-level construct like iframe seamless.

-- 
Ian Hickson   U+1047E)\._.,--,'``.fL
http://ln.hixie.ch/   U+263A/,   _.. \   _\  ;`._ ,.
Things that are impossible just take longer.   `._.-(,_..'--(,_..'`-.;.'


Re: [whatwg] Client-side includes proposal

2008-08-18 Thread Kristof Zelechovski
Client-side includes make a document transformation, something which
logically belongs XSLT rather than HTML.  And what would the workaround for
legacy browsers be?
Chris

-Original Message-
From: [EMAIL PROTECTED]
[mailto:[EMAIL PROTECTED] On Behalf Of Shannon
Sent: Monday, August 18, 2008 8:34 AM
To: WHAT working group
Subject: [whatwg] Client-side includes proposal

The proposal would work like this:

--- Master Document ---
html
head
   titleInclude Example/title
   meta name=includes content=allow
   include src=global_head.ihtml
/head
body
  include src=header.ihtml
  include src=http://www.pagelets.com/foo.ihtml;
  include src=footer.ihtml
/body
/html

--- Header.html ---
div id=header
h1Header/h1
/div






Re: [whatwg] Client-side includes proposal

2008-08-18 Thread Shannon

Ian Hickson wrote:

On Mon, 18 Aug 2008, Shannon wrote:
  
The discussion on seamless iframes reminded me of something I've felt 
was missing from HTML - an equivalent client functionality to 
server-side includes as provided by PHP, Coldfusion and SSI.



What advantage does this have over server-side includes?

The iframe doc= seamless idea has the advantage of not blocking 
rendering, which a client-side parsing-level include would. I don't really 
see what the advantage of a client-side parsing-level include would be.
  
Cost. SSI of any description generally puts you on a business hosting 
plan with a cost increase of up to 10x. Client-side includes only 
require static page serving which can also be beneficial where the 
server is a device (like a router interface).


Availability. As far as I can tell SSI is only available on  Apache and 
later versions of IIS. This may be a market majority but when you 
consider the number of devices and home servers coming onto the market 
with basic web interfaces the actual availability of SSI may be lower 
than you'd expect.


Security. Availability is reduced even further by ISP and organisations 
that flat-out refuse to support SSI due to security and support concerns.


Reuse. SSI follows no agreed set of rules and therefore may require code 
changes when changing hosting provider. Some systems provide extensions 
that authors may assume is part of a standard, but aren't. We have an 
opportunity to define a simpler and more reliable interface that is 
independant of any server configuration.


Speed. Concerns about speed are generally only valid for the first page 
on the first visit to a site. Subsequent pages can be much faster than 
SSI and even static html since common banners and footers can be cached 
seperately - requiring only a unique content download. This is less 
trivial than it sounds since global drop-down menus, ad frames, tracking 
code, overlays and embedded JS and CSS often account for a vast majority 
of the source code.


Flexibility. It's hard to tell because the seamless proposal is vague 
but from what I can tell there are a lot of things a seamless iframe 
is not seamless about. For instance can absolutely positioned items be 
positioned relative to the enclosing document? Do child and adjacent 
selectors work across the iframe boundary? Will IE give up its behaviour 
of placing iframes above the document regardless of z-index? Includes 
don't have any of these issues because they are treated as part of the 
document, not as a special case.


Even with these advantages I do not believe it is an either/or case. 
seamless iframes serve a purpose too and I do not want to see them 
dropped from the spec. I would however like to see more clarity in the 
spec as to how they interact with scripts and styles (especially 
adjacency selectors)  in the master document and neighbouring seamless 
frames.



HTTP 1.1 pipelining should remove any performance concerns that includes
would have over traditional SSI since the retrieval process only 
requires the sending of a few more bytes of request and response 
headers.



A TCP round-trip is very expensive. A client-side parsing-level include 
would mean that the parser would have to stop while a complete round-trip 
is performed. There's really no way to get around that short of making it 
a higher-level construct like iframe seamless.


  
There is actually an easy solution for this, though it is less flexible 
than my original proposal. The solution is to require each include to be 
balanced (equal number of open and close tags) so the surrounding block 
is guaranteed to be a single node. Anything left open is forcefully 
closed (as when reaching /body with open blocks). In other words:


div id=content style=min-height:500px
   include src=content.ihtml
/div!-- always closes content --

Since we know content is a closed block we can draw in a transparent 
placeholder and continue rendering the outer document as we do with img, 
video, iframes and embed. Once the true dimensions are known the 
renderer can repaint as it does with table cells and other auto 
sizing. This will often improve the readability of documents on slower 
connections since the top third of source code is usually concerned with 
banners, menus, search-bars and other cruft not directly relevant to the 
content the user wants to view and this is exactly the stuff you would 
want to put in an include to begin with. If it renders last then all the 
better.


Shannon


Re: [whatwg] Client-side includes proposal

2008-08-18 Thread Shannon

Kristof Zelechovski wrote:

Client-side includes make a document transformation, something which
logically belongs XSLT rather than HTML.  And what would the workaround for
legacy browsers be?
Chris
  


You make good points but last I checked general browser and editor 
support for XSLT was abysmal. Everyone is saying its on their roadmaps 
though so maybe it will one day be reliable enough to use.


You could go:

include src=banner.ihtml
   h1Banner/h1
/include

But this just seems wasteful, pointless and open to abuse. I think a 
better workaround is that people with legacy browsers download each 
include file seperately and paste them together in DOS or AmigaOS or 
whatever system it is that keeps them from installing a modern browser.


Of course XSLT has the same legacy issues as do many parts of HTML5. I 
know the reasoning but at some point the web will have to leave 
unmaintained software behind or face the same grim reality OpenGL is 
facing now (can't move forward because a minority want legacy support 
for 10 year old CAD applications, can't go back because competing 
protocols are in front on features).


I'd like to see the option made available and its use determined by the 
market as we have always done. If a developer wants to piss-off users by 
writing a Flash or Silverlight-only website then the ONLY thing we can 
do is provide an equivalent feature and hope that it does less harm (by 
virtue of being a truly open standard). The average web developer's 
mentally is very different from the majority of this list, they won't 
compromise to do the right thing. If I can do client-side includes in 
Flash and Silverlight (and I can) then why should HTML be left behind?


Anyway, I don't mean for an open discussion on this as I'm sure it's 
been debated endlessly. I'm just stating my case for going ahead with 
this feature.


Shannon


Re: [whatwg] Client-side includes proposal

2008-08-18 Thread Martin Ellis
I can't speak for non Windows/Linux users, but for windows users IIS
comes supplied and supports SSI, asp.net php (via a download) etc and
with linux you can download apache and a sluth of other http daemons, I
see no reason for any html page to require the client to do the inline
including of content, as stated previously in this thread the tcp
overhead is huge and this would only make it worse in my opinion.

M

-Original Message-
From: [EMAIL PROTECTED]
[mailto:[EMAIL PROTECTED] On Behalf Of Shannon
Sent: 18 August 2008 18:57
To: Kristof Zelechovski
Cc: 'WHAT working group'
Subject: Re: [whatwg] Client-side includes proposal

Kristof Zelechovski wrote:
 Client-side includes make a document transformation, something which
 logically belongs XSLT rather than HTML.  And what would the
workaround for
 legacy browsers be?
 Chris
   

You make good points but last I checked general browser and editor 
support for XSLT was abysmal. Everyone is saying its on their roadmaps

though so maybe it will one day be reliable enough to use.

You could go:

include src=banner.ihtml
h1Banner/h1
/include

But this just seems wasteful, pointless and open to abuse. I think a 
better workaround is that people with legacy browsers download each 
include file seperately and paste them together in DOS or AmigaOS or 
whatever system it is that keeps them from installing a modern browser.

Of course XSLT has the same legacy issues as do many parts of HTML5. I 
know the reasoning but at some point the web will have to leave 
unmaintained software behind or face the same grim reality OpenGL is 
facing now (can't move forward because a minority want legacy support 
for 10 year old CAD applications, can't go back because competing 
protocols are in front on features).

I'd like to see the option made available and its use determined by the 
market as we have always done. If a developer wants to piss-off users by

writing a Flash or Silverlight-only website then the ONLY thing we can 
do is provide an equivalent feature and hope that it does less harm (by 
virtue of being a truly open standard). The average web developer's 
mentally is very different from the majority of this list, they won't 
compromise to do the right thing. If I can do client-side includes in 
Flash and Silverlight (and I can) then why should HTML be left behind?

Anyway, I don't mean for an open discussion on this as I'm sure it's 
been debated endlessly. I'm just stating my case for going ahead with 
this feature.

Shannon

-- 
This message has been scanned for viruses and
dangerous content by MailScanner, and is
believed to be clean.



Re: [whatwg] Client-side includes proposal

2008-08-18 Thread Bill Wallace
It is only expensive if the client needs to do the client side include each
time - however, if the page re-uses parts all the time, then parts of the
page can be cached either in a proxy or directly client side, and that can
make it faster as well as reducing server load.
Consider including 3-4 components:

include src=static-header /
include src=user-specific-data /
include src=dynamic-daily-content /

This type of page needs to be read at least a couple of times for each
client for it to be faster, but there are lots of clients like that.

HOWEVER - how about just using the xlink design and making it part of the
html standard (allowing HTML fragments to be included, not just XML).  That
has client side includes, and also supports client side replacement of
components based on clicking a URL/button/object.  Supporting xlink in html
consistently would allow most of the current JavaScript based applications
to be replaced with JavaScript free applications, and that would make many
of them much safer, and is reasonably efficient.

2008/8/18 Martin Ellis [EMAIL PROTECTED]

 I can't speak for non Windows/Linux users, but for windows users IIS
 comes supplied and supports SSI, asp.net php (via a download) etc and
 with linux you can download apache and a sluth of other http daemons, I
 see no reason for any html page to require the client to do the inline
 including of content, as stated previously in this thread the tcp
 overhead is huge and this would only make it worse in my opinion.

 M

 -Original Message-
 From: [EMAIL PROTECTED]
 [mailto:[EMAIL PROTECTED] On Behalf Of Shannon
 Sent: 18 August 2008 18:57
 To: Kristof Zelechovski
 Cc: 'WHAT working group'
 Subject: Re: [whatwg] Client-side includes proposal

 Kristof Zelechovski wrote:
  Client-side includes make a document transformation, something which
  logically belongs XSLT rather than HTML.  And what would the
 workaround for
  legacy browsers be?
  Chris
 

 You make good points but last I checked general browser and editor
 support for XSLT was abysmal. Everyone is saying its on their roadmaps

 though so maybe it will one day be reliable enough to use.

 You could go:

 include src=banner.ihtml
h1Banner/h1
 /include

 But this just seems wasteful, pointless and open to abuse. I think a
 better workaround is that people with legacy browsers download each
 include file seperately and paste them together in DOS or AmigaOS or
 whatever system it is that keeps them from installing a modern browser.

 Of course XSLT has the same legacy issues as do many parts of HTML5. I
 know the reasoning but at some point the web will have to leave
 unmaintained software behind or face the same grim reality OpenGL is
 facing now (can't move forward because a minority want legacy support
 for 10 year old CAD applications, can't go back because competing
 protocols are in front on features).

 I'd like to see the option made available and its use determined by the
 market as we have always done. If a developer wants to piss-off users by

 writing a Flash or Silverlight-only website then the ONLY thing we can
 do is provide an equivalent feature and hope that it does less harm (by
 virtue of being a truly open standard). The average web developer's
 mentally is very different from the majority of this list, they won't
 compromise to do the right thing. If I can do client-side includes in
 Flash and Silverlight (and I can) then why should HTML be left behind?

 Anyway, I don't mean for an open discussion on this as I'm sure it's
 been debated endlessly. I'm just stating my case for going ahead with
 this feature.

 Shannon

 --
 This message has been scanned for viruses and
 dangerous content by MailScanner, and is
 believed to be clean.




Re: [whatwg] Client-side includes proposal

2008-08-18 Thread João Eiras

include src=static-header /
include src=user-specific-data /
include src=dynamic-daily-content /


This is something that would probably not be represented with a new  
element, because elements are  preserved in the DOM tree so they can be  
accessed and queried.
So then you'd have a question: keep includein the dom and allow access  
to children like an iframe, or replace include entirely ?
The answer could be intuitive, but that can open a can of worms. Then you  
can do all sorts of dynamic manipulation, which would have to be very well  
specified.

I think the way to go would be a processing instruction.

The idea is good though ! But I think could be better implemented with a  
css template like feature. I don't like the idea of a new element.


Bye.


Re: [whatwg] Client-side includes proposal

2008-08-18 Thread Greg Houston
On Mon, Aug 18, 2008 at 4:01 PM, João Eiras [EMAIL PROTECTED] wrote:
 include src=static-header /
 include src=user-specific-data /
 include src=dynamic-daily-content /

 This is something that would probably not be represented with a new element,
 because elements are  preserved in the DOM tree so they can be accessed and
 queried.
 So then you'd have a question: keep includein the dom and allow access to
 children like an iframe, or replace include entirely ?
 The answer could be intuitive, but that can open a can of worms. Then you
 can do all sorts of dynamic manipulation, which would have to be very well
 specified.
 I think the way to go would be a processing instruction.

 The idea is good though ! But I think could be better implemented with a css
 template like feature. I don't like the idea of a new element.

 Bye.

This seems to be mostly useful for people creating small websites that
are afraid of server side scripting languages like PHP, Python and
Ruby. That being the case, if something like this is implemented the
included content should definitely not be accessed like with an
iframe. The elements included should be in the DOM tree of the parent
just as if the includes were done server side. Accessing the DOM of an
iframe from the parent and vice versa causes people a lot of
confusion. I don't think we need to add that level of confusion to the
group of users that would most likely use this feature.

Also, a bonus of keeping include src=some-content.html / in the
DOM, is that changing the source could reload the content of that
element. You would have instant AJAX/XHR for dummies.

- Greg


Re: [whatwg] Client-side includes proposal

2008-08-18 Thread Ian Hickson
On Tue, 19 Aug 2008, Shannon wrote:
  
  What advantage does this have over server-side includes?

 Cost. SSI of any description generally puts you on a business hosting 
 plan with a cost increase of up to 10x. Client-side includes only 
 require static page serving which can also be beneficial where the 
 server is a device (like a router interface).

 Availability. As far as I can tell SSI is only available on Apache and 
 later versions of IIS. This may be a market majority but when you 
 consider the number of devices and home servers coming onto the market 
 with basic web interfaces the actual availability of SSI may be lower 
 than you'd expect.
 
 Security. Availability is reduced even further by ISP and organisations 
 that flat-out refuse to support SSI due to security and support 
 concerns.

All three of the above basically boil down to the same thing -- there are 
hosting providers that don't provide the simplest of features. That's 
certainly true, but there are also hosting providers that provide these 
features for very cheap (e.g. $120/year or less) that provide all this and 
more, so I don't buy this argument. If you're having trouble finding one, 
contact me privately and I can give you a coupon code for one.


 Reuse. SSI follows no agreed set of rules and therefore may require code 
 changes when changing hosting provider. Some systems provide extensions 
 that authors may assume is part of a standard, but aren't. We have an 
 opportunity to define a simpler and more reliable interface that is 
 independant of any server configuration.

If it's just a static site, then you can just pre-generate the pages and 
upload the completed pages, so there's no dependency on the server. This, 
incidentally, also works when the server-side hosting provider doesn't 
support SSIs at all.

If it's not a static site, then the SSIs are going to be the least of your 
problems when you change to a different server.


 Speed. Concerns about speed are generally only valid for the first page 
 on the first visit to a site. Subsequent pages can be much faster than 
 SSI and even static html since common banners and footers can be cached 
 seperately - requiring only a unique content download. This is less 
 trivial than it sounds since global drop-down menus, ad frames, tracking 
 code, overlays and embedded JS and CSS often account for a vast majority 
 of the source code.

We're talking about such a small amount of data here that the latency far 
outweighs the bandwidth cost on most connections. Given that you still 
have to do an If-Modified-Since check, you don't really gain anything 
here. If we did want to optimise for small parts of the content being 
common over multiple pages, we should investigate dictionary-based 
compression with site-specific dictionaries. That would get us much, much 
better performance than cached CSIs.


 Flexibility. It's hard to tell because the seamless proposal is vague 
 but from what I can tell there are a lot of things a seamless iframe 
 is not seamless about. For instance can absolutely positioned items be 
 positioned relative to the enclosing document? Do child and adjacent 
 selectors work across the iframe boundary? Will IE give up its behaviour 
 of placing iframes above the document regardless of z-index? Includes 
 don't have any of these issues because they are treated as part of the 
 document, not as a special case.

This isn't an argument over SSIs. I agree that for inclusions, the iframe 
seamless feature isn't optimal. It was not designed for that, it was 
meant for including sandboxed blog comments and the like.


  A TCP round-trip is very expensive. A client-side parsing-level 
  include would mean that the parser would have to stop while a complete 
  round-trip is performed. There's really no way to get around that 
  short of making it a higher-level construct like iframe seamless.

 There is actually an easy solution for this, though it is less flexible 
 than my original proposal. The solution is to require each include to be 
 balanced (equal number of open and close tags) so the surrounding block 
 is guaranteed to be a single node. Anything left open is forcefully 
 closed (as when reaching /body with open blocks). In other words:
 
 div id=content style=min-height:500px
include src=content.ihtml
 /div!-- always closes content --

What do you do when the CSIed page includes script that manipulates 
content after the include? Now you have a race condition. This is just as 
bad as blocking, if not worse, since it's now unpredictable.


Anyway in conclusion I don't understand what CSIs give us that is actually 
worth the massive amounts of effort they require. Just generate your pages 
server-side or upload them to your server pre-generated.

-- 
Ian Hickson   U+1047E)\._.,--,'``.fL
http://ln.hixie.ch/   U+263A/,   _.. \   _\  ;`._ ,.
Things that are impossible just take longer.   

Re: [whatwg] Client-side includes proposal

2008-08-18 Thread Ian Hickson
On Mon, 18 Aug 2008, Bill Wallace wrote:

 It is only expensive if the client needs to do the client side include 
 each time - however, if the page re-uses parts all the time, then parts 
 of the page can be cached either in a proxy or directly client side, and 
 that can make it faster as well as reducing server load. Consider 
 including 3-4 components:
 
 include src=static-header /
 include src=user-specific-data /
 include src=dynamic-daily-content /
 
 This type of page needs to be read at least a couple of times for each 
 client for it to be faster, but there are lots of clients like that.

This still ends up being expensive due to the round-trip cost of checking 
if the resource has changed or not, even if the resource itself isn't 
transferred. It's cheaper to just include the content. If we wanted to 
optimise this case, we should look into dictionary-based compression.


 HOWEVER - how about just using the xlink design and making it part of 
 the html standard (allowing HTML fragments to be included, not just 
 XML).  That has client side includes, and also supports client side 
 replacement of components based on clicking a URL/button/object.  
 Supporting xlink in html consistently would allow most of the current 
 JavaScript based applications to be replaced with JavaScript free 
 applications, and that would make many of them much safer, and is 
 reasonably efficient.

I assume you mean XInclude, not XLink (XLink is a disaster). Certainly if 
people want to use XInclude with XHTML there's nothing in the HTML5 spec 
that stops it.

-- 
Ian Hickson   U+1047E)\._.,--,'``.fL
http://ln.hixie.ch/   U+263A/,   _.. \   _\  ;`._ ,.
Things that are impossible just take longer.   `._.-(,_..'--(,_..'`-.;.'


Re: [whatwg] Client-side includes proposal

2008-08-18 Thread Shannon

Ian Hickson wrote:
All three of the above basically boil down to the same thing -- there are 
hosting providers that don't provide the simplest of features. That's 
certainly true, but there are also hosting providers that provide these 
features for very cheap (e.g. $120/year or less) that provide all this and 
more, so I don't buy this argument. If you're having trouble finding one, 
contact me privately and I can give you a coupon code for one.


  
Thank you but I have my own servers. You make a pretty big assumption 
here about the usage model of the Internet being corporate + ISP. You 
ignore:


* Web applications and HTML documentation on the local filesystem.
* Local testing in WYSIWYG editors.
* Autorun frontends on CDROM that are typically employed by magazine 
coverdiscs.
* Embedded servers in storage devices, media centers, routers and other 
gadgets.
* Minimalist HTTP servers for simple usage, websocket tunnels or high 
load services.
* Users taking advantage of free hosting services with limited features 
(like Sourceforge or Geocities).
* Charities and OSS groups with better things to spend their money on 
than hosting package upgrades.
* Companies like many in the building and equipment hire industries that 
spent $500 on a website and call it expensive.


Far from being a small minority these groups and applications possibly 
make up the majority of HTML content after blogs.


If it's just a static site, then you can just pre-generate the pages and 
upload the completed pages, so there's no dependency on the server. This, 
incidentally, also works when the server-side hosting provider doesn't 
support SSIs at all.


If it's not a static site, then the SSIs are going to be the least of your 
problems when you change to a different server.


  
I assume that by pre-generated you're referring to 
Dreamweaver/WebObjects templates rather than copy-and-paste. I find 
these irritating as quite often you are given one format but require 
another (as a FOSS user I can't/won't use Dreamweaver). Some of the best 
editors don't have any kind of template support and when they do it's 
typically native to the application. Sometimes they're even native to an 
installation (you can't export your templates). These things are bad for 
the web without some kind of accepted open specification. CSI could be 
that specification.


Speed. Concerns about speed are generally only valid for the first page 
on the first visit to a site. Subsequent pages can be much faster than 
SSI and even static html since common banners and footers can be cached 
seperately - requiring only a unique content download. This is less 
trivial than it sounds since global drop-down menus, ad frames, tracking 
code, overlays and embedded JS and CSS often account for a vast majority 
of the source code.



We're talking about such a small amount of data here that the latency far 
outweighs the bandwidth cost on most connections. Given that you still 
have to do an If-Modified-Since check, you don't really gain anything 
here.


  
I'm not sure where you get your statistics but these claims don't match 
my direct experience. In the typical case a web page is heavily 
dependant on included content such as CSS files, images, plugins and 
other non-embedded elements. Even geek sites like orielly.com and 
slashdot are not immune. According to websiteoptimization.com the 
orielly home page has 60 external objects. Many common sites are even 
worse. Your claim that adding HTML includes will have any noticeable 
affect on overall page loading times needs more analysis.


orielly.com:
Total HTML: 5
Total HTML Images:  31
Total CSS Images:   14
Total Images:   45
Total Scripts:  7
Total CSS imports:  1
Total Frames:   0
Total Iframes:  4


report: 
http://www.websiteoptimization.com/services/analyze/wso.php?url=http://oreilly.com/


If we did want to optimise for small parts of the content being 
common over multiple pages, we should investigate dictionary-based 
compression with site-specific dictionaries. That would get us much, much 
better performance than cached CSIs.
I like this idea, but it isn't an alternative to CSI from a designers 
perspective nor is it likely to have significant gains over current gzip 
implementations.


This isn't an argument over SSIs. I agree that for inclusions, the iframe 
seamless feature isn't optimal. It was not designed for that, it was 
meant for including sandboxed blog comments and the like.


  
I absolutely agree that iframes aren't an alternative to CSI, not 
vice-versa.
A TCP round-trip is very expensive. A client-side parsing-level 
include would mean that the parser would have to stop while a complete 
round-trip is performed. There's really no way to get around that 
short of making it a higher-level construct like iframe seamless.
  
There is actually an easy solution for this, though it is less flexible 
than my original proposal. The solution is to require each include 

Re: [whatwg] Client-side includes proposal

2008-08-18 Thread Ian Hickson
On Tue, 19 Aug 2008, Shannon wrote:

 You make a pretty big assumption here about the usage model of the 
 Internet being corporate + ISP. You ignore:
 
 * Local testing in WYSIWYG editors.

WYSIWYG editors are quite capable of implementing a model whereby pages 
are merged before being previewed, and are merged when published, removing 
the need for any post-publication inclusion mechanism.


 * Embedded servers in storage devices, media centers, routers and other
   gadgets.
 * Minimalist HTTP servers for simple usage, websocket tunnels or high load
   services.
 * Users taking advantage of free hosting services with limited features (like
   Sourceforge or Geocities).
 * Charities and OSS groups with better things to spend their money on than
   hosting package upgrades.
 * Companies like many in the building and equipment hire industries that spent
   $500 on a website and call it expensive.

In all of these cases, it is quite feasible to just pre-generate the 
documents with the includes directly in the content.


 * Web applications and HTML documentation on the local filesystem.
 * Autorun frontends on CDROM that are typically employed by magazine
   coverdiscs.

These aren't part of the use cases I am considering.


 I assume that by pre-generated you're referring to 
 Dreamweaver/WebObjects templates rather than copy-and-paste.

I was really thinking of C++ preprocessor includes, but any mechanism 
would be fine, sure.


 I find these irritating as quite often you are given one format but 
 require another (as a FOSS user I can't/won't use Dreamweaver).

I usually just roll my own using Perl, it's like one or two lines 
depending on what you want.


  We're talking about such a small amount of data here that the latency 
  far outweighs the bandwidth cost on most connections. Given that you 
  still have to do an If-Modified-Since check, you don't really gain 
  anything here.

 I'm not sure where you get your statistics but these claims don't match 
 my direct experience. In the typical case a web page is heavily 
 dependant on included content such as CSS files, images, plugins and 
 other non-embedded elements. Even geek sites like orielly.com and 
 slashdot are not immune. According to websiteoptimization.com the 
 orielly home page has 60 external objects. Many common sites are even 
 worse. Your claim that adding HTML includes will have any noticeable 
 affect on overall page loading times needs more analysis.

Most content doesn't require blocking. Those that require blocking (like 
scripts) are a massive problem. Just ask any browser vendor. Safari even 
ends up running a second tokeniser in parallel just to try to help this.


   div id=content style=min-height:500px
  include src=content.ihtml
   /div!-- always closes content --
  
  What do you do when the CSIed page includes script that manipulates 
  content after the include? Now you have a race condition. This is just 
  as bad as blocking, if not worse, since it's now unpredictable.
 
 You do the same thing you always have when external JS or inter-page 
 requests raise the same issue.

External (blocking) JS is a disaster. We don't want to be adding more 
features like it. We're still trying to get out of the problems blocking 
scripts added.


  Anyway in conclusion I don't understand what CSIs give us that is 
  actually worth the massive amounts of effort they require. Just 
  generate your pages server-side or upload them to your server 
  pre-generated.

 As a developer I tell you this is not really a good option, and I 
 disagree with your claim of massive effort. It is a fairly 
 straightforward feature as they go. Embedded SQL is a massive effort, 
 WebWorkers is massive effort, client-side includes is quite trivial, 
 relatively speaking. Certainly worth further investigation in light of 
 its obvious benefits.

Providing client-side includes in an efficient manner is on the same kind 
of scale as the other features you mention, but it doesn't have anywhere 
near the same level of benefit. IMHO.

-- 
Ian Hickson   U+1047E)\._.,--,'``.fL
http://ln.hixie.ch/   U+263A/,   _.. \   _\  ;`._ ,.
Things that are impossible just take longer.   `._.-(,_..'--(,_..'`-.;.'