Re: [webkit-dev] Making browsers faster: Resource Packages

2009-11-22 Thread Mike Belshe
On Sat, Nov 21, 2009 at 3:00 PM, Steve Souders st...@souders.org wrote:

  Here's my understanding of how this would work: In addition to the
 resource package LINK and the not-packaged stylesheet LINK, you still need
 LINKs for the other stylesheets. So the page could look like this:
 link rel=resource-package href=pkg.zip
 link rel=stylesheet href=in-package-A.css
 link rel=stylesheet href=in-package-B.css
 link rel=stylesheet href=NOT-in-package-C.css

 or this:
 link rel=resource-package href=pkg.zip
 link rel=stylesheet href=NOT-in-package-C.css
  link rel=stylesheet href=in-package-A.css
 link rel=stylesheet href=in-package-B.css

 Browsers probably shouldn't download any other resources until they've
 gotten the manifest.txt. In the first case, there isn't an extra RT
 (assuming in-package-A.css is the first file in the package), and the page
 should render faster, esp in IE  7 (if all the resources are on the same
 domain). In the second case there, is an extra RT delay for painting.
 Presumably, core stylesheets are packaged and come first, and
 page-specific stylesheets aren't packaged and come last, so the first
 situation is more typical.


CSS and JS can't be declared in arbitrary orders.  So while your argument is
good (about when the extra RTT exists), in practice, it is not always an
option.  If there are 3 scripts, two which can't be bundled and one which
can, then you may or may not suffer the extra RT.

This is really subtle stuff -  web designers could think they are speeding
up their pages when they're slowing them down.  The tools need to prevent
that.  It can't be manual.

Mike





 -Steve



 Mike Belshe wrote:

 Alexander - when you do the testing on this, one case I'd really like to
 see results on is this:

  Page contains a resource bundle, and the bundle contains a bunch of
 stylesheets, JS and other, but DOES NOT include one of the CSS files.
  Immediately following the link resource bundle, put a reference to the
 style sheet not included in the bundle.

  When the browser sees the link to the CSS, which is critical to the page
 download, does it wait for the resource bundle to load (I realize that
 technically it only needs to get the manifest)?  If not, it might download
 it twice (since it doesn't know the status of the bundle yet).

  Now simulate over a 200ms RTT link.  I believe you've just added a full
 RT to get the CSS, which was critical for layout.  Overall PLT won't suffer
 the full RTT, but time-to-first-paint will.

  Mike


 On Wed, Nov 18, 2009 at 3:57 PM, Peter Kasting pkast...@google.comwrote:

  On Wed, Nov 18, 2009 at 3:54 PM, Dirk Pranke dpra...@chromium.orgwrote:

 Another caching-related issue involves versioning of the archives. If
 version 2 of a zip contains only a few files modified since version 1,
 and I have version 1 cached, is there some way to take advantage of
 that?


  This is a specific case of my more general question, One of your stated
 goals is to avoid downloading resources you already have, but even with
 manifests, I see no way to do this, since the client can't actually tell the
 server 'only send items x, y, and z'.  This was the one point Alexander
 didn't copy in his reply mail.

  PK



___
webkit-dev mailing list
webkit-dev@lists.webkit.org
http://lists.webkit.org/mailman/listinfo.cgi/webkit-dev


Re: [webkit-dev] Making browsers faster: Resource Packages

2009-11-21 Thread Alexander Limi
Yeah, that sounds about right. Remember that I'm a UI designer, so I'll have
to defer to some of the other Mozilla people on the benchmark/testing side,
but we'll definitely take stuff like this into account.

I'm also looking for a good syntax for how to specify the manifest content
in the actual HTML document, suggestions welcome. This way, you wouldn't
even have to wait for the manifest file.

-- 
Alexander Limi · Firefox User Experience · http://limi.net



On Sat, Nov 21, 2009 at 3:00 PM, Steve Souders st...@souders.org wrote:

  Here's my understanding of how this would work: In addition to the
 resource package LINK and the not-packaged stylesheet LINK, you still need
 LINKs for the other stylesheets. So the page could look like this:
 link rel=resource-package href=pkg.zip
 link rel=stylesheet href=in-package-A.css
 link rel=stylesheet href=in-package-B.css
 link rel=stylesheet href=NOT-in-package-C.css

 or this:
 link rel=resource-package href=pkg.zip
 link rel=stylesheet href=NOT-in-package-C.css
  link rel=stylesheet href=in-package-A.css
 link rel=stylesheet href=in-package-B.css

 Browsers probably shouldn't download any other resources until they've
 gotten the manifest.txt. In the first case, there isn't an extra RT
 (assuming in-package-A.css is the first file in the package), and the page
 should render faster, esp in IE  7 (if all the resources are on the same
 domain). In the second case there, is an extra RT delay for painting.
 Presumably, core stylesheets are packaged and come first, and
 page-specific stylesheets aren't packaged and come last, so the first
 situation is more typical.

 -Steve



 Mike Belshe wrote:

 Alexander - when you do the testing on this, one case I'd really like to
 see results on is this:

  Page contains a resource bundle, and the bundle contains a bunch of
 stylesheets, JS and other, but DOES NOT include one of the CSS files.
  Immediately following the link resource bundle, put a reference to the
 style sheet not included in the bundle.

  When the browser sees the link to the CSS, which is critical to the page
 download, does it wait for the resource bundle to load (I realize that
 technically it only needs to get the manifest)?  If not, it might download
 it twice (since it doesn't know the status of the bundle yet).

  Now simulate over a 200ms RTT link.  I believe you've just added a full
 RT to get the CSS, which was critical for layout.  Overall PLT won't suffer
 the full RTT, but time-to-first-paint will.

  Mike


 On Wed, Nov 18, 2009 at 3:57 PM, Peter Kasting pkast...@google.comwrote:

  On Wed, Nov 18, 2009 at 3:54 PM, Dirk Pranke dpra...@chromium.orgwrote:

 Another caching-related issue involves versioning of the archives. If
 version 2 of a zip contains only a few files modified since version 1,
 and I have version 1 cached, is there some way to take advantage of
 that?


  This is a specific case of my more general question, One of your stated
 goals is to avoid downloading resources you already have, but even with
 manifests, I see no way to do this, since the client can't actually tell the
 server 'only send items x, y, and z'.  This was the one point Alexander
 didn't copy in his reply mail.

  PK



___
webkit-dev mailing list
webkit-dev@lists.webkit.org
http://lists.webkit.org/mailman/listinfo.cgi/webkit-dev


Re: [webkit-dev] Making browsers faster: Resource Packages

2009-11-21 Thread Alexander Limi
 On Sat, Nov 21, 2009 at 3:43 PM, Peter Kasting pkast...@google.com wrote:

 On Sat, Nov 21, 2009 at 3:12 PM, Alexander Limi l...@mozilla.com wrote:

 I'm also looking for a good syntax for how to specify the manifest content
 in the actual HTML document, suggestions welcome. This way, you wouldn't
 even have to wait for the manifest file.


 The WHATWG list may be a better place to obtain that kind of feedback.


Yeah, $DEITY forbid I should actually ask people who implement browsers. ;)

-- 
Alexander Limi · Firefox User Experience · http://limi.net
___
webkit-dev mailing list
webkit-dev@lists.webkit.org
http://lists.webkit.org/mailman/listinfo.cgi/webkit-dev


Re: [webkit-dev] Making browsers faster: Resource Packages

2009-11-18 Thread Peter Kasting
On Tue, Nov 17, 2009 at 9:58 PM, Steve Souders st...@souders.org wrote:

  I like the option of putting the manifest in the HTML. That was the main
 suggestion I was going to make. You don't *have* to do it, but if you really
 care about performance you could choose to do it.

 James mentions: The page-specific resources end up getting blocked behind
 all of the manifest downloads.

 I would expect that if I have:
 script src=a.js/script
 script src=b.js/script
 link rel=resource-package type=application/zip
 href=site-resources.zip /

 The browser should start downloading a.js and b.js before
 site-resources.zip. Therefore, as a developer, if I have page-specific
 resources, I have some ability to get those downloading before the
 manifest-blocking issue of resource packages.


I'm not totally clear on how this works today, so this might be groundless,
but doesn't this present a potential problem?  a.js gets included early in
the page, the browser finishes loading it, it starts getting used, and then
the browser encounters a resource bundle that contains a different a.js.

I can also imagine scripts doing a document.write that adds a link to a
resource bundle, causing similar potential issues with various
already-loaded resources.

PK
___
webkit-dev mailing list
webkit-dev@lists.webkit.org
http://lists.webkit.org/mailman/listinfo.cgi/webkit-dev


Re: [webkit-dev] Making browsers faster: Resource Packages

2009-11-18 Thread Patrick Mueller

Alexander Limi wrote:

Good people of Webkit!

We'd all like for the web to be faster, and therefore I'd love your feedback
on my proposal — it would be great to see support for this in additional
browsers, not just Firefox:

http://limi.net/articles/resource-packages/

Summary:
What if there was a backwards compatible way to transfer all of the
resources that are used on every single page in your site — CSS, JS, images,
anything else — in a single HTTP request at the start of the first visit to
the page? This is what Resource Package support in browsers will let you do.

Looking forward to hear your thoughts on this.


I happened to open a bug on this, early this morning:

   https://bugs.webkit.org/show_bug.cgi?id=31621

WebKit already supports something similiar - webarchive's.  Main 
difference being that webarchives contain the main resource as well as 
the sub-resources.  Perhaps some of the same machinery can be reused 
there though.


Though ... makes me wonder ... why not have a mode of supporting the 
main resource as well?  Go from two downloads down to one.  You'd just 
need a convention for specifying the main resource in the .zip file ...


--
Patrick Mueller - http://muellerware.org

___
webkit-dev mailing list
webkit-dev@lists.webkit.org
http://lists.webkit.org/mailman/listinfo.cgi/webkit-dev


Re: [webkit-dev] Making browsers faster: Resource Packages

2009-11-18 Thread Alexander Limi
On Tue, Nov 17, 2009 at 5:56 PM, Alexander Limi l...@mozilla.com wrote:

 On Tue, Nov 17, 2009 at 5:53 PM, James Robinson jam...@google.com wrote:

 Yes, actual numbers would be nice to have.


 Steve Souders just emailed me some preliminary numbers from a bunch of
 major web sites, so that should be on his blog shortly.


Numbers are up:
http://www.stevesouders.com/blog/2009/11/18/fewer-requests-through-resource-packages/

-- 
Alexander Limi · Firefox User Experience · http://limi.net
___
webkit-dev mailing list
webkit-dev@lists.webkit.org
http://lists.webkit.org/mailman/listinfo.cgi/webkit-dev


Re: [webkit-dev] Making browsers faster: Resource Packages

2009-11-18 Thread Mike Belshe
Overall, I think the general idea.

I'm concerned about the head-of-line blocking that it introduces.  If an
administrator poorly constructs the bundle, he could significantly hurt
perf.  Instead of using gzip, you could use a framer which chunked items
before gzipping.  This might be more trouble than it is worth.

Inside the browser, the caching is going to be kind of annoying.  Example:
 Say foo.zip contains foo.gif and baz.gif, and foo.zip expires in one week.
  When the browser downloads the manifest, it needs to unfold it and store
foo.gif and baz.gif in the cache.  Then, a week later, if the browser tries
to use foo.gif, it will be expired; does the browser fetch foo.zip?  or just
foo.gif?  Obviously, either will work.  But now you've got an inconsistent
cache.  If you hit another page which references foo.zip next, you'll
download the whole zip file when all you needed was bar.gif.  This is
probably a minor problem - I can't see this being very significant in
practice.  Did you consider having the resources for a bundle be addressed
such as:  http://www.foo.com/bundle.zip/foo.gif  ?  This would eliminate the
problem of two names for the same resource.  Maybe this was your intent -
the spec was unclear about the identity (URL) of the bundled resources.

I think it is a good enough idea to warrant an implementation.  Once we have
data about performance, it will be clear whether this should be made
official or not.

Mike


On Wed, Nov 18, 2009 at 11:56 AM, Alexander Limi l...@mozilla.com wrote:

 On Tue, Nov 17, 2009 at 5:56 PM, Alexander Limi l...@mozilla.com wrote:

 On Tue, Nov 17, 2009 at 5:53 PM, James Robinson jam...@google.comwrote:

 Yes, actual numbers would be nice to have.


 Steve Souders just emailed me some preliminary numbers from a bunch of
 major web sites, so that should be on his blog shortly.


 Numbers are up:

 http://www.stevesouders.com/blog/2009/11/18/fewer-requests-through-resource-packages/


 --
 Alexander Limi · Firefox User Experience · http://limi.net



 ___
 webkit-dev mailing list
 webkit-dev@lists.webkit.org
 http://lists.webkit.org/mailman/listinfo.cgi/webkit-dev


___
webkit-dev mailing list
webkit-dev@lists.webkit.org
http://lists.webkit.org/mailman/listinfo.cgi/webkit-dev


Re: [webkit-dev] Making browsers faster: Resource Packages

2009-11-18 Thread Mike Belshe
On Wed, Nov 18, 2009 at 2:47 PM, Mike Belshe m...@belshe.com wrote:

 Overall, I think the general idea.


I meant to say Overall I like the general idea




 I'm concerned about the head-of-line blocking that it introduces.  If an
 administrator poorly constructs the bundle, he could significantly hurt
 perf.  Instead of using gzip, you could use a framer which chunked items
 before gzipping.  This might be more trouble than it is worth.

 Inside the browser, the caching is going to be kind of annoying.  Example:
  Say foo.zip contains foo.gif and baz.gif, and foo.zip expires in one week.
   When the browser downloads the manifest, it needs to unfold it and store
 foo.gif and baz.gif in the cache.  Then, a week later, if the browser tries
 to use foo.gif, it will be expired; does the browser fetch foo.zip?  or just
 foo.gif?  Obviously, either will work.  But now you've got an inconsistent
 cache.  If you hit another page which references foo.zip next, you'll
 download the whole zip file when all you needed was bar.gif.  This is
 probably a minor problem - I can't see this being very significant in
 practice.  Did you consider having the resources for a bundle be addressed
 such as:  http://www.foo.com/bundle.zip/foo.gif  ?  This would eliminate
 the problem of two names for the same resource.  Maybe this was your intent
 - the spec was unclear about the identity (URL) of the bundled resources.

 I think it is a good enough idea to warrant an implementation.  Once we
 have data about performance, it will be clear whether this should be made
 official or not.

 Mike


 On Wed, Nov 18, 2009 at 11:56 AM, Alexander Limi l...@mozilla.com wrote:

 On Tue, Nov 17, 2009 at 5:56 PM, Alexander Limi l...@mozilla.com wrote:

 On Tue, Nov 17, 2009 at 5:53 PM, James Robinson jam...@google.comwrote:

 Yes, actual numbers would be nice to have.


 Steve Souders just emailed me some preliminary numbers from a bunch of
 major web sites, so that should be on his blog shortly.


 Numbers are up:

 http://www.stevesouders.com/blog/2009/11/18/fewer-requests-through-resource-packages/


 --
 Alexander Limi · Firefox User Experience · http://limi.net



 ___
 webkit-dev mailing list
 webkit-dev@lists.webkit.org
 http://lists.webkit.org/mailman/listinfo.cgi/webkit-dev



___
webkit-dev mailing list
webkit-dev@lists.webkit.org
http://lists.webkit.org/mailman/listinfo.cgi/webkit-dev


Re: [webkit-dev] Making browsers faster: Resource Packages

2009-11-18 Thread Alexander Limi
Cool, thanks for the feedback. It does seem like most people (well, outside
of this list ;) like the direction a lot.

The one issue that I agree with, and would like to find an elegant solution
to, is how to specify the manifest in the HTML instead of in the zip file to
reduce blocking and start parsing earlier.

I can't say that I know enough about HTML specifics to have an immediately
useful answer here, I assume the link tag can't have content inside of it
that could serve as the manifest?

-- 
Alexander Limi · Firefox User Experience · http://limi.net



On Wed, Nov 18, 2009 at 2:47 PM, Mike Belshe m...@belshe.com wrote:

 Overall, I think the general idea.

 I'm concerned about the head-of-line blocking that it introduces.  If an
 administrator poorly constructs the bundle, he could significantly hurt
 perf.  Instead of using gzip, you could use a framer which chunked items
 before gzipping.  This might be more trouble than it is worth.

 Inside the browser, the caching is going to be kind of annoying.  Example:
  Say foo.zip contains foo.gif and baz.gif, and foo.zip expires in one week.
   When the browser downloads the manifest, it needs to unfold it and store
 foo.gif and baz.gif in the cache.  Then, a week later, if the browser tries
 to use foo.gif, it will be expired; does the browser fetch foo.zip?  or just
 foo.gif?  Obviously, either will work.  But now you've got an inconsistent
 cache.  If you hit another page which references foo.zip next, you'll
 download the whole zip file when all you needed was bar.gif.  This is
 probably a minor problem - I can't see this being very significant in
 practice.  Did you consider having the resources for a bundle be addressed
 such as:  http://www.foo.com/bundle.zip/foo.gif  ?  This would eliminate
 the problem of two names for the same resource.  Maybe this was your intent
 - the spec was unclear about the identity (URL) of the bundled resources.

 I think it is a good enough idea to warrant an implementation.  Once we
 have data about performance, it will be clear whether this should be made
 official or not.

 Mike


 On Wed, Nov 18, 2009 at 11:56 AM, Alexander Limi l...@mozilla.com wrote:

 On Tue, Nov 17, 2009 at 5:56 PM, Alexander Limi l...@mozilla.com wrote:

 On Tue, Nov 17, 2009 at 5:53 PM, James Robinson jam...@google.comwrote:

 Yes, actual numbers would be nice to have.


 Steve Souders just emailed me some preliminary numbers from a bunch of
 major web sites, so that should be on his blog shortly.


 Numbers are up:

 http://www.stevesouders.com/blog/2009/11/18/fewer-requests-through-resource-packages/


 --
 Alexander Limi · Firefox User Experience · http://limi.net



 ___
 webkit-dev mailing list
 webkit-dev@lists.webkit.org
 http://lists.webkit.org/mailman/listinfo.cgi/webkit-dev



___
webkit-dev mailing list
webkit-dev@lists.webkit.org
http://lists.webkit.org/mailman/listinfo.cgi/webkit-dev


Re: [webkit-dev] Making browsers faster: Resource Packages

2009-11-18 Thread Steve Tickle
Hi All,

As a game developer with interests in mobile, I'd like to say overall, I
think the idea sucks.

Incorrectly implemented clients will end up downloading large resources
multiple times, and web masters are sure to start adding every resource to
pages that don't need them.

As I see it, the load time issue only occurs on first load anyway, as
properly implemented clients and servers should use cache directives to
avoid unnecessary connections.

If the first load case is a major concern (as in mobile), web masters should
optimise their pages so that resources are amalgamated wherever possible.

If you truly want better interactive performance from websites, add a
streaming connectionless facility atop UDP, rather than complicating matters
with an ad-hoc application level HTML hack.

Regards,

Steve

2009/11/18 Mike Belshe m...@belshe.com



 On Wed, Nov 18, 2009 at 2:47 PM, Mike Belshe m...@belshe.com wrote:

 Overall, I think the general idea.


 I meant to say Overall I like the general idea




 I'm concerned about the head-of-line blocking that it introduces.  If an
 administrator poorly constructs the bundle, he could significantly hurt
 perf.  Instead of using gzip, you could use a framer which chunked items
 before gzipping.  This might be more trouble than it is worth.

 Inside the browser, the caching is going to be kind of annoying.  Example:
  Say foo.zip contains foo.gif and baz.gif, and foo.zip expires in one week.
   When the browser downloads the manifest, it needs to unfold it and store
 foo.gif and baz.gif in the cache.  Then, a week later, if the browser tries
 to use foo.gif, it will be expired; does the browser fetch foo.zip?  or just
 foo.gif?  Obviously, either will work.  But now you've got an inconsistent
 cache.  If you hit another page which references foo.zip next, you'll
 download the whole zip file when all you needed was bar.gif.  This is
 probably a minor problem - I can't see this being very significant in
 practice.  Did you consider having the resources for a bundle be addressed
 such as:  http://www.foo.com/bundle.zip/foo.gif  ?  This would eliminate
 the problem of two names for the same resource.  Maybe this was your intent
 - the spec was unclear about the identity (URL) of the bundled resources.

 I think it is a good enough idea to warrant an implementation.  Once we
 have data about performance, it will be clear whether this should be made
 official or not.

 Mike


 On Wed, Nov 18, 2009 at 11:56 AM, Alexander Limi l...@mozilla.comwrote:

 On Tue, Nov 17, 2009 at 5:56 PM, Alexander Limi l...@mozilla.comwrote:

 On Tue, Nov 17, 2009 at 5:53 PM, James Robinson jam...@google.comwrote:

 Yes, actual numbers would be nice to have.


 Steve Souders just emailed me some preliminary numbers from a bunch of
 major web sites, so that should be on his blog shortly.


 Numbers are up:

 http://www.stevesouders.com/blog/2009/11/18/fewer-requests-through-resource-packages/


 --
 Alexander Limi · Firefox User Experience · http://limi.net



 ___
 webkit-dev mailing list
 webkit-dev@lists.webkit.org
 http://lists.webkit.org/mailman/listinfo.cgi/webkit-dev




 ___
 webkit-dev mailing list
 webkit-dev@lists.webkit.org
 http://lists.webkit.org/mailman/listinfo.cgi/webkit-dev




-- 
Steve Tickle
sixteenk - Refined software development for mobile and web

t: +44 151 324 2816
m: +44 7950 052 976
w: sixteenk.com
___
webkit-dev mailing list
webkit-dev@lists.webkit.org
http://lists.webkit.org/mailman/listinfo.cgi/webkit-dev


Re: [webkit-dev] Making browsers faster: Resource Packages

2009-11-18 Thread Peter Kasting
On Wed, Nov 18, 2009 at 3:54 PM, Dirk Pranke dpra...@chromium.org wrote:

 Another caching-related issue involves versioning of the archives. If
 version 2 of a zip contains only a few files modified since version 1,
 and I have version 1 cached, is there some way to take advantage of
 that?


This is a specific case of my more general question, One of your stated
goals is to avoid downloading resources you already have, but even with
manifests, I see no way to do this, since the client can't actually tell the
server 'only send items x, y, and z'.  This was the one point Alexander
didn't copy in his reply mail.

PK
___
webkit-dev mailing list
webkit-dev@lists.webkit.org
http://lists.webkit.org/mailman/listinfo.cgi/webkit-dev


[webkit-dev] Making browsers faster: Resource Packages

2009-11-17 Thread Alexander Limi
Good people of Webkit!

We'd all like for the web to be faster, and therefore I'd love your feedback
on my proposal — it would be great to see support for this in additional
browsers, not just Firefox:

http://limi.net/articles/resource-packages/

Summary:
What if there was a backwards compatible way to transfer all of the
resources that are used on every single page in your site — CSS, JS, images,
anything else — in a single HTTP request at the start of the first visit to
the page? This is what Resource Package support in browsers will let you do.

Looking forward to hear your thoughts on this.

Thanks!

-- 
Alexander Limi · Firefox User Experience · http://limi.net
___
webkit-dev mailing list
webkit-dev@lists.webkit.org
http://lists.webkit.org/mailman/listinfo.cgi/webkit-dev


Re: [webkit-dev] Making browsers faster: Resource Packages

2009-11-17 Thread David Hyatt

I have many of the same concerns mentioned here:

http://ajaxian.com/archives/resource-packages-making-a-faster-web-via-packaging

dave
(hy...@apple.com)

On Nov 17, 2009, at 4:19 PM, Alexander Limi wrote:


Good people of Webkit!

We'd all like for the web to be faster, and therefore I'd love your  
feedback on my proposal — it would be great to see support for this  
in additional browsers, not just Firefox:


http://limi.net/articles/resource-packages/

Summary:
What if there was a backwards compatible way to transfer all of the  
resources that are used on every single page in your site — CSS, JS,  
images, anything else — in a single HTTP request at the start of the  
first visit to the page? This is what Resource Package support in  
browsers will let you do.


Looking forward to hear your thoughts on this.

Thanks!

--
Alexander Limi · Firefox User Experience · http://limi.net

___
webkit-dev mailing list
webkit-dev@lists.webkit.org
http://lists.webkit.org/mailman/listinfo.cgi/webkit-dev


___
webkit-dev mailing list
webkit-dev@lists.webkit.org
http://lists.webkit.org/mailman/listinfo.cgi/webkit-dev


Re: [webkit-dev] Making browsers faster: Resource Packages

2009-11-17 Thread Alexander Limi
Could you be more specific? Most of the comments seem to be a result of
people not actually reading the spec.

As for the comments in the article itself:

   - You still do parallel downloads, and you can have multiple resource
   packages.
   - The zip can have expiry headers, and can be invalidated using ETags.

If you can pull out the specific questions, I'm happy to answer them.

-- 
Alexander Limi · Firefox User Experience · http://limi.net



On Tue, Nov 17, 2009 at 2:30 PM, David Hyatt hy...@apple.com wrote:

 I have many of the same concerns mentioned here:


 http://ajaxian.com/archives/resource-packages-making-a-faster-web-via-packaging

 dave
 (hy...@apple.com)

 On Nov 17, 2009, at 4:19 PM, Alexander Limi wrote:

 Good people of Webkit!

 We'd all like for the web to be faster, and therefore I'd love your
 feedback on my proposal — it would be great to see support for this in
 additional browsers, not just Firefox:

 http://limi.net/articles/resource-packages/

 Summary:
 What if there was a backwards compatible way to transfer all of the
 resources that are used on every single page in your site — CSS, JS, images,
 anything else — in a single HTTP request at the start of the first visit to
 the page? This is what Resource Package support in browsers will let you do.

 Looking forward to hear your thoughts on this.

 Thanks!

 --
 Alexander Limi · Firefox User Experience · http://limi.net

  ___
 webkit-dev mailing list
 webkit-dev@lists.webkit.org
 http://lists.webkit.org/mailman/listinfo.cgi/webkit-dev



___
webkit-dev mailing list
webkit-dev@lists.webkit.org
http://lists.webkit.org/mailman/listinfo.cgi/webkit-dev


Re: [webkit-dev] Making browsers faster: Resource Packages

2009-11-17 Thread Peter Kasting
On Tue, Nov 17, 2009 at 2:19 PM, Alexander Limi l...@mozilla.com wrote:

 We'd all like for the web to be faster, and therefore I'd love your
 feedback on my proposal


I have read the whole document, but I read it quickly, so please do point
out places where I've overlooked an obvious response.

Reduced parallelism is a big concern of mine.  Lots of sites make heavy use
of resource sharding across many hostnames to take advantage of multiple
connections, which this defeats.  You say in this thread that you still do
parallel downloads, but it seems to me that you either download this zip in
parallel with anything not in the zip (meaning you run out of parallelism
faster the more the author makes use of this technique), or else you
potentially download in parallel multiple copies of the same resource (one
in the zip, one outside), neither of which is good.

I am concerned about the instruction to prefer the packaged resources to any
separate resources.  This seems to increase the maintenance burden since you
can never incrementally override the contents of a package, but always have
to repackage.

One of your stated goals is to avoid downloading resources you already have,
but even with manifests, I see no way to do this, since the client can't
actually tell the server only send items x, y, and z.

If an author has resources only used on some pages, then he can either make
multiple packages (more maintenance burden and exacerbates problem above),
or include everything in one package (may result in downloading excessive
resources for pages where clients don't need them).

You note that SPDY has to be implemented by both UAs and web servers, but
conversely this proposal needs to be implemented by UAs and _authors_.  I
would rather burden the guys writing Apache than the guys making webpages,
and I think if a technique is extremely useful, it's easier to get support
into Apache than into, say, 50% of the webpages out there.

PK
___
webkit-dev mailing list
webkit-dev@lists.webkit.org
http://lists.webkit.org/mailman/listinfo.cgi/webkit-dev


Re: [webkit-dev] Making browsers faster: Resource Packages

2009-11-17 Thread James Robinson
On Tue, Nov 17, 2009 at 2:19 PM, Alexander Limi l...@mozilla.com wrote:

 Good people of Webkit!

 We'd all like for the web to be faster, and therefore I'd love your feedback 
 on my proposal — it would be great to see support for this in additional 
 browsers, not just Firefox:

 http://limi.net/articles/resource-packages/

 Summary:
 What if there was a backwards compatible way to transfer all of the resources 
 that are used on every single page in your site — CSS, JS, images, anything 
 else — in a single HTTP request at the start of the first visit to the page? 
 This is what Resource Package support in browsers will let you do.

 Looking forward to hear your thoughts on this.

It seems like a browser will have to essentially stop rendering until
it has finished downloading the entire .zip and examined it.  This
will most likely slow down the time taken to render parts of the page
as they arrive. From the blog post:

A given browser will probably block downloading any resources until
the lists of files that are available in resource packages have been
accounted for — or there may be a way to do opportunistic requests or
similar, we leave this up to the browser vendor unless there’s a
compelling reason to specify how this should work.

This also means that a browser would have to stop tokenizing the HTML
when it hits the next script src= tag, since it would be unable to
know if the javascript was in the bundled zip or not.  This seems to
go against the idea that as much of the page be rendered as fast as
possible.

- James


 Thanks!

 --
 Alexander Limi · Firefox User Experience · http://limi.net


 ___
 webkit-dev mailing list
 webkit-dev@lists.webkit.org
 http://lists.webkit.org/mailman/listinfo.cgi/webkit-dev

___
webkit-dev mailing list
webkit-dev@lists.webkit.org
http://lists.webkit.org/mailman/listinfo.cgi/webkit-dev


Re: [webkit-dev] Making browsers faster: Resource Packages

2009-11-17 Thread Peter Kasting
On Tue, Nov 17, 2009 at 3:00 PM, James Robinson jam...@google.com wrote:

 It seems like a browser will have to essentially stop rendering until
 it has finished downloading the entire .zip and examined it.


I think mitigating this is why there are optional manifests.  I agree that
if there's no manifest, this is really, really painful.  I think manifests
should be made mandatory.

PK
___
webkit-dev mailing list
webkit-dev@lists.webkit.org
http://lists.webkit.org/mailman/listinfo.cgi/webkit-dev


Re: [webkit-dev] Making browsers faster: Resource Packages

2009-11-17 Thread Simon Fraser

On Nov 17, 2009, at 3:02 PM, Peter Kasting wrote:

On Tue, Nov 17, 2009 at 3:00 PM, James Robinson jam...@google.com  
wrote:

It seems like a browser will have to essentially stop rendering until
it has finished downloading the entire .zip and examined it.

I think mitigating this is why there are optional manifests.  I  
agree that if there's no manifest, this is really, really painful.   
I think manifests should be made mandatory.


If you require a manifest, why not pick an archive format where  
there's a TOC which is guaranteed to be at the head of the file, which  
the browser can parse without having to wait for the entire file to  
download?


In my not very extensive reading of pages on the zip format, it  
appears that the Central Directory is always at the end of the file,  
which is not very friendly to progressive download of these files.


Simon

___
webkit-dev mailing list
webkit-dev@lists.webkit.org
http://lists.webkit.org/mailman/listinfo.cgi/webkit-dev


Re: [webkit-dev] Making browsers faster: Resource Packages

2009-11-17 Thread James Robinson
On Tue, Nov 17, 2009 at 3:02 PM, Peter Kasting pkast...@google.com wrote:
 On Tue, Nov 17, 2009 at 3:00 PM, James Robinson jam...@google.com wrote:

 It seems like a browser will have to essentially stop rendering until
 it has finished downloading the entire .zip and examined it.

 I think mitigating this is why there are optional manifests.  I agree that
 if there's no manifest, this is really, really painful.  I think manifests
 should be made mandatory.
 PK

Do you mean external manifests?  Either way, the browser cannot start
a download for any external resource until it downloads and parses out
the manifest.txt for every resource bundle seen in the page so far.
Whether it is pulling the manifest out of a .zip file or as a .txt by
itself doesn't matter much, it's still an extra HTTP round-trip before
any content can be downloaded (including content that is not bundled
at all).

- James
___
webkit-dev mailing list
webkit-dev@lists.webkit.org
http://lists.webkit.org/mailman/listinfo.cgi/webkit-dev


Re: [webkit-dev] Making browsers faster: Resource Packages

2009-11-17 Thread Peter Kasting
On Tue, Nov 17, 2009 at 3:14 PM, James Robinson jam...@google.com wrote:

 On Tue, Nov 17, 2009 at 3:02 PM, Peter Kasting pkast...@google.com
 wrote:
  On Tue, Nov 17, 2009 at 3:00 PM, James Robinson jam...@google.com
 wrote:
  It seems like a browser will have to essentially stop rendering until
  it has finished downloading the entire .zip and examined it.
 
  I think mitigating this is why there are optional manifests.  I agree
 that
  if there's no manifest, this is really, really painful.  I think
 manifests
  should be made mandatory.
  PK

 Do you mean external manifests?  Either way, the browser cannot start
 a download for any external resource until it downloads and parses out
 the manifest.txt for every resource bundle seen in the page so far.


Well, it can start the downloads, but it might have to throw them away, and
it definitely can't actually make use of their contents.  In this way it's a
bit like the current implementation of optimistic parallel script fetching.


 Whether it is pulling the manifest out of a .zip file or as a .txt by
 itself doesn't matter much, it's still an extra HTTP round-trip before
 any content can be downloaded (including content that is not bundled
 at all).


True.  It's just not a delay until the .zip file has completed its download.

I agree that even with a manifest, this is suboptimal.

I think the decision to incorporate this in Fx 3.7 may be premature.

PK
___
webkit-dev mailing list
webkit-dev@lists.webkit.org
http://lists.webkit.org/mailman/listinfo.cgi/webkit-dev


Re: [webkit-dev] Making browsers faster: Resource Packages

2009-11-17 Thread Anthony RICAUD
I don't see much value in this proposition.

For CSS and JS, people will need to run a script to generate the ZIP
file. That's what they already do when combining files. And they get
more value by combining since they get performance improvements for
all browsers, not just the one supporting resource-packages.

For CSS Sprites, the W3C is working on making it better with
http://www.w3.org/TR/css3-images/. Maybe you should put some efforts
on this spec. And again, with the actual situation, the code is maybe
ugly but you get performance benefits for all browsers.

The only benefit would eventually be for inline images. But you can't
generate a static ZIP file for such moving content so I don't see a
lot of persons taking advantage of this proposition for that kind of
content. Thinking out loud, maybe something like img src=file.png
from=file.zip would be more appropriate for that particular kind of
content. This way, browsers now upfront if a file can be found in a
package.

I sure like the idea of trying to reduce the number of HTTP requests,
but I don't think it's the right solution.

On Tue, Nov 17, 2009 at 11:19 PM, Alexander Limi l...@mozilla.com wrote:
 Good people of Webkit!

 We'd all like for the web to be faster, and therefore I'd love your feedback
 on my proposal — it would be great to see support for this in additional
 browsers, not just Firefox:

 http://limi.net/articles/resource-packages/

 Summary:
 What if there was a backwards compatible way to transfer all of the
 resources that are used on every single page in your site — CSS, JS, images,
 anything else — in a single HTTP request at the start of the first visit to
 the page? This is what Resource Package support in browsers will let you do.

 Looking forward to hear your thoughts on this.

 Thanks!

 --
 Alexander Limi · Firefox User Experience · http://limi.net


 ___
 webkit-dev mailing list
 webkit-dev@lists.webkit.org
 http://lists.webkit.org/mailman/listinfo.cgi/webkit-dev


___
webkit-dev mailing list
webkit-dev@lists.webkit.org
http://lists.webkit.org/mailman/listinfo.cgi/webkit-dev


Re: [webkit-dev] Making browsers faster: Resource Packages

2009-11-17 Thread Alexander Limi
 (Adding in some of the people involved with Resource Packages earlier to
this thread, so they can help me out — I'm just a lowly UI designer, so some
of these questions have to be answered by people that know how browsers
work. I'm just the messenger. Hope you don't mind, guys, and remember that
webkit-dev requires you to sign up before you can post.)

On Tue, Nov 17, 2009 at 2:44 PM, Peter Kasting pkast...@google.com wrote:

 I have read the whole document, but I read it quickly, so please do point
 out places where I've overlooked an obvious response.


This is what everyone does, so no worries, happy to clarify. 95% of the
this is why this won't work statements are actually answered by the
article in some way. But I guess I shouldn't be surprised. :)


 Reduced parallelism is a big concern of mine.  Lots of sites make heavy use
 of resource sharding across many hostnames to take advantage of multiple
 connections, which this defeats.


If you package up everything in a single zip file, yes. Realistically, if
you have a lot of resources, you'd want to spread them out over several
files to increase parallelism. Also, there's usually resources that are
page-specific (e.g. belong to the article being rendered). As with
everything, there are possibilities to use this the wrong way, and packaging
up everything in one zip file will definitely affect parallelism. Don't do
that.

I am concerned about the instruction to prefer the packaged resources to any
 separate resources.  This seems to increase the maintenance burden since you
 can never incrementally override the contents of a package, but always have
 to repackage.


This is something we could look at, of course. There are easy ways to
invalidate the zip using ETags etc.


 If an author has resources only used on some pages, then he can either make
 multiple packages (more maintenance burden and exacerbates problem above),
 or include everything in one package (may result in downloading excessive
 resources for pages where clients don't need them).


I don't think it's unreasonable to expect most big sites to have a standard
core of resources they use everywhere. It's important not to try to put
*everything* in resource packages, just the stuff that should be present
everywhere (and the specialized thumbnail search result case I mentioned).


 You note that SPDY has to be implemented by both UAs and web servers, but
 conversely this proposal needs to be implemented by UAs and _authors_.  I
 would rather burden the guys writing Apache than the guys making webpages,
 and I think if a technique is extremely useful, it's easier to get support
 into Apache than into, say, 50% of the webpages out there.


There's no damage if you *don't* do this as a web author. If you care enough
to do CSS spriting and CSS/JS combining, this gives you a more maintainable,
easier, faster solution.

On Tue, Nov 17, 2009 at 3:00 PM, James Robinson jam...@google.com wrote:

 It seems like a browser will have to essentially stop rendering until
  it has finished downloading the entire .zip and examined it.


No. That's why the manifest is there, since it can be read early on, so the
browser doesn't have to block.

I see a lot of I don't think this will work or I don't think this will be
any faster here. I guess I should get someone to help me create some
reasonable benchmarks and show what the difference would be. Maybe Steve
Souders or someone else that is better at this stuff than me can help out
with some data.

On Tue, Nov 17, 2009 at 3:02 PM, Peter Kasting pkast...@google.com wrote:

 I think mitigating this is why there are optional manifests.  I agree that
 if there's no manifest, this is really, really painful.  I think manifests
 should be made mandatory.


The manifests *are* mandatory. Without a manifest, it won't do anything (ie.
proceed to load the resources as usual), since that would block page loads,
which is not an option.


On Tue, Nov 17, 2009 at 3:12 PM, Simon Fraser simon.fra...@apple.comwrote:

 If you require a manifest, why not pick an archive format where there's a
 TOC which is guaranteed to be at the head of the file, which the browser can
 parse without having to wait for the entire file to download?


If there are other formats that can a) be streamed and unpacked in partial
state, and b) is common enough that people will actually be able to use it,
let me know.

The tar format is sequential, and (I think) has the header first, but
doesn't do compression. If you add gzip to that, you can't partially unpack,
which will block page downloads. You could of course argue that using only
tar (without gzip) could work, and I think we're open to supporting that, if
those assumptions are correct — I haven't looked at the details for that
yet.

-- 
Alexander Limi · Firefox User Experience · http://limi.net
___
webkit-dev mailing list
webkit-dev@lists.webkit.org
http://lists.webkit.org/mailman/listinfo.cgi/webkit-dev


Re: [webkit-dev] Making browsers faster: Resource Packages

2009-11-17 Thread Alex Russell
On Tue, Nov 17, 2009 at 3:00 PM, James Robinson jam...@google.com wrote:
 On Tue, Nov 17, 2009 at 2:19 PM, Alexander Limi l...@mozilla.com wrote:

 Good people of Webkit!

 We'd all like for the web to be faster, and therefore I'd love your feedback 
 on my proposal — it would be great to see support for this in additional 
 browsers, not just Firefox:

 http://limi.net/articles/resource-packages/

 Summary:
 What if there was a backwards compatible way to transfer all of the 
 resources that are used on every single page in your site — CSS, JS, images, 
 anything else — in a single HTTP request at the start of the first visit to 
 the page? This is what Resource Package support in browsers will let you do.

 Looking forward to hear your thoughts on this.

 It seems like a browser will have to essentially stop rendering until
 it has finished downloading the entire .zip and examined it.

I think that's not entirely true. In zip archives the manifest comes
first and can be examined while the rest of the body is still
downloading.

  This
 will most likely slow down the time taken to render parts of the page
 as they arrive. From the blog post:

 A given browser will probably block downloading any resources until
 the lists of files that are available in resource packages have been
 accounted for — or there may be a way to do opportunistic requests or
 similar, we leave this up to the browser vendor unless there’s a
 compelling reason to specify how this should work.

 This also means that a browser would have to stop tokenizing the HTML
 when it hits the next script src= tag, since it would be unable to
 know if the javascript was in the bundled zip or not.  This seems to
 go against the idea that as much of the page be rendered as fast as
 possible.

 - James


 Thanks!
___
webkit-dev mailing list
webkit-dev@lists.webkit.org
http://lists.webkit.org/mailman/listinfo.cgi/webkit-dev


Re: [webkit-dev] Making browsers faster: Resource Packages

2009-11-17 Thread James Robinson
On Tue, Nov 17, 2009 at 5:36 PM, Alexander Limi l...@mozilla.com wrote:
 (Adding in some of the people involved with Resource Packages earlier to
 this thread, so they can help me out — I'm just a lowly UI designer, so
some
 of these questions have to be answered by people that know how browsers
 work. I'm just the messenger. Hope you don't mind, guys, and remember that
 webkit-dev requires you to sign up before you can post.)

 On Tue, Nov 17, 2009 at 2:44 PM, Peter Kasting pkast...@google.com
wrote:

 I have read the whole document, but I read it quickly, so please do point
 out places where I've overlooked an obvious response.

 This is what everyone does, so no worries, happy to clarify. 95% of the
 this is why this won't work statements are actually answered by the
 article in some way. But I guess I shouldn't be surprised. :)


 Reduced parallelism is a big concern of mine.  Lots of sites make heavy
 use of resource sharding across many hostnames to take advantage of
multiple
 connections, which this defeats.

 If you package up everything in a single zip file, yes. Realistically, if
 you have a lot of resources, you'd want to spread them out over several
 files to increase parallelism. Also, there's usually resources that are
 page-specific (e.g. belong to the article being rendered). As with
 everything, there are possibilities to use this the wrong way, and
packaging
 up everything in one zip file will definitely affect parallelism. Don't do
 that.

If the contents are spread across N zip files then the browser still has to
download (at least part of) N files in order to see all the manifests before
it can start fetching other resources.  The page-specific resources end up
getting blocked behind all of the manifest downloads.  If resource bundles
are allowed to include other resource bundles (and I see nothing in the spec
about this), then each of the N downloads would have to be made serially
since the browser would have to check the manifest of each bundle to see if
it includes any of the remaining ones.

I think this line of concerns would be lesser if the author could declare
the contents of the manifest in the HTML itself to avoid an extra download
or give some sort of explicit signal to the browser that a given resource
was not in any resource bundle.  The downside of this is that it increases
the HTML's size even more which is a big loss on browsers that do not
support this.


 I am concerned about the instruction to prefer the packaged resources to
 any separate resources.  This seems to increase the maintenance burden
since
 you can never incrementally override the contents of a package, but
always
 have to repackage.

 This is something we could look at, of course. There are easy ways to
 invalidate the zip using ETags etc.


 If an author has resources only used on some pages, then he can either
 make multiple packages (more maintenance burden and exacerbates problem
 above), or include everything in one package (may result in downloading
 excessive resources for pages where clients don't need them).

 I don't think it's unreasonable to expect most big sites to have a
standard
 core of resources they use everywhere. It's important not to try to put
 *everything* in resource packages, just the stuff that should be present
 everywhere (and the specialized thumbnail search result case I mentioned).


 You note that SPDY has to be implemented by both UAs and web servers, but
 conversely this proposal needs to be implemented by UAs and _authors_.  I
 would rather burden the guys writing Apache than the guys making
webpages,
 and I think if a technique is extremely useful, it's easier to get
support
 into Apache than into, say, 50% of the webpages out there.

 There's no damage if you don't do this as a web author. If you care enough
 to do CSS spriting and CSS/JS combining, this gives you a more
maintainable,
 easier, faster solution.

 On Tue, Nov 17, 2009 at 3:00 PM, James Robinson jam...@google.com
wrote:

 It seems like a browser will have to essentially stop rendering until
 it has finished downloading the entire .zip and examined it.

 No. That's why the manifest is there, since it can be read early on, so
the
 browser doesn't have to block.

 I see a lot of I don't think this will work or I don't think this will
be
 any faster here. I guess I should get someone to help me create some
 reasonable benchmarks and show what the difference would be. Maybe Steve
 Souders or someone else that is better at this stuff than me can help out
 with some data.

Yes, actual numbers would be nice to have.

- James


 On Tue, Nov 17, 2009 at 3:02 PM, Peter Kasting pkast...@google.com
wrote:

 I think mitigating this is why there are optional manifests.  I agree
that
 if there's no manifest, this is really, really painful.  I think
manifests
 should be made mandatory.

 The manifests *are* mandatory. Without a manifest, it won't do anything
(ie.
 proceed to load the resources as usual), since that 

Re: [webkit-dev] Making browsers faster: Resource Packages

2009-11-17 Thread Alexander Limi
 On Tue, Nov 17, 2009 at 5:53 PM, James Robinson jam...@google.com wrote:

 Yes, actual numbers would be nice to have.


Steve Souders just emailed me some preliminary numbers from a bunch of major
web sites, so that should be on his blog shortly.

-- 
Alexander Limi · Firefox User Experience · http://limi.net
___
webkit-dev mailing list
webkit-dev@lists.webkit.org
http://lists.webkit.org/mailman/listinfo.cgi/webkit-dev


Re: [webkit-dev] Making browsers faster: Resource Packages

2009-11-17 Thread Peter Kasting
On Tue, Nov 17, 2009 at 5:36 PM, Alexander Limi l...@mozilla.com wrote:

 On Tue, Nov 17, 2009 at 2:44 PM, Peter Kasting pkast...@google.com
  wrote:

 Reduced parallelism is a big concern of mine.  Lots of sites make heavy
 use of resource sharding across many hostnames to take advantage of multiple
 connections, which this defeats.


 If you package up everything in a single zip file, yes. Realistically, if
 you have a lot of resources, you'd want to spread them out over several
 files to increase parallelism. Also, there's usually resources that are
 page-specific (e.g. belong to the article being rendered). As with
 everything, there are possibilities to use this the wrong way, and packaging
 up everything in one zip file will definitely affect parallelism. Don't do
 that.


But at this point it's not clear what the site author should do.  *Any*
packaging reduces parallelism somewhat.  How much do you reduce parallelism?
 Best practices vary dramatically depending on the details of the user's
connection. I realize some of these issues already exist when sites try to
determine how to shard their resource servers, but if you want to split your
resources among several packages *today*, you can put all the images in one
file, all the scripts in one file, etc., today, and this proposal doesn't
buy terribly much over that.  (Plus it has costs, see below.)

You note that SPDY has to be implemented by both UAs and web servers, but
 conversely this proposal needs to be implemented by UAs and _authors_.  I
 would rather burden the guys writing Apache than the guys making webpages,
 and I think if a technique is extremely useful, it's easier to get support
 into Apache than into, say, 50% of the webpages out there.


 There's no damage if you *don't* do this as a web author. If you care
 enough to do CSS spriting and CSS/JS combining, this gives you a more
 maintainable, easier, faster solution.


Neither proposal does harm when people don't implement it.  What I am saying
is that there's much more of a burden to try and get this to happen on a
per-site-author basis than a per-web-server-codebase basis.  And with a
technique that needs expertise not to backfire, I'm definitely interested in
not forcing each site author to make individual decisions about how to use
it.

I think mitigating this is why there are optional manifests.  I agree that
 if there's no manifest, this is really, really painful.  I think manifests
 should be made mandatory.


 The manifests *are* mandatory. Without a manifest, it won't do anything
 (ie. proceed to load the resources as usual), since that would block page
 loads, which is not an option.


Your doc explicitly says manifests are optional: To give the browser the
ability to know up front what files are in the zip file without reading the
entire file first, we support an *optional* manifest file that can contain
this information. (emphasis mine)

As I noted, even with a manifest, you're introducing extra overhead before
the browser knows how to handle other referenced resources, although it is
only the overhead of contacting the web server and obtaining the manifest,
rather than the overhead of obtaining the entire bundle.  James Robinson
does a good job in his latest message of covering some of the issues here in
more detail.

In the end, the initial proposal comes across a bit like just bundle
everything up in one archive!, but as you note, doing that will _harm_ page
load speeds in many cases.  The actual usage of this feature needs to be
carefully considered by site authors, and ends up providing capabilities
very similar to spriting and combining script files, except with the
additional problem that the browser has to obtain manifests before it knows
how to process any resources referenced in the document.  This just doesn't
feel like a very good tradeoff to me.

I agree with everyone who would like to see numbers.  Of course, good
measurements here are extremely hard (as we've found while working on SPDY),
so I suspect providing meaningful, reliable ones that cover all the relevant
cases might take quite a bit of doing.  I will be interested to see what
Steve Souders has come up with, and especially what sort of conditions lead
to the numbers he has.

PK
___
webkit-dev mailing list
webkit-dev@lists.webkit.org
http://lists.webkit.org/mailman/listinfo.cgi/webkit-dev


Re: [webkit-dev] Making browsers faster: Resource Packages

2009-11-17 Thread Alexander Limi
 On Tue, Nov 17, 2009 at 6:01 PM, Peter Kasting pkast...@google.com wrote:

 Your doc explicitly says manifests are optional: To give the browser the
 ability to know up front what files are in the zip file without reading the
 entire file first, we support an *optional* manifest file that can contain
 this information. (emphasis mine)


That's a leftover from the old proposal. Fixed.


-- 
Alexander Limi · Firefox User Experience · http://limi.net
___
webkit-dev mailing list
webkit-dev@lists.webkit.org
http://lists.webkit.org/mailman/listinfo.cgi/webkit-dev