± You're not saying that gzipping and wise pre-fetching and parallel download
± of scripts don't improve page load times. Or are you?

All servers serve GZIPPED version of text files. You don't need a ZIP for that. 

HTTP 2.0 can send you multiple files in parallel on the same connection: that 
way you don't pay (1) the TCP's Slow Start cost, (2) the HTTPS handshake and 
(3) the cookie/useragent/... headers cost. 

Under HTTP 2.0, you can also ask for the next files while you receive the 
current one (or send them by batch), and that reduces the RTT cost to 0. Also, 
the server can decide to send you a list of files you didn't request (à la 
ZIP), making totally unnecessary for your site to ask for the files to preload 
them.

The priority of downloads is negotiated between the browser and the server, and 
not dependent on the 6 connections and the client.

The big advantage of the HTTP2 solution over the ZIP is that your site could 
already load with only the most important files downloaded while if you use a 
ZIP you've to wait until all files have been downloaded. From a performance 
point of view, this is an issue. Also, since you can only start analyzing the 
resources at that time, you will overload the CPU at that time. If you can 
unzip the files one by one, you can spread the load over a much longer time.



± In the equation you paint above something important is missing: the fact that
± there's a round-trip delay per request (even with http2.0), and that the only
± way to avoid it is to bundle things, as in .zip bundling, to minimize the
± (number of requests and thus the) impact of latencies.

Go find some HTTP2 presentation, you'll learn things ;-)
_______________________________________________
es-discuss mailing list
[email protected]
https://mail.mozilla.org/listinfo/es-discuss

Reply via email to