Re: [webkit-dev] Question regarding priorities of subresource content retrieval

2011-02-08 Thread Jerry Seeger
My argument is less it's the Web developer's fault than it is the Web 
developer should have control. I am hardly a sophisticated Web developer but I 
have javascript from a different  domain that must be loaded first and I have 
Google analytics, which I should load after the rest of the page (though to be 
honest I'm not sure I do after my redesign... hm). While I would love it if 
there were standardized rules for which scripts would be loaded synchronously 
and which wouldn't, I would hate it if one browser required me to move my 
scripts to a different domain.

Having said all that, I hate it when I have to wait for a resource out outside 
of my control, so I'd love to see a solution to this. If there were a more 
reliable way than simple domain checking to prioritize content, that would be 
fantastic. I think ideally this is something for the standards board - perhaps 
an extension of the script and link tags to specify a priority, or something 
like that. 

Jerry


On Feb 8, 2011, at 2:23 AM, Silvio Ventres wrote:

 This argument - web developer is to blame for choosing a slow
 ad/tracking/etc server - is incorrect.
 Web developers in general do not have any control over the ad provider
 or, frankly, any other type of external functionality provider.
 Google Analytics being a good point in case, you would not want most
 of the world's web pages to suddenly hang if something happens inside
 Google.
 
 The web browser should clearly prioritize developer-controllable
 resources over ones that are beyond web developer's control.
 Also, as an application run by the user and not by the developer, the
 browser should arguably prioritize actual content against
 pseudo-content which purpose is functionality that is not visibile to
 the actual user, such as ad/tracker scripts. This actual content has
 higher probability to be important when sourced from the
 domain/subdomain of the webpage itself, based on current trends.
 
 Domain check is a reasonable approximation that fits both purposes.
 
 --
 silvio
 
 
 On Tue, Feb 8, 2011 at 5:13 AM, Jerry Seeger vikin...@mac.com wrote:
 I'm reasonably sure that javascript in the header must be loaded 
 synchronously, as it might affect the rest of the load. This is why tools 
 like YSlow advise Web designers to move javascript loads that are not needed 
 for rendering until after the rest of the page loads.
 
 Blocking on loading the css is less clear-cut, as in some cases it could 
 mean several seconds of ugly page. I don't know if it's right or wrong, but 
 a lot of pages out there rely on the CSS being loaded before the page starts 
 to render to avoid terrible layout and the appearance of items meant to be 
 hidden for the seconds it takes the css to load.
 
 In general, while things could certainly be improved, it's up to the owner 
 of the page to not rely on a a slow ad server, or build the page so the ads 
 load after the primary content.
 
 Jerry Seeger
 
 
 On Feb 7, 2011, at 5:47 PM, Silvio Ventres wrote:
 
 IE/Opera are delaying only for 4 seconds, same as Mobile Safari
 The reason looks to be the url for the script/css.
 If the url is the same twice, Chrome/Firefox serializes the requests,
 while IE/Opera/MobileSafari launches both requests simultaneously.
 
 Of course, requesting simultaneously doesn't fix anything, as you can
 see by trying a link-stuffed version at
 http://solid.eqoppa.com/testlag2.html
 
 This one has 45 css and 38 javascript links. It hangs all browsers nicely.
 The main point here is that it might be acceptable if it's coming from
 the webpage domain itself.
 But the links are coming from a completely different place.
 
 This is exactly what makes browsing pages with any third-party
 analytics, tracking or ad addons so slow and frustrating.
 Fixing priorities in subresource download should make experience
 considerably more interactive and fun.
 
 --
 silvio
 
 

___
webkit-dev mailing list
webkit-dev@lists.webkit.org
http://lists.webkit.org/mailman/listinfo.cgi/webkit-dev


Re: [webkit-dev] Question regarding priorities of subresource content retrieval

2011-02-08 Thread Jerry Seeger
I'm still fiddling with the scripts on muddledramblings.com after a redesign, 
but I intend to move static resources to a cookieless domain to improve 
performance. This is a petty common tactic - sort of a poor man's CDN. The key 
is that I can decide to do this. (Yes, I could rearrange my site and use 
www.muddledramblings.com and cookieless.muddledramblings.com, but you're making 
me do things a different way to support one Web browser.)

(On a side note, muddledramblings.com's biggest performance problem right now 
is the host. Don't use iPage. /rant)

Keep in mind that scripts not executing when expected can totally break a site, 
not just make it less pleasant. A script that generates content must be 
executed in a predictable fashion no matter where it came from. Long ago I had 
a moon phase widget that generated content, and raised hell on browsers that 
did not block correctly when the script loaded. (I once had a widget with a 
script that generated a script. The results were... inconsistent.) These days 
all browsers block correctly and the Web is a better place for it.

I can't see telling Web designers, If your script uses document.write, it must 
come from the same domain or a known whitelist. (And let's hope the latency of 
the whitelist server is really low.) I can't see telling Joe Blogger why the 
visitor counter in his sidebar now writes the number at the bottom of the page.

The WordPress plugin W3 Super Cache includes features to automate moving static 
content (including scripts) to a separate, cookieless domain. A lot of people 
use the plugin, but I can't speak to how many use the pseudo-cdn feature. My 
guess is not that many, but the ones who do will expect their scripts to 
execute where encountered, before the rest of the page loads, as mandated by 
the standards.

The Web designer can already cause javascripts to load after the rest of the 
page (the above plugin automates this as well). Were I to run ads, you can bet 
that those scripts would not be loaded in the header (well, if I weren't lazy 
you could bet it). If I'm not already loading Google analytics late, it's 
because I haven't finished getting my script strategy finalized.

While I would certainly like to see an automated mechanism for setting external 
resource priority, allowing me to continue in my lazy ways and not pay a 
performance price, (and make the Web more responsive in general, since most of 
us are lazy), simple domain check is not adequate when it comes to scripts. I 
wish I could think of an automated way to augment using the domain, but all my 
ideas require knowing what's in the script ahead of time (scripts that only 
define event handlers, for instance).

Jerry Seeger

On Feb 8, 2011, at 9:24 AM, Silvio Ventres wrote:

 Do you have any example of scripts or css that are externally sourced,
 and where developer cares to reasonably optimize the web page?
 The main use case of such external scripts currently is ads and
 statistics gatherers for analysis. This, arguably, is not critical
 content that the user is interested in.
 
 If your argument is indeed Web developer should have control, then,
 when you have no choice but including external scripts (ads, f.e.),
 you would probably hate those to break the latency of your website.
 If you are talking about the http://muddledramblings.com/ website, for
 example, you can clearly see that most scripts there are
 domain-internal.
 Do you deem your user experience more or less important than Google
 Analytics capability ? If Google Analytics hangs, for example, for 4
 seconds, would you like the user to wait, or start reading while it
 loads?
 
 A change to HTML standard might be a good idea, though the problem
 here is that there are millions of pages on the 'net already, and the
 developers won't suddenly start changing them.
 
 This heuristic will allow the users to view 90% of the current Web
 more interactively.
 Keep in mind that at least 38% of all statistics is taken out of thin
 air :), but, really, please, show at least two pages which this
 heuristic will NOT work on.
 
 --
 silvio
 
 On Tue, Feb 8, 2011 at 6:52 PM, Jerry Seeger vikin...@mac.com wrote:
 My argument is less it's the Web developer's fault than it is the Web 
 developer should have control. I am hardly a sophisticated Web developer 
 but I have javascript from a different  domain that must be loaded first and 
 I have Google analytics, which I should load after the rest of the page 
 (though to be honest I'm not sure I do after my redesign... hm). While I 
 would love it if there were standardized rules for which scripts would be 
 loaded synchronously and which wouldn't, I would hate it if one browser 
 required me to move my scripts to a different domain.
 
 Having said all that, I hate it when I have to wait for a resource out 
 outside of my control, so I'd love to see a solution to this. If there were 
 a more reliable way than simple domain checking to prioritize content

Re: [webkit-dev] Question regarding priorities of subresource content retrieval

2011-02-08 Thread Jerry Seeger
Sorry - the WordPress plugin is W3 Total Cache, not W3 Super Cache. I always 
get those names scrambled.

Jerry

On Feb 8, 2011, at 10:40 AM, Jerry Seeger wrote:

 I'm still fiddling with the scripts on muddledramblings.com after a redesign, 
 but I intend to move static resources to a cookieless domain to improve 
 performance. This is a petty common tactic - sort of a poor man's CDN. The 
 key is that I can decide to do this. (Yes, I could rearrange my site and use 
 www.muddledramblings.com and cookieless.muddledramblings.com, but you're 
 making me do things a different way to support one Web browser.)
 
 (On a side note, muddledramblings.com's biggest performance problem right now 
 is the host. Don't use iPage. /rant)
 
 Keep in mind that scripts not executing when expected can totally break a 
 site, not just make it less pleasant. A script that generates content must be 
 executed in a predictable fashion no matter where it came from. Long ago I 
 had a moon phase widget that generated content, and raised hell on browsers 
 that did not block correctly when the script loaded. (I once had a widget 
 with a script that generated a script. The results were... inconsistent.) 
 These days all browsers block correctly and the Web is a better place for it.
 
 I can't see telling Web designers, If your script uses document.write, it 
 must come from the same domain or a known whitelist. (And let's hope the 
 latency of the whitelist server is really low.) I can't see telling Joe 
 Blogger why the visitor counter in his sidebar now writes the number at the 
 bottom of the page.
 
 The WordPress plugin W3 Super Cache includes features to automate moving 
 static content (including scripts) to a separate, cookieless domain. A lot of 
 people use the plugin, but I can't speak to how many use the pseudo-cdn 
 feature. My guess is not that many, but the ones who do will expect their 
 scripts to execute where encountered, before the rest of the page loads, as 
 mandated by the standards.
 
 The Web designer can already cause javascripts to load after the rest of the 
 page (the above plugin automates this as well). Were I to run ads, you can 
 bet that those scripts would not be loaded in the header (well, if I weren't 
 lazy you could bet it). If I'm not already loading Google analytics late, 
 it's because I haven't finished getting my script strategy finalized.
 
 While I would certainly like to see an automated mechanism for setting 
 external resource priority, allowing me to continue in my lazy ways and not 
 pay a performance price, (and make the Web more responsive in general, since 
 most of us are lazy), simple domain check is not adequate when it comes to 
 scripts. I wish I could think of an automated way to augment using the 
 domain, but all my ideas require knowing what's in the script ahead of time 
 (scripts that only define event handlers, for instance).
 
 Jerry Seeger
 
 On Feb 8, 2011, at 9:24 AM, Silvio Ventres wrote:
 
 Do you have any example of scripts or css that are externally sourced,
 and where developer cares to reasonably optimize the web page?
 The main use case of such external scripts currently is ads and
 statistics gatherers for analysis. This, arguably, is not critical
 content that the user is interested in.
 
 If your argument is indeed Web developer should have control, then,
 when you have no choice but including external scripts (ads, f.e.),
 you would probably hate those to break the latency of your website.
 If you are talking about the http://muddledramblings.com/ website, for
 example, you can clearly see that most scripts there are
 domain-internal.
 Do you deem your user experience more or less important than Google
 Analytics capability ? If Google Analytics hangs, for example, for 4
 seconds, would you like the user to wait, or start reading while it
 loads?
 
 A change to HTML standard might be a good idea, though the problem
 here is that there are millions of pages on the 'net already, and the
 developers won't suddenly start changing them.
 
 This heuristic will allow the users to view 90% of the current Web
 more interactively.
 Keep in mind that at least 38% of all statistics is taken out of thin
 air :), but, really, please, show at least two pages which this
 heuristic will NOT work on.
 
 --
 silvio
 
 On Tue, Feb 8, 2011 at 6:52 PM, Jerry Seeger vikin...@mac.com wrote:
 My argument is less it's the Web developer's fault than it is the Web 
 developer should have control. I am hardly a sophisticated Web developer 
 but I have javascript from a different  domain that must be loaded first 
 and I have Google analytics, which I should load after the rest of the page 
 (though to be honest I'm not sure I do after my redesign... hm). While I 
 would love it if there were standardized rules for which scripts would be 
 loaded synchronously and which wouldn't, I would hate it if one browser 
 required me to move my scripts to a different domain.
 
 Having said

Re: [webkit-dev] Question regarding priorities of subresource content retrieval

2011-02-07 Thread Jerry Seeger
I'm reasonably sure that javascript in the header must be loaded synchronously, 
as it might affect the rest of the load. This is why tools like YSlow advise 
Web designers to move javascript loads that are not needed for rendering until 
after the rest of the page loads.

Blocking on loading the css is less clear-cut, as in some cases it could mean 
several seconds of ugly page. I don't know if it's right or wrong, but a lot of 
pages out there rely on the CSS being loaded before the page starts to render 
to avoid terrible layout and the appearance of items meant to be hidden for the 
seconds it takes the css to load.

In general, while things could certainly be improved, it's up to the owner of 
the page to not rely on a a slow ad server, or build the page so the ads load 
after the primary content.

Jerry Seeger


On Feb 7, 2011, at 5:47 PM, Silvio Ventres wrote:

 IE/Opera are delaying only for 4 seconds, same as Mobile Safari
 The reason looks to be the url for the script/css.
 If the url is the same twice, Chrome/Firefox serializes the requests,
 while IE/Opera/MobileSafari launches both requests simultaneously.
 
 Of course, requesting simultaneously doesn't fix anything, as you can
 see by trying a link-stuffed version at
 http://solid.eqoppa.com/testlag2.html
 
 This one has 45 css and 38 javascript links. It hangs all browsers nicely.
 The main point here is that it might be acceptable if it's coming from
 the webpage domain itself.
 But the links are coming from a completely different place.
 
 This is exactly what makes browsing pages with any third-party
 analytics, tracking or ad addons so slow and frustrating.
 Fixing priorities in subresource download should make experience
 considerably more interactive and fun.
 
 --
 silvio

___
webkit-dev mailing list
webkit-dev@lists.webkit.org
http://lists.webkit.org/mailman/listinfo.cgi/webkit-dev