My argument is less "it's the Web developer's fault" than it is "the Web 
developer should have control." I am hardly a sophisticated Web developer but I 
have javascript from a different  domain that must be loaded first and I have 
Google analytics, which I should load after the rest of the page (though to be 
honest I'm not sure I do after my redesign... hm). While I would love it if 
there were standardized rules for which scripts would be loaded synchronously 
and which wouldn't, I would hate it if one browser required me to move my 
scripts to a different domain.

Having said all that, I hate it when I have to wait for a resource out outside 
of my control, so I'd love to see a solution to this. If there were a more 
reliable way than simple domain checking to prioritize content, that would be 
fantastic. I think ideally this is something for the standards board - perhaps 
an extension of the script and link tags to specify a priority, or something 
like that. 


On Feb 8, 2011, at 2:23 AM, Silvio Ventres wrote:

> This argument - "web developer is to blame for choosing a slow
> ad/tracking/etc server" - is incorrect.
> Web developers in general do not have any control over the ad provider
> or, frankly, any other type of external functionality provider.
> Google Analytics being a good point in case, you would not want most
> of the world's web pages to suddenly hang if something happens inside
> Google.
> The web browser should clearly prioritize developer-controllable
> resources over ones that are beyond web developer's control.
> Also, as an application run by the user and not by the developer, the
> browser should arguably prioritize actual content against
> pseudo-content which purpose is functionality that is not visibile to
> the actual user, such as ad/tracker scripts. This actual content has
> higher probability to be important when sourced from the
> domain/subdomain of the webpage itself, based on current trends.
> Domain check is a reasonable approximation that fits both purposes.
> --
> silvio
> On Tue, Feb 8, 2011 at 5:13 AM, Jerry Seeger <> wrote:
>> I'm reasonably sure that javascript in the header must be loaded 
>> synchronously, as it might affect the rest of the load. This is why tools 
>> like YSlow advise Web designers to move javascript loads that are not needed 
>> for rendering until after the rest of the page loads.
>> Blocking on loading the css is less clear-cut, as in some cases it could 
>> mean several seconds of ugly page. I don't know if it's right or wrong, but 
>> a lot of pages out there rely on the CSS being loaded before the page starts 
>> to render to avoid terrible layout and the appearance of items meant to be 
>> hidden for the seconds it takes the css to load.
>> In general, while things could certainly be improved, it's up to the owner 
>> of the page to not rely on a a slow ad server, or build the page so the ads 
>> load after the primary content.
>> Jerry Seeger
>> On Feb 7, 2011, at 5:47 PM, Silvio Ventres wrote:
>>> IE/Opera are delaying only for 4 seconds, same as Mobile Safari
>>> The reason looks to be the url for the script/css.
>>> If the url is the same twice, Chrome/Firefox serializes the requests,
>>> while IE/Opera/MobileSafari launches both requests simultaneously.
>>> Of course, requesting simultaneously doesn't fix anything, as you can
>>> see by trying a link-stuffed version at
>>> This one has 45 css and 38 javascript links. It hangs all browsers nicely.
>>> The main point here is that it might be acceptable if it's coming from
>>> the webpage domain itself.
>>> But the links are coming from a completely different place.
>>> This is exactly what makes browsing pages with any third-party
>>> analytics, tracking or ad addons so slow and frustrating.
>>> Fixing priorities in subresource download should make experience
>>> considerably more interactive and fun.
>>> --
>>> silvio

webkit-dev mailing list

Reply via email to