2009/9/16 skaiuoquer <skaiuoq...@gmail.com>:
> disccomp, Richard, Alex; thanks for your replies.
> The problem with minified compressed versions is that they will never
> get cached on the client side.
> I mean, they will, but you will still be sending them each one of the
> different "combinations"--so instead of loading prototype.js once, the
> user will be loading it -albeit compressed- once on every different
> This is a problem, since, like I said, this is a web application that
> is meant to be used several times throughout the day, and taking
> advantage of browser-caching is quite important for us.
> Those ideas are part of the advices that YSlow proposes--and they are
> quite good, thanks =)
> I will look into those solutions, but, I'm not fully convinced yet
> that they are entirely adequate for this particular problem.
> (Once again, thanks for your responses, guys.)
> On Sep 16, 2:31 pm, "Alex McAuley" <webmas...@thecarmarketplace.com>
>> I wouldnt know, i done this about 6 months ago and have never heard of
>> Alex Mcauleyhttp://www.thevacancymarket.com
>> ----- Original Message -----
>> From: "Jarkko Laine" <jarks...@gmail.com>
>> To: <firstname.lastname@example.org>
>> Sent: Wednesday, September 16, 2009 6:24 PM
>> > On 16.9.2009, at 20.20, Alex McAuley wrote:
>> >> Well....
>> >> The page downloads in linear (line by line) so it depends...
>> >> If you reference your JS via src="/path/to.file.js" ... and have
>> >> many of
>> >> these files then you will be stunted by the paralell download
>> >> stopping.
>> >> However.... If you do some smart js thinking (like i did!!!!!!!!)
>> >> you can
>> >> make thigs very fast indeed...
>> >> Basically what i done was..
>> >> Created a file that minifies my js files on the fly and outputs one
>> >> long
>> >> minified whitespace stripped string back to the browser which
>> >> includes all
>> > Likehttp://getsprockets.com/?
>> > //jarkko
>> > --
>> > Jarkko Laine
If I only need to download the entire set of JS once, then EVERY
single page will benefit.
No matter how much of the library I use.
Even if I don't use any.
But, assuming that I do have different sets (and with a large amount
of redundancy), I would only need to download the set once.
Caching would look after that for me.
The PHP based combinator I use works very well for me.
Each new visitor will get a handful of pre-gzipped CSS and JS files as
they move around the various sites on our intranet.
Once they've visited a page, they never download the CSS or JS again.
That's the point.
You can't magically provide the data without them actually downloading it.
But cache what you can and make the rest serve as fast as you can.
"Standing on the shoulders of some very clever giants!"
EE : http://www.experts-exchange.com/M_248814.html
Zend Certified Engineer : http://zend.com/zce.php?c=ZEND002498&r=213474731
ZOPA : http://uk.zopa.com/member/RQuadling
You received this message because you are subscribed to the Google Groups
"Prototype & script.aculo.us" group.
To post to this group, send email to email@example.com
To unsubscribe from this group, send email to
For more options, visit this group at