How many people regularly use compression (either from the web server or manually using something like cfx_gzip) when serving up pages.
Also, does this affect spiders (eg googlebot) adversely? I've used cfx_gzip in the past with an intranet to great effect, but remember that I had troubles with different user agents when the tech was opened up to the web populace as a whole. I'm now looking at cgi.http_accept_encoding and serving up compressed/uncompressed accordingly etc, but just wanted to check people's opinions, experiences with this.... ta. ------------------------------------------------------- Rich Wild Senior Web Developer ------------------------------------------------------- e-mango Tel: 01202 755 300 Gild House Fax: 01202 755 301 74 Norwich Avenue West Bournemouth Mailto:[EMAIL PROTECTED] BH2 6AW, UK http://www.e-mango.com ------------------------------------------------------- This message may contain information which is legally privileged and/or confidential. If you are not the intended recipient, you are hereby notified that any unauthorised disclosure, copying, distribution or use of this information is strictly prohibited. Such notification notwithstanding, any comments, opinions, information or conclusions expressed in this message are those of the originator, not of e-mango.com ltd, unless otherwise explicitly and independently indicated by an authorised representative of e-mango.com ltd. ------------------------------------------------------- -- ** Archive: http://www.mail-archive.com/dev%40lists.cfdeveloper.co.uk/ To unsubscribe, e-mail: [EMAIL PROTECTED] For additional commands, e-mail: [EMAIL PROTECTED] For human help, e-mail: [EMAIL PROTECTED]
