On Wed, Sep 02, 2015 at 03:51:12PM +0930, Glen Turner wrote: > There's not much uncompressed data by volume on the Internet. Sizeable > data has already been compressed, often by an algorithm which performs > better than a general-purpose compression algorithm -- JPEG, GIF, PNG, > MP3, H.264 video, software packages. > > In-flight compression of the remaining data is getting less and less > effective. Consider the movement to serve all HTML over TLS. This data > can't be compressed in-flight, as an encrypted stream should appear > random.
also, most web-servers can be (and usually are) configured to compress data on request from the client....and most browsers request compressed data. ditto for other kinds of servers - pretty much anything that routinely transfers bulk quantities of data has the capability, or at leat the option, of compressing data using compression algorithms like gzip or lz4 rsync, for example, typically transfers data to/from remote hosts using ssh, which both compresses and encrypts the data stream. and, as you say, application-level compression algorithms like these tend to be better than transit-level compression. and trying to re-compress already compressed data like pictures or videos usually just wastes CPU power and sometimes results in negative compression levels (i.e. the qty of data transferred is greater than if you didn't recompress it) craig -- craig sanders <[email protected]> _______________________________________________ Link mailing list [email protected] http://mailman.anu.edu.au/mailman/listinfo/link
