* Ian Clarke <ian at revver.com> [2006-08-14 21:59:30]: > > On 14 Aug 2006, at 14:10, Matthew Toseland wrote: > > >On Sat, Aug 12, 2006 at 03:55:06PM -0700, Ian Clarke wrote: > >>On 12 Aug 2006, at 12:36, Matthew Toseland wrote: > >>>Whatever happened to maximizing usability? > >> > >>Imposing a compression format that may be ill suited to what is being > >>inserted doesn't maximize usability, it simply means that the node is > >>meddling in a client issue in which it lacks the competence to > >>meddle. I can't think of any other data transmission software that > >>takes it upon itself to compress data by default, about which it > >>knows nothing, by default (BitTorrent doesn't, Secure Copy doesn't, > >>LimeWire doesn't, Kazaa didn't, even Apache doesn't).
Well, you're wrong here : apache provides compression support through a plugin and the default distribution bundles and enable it. > > > >Transfer-Encoding: gzip / Content-Transfer-Encoding: gzip is pretty > >standard. Very few web pages are put up on the web as .html.gz, > >which is > >the appproach you are advocating. What actually happens is that the > >server and the browser transparently compress it. Which is what I > >propose for Freenet. > > Firstly, last time I checked, no web server compresses files by > default. Furthermore, your proposal is to transparently compress all > file formats, how many non-HTML file formats do web servers > transparently compress? Everything as long as the module is loaded... see http://www.mozilla.org/projects/apache/gzip/ > > >>I'm not arguing against the value of compression, I'm just arguing > >>against the node doing it. Different types of compression are > >>appropriate for different file types, it doesn't make sense for us to > >>impose one type of compression, that happens to be optimized for text > >>and some binary formats, on all data types. The client has the > >>expertise to decide on the appropriate format in which to insert > >>content, we don't. Sure, but the point is "if we insert it faster, wasting some cpu-cycles compressing the data ; does it worth it?" I do think that from an usability PoV, it does. > > > >Files which should be compressed lossily already are. We are > >required to > >transport the data intact; we do so. > > I never claimed that compressing files by default violated any kind > of obligation, except the obligation to make good design decisions. > > >When I have tried to insert video etc files, they have usually been > >shrunk by the node. As I said, 1% is worth it. > > If so, then let the client or inserter make that determination, it > isn't the node's place to make it for them. > Let's take the apache metaphor : the browser advertizes its capability, if compression is supported, it's up to the SERVER to decide whether to compress it or not. > >> > >>>since FCPv2 is a deliberately high level API > >>>designed to do as much as possible of the work for the client > >>>author. > >> > >>Yes, but not such that it imposes a uniform solution across all > >>clients where different solutions are needed for different clients. > > > >Different mutually incompatible clients? Lossy compression is done at > >the file format level, but *lossless* compression can be done anywhere > >along the chain. It is for example often used in PPP on low level > >internet links. It is supported transparently in HTTP as well. > > It is not done transparently in HTTP *BY DEFAULT*, and it is not done > in HTTP on all file types. It is doing it by default that I object to. You're wrong :) With pipelining, EVERYTHING gets compressed. > >> > >>>Doing it on the client will just lead to a mass of incompatible > >>>standards, and more (duplicated) work for client authors. > >> > >>If by "incompatible" you mean using jpeg to compress an image rather > >>than gzip, or mpeg to compress a video rather than bzip2, then > >>sometimes incompatibility is justified. > > > >JPEG and MPEG are lossy. This is why they can never be applied > >transparently. Another recent poster gave an example where he > >compressed > >an MPEG file 23% with gzip (27% with lzma). > > If people want to further compress these file types, then that is > their decision to make, not ours to make for them. > That's the arguable point: again, from an usability point of view, it will be "faster" > >> > >>Now I have presented an argument for why we shouldn't support in-node > >>compression of data at all, but I'm willing to compromise and agree > >>to it provided that it isn't enabled by default. This way, client > >>authors can use it if they deem it appropriate, but are much less > >>likely to use it where it isn't. Everyone knows that only the default value will be used :) Hence you want to change the default :p > > > >I don't understand your objection. It is usually beneficial even with > >AVIs, and transparent compression is widely used in other protocols. > > No protocols that I am familiar with do it regardless of file type as > you are proposing, further most I am familiar with don't do it by > default (eg. Apache doesn't, SCP doesn't). > Again, apache does it if the browser supports it ... Yes SCP doesn't. NextGen$
