On Saturday 05 June 2010 04:45:55 Dennis Nezic wrote:
> On Fri, 4 Jun 2010 11:54:42 -0400, Dennis Nezic wrote:
> > On Fri, 04 Jun 2010 19:33:09 +0400, VolodyA! V Anarhist wrote:
> > > Dennis Nezic wrote:
> > > > On Thu, 3 Jun 2010 23:56:46 -0400, Dennis Nezic wrote:
> > > >> Is it possible to explicitly state the compression used with
> > > >> GETCHK or GETCHKFILE or GETCHKDDIR from telnet? (I don't think
> > > >> these commands are even possible in fproxy -- getting chk keys
> > > >> without inserting?)
> > > >>
> > > >> When inserting files via fproxy, I think you have to explicitly
> > > >> decide whether to compress or not, but that would easily lead to
> > > >> a different chk key for the same file, if the GETCHK* commands
> > > >> don't do the same thing.
> > > > 
> > > > Oh, why do we have arbitrary compression anyways, btw? :)
> > > > (Arbitrary because there is no explicit standard in the specs, as
> > > > far as I know, which can easily lead to completely different CHKs
> > > > for the same file across different versions, if the settings are
> > > > even slightly changed (ie. slightly different compression
> > > > algorithm/level, or threshold for using it, or explicit user
> > > > choice, etc.)) Is the massive computer and time overhead really
> > > > necessary to reduce filesizes by 1%? (I assume jpeg and zip and
> > > > mpeg4 etc compression algorithms are already good enough? And why
> > > > the heck is all this massive overhead done THREE times? Are gzip
> > > > bzip and lzma really all /that/ different??)
> > > 
> > > This question is being asked over and over and over again, mostly by
> > > the people who don't bother look for the answer (in the future
> > > please at least say that you didn't look for it).
> > > 
> > > Think about the implications of 1% in the network that does not do
> > > path folding? This is 1% on every download by every person that goes
> > > out multiple hops. So you will be downloading 1% more, but your node
> > > will also have to carry 1% more from all the traffic that comes
> > > through it. This will also amount to the constant garbage flood
> > > attack on the network equivalent to 1% of all the data that is
> > > currently being inserted, pushing more content off the network,
> > > causing people to retry, and then reinsert more often (with the
> > > effects discussed above).
> > > 
> > > In addition to all that, the truth of the matter is that the CPU
> > > time is very cheap when it is compared to the network latency.
> > 
> > I suppose I can accept that logic -- one end user (the author) suffers
> > while everyone else benefits. But, actually, I think more often then
> > not all that intense cpu-work is completely ignored since none of
> > those general-purpose algorithms can do better than the
> > specific-purpose jpeg/zip/mpeg4/etc. (Assuming a few bytes could be
> > compressed, the metadata overhead negates it.)
> > 
> > > 
> > > And as for the reason why there's no standard so far, it's probably
> > > because things are still being tweaked.
> > 
> > That's probably my biggest complaint. If it was standardized and
> > completely transparent, I might grudgingly accept having to wait an
> > hour to get a chk key, without inserting. But as it currently exists,
> > depending on how you insert, (ie. via telnet, fproxy w or w/o
> > compression, etc) will result in differing CHKs.
> > 
> > Personally I'd trash all the compression stuff -- that is not the
> > node's responsibility, IMHO. Like you said in your other email, people
> > already compress their archives, and probably at even higher
> > compression levels?
> 
> Upon further reflextion, my biggest complaint is actually that
> GETCHKFILE doesn't work!? I've tried doing it on a couple files, and
> it never finishes. Ie.:
> 
> TMCI> getchkfile:/pub/speeches/Noam Chomsky - Iraq.mp3
> Started compression attempt with GZIP
> Started compression attempt with BZIP2
> Started compression attempt with LZMA
> Compressed data: codec=1, origSize=7576241, compressedSize=7317992
> Completed 0% 0/448 (failed 0, fatally 0, total 448) 
> Completed 0% 0/448 (failed 0, fatally 0, total 448) 
> 
> And it will hang there, (after spending too much time compressing!) for
> hours (forever). Does it work for you guys?

As I understand it Chomsky defends his copyright quite vigorously (and earns a 
substantial amount of money from his work). Are you absolutely sure you have 
the right to distribute that file?

Attachment: signature.asc
Description: This is a digitally signed message part.

_______________________________________________
Support mailing list
Support@freenetproject.org
http://news.gmane.org/gmane.network.freenet.support
Unsubscribe at http://emu.freenetproject.org/cgi-bin/mailman/listinfo/support
Or mailto:support-requ...@freenetproject.org?subject=unsubscribe

Reply via email to