Re: [freenet-support] getchk getchkfile with or without compression

2010-06-08 Thread Matthew Toseland
On Monday 07 June 2010 19:09:43 VolodyA! V Anarhist wrote:
 Matthew Toseland wrote:
  As I understand it Chomsky defends his copyright quite vigorously (and 
  earns a substantial amount of money from his work). Are you absolutely sure 
  you have the right to distribute that file?
 
 Dear Matthew,
 
 I understand (while not 100% agree with) your reasons behind disallowing 
 people 
 from talking about anything that is copyrighted and is transferred to another 
 person without permission. However, what you have said there is untrue.
 
 In fact you are safe only because Noam Chomsky does not use libel laws to 
 protect oneself, even when it would easily be a winner (such as when he is 
 accused of antisemitism, holocaust denial, etc.) You have already spoken to 
 me 
 about Noam Chosmky and copyright, and i recall quite well that i've told you 
 that he has never used copyright laws to take any individual to court, i have 
 then informed you that there has been a torrent - file sharing site 
 specifically 
 geared to exchanging interviews, books, and other media made by Chomsky. I 
 believe your argument then was that some media company that owns the 
 copyright 
 of his interview or writing may non-the-less take people to court.
 
 The reason why i'm writing this is that this list is public and is easily 
 found 
 in search engines, thus when you say things like that on this list it creates 
 an 
 urgan legend and defames Professor Noam Chomsky. I can already see some 
 zionist 
 article:
 Noam Chomsky, the self-proclaimed anarchist, has been terrorising the people 
 by 
 using the laws that he claims to oppose, in fact even the radical Freenet 
 administrators who allow links to child pornography and terrorism are afraid 
 to 
 even allow a mention of the audio files with his name in them (Matthew 
 Toseland, 
 2010).
 
 Whether you like it or not, toad; by the public that will not know enough 
 about 
 the way Freenet operates you will be seen as one of the people in charge, 
 and 
 as such when you say things on public forum about some specific person it can 
 be 
 used to show the Freenet policy. Please note, i'm not arguing to allow 
 copyright infringement to be advertised on the support lists, far from it; 
 but 
 please do not make statements about individual people's character.

Well, can you get a definitive answer on the simple question, is the file in 
question copyrighted and not legally redistributible? I don't care whether it 
is Chomsky who owns the copyright or his publisher.


signature.asc
Description: This is a digitally signed message part.
___
Support mailing list
Support@freenetproject.org
http://news.gmane.org/gmane.network.freenet.support
Unsubscribe at http://emu.freenetproject.org/cgi-bin/mailman/listinfo/support
Or mailto:support-requ...@freenetproject.org?subject=unsubscribe

Re: [freenet-support] getchk getchkfile with or without compression

2010-06-07 Thread Matthew Toseland
On Saturday 05 June 2010 04:45:55 Dennis Nezic wrote:
 On Fri, 4 Jun 2010 11:54:42 -0400, Dennis Nezic wrote:
  On Fri, 04 Jun 2010 19:33:09 +0400, VolodyA! V Anarhist wrote:
   Dennis Nezic wrote:
On Thu, 3 Jun 2010 23:56:46 -0400, Dennis Nezic wrote:
Is it possible to explicitly state the compression used with
GETCHK or GETCHKFILE or GETCHKDDIR from telnet? (I don't think
these commands are even possible in fproxy -- getting chk keys
without inserting?)
   
When inserting files via fproxy, I think you have to explicitly
decide whether to compress or not, but that would easily lead to
a different chk key for the same file, if the GETCHK* commands
don't do the same thing.

Oh, why do we have arbitrary compression anyways, btw? :)
(Arbitrary because there is no explicit standard in the specs, as
far as I know, which can easily lead to completely different CHKs
for the same file across different versions, if the settings are
even slightly changed (ie. slightly different compression
algorithm/level, or threshold for using it, or explicit user
choice, etc.)) Is the massive computer and time overhead really
necessary to reduce filesizes by 1%? (I assume jpeg and zip and
mpeg4 etc compression algorithms are already good enough? And why
the heck is all this massive overhead done THREE times? Are gzip
bzip and lzma really all /that/ different??)
   
   This question is being asked over and over and over again, mostly by
   the people who don't bother look for the answer (in the future
   please at least say that you didn't look for it).
   
   Think about the implications of 1% in the network that does not do
   path folding? This is 1% on every download by every person that goes
   out multiple hops. So you will be downloading 1% more, but your node
   will also have to carry 1% more from all the traffic that comes
   through it. This will also amount to the constant garbage flood
   attack on the network equivalent to 1% of all the data that is
   currently being inserted, pushing more content off the network,
   causing people to retry, and then reinsert more often (with the
   effects discussed above).
   
   In addition to all that, the truth of the matter is that the CPU
   time is very cheap when it is compared to the network latency.
  
  I suppose I can accept that logic -- one end user (the author) suffers
  while everyone else benefits. But, actually, I think more often then
  not all that intense cpu-work is completely ignored since none of
  those general-purpose algorithms can do better than the
  specific-purpose jpeg/zip/mpeg4/etc. (Assuming a few bytes could be
  compressed, the metadata overhead negates it.)
  
   
   And as for the reason why there's no standard so far, it's probably
   because things are still being tweaked.
  
  That's probably my biggest complaint. If it was standardized and
  completely transparent, I might grudgingly accept having to wait an
  hour to get a chk key, without inserting. But as it currently exists,
  depending on how you insert, (ie. via telnet, fproxy w or w/o
  compression, etc) will result in differing CHKs.
  
  Personally I'd trash all the compression stuff -- that is not the
  node's responsibility, IMHO. Like you said in your other email, people
  already compress their archives, and probably at even higher
  compression levels?
 
 Upon further reflextion, my biggest complaint is actually that
 GETCHKFILE doesn't work!? I've tried doing it on a couple files, and
 it never finishes. Ie.:
 
 TMCI getchkfile:/pub/speeches/Noam Chomsky - Iraq.mp3
 Started compression attempt with GZIP
 Started compression attempt with BZIP2
 Started compression attempt with LZMA
 Compressed data: codec=1, origSize=7576241, compressedSize=7317992
 Completed 0% 0/448 (failed 0, fatally 0, total 448) 
 Completed 0% 0/448 (failed 0, fatally 0, total 448) 
 
 And it will hang there, (after spending too much time compressing!) for
 hours (forever). Does it work for you guys?

As I understand it Chomsky defends his copyright quite vigorously (and earns a 
substantial amount of money from his work). Are you absolutely sure you have 
the right to distribute that file?


signature.asc
Description: This is a digitally signed message part.
___
Support mailing list
Support@freenetproject.org
http://news.gmane.org/gmane.network.freenet.support
Unsubscribe at http://emu.freenetproject.org/cgi-bin/mailman/listinfo/support
Or mailto:support-requ...@freenetproject.org?subject=unsubscribe

Re: [freenet-support] getchk getchkfile with or without compression

2010-06-07 Thread VolodyA! V Anarhist

Matthew Toseland wrote:

As I understand it Chomsky defends his copyright quite vigorously (and earns a 
substantial amount of money from his work). Are you absolutely sure you have 
the right to distribute that file?


Dear Matthew,

I understand (while not 100% agree with) your reasons behind disallowing people 
from talking about anything that is copyrighted and is transferred to another 
person without permission. However, what you have said there is untrue.


In fact you are safe only because Noam Chomsky does not use libel laws to 
protect oneself, even when it would easily be a winner (such as when he is 
accused of antisemitism, holocaust denial, etc.) You have already spoken to me 
about Noam Chosmky and copyright, and i recall quite well that i've told you 
that he has never used copyright laws to take any individual to court, i have 
then informed you that there has been a torrent - file sharing site specifically 
geared to exchanging interviews, books, and other media made by Chomsky. I 
believe your argument then was that some media company that owns the copyright 
of his interview or writing may non-the-less take people to court.


The reason why i'm writing this is that this list is public and is easily found 
in search engines, thus when you say things like that on this list it creates an 
urgan legend and defames Professor Noam Chomsky. I can already see some zionist 
article:
Noam Chomsky, the self-proclaimed anarchist, has been terrorising the people by 
using the laws that he claims to oppose, in fact even the radical Freenet 
administrators who allow links to child pornography and terrorism are afraid to 
even allow a mention of the audio files with his name in them (Matthew Toseland, 
2010).


Whether you like it or not, toad; by the public that will not know enough about 
the way Freenet operates you will be seen as one of the people in charge, and 
as such when you say things on public forum about some specific person it can be 
used to show the Freenet policy. Please note, i'm not arguing to allow 
copyright infringement to be advertised on the support lists, far from it; but 
please do not make statements about individual people's character.


   - Volodya

--
http://freedom.libsyn.com/ Echo of Freedom, Radical Podcast

 None of us are free until all of us are free.~ Mihail Bakunin
___
Support mailing list
Support@freenetproject.org
http://news.gmane.org/gmane.network.freenet.support
Unsubscribe at http://emu.freenetproject.org/cgi-bin/mailman/listinfo/support
Or mailto:support-requ...@freenetproject.org?subject=unsubscribe


Re: [freenet-support] getchk getchkfile with or without compression

2010-06-04 Thread Dennis Nezic
On Thu, 3 Jun 2010 23:56:46 -0400, Dennis Nezic wrote:
 Is it possible to explicitly state the compression used with GETCHK or
 GETCHKFILE or GETCHKDDIR from telnet? (I don't think these commands
 are even possible in fproxy -- getting chk keys without inserting?)
 
 When inserting files via fproxy, I think you have to explicitly decide
 whether to compress or not, but that would easily lead to a different
 chk key for the same file, if the GETCHK* commands don't do the same
 thing.

Oh, why do we have arbitrary compression anyways, btw? :) (Arbitrary
because there is no explicit standard in the specs, as far as I know,
which can easily lead to completely different CHKs for the same file
across different versions, if the settings are even slightly changed
(ie. slightly different compression algorithm/level, or threshold for
using it, or explicit user choice, etc.)) Is the massive computer and
time overhead really necessary to reduce filesizes by 1%? (I assume
jpeg and zip and mpeg4 etc compression algorithms are already good
enough? And why the heck is all this massive overhead done THREE
times? Are gzip bzip and lzma really all /that/ different??)
___
Support mailing list
Support@freenetproject.org
http://news.gmane.org/gmane.network.freenet.support
Unsubscribe at http://emu.freenetproject.org/cgi-bin/mailman/listinfo/support
Or mailto:support-requ...@freenetproject.org?subject=unsubscribe


Re: [freenet-support] getchk getchkfile with or without compression

2010-06-04 Thread VolodyA! V Anarhist

Dennis Nezic wrote:

On Thu, 3 Jun 2010 23:56:46 -0400, Dennis Nezic wrote:

Is it possible to explicitly state the compression used with GETCHK or
GETCHKFILE or GETCHKDDIR from telnet? (I don't think these commands
are even possible in fproxy -- getting chk keys without inserting?)

When inserting files via fproxy, I think you have to explicitly decide
whether to compress or not, but that would easily lead to a different
chk key for the same file, if the GETCHK* commands don't do the same
thing.


Oh, why do we have arbitrary compression anyways, btw? :) (Arbitrary
because there is no explicit standard in the specs, as far as I know,
which can easily lead to completely different CHKs for the same file
across different versions, if the settings are even slightly changed
(ie. slightly different compression algorithm/level, or threshold for
using it, or explicit user choice, etc.)) Is the massive computer and
time overhead really necessary to reduce filesizes by 1%? (I assume
jpeg and zip and mpeg4 etc compression algorithms are already good
enough? And why the heck is all this massive overhead done THREE
times? Are gzip bzip and lzma really all /that/ different??)


This question is being asked over and over and over again, mostly by the people 
who don't bother look for the answer (in the future please at least say that you 
didn't look for it).


Think about the implications of 1% in the network that does not do path folding? 
This is 1% on every download by every person that goes out multiple hops. So you 
will be downloading 1% more, but your node will also have to carry 1% more from 
all the traffic that comes through it. This will also amount to the constant 
garbage flood attack on the network equivalent to 1% of all the data that is 
currently being inserted, pushing more content off the network, causing people 
to retry, and then reinsert more often (with the effects discussed above).


In addition to all that, the truth of the matter is that the CPU time is very 
cheap when it is compared to the network latency.


And as for the reason why there's no standard so far, it's probably because 
things are still being tweaked.


 - Volodya

--
http://freedom.libsyn.com/ Echo of Freedom, Radical Podcast

 None of us are free until all of us are free.~ Mihail Bakunin
___
Support mailing list
Support@freenetproject.org
http://news.gmane.org/gmane.network.freenet.support
Unsubscribe at http://emu.freenetproject.org/cgi-bin/mailman/listinfo/support
Or mailto:support-requ...@freenetproject.org?subject=unsubscribe


Re: [freenet-support] getchk getchkfile with or without compression

2010-06-04 Thread VolodyA! V Anarhist

Dennis Nezic wrote:

On Thu, 3 Jun 2010 23:56:46 -0400, Dennis Nezic wrote:

Is it possible to explicitly state the compression used with GETCHK or
GETCHKFILE or GETCHKDDIR from telnet? (I don't think these commands
are even possible in fproxy -- getting chk keys without inserting?)

When inserting files via fproxy, I think you have to explicitly decide
whether to compress or not, but that would easily lead to a different
chk key for the same file, if the GETCHK* commands don't do the same
thing.


Oh, why do we have arbitrary compression anyways, btw? :) (Arbitrary
because there is no explicit standard in the specs, as far as I know,
which can easily lead to completely different CHKs for the same file
across different versions, if the settings are even slightly changed
(ie. slightly different compression algorithm/level, or threshold for
using it, or explicit user choice, etc.)) Is the massive computer and
time overhead really necessary to reduce filesizes by 1%? (I assume
jpeg and zip and mpeg4 etc compression algorithms are already good
enough? And why the heck is all this massive overhead done THREE
times? Are gzip bzip and lzma really all /that/ different??)


You did hit one good point here, however, and perhaps it should be put in the 
wiki. There is almost no reason to insert zip, bzip, or other archive types into 
freenet (unless the format is important for some reason); just insert the tar, 
the node will compress it the best it can.


- Volodya


--
http://freedom.libsyn.com/ Echo of Freedom, Radical Podcast

 None of us are free until all of us are free.~ Mihail Bakunin
___
Support mailing list
Support@freenetproject.org
http://news.gmane.org/gmane.network.freenet.support
Unsubscribe at http://emu.freenetproject.org/cgi-bin/mailman/listinfo/support
Or mailto:support-requ...@freenetproject.org?subject=unsubscribe


Re: [freenet-support] getchk getchkfile with or without compression

2010-06-04 Thread Dennis Nezic
On Fri, 04 Jun 2010 19:33:09 +0400, VolodyA! V Anarhist wrote:
 Dennis Nezic wrote:
  On Thu, 3 Jun 2010 23:56:46 -0400, Dennis Nezic wrote:
  Is it possible to explicitly state the compression used with
  GETCHK or GETCHKFILE or GETCHKDDIR from telnet? (I don't think
  these commands are even possible in fproxy -- getting chk keys
  without inserting?)
 
  When inserting files via fproxy, I think you have to explicitly
  decide whether to compress or not, but that would easily lead to a
  different chk key for the same file, if the GETCHK* commands don't
  do the same thing.
  
  Oh, why do we have arbitrary compression anyways, btw? :) (Arbitrary
  because there is no explicit standard in the specs, as far as I
  know, which can easily lead to completely different CHKs for the
  same file across different versions, if the settings are even
  slightly changed (ie. slightly different compression
  algorithm/level, or threshold for using it, or explicit user
  choice, etc.)) Is the massive computer and time overhead really
  necessary to reduce filesizes by 1%? (I assume jpeg and zip and
  mpeg4 etc compression algorithms are already good enough? And why
  the heck is all this massive overhead done THREE times? Are gzip
  bzip and lzma really all /that/ different??)
 
 This question is being asked over and over and over again, mostly by
 the people who don't bother look for the answer (in the future please
 at least say that you didn't look for it).
 
 Think about the implications of 1% in the network that does not do
 path folding? This is 1% on every download by every person that goes
 out multiple hops. So you will be downloading 1% more, but your node
 will also have to carry 1% more from all the traffic that comes
 through it. This will also amount to the constant garbage flood
 attack on the network equivalent to 1% of all the data that is
 currently being inserted, pushing more content off the network,
 causing people to retry, and then reinsert more often (with the
 effects discussed above).
 
 In addition to all that, the truth of the matter is that the CPU time
 is very cheap when it is compared to the network latency.

I suppose I can accept that logic -- one end user (the author) suffers
while everyone else benefits. But, actually, I think more often then
not all that intense cpu-work is completely ignored since none of those
general-purpose algorithms can do better than the specific-purpose
jpeg/zip/mpeg4/etc. (Assuming a few bytes could be compressed, the
metadata overhead negates it.)

 
 And as for the reason why there's no standard so far, it's probably
 because things are still being tweaked.

That's probably my biggest complaint. If it was standardized and
completely transparent, I might grudgingly accept having to wait an
hour to get a chk key, without inserting. But as it currently exists,
depending on how you insert, (ie. via telnet, fproxy w or w/o
compression, etc) will result in differing CHKs.

Personally I'd trash all the compression stuff -- that is not the
node's responsibility, IMHO. Like you said in your other email, people
already compress their archives, and probably at even higher
compression levels?
___
Support mailing list
Support@freenetproject.org
http://news.gmane.org/gmane.network.freenet.support
Unsubscribe at http://emu.freenetproject.org/cgi-bin/mailman/listinfo/support
Or mailto:support-requ...@freenetproject.org?subject=unsubscribe


Re: [freenet-support] getchk getchkfile with or without compression

2010-06-04 Thread Dennis Nezic
On Fri, 4 Jun 2010 11:54:42 -0400, Dennis Nezic wrote:
 On Fri, 04 Jun 2010 19:33:09 +0400, VolodyA! V Anarhist wrote:
  Dennis Nezic wrote:
   On Thu, 3 Jun 2010 23:56:46 -0400, Dennis Nezic wrote:
   Is it possible to explicitly state the compression used with
   GETCHK or GETCHKFILE or GETCHKDDIR from telnet? (I don't think
   these commands are even possible in fproxy -- getting chk keys
   without inserting?)
  
   When inserting files via fproxy, I think you have to explicitly
   decide whether to compress or not, but that would easily lead to
   a different chk key for the same file, if the GETCHK* commands
   don't do the same thing.
   
   Oh, why do we have arbitrary compression anyways, btw? :)
   (Arbitrary because there is no explicit standard in the specs, as
   far as I know, which can easily lead to completely different CHKs
   for the same file across different versions, if the settings are
   even slightly changed (ie. slightly different compression
   algorithm/level, or threshold for using it, or explicit user
   choice, etc.)) Is the massive computer and time overhead really
   necessary to reduce filesizes by 1%? (I assume jpeg and zip and
   mpeg4 etc compression algorithms are already good enough? And why
   the heck is all this massive overhead done THREE times? Are gzip
   bzip and lzma really all /that/ different??)
  
  This question is being asked over and over and over again, mostly by
  the people who don't bother look for the answer (in the future
  please at least say that you didn't look for it).
  
  Think about the implications of 1% in the network that does not do
  path folding? This is 1% on every download by every person that goes
  out multiple hops. So you will be downloading 1% more, but your node
  will also have to carry 1% more from all the traffic that comes
  through it. This will also amount to the constant garbage flood
  attack on the network equivalent to 1% of all the data that is
  currently being inserted, pushing more content off the network,
  causing people to retry, and then reinsert more often (with the
  effects discussed above).
  
  In addition to all that, the truth of the matter is that the CPU
  time is very cheap when it is compared to the network latency.
 
 I suppose I can accept that logic -- one end user (the author) suffers
 while everyone else benefits. But, actually, I think more often then
 not all that intense cpu-work is completely ignored since none of
 those general-purpose algorithms can do better than the
 specific-purpose jpeg/zip/mpeg4/etc. (Assuming a few bytes could be
 compressed, the metadata overhead negates it.)
 
  
  And as for the reason why there's no standard so far, it's probably
  because things are still being tweaked.
 
 That's probably my biggest complaint. If it was standardized and
 completely transparent, I might grudgingly accept having to wait an
 hour to get a chk key, without inserting. But as it currently exists,
 depending on how you insert, (ie. via telnet, fproxy w or w/o
 compression, etc) will result in differing CHKs.
 
 Personally I'd trash all the compression stuff -- that is not the
 node's responsibility, IMHO. Like you said in your other email, people
 already compress their archives, and probably at even higher
 compression levels?

Upon further reflextion, my biggest complaint is actually that
GETCHKFILE doesn't work!? I've tried doing it on a couple files, and
it never finishes. Ie.:

TMCI getchkfile:/pub/speeches/Noam Chomsky - Iraq.mp3
Started compression attempt with GZIP
Started compression attempt with BZIP2
Started compression attempt with LZMA
Compressed data: codec=1, origSize=7576241, compressedSize=7317992
Completed 0% 0/448 (failed 0, fatally 0, total 448) 
Completed 0% 0/448 (failed 0, fatally 0, total 448) 

And it will hang there, (after spending too much time compressing!) for
hours (forever). Does it work for you guys?
___
Support mailing list
Support@freenetproject.org
http://news.gmane.org/gmane.network.freenet.support
Unsubscribe at http://emu.freenetproject.org/cgi-bin/mailman/listinfo/support
Or mailto:support-requ...@freenetproject.org?subject=unsubscribe


[freenet-support] getchk getchkfile with or without compression

2010-06-03 Thread Dennis Nezic
Is it possible to explicitly state the compression used with GETCHK or
GETCHKFILE or GETCHKDDIR from telnet? (I don't think these commands are
even possible in fproxy -- getting chk keys without inserting?)

When inserting files via fproxy, I think you have to explicitly decide
whether to compress or not, but that would easily lead to a different
chk key for the same file, if the GETCHK* commands don't do the same
thing.
___
Support mailing list
Support@freenetproject.org
http://news.gmane.org/gmane.network.freenet.support
Unsubscribe at http://emu.freenetproject.org/cgi-bin/mailman/listinfo/support
Or mailto:support-requ...@freenetproject.org?subject=unsubscribe