> All the data on freenet is inserted after going through a round of
> encryption and thus having the protocol do any compression is a waste of
> resources becuase compression of the encrypted stream in most cases won't
> compress it.
>
> -Mathew

er... yeah... I knew the data was encrypted. I didn't realize that the 
encryption rendered the data mostly random. I suppose that makes sense, 
recognizable patterns can't be good for security. I guess the only way to 
do it then would be to compress that data prior to encryption. This would 
have the added benefit of reducing the size of file in datastores as well.
  Of course this would not be a backwards compatible change. It would still 
be beneficial to have freenet nodes automagically zlib compress files 
prior to insertion. The other problem would be that the algorithm can't be 
changed in the future without breaking freenet.

What if the first part of the unencrypted file were:
Compression=zlib:
Followed by the compressed data. This way "none" could be chosen for 
already compressed files and new algorithms could be added in the future. 
If the file does not start with Compression=<algorithm>:, it could be 
assumed that this file was inserted prior to the compression feature, 
allowing for backwards compatibility.

l8r
Aaron

_______________________________________________
Devl mailing list
[EMAIL PROTECTED]
http://lists.freenetproject.org/mailman/listinfo/devl

Reply via email to