* andrey mirtchovski <[EMAIL PROTECTED]> wrote:

> i think what you fail to take into consideration is the fact, that
> even if the chance of a collision may be relatively high by your
> standards, the chance that the colliding blocks have data of any
> significance is very, very low. 

Okay, valid point. For my personal things (eg. large media 
collections) this would be perfectly fine, since the data isn't 
that important and a few broken data blocks aren't that harmful
(loosing an frame in an movie is even hard to notice).

But for HA applications, we still need some additional redundancy
or at least some error diagnostics at application level. Well, 
we'll most likely needs this anyways, eg. to detect human fault
or code bugs.

My current idea is to use two separate hash functions in parallel
(as many sw distros already do). But I've got no idea if this
really helps or collissions in SHA-1 will often go parallel with
colissions in the second hash (eg. MD5).


cu
-- 
---------------------------------------------------------------------
 Enrico Weigelt    ==   metux IT service - http://www.metux.de/
---------------------------------------------------------------------
 Please visit the OpenSource QM Taskforce:
        http://wiki.metux.de/public/OpenSource_QM_Taskforce
 Patches / Fixes for a lot dozens of packages in dozens of versions:
        http://patches.metux.de/
---------------------------------------------------------------------

Reply via email to