I wrote: > > a) if the hash function happens to have a property I call "no > >wasted entropy" then the whitening stage is superfluous (and > >you may decide to classify the hash as "non-simple");
David Honig responded: > Not wasting entropy does not mean that a function's output > is "white" ie uniformly distributed. E.g., a function > which doubles every bit: 0->00, 1->11 preserves total > entropy but does not produce uniformly distributed output. > (It also reduces entropy/symbol.) That's a red-herring tangent. I'm not talking about any old function that doesn't waste entropy; I'm talking about a !!hash!! function that doesn't waste entropy. The hash function has a hard constraint on the word-size of its output. If it starts doubling bits, or otherwise putting out redundancies, then it is wasting entropy. See the discussion of the BADHASH-2 function in the paper. http://www.monmouth.com/~jsd/turbid/ And remember: in addition to having a non-entropy-wasting hash function, we are also required to saturate its input. Then we can conclude that the output is white to a very high degree, as quantitatively discussed in the paper. > The simple-hash's --lets call it a digest-- function is to increase the > entropy/symbol > by *reducing* the number of symbols while preserving *total* entropy. Total entropy is preserved in the non-saturated regime. This is documented in upper rows of the table: http://www.monmouth.com/~jsd/turbid/paper/turbid.htm#tab-saturation In the saturated regime, some entropy is necessarily lost. This is documented in the lower rows of the table. This is only a small percentage, but it is mandatory. I don't consider this to be "wasted" entropy; I consider it entropy well spent. That is, these are necessary hash collisions, as opposed to unnecessary ones. > A function like xor(bit N, bit N+1) which halves the number of bits > can do this. While its output is whiter than its input, its output > will not be perfectly white unless its input was. See the discussion of BADHASH-1 in the paper. > Two scenarios where SHA is contraindicated came up: > 1. hardware > 2. interrupt handlers > > Sometimes a hardware RNG will feed raw data into a crypto-WEAK analogue of > SHA, > ie a LFSR which does the bit-reduction and mixing functions, > but doesn't mix as well as SHA. (LFSR are hardware-friendly) Separating > the bit-reduction from the mixing can be useful for analyzing what's > really going on. Well, I tend to agree that systems that separate the bit-reduction from the mixing are easier to analyze, in the sense that it is easier to find flaws in them. But that's because they're so flawed! --------------------------------------------------------------------- The Cryptography Mailing List Unsubscribe by sending "unsubscribe cryptography" to [EMAIL PROTECTED]
