So the funny thing about, say, SHA-1, is if you give it less than 160 bits of data, you end up expanding into 160 bits of data, but if you give it more than 160 bits of data, you end up contracting into 160 bits of data. This works of course for any input data, entropic or not. Hash saturation? Is not every modern hash saturated with as much entropy it can assume came from the input data (i.e. all input bits have a 50% likelihood of changing all output bits)?
Incidentally, that's a more than mild assumptoin that it's pure noise coming off the sound card. It's not, necessarily, not even at the high frequencies. Consider for a moment the Sound Blaster Live's E10K chip, internally hard-clocked to 48khz. This chip uses a fairly simple algorithm to upsample or downsample all audio streams to 48,000 samples per second. It's well known that scaling algorithms exhibit noticable properties -- this fact has been used to detect photoshopped works, for instance. Take a look how noise centered around 15khz gets represented in a 48khz averaged domain. Would your system detect this fault? Of course not. No extant system can yet detect the difference between a quantum entropy generator and an AES or 3DES stream. (RC4's another story.) You can't externally calculate entropy levels; you can only assume. --Dan John Denker wrote: > On 07/01/05 13:08, Charles M. Hannum wrote: > >> Most implementations of /dev/random (or so-called "entropy gathering >> daemons") rely on disk I/O timings as a primary source of randomness. > > >> ... I believe it is readily apparent that such exploits could be >> written. > > > So don't do it that way. > > Vastly better methods are available: > http://www.av8n.com/turbid/ > > ABSTRACT: We discuss the principles of a High-Entropy Symbol Generator > (also called a Random Number Generator) that is suitable for a wide > range of applications, including cryptography and other high-stakes > adversarial applications. It harvests entropy from physical processes, > and uses that entropy efficiently. The hash saturation principle is used > to distill the data, resulting in virtually 100% entropy density. This > is calculated, *not* statistically estimated, and is provably correct > under mild assumptions. In contrast to a Pseudo-Random Number Generator, > it has no internal state to worry about, and does not depend on > unprovable assumptions about ``one-way functions''. We also describe a > low-cost high-performance implementation, using the computer's audio I/O > system. > > --------------------------------------------------------------------- > The Cryptography Mailing List > Unsubscribe by sending "unsubscribe cryptography" to > [EMAIL PROTECTED] --------------------------------------------------------------------- The Cryptography Mailing List Unsubscribe by sending "unsubscribe cryptography" to [EMAIL PROTECTED]
