# Re: entropy depletion

```| >
| > .... random number generator this way. ....  Just what *is*
| > good enough?
|
| That's a good question.  I think there is a good answer.  It
| sheds light on the distinction of pseudorandomness versus
| entropy:
|
|   A long string produced by a good PRNG is conditionally
|   compressible in the sense that we know there exists a shorter
|   representation, but at the same time we believe it to be
|   conditionally incompressible in the sense that the adversaries
|   have no feasible way of finding a shorter representation.
|
| In contrast,
|
|   A long string produced by a HESG is unconditionally, absolutely
|   incompressible. There does not exist a shorter representation.
|   There cannot possibly exist a shorter representation.
You're using Kolgomorv/Chaitin complexity here, but unfortunately that's a
bit hard to apply.  *Any* string has a shorter representation - if I get to
specify the model *after* you choose the string.  K/C complexity is robust
when you talk about sets of possible strings, because playing around with
the machine can only shorten a fixed number of them.  Turning that into a
notion useful for cryptography is a bit tougher - and in any case, if you
want to get at PRNG's, you need to get at K/C complexity "with trapdoor
information":  The bit stream may *look* uncompressible, but given the
internal state, it is easily produced.```
```
More generally:  We're talking about "stretching" entropy here.  There are
certainly theoretical results in that direction, the one usually mentioned
being the BBS bit generator, which takes k bits of entropy and gives you
p(k) (for some polynomial p) bits that are polynomially-indistinguishable
from random bits.  But you (a) need some significant work to set this up and
prove it; (b) BBS generators are slow.

A simpler approach that does work (in some sense) is:  Choose a random
starting value R, and a number k.  Compute SHA^i(R) for i from 0 to k.  Emit
these values *backwards*.  Then given the first k-1 outputs, an attacker
cannot determine the next one (under the standard hypotheses about SHA).

Unfortunately, this is useless as, say, a key generator:  If you send me
the k-1'st output for use as a key, I can't determine what your *next* key
will be - but I can trivially read your preceding k-2 sessions.

The idea that revealing just the hash of random bits "doesn't reduce the
effective entropy" sounds great, but it's naive.  It's like the argument
that if H is a good hash function, then H(K || message) is a good MAC.  Not
quite.
-- Jerry

---------------------------------------------------------------------
The Cryptography Mailing List
Unsubscribe by sending "unsubscribe cryptography" to [EMAIL PROTECTED]
```