Enzo Michelangeli wrote:
> This "entropy depletion" issue keeps coming up every now and then, but I
> still don't understand how it is supposed to happen.
Then you're not paying attention.
> If the PRNG uses a
> really non-invertible algorithm (or one invertible only with intractable
> complexity), its output gives no insight whatsoever on its internal state.
That is an invalid argument. The output is not the only source of insight
as to the internal state. As discussed at
attacks against PRNGs can be classified as follows:
1. Improper seeding, i.e. internal state never properly initialized.
2. Leakage of the internal state over time. This rarely involves
direct cryptanalytic attack on the one-way function, leading to
leakage through the PRNG’s output channel. More commonly it
3. Improper stretching of limited entropy supplies, i.e. improper
reseeding of the PRNG, and other re-use of things that ought not
4. Bad side effects.
There is a long record of successful attacks against PRNGs (op cit.).
I'm not saying that the problems cannot be overcome, but the cost and bother
of overcoming them may be such that you decide it's easier (and better!) to
implement an industrial-strength high-entropy symbol generator.
> As entropy is a measure of the information we don't have about the
> internal state of a system,
That is the correct definition of entropy ... but it must be correctly
interpreted and correctly applied; see below.
> it seems to me that in a good PRNGD its value
> cannot be reduced just by extracting output bits. If there is an entropy
> estimator based on the number of bits extracted, that estimator must be
You're letting your intuition about "usable randomness" run roughshod over
the formal definition of entropy. Taking bits out of the PRNG *does*
reduce its entropy. This may not (and in many applications does not)
reduce its ability to produce useful randomness.
The Cryptography Mailing List
Unsubscribe by sending "unsubscribe cryptography" to [EMAIL PROTECTED]