Von Neumann counseled Shannon to call it entropy because "no one
really knows what entropy is". ;-)

I wanted to say that it's inherently problematic to use things like the
"randomness" in the interarrival time of events like interrupts, etc.
to "gather" "entropy" -- Ted has touched on this with his comment on
solid state disk drives, etc.

A bit stream may have 1 bit of entropy per bit of message (i.e. an
entropy of 1), and therefore be incompressible -- perhaps what Schwartz
thinks he means when he says "truly random" -- and be entirely predictable.

We want nicely-distributed "random" numbers without apparent bias,
with an apparent entropy of one, so that 10 bits of key material
is really 10 bits of key;  what makes such numbers cryptographically
useful is that no amount of collected material enables us to
predict bits anywhere else in the stream (past or future, if viewed
temporally).

I have given Herr Schwartz enough of my grossmutterlich Freundlichkeit
for one day.  Ppppt.

- M





______________________________________________________________________
OpenSSL Project                                 http://www.openssl.org
User Support Mailing List                    openssl-users@openssl.org
Automated List Manager                           [EMAIL PROTECTED]

Reply via email to