Von Neumann counseled Shannon to call it entropy because no one
really knows what entropy is. ;-)
I wanted to say that it's inherently problematic to use things like the
randomness in the interarrival time of events like interrupts, etc.
to gather entropy -- Ted has touched on this with his
Michael Sierchio wrote:
A bit stream may have 1 bit of entropy per bit of message (i.e. an
entropy of 1), and therefore be incompressible -- perhaps what Schwartz
thinks he means when he says truly random -- and be entirely predictable.
In case this isn't obvious, apply Von Neumann's