Hal Finney wrote:
...
This is true, in fact it is sometimes called the universal distribution
or universal measure. In more detail, it is a distribution over all
finite-length strings. The measure for a particular string X is defined
as the sum over all programs that output X of 1/2^L_i, where
On Thu, Mar 23, 2006 at 08:15:50PM -, Dave Korn wrote:
As we all know, when you pay with a credit or debit card at a store, it's
important to take the receipt with you
[..]
So what they've been doing at my local branch of Marks Spencer for the
past few weeks is, at the end of the
Shannon entropy is the one most people know, but it's all
wrong for deciding how many samples you need to derive a key.
The kind of classic illustration of this is the probability
distirbution:
0 occurs with probability 1/2
each other number from 1 to 2^{160}+1 happens with
| Min-entropy of a probability distribution is
|
| -lg ( P[max] ),
|
| minus the base-two log of the maximum probability.
|
| The nice thing about min-entropy in the PRNG world is that it leads to
| a really clean relationship between how many bits of entropy we need
| to seed the PRNG, and
Erik Zenner [EMAIL PROTECTED] writes:
Shannon entropy is the one most people know, but it's all
wrong for deciding how many samples you need to derive a key.
The kind of classic illustration of this is the probability
distirbution:
0 occurs with probability 1/2
each other number from
Following Travis' message, let me first describe the main results of the
paper.
The paper provides a concise algorithmic description of the Linux random
number generator (LRNG), which is quite complex and is based on the use of
shift registers and of several SHA-1 operations. Identifying the
Someone mentioned Physics in this discussion and this
was for me a motivation to point out something that
has been forgotten by Shannon, Kolmogorov, Chaitin
and in this thread.
Even though Shannon's data entropy formula looks like an
absolute measure (there is no reference included), the often
J. Bruce Fields wrote:
On Thu, Mar 23, 2006 at 08:15:50PM -, Dave Korn wrote:
So what they've been doing at my local branch of Marks Spencer
for the past few weeks is, at the end of the transaction after the
(now always chip'n'pin-based) card reader finishes authorizing your
| If all that information's printed on the outside of the card, then
| isn't this battle kind of lost the moment you hand the card to them?
|
| 1- I don't hand it to them. I put it in the chip-and-pin card reader
| myself. In any case, even if I hand it to a cashier, it is within my
sight
On Fri, Mar 24, 2006 at 06:47:07PM -, Dave Korn wrote:
J. Bruce Fields wrote:
If all that information's printed on the outside of the card, then
isn't this battle kind of lost the moment you hand the card to them?
1- I don't hand it to them. I put it in the chip-and-pin card reader
Ed Gerck wrote:
In Physics, Thermodynamics, entropy is a potential [1].
That's true in classical (19th-century) thermodynamics, but not
true in modern physics, including statistical mechanics. The
existence of superconductors and superfluids removes all doubt
about the absolute zero of
11 matches
Mail list logo