On Sat, Mar 25, 2006 at 07:26:51PM -0500, John Denker wrote:
Executive summary: Small samples do not always exhibit average behavior.
That's not the whole problem - you have to be looking at the right
average too.
For the long run encodability of a set of IID symbols produced with
probability
From: John Denker [EMAIL PROTECTED]
Sent: Mar 24, 2006 11:57 AM
To: Erik Zenner [EMAIL PROTECTED], cryptography@metzdowd.com
Subject: Re: Entropy Definition (was Re: passphrases with more than 160 bits
of entropy)
Erik Zenner wrote:
0 occurs with probability 1/2
each other number from 1 to 2
In the context of
0 occurs with probability 1/2
each other number from 1 to 2^{160}+1 happens with
probability 2^{-161}.
I wrote:
This ... serves to illustrate, in an exaggerated way, the necessity
of not assuming that the raw data words are IID (independent and identically
distributed).
From: Erik Zenner [EMAIL PROTECTED]
Sent: Mar 24, 2006 4:14 AM
To: cryptography@metzdowd.com
Subject: RE: Entropy Definition (was Re: passphrases with more than 160 bits
of entropy)
...
[I wrote:]
0 occurs with probability 1/2
each other number from 1 to 2^{160}+1 happens with
probability 2
Hal Finney wrote:
...
This is true, in fact it is sometimes called the universal distribution
or universal measure. In more detail, it is a distribution over all
finite-length strings. The measure for a particular string X is defined
as the sum over all programs that output X of 1/2^L_i, where
Shannon entropy is the one most people know, but it's all
wrong for deciding how many samples you need to derive a key.
The kind of classic illustration of this is the probability
distirbution:
0 occurs with probability 1/2
each other number from 1 to 2^{160}+1 happens with
Someone mentioned Physics in this discussion and this
was for me a motivation to point out something that
has been forgotten by Shannon, Kolmogorov, Chaitin
and in this thread.
Even though Shannon's data entropy formula looks like an
absolute measure (there is no reference included), the often
Ed Gerck wrote:
In Physics, Thermodynamics, entropy is a potential [1].
That's true in classical (19th-century) thermodynamics, but not
true in modern physics, including statistical mechanics. The
existence of superconductors and superfluids removes all doubt
about the absolute zero of
Aram Perez wrote:
* How do you measure entropy? I was under the (false) impression that
Shannon gave a formula that measured the entropy of a message (or
information stream).
Entropy is defined in terms of probability. It is a measure of
how much you don't know about the situation. If by
On Wed, Mar 22, 2006 at 03:29:07PM -0800, Aram Perez wrote:
* How do you measure entropy? I was under the (false) impression that
Shannon gave a formula that measured the entropy of a message (or
information stream).
He did give a formula for the entropy of a source; however the
Aram Perez [EMAIL PROTECTED] wrote:
So, if you folks care to educate me, I have several questions related
to entropy and information security (apologies to any physicists):
I'll answer the easier questions. I'll leave the harder ones for someone
with a better grounding in information theory.
From: Jack Lloyd [EMAIL PROTECTED]
Sent: Mar 22, 2006 11:30 PM
To: cryptography@metzdowd.com
Subject: Re: Entropy Definition (was Re: passphrases with more than 160 bits
of entropy)
...
As an aside, this whole discussion is confused by the fact that there
are a bunch of different domains
John Kelsey wrote:
As an aside, this whole discussion is confused by the fact that there
are a bunch of different domains in which entropy is defined. The
algorithmic information theory sense of entropy (how long is the
shortest program that produces this sequence?) is miles away from the
At 22:09 -0500 2006/03/22, John Denker wrote:
Aram Perez wrote:
* Can you add or increase entropy?
Shuffling a deck of cards increases the entropy of the deck.
As a minor nit, shuffling *in an unpredictable manner* adds entropy,
because there is extra randomness being brought into the
From: John Denker [EMAIL PROTECTED]
Sent: Mar 23, 2006 1:44 PM
To: John Kelsey [EMAIL PROTECTED], cryptography@metzdowd.com
Subject: Re: Entropy Definition (was Re: passphrases with more than 160 bits
of entropy)
...
With some slight fiddling to get the normalization right, 1/2
raised
I wrote:
With some slight fiddling to get the normalization right, 1/2
raised to the power of (program length) defines a probability
measure. This may not be the probability you want, but it
is a probability, and you can plug it into the entropy definition.
John Kelsey wrote:
No, this isn't
This is getting pretty far afield from cryptography but it is a topic
I find very interesting so I can't resist jumping in.
John Denker writes:
OK, in a moment we will have gone through four plies of no-it-isn't
yes-it-is no-it-isn't yes-it-is. Let's get serious. The axiomatic
definition of
On Mar 22, 2006, at 2:05 PM, Perry E. Metzger wrote:
Victor Duchovni [EMAIL PROTECTED] writes:
Actually calculating the entropy for real-world functions and
generators
may be intractable...
It is, in fact, generally intractable.
1) Kolmogorov-Chaitin entropy is just plain intractable --
18 matches
Mail list logo