On Sat, Mar 25, 2006 at 07:26:51PM -0500, John Denker wrote:
Executive summary: Small samples do not always exhibit average behavior.
That's not the whole problem - you have to be looking at the right
average too.
For the long run encodability of a set of IID symbols produced with
probability
From: John Denker [EMAIL PROTECTED]
Sent: Mar 24, 2006 11:57 AM
To: Erik Zenner [EMAIL PROTECTED], cryptography@metzdowd.com
Subject: Re: Entropy Definition (was Re: passphrases with more than 160 bits
of entropy)
Erik Zenner wrote:
0 occurs with probability 1/2
each other number from 1 to 2
In the context of
0 occurs with probability 1/2
each other number from 1 to 2^{160}+1 happens with
probability 2^{-161}.
I wrote:
This ... serves to illustrate, in an exaggerated way, the necessity
of not assuming that the raw data words are IID (independent and identically
distributed).
From: Erik Zenner [EMAIL PROTECTED]
Sent: Mar 24, 2006 4:14 AM
To: cryptography@metzdowd.com
Subject: RE: Entropy Definition (was Re: passphrases with more than 160 bits
of entropy)
...
[I wrote:]
0 occurs with probability 1/2
each other number from 1 to 2^{160}+1 happens with
probability 2
Hal Finney wrote:
...
This is true, in fact it is sometimes called the universal distribution
or universal measure. In more detail, it is a distribution over all
finite-length strings. The measure for a particular string X is defined
as the sum over all programs that output X of 1/2^L_i, where
Shannon entropy is the one most people know, but it's all
wrong for deciding how many samples you need to derive a key.
The kind of classic illustration of this is the probability
distirbution:
0 occurs with probability 1/2
each other number from 1 to 2^{160}+1 happens with
Someone mentioned Physics in this discussion and this
was for me a motivation to point out something that
has been forgotten by Shannon, Kolmogorov, Chaitin
and in this thread.
Even though Shannon's data entropy formula looks like an
absolute measure (there is no reference included), the often
Ed Gerck wrote:
In Physics, Thermodynamics, entropy is a potential [1].
That's true in classical (19th-century) thermodynamics, but not
true in modern physics, including statistical mechanics. The
existence of superconductors and superfluids removes all doubt
about the absolute zero of
considerable entropy density.
* (Apologies to the original poster) When the original poster requested
passphrases with more than 160 bits of entropy, what was he requesting?
When you apply a mathematical function to an ensemble of inputs, it
is common to find that the ensemble of outputs has less
On Wed, Mar 22, 2006 at 03:29:07PM -0800, Aram Perez wrote:
* How do you measure entropy? I was under the (false) impression that
Shannon gave a formula that measured the entropy of a message (or
information stream).
He did give a formula for the entropy of a source; however the
Aram Perez [EMAIL PROTECTED] wrote:
So, if you folks care to educate me, I have several questions related
to entropy and information security (apologies to any physicists):
I'll answer the easier questions. I'll leave the harder ones for someone
with a better grounding in information theory.
From: Jack Lloyd [EMAIL PROTECTED]
Sent: Mar 22, 2006 11:30 PM
To: cryptography@metzdowd.com
Subject: Re: Entropy Definition (was Re: passphrases with more than 160 bits
of entropy)
...
As an aside, this whole discussion is confused by the fact that there
are a bunch of different domains
John Kelsey wrote:
As an aside, this whole discussion is confused by the fact that there
are a bunch of different domains in which entropy is defined. The
algorithmic information theory sense of entropy (how long is the
shortest program that produces this sequence?) is miles away from the
At 22:09 -0500 2006/03/22, John Denker wrote:
Aram Perez wrote:
* Can you add or increase entropy?
Shuffling a deck of cards increases the entropy of the deck.
As a minor nit, shuffling *in an unpredictable manner* adds entropy,
because there is extra randomness being brought into the
From: John Denker [EMAIL PROTECTED]
Sent: Mar 23, 2006 1:44 PM
To: John Kelsey [EMAIL PROTECTED], cryptography@metzdowd.com
Subject: Re: Entropy Definition (was Re: passphrases with more than 160 bits
of entropy)
...
With some slight fiddling to get the normalization right, 1/2
raised
I wrote:
With some slight fiddling to get the normalization right, 1/2
raised to the power of (program length) defines a probability
measure. This may not be the probability you want, but it
is a probability, and you can plug it into the entropy definition.
John Kelsey wrote:
No, this isn't
This is getting pretty far afield from cryptography but it is a topic
I find very interesting so I can't resist jumping in.
John Denker writes:
OK, in a moment we will have gone through four plies of no-it-isn't
yes-it-is no-it-isn't yes-it-is. Let's get serious. The axiomatic
definition of
On 3/21/06, Travis H. [EMAIL PROTECTED] wrote:
Does anyone have a good idea on how to OWF passphrases without
reducing them to lower entropy counts?
I've frequently seen constructs of this type:
H(P), H(0 || P), H(0 || 0 || P), ...
If entropy(P) entropy(H), the entries will be independent,
- Original Message -
From: Travis H. [EMAIL PROTECTED]
Subject: passphrases with more than 160 bits of entropy
I was thinking that one could hash the first block, copy the
intermediate state, finalize it, then continue the intermediate result
with the next block, and finalize
Does anyone have a good idea on how to OWF passphrases without
reducing them to lower entropy counts? That is, I've seen systems
which hash the passphrase then use a PRF to expand the result --- I
don't want to do that. I want to have more than 160 bits of entropy
involved.
What kind of
On Mar 22, 2006, at 4:28 AM, Thierry Moreau wrote:
Travis H. wrote:
Hi,
Does anyone have a good idea on how to OWF passphrases without
reducing them to lower entropy counts? That is, I've seen systems
which hash the passphrase then use a PRF to expand the result --- I
don't want to do that.
On Mar 22, 2006, at 9:04 AM, Perry E. Metzger wrote:
Aram Perez [EMAIL PROTECTED] writes:
Entropy is a highly discussed unit of measure.
And very often confused.
Apparently.
While you do want maximum entropy, maximum
entropy is not sufficient. The sequence of the consecutive numbers 0
-
| Let me rephrase my sequence. Create a sequence of 256 consecutive
| bytes, with the first byte having the value of 0, the second byte the
| value of 1, ... and the last byte the value of 255. If you measure
| the entropy (according to Shannon) of that sequence of 256 bytes, you
| have
Aram Perez [EMAIL PROTECTED] writes:
On Mar 22, 2006, at 9:04 AM, Perry E. Metzger wrote:
Aram Perez [EMAIL PROTECTED] writes:
Entropy is a highly discussed unit of measure.
And very often confused.
Apparently.
While you do want maximum entropy, maximum
entropy is not sufficient. The
[EMAIL PROTECTED] writes:
| Let me rephrase my sequence. Create a sequence of 256 consecutive
| bytes, with the first byte having the value of 0, the second byte the
| value of 1, ... and the last byte the value of 255. If you measure
| the entropy (according to Shannon) of that
Let me rephrase my sequence. Create a sequence of 256 consecutive
bytes, with the first byte having the value of 0, the second byte
the value of 1, ... and the last byte the value of 255. If you
measure the entropy (according to Shannon) of that sequence of 256
bytes, you have maximum
Victor Duchovni [EMAIL PROTECTED] writes:
Actually calculating the entropy for real-world functions and generators
may be intractable...
It is, in fact, generally intractable.
1) Kolmogorov-Chaitin entropy is just plain intractable -- finding the
smallest possible Turing machine to
requested passphrases with more than 160 bits of entropy, what was
he requesting?
* Does processing an 8 character password with a process similar to
PKCS#5 increase the entropy of the password?
* Can you add or increase entropy?
Thanks in advance,
Aram Perez
Matt Crawford wrote:
I so often get irritated when non-physicists discuss entropy. The word
is almost always misused.
Yes, the term entropy is often misused ... and we have seen some
remarkably wacky misusage in this thread already. However, physicists
do not have a monopoly on correct
Hi,
Does anyone have a good idea on how to OWF passphrases without
reducing them to lower entropy counts? That is, I've seen systems
which hash the passphrase then use a PRF to expand the result --- I
don't want to do that. I want to have more than 160 bits of entropy
involved.
I was thinking
30 matches
Mail list logo