On Sat, Mar 25, 2006 at 07:26:51PM -0500, John Denker wrote:
> Executive summary: Small samples do not always exhibit "average" behavior.
That's not the whole problem - you have to be looking at the right
"average" too.
For the long run encodability of a set of IID symbols produced with
probabil
In the context of
>> 0 occurs with probability 1/2
>> each other number from 1 to 2^{160}+1 happens with
>> probability 2^{-161}.
I wrote:
> This ... serves to illustrate, in an exaggerated way, the necessity
> of not assuming that the raw data words are IID (independent and identically
> distr
>From: John Denker <[EMAIL PROTECTED]>
>Sent: Mar 24, 2006 11:57 AM
>To: Erik Zenner <[EMAIL PROTECTED]>, cryptography@metzdowd.com
>Subject: Re: Entropy Definition (was Re: passphrases with more than 160 bits
>of entropy)
>Erik Zenner wrote:
>
>>>0
>From: Erik Zenner <[EMAIL PROTECTED]>
>Sent: Mar 24, 2006 4:14 AM
>To: cryptography@metzdowd.com
>Subject: RE: Entropy Definition (was Re: passphrases with more than 160 bits
>of entropy)
...
>> [I wrote:]
>> 0 occurs with probability 1/2
>> each other
Ed Gerck wrote:
In Physics, Thermodynamics, entropy is a potential [1].
That's true in classical (19th-century) thermodynamics, but not
true in modern physics, including statistical mechanics. The
existence of superconductors and superfluids removes all doubt
about the absolute zero of entrop
Someone mentioned Physics in this discussion and this
was for me a motivation to point out something that
has been forgotten by Shannon, Kolmogorov, Chaitin
and in this thread.
Even though Shannon's data entropy formula looks like an
absolute measure (there is no reference included), the often
co
Erik Zenner wrote:
0 occurs with probability 1/2
each other number from 1 to 2^{160}+1 happens with
probability 2^{-161}.
Is anyone aware of whether (and where) this was
discussed in the literature, or what other approaches are taken?
This particular problem is contrived or at least exagg
> Shannon entropy is the one most people know, but it's all
> wrong for deciding how many samples you need to derive a key.
> The kind of classic illustration of this is the probability
> distirbution:
>
> 0 occurs with probability 1/2
> each other number from 1 to 2^{160}+1 happens with
> p
Hal Finney wrote:
...
This is true, in fact it is sometimes called the universal distribution
or universal measure. In more detail, it is a distribution over all
finite-length strings. The measure for a particular string X is defined
as the sum over all programs that output X of 1/2^L_i, where
This is getting pretty far afield from cryptography but it is a topic
I find very interesting so I can't resist jumping in.
John Denker writes:
> OK, in a moment we will have gone through four plies of no-it-isn't
> yes-it-is no-it-isn't yes-it-is. Let's get serious. The axiomatic
> definition o
I wrote:
>>With some slight fiddling to get the normalization right, 1/2
>>raised to the power of (program length) defines a probability
>>measure. This may not be "the" probability you want, but it
>>is "a" probability, and you can plug it into the entropy definition.
John Kelsey wrote:
No,
>From: John Denker <[EMAIL PROTECTED]>
>Sent: Mar 23, 2006 1:44 PM
>To: John Kelsey <[EMAIL PROTECTED]>, cryptography@metzdowd.com
>Subject: Re: Entropy Definition (was Re: passphrases with more than 160 bits
>of entropy)
...
>With some slight fiddling to get the no
At 22:09 -0500 2006/03/22, John Denker wrote:
Aram Perez wrote:
* Can you add or increase entropy?
Shuffling a deck of cards increases the entropy of the deck.
As a minor nit, shuffling *in an unpredictable manner* adds entropy,
because there is extra randomness being brought into the pro
John Kelsey wrote:
As an aside, this whole discussion is confused by the fact that there
are a bunch of different domains in which entropy is defined. The
algorithmic information theory sense of entropy (how long is the
shortest program that produces this sequence?) is miles away from the
infor
On Mar 22, 2006, at 20:11, John Denker wrote:
But if you apply thoughtfully to a single fixed sequence, you
correctly get the answer zero.
I agree with all that, except for the "But". Shannon well knew that
the entropy was zero in such a situation.
Sure. The "but" was to someone who th
>From: Jack Lloyd <[EMAIL PROTECTED]>
>Sent: Mar 22, 2006 11:30 PM
>To: cryptography@metzdowd.com
>Subject: Re: Entropy Definition (was Re: passphrases with more than 160 bits
>of entropy)
...
As an aside, this whole discussion is confused by the fact that there
are a bunch
Aram Perez <[EMAIL PROTECTED]> wrote:
> So, if you folks care to educate me, I have several questions related
> to entropy and information security (apologies to any physicists):
>
I'll answer the easier questions. I'll leave the harder ones for someone
with a better grounding in information theor
On Wed, Mar 22, 2006 at 03:29:07PM -0800, Aram Perez wrote:
> * How do you measure entropy? I was under the (false) impression that
> Shannon gave a formula that measured the entropy of a message (or
> information stream).
He did give a formula for the entropy of a source; however the caculat
Aram Perez wrote:
* How do you measure entropy? I was under the (false) impression that
Shannon gave a formula that measured the entropy of a message (or
information stream).
Entropy is defined in terms of probability. It is a measure of
how much you don't know about the situation. If by
Matt Crawford wrote:
I so often get irritated when non-physicists discuss entropy. The word
is almost always misused.
Yes, the term "entropy" is often misused ... and we have seen some
remarkably wacky misusage in this thread already. However, physicists
do not have a monopoly on correct u
On Mar 22, 2006, at 2:05 PM, Perry E. Metzger wrote:
Victor Duchovni <[EMAIL PROTECTED]> writes:
Actually calculating the entropy for real-world functions and
generators
may be intractable...
It is, in fact, generally intractable.
1) Kolmogorov-Chaitin entropy is just plain intractable --
Victor Duchovni <[EMAIL PROTECTED]> writes:
> Actually calculating the entropy for real-world functions and generators
> may be intractable...
It is, in fact, generally intractable.
1) Kolmogorov-Chaitin entropy is just plain intractable -- finding the
smallest possible Turing machine to gene
On Wed, Mar 22, 2006 at 01:58:26PM -0600, Matt Crawford wrote:
> If you have a generator of 8-bit random numbers and every sample is
> independent and uniformly distributed, and you ran this for a
> gazillion iterations and wrote to the list one day saying the special
> sequence { 0, 1, 2, .
Let me rephrase my sequence. Create a sequence of 256 consecutive
bytes, with the first byte having the value of 0, the second byte
the value of 1, ... and the last byte the value of 255. If you
measure the entropy (according to Shannon) of that sequence of 256
bytes, you have maximum entro
[EMAIL PROTECTED] writes:
> | Let me rephrase my sequence. Create a sequence of 256 consecutive
> | bytes, with the first byte having the value of 0, the second byte the
> | value of 1, ... and the last byte the value of 255. If you measure
> | the entropy (according to Shannon) of that sequ
Aram Perez <[EMAIL PROTECTED]> writes:
> On Mar 22, 2006, at 9:04 AM, Perry E. Metzger wrote:
>
>>
>> Aram Perez <[EMAIL PROTECTED]> writes:
Entropy is a highly discussed unit of measure.
>>>
>>> And very often confused.
>>
>> Apparently.
>>
>>> While you do want maximum entropy, maximum
>>>
| Let me rephrase my sequence. Create a sequence of 256 consecutive
| bytes, with the first byte having the value of 0, the second byte the
| value of 1, ... and the last byte the value of 255. If you measure
| the entropy (according to Shannon) of that sequence of 256 bytes, you
| have max
On Mar 22, 2006, at 9:04 AM, Perry E. Metzger wrote:
Aram Perez <[EMAIL PROTECTED]> writes:
Entropy is a highly discussed unit of measure.
And very often confused.
Apparently.
While you do want maximum entropy, maximum
entropy is not sufficient. The sequence of the consecutive numbers 0
Aram Perez <[EMAIL PROTECTED]> writes:
>> Entropy is a highly discussed unit of measure.
>
> And very often confused.
Apparently.
> While you do want maximum entropy, maximum
> entropy is not sufficient. The sequence of the consecutive numbers 0
> - 255 have maximum entropy but have no randomnes
On Mar 22, 2006, at 4:28 AM, Thierry Moreau wrote:
Travis H. wrote:
Hi,
Does anyone have a good idea on how to OWF passphrases without
reducing them to lower entropy counts? That is, I've seen systems
which hash the passphrase then use a PRF to expand the result --- I
don't want to do that. I
> BTW, with respect to entropy reduction is there any explanation why
> PBKDFs from PKCS5 hash
>
> password || seed || counter
>
> instead of
>
> counter || seed || password
>
> and thus reduce all the entropy of the password to the size of the
> internal state.
In theory it's more efficient
On Tue, 21 Mar 2006, Travis H. wrote:
> Does anyone have a good idea on how to OWF passphrases without
> reducing them to lower entropy counts? That is, I've seen systems
> which hash the passphrase then use a PRF to expand the result --- I
> don't want to do that. I want to have more than 160 bi
Travis H. wrote:
Hi,
Does anyone have a good idea on how to OWF passphrases without
reducing them to lower entropy counts? That is, I've seen systems
which hash the passphrase then use a PRF to expand the result --- I
don't want to do that. I want to have more than 160 bits of entropy
involv
> Does anyone have a good idea on how to OWF passphrases without
> reducing them to lower entropy counts? That is, I've seen systems
> which hash the passphrase then use a PRF to expand the result --- I
> don't want to do that. I want to have more than 160 bits of entropy
> involved.
What kind o
- Original Message -
From: "Travis H." <[EMAIL PROTECTED]>
Subject: passphrases with more than 160 bits of entropy
I was thinking that one could hash the first block, copy the
intermediate state, finalize it, then continue the intermediate result
with the next block, and finalize that.
On 3/21/06, Travis H. <[EMAIL PROTECTED]> wrote:
> Does anyone have a good idea on how to OWF passphrases without
> reducing them to lower entropy counts?
I've frequently seen constructs of this type:
H(P), H(0 || P), H(0 || 0 || P), ...
If entropy(P) > entropy(H), the entries will be independen
36 matches
Mail list logo