Re: Entropy Definition (was Re: passphrases with more than 160 bits of entropy)

2006-03-27 Thread David Malone
On Sat, Mar 25, 2006 at 07:26:51PM -0500, John Denker wrote: Executive summary: Small samples do not always exhibit average behavior. That's not the whole problem - you have to be looking at the right average too. For the long run encodability of a set of IID symbols produced with probability

Re: Entropy Definition (was Re: passphrases with more than 160 bits of entropy)

2006-03-26 Thread John Kelsey
From: John Denker [EMAIL PROTECTED] Sent: Mar 24, 2006 11:57 AM To: Erik Zenner [EMAIL PROTECTED], cryptography@metzdowd.com Subject: Re: Entropy Definition (was Re: passphrases with more than 160 bits of entropy) Erik Zenner wrote: 0 occurs with probability 1/2 each other number from 1 to 2

Re: Entropy Definition (was Re: passphrases with more than 160 bits of entropy)

2006-03-26 Thread John Denker
In the context of 0 occurs with probability 1/2 each other number from 1 to 2^{160}+1 happens with probability 2^{-161}. I wrote: This ... serves to illustrate, in an exaggerated way, the necessity of not assuming that the raw data words are IID (independent and identically distributed).

RE: Entropy Definition (was Re: passphrases with more than 160 bits of entropy)

2006-03-25 Thread John Kelsey
From: Erik Zenner [EMAIL PROTECTED] Sent: Mar 24, 2006 4:14 AM To: cryptography@metzdowd.com Subject: RE: Entropy Definition (was Re: passphrases with more than 160 bits of entropy) ... [I wrote:] 0 occurs with probability 1/2 each other number from 1 to 2^{160}+1 happens with probability 2

Re: Entropy Definition (was Re: passphrases with more than 160 bits of entropy)

2006-03-24 Thread John Denker
Hal Finney wrote: ... This is true, in fact it is sometimes called the universal distribution or universal measure. In more detail, it is a distribution over all finite-length strings. The measure for a particular string X is defined as the sum over all programs that output X of 1/2^L_i, where

RE: Entropy Definition (was Re: passphrases with more than 160 bits of entropy)

2006-03-24 Thread Erik Zenner
Shannon entropy is the one most people know, but it's all wrong for deciding how many samples you need to derive a key. The kind of classic illustration of this is the probability distirbution: 0 occurs with probability 1/2 each other number from 1 to 2^{160}+1 happens with

Re: Entropy Definition

2006-03-24 Thread Perry E. Metzger
Erik Zenner [EMAIL PROTECTED] writes: Shannon entropy is the one most people know, but it's all wrong for deciding how many samples you need to derive a key. The kind of classic illustration of this is the probability distirbution: 0 occurs with probability 1/2 each other number from

Re: Entropy Definition (was Re: passphrases with more than 160 bits of entropy)

2006-03-24 Thread Ed Gerck
Someone mentioned Physics in this discussion and this was for me a motivation to point out something that has been forgotten by Shannon, Kolmogorov, Chaitin and in this thread. Even though Shannon's data entropy formula looks like an absolute measure (there is no reference included), the often

Re: Entropy Definition (was Re: passphrases with more than 160 bits of entropy)

2006-03-24 Thread John Denker
Ed Gerck wrote: In Physics, Thermodynamics, entropy is a potential [1]. That's true in classical (19th-century) thermodynamics, but not true in modern physics, including statistical mechanics. The existence of superconductors and superfluids removes all doubt about the absolute zero of

Re: Entropy Definition (was Re: passphrases with more than 160 bits of entropy)

2006-03-23 Thread John Denker
Aram Perez wrote: * How do you measure entropy? I was under the (false) impression that Shannon gave a formula that measured the entropy of a message (or information stream). Entropy is defined in terms of probability. It is a measure of how much you don't know about the situation. If by

Re: Entropy Definition (was Re: passphrases with more than 160 bits of entropy)

2006-03-23 Thread Jack Lloyd
On Wed, Mar 22, 2006 at 03:29:07PM -0800, Aram Perez wrote: * How do you measure entropy? I was under the (false) impression that Shannon gave a formula that measured the entropy of a message (or information stream). He did give a formula for the entropy of a source; however the

Re: Entropy Definition (was Re: passphrases with more than 160 bits of entropy)

2006-03-23 Thread Sandy Harris
Aram Perez [EMAIL PROTECTED] wrote: So, if you folks care to educate me, I have several questions related to entropy and information security (apologies to any physicists): I'll answer the easier questions. I'll leave the harder ones for someone with a better grounding in information theory.

Re: Entropy Definition (was Re: passphrases with more than 160 bits of entropy)

2006-03-23 Thread John Kelsey
From: Jack Lloyd [EMAIL PROTECTED] Sent: Mar 22, 2006 11:30 PM To: cryptography@metzdowd.com Subject: Re: Entropy Definition (was Re: passphrases with more than 160 bits of entropy) ... As an aside, this whole discussion is confused by the fact that there are a bunch of different domains

Re: Entropy Definition (was Re: passphrases with more than 160 bits of entropy)

2006-03-23 Thread John Denker
, but it is a probability, and you can plug it into the entropy definition. The problem is almost never understanding the definition of entropy. The problem is almost always ascertaining what is the correct probability measure applicable to the problem at hand. Don't get me started about universal

Re: Entropy Definition (was Re: passphrases with more than 160 bits of entropy)

2006-03-23 Thread Greg Rose
At 22:09 -0500 2006/03/22, John Denker wrote: Aram Perez wrote: * Can you add or increase entropy? Shuffling a deck of cards increases the entropy of the deck. As a minor nit, shuffling *in an unpredictable manner* adds entropy, because there is extra randomness being brought into the

Re: Entropy Definition (was Re: passphrases with more than 160 bits of entropy)

2006-03-23 Thread John Kelsey
From: John Denker [EMAIL PROTECTED] Sent: Mar 23, 2006 1:44 PM To: John Kelsey [EMAIL PROTECTED], cryptography@metzdowd.com Subject: Re: Entropy Definition (was Re: passphrases with more than 160 bits of entropy) ... With some slight fiddling to get the normalization right, 1/2 raised

Re: Entropy Definition (was Re: passphrases with more than 160 bits of entropy)

2006-03-23 Thread John Denker
I wrote: With some slight fiddling to get the normalization right, 1/2 raised to the power of (program length) defines a probability measure. This may not be the probability you want, but it is a probability, and you can plug it into the entropy definition. John Kelsey wrote: No, this isn't

Re: Entropy Definition (was Re: passphrases with more than 160 bits of entropy)

2006-03-23 Thread Hal Finney
This is getting pretty far afield from cryptography but it is a topic I find very interesting so I can't resist jumping in. John Denker writes: OK, in a moment we will have gone through four plies of no-it-isn't yes-it-is no-it-isn't yes-it-is. Let's get serious. The axiomatic definition of

Entropy Definition (was Re: passphrases with more than 160 bits of entropy)

2006-03-22 Thread Aram Perez
On Mar 22, 2006, at 2:05 PM, Perry E. Metzger wrote: Victor Duchovni [EMAIL PROTECTED] writes: Actually calculating the entropy for real-world functions and generators may be intractable... It is, in fact, generally intractable. 1) Kolmogorov-Chaitin entropy is just plain intractable --