----- Original Message ----- From: "Alex Pankratov" <[EMAIL PROTECTED]>
To: <[email protected]>
Sent: Thursday, October 25, 2007 9:16 PM
Subject: Password vs data entropy


Say, we have a random value of 4 kilobits that someone wants
to keep secret by the means of protecting it with a password.

Empirical entropy estimate for an English text is 1.3 bits of
randomness per character, IIRC.

Assuming the password is an English word or a phrase, and the
secret is truly random, does it mean that the password needs
to be 3100+ characters in size in order to provide a "proper"
degree of protection to the value ?

Or, rephrasing, what should the entropy of the password be
compared to the entropy of the value being protected (under
whatever keying/encryption scheme) ?

I realize that this is rather .. err .. open-ended question,
and it depends on what one means by "protected", but I'm sure
you can see the gist of the question. How would one deem a
password random enough to be fit for protecting an equivalent
of N bits of random data ? Is it a 1-to-1 ratio ?


There are two answers:

1) Attacker has no oracle to tell him it is correct, information with probility is useless It is only necessary that entropy(plaintext)+entropy(password) >= sizeof(ciphertext). As a result the password only needs 1 bit of entropy, and the combiner to generate the ciphertext MUST decrypt to at least two different plaintexts based on the password.

2) an oracle is available, or probability matters
It is necessary that entropy(password) exceed the computation efforts for the attacker. This is more difficult to determine because it changes continually, but 128-bits of entropy is a "good" estimate. It is always at least difficult, possibly impossible, to prove a useful limit here.
           Joe

---------------------------------------------------------------------
The Cryptography Mailing List
Unsubscribe by sending "unsubscribe cryptography" to [EMAIL PROTECTED]

Reply via email to