On 3/21/06, Travis H. [EMAIL PROTECTED] wrote:
Does anyone have a good idea on how to OWF passphrases without
reducing them to lower entropy counts?
I've frequently seen constructs of this type:
H(P), H(0 || P), H(0 || 0 || P), ...
If entropy(P) entropy(H), the entries will be independent,
- Original Message -
From: Travis H. [EMAIL PROTECTED]
Subject: passphrases with more than 160 bits of entropy
I was thinking that one could hash the first block, copy the
intermediate state, finalize it, then continue the intermediate result
with the next block, and finalize that.
Does anyone have a good idea on how to OWF passphrases without
reducing them to lower entropy counts? That is, I've seen systems
which hash the passphrase then use a PRF to expand the result --- I
don't want to do that. I want to have more than 160 bits of entropy
involved.
What kind of
On Mar 22, 2006, at 4:28 AM, Thierry Moreau wrote:
Travis H. wrote:
Hi,
Does anyone have a good idea on how to OWF passphrases without
reducing them to lower entropy counts? That is, I've seen systems
which hash the passphrase then use a PRF to expand the result --- I
don't want to do that.
On Mar 22, 2006, at 9:04 AM, Perry E. Metzger wrote:
Aram Perez [EMAIL PROTECTED] writes:
Entropy is a highly discussed unit of measure.
And very often confused.
Apparently.
While you do want maximum entropy, maximum
entropy is not sufficient. The sequence of the consecutive numbers 0
-
| Let me rephrase my sequence. Create a sequence of 256 consecutive
| bytes, with the first byte having the value of 0, the second byte the
| value of 1, ... and the last byte the value of 255. If you measure
| the entropy (according to Shannon) of that sequence of 256 bytes, you
| have
Aram Perez [EMAIL PROTECTED] writes:
On Mar 22, 2006, at 9:04 AM, Perry E. Metzger wrote:
Aram Perez [EMAIL PROTECTED] writes:
Entropy is a highly discussed unit of measure.
And very often confused.
Apparently.
While you do want maximum entropy, maximum
entropy is not sufficient. The
[EMAIL PROTECTED] writes:
| Let me rephrase my sequence. Create a sequence of 256 consecutive
| bytes, with the first byte having the value of 0, the second byte the
| value of 1, ... and the last byte the value of 255. If you measure
| the entropy (according to Shannon) of that
Let me rephrase my sequence. Create a sequence of 256 consecutive
bytes, with the first byte having the value of 0, the second byte
the value of 1, ... and the last byte the value of 255. If you
measure the entropy (according to Shannon) of that sequence of 256
bytes, you have maximum
PayPad (www.paypad.com) is an initiative that seems to have JPMorganChase
Chase behind it to provide an alternative method for paying transactions
on line. You buy a PayPad device, a small card reader with integrated
keypad. It connects to your PC using USB. To pay using PayPad at
a merchant
Victor Duchovni [EMAIL PROTECTED] writes:
Actually calculating the entropy for real-world functions and generators
may be intractable...
It is, in fact, generally intractable.
1) Kolmogorov-Chaitin entropy is just plain intractable -- finding the
smallest possible Turing machine to
On 3/21/06, [EMAIL PROTECTED] (Heyman, Michael) wrote:
Gutterman, Pinkas, and Reinman have produced a nice as-built-specification and
analysis of the Linux
random number generator.
From http://eprint.iacr.org/2006/086.pdf:
...
” Since randomness is often consumed in a multi-user environment,
On Mar 22, 2006, at 2:05 PM, Perry E. Metzger wrote:
Victor Duchovni [EMAIL PROTECTED] writes:
Actually calculating the entropy for real-world functions and
generators
may be intractable...
It is, in fact, generally intractable.
1) Kolmogorov-Chaitin entropy is just plain intractable --
On Wed, Mar 22, 2006 at 02:31:37PM -0800, Bill Frantz wrote:
One of my pet peeves: The idea that the user is the proper atom of
protection in an OS.
My threat model includes different programs run by one (human) user. If
a Trojan, running as part of my userID, can learn something about the
Matt Crawford wrote:
I so often get irritated when non-physicists discuss entropy. The word
is almost always misused.
Yes, the term entropy is often misused ... and we have seen some
remarkably wacky misusage in this thread already. However, physicists
do not have a monopoly on correct
15 matches
Mail list logo