>From: William Allen Simpson <[EMAIL PROTECTED]>
>Sent: Jan 11, 2005 1:48 PM
>To: cryptography@metzdowd.com
>Subject: Re: entropy depletion
>Ben Laurie wrote:
>> Surely observation of /dev/urandom's output also gives away information?
>>
>ummm, no, not
>From: "Steven M. Bellovin" <[EMAIL PROTECTED]>
>Sent: Jan 11, 2005 10:58 AM
>To: cryptography@metzdowd.com
>Subject: Re: entropy depletion
>Let me raise a different issue: a PRNG might be better *in practice*
>because of higher assurance that it's actu
On Tue, Jan 11, 2005 at 03:48:32PM -0500, William Allen Simpson wrote:
> >2. set the contract in the read() call such that
> >the bits returned may be internally entangled, but
> >must not be entangled with any other read(). This
> >can trivially be met by locking the device for
> >single read ac
William Allen Simpson wrote:
Ian G wrote:
The *requirement* is that the generator not leak
information.
This requirement applies equally well to an entropy
collector as to a PRNG.
Now here we disagree. It was long my understanding
that the reason the entropy device (/dev/random)
could be used for
William Allen Simpson wrote:
Ben Laurie wrote:
William Allen Simpson wrote:
Why then restrict it to non-communications usages?
Because we are starting from the postulate that observation of the
output could (however remotely) give away information about the
underlying state of the entropy generato
Ian G wrote:
The *requirement* is that the generator not leak
information.
This requirement applies equally well to an entropy
collector as to a PRNG.
Now here we disagree. It was long my understanding
that the reason the entropy device (/dev/random)
could be used for both output and input, and bl
Ben Laurie wrote:
William Allen Simpson wrote:
Why then restrict it to non-communications usages?
Because we are starting from the postulate that observation of the
output could (however remotely) give away information about the
underlying state of the entropy generator(s).
Surely observation of /d
Ben Laurie wrote:
William Allen Simpson wrote:
Why then restrict it to non-communications usages?
Because we are starting from the postulate that observation of the
output could (however remotely) give away information about the
underlying state of the entropy generator(s).
Surely observation of
Ben Laurie wrote:
William Allen Simpson wrote:
Why then restrict it to non-communications usages?
Because we are starting from the postulate that observation of the
output could (however remotely) give away information about the
underlying state of the entropy generator(s).
Surely observation of
Let me raise a different issue: a PRNG might be better *in practice*
because of higher assurance that it's actually working as designed at
any given time.
Hardware random number generators are subject to all sorts of
environmental issues, including stuck bits, independent oscillators
that aren
William Allen Simpson wrote:
Why then restrict it to non-communications usages?
Because we are starting from the postulate that observation of the
output could (however remotely) give away information about the
underlying state of the entropy generator(s).
Surely observation of /dev/urandom's outp
Ian G wrote:
(4A) Programs must be audited to ensure that they do not use
/dev/random improperly.
(4B) Accesses to /dev/random should be logged.
I'm confused by this aggresive containment of the
entropy/random device. I'm assuming here that
/dev/random is the entropy device (better renamed
as /dev
On Sat, Jan 08, 2005 at 10:46:17AM +0800, Enzo Michelangeli wrote:
> But that was precisely my initial position: that the insight on the
> internal state (which I saw, by definition, as the loss of entropy by the
> generator) that we gain from one bit of output is much smaller than one
> full bit.
William Allen Simpson wrote:
There are already other worthy comments in the thread(s).
This is a great post. One can't stress enough
that programmers need programming guidance,
not arcane information theoretic concepts.
We are using
computational devices, and therefore computational infeasibility
Wondering how in the world we got into this endless debate, I went back
and re-read the entire thread(s). I think that early comments were
predictive, where Ian Grigg wrote:
... Crypto is
such a small part of security that most all crypto people
move acros
Zooko O'Whielacronx wrote:
I would love to have an information-theoretic argument for the security
of my PRNG, but that's not what we have,
Yes, and I'd like my goldfish to ride a bicycle, but he can't.
The P in PRNG is for Pseudo, and means the PRNG is relying
on computational intractability, not
I would love to have an information-theoretic argument for the security
of my PRNG, but that's not what we have, and I don't think reducing the
entropy_count by one bit per output bit gets us any closer to such an
argument.
For starters, the entropy_count value before you output the bit is
obv
- Original Message -
From: <[EMAIL PROTECTED]>
To:
Sent: Friday, January 07, 2005 9:30 AM
Subject: Re: entropy depletion (was: SSL/TLS passive sniffing)
> > From: [EMAIL PROTECTED]
> > [mailto:[EMAIL PROTECTED] On Behalf Of Enzo
> > Michelangeli
> > Sent:
I wrote:
>>
A long string produced by a good PRNG is conditionally
compressible in the sense that we know there exists a
shorter representation, but at the same time we believe it
to be conditionally incompressible in the sense that the
adversaries have no feasible way of finding a shorter
represe
>From: John Denker <[EMAIL PROTECTED]>
>Sent: Jan 5, 2005 2:06 PM
>To: Enzo Michelangeli <[EMAIL PROTECTED]>
>Cc: cryptography@metzdowd.com
>Subject: Re: entropy depletion (was: SSL/TLS passive sniffing)
...
>You're letting your intuition about "usable ra
| >
| > random number generator this way. Just what *is*
| > good enough?
|
| That's a good question. I think there is a good answer. It
| sheds light on the distinction of pseudorandomness versus
| entropy:
|
| A long string produced by a good PRNG is conditionally
| compressib
Jerrold Leichter asked:
random number generator this way. Just what *is*
good enough?
That's a good question. I think there is a good answer. It
sheds light on the distinction of pseudorandomness versus
entropy:
A long string produced by a good PRNG is conditionally
compressible in
| > You're letting your intuition about "usable randomness" run roughshod
| > over the formal definition of entropy. Taking bits out of the PRNG
| > *does* reduce its entropy.
|
| By how much exactly? I'd say, _under the hypothesis that the one-way
| function can't be broken and other attacks fai
On Thu, Jan 06, 2005 at 04:35:05PM +0800, Enzo Michelangeli wrote:
> By how much exactly? I'd say, _under the hypothesis that the one-way
> function can't be broken and other attacks fail_, exactly zero; in the
> real world, maybe a little more.
Unfortunately for your analysis, *entropy* assumes t
> From: [EMAIL PROTECTED]
> [mailto:[EMAIL PROTECTED] On Behalf Of Enzo
> Michelangeli
> Sent: Tuesday, January 04, 2005 7:50 PM
>
> This "entropy depletion" issue keeps coming up every now and
> then, but I still don't understand how it is supposed to
> happen. If the PRNG uses a really non-i
I wrote:
>> Taking bits out of the PRNG *does* reduce its entropy.
Enzo Michelangeli wrote:
By how much exactly?
By one bit per bit.
I'd say, _under the hypothesis that the one-way
function can't be broken and other attacks fail_, exactly zero; in the
real world, maybe a little more.
If you said
- Original Message -
From: "John Denker" <[EMAIL PROTECTED]>
Sent: Thursday, January 06, 2005 3:06 AM
> Enzo Michelangeli wrote:
[...]
> > If the PRNG uses a
> > really non-invertible algorithm (or one invertible only
> > with intractable complexity), its output gives no insight
> > w
Enzo Michelangeli wrote:
>
> This "entropy depletion" issue keeps coming up every now and then, but I
> still don't understand how it is supposed to happen.
Then you're not paying attention.
> If the PRNG uses a
> really non-invertible algorithm (or one invertible only with intractable
> complexity
28 matches
Mail list logo