Re: Entropy and PRNGs

2005-01-26 Thread John Denker
Ed Gerck wrote:
Let me comment, John, that thermal noise is not random
When did you figure that out?  If you'd been paying attention,
you'd know that I figured that out a long time ago.
First of all, the phrase not random is ambiguous.  I said
Some people think random should denote 100% entropy density,
and anything less than that is non-random even if it
contains a great deal of unpredictability. Other folks think
that random refers to anything with an entropy density
greater than zero, and non-random means completely
predictable.
Reference:
  http://www.av8n.com/turbid/paper/turbid.htm#sec-s-density
Thermal noise, as it comes off the hardware, has an entropy
density greater than zero and less than 100%, as I said at
  http://www.av8n.com/turbid/paper/turbid.htm#sec-hesg
and elsewhere.
There are several quantities that can be estimated in thermal
noise, reducing its entropy according to what you seem to expect
today. See photon bunching, as an example that is usually ignored.
Another, even though trivial, example is due to the observation that
thermal noise is not white noise. Yet another observation is that no
noise is really white, because of causality (in other words, it's
duration must be finite). The noise that is due to photon fluctuations
in thermal background radiation, for another example, depends
also on the number of detectors used to measure it, as well as
single- or multiple-mode illumination, and both internal and external
noise sources.
Stop wasting our time.  All that doesn't change the fact that
thermal noise, as it comes off the hardware, has a nonzero
entropy density.  And it is easy to arrange situations where
I can calculate a very useful lower bound on the entropy density.
Yes, it's entirely possible that someone in the future will know
more about your entropy source than you do today! Even thermal
noise.
That's tantamount to saying the second law of thermodynamics will
be repealed.  By that standard, it's entirely possible that the
sun will rise in the west tomorrow morning.  But I wouldn't bet
on it.
OTOH, why are nuclear decay processes considered safe as a source
of entropy? Because the range of energies preclude knowing or
tampering with the internal state. These processes are, however,
not free from correlations either.
Nuclear decay processes are not in any practical sense safer than
thermal noise.  As I discuss at
  http://www.av8n.com/turbid/paper/turbid.htm#sec-hesg-attack
nuclear is in the same category as thermal:  entropy density
greater than zero and less than 100%.
-
The Cryptography Mailing List
Unsubscribe by sending unsubscribe cryptography to [EMAIL PROTECTED]


Re: Entropy and PRNGs

2005-01-11 Thread Ed Gerck
John Denker wrote:
 For the sources of entropy that I consider
real entropy, such as thermal noise, for a modest payoff I'd
be willing to bet my life -- and also the lives of millions
of innocent people -- on the proposition that no adversary,
no matter how far in the future and no matter how resourceful,
will ever find in my data less entropy than I say there is.
Let me comment, John, that thermal noise is not random and is
not real entropy (btw, is there a fake entropy in your
view?).
There are several quantities that can be estimated in thermal
noise, reducing its entropy according to what you seem to expect
today. See photon bunching, as an example that is usually ignored.
Another, even though trivial, example is due to the observation that
thermal noise is not white noise. Yet another observation is that no
noise is really white, because of causality (in other words, it's
duration must be finite). The noise that is due to photon fluctuations
in thermal background radiation, for another example, depends
also on the number of detectors used to measure it, as well as
single- or multiple-mode illumination, and both internal and external
noise sources.
Yes, it's entirely possible that someone in the future will know
more about your entropy source than you do today! Even thermal
noise.
OTOH, why are nuclear decay processes considered safe as a source
of entropy? Because the range of energies preclude knowing or
tampering with the internal state. These processes are, however,
not free from correlations either.
Cheers,
Ed Gerck

-
The Cryptography Mailing List
Unsubscribe by sending unsubscribe cryptography to [EMAIL PROTECTED]


Re: Entropy and PRNGs

2005-01-10 Thread John Denker
Referring to http://www.apache-ssl.org/randomness.pdf
I wrote:
I just took a look at the first couple of pages.
IMHO it has much room for improvement.
David Wagner responded:
I guess I have to take exception.  I disagree.  I think Ben Laurie's
paper is quite good.  I thought your criticisms missed some of the points
he was trying to make (these points are subtle, so this is completely
understandable).  Presumably his paper could be criticized as not clear
enough, since it seems it did not convey those points adequately, but
I don't think his paper is inaccurate. 
Or perhaps it was my critique that was not clear enough.
I am not motivated to look for subtle shades of meaning in a
paper that claims the system clock is a source of entropy.
First of all, you can't throw around the term conditional
entropy without saying what it's conditioned on.

 Conditioned on everything known to the attacker, of course.
Well, of course indeed!  That notion of entropy -- the entropy
in the adversary's frame of reference -- is precisely the
notion that is appropriate to any adversarial situation, as I
have consistently and clearly stated in my writings;  see
e.g. the end of the definitions section, i.e.
  http://www.av8n.com/turbid/paper/turbid.htm#sec-defs
If it has no unpredictability, it has no entropy,
according to any reasonable definition of entropy, including the
somewhat loose definition on page 1.
Actually, I think Ben got it right.  Entropy depends on context.
The attacker might have extra context that allows him to narrow down
the possible values of the randomness samples.
Heed your own of course statement.  It is hard to imagine a
situation where my adversary has more context about my
generator than I have myself.
For instance, imagine if we use packet inter-arrival times (measured down
to the nanosecond) as our randomness source. 
I am unable to imagine myself being so silly.
This is the difference between unconditional and conditional entropy that
Ben was trying to introduce.  In information-theoretic notation, H(X)
vs H(X|Y).  Let X = packet inter-arrival time, and Y = everything seen by
a local eavesdropper, and you will see that H(X|Y) can be much smaller
than H(X).  Indeed, we can have H(X|Y) = 0 even if H(X) is very large.
This is Ben's point, and it is a good one.
No.  There is only one entropy that matters in an adversarial
situation.  The so-called unconditional entropy H(X) is
merely a wild overestimate of the only thing that matters.
There is no glory in outstripping donkeys.  (Martial)
There is no honor in introducing H(X) since it is irrelevant
in any adversarial situation.
A counter is fine as long as there is only one machine in the universe
that will ever assign UUIDs.  However, if you want to do distributed
generation of UUIDs, then counters are insufficient because there is no
way to prevent overlap of two machine's counter spaces.
I imagine a smart person such as DAW should be able to come
up with five schemes in five minutes whereby UUID generation
can be delegated to virtually any machine that wants it.
MAC(eth0) /concat/ local counter will do for scheme #1.
Perhaps what Ben should have said is that:
* Unconditional entropy is sufficient for UUIDs;
  conditional entropy is not needed.
* For centrally-assigned UUIDs, even unconditional entropy is unnecessary;
  a centrally-managed counter is fine.
* For distributed, unsynchronized assignment of UUIDs, unconditional
  entropy appears to be necessary and sufficient.
Horsefeathers.  For generating  UUIDs,  _zero_ entropy is
sufficient, and no positive amount of entropy (unconditional
or otherwise) can be called necessary.
I am not interested in ways of obtaining wild overestimates
of zero.
If you want people to believe that unconditional entropy is a
worthwhile concept, you'll have to come up with a nontrivial
application for it.
-
The Cryptography Mailing List
Unsubscribe by sending unsubscribe cryptography to [EMAIL PROTECTED]


Re: Entropy and PRNGs

2005-01-10 Thread John Denker
Ben Laurie wrote:
The point I am trying to make is that predictability is in the eye of 
the beholder. I think it is unpredictable, my attacker does not.
I still cannot see how that can happen to anyone unless
they're being willfully stupid.  It's like something out
of Mad Magazine: White Spy accepts a cigar from Black Spy,
lights it, and is surprised when it explodes.  That's
funny when it happens to somebody else, but as for me,
I'm not going to accept alleged entropy from any source
such that my adversary might know more about it than I do.
I'm just not.
By your argument, no PRNG ever has any entropy, since the inputs are 
clearly known (at least to the PRNG).
I *almost* agree with that, but my real argument is somewhat
more nuanced:
a) Certainly there is a very wide class of PRNGs that
have no entropy whatsoever, including many that Mr. Laurie
seems willing to attribute entropy to.
b) It is also possible, as I have repeatedly explained,
for an ordinary PRNG to have a modest amount of entropy
residing in its internal state.  This entropy must
have abeen obtained from elsewhere, from something
other than a PRNG, not produced _de novo_ by any PRNG.
Categories (a) and (b) share the property of having
no nonzero lower bound on the entropy _density_ of
the output stream;  the entropy density is either
strictly zero (case a) or asymptotically zero (case b).
c) At the opposite extreme, there exist things that
produce 100% entropy density.  These must not be
called PRNGs.  I like the name HESG -- High Entropy
Symbol Generator.
  http://www.av8n.com/turbid/
d) Also as I have repeatedly explained, there exist
intermediate cases, where something that works like
a PRNG is coupled to something else that provides
real entropy.  I recommend calling this a SRSG,
i.e. Stretched Random Symbol Generator, since it
isn't just a PRNG and it isn't just a HESG either.
  http://www.av8n.com/turbid/paper/turbid.htm#sec-srandom
Linux /dev/urandom was an early and unsatisfactory
attempt at an SRSG.  Yarrow, coupled to a good HESG,
is vastly better, and that's what I implemented.
-
The Cryptography Mailing List
Unsubscribe by sending unsubscribe cryptography to [EMAIL PROTECTED]


Re: Entropy and PRNGs

2005-01-10 Thread John Kelsey
From: John Denker [EMAIL PROTECTED]
Sent: Jan 10, 2005 12:21 AM
To: David Wagner [EMAIL PROTECTED]
Cc: cryptography@metzdowd.com
Subject: Re: Entropy and PRNGs

 Conditioned on everything known to the attacker, of course.

Well, of course indeed!  That notion of entropy -- the entropy
in the adversary's frame of reference -- is precisely the
notion that is appropriate to any adversarial situation, as I
have consistently and clearly stated in my writings;  see
e.g. the end of the definitions section, i.e.
   http://www.av8n.com/turbid/paper/turbid.htm#sec-defs

... 
 Actually, I think Ben got it right.  Entropy depends on context.
 The attacker might have extra context that allows him to narrow down
 the possible values of the randomness samples.

Heed your own of course statement.  It is hard to imagine a
situation where my adversary has more context about my
generator than I have myself.

Well, the broader problem isn't the context, it's the model.  If your attacker 
(who lives sometime in the future, and may have a large budget besides) comes 
up with a better model to describe the process you're using as a source of 
noise, you could be out of luck.   The thing that matters is H(X| all 
information available to the attacker), which is based on P(X | all information 
available to the attacker), which includes a model that may be better than 
yours.

But I think it's of practical value to consider the different attackers whose 
information might not include some information you use for seeding a PRNG.  
Some sources of entropy, such as packet arrival times, are not worth much for 
attackers on your local network who are attacking you in real time, but are 
quite valuable against attackers who attack you later.  Other sources of 
entropy, such as the hash of the contents of your Windows registry, or a full 
directory tree from your hard drive, are worthwhile against real-time attackers 
without access to your machine, but worthless against  attackers with your 
machine in their hands.  Using cheaply-available sources of each kind in 
seeding a PRNG decreases the set of attackers that will be able to attack you,  
while not preventing you from also using some source of entropy you believe to 
be good against all attackers.  

...
If you want people to believe that unconditional entropy is a
worthwhile concept, you'll have to come up with a nontrivial
application for it.

Differentiate between measures of entropy.  Collision entropy (Renyi entropy of 
order two) is very useful in determining how many samples you can take before 
expecting a collision, and it's not conditioned on an attacker's information.  
And collision probabilities do matter, in both obvious and subtle ways, for 
PRNG security.  

--John

-
The Cryptography Mailing List
Unsubscribe by sending unsubscribe cryptography to [EMAIL PROTECTED]


Re: Entropy and PRNGs

2005-01-10 Thread John Denker
John Kelsey wrote:
If your attacker (who lives sometime in the future, and may
have a large budget besides) comes up with a better model to
describe the process you're using as a source of noise, you
could be out of luck.   The thing that matters is H(X| all
information available to the attacker), which is based on P(X
| all information available to the attacker), which includes a
model that may be better than yours.
I disagree.  For the sources of entropy that I consider
real entropy, such as thermal noise, for a modest payoff I'd
be willing to bet my life -- and also the lives of millions
of innocent people -- on the proposition that no adversary,
no matter how far in the future and no matter how resourceful,
will ever find in my data less entropy than I say there is.
(For instance, under suitable conditions I might say that
there's 159.98 bits of entropy in a 160-bit word.)
Of course I'd want to make approriate checks for implementation
errors, but in principle the entropy is there and the adversary
can't make it go away.  Some guy named Shannon had something to
say about this.
But I think it's of practical value to consider the different
attackers whose information might not include some information
you use for seeding a PRNG.  Some sources of entropy, such as
packet arrival times, are not worth much for attackers on your
local network who are attacking you in real time, but are
quite valuable against attackers who attack you later.  Other
sources of entropy, such as the hash of the contents of your
Windows registry, or a full directory tree from your hard
drive, are worthwhile against real-time attackers without
access to your machine, but worthless against  attackers with
your machine in their hands.  
Can you please exhibit a nonzero lower bound on the entropy
content of the windows registry?  If not, please don't call
it entropy.
In particular, I ask you to consider the case of mass-produced
network appliances as mentioned at
  http://www.av8n.com/turbid/paper/turbid.htm#sec-prng-attack
and consider whether, registry or no registry, the boxes will
be waaay too much alike.
In ordinary situations, the windows registry is constructed
by strictly deterministic processes.  No entropy.  If your
adversaries are so lame that they cannot figure out how the
registry is constructed, you're living in some kind of paradise.
Differentiate between measures of entropy.  Collision entropy ...
Let's stay on-topic.
There is only one measure of entropy appropriate to a random
symbol generator.  If I am unable in principle to predict
the output of my HESG, then under mild assumptions that's
what I call entropy.
  By mild assumptions I mean things like assuming my
  machine has not been taken over by the attacker.  This
  assumption is of course common to *all* discussions
  of security algorithms and principles.  I mention it
  only to fend off nit-picks.  There's not much point in
  discussing algorithm A if you're going to turn around
  and tell me your box might be implementing some other
  algorithm.
-
The Cryptography Mailing List
Unsubscribe by sending unsubscribe cryptography to [EMAIL PROTECTED]


Entropy and PRNGs

2005-01-10 Thread David Wagner
John Denker writes:
Well, of course indeed!  That notion of entropy -- the entropy
in the adversary's frame of reference -- is precisely the
notion that is appropriate to any adversarial situation, as I
have consistently and clearly stated in my writings;
[...]
There is only one entropy that matters in an adversarial
situation.  The so-called unconditional entropy H(X) is
merely a wild overestimate of the only thing that matters.

Ok.  I see that you were already well aware of the point Ben Laurie
was making, and indeed it was obvious to you.  Great.

But I have seen people for who this was definitely not obvious, and
who failed to recognize the distinction between the two concepts or
the need to use conditional entropy until it was pointed out to them.
I guess Ben's paper is going to be useful for them, but not for you.

I imagine a smart person such as DAW should be able to come
up with five schemes in five minutes whereby UUID generation
can be delegated to virtually any machine that wants it.
MAC(eth0) /concat/ local counter will do for scheme #1.
[...]
Horsefeathers.  For generating  UUIDs,  _zero_ entropy is
sufficient, and no positive amount of entropy (unconditional
or otherwise) can be called necessary.

You're right.  I take it back.  I accept your point about UUIDs.
There are schemes that avoid the need for randomness (entropy).
Thank you.

-
The Cryptography Mailing List
Unsubscribe by sending unsubscribe cryptography to [EMAIL PROTECTED]


Entropy and PRNGs

2005-01-09 Thread David Wagner
John Denker writes:
Ben Laurie wrote:
 http://www.apache-ssl.org/randomness.pdf

I just took a look at the first couple of pages.
IMHO it has much room for improvement.

I guess I have to take exception.  I disagree.  I think Ben Laurie's
paper is quite good.  I thought your criticisms missed some of the points
he was trying to make (these points are subtle, so this is completely
understandable).  Presumably his paper could be criticized as not clear
enough, since it seems it did not convey those points adequately, but
I don't think his paper is inaccurate.  I'll respond point-by-point.

*) For instance, on page 2 it says

 I can have a great source of entropy, but if an attacker knows
 all about that source, then it gives me no unpredictability at
 all.

That's absurd.  If it has no unpredictability, it has no entropy,
according to any reasonable definition of entropy, including the
somewhat loose definition on page 1.

Actually, I think Ben got it right.  Entropy depends on context.
The attacker might have extra context that allows him to narrow down
the possible values of the randomness samples.

For instance, imagine if we use packet inter-arrival times (measured down
to the nanosecond) as our randomness source.  From the point of view of
an outsider, there might a lot of entropy in these times, perhaps tens
of bits.  However, from the point of view of an attacker who can eavesdrop
on our local area network, there might be very little or no entropy.

This is the difference between unconditional and conditional entropy that
Ben was trying to introduce.  In information-theoretic notation, H(X)
vs H(X|Y).  Let X = packet inter-arrival time, and Y = everything seen by
a local eavesdropper, and you will see that H(X|Y) can be much smaller
than H(X).  Indeed, we can have H(X|Y) = 0 even if H(X) is very large.
This is Ben's point, and it is a good one.

*) Later on page 2 it says:

 Cryptographers want conditional entropy, but for UUIDs, and
 other applications, what is needed is unconditional entropy.

First of all, you can't throw around the term conditional
entropy without saying what it's conditioned on.

Conditioned on everything known to the attacker, of course.

Also importanty, for UUIDs no entropy is required at all.
A globally-accessible counter has no entropy whatsoever, and
suffices to solve the UUID problem

A counter is fine as long as there is only one machine in the universe
that will ever assign UUIDs.  However, if you want to do distributed
generation of UUIDs, then counters are insufficient because there is no
way to prevent overlap of two machine's counter spaces.

Perhaps what Ben should have said is that:
* Unconditional entropy is sufficient for UUIDs;
  conditional entropy is not needed.
* For centrally-assigned UUIDs, even unconditional entropy is unnecessary;
  a centrally-managed counter is fine.
* For distributed, unsynchronized assignment of UUIDs, unconditional
  entropy appears to be necessary and sufficient.

*) At the bottom of page 2 it says:

 Well, for many purposes, the system time has plenty of
 unconditional entropy, because it is usually different when
 used to generate different UUIDs.

No, the system time has no entropy whatsoever, not by any
reasonable definition of entropy.

Ok, this seems like a fair criticism.

*) On page 4, I find the name trueprng to be an oxymoron.
The P in PRNG stands for Pseudo, which for thousands of
years has meant false, i.e. the opposite of true.

Another reasonable point.  Perhaps truerng would be a better name, then?

-
The Cryptography Mailing List
Unsubscribe by sending unsubscribe cryptography to [EMAIL PROTECTED]


Entropy and PRNGs

2005-01-07 Thread Ben Laurie
Given recent discussion, this is perhaps a good moment to point at a 
paper I wrote a while back on PRNGs for Dr. Dobbs, where, I'll bet, most 
of you didn't read it.

http://www.apache-ssl.org/randomness.pdf
One day, I plan to implement the API I describe there.
Cheers,
Ben.
--
http://www.apache-ssl.org/ben.html   http://www.thebunker.net/
There is no limit to what a man can do or how far he can go if he
doesn't mind who gets the credit. - Robert Woodruff
-
The Cryptography Mailing List
Unsubscribe by sending unsubscribe cryptography to [EMAIL PROTECTED]