On Dec 30, 2008, at 2:11 PM, Jerry Leichter wrote:
On Dec 30, 2008, at 4:40 PM, Jon Callas wrote:
We don't have a formal definition of what we mean by random. My
definition is that it needs to be unguessable. If I have a random
number and the work factor for you to guess it is more or less its
randomness. It's a Shannonesque way of looking things, but not
precisely information-theoretic.
I don't think this quite captures the situation. It's easy to give
a formal definition of randomness; it's just not clear that such a
thing can ever be realized.
And that is, pretty much, the point. A formal definition that is no
guidance to people trying to build things is mathematics as art. Not
that art is a bad thing -- I adore mathematics as art, and my time as
a mathematician was all in the art department. It's just not useful to
the part of me that's an engineer. Pretty has worth, just different
worth than useful.
Here's one definition: A random bitstream generator is an
"isolated" source of an infinite stream of bits, numbered starting
at zero, with the property that a Turing machine, given as input
anything about the universe except the internal state of the
"isolated" source, and bits 0 to n generated by the source, has no
better than a 50% chance of correctly guessing bit n+1. The
difficulty is entirely in that quoted "isolated". It's not so much
that we can't define it as that given any definition that captures
the intended meaning, there are no known systems that we can
definitely say are "isolated" in that sense. (Well, there's kind of
an exception: Quantum mechanics tells us that the outcome of
certain experiments is "random", and Bell's Theorem gives us some
kind of notion of "isolation" by saying there are no hidden
variables - but this is very technically complex and doesn't really
say anything nearly so simple.)
A while back, I wrote to this list about some work toward a stronger
notion of "computable in principle", based on results in quantum
mechanics that limit the amount of computation - in the very basic
sense of bit flips - that can be done in a given volume of space-
time. The argument is that a computation that needs more than this
many bit flips can't reasonably be defined as possible "in
principle" just because we can describe what such a computation
would look like, if the universe permitted it! One might produce a
notion of "strongly computationally random" based on such a theory.
Curiously, as I remarked i that message, somewhere between 128 and
256 bits of key, a brute force search transitions from "impractical
for the forseeable future" to "not computable in principle". So at
least for brute force attacks - we're actually at the limits
already. Perhaps it might actually be possible to construct such a
"random against any computation that's possible in principle" source.
A deterministic, but chaotic system that is sufficiently opaque
gets pretty close to random. Let's just suppose that the model they
give of photons bouncing in their laser is Newtonian. If there's
enough going on in there, we can't model it effectively and it can
be considered random because we can't know its outputs.
I don't like the notion of "opaqueness" in the context. That just
means we can't see any order that might be in there. There's a
classic experiment - I think Scientific American had pictures of
this maybe 10 years back - in which you take a pair of concentric
cylinders, fill the gap with a viscous fluid in which you draw a
line with dye parallel to the cylinders' common axis. Now slowly
turn the inner cylinder, dragging the dye along. This is a highly
chaotic process, and after a short time, you see a completely random-
looking dispersion of dye through the liquid. Present this to
someone and any likely test will say this is quite random. But ...
if you slow turn the inner cylinder backwards - "slowly", for both
directions of turn, depending on details of the system - the
original line of dye miraculously reappears.
That's why it's not enough to have chaos, not enough to have
opaqueness. The last thing you want to say is "this system is so
complicated that I can't model it, so my opponent can't model it
either, so it's random". To the contrary, you *want* a model that
tells you something about *why* this system is hard to predict!
Exactly. You've described a chaotic but easily reversible system, and
that makes it unsuitable to be random by my "unguessability" metric.
There exists an algorithm that's faster than guessing, and so
therefore it isn't random.
However, on top of that, there's a problem that hardware people
(especially physicists) just don't get about useful randomness,
especially cryptographic random variables. Dylan said that to live
outside the law, you must be honest. A cryptographic random
variable has to look a certain way, it has to be honest. It's got
to be squeaky clean in many ways. A true random variable does not.
A true random variable can decide that it'll be evenly distributed
today, normal tomorrow, or perhaps Poisson -- the way we decide
what restaurant to go to. No, no, not Italian; I had Italian for
lunch.
That's why we cryptographers always run things through a lot of
software. It's also why we want to see our hardware randomness, so
we can correct for the freedom of the physical process. Imagine a
die that is marked with a 1, four 4s, and a 5. This die is crap to
play craps with, but we can still feed an RNG with it. We just need
to know that it's not what it seems.
This simply says that *known* bias and randomness are completely
separate notions. I can always get rid of any *known* bias. Bias
that's unknown/unmodeled to me as the *user* of the system, on the
other hand, is very dangerous if an attacker might conceivably know
more about the bias than I do.
Yes. Again, we're in agreement on this.
If I think some system is unguessable to 256 bits, and it's really
unguessable to 10 bits, then someone who knows those ten bits can
easily search for a state that I think is unsearchable. (At least in
principle -- ten bits worth of calculations that take a century each
is a different thing entirely, but that's only because centuries are
longer than nanoseconds.)
Jon
---------------------------------------------------------------------
The Cryptography Mailing List
Unsubscribe by sending "unsubscribe cryptography" to [email protected]