iVBORw0KGgoAAAANSUhEUgAAAFwAAABcAQMAAADZIUAbAAAABlBMVEX///8A
AABVwtN+AAAAS0lEQVQ4jbXSUQoAIAhEwYXuf2NhS1O6QM+EnH4qUfoaK2bBcJysnUUVWY
lGput3JGxPD1H00byAQ17r20YW8QaChXr2UHgiUHyNDSRgxkgDsThDAAAAAElFTkSuQmCC
Crypto-Hint: image/png

On Fri, Aug 16, 2013 at 09:32:02AM -0400, shawn wilson wrote:
> I thought that decent crypto programs (openssh, openssl, tls suites)
> should read from random so they stay secure and don't start generating
> /insecure/ data when entropy runs low. The only way I could see this as
> being a smart thing to do is if these programs also looked at how much
> entropy the kernel had and stopped when it got ~50 or so. Is this the way
> things are done when these programs use urandom or what?

It's not that the data is "insecure". It's that it becomes theoretically
"predictable". The key word being "theoretically". BIG difference.

The /dev/urandom device in the Linux kernel uses the Yarrow pseudo random
number generator when the entropy pool has been exhausted. Yarrow has an
rotational size of 160-bits. This means that after you have generated 2^160
bits worth of data, you will notice the data repeat itself. If you were
generating images, the resulting image would have a pattern. See
a randomly generated image on http://www.random.org/analysis/ for PHP
rand() on Microsoft Windows to see an example of what the pattern might
look like.

2^160 ~= 1.5x10^48. To put this number in perspective, 1 TB ~= 1.1x10^12. 1
PB ~= 1.1x10^15. 1 EB ~= 1.2x10^18. 1 ZB ~= 1.2x10^21. 1 Zettabyte = 1,024
exabytes = 1,048,576 petabytes = 1,073,741,824 terabytes. And we're no
where close to the output size of 160 binary bits.

So, in just sheer terms of size, you would need to generate immense amounts
of data, before it becomes predictable. If the data is predicted
successfully, then you can theoretically reproduce the same data. Again,
the keyword is "theoretically".

Cryptographers don't like the idea that it's possible, even if it's
excessively remote, and highly unprobable. This is why you see suggestions
to use /dev/random for long term SSH, SSL and OpenPGP keys. If there is
even a 0.0000000000000000000000000000000000000000001% chance that the data
could be predicted, you're best off relying on chaotic events, than
pseudorandom ones.

It turns out, getting good, high quality, true random, and chaotic data
into your kernel isn't really at all that difficult. All you need to do, is
rely in quantum chas, which is really the only true source for random, as
much as random can get. Some things people have done:

    * Tuned their radio to atmospheric noise, and fed it into their kernel
      through their sound card.
    * Created reverse PNL junctions, timing electron jumps.
    * Timing radioactive decay using Americium-241, common in everyday
      household smoke detectors.
    * Opening up the CCD on a web camera fully in a completely dark box.
    * Termal noise from resistors.
    * Clock drift from quartz-based clocks and power fluctuations.

At any event, using /dev/urandom is perfectly secure, as the Yarrow
algorithm has proven itself over time to withstand practical attacks. So,
let's dispel the myth that using /dev/urandom is insecure. :)

-- 
. o .   o . o   . . o   o . .   . o .
. . o   . o o   o . o   . o o   . . o
o o o   . o .   . o o   o o .   o o o

Attachment: pgp39I7aLMfeE.pgp
Description: PGP signature

_______________________________________________
cryptography mailing list
cryptography@randombit.net
http://lists.randombit.net/mailman/listinfo/cryptography

Reply via email to