I'd like to expand on a point I made a little while ago about the
"just throw everything at it, and hope the good sources drown out the
bad ones" entropy collection strategy.

The biggest problem in security systems isn't whether you're using 128
bit or 256 bit AES keys or similar trivia. The biggest problem is the
limited ability of the human mind to understand a design. This leads
to design bugs and implementation bugs. Design and implementation
flaws are the biggest failure mode for security systems, not whether
it will take all the energy in our galaxy vs. the entire visible
universe to brute force a key.

So, if you're designing any security system, the biggest thing on your
mind has to be how to validate that the system is secure. That
requires ways to know your design was correct, and ways to know you
actually implemented your design correctly.

"Just throw the kitchen sink at it" impedes both. As just one trivial
example, say that your code is buggy and instead of throwing in eight
entropy sources, you're really throwing out #1-#7 and only using #8,
perhaps because of some overwriting that you thought was an xor or
some similar stupidity. How are you going to notice this on the other
end of your SHA-256 pool mashing? Will you look at the output by hand
and magically see that the numbers aren't as random as they should be?
Probably not. Lets then say #8 turned out to be a less than random
source.

Is this sort of stupid move theoretical? Ask the guys at Debian how
theoretical it is.

At the very least, you need to have a very good set of tests attached
to your software that get run every time any sort of change (no matter
how trivial) is made to it, and even that is not necessarily enough.

If you're doing all of this properly (and few people do it properly),
you need to have a good way of knowing that your system works
right. This is especially hard to do properly at the best of times --
when the product of your algorithm is supposed to look like nonsense,
it becomes especially important that you have a well understood design
that has turned into a well validated implementation. "Throw the
kitchen sink at it" often ends up kicking up enough dust that you
can't distinguish something good from something bad. That can be a
problem.

Always bear in mind that security systems fail because of people, and
that as a security system designer or implementor, you are the first
and most likely point of failure in the entire process.

Perry
-- 
Perry E. Metzger                pe...@piermont.com

---------------------------------------------------------------------
The Cryptography Mailing List
Unsubscribe by sending "unsubscribe cryptography" to majord...@metzdowd.com

Reply via email to