On 02/21/2012 08:31 PM, Kevin W. Wall wrote:
Apologies for this being a bit OT as far as the charter of this list goes, and perhaps a bit self-serving as well. I hope you will bear with me.
Meh. I think I've seen worse. :-)
To a degree, I think it is more ignorance than it is outright incompetence, Overall, developers generally are much better than the general public when it comes to analytic and reasoning abilities. And I think that this Dunning-Kruger effect that you mention is a good explanation. But this phenomena goes *way* beyond developer's ignorance of cryptography. It even goes way beyond a general ignorance of information security.
Most developers are under a great deal of pressure to complete tasks. They get lots of tasks done -> they get a raise. They get so many tasks done that they inspire (or fear) others to complete tasks -> they get a promotion. Doubly so for new and inexperienced developers. Now, think of the mindset required to code securely. It involves digging deeper, asking a lot of dumb-sounding questions, generally being more cautious, and constantly brainstorming ways break things and reasons *not* to ship the functionality. In short, it is an absolutely toxic mindset for a new developer to have at the vast majority of entry-level developer jobs.
A great example of this is time and time again, I encounter _web_ application developers who have absolutely no clue as to how HTTP works as a protocol. That just seems so counter intuitive to me. Yet at least with the younger web developers, it seems to be the rule rather than the exception.
For many of them, not only is it their first exposure to a protocol, it's their first programming project. Protocols are *hard*, even simple ones. Tim Berners Lee. Brilliant guy. Working at CERN. HTTP 0.9. Need I say more? ;-)
Some of this can be "blamed" on the fact that web developers deal with higher and higher levels of abstraction, until eventually, they really don't need to understand what a Set-Cookie response header looks like. All of us do this to some extent, but I think it is becoming more common and therefore more noticeable because 1) technology moves at an ever increasing pace and 2) IT management still hasn't figured out that developers can't wear all hats and that there is no substitute for expertise. IT management still thinks that all members of technical staff are completely interchangeable. What does this have to do with the Dunning-Kruger effect? Well, I think that it encourages developers, especially younger ones, to fake it. Back when I started (now over 30 yrs ago!), it was OK to admit your ignorance, at least at Bell Labs.
From everything I've heard, that was a very special place.
And you could always find someone to mentor you if you wanted to learn something new. Not so today. Most people are too busy and I haven't seen any _formal_ mentorship programs in any company for at least the past 25 years.
Are companies still complaining "colleges aren't producing enough graduates proficient in language X on platform Y"?
So, let's bring this back to cryptography. I'm going to assume that virtually all of you are a somewhat altruistic and are not in this game just to make a boatload of money by keeping all the crypto knowledge within the secret priesthood thereby driving your own salaries up.
Hmmm... is there anything I need to sign? :-)
For starters, I would urge those of you who are not involved in the open source movement to step up and help out with things like OpenSSL, OpenSSH, cryptographic libraries (in languages *other* than C/C++), etc. Personally, I would *more* than welcome someone here stepping forward and volunteering to head up the crypto effort in OWASP ESAPI.
I think I looked at it briefly a year or two ago and, frankly, where I got hung up was that it was written in Java. I hate to be a purist, but I just feel uncomfortable with crypto code written in a language that doesn't have guaranteed constant-time operations (e.g. string comparisons) or secure memory overwrite functions. Could be worse I suppose. Some days it seems that Javascript crypto is inevitable.
Even though some people from the NSA have reviewed it, I'm paranoid enough to think that it's what they are NOT telling me that is wrong is what is worrying me.
I look at it this way. The US government has more information systems than anybody and ostensibly part of the NSA's job is to secure them, or at least put some parameters on the level of exposure. Are they eating the dog food? Take it as the highest endorsement.
I know many of you have already contributed (I won't attempt to name names because I'd probably unintentionally leave a few of you out and offend them), but not nearly enough.
Most of you who regularly post to this mailing have commented on how you've seen some of the same beginner crypto failures over and over, so how about starting with jus a simple crypto HowTo FAQ, maybe an OWASP crypto cheat sheat.
Something like the opposite of this maybe: 52 Things People Should Know To Do Cryptography http://www.cs.bris.ac.uk/Research/CryptographySecurity/knowledge.html "If at the end of the first year you know the answers to ninety percent of the things we list then you should find that you will get more out of the conferences and talks you attend in your second year." The implication here being that there are maybe only a handful of people in the world who should "Do Cryptography". "In addition it will be easier to talk to cryptographers (who may be future employers) from other institutions since you will be able to converse with them using a common language." The implication here being that cryptographers belong in institutions because they are unable to communicate with the rest of us. Now, look at their curriculum. Don't get me wrong, I think it is awesome, but man are these dudes ever obsessed with number theory! With the notable exceptions of 17, 18, 43, and some side channel stuff, almost all the rest are about RSA, ECC, and discrete log problems. Number 49 under "Advanced protocols and constructions" is "Describe the basic ideas behind IPsec and TLS." Well, these are the most commonly used crypto protocols in existence, why should their "basic ideas" be so "advanced"? I think clearly these are mathematicians selecting the topics that they find the most amenable to their own analytic tools. Again, good for them, somebody needs to do it, and we all benefit from it. But the rest of the world has a lot of software engineers who are now expected to "do cryptography" and data security as part of their ordinary job. To me this goes a long way towards explaining how exploits like Duong & Rizzo's BEAST can be demonstrated for the first time so many years (and publication cycles) after the basic ideas of the attack are published by academics. (Although some of that round of academics deserve credit for rolling up their sleeves and fixing the problem in IPsec.)
Consider this...If *you* don't help, then the crypto will have to be left up to non-experts like me to work on it. And the only *major* difference between myself and complete crypto newbs is that I know that I don't know (and don't hesitate to squeal for help). Others don't know that they have ignorance, so they don't ask, and we've all seen the result. Contributions to the community can come in many forms, whether it be simple, like a FAQ, or a single crypto course on YouTube, or something much complex like a book aimed at beginner / intermediate developers. From where I sit, I see the following things that the development community in general are lacking when it comes to things crypto: 1) They think that key size is the paramount thing; the bigger the better.
I think we have NSA to thank for that one what with all those 40-bit "export grade" keys and such.
2) The have no clue as to what cipher modes are. It's ECB by default. 3) More importantly, they don't know how to choose a cipher mode (not surprising, given #2). They need to understand the trade-offs.
4) They have no idea about how to generate keys, derived keys, IVs,
PBKDF2 really is an excessively well-kept secret.
5) They don't know what padding is, or when/why to use it.
But any competent software engineer could easily re-invent it as needed.
6) They have a very naive concept of entropy...where/when to use it and from where and how to obtain it.
To be fair, that's a deep topic that borders on the philosophical. This is great blog post came across my Twitter today.
http://blog.cryptographyengineering.com/2012/02/random-number-generation-illustrated.html
But
if I could make one minor criticism it is that it talks a lot about 'entropy' without giving a useful intuitive definition of it. It talks about entropy exclusively in terms of 'unpredictability', which I think misses the essential point necessary for thinking about actual systems: Entropy is a measure of uncertainty experienced by a specific attacker. It is an attempt to quantify the absence of information that he perceives when he stares into the black hole of our cryptosystem. It is how much he learns, in bits, if the lid of our black box were opened and its secret contents were revealed to him. Entropy is likely different for every attacker. An attacker who can access a recent snapshot of our VM likely perceives a different amount of entropy than one who can observe packets on your network interface. It's not even that exactly. In the real world, our own knowledge of the attacker's information is also degraded by uncertainty. So at best we can hope to accurately estimate some lower bounds on the entropy experienced by the attacker. In theory, the defender is presumed to possesses the private key outright and thus perceives the system as having zero bits of entropy. But in the real world, we as defenders finds ourselves in a constant state of uncertainty and unpredictability about our own systems. How often when we write a program can we be certain what kind of processor it will be executed on? We might have an idea of a class of CPUs our binary targets without recompilation, but even then one or more levels of emulation and virtualization are increasingly common in deployment. Today when we deploy networking gear it's often closed-source software running on closed-source hardware manufactured in the same geographic location as the IP addresses which are later attacking it. How do we know how secure it is? We know from repeated experience that the security of even the top name brands is "unpredictable". Addressing the difficulties operating in such an environment seems likely to be the defining characteristic of the upcoming era of applied computer science. Engineers will sit down and think "well, I don't know how to predict it, therefore it satisfies the unpredictability requirement, therefore it has sufficient entropy". (c.f. Schneier's Law) If you've ever worked with state-certified licensed engineers still paying for their student loans you'll know what I'm talking about. Couple that with the task-oriented eager mindset and we clearly have a recipe for disaster.
Fill-in your own favorites. These are just the ones that immediately popped to mind.
"I'm sorry. I'm thinking about [entropy] again." http://www.youtube.com/watch?v=sP4NMoJcFd4 But I'll leave it with this question. It sounds a bit obvious and familiar in writing it, but maybe good questions are like that. To what extent are we taking the unpredictability that we perceive of our own systems and projecting that onto the attacker? Possibly to the point of causing us to overestimate his effective perceived entropy? - Marsh _______________________________________________ cryptography mailing list [email protected] http://lists.randombit.net/mailman/listinfo/cryptography
