On 2011-07-04, Jon Callas wrote:
Let me be blunt here. The state of software security is so immature that worrying about crypto security or protocol security is like debating the options between hardened steel and titanium, when the thing holding the chain of links to the actual user interaction is a twisted wire bread tie.
Agreed: human factors are the number one problem, with script-kiddie level bugs coming a close second. However, the few real-life problems caused by defective protocols or algorithms have the potential to have very wide impact, including high target value institutions which have already nailed the usual problems, and to be very costly to repair once the algos have been set in silicon.
Yeah, it's hard to get the crypto right, but that's why they pay us.
That's one thing No Options also helps with: it should be the paid cryptographers who make the hard choices, not the end user.
That's just puritanism, the belief that if you just make a few absolute rules, everything will be alright forever.
I rather like to think of myself as an empiricist: at least now that we have a long track record of things going more wrong with complex crypto protocols than simple ones or the primitives they employ, we should perhaps fix at least this corner of the overall problem.
I'm smiling as I say this -- puritanism: just say no.
OTOH, Puritanism: The haunting fear that someone, somewhere, may be happy. -- H. L. Mencken
Meh. My answer to your first question is that you can't. If you want an interesting protocol, it can't resist protocol attacks.
So the corollary of what I'm talking about is that protocols should not be interesting. "May you live in a time with interesting protocols" should perhaps be a cryptographers' curse?
As for X.509, want to hear something *really* depressing? It isn't a total mess. It actually works very well, even though all the mess about it is quite well documented. Moreover, the more that X.509 gets used, the more elegant its uses are. There are some damned fine protocols using it and just drop it in.
Well, why don't they then just pick each of those elegant uses, codify it into a maximally restricted, formal grammar, and supercede X.509 with the combined results? That could potentially make everybody happy at the same time. And I mean, as you say, that's been the general direction, e.g. within the RFC series. Not to mention many other polymorphic formats -- nowadays it's pretty rare that even ISO pushes out anything too complicated without also defining much-simplified profiles of it.
Yeah, yeah, having more than one encoding rule is madness, but to make that make you run screaming is to be squeamish.
I'm perfectly happy dealing with complications like these. But the trouble is, not everybody can hack it, and since even a perfectly good implementation can be weakened simply by interoperating with a bad one, this sort of stuff can impact the entire ecosystem.
However, the problems with PKI have nothing to do
I'm not so sure PKI is completely innocent. I mean, it aims at being a silver bullet which solves any and every authentication related problem within a single framework and, usually, by reusing the same protocols or formats. To me that seems like a prime reason for high polymorphism and open-ended design.
OpenPGP is a trivially simple protocol at its purest structure. It's just tag, length, binary blob.
TLV encodings are conceptually rather simple, yes. But in practice once you allow nesting, mix in length fields outside of the block structure, allow indefinite length blocks and reuse of globally defined tag values in different contexts, allow mixing of free form binary and block content, and so on, there's suddenly ample room for error.
I mean, I can understand why we want extensible protocols, that is, protocols which let the receiver be lax in what it is willing to accept. It's just that crypto doesn't seem to be one of the applications where this sort of polymorphism is too desirable or even useful.
You know where the convolutedness comes from? A lack of options. That and over-optimization, which is actually a form of unneeded complexity.
Do you happen to have a particular example in mind?
If you create a system with truly no options, you create brittleness and inflexibility. It will fail the first time an underlying component fails and you can't revise it.
That's why you probably need some minimum form of versioning and/or tagging. But, say, embedding the choice of crypto primitives to be used together in the protocol, letting key lengths vary willy-nilly and that sort of general compositionality, it just doesn't seem too useful to me. Neither does a tagging structure which lets you embed whatever kinds of generic packet types into whatever context -- which is why I've actually become a big fan of ASN.1's implicit tags as opposed to the universal ones.
I think that crypto people are scared of options because options are hard to get right, but one doesn't get away from options by not having them.
I don't think it's just that. It's that principled analysis and (especially automated) testing (if not outright verification) gets a lot more difficult when we move to polymorphism. I don't think anybody's too afraid of the complexity, because simply by being in the business of crypto/security you've already accepted that you'll be dealing with complex systems. It's just that polymorphism and compositionality lead to *open-ended* complexity, which means that you can no longer accurately pin down what you're dealing with in the first place.
Options are hard, but you only get paid to solve hard problems.
Personally, I'm not in the business of cryptography, so I look at this from the opposite angle: I don't want to be paying anybody for solving hard problems. I just want the protocols I use to be safe and efficient, and to apply to the (usually rather simple) needs I have. I think you can appreciate that from this perspective, polymorphic protocols sometimes seem more like a make-work initiative or an instance of intellectual wanking-off than a sound engineering decision -- after all, the latter eventually solve the problem, making the engineer and his salary unnecessary.
Let's face it, if your system is as expressive as arithmetic, then you *can't* verify it.
Isn't that precisely why my system should *not* be as expressive as arithmetic?
The important point is that if you design a secure protocol, you formally verify it, and then you implement, how do you know that the implementation didn't accidentally bring in some feature that to the right clever person is a security flaw.
I do understand your point. In the end you can't really be sure. But it's a helluva lot better to be able to apply one or two general formalisms which give you extra assurance, and which for example catch common mistakes automatically or give you clear rules of thumb on how to prevent them.
The reason the Sassaman&Patterson reasoning about formal grammars strikes me is because that's one such framework, and I've been forced to deal with a number of nasty syntaxes in the past. There is a complete heap of very easy to understand design and validation principles which flow directly from that kind of reasoning, and which seem to be consistently broken when people come up with new stuff. Starting with "either you validate your input or you escape it religiously so that it doesn't break your infrastructure, or both". Or, "you don't deal with ambiguous grammars, period". And yet this stuff is routinely botched up.
-- Sampo Syreeni, aka decoy - [email protected], http://decoy.iki.fi/front +358-50-5756111, 025E D175 ABE5 027C 9494 EEB0 E090 8BA9 0509 85C2 _______________________________________________ cryptography mailing list [email protected] http://lists.randombit.net/mailman/listinfo/cryptography
