Hi Ian! (This was originally a private, but I'm copying the list cuz of your "anyone who's better with SSL..." comment.)
On 4/24/05, Ian Grigg <[EMAIL PROTECTED]> wrote: > Hi Kyle, > > > > On 4/23/05, Ian Grigg <[EMAIL PROTECTED]> wrote: > > >> If none of that makes sense, then think of it this > >> way: The market is too small, and it is too small > >> because it is inefficient. > >> > >> iang > > > > It's inefficient partly due to the difficulty of getting a certificate > > in the first place -- it's "geek" territory, in the UI, and there > > haven't been any serious UI studies done on how to make it better. > > There have been studies, and they've brought > out a lot of stuff, but that stuff is rarely > adopted. The browser world isn't set up to > adopt ideas from outside, for differing reasons > (for example, techies don't read academic papers > coz their boring). I'd much rather read an academic paper about something that will help the users of my software get what they need out of it than program something that won't. I realize that it's impossible to get things right on the first try, but the security interface in Netscape/Mozilla/Firefox has been essentially unchanged since Netscape 0.94. > > It's inefficient partly due to the difficulty of getting a certificate > > to work in the second place. And for end-entities, it's inefficient > > partly due to the policies of (say) webmail providers who want to tack > > something onto the end of any message that the customer sends -- as an > > example, look at Hotmail and its breakage of digital signatures. > > Right. However, your causality is back to > front; all those say that it is an inefficient > market, the question I was answering was why > the market is inefficient. Nothing changes > until those core *business* factors are addressed, > which is why Frank's project is by definition the > most important project in the browser world today > (because it effects the business factors). Actually, I think it's nearing time that CAs were taken out of the hands of businesses entirely... at least until business can create a coherent idea of what they need to accomplish, and design a CA infrastructure that meets those needs. Commercial operations would run their own CAs, or outsource to others to run their CA... and each outsourcing would have a specific contractual obligation setting blame for fraud at the cost of the fraud. (Insurance companies would then put pressure on the outsourced CAs to reduce the risk of fraud.) But there's a conceptual issue here: a Certifying Authority can only certify identities within its own domain, using information that it has from within its own domain. Verisign et al are issuing certificates based on 'legal identity' [what everyone else calls 'real identity', but I balk at that notion as well], but the problem is that they are /not/ the keepers of the domain of legal identities. Which means that they're taking on a pretention that they really shouldn't be allowed to have. I hate to say it, but since government is what makes the legal identity concept flow, it needs to be the various governments that actually run the 'legal identity' CAs. (Or award contracts to run such.) And these governments would not be the US federal level... instead, the US state level. (I don't know how things are arranged in other countries, so I can't comment.) > > Second: CAs need to no longer have such a high barrier to entry. > > (Even OpenCA is too difficult for most system administrators to set up > > -- and MS's CA is too difficult to administer.) > > Yes. Unwinding the "big expensive CA" model is > taking a long time. Unfortunately, what people > don't realise is that model is one of the very things > that killed the big CAs, they needed the little > guys to create the food chain, it wasn't possible > to construct a shark and ban all the the other > competitors ... because the competitors *are* the > food chain. (That's what comes of letting sales > people design your PKI, they deliberately created > the big model thinking if they could get their > first it would all work out.) Sales people, versus... economists? > > Third: The user interface to obtaining certificates must be improved. > > Fourth: A means of carrying certificates from one place to another > > must be created. > > Fifth: The users have to want certificates. (This can be accomplished > > by creating a "killer app" that leverages all of this.) > > Sixth: The users must be able to see that the certificates actually work. > > Sure, all these things are nice. Actually, the > killer app is sort of sitting in there as we speak, > it is thunderbird. But, what has to happen is the > CA-first principle needs to be turned into CA-last, > and email accounts should create their self-signed > certs on the fly and distro them in some fashion, > again on the fly. But for that to happen, we have > to understand that people email other people without > really caring that they don't have some other party > telling them who it is. Indeed. (Email is based on reputation, more than anything -- either the person is known RL, in which case you're merely creating a transition of domains of trust... or the person is unknown, in which case you're creating a reputation based on what they have to say and what they present. For example, '[EMAIL PROTECTED]' has established a reputation with me as a critical thinker about cryptography and security systems. As has '[EMAIL PROTECTED]'. And various other identities, based on email address more than whatever 'name' the person has voluntarily associated with that address.) > > Once that occurs, websites will begin using cryptographic > > authentication instead of username and password pairs, because it will > > be very cheap for them to do so (meaning, the cost of setting up a > > certificate-based authentication mechanism will be offset by the fact > > that users will be able to log in automatically). Especially if they > > run their own low-barrier-to-entry CA, or outsource it to someone > > else. > > Right, in principle. Unfortunately, the way the > user certs are placed makes them only useful for > all-or-nothing scenarios, and that makes deployment > difficult. There would need to be changes to the > SSL user cert protocol to make that fly (only the > server can request a cert, in which case it is a > demand, what is needed is for the client to suggest > certs on the fly). (Those who know the SSL protocol > should jump here in and correct if I'm wrong...) Correction: the way the user certs are /currently/ placed. But there's not precisely much choice about the concept -- else you have the SSH management problem. As a side note: I find it absolutely deplorable that usernames and passwords are still with us. We've had cryptographic strong authentication for a long while -- why don't we use it more often? (aside from the aforementioned 'barriers to entry' to running a CA, that is.) (And I do know the SSL protocol. The server may or may not authenticate itself. If the server doesn't authenticate itself, the client may close the connection. Iff the server authenticates itself, it may ask for a certificate, and provide a list of CAs that it trusts in order to make it possible to figure out which end-entity certificate is needed. If it does not get a valid certificate, it may close the connection.) > > You would think that with the failure of X.500 (why they used it as > > the basis for LDAP I'll never know), they would recognize that BER and > > DER have serious flaws, that ASN.1 isn't all it's cracked up to be, > > and that X.509 is a horrible idea to base certificates on. (Flaws in > > MS's ASN.1 implementation were found, giving SYSTEM access on WinXP > > and below and Network Service access on WinServ2k3. I'd bet there are > > other ASN.1/BER/DER parsers that also have or had problems.) > > The core cert layout is well known to be dross. I'm > not entirely convinced that OpenPGP is any better, it > has for example 5 different ways of expressing an > integer, and you "just have to know" so many things. So, get a working group together and do the IETF geek semi-annual party thing... define a new format, maybe XML-based. (Though the way I see it, X.509v3 could easily be converted to XML and back again -- only the DER form could have the signature verified, but XML would make it a lot easier to see what's actually /there/.) > > But, as you said, the market is inefficient. > > > > And now I'm ranting again... I hope you find something of use within > > this message, as I don't like sending a huge batch of noise with no > > signal. > > :-) > > The thing is there is a wealth of knowledge out > and about on how to improve the security in the > browser. It is the browser world's challenge to > adopt that knowledge. (I say that explicitly > including others like Microsoft who have just > as much of a challenge coping with security as > does Mozilla.) There's also a wealth of knowledge on how to improve the security of the user's online experience, which is not necessarily the same thing... > So the more forward looking thought we can get > together the better. At some point we might have > some programmers who are ready to start branching > off the main distro and start working on the > security. I am, unfortunately, not conversant with the layout of the Firefox source code -- and there's no "Firefox internals" book like there is with the Linux kernel. Is there a breakout of the API anywhere? (i.e., a breakout of the hooks that occur when an https connection is made, the certificate is validated, and how to change various aspects of the window etc?) Cordially, Kyle Hamilton _______________________________________________ mozilla-crypto mailing list [email protected] http://mail.mozilla.org/listinfo/mozilla-crypto
