This space is of particular interest to me. I implemented just one of these and published the protocol (rather than pimp my blog if anyone wants to read up on the protocol description feel free to email me and I'll send you a link).
The system itself was built around a fairly simple PKI which then allowed people to build end-to-end channels. You hit the nail on the head though, control of the keys. If you can game the PKI you can replace someone's public key and execute a MITM attack. The approach I took to this was that the PKI publishes peoples public keys but then allows other users to verify your public key. A MITM attack is possible but as soon as your public key is rotated this is detected and the client itself asks if you'd like to verify if out of band (this was for mobile devices so it lends itself to having other channels to check keys via, like phone your friend and ask them). The much more likely thing is where someone tries to do a MITM attack for just a particular user but as the channels are tunnelled end to end they need to essentially ask the PKI to publish two duff keys, i.e. one in each direction, Alice's key as far as Bob is concerned and Bob's key as far as alice is concerned.. In turn the two people who's traffic the attacker is trying to obtain can in turn ask someone else to double check their. It means that you need to publish an entirely fake PKI directory to just two users. The idea was the alarm bells go off when it transpires that every person you want to get a proxy verification of a public key via has 'all of a sudden' changed their public key too. It's a hybrid model, a PKI to make life easy for the users to bootstrap but which uses a web of trust to detect when the PKI (or your local directory) has been attacked. Relationships become 'public' knowledge at least in so far as you ask others in your address book to verify peoples public keys (all be it via uuids, you could still find out if your mate Bill had 'John's' public key in his address book because he's asked you to verify it for him). So for those who want to protect the conversational meta data it's already orthogonal to that. Group chat semantics are quite feasible in that all users are peers but you run into difficulty when it comes to signing your own messages, not that you can't sign them but that's computationally expensive and the eats battery life. Again, you are right though, what do you want to achieve? I certainly built a protocol that answered the main questions I was asking! As for multiple devices, the trick was always usability. How do you securely move an identity token of some description from one node to another. I settled on every device having its own key pair but you still need an 'owning' identity and a way to 'enrol' a new key pair because if that got broken the attacked just enrols their own 'device' surreptitiously. You then get into the realms of passwords through salted hashing algorithms but then you're back to the security of a password being brute forced. If you were really paranoid I proposed a smart card mechanism but I've yet to implement that (how closed a world are smart cards with decent protection specifications?! but that's another conversation), the idea being that you decrypt your device key pair using the smart card and ditch the smart card if needs be, through a typical office shredder. Silent Circle was one of the most analogous systems but I'm an amateur compared to those chaps. As interesting as it was building, it kept boiling down to one thing: Assuming I'd done a good job all I had done was shift the target from the protocol to the device. If I really wanted to get the data I'd attack the onscreen software keyboard and leave everything else alone. Max On Sun, Sep 8, 2013 at 7:50 PM, Jerry Leichter <leich...@lrw.com> wrote: > On Sep 7, 2013, at 11:16 PM, Marcus D. Leech wrote: > > Jeff Schiller pointed out a little while ago that the crypto-engineering > community have largely failed to make end-to-end encryption easy to use. > There are reasons for that, some technical, some political, but it is > absolutely true that end-to-end encryption, for those cases where "end to > end" is the obvious and natural model, has not significantly materialized > on the Internet. Relatively speaking, a handful of crypto-nerds use > end-to-end schemes for e-mail and chat clients, and so on, but the vast > majority of the Internet user-space? Not so much. > I agree, but the situation is complicated. Consider chat. If it's > one-to-one, end-to-end encryption is pretty simple and could be made simple > to use; but people also want to chat rooms, which are a much more > complicated key management problem - unless you let the server do the > encryption. Do you enable it only for one-to-one conversations? Provide > different interfaces for one-to-one and chat room discussions? > > Even for one-to-one discussions, these days, people want transparent > movement across their hardware. If I'm in a chat session on my laptop and > leave the house, I'd like to be able to continue on my phone. How do I > hand off the conversation - and the keys? (What this actually shows is the > complexity of defining "the endpoint". From the protocol's point of view, > the endpoint is first my laptop, then my phone. From the user's point of > view, the endpoint is the user! How do we reconcile these points of view? > Or does the difference go away if we assume the endpoint is always the > phone, since it's always with me anyway?) > > The same kinds of questions arise for other communications modalities, but > are often more complex. One-to-one voice? Sure, we could easily > end-to-end encrypt that. But these days everyone expects to do conference > calls. Handling those is quite a bit more complex. > > There does appear to be some consumer interest here. Apple found it > worthwhile to advertise that iMessage - which is used in a completely > transparent way, you don't even have to opt in for it to replace SMS for > iOS to iOS messages - is end-to-end encrypted. (And, it appears that it > *is* end-to-end encrypted - but unfortunately key establishment protocols > leave Apple with the keys - which allows them to provide useful services, > like making your chat logs visible on brand new hardware, but also leaves > holes of course.) Silent Circle, among others, makes their living off of > selling end-to-end encrypted chat sessions, but they've got a tiny, tiny > fraction of the customer base Apple has. > > I think you first need to decide *exactly* what services you're going to > provide in a secure fashion, and then what customers are willing to do > without (multi-party support, easy movement to new devices, backwards > compatibility perhaps) before you can begin to design something new with > any chance of success. > -- Jerry > > > > -- Jerry > > _______________________________________________ > The cryptography mailing list > email@example.com > http://www.metzdowd.com/mailman/listinfo/cryptography >
_______________________________________________ The cryptography mailing list firstname.lastname@example.org http://www.metzdowd.com/mailman/listinfo/cryptography