On Sep 8, 2013, at 3:51 PM, Perry E. Metzger wrote:
>
>> Even for one-to-one discussions, these days, people want
>> transparent movement across their hardware. If I'm in a chat
>> session on my laptop and leave the house, I'd like to be able to
>> continue on my phone. How do I hand off the conversation - and the
>> keys?
>
> I wrote about this a couple of weeks ago, see:
>
> http://www.metzdowd.com/pipermail/cryptography/2013-August/016872.html
>
> In summary, it would appear that the most viable solution is to make
> the end-to-end encryption endpoint a piece of hardware the user owns
> (say the oft mentioned $50 Raspberry Pi class machine on their home
> net) and let the user interact with it over an encrypted connection
> (say running a normal protocol like Jabber client to server
> protocol over TLS, or IMAP over TLS, or https: and a web client.)
>
> It is a compromise, but one that fits with the usage pattern almost
> everyone has gotten used to. It cannot be done with the existing
> cloud model, though -- the user needs to own the box or we can't
> simultaneously maintain current protocols (and thus current clients)
> and current usage patterns.
I don't see how it's possible to make any real progress within the existing
cloud model, so I'm with you 100% here. (I've said the same earlier.)
What's hard is making this so simple and transparent that anyone can do it
without thinking about it. Again, think of the iMessage model: If Apple
hadn't larded it up with extra features (that, granted, most of its users
probably want), we would today have tens of millions of people exchanging
end-to-end, private messages without doing anything special or even thinking
about it. (Yes, Apple could have been forced to weaken it after the fact - but
it would have had to be by sending an update that broke the software.)
Apple has built some surprisingly well protected stuff (along with some really
broken stuff). There's an analysis somewhere out there of how iOS device
backups work. Apple gives you a choice of an "encrypted" or an "unencrypted"
backup. Bizarrely, the "unencrypted" one actually has some of the most
sensitive data encrypted using secret information *locked into the device
itself* - where it would take significant hardware hacking (as far as anyone
knows) to get at it; in an encrypted backup, this information is decrypted by
the device, then encrypted with the backup key. So in some ways, it's
*stronger* - an "unencrypted" backup can only be restored to the iOS device
that created it, while if you know the password, an "encrypted" backup can be
restored to any device - which is the point. (Actually, you can restore an
"unencrypted" backup to a new device, too, but the most sensitive items - e.g.,
stored passwords - are lost as the information to access them is present only
in the old device.) You'd never really know any of this from Apple's
extremely sparse documentation, mind you - it took someone hacking at the
implementation to figure it out.
I don't agree, at all, with the claim that users are not interested in privacy
or security. But (a) they often don't know how exposed they are - something
the Snowden papers are educating many about; (b) they don't know how to judge
what's secure and what isn't (gee, can any of us, post-Snowden?); (c)
especially given the previous two items, but even without them, there's a limit
to how much crap they'll put up with. The bar for perceived quality and
simplicity of interface these days is - thanks mainly to Apple - incredibly
high. Techies may bitch and moan that this is all just surface glitz, that
what's important is underneath - but if you want to reach beyond a small
coterie of hackers, you have to get that stuff right.
-- Jerry
_______________________________________________
The cryptography mailing list
[email protected]
http://www.metzdowd.com/mailman/listinfo/cryptography