Re: [cryptography] Here's What Law Enforcement Can Recover From A Seized iPhone

2013-03-29 Thread Jon Callas
-BEGIN PGP SIGNED MESSAGE-
Hash: SHA1


On Mar 28, 2013, at 10:27 PM, Jeffrey Goldberg jeff...@goldmark.org wrote:

 There are a couple interesting lessons from LocationGate. 

[...]

 The second lesson has to do with the the status of iOS protection classes 
 that can leave things unencrypted even when the phone is locked. There are 
 things that we want our phones to do before they are unlocked with a 
 passcode. 

[...]

 
 The trick is how to communicate this the people...

[...]

Very well put in all of those.

 What's the line? Never attribute to malice what can be explained by 
 incompetence.

That is the line. And also that stupidity is the most second most common 
element in the universe, after hydrogen. (And variants on that.)

 
 At the same time we are in the business of designing system that will protect 
 people and their data under the assumption that the world is full of hostile 
 agents. As I like to put it, I lock my car not because I think everyone is a 
 crook, but because I know that car thieves do exist.

And in many cases a cheap lock will work because it deters and deflects, not 
because it actually prevents. This doesn't apply so much with information 
security, but I think it does in places.

For example, I think that the most important thing about a password is that it 
not be a dictionary word. If it is one, length doesn't matter. If it isn't, 
length only matters a little, because most attackers just one someone's 
password, not yours. If they do want yours, either spearphishing or malware 
like Zeus is a better bang for the buck. They won't actually bother cracking 
it, they'll go around it.

Jon


-BEGIN PGP SIGNATURE-
Version: PGP Universal 3.2.0 (Build 1672)
Charset: us-ascii

wj8DBQFRVTsEsTedWZOD3gYRAhDeAKDYJOTTA9mBBebl4ccMbAbqZQzg9ACdG7A7
XRwwSV8OBtA8JufBO4YsAJ0=
=/Olb
-END PGP SIGNATURE-
___
cryptography mailing list
cryptography@randombit.net
http://lists.randombit.net/mailman/listinfo/cryptography


Re: [cryptography] Here's What Law Enforcement Can Recover From A Seized iPhone

2013-03-29 Thread ianG

On 29/03/13 08:27 AM, Jeffrey Goldberg wrote:
...

The scare story spread quickly, with the more hyperbolic accounts getting the 
most attention. The corrective analysis probably didn't penetrate as widely.



The issue that I see is that because Apple runs a secret shop, they are 
more vulnerable to PR-style disasters.


I actually agree with Apple doing it.  I think they win more by closing 
off the information.  But to do this exposes them to risks of the media 
being able to push bad conspiracy stories, and having them unchallenged.


Also, secrecy means that ones own people do not know what is going on. 
So, stupidity is easier because there are less eyes on the topic, and 
when it does happen, fewer eyes are there to warn about it as the event 
rolls on towards its own disaster.


How to deal with this?  Google has (or had, by repute) a policy of 
secrecy to the outside, but inside is open.  Google people can talk 
about anything to other insiders.  This sounds like a good solution to 
me, or at least, one way, conceptually.


(Perhaps google people can confirm/deny more about this meta-secret if 
they so desire.  Actually, forget the confirm/deny, just tell us whether 
it works and/or what is better :)


Also, I think we have to be a little bit humble and realise that all 
security is risk based.  S**t happens.  Sometimes the statistics roll 
around, it's our turn, and mistakes become disasters.


Take it on the chin, and get back to work.  Don't overreact by promising 
ludicrous things like it'll never happen again or lopping off heads 
just to satiate the media.


...

The trick is how to communicate this the people, most of whom do not wish to be 
overwhelmed with information.



Stick to your business.  People's loyalty to brand will overcome these 
sorts of disasters, as long as you're sticking to your own values and 
your values are aligned with your customers.


It's when you don't stick to your own values, or when your values are 
confused and contradictory, that the real failure happens.  Arguably, 
this happened in the 2000s with Microsoft -- Bill Gates' famous memo -- 
which couldn't articulate their values in security, *and* had a major 
security PR issue to contend with.  Hence, their efforts to improve 
their security in a post-1990s-benign world fell to little effect, even 
though their efforts were strong.


Apple are very good at this.  Microsoft are lousy.  Google sort of 
muddle along in between.


...

What's the line? Never attribute to malice what can be explained by 
incompetence.



Yeah.  But in a lot of cases, what looks like incompetence after the 
fact was just innocence of the future, before the fact, and simple lack 
of understanding of the wider world.  As Jon described, if you think 
about the location  wifi issues, how likely is that you yourself 
couldn't be caught up in something like that?


There but for the grace of our own personal deity, we go.

Personally, when I think of what google were doing with their street 
vans, and listening to the wifi and all, to have properly scrubbed that 
data would be to ascribe deity status to the geeks involved.  They are 
employed and paid to do cool things with tech, not understand PR 
disasters or trawl through the arcania  complexity of privacy 
regulation or read pop management mags like HBR.


Even the fact that heads rolled over that disturbs me.  I hope that was 
not done without a lot of thought.


...

At the same time we are in the business of designing system that will protect 
people and their data under the assumption that the world is full of hostile 
agents. As I like to put it, I lock my car not because I think everyone is a 
crook, but because I know that car thieves do exist.



:)  Security is risk-based.  Locks raise the bar, a little, which is 
generally enough to move the dishonest thief to an easier target, and 
make the honest thief think twice.  We should think more like that in 
our field, it would help us a lot.  Conceptually, this is to say we need 
algorithms that fail 0.1% of the time, not 0.01%.


However, a consequence of risk is that some people get hit.  We need to 
learn to love our occasional failures.  As Dan Geer put it, If the 
number of failures is zero, there is no way to disambiguate good luck 
from spending too much.


http://financialcryptography.com/mt/archives/001255.html

That's two good reasons to defend not attack the locationgates.  We are 
all vulnerable in a risk-oriented world, and each failure is a learning 
opportunity.




iang
___
cryptography mailing list
cryptography@randombit.net
http://lists.randombit.net/mailman/listinfo/cryptography


Re: [cryptography] Here's What Law Enforcement Can Recover From A Seized iPhone

2013-03-29 Thread ianG

On 29/03/13 06:42 AM, Jon Callas wrote:
...


I don't believe that it is in the interests of a company to shaft its 
customers. ...



Right, this is why I like Apple.  This is the same secret as in the 
gambling industry.  The house always wins -- so why bother cheating? 
Better to actually go the other way and be ultra honest and work to your 
customers needs.  Indeed, in casinos, they hand free money out at the 
slightest complaint, it's called comping.  Why?  Coz 99 out of 100 
customers will simply play the money back into the house.


...

Let me ask again -- what could an LE or GOV offer that would be better than 
being cool?


Be More Cool?

The LEAs might be stupid (a better term might be behind) but the 
spooks definitely aren't.   The latter have been working on how to 
breach organisations for about a century (in the USA, several centuries 
elsewhere), they wrote the manual on it many times over.  It's their 
job, why do we subconsciously think we can defeat them at it because 
we're smarter or cooler?


There are ways to seduce such an organisation.  Share work on a cool 
secret project [0].  Once inside, loyalties can be shifted, rich 
contracts can be loaded with conditions, products can be shifted.


A second way is to offer cool people, who have primary loyalties that 
one doesn't notice.  If we recall 20 year pensions, this is surprisingly 
easy to arrange.  Ask your HR department how they'd feel about employing 
someone with 20 years of experience in secret spook technology. 
Recently retired, looking for a new challenge!


HR are going to be highly positive about this person.  They will see 
talk of potential conflicts of interest as blather from over imaginative 
geeks living in conspiracy la-la land.  Once a high-tech business model 
gets rolling, the demand for good techs is insatiable.




iang



[0]  Apropos other thread on DES and 56 bit keys:
http://en.wikipedia.org/wiki/Data_Encryption_Standard#NSA.27s_involvement_in_the_design

NSA worked closely with IBM to strengthen the algorithm against all 
except brute force attacks and to strengthen substitution tables, called 
S-boxes. Conversely, NSA tried to convince IBM to reduce the length of 
the key from 64 to 48 bits. Ultimately they compromised on a 56-bit key.[9]


Apparently in their own words, the NSA manipulated a cool project by 
being cooler.  As another pointer, Ross Anderson once posted about an 
article in Foreign Policy journal (memory may trick me here) which 
outlined how they manipulated the South African crypto industry.

___
cryptography mailing list
cryptography@randombit.net
http://lists.randombit.net/mailman/listinfo/cryptography


Re: [cryptography] Here's What Law Enforcement Can Recover From A Seized iPhone

2013-03-29 Thread ianG

On 29/03/13 06:42 AM, Jon Callas wrote:


- From being there, Apple's culture and practices are such that everything they 
do is focused on making cool things for the customers.



In a world of secrecy, media, spin, security complexity and so forth, 
personal testimony from the inside as to bona fides is as good as it gets.


Thanks!

iang

___
cryptography mailing list
cryptography@randombit.net
http://lists.randombit.net/mailman/listinfo/cryptography


Re: [cryptography] Here's What Law Enforcement Can Recover From A Seized iPhone

2013-03-29 Thread dan

Jon Callas writes, in part:
-+-
 | Let me ask again -- what could an LE or GOV offer that would be
 | better than being cool? Being a snitch, being a sell-out isn't cool.
 | Lots of people don 't get that. To them, money is more important
 | than being cool. And all that  means is they aren't cool. Some of
 | those people are rich, which is good for them, but money can't buy
 | cool.


Conspiracy theories are irresistible labor-saving devices in
the face of complexity.
-- Henry Louis Skip Gates 

___
cryptography mailing list
cryptography@randombit.net
http://lists.randombit.net/mailman/listinfo/cryptography


Re: [cryptography] Here's What Law Enforcement Can Recover From A Seized iPhone

2013-03-29 Thread Jeffrey Walton
On Thu, Mar 28, 2013 at 11:42 PM, Jon Callas j...@callas.org wrote:

 On Mar 28, 2013, at 6:59 PM, Jeffrey Walton noloa...@gmail.com wrote:

 ...
 Apple designed the hardware and hold the platform keys. So I'm clear
 and I'm not letting my imagination run too far ahead:

 ...
 There are no means to recover a secret from the hardware, such as a
 JTAG interface or a datapath tap. Just because I can't do it, it does
 not mean Apple, a University with EE program, Harris Corporation,
 Cryptography Research, NSA, GCHQ, et al cannot do it.

 I alluded to that before. Prying secrets out of hardware is known technology. 
 If you're willing to destroy the device, there's a lot you can do, from 
 decapping the chip, to just x-raying it, etc.
Using JTAG interfaces and headless pinouts are hardly destructive testing :)

 ...
 These are some of the goodies I would expect a manufacturer to provide
 to select customers, such as LE an GOV. I would expect that the
 information would be held close to the corporate chest, so folks could
 not discuss it even if they wanted to.

 Really? Why?

 I don't believe that it is in the interests of a company to shaft its 
 customers
It appear Apple, Google, Microsoft, et al are doing it. From the
article (unless I am reading it wrong): ... and if law enforcement
can’t crack a seized iPhone, officers will in some cases mail the
phone to Apple, who extract the data and return it stored on a DVD
along with the locked phone.

 Let me ask again -- what could an LE or GOV offer that would be better than 
 being cool? ...
If there is nothing to be gained, then why does LE and GOV go to the
manufacturers in the first place? I'm presuming there is extracted,
usable data on the DVD mentioned above. For some reason, I don't have
the feeling that the DVD is empty or the data returned is unusable.

For what its worth, I agree with what are saying. But from the
article, it sounds diametrically opposed to what you are telling me.

Jeff
___
cryptography mailing list
cryptography@randombit.net
http://lists.randombit.net/mailman/listinfo/cryptography


Re: [cryptography] Here's What Law Enforcement Can Recover From A Seized iPhone

2013-03-28 Thread Jeffrey Goldberg
[Reply-To set to cryptopolitics]

On 2013-03-28, at 12:37 AM, Jeffrey Walton noloa...@gmail.com wrote:

 On Wed, Mar 27, 2013 at 11:37 PM, Jeffrey Goldberg jeff...@goldmark.org 
 wrote:

 ... In the other cases, the phones did have a passcode lock, but
 with 1 possible four digit codes it takes about 40 minutes to run
 through all given how Apple has calibrated PBKDF2 on these (4 trials per
 second).

 Does rooting and Jailbreaking invalidate evidence collection?

That is the kind of thing that would have to be settled by case law, I don't
know if evidence gathered this way has ever been been offered as evidence in
trial. (Note that a lot can be used against a suspect during an investigation
without ever having to be presented as evidence at trail.)

 Do hardware manufacturers and OS vendors have alternate methods? For
 example, what if LE wanted/needed iOS 4's hardware key?

You seem to be talking about a single iOS 4 hardware key. But each device
has its own. We don't know if Apple actually has retained copies of that.

 I suspect Apple has the methods/processes to provide it.

I have no more evidence than you do, but my guess is that they don't, for
the simple reason that if they did that fact would leak out. Secret
conspiracies (and that's what it would take) grow less plausible
as a function of the number of people who have to be in on it.
(Furthermore I suspect that implausibility rises super-linearly with
the number of people in on a conspiracy.)

 I think there's much more to it than a simple brute force.

We know that those brute force techniques exist (there are several vendors
of forensic recovery tools), and we've got very good reasons to believe
that only a small portion of users go beyond the default 4 digit passcode.
In case of LEAs, they can easily hold on to the phones for the 20 minutes
(on average) it takes to brute force them.

So I don't see why you suspect that there is some other way that only
Apple (or other relevant vendor) and the police know about.

Cheers,

-j
___
cryptography mailing list
cryptography@randombit.net
http://lists.randombit.net/mailman/listinfo/cryptography


Re: [cryptography] Here's What Law Enforcement Can Recover From A Seized iPhone

2013-03-28 Thread shawn wilson
On Mar 27, 2013 11:38 PM, Jeffrey Goldberg jeff...@goldmark.org wrote:




http://blog.agilebits.com/2012/03/30/the-abcs-of-xry-not-so-simple-passcodes/


Days? Not sure about the algorithm but both ocl and jtr can be run in
parallel and idk why you'd try to crack a password on an arm device anyway
(there's a jtr page that compares platforms and arm is god awful slow).
___
cryptography mailing list
cryptography@randombit.net
http://lists.randombit.net/mailman/listinfo/cryptography


Re: [cryptography] Here's What Law Enforcement Can Recover From A Seized iPhone

2013-03-28 Thread Jon Callas
-BEGIN PGP SIGNED MESSAGE-
Hash: SHA1

[Not replied-to cryptopolitics as I'm not on that list -- jdcc]

On Mar 28, 2013, at 3:23 PM, Jeffrey Goldberg jeff...@goldmark.org wrote:

 Do hardware manufacturers and OS vendors have alternate methods? For
 example, what if LE wanted/needed iOS 4's hardware key?
 
 You seem to be talking about a single iOS 4 hardware key. But each device
 has its own. We don't know if Apple actually has retained copies of that.

I've been involved in these sorts of questions in various companies that I've 
worked. Let's look at it coolly and rationally.

If you make a bunch of devices with keys burned in them, if you *wanted* to 
retain the keys, you'd have to keep them in some database, protect them, create 
access  controls and procedures so that only the good guys (to your definition) 
got them, and so on. It's expensive.

You're also setting yourself up for a target of blackmail. Once some bad guy 
learns that they have such a thing, they can blackmail you for the keys they 
want lest they reveal that the keys even exist. Those bad guys include 
governments of countries you operate or have suppliers in, mafiosi, etc. Heck, 
once some good guy knows about it, the temptation to break protocol on who gets 
keys when will be too great to resist, and blackmail will happen.

Eventually, so many people know about the keys that it's not a secret. Your 
company loses its reputation, even among the sort of law-and-order types who 
think that it's good for *their* country's LEAs to have those keys because they 
don't want other countries having those keys. Sales plummet. Profits drop. 
There are civil suits, shareholder suits, and most likely criminal charges in 
lots of countries (because while it's not a crime to give keys to their LEAs, 
it's a crime to give them to that other bad country's LEAs). Remember, the only 
difference between lawful access and espionage is whose jurisdiction it is.

On the other hand, if you don't retain the keys it doesn't cost you any money 
and you get to brag about how secure your device is, selling it to customers in 
and out of governments the world over.

Make the mental calculation. Which would a sane company do?

 
 I suspect Apple has the methods/processes to provide it.
 
 I have no more evidence than you do, but my guess is that they don't, for
 the simple reason that if they did that fact would leak out. Secret
 conspiracies (and that's what it would take) grow less plausible
 as a function of the number of people who have to be in on it.
 (Furthermore I suspect that implausibility rises super-linearly with
 the number of people in on a conspiracy.)

And that's just what I described above. I just wanted to put a sharper point on 
it. I don't worry about it because truth will out. Or as Dr. Franklin put it, 
three people can keep a secret if two of them are dead.

 
 I think there's much more to it than a simple brute force.
 
 We know that those brute force techniques exist (there are several vendors
 of forensic recovery tools), and we've got very good reasons to believe
 that only a small portion of users go beyond the default 4 digit passcode.
 In case of LEAs, they can easily hold on to the phones for the 20 minutes
 (on average) it takes to brute force them.

The unlocking feature on iOS uses the hardware to spin crypto operations on 
your passcode, so you have to do it on the device (the hardware key is involved 
-- you can't just image the flash) and you get about 10 brute force checks per 
second. For a four-character code, that's about 1000 seconds.

See http://images.apple.com/ipad/business/docs/iOS_Security_May12.pdf for 
many details on what's in iOS specifically.

Also, surprisingly often, if the authorities ask someone to unlock the phone, 
people comply. 

 
 So I don't see why you suspect that there is some other way that only
 Apple (or other relevant vendor) and the police know about.

Yeah, me either. We know that there are countries that have special national 
features in devices made by hardware makers that are owned by that country's 
government, but they're very careful to keep them within their own borders, for 
all the obvious reasons. It just looks bad and could lead to losing contracts 
in other countries.

Jon
-BEGIN PGP SIGNATURE-
Version: PGP Universal 3.2.0 (Build 1672)
Charset: us-ascii

wj8DBQFRVNHisTedWZOD3gYRAnLPAKCA3BW64XmpIlJJL8vMIwEZ9qBQzwCcDQiJ
OvnvTSUXUdELynnYxnT0lEA=
=JuD+
-END PGP SIGNATURE-
___
cryptography mailing list
cryptography@randombit.net
http://lists.randombit.net/mailman/listinfo/cryptography


Re: [cryptography] Here's What Law Enforcement Can Recover From A Seized iPhone

2013-03-28 Thread Jon Callas

On Mar 28, 2013, at 4:07 PM, shawn wilson ag4ve...@gmail.com wrote:

 
 On Mar 27, 2013 11:38 PM, Jeffrey Goldberg jeff...@goldmark.org wrote:
 
 
 
  http://blog.agilebits.com/2012/03/30/the-abcs-of-xry-not-so-simple-passcodes/
 
 
 Days? Not sure about the algorithm but both ocl and jtr can be run in 
 parallel and idk why you'd try to crack a password on an arm device anyway 
 (there's a jtr page that compares platforms and arm is god awful slow)
 
 

You have to run the password cracker on the device, because it involves mixing 
the hardware key in with the passcode, and that's done in the security chip. 
You can't parallelize it unless you pry the chip apart. I'm not saying it's 
impossible, but it is risky. If you screw that up, you lose totally, as then 
breaking the passcode is breaking AES-256. And if you have about 2^90 memory, 
it's easier than breaking AES-128!

Jon




PGP.sig
Description: PGP signature
___
cryptography mailing list
cryptography@randombit.net
http://lists.randombit.net/mailman/listinfo/cryptography


Re: [cryptography] Here's What Law Enforcement Can Recover From A Seized iPhone

2013-03-28 Thread Kevin W. Wall
On Thu, Mar 28, 2013 at 7:27 PM, Jon Callas j...@callas.org wrote:
 -BEGIN PGP SIGNED MESSAGE-
 Hash: SHA1

 [Not replied-to cryptopolitics as I'm not on that list -- jdcc]

Ditto.

 On Mar 28, 2013, at 3:23 PM, Jeffrey Goldberg jeff...@goldmark.org wrote:

 Do hardware manufacturers and OS vendors have alternate methods? For
 example, what if LE wanted/needed iOS 4's hardware key?

 You seem to be talking about a single iOS 4 hardware key. But each device
 has its own. We don't know if Apple actually has retained copies of that.

 I've been involved in these sorts of questions in various companies that I've 
 worked. Let's look at it coolly and rationally.

 If you make a bunch of devices with keys burned in them, if you *wanted* to 
 retain the keys, you'd have to keep them in some database, protect them, 
 create access  controls and procedures so that only the good guys (to your 
 definition) got them, and so on. It's expensive.

 You're also setting yourself up for a target of blackmail. Once some bad guy 
 learns that they have such a thing, they can blackmail you for the keys they 
 want lest they reveal that the keys even exist. Those bad guys include 
 governments of countries you operate or have suppliers in, mafiosi, etc. 
 Heck, once some good guy knows about it, the temptation to break protocol on 
 who gets keys when will be too great to resist, and blackmail will happen.

 Eventually, so many people know about the keys that it's not a secret. Your 
 company loses its reputation, even among the sort of law-and-order types who 
 think that it's good for *their* country's LEAs to have those keys because 
 they don't want other countries having those keys. Sales plummet. Profits 
 drop. There are civil suits, shareholder suits, and most likely criminal 
 charges in lots of countries (because while it's not a crime to give keys to 
 their LEAs, it's a crime to give them to that other bad country's LEAs). 
 Remember, the only difference between lawful access and espionage is whose 
 jurisdiction it is.

 On the other hand, if you don't retain the keys it doesn't cost you any money 
 and you get to brag about how secure your device is, selling it to customers 
 in and out of governments the world over.

 Make the mental calculation. Which would a sane company do?


All excellent, well articulated points. I guess that means that
RSA Security is an insane company then since that's
pretty much what they did with the SecurID seeds. Inevitably,
it cost them a boatload too. We can only hope that Apple
and others learn from these mistakes.

OTOH, if Apple thought they could make a hefty profit by
selling to LEAs or friendly governments, that might change
the equation enough to tempt them. Of course that's doubtful
though, but stranger things have happened.

-kevin
-- 
Blog: http://off-the-wall-security.blogspot.com/
The most likely way for the world to be destroyed, most experts agree,
is by accident. That's where we come in; we're computer professionals.
We *cause* accidents.-- Nathaniel Borenstein
___
cryptography mailing list
cryptography@randombit.net
http://lists.randombit.net/mailman/listinfo/cryptography


Re: [cryptography] Here's What Law Enforcement Can Recover From A Seized iPhone

2013-03-28 Thread Nico Williams
On Thu, Mar 28, 2013 at 7:24 PM, Kevin W. Wall kevin.w.w...@gmail.com wrote:
 On Thu, Mar 28, 2013 at 7:27 PM, Jon Callas j...@callas.org wrote:
 [Rational response elided.]

 All excellent, well articulated points. I guess that means that
 RSA Security is an insane company then since that's
 pretty much what they did with the SecurID seeds. Inevitably,
 it cost them a boatload too. We can only hope that Apple
 and others learn from these mistakes.

RSA did it for plausible, reasonable (if wrong) ostensible reasons not
related to LEA.

 OTOH, if Apple thought they could make a hefty profit by

There is zero chance Apple would be backdooring anything for profit
considering the enormity of the risk they would be taking.  If they do
it at all it's because they've been given no choice (ditto their
competitors).

 selling to LEAs or friendly governments, that might change
 the equation enough to tempt them. Of course that's doubtful
 though, but stranger things have happened.

This the tin-foil response.  But note that the more examples of
bad-idea backdoors, the less confidence we can have in the rational
argument, and the more the tin-foil argument becomes the rational one.
 In the worst case scenario we can't trust much of anything and we
can't open-code everything either.  But in the worst case scenario
we're also mightily vulnerable to attack from bad guys.  Let us hope
that there are enough rational people at or alongside LEAs to temper
the would-be arm-twisters that surely must exist within those LEAs.

Nico
--
___
cryptography mailing list
cryptography@randombit.net
http://lists.randombit.net/mailman/listinfo/cryptography


Re: [cryptography] Here's What Law Enforcement Can Recover From A Seized iPhone

2013-03-28 Thread Jon Callas
-BEGIN PGP SIGNED MESSAGE-
Hash: SHA1

On Mar 28, 2013, at 5:24 PM, Kevin W. Wall kevin.w.w...@gmail.com wrote:

 
 All excellent, well articulated points. I guess that means that
 RSA Security is an insane company then since that's
 pretty much what they did with the SecurID seeds. Inevitably,
 it cost them a boatload too. We can only hope that Apple
 and others learn from these mistakes.

No, RSA was careless and stupid. It's not the same thing at all.

SecurID seeds are shared secrets and the authenticators need them. They did 
nothing like what we were talking about -- handing them out so the security of 
the device could be compromised. They kept their own crown jewels on some PC on 
their internal network and they were hacked for them.

 
 OTOH, if Apple thought they could make a hefty profit by
 selling to LEAs or friendly governments, that might change
 the equation enough to tempt them. Of course that's doubtful
 though, but stranger things have happened.

Excuse me, but Apple in particular is making annual income in the same ballpark 
as the GDP of Ireland, the Czech Republic, or Israel. They could bail out 
Cyprus with pocket change.

If you want to go all tinfoil hat, you shouldn't be thinking about friendly 
governments buying them off, you should be thinking about *them* buying their 
own country.

Jon
-BEGIN PGP SIGNATURE-
Version: PGP Universal 3.2.0 (Build 1672)
Charset: iso-8859-1

wj8DBQFRVPGKsTedWZOD3gYRAmKzAKDkD8/myOnUQjpSQzohZ7i3OqC6QwCeJ69T
e81n4nVL+KTK7g72TLMeHow=
=JqMQ
-END PGP SIGNATURE-
___
cryptography mailing list
cryptography@randombit.net
http://lists.randombit.net/mailman/listinfo/cryptography


Re: [cryptography] Here's What Law Enforcement Can Recover From A Seized iPhone

2013-03-28 Thread Jeffrey Walton
On Thu, Mar 28, 2013 at 7:27 PM, Jon Callas j...@callas.org wrote:
 -BEGIN PGP SIGNED MESSAGE-
 Hash: SHA1

 [Not replied-to cryptopolitics as I'm not on that list -- jdcc]

 On Mar 28, 2013, at 3:23 PM, Jeffrey Goldberg jeff...@goldmark.org wrote:

 Do hardware manufacturers and OS vendors have alternate methods? For
 example, what if LE wanted/needed iOS 4's hardware key?

 You seem to be talking about a single iOS 4 hardware key. But each device
 has its own. We don't know if Apple actually has retained copies of that.

 I've been involved in these sorts of questions in various companies that I've 
 worked.
Somewhat related: are you bound to some sort of non-disclosure with
Apple? Can you discuss all aspects of the security architecture, or is
it [loosely] limited to Apple's public positions?

 If you make a bunch of devices with keys burned in them, if you *wanted* to 
 retain the keys, you'd have to keep them in some database, protect them, 
 create access  controls and procedures so that only the good guys (to your 
 definition) got them, and so on. It's expensive.
Agreed.

 You're also setting yourself up for a target of blackmail
 Eventually, so many people know about the keys that it's not a secret. Your 
 company loses its reputation.
Agreed.

 On the other hand, if you don't retain the keys it doesn't cost you any money 
 and you get to brag about how secure your device is, selling it to customers 
 in and out of governments the world over.
Agreed.

I regard these as the positive talking points. There's no slight of
hand in your arguments, and I believe they are truthful. I expect them
to be in the marketing literature.

 I suspect Apple has the methods/processes to provide it.
 I have no more evidence than you do, but my guess is that they don't, for
 the simple reason that if they did that fact would leak out. ...
 And that's just what I described above. I just wanted to put a sharper point 
 on it.
 I don't worry about it because truth will out. ...
A corporate mantra appears to be 'catch me if you can', 'deny deny
deny', and then 'turn it over to marketing for a spin'.

We've seen it in the past with for example, Apple and location data,
carriers and location data, and Google and wifi spying. No one was
doing it until they got caught.

Please forgive my naiveness or my ignorance if I'm seeing things is a
different light (or shadow).

 I think there's much more to it than a simple brute force.
 We know that those brute force techniques exist (there are several vendors
 of forensic recovery tools), 
 The unlocking feature on iOS uses the hardware to spin crypto operations on 
 your passcode...

Apple designed the hardware and hold the platform keys. So I'm clear
and I'm not letting my imagination run too far ahead:

Apple does not have or use, for example, custom boot loaders signed by
the platform keys used in diagnostics, for data extraction, etc.

There are no means to recover a secret from the hardware, such as a
JTAG interface or a datapath tap. Just because I can't do it, it does
not mean Apple, a University with EE program, Harris Corporation,
Cryptography Research, NSA, GCHQ, et al cannot do it.

A naturally random event is used to select the hardware keys, and not
a deterministic event such as hashing a serial number and date of
manufacture.

These are some of the goodies I would expect a manufacturer to provide
to select customers, such as LE an GOV. I would expect that the
information would be held close to the corporate chest, so folks could
not discuss it even if they wanted to.

jeff
___
cryptography mailing list
cryptography@randombit.net
http://lists.randombit.net/mailman/listinfo/cryptography


Re: [cryptography] Here's What Law Enforcement Can Recover From A Seized iPhone

2013-03-28 Thread Jon Callas
-BEGIN PGP SIGNED MESSAGE-
Hash: SHA1


On Mar 28, 2013, at 6:59 PM, Jeffrey Walton noloa...@gmail.com wrote:

 On Thu, Mar 28, 2013 at 7:27 PM, Jon Callas j...@callas.org wrote:
 -BEGIN PGP SIGNED MESSAGE-
 Hash: SHA1
 
 [Not replied-to cryptopolitics as I'm not on that list -- jdcc]
 
 On Mar 28, 2013, at 3:23 PM, Jeffrey Goldberg jeff...@goldmark.org wrote:
 
 Do hardware manufacturers and OS vendors have alternate methods? For
 example, what if LE wanted/needed iOS 4's hardware key?
 
 You seem to be talking about a single iOS 4 hardware key. But each device
 has its own. We don't know if Apple actually has retained copies of that.
 
 I've been involved in these sorts of questions in various companies that 
 I've worked.
 Somewhat related: are you bound to some sort of non-disclosure with
 Apple? Can you discuss all aspects of the security architecture, or is
 it [loosely] limited to Apple's public positions?

- From being there, Apple's culture and practices are such that everything they 
do is focused on making cool things for the customers. Apple fights for the 
users. The users' belief and faith in Apple saved it from near death. 
Everything there focuses on how it's good for the users. Also remember that 
there are many axes of good for the users. User experience, cost, reliability, 
etc. are part of the total equation along with security. People like you and me 
are not the target,  it's more the proverbial My Mom sort of user.

Moreover, they're not in it for the money. They're in it for the cool. 
Obviously, one has to be profitable, and obviously high margins are better than 
low ones, but the motivator is the user, and being cool. Ultimately, they do it 
for the person in the mirror, not for the cash.

I believe that Apple is too closed-mouthed about a lot of very, very cool 
things that they do security-wise. But that's their choice, and as a gentleman, 
I don't discuss things that aren't public because I don't blab. NDA or no NDA, 
I just don't blab.


 I regard these as the positive talking points. There's no slight of
 hand in your arguments, and I believe they are truthful. I expect them
 to be in the marketing literature.
 
 I suspect Apple has the methods/processes to provide it.
 I have no more evidence than you do, but my guess is that they don't, for
 the simple reason that if they did that fact would leak out. ...
 And that's just what I described above. I just wanted to put a sharper point 
 on it.
 I don't worry about it because truth will out. ...
 A corporate mantra appears to be 'catch me if you can', 'deny deny
 deny', and then 'turn it over to marketing for a spin'.
 
 We've seen it in the past with for example, Apple and location data,
 carriers and location data, and Google and wifi spying. No one was
 doing it until they got caught.
 
 Please forgive my naiveness or my ignorance if I'm seeing things is a
 different light (or shadow).

Well, with locationgate at Apple, that was a series of stupid and unfortunate 
bugs and misfeatures. Heads rolled over it.

- From what I have read of the Google wifi thing, it was also stupid and 
unfortunate. The person who coded it up was a pioneer of wardriving. People 
realized they could do cool things and did them without thinking it through. 
Thinking it through means that there are things to do that are cool if you are 
just a hacker, but not if you are a company. If that had been written up here, 
or submitted at a hacker con, everyone would have cheered -- and basically did, 
since arguably a pre-alpha of that hack was a staple of DefCon contests. The 
superiors of the brilliant hackers didn't know or didn't grok what was going on.

In neither of those cases was anyone trying to spy. In each differently, people 
were building cool features and some combination of bugs and failure to think 
it through led to each of them. It doesn't excuse mistakes, but it does explain 
them. Not every bad thing in the world happens by intent. In fact, most of them 
don't.

 
 Apple designed the hardware and hold the platform keys. So I'm clear
 and I'm not letting my imagination run too far ahead:
 
 Apple does not have or use, for example, custom boot loaders signed by
 the platform keys used in diagnostics, for data extraction, etc.
 
 There are no means to recover a secret from the hardware, such as a
 JTAG interface or a datapath tap. Just because I can't do it, it does
 not mean Apple, a University with EE program, Harris Corporation,
 Cryptography Research, NSA, GCHQ, et al cannot do it.

I alluded to that before. Prying secrets out of hardware is known technology. 
If you're willing to destroy the device, there's a lot you can do, from 
decapping the chip, to just x-raying it, etc.

 
 A naturally random event is used to select the hardware keys, and not
 a deterministic event such as hashing a serial number and date of
 manufacture.
 
 These are some of the goodies I would expect a manufacturer to provide
 to select 

Re: [cryptography] Here's What Law Enforcement Can Recover From A Seized iPhone

2013-03-28 Thread James A. Donald

On 2013-03-29 8:23 AM, Jeffrey Goldberg wrote:

I suspect Apple has the methods/processes to provide it.

I have no more evidence than you do, but my guess is that they don't, for
the simple reason that if they did that fact would leak out. Secret
conspiracies (and that's what it would take) grow less plausible
as a function of the number of people who have to be in on it.


Real secret conspiracy:  Small enough to fit around a coffee table.

Semi secret conspiracy.  Big enough to exercise substantial power, 
powerful enough to say ha ha, you must be crazy, also racist, pawn of 
big oil, Nazi whenever anything leaks out.


Looking back at the early twentieth century, we find an ample supply of 
secret conspiracies which must have hundreds of thousands of people in 
the know.


I could mention two rather famous ones, but this would divert the list 
off topic because three people with ten sock puppets each would then 
post a bunch of messages saying ha ha, you must be crazy





(Furthermore I suspect that implausibility rises super-linearly with
the number of people in on a conspiracy.)


I think there's much more to it than a simple brute force.

We know that those brute force techniques exist (there are several vendors
of forensic recovery tools), and we've got very good reasons to believe
that only a small portion of users go beyond the default 4 digit passcode.
In case of LEAs, they can easily hold on to the phones for the 20 minutes
(on average) it takes to brute force them.

So I don't see why you suspect that there is some other way that only
Apple (or other relevant vendor) and the police know about.

Cheers,

-j
___
cryptography mailing list
cryptography@randombit.net
http://lists.randombit.net/mailman/listinfo/cryptography



___
cryptography mailing list
cryptography@randombit.net
http://lists.randombit.net/mailman/listinfo/cryptography


Re: [cryptography] Here's What Law Enforcement Can Recover From A Seized iPhone

2013-03-28 Thread James A. Donald

On 2013-03-29 10:47 AM, Nico Williams wrote:

   There is zero chance Apple would be backdooring anything for profit


They might, however, and very likely are, backdooring everything to 
avoid getting their faces broken in with rifle butts.



___
cryptography mailing list
cryptography@randombit.net
http://lists.randombit.net/mailman/listinfo/cryptography


Re: [cryptography] Here's What Law Enforcement Can Recover From A Seized iPhone

2013-03-28 Thread Jeffrey Goldberg
On 2013-03-28, at 10:42 PM, Jon Callas j...@callas.org wrote:

 On Mar 28, 2013, at 6:59 PM, Jeffrey Walton noloa...@gmail.com wrote:
 
 We've seen it in the past with for example, Apple and location data,

 Well, with locationgate at Apple, that was a series of stupid and unfortunate 
 bugs and misfeatures. Heads rolled over it.

There are a couple interesting lessons from LocationGate. The scary 
demonstrations were out and circulated before the press and public realized 
that what was cached were the location of cell towers, not the phones actual 
location and that there was a good reason for caching that data. But I suspect 
that the large majority of people who remember that, still are under the 
impression that Apple was arbitrarily storing the the actual locations of the 
phone for no good reason.

The scare story spread quickly, with the more hyperbolic accounts getting the 
most attention. The corrective analysis probably didn't penetrate as widely.

The second lesson has to do with the the status of iOS protection classes that 
can leave things unencrypted even when the phone is locked. There are things 
that we want our phones to do before they are unlocked with a passcode. We'd 
like them to know which local WiFi networks they can join and we'd like them to 
precompute our locations so that that is up and ready as soon as we do unlock 
the phones. As a consequence things like WiFi passwords are not (or at least, 
were not) stored in a way that are protected by the device key. The data 
protection classes NSFileProtectionNone and 
NSFileProtectionCompleteUntilFirstUserAuthentication have legitimate uses, but 
it does lead to cases where people may thing that some data is protected when 
their device is off or locked which in fact isn't.

The trick is how to communicate this the people, most of whom do not wish to be 
overwhelmed with information.  There are lots of other things like this 
(encrypted backups and thisDeviceOnly, 10 seconds after lock before keys are 
erased, etc) that really people ought to know. The information about these 
isn't secret, Apple publishes it. But it takes some level of sophistication to 
understand; but mostly what it takes is interest.

 In neither of those cases was anyone trying to spy. In each differently, 
 people were building cool features and some combination of bugs and failure 
 to think it through led to each of them. It doesn't excuse mistakes, but it 
 does explain them. Not every bad thing in the world happens by intent. In 
 fact, most of them don't.

What's the line? Never attribute to malice what can be explained by 
incompetence.

At the same time we are in the business of designing system that will protect 
people and their data under the assumption that the world is full of hostile 
agents. As I like to put it, I lock my car not because I think everyone is a 
crook, but because I know that car thieves do exist.

Cheers,

-j
___
cryptography mailing list
cryptography@randombit.net
http://lists.randombit.net/mailman/listinfo/cryptography


Re: [cryptography] Here's What Law Enforcement Can Recover From A Seized iPhone

2013-03-27 Thread Jeffrey Goldberg
On Mar 24, 2013, at 5:30 PM, Jeffrey Walton noloa...@gmail.com wrote:

 I wonder how they are doing it when other tools fails.

The article explained how they do it.  The case they described said the phone 
had no passcode lock, so the data on the phone would not have been encrypted.  
In the other cases, the phones did have a passcode lock, but with 1 
possible four digit codes it takes about 40 minutes to run through all given 
how Apple has calibrated PBKDF2 on these (4 trials per second). 

I've been recommending that people turn off simple passcode on iOS devices 
and move to at least six digits. If your non-simple passcode is all digits, you 
are still get the numeric keypad. 

I've written about all that here

http://blog.agilebits.com/2012/03/30/the-abcs-of-xry-not-so-simple-passcodes/

when there was some hyperbolic claims about breaking into iPhones.

smime.p7s
Description: S/MIME cryptographic signature
___
cryptography mailing list
cryptography@randombit.net
http://lists.randombit.net/mailman/listinfo/cryptography


Re: [cryptography] Here's What Law Enforcement Can Recover From A Seized iPhone

2013-03-27 Thread Jeffrey Walton
On Wed, Mar 27, 2013 at 11:37 PM, Jeffrey Goldberg jeff...@goldmark.org wrote:
 On Mar 24, 2013, at 5:30 PM, Jeffrey Walton noloa...@gmail.com wrote:

 I wonder how they are doing it when other tools fails.

 ... In the other cases, the phones did have a passcode lock, but
 with 1 possible four digit codes it takes about 40 minutes to run
 through all given how Apple has calibrated PBKDF2 on these (4 trials per
 second).
Does rooting and Jailbreaking invalidate evidence collection? Do
hardware manufacturers and OS vendors have alternate methods? For
example, what if LE wanted/needed iOS 4's hardware key? I suspect
Apple has the methods/processes to provide it.

I think there's much more to it than a simple brute force.

 I've been recommending that people turn off simple passcode on iOS devices
 and move to at least six digits. If your non-simple passcode is all digits,
 you are still get the numeric keypad.
Yes good advice. The platform's data protection on hardware encryption
keys is a good start.

Jeff
___
cryptography mailing list
cryptography@randombit.net
http://lists.randombit.net/mailman/listinfo/cryptography


[cryptography] Here's What Law Enforcement Can Recover From A Seized iPhone

2013-03-24 Thread Jeffrey Walton
Interesting point below is OS vendors are extracting data for law
enforcement. I wonder how they are doing it when other tools fails.
(Thanks to JM on another list for the link).

http://www.forbes.com/sites/andygreenberg/2013/02/26/heres-what-law-enforcement-can-recover-from-a-seized-iphone/

You may think of your iPhone as a friendly personal assistant. But
once it’s alone in a room full of law enforcement officials, you might
be surprised at the revealing things it will say about you.

On Tuesday the American Civil Liberties Union published a report it
obtained from a drug investigation by the Immigration and Customs
Enforcement (ICE) agency, documenting the seizure and search of a
suspect’s iPhone from her bedroom. While it’s no surprise that a phone
carries plenty of secrets, the document presents in stark detail a
list of that personal information, including call logs, photos,
videos, text messages, Web history, eight different passwords for
various services, and perhaps most importantly, 659 previous locations
of the phone invisibly gathered from Wifi networks and cell towers.

“We know the police have started using tools that can do this. We’ve
known the iPhone retains records of the cell towers it contacts. But
we’ve never before seen the huge amount of data police can obtain,”
says ACLU technology lead Chris Soghoian, who found the report in a
court filing. “It shouldn’t be shocking. But it’s one thing to know
that they’re using it. It’s another to see exactly what they get.”

In this case, ICE was able to extract the iPhone’s details with the
help of the forensics firm Cellebrite. The suspect doesn’t seem to
have enabled a PIN or passcode. But even when those login safeguards
are set up in other cases, law enforcement have still often been able
to use tools to bypass or brute-force a phone’s security measures.
Google in some cases helps law enforcement to get past Android phones’
lockscreens, and if law enforcement can’t crack a seized iPhone,
officers will in some cases mail the phone to Apple, who extract the
data and return it stored on a DVD along with the locked phone.

The phone search and seizure described in the documented case required
a warrant. But the legality of warrantless phone searches remains an
open issue. At U.S. borders or when arresting a suspect, for instance,
police and government officials have argued that no such warrant is
required.

Failing legal protections, the ACLU’s Soghoian says those who’d like
to keep prying eyes away from their handsets’ data should use long,
complex passcodes and encrypt their phone’s storage disk. “While the
law does not sufficiently protect the private data on smartphones,
technology can at least provide some protection,” Soghoian writes.

Here’s the full court document detailing the iPhone’s forensic search.
___
cryptography mailing list
cryptography@randombit.net
http://lists.randombit.net/mailman/listinfo/cryptography