Re: Difference between TCPA-Hardware and other forms of trust

2003-12-20 Thread Seth David Schoen
that *you* can trust, but no one else has
> any reason to trust.  However, the capability to do that can be easily
> leveraged to produce a system that *others* can trust as well.  There are
> so many potential applications for the latter type of system that, as soon
> as systems of the former type are fielded, the pressure to convert them to
> the latter type will be overwhelming.

I find this puzzling because I don't see how the leveraging happens.
I'm puzzled as a pure matter of cryptography, because if my computer
doesn't come with any tokens that a third party can use to establish
trust in it, I don't see how it can prevent me from doing a
man-in-the-middle attack whenever someone tries to exchange a key with
it.

In the TCG model, there are already keys shipped with the PC, inside
the TPM.  These keys make signatures that have a particular
significance.  People who are concerned about the effects of attestation
have proposed changes in which the meaning of these signatures is changed
slightly, or in which the keys are disclosed to the computer owner, or
in which the keys are not preloaded at all.  These changes are
proposed specifically in order to preserve an aspect of the status
quo: that other people can't trust your computer without also trusting
its owner.

So if these proposals don't have that effect, I'd be glad to hear why
not.

-- 
Seth David Schoen <[EMAIL PROTECTED]> | Very frankly, I am opposed to people
 http://www.loyalty.org/~schoen/   | being programmed by others.
 http://vitanuova.loyalty.org/ | -- Fred Rogers (1928-2003),
   |464 U.S. 417, 445 (1984)

-
The Cryptography Mailing List
Unsubscribe by sending "unsubscribe cryptography" to [EMAIL PROTECTED]


Re: Difference between TCPA-Hardware and a smart card (was: example: secure computing kernel needed)

2003-12-22 Thread Seth David Schoen
  If so, the challenger, in this case the
service provider, can then deliver the content.

Some people may have read things like this and mistakenly thought that
this would not be an opt-in process.  (There is some language about
how the user's platform takes various actions and then "responds" to
challenges, and perhaps people reasoned that it was responding
autonomously, rather than under its user's direction.)

But it's clear from the context why the computer user is opting in:
because it's a condition of access to the service.  If you don't
attest at all, that's treated as giving an unacceptable answer.

There might seem to be a certain circularity here (you can only get
people to give attestations if you can deny them access to the service
if they refuse, and you can only deny them access to the service for
refusing if people are generally willing to give attestations).  But I
think it's mainly a question of network effects.

Your desire to "attest about the state of the software to myself, the
machine owner" could be met in various ways without increasing other
people's potential leverage over what software you use.  I have
suggest that you could have a TPM that allows you deliberately to attest
to a software environment that is different from your real software
environment.  There are other possibilities.  Ka-Ping Yee suggested
that, when you buy a device with a TPM, you should get a copy of all the
secret keys that were preloaded in your TPM; another alternative would be
not pre-loading any keys at all.  In these models, not only can you
get an attestation through some local UI, as you suggest, but you can
also give an attestation to a machine or service that you operate --
or that someone else operates -- whenever you have reason to believe
that the attestation will be used in a way that will benefit you.

-- 
Seth David Schoen <[EMAIL PROTECTED]> | Very frankly, I am opposed to people
 http://www.loyalty.org/~schoen/   | being programmed by others.
 http://vitanuova.loyalty.org/ | -- Fred Rogers (1928-2003),
   |464 U.S. 417, 445 (1984)

-
The Cryptography Mailing List
Unsubscribe by sending "unsubscribe cryptography" to [EMAIL PROTECTED]


Re: example: secure computing kernel needed

2003-12-26 Thread Seth David Schoen
ttestation (typically to their detriment, by
way of taking away their choice of software) as opposed to on-line
services where users would be able to ask for attestation (typically
to their benefit, by way of showing that the service had certain
desirable properties, at least in some future TC technology where the
cost to the service provider of making a false attestation could be
made substantial, which it is not now).  I'm not sure exactly what
things separate these.

-- 
Seth David Schoen <[EMAIL PROTECTED]> | Very frankly, I am opposed to people
 http://www.loyalty.org/~schoen/   | being programmed by others.
 http://vitanuova.loyalty.org/ | -- Fred Rogers (1928-2003),
   |464 U.S. 417, 445 (1984)

-
The Cryptography Mailing List
Unsubscribe by sending "unsubscribe cryptography" to [EMAIL PROTECTED]


Re: Difference between TCPA-Hardware and a smart card (was: example: secure computing kernel needed)

2003-12-28 Thread Seth David Schoen
Antonomasia writes:

> From: "Carl Ellison" <[EMAIL PROTECTED]>
> 
> > Some TPM-machines will be owned by people who decide to do what I
> > suggested: install a personal firewall that prevents remote attestation.
> 
> How confident are you this will be possible ?  Why do you think the
> remote attestation traffic won't be passed in a widespread service
> like HTTP - or even be steganographic ?

The main answer is that the TPM will let you disable attestation, so
you don't even have to use a firewall (although if you have a LAN, you
could have a border firewall that prevented anybody on the LAN from
using attestation within protocols that the firewall was sufficiently
familiar with).

When attestation is used, it likely will be passed in a service like
HTTP, but in a documented way (for example, using a protocol based on
XML-RPC).  There isn't really any security benefit obtained by hiding
the content of the attestation _from the party providing it_!

Certainly attestation can be used as part of a key exchange so that
subsequent communications between local software and a third party are
hidden from the computer owner, but because the attestation must
happen before that key exchange is concluded, you can still examine
and destroy the attestation fields.

One problem is that a client could use HTTPS to establish a session
key for a session within which an attestation would be presented.
That might disable your ability to use the border firewall to block
the attestation, but you can still disable it in the TPM on that
machine if you control the machine.

The steganographic thing is implausible because the TPM is a passive
device which can't control other components in order to get them to
signal information.

-- 
Seth David Schoen <[EMAIL PROTECTED]> | Very frankly, I am opposed to people
 http://www.loyalty.org/~schoen/   | being programmed by others.
 http://vitanuova.loyalty.org/ | -- Fred Rogers (1928-2003),
   |464 U.S. 417, 445 (1984)

-
The Cryptography Mailing List
Unsubscribe by sending "unsubscribe cryptography" to [EMAIL PROTECTED]


Re: Difference between TCPA-Hardware and a smart card (was: example: secure computing kernel needed)

2003-12-31 Thread Seth David Schoen
PCR values that show the results of
Mallory's tampering.  But Mallory does _not_ pass this attestation
along to you.  Instead, Mallory uses Owner Override or Owner Gets Key
to generate a new attestation reflecting the original set of PCR
values that FOO had _before Mallory tampered with my software
environment_.  He then generates an attestation by BAR falsely
reflecting that BAR presently has those PCR values.  (Instead, BAR's
PCR values actually reflect that Mallory is running some custom-built
MITM attack software.  But the attestation Mallory generates conceals
that fact.)

Mallory then passes his false attestation along to you in place of my
real attestation.  On the basis of his attestation, you believe that
my computer has not been tampered with and you exchange a session key
with Mallory (believing that you are exchanging it with me).  Mallory
now exchanges a session key with me, and proxies the remainder of the
encrypted connection, which he can then observe or alter.  You falsely
believe that the session key is held by my original, unmodified
software environment, where it is really held by Mallory.

I think the lesson of this example is that believing attestations
requires some out-of-band mechanism to establish trust in TPM signing
keys.  That mechanism _could be_ vendor signatures in the present TCG
scheme, or it could be some completely different mechanism in an Owner
Override or Owner Gets Key system.  (For instance, in a corporate
environment, an IT manager can generate all the keys directly, and
then knows their values.  The IT manager does not need to rely on TPM
manufacturers to establish the legitimacy of TPM signing keys.)  The
MITM attack works because the TPM manufacturer's signature is no
longer a sufficient basis to establish trust in a TPM, if the TPM
might have an Owner Override feature.

So in my example, I need to find an out-of-band way to tell you FOO's
key so that you know it and can trust it (and distinguish it from
BAR's key).  If Owner Override exists and I've never told you precisely
which TPM I'm using, all you can tell is that you got an attestation
from _some_ TPM, but that might be an attacker's TPM and not my TPM!
In the corporate environment, the IT manager knows which TPM is in
each machine and can therefore easily tell who really generated a
particular attestation.  (I'm oversimplifying in various ways, but I
think this point is right.)

This extra work is not necessarily a bad thing -- it can be seen as a
legitimate cost of making TPMs that can never be used against their
owners -- but I'm not sure it can be avoided, and it might make
attestation less useful for some applications.  In all the cases where
the application is "technically sophisticated people want to receive
attestations from computers they own and configure", it's probably
still fine, but there may be some challenges for other purposes.


Here's an ASCII art diagram of that attack (eliding many details of
key exchange protocols that don't seem relevant):

 me   Mallory  you
     _   
 | EvilOS | --- | MITM-OS | --- | BankOS |
     -   
  TPM "FOO"  TPM "BAR"

Me (to Mallory): Hi, Bank, it's me!
Mallory (to you): Hi, Bank, it's me!
You (to Mallory): Hi, sign your PCRs with nonce "ahroowah" to prevent replay.
Mallory (to me): Hi, sign your PCRs with nonce "ahroowah" to prevent replay.
Me (to Mallory): Ok.  sign(FOO, "EvilOS/ahroowah")
Mallory (to you): Ok.  sign(BAR, "NormalOS/ahroowah")
You (to Mallory): Great, you're NormalOS.  session key: encrypt(BAR, sugheequ)
Mallory (to me): Great, you're NormalOS.  session key: encrypt(FOO, heepheig)

(Again, I can't exchange a session key with you that Mallory can't
intercept because I'm running EvilOS, which deliberately fails to
notice when Mallory's SSL certificate gets substituted for the bank's.
and the bank can't benefit from this attestation because it doesn't
know whether my TPM is FOO or BAR.  If I had previously had an
out-of-band way to tell the bank that my TPM was FOO, Mallory would not
be able to carry out this attack.  Mallory still can't sign things as
FOO, and without a sufficiently clever social engineering attack,
can't get me to sign things as FOO for him.)

-- 
Seth David Schoen <[EMAIL PROTECTED]> | Very frankly, I am opposed to people
 http://www.loyalty.org/~schoen/   | being programmed by others.
 http://vitanuova.loyalty.org/ | -- Fred Rogers (1928-2003),
   |464 U.S. 417, 445 (1984)

-
The Cryptography Mailing List
Unsubscribe by sending "unsubscribe cryptography" to [EMAIL PROTECTED]


Re: Difference between TCPA-Hardware and a smart card (was: example: secure computing kernel needed)

2004-01-04 Thread Seth David Schoen
David Wagner writes:

> To see why, let's go back to the beginning, and look at the threat
> model.  If multiple people are doing shared development on a central
> machine, that machine must have an owner -- let's call him Linus.  Now
> ask yourself: Do those developers trust Linus?
> 
> If the developers don't trust Linus, they're screwed.  It doesn't how
> much attestation you throw at the problem, Linus can always violate their
> security model.  As always, you've got to trust "root" (the system
> administrator); nothing new here.
> 
> Consequently, it seems to me we only need to consider a threat model
> where the developers trust Linus.  (Linus need not be infallible, but the
> developers should believe Linus won't intentionally try to violate their
> security goals.)  In this case, owner-directed attestation suffices.
> Do you see why?  Linus's machine will produce an attestation, signed
> by Linus's key, of what software is running.  Since the developers trust
> Linus, they can then verify this attestation.  Note that the developers
> don't need to trust each other, but they do need to trust the owner/admin
> of the shared box.  So, it seems to me we can get by without third-party
> attestation.

You could conceivably have a PC where the developers don't trust
Linus, but instead trust the PC manufacturer.  The PC manufacturer
could have made it extremely expensive for Linus to tamper with the PC
in order to "violate [the developers'] security model".  (It isn't
logically impossible, it's just extremely expensive.  Perhaps it costs
millions of dollars, or something.)

There are computers like that today.  At least, there are devices that can
run software, that are highly tamper-resistant, and that can do attestations.
(Now there is an important question about what the cost to do a hardware
attack against those devices would be.)  It seems to me to be a good thing
that the ordinary PC is not such a device.  (Ryan Lackey, in a talk
about security for colocated machines, described using devices like
these for colocation where it's not appropriate or desirable to rely on
the physical security of the colocated machine.  Of course, strictly
speaking, all security always relies on physical security.)

I don't know how the key management works in these devices.  If the
keys used to sign attestations are loaded by (or known to) the device
owner, it wouldn't help with the case where the device owner is
untrusted.  If the keys are loaded by the manufacturer, it might
support a model where the owner is untrusted and the manufacturer is
trusted.

-- 
Seth David Schoen <[EMAIL PROTECTED]> | Very frankly, I am opposed to people
 http://www.loyalty.org/~schoen/   | being programmed by others.
 http://vitanuova.loyalty.org/ | -- Fred Rogers (1928-2003),
   |464 U.S. 417, 445 (1984)

-
The Cryptography Mailing List
Unsubscribe by sending "unsubscribe cryptography" to [EMAIL PROTECTED]