Re: Difference between TCPA-Hardware and other forms of trust

2003-12-22 Thread Ian Grigg
Bill Frantz wrote:

 [I always considered the biggest contribution from Mondex was the idea of
 deposit-only purses, which might reduce the incentive to rob late-night
 business.]

This was more than just a side effect, it was also
the genesis of the earliest successes with smart
card money.

The first smart card money system in the Netherlands
was a service-station system for selling fuel to
truck drivers.  As security costs kept on rising,
due to constant hold-ups, the smart card system
was put in to create stations that had no money
on hand, so no need for guards or even tellers.

This absence of night time staff created a great
cost saving, and the programme was a big success.
Unfortunately, the early lessons were lost as time
went on, and attention switched from single-purpose
to multi-purpose applications.

iang

-
The Cryptography Mailing List
Unsubscribe by sending unsubscribe cryptography to [EMAIL PROTECTED]


Re: Difference between TCPA-Hardware and other forms of trust

2003-12-22 Thread Ben Laurie
bear wrote:
I really don't care if anyone *else* trusts my system; as
far as I'm concerned, their secrets should not be on my
system in the first place, any more than my secrets should
be on theirs.
The problem is that their secrets are Snow White, or the latest Oasis 
album. You want them on your box, and they want them not to leave your box.

Cheers,

Ben.

--
http://www.apache-ssl.org/ben.html   http://www.thebunker.net/
There is no limit to what a man can do or how far he can go if he
doesn't mind who gets the credit. - Robert Woodruff
-
The Cryptography Mailing List
Unsubscribe by sending unsubscribe cryptography to [EMAIL PROTECTED]


Re: Difference between TCPA-Hardware and other forms of trust

2003-12-22 Thread Ben Laurie
Bill Frantz wrote:
One should note that TCPA is designed to store its data (encrypted) in the
standard file system, so standard backup and restore techniques can be
used.
Only if your box doesn't die.

Cheers,

Ben.

--
http://www.apache-ssl.org/ben.html   http://www.thebunker.net/
There is no limit to what a man can do or how far he can go if he
doesn't mind who gets the credit. - Robert Woodruff
-
The Cryptography Mailing List
Unsubscribe by sending unsubscribe cryptography to [EMAIL PROTECTED]


Re: Difference between TCPA-Hardware and other forms of trust

2003-12-22 Thread bear


On Sat, 20 Dec 2003, Ian Grigg wrote:

Bill Frantz wrote:

 [I always considered the biggest contribution from Mondex was the idea of
 deposit-only purses, which might reduce the incentive to rob late-night
 business.]

...

The first smart card money system in the Netherlands
was a service-station system for selling fuel to
truck drivers.  As security costs kept on rising,
due to constant hold-ups, the smart card system
was put in to create stations that had no money
on hand, so no need for guards or even tellers.

This absence of night time staff created a great
cost saving, and the programme was a big success.
Unfortunately, the early lessons were lost as time
went on, and attention switched from single-purpose
to multi-purpose applications.

This underscores an important point.  In security
applications limitations are often a feature rather
than a bug.  We are accustomed to making things better
by making them able to do more; but in some spaces
it's actually better to use a solution that can do
very little.

Much of the current security/cryptography angst can
be summed up as small, limited, simple systems work,
but big, complex, general systems are very hard to
get right or have unintended drawbacks.  Often the
very generality of such systems is a barrier to their
wide adoption.

I would say that if you want to make any money in
cryptography and security (and make it honestly) you
should pick one business application, with one threat
model and one business model, and nail it.  Add no
features, nor even include any room in your design,
that don't directly address *that* problem.  When
you are able to present people with a solution to
one problem, which has no requirement of further
involvement than solving that one problem and introduces
no risks or interactions other than those flatly necessary
to solve that one problem, then they'll pay for it.

But when we start talking about multi-function cards,
it becomes a tradeoff where I can't get anything I want
without getting things I don't want or risking network
effects that will lead to markets dominated by business
models I don't want to deal with.  It makes the buy
decision complicated and fraught with risk.

Bear


-
The Cryptography Mailing List
Unsubscribe by sending unsubscribe cryptography to [EMAIL PROTECTED]


Re: Difference between TCPA-Hardware and other forms of trust

2003-12-20 Thread bear


On Wed, 17 Dec 2003, Jerrold Leichter wrote:

Given this setup, a music company will sell you a program that you must
install with a given set of access rights.  The program itself will check
(a) that it wasn't modified; (b) that a trusted report indicates that it
has been given exactly the rights specified.  Among the things it will check
in the report is that no one has the right to change the rights!  And, of
course, the program won't grant generic rights to any music file - it will
specifically control what you can do with the files.  Copying will, of course,
not be one of those things.

I think that if the music company wants that much control
(which is, btw, in clear violation of the First Sale Doctrine),
then the only legal way for them to achieve it is to provide
a player specifically for the music which they own, in exactly
the same way that banks retain ownership of the credit cards
and smartcards we use.  As long as the player is not their
property, they can't do this.

The major problem I want a trusted kernel for is because I
don't want to trust binaries provided by closed-source software
houses.  I want my trusted kernel to tell me exactly what
priveleges they're asking for and I want to tell it exactly
what priveleges it's allowed to provide them.  I want it to
be able to tell me exactly when every executable file appeared,
and as a result of running which other executable file (all
the way back to whichever command *I* gave that resulted in
its being there).  I want it to tell me exactly how the daemon
listening on any tcp port got installed and what priveleges
it has.  I want my trusted kernel to keep tamper-proof logs;
in fact I'd go so far as to want to use a write-once media
for logfiles just to make absolutely sure.

A trusted kernel should absolutely know when any program
is reading screen memory it didn't write, or keyboard
keystrokes that it then passes as input to another program,
and it should be possible for me to set up instant notification
for it to alert me when any program does so.

A trusted kernel should monitor outgoing network packets and
sound an alarm when any of them contains personal information
like PINs, passwords, keys, Social Security Number, Drivers
License, Credit Card numbers, Address, etc.  It should even
be possible to have a terminate-with-prejudice policy that
drops any such packets before sending and terminates and
uninstalls any unauthorized application that attempts to send
such packets.

I really don't care if anyone *else* trusts my system; as
far as I'm concerned, their secrets should not be on my
system in the first place, any more than my secrets should
be on theirs.  The fact is I'm building a system out of
pieces and parts from hundreds of sources and I don't know
all the sources; with an appropriate trusted kernel I
wouldn't have to extend nearly as much black box trust
to all the different places software comes from.


Yes, you can construct a system that *you* can trust, but no one else has
any reason to trust.  However, the capability to do that can be easily
leveraged to produce a system that *others* can trust as well.  There are
so many potential applications for the latter type of system that, as soon
as systems of the former type are fielded, the pressure to convert them to
the latter type will be overwhelming.

I do not think so.  People want to retain ownership of their
computer systems and personal information, and a system that
is made for *others* to trust would be used to take that
ownership and information.

 Ultimately, TCPA or no, you will be faced with a stark choice:  Join the
 broad trust community, or live in the woods.

No.  Lots of bands release music and encourage sharing, as promo
for their main revenue source (concert tours).  I see those bands
getting a leg up as their released music becomes popular while
music only available with onerous conditions languishes.  Lots of
other artists do graphic or animation work just for the chance to
be seen, and some of them are quite good.

You may consider it living in the woods to listen to stuff that
isn't the top 20; but I think lots of people will find that the
woods is a friendlier and more trustworthy place than a world
full of weasels who want to control their systems.

Bear

-
The Cryptography Mailing List
Unsubscribe by sending unsubscribe cryptography to [EMAIL PROTECTED]


Re: Difference between TCPA-Hardware and other forms of trust

2003-12-20 Thread Seth David Schoen
Jerrold Leichter writes:

 Given this setup, a music company will sell you a program that you must
 install with a given set of access rights.  The program itself will check
 (a) that it wasn't modified; (b) that a trusted report indicates that it
 has been given exactly the rights specified.  Among the things it will check
 in the report is that no one has the right to change the rights!  And, of
 course, the program won't grant generic rights to any music file - it will
 specifically control what you can do with the files.  Copying will, of course,
 not be one of those things.
 
 Now, what you'll say is that you want a way to override the trusted system,
 and grant yourself whatever rights you like.  Well, then you no longer have
 a system *anyone else* can trust, because *you* have become the real security
 kernel.  And in a trusted system, you could certainly create a subject that
 would automatically be granted all access rights to everything.  Of course,
 since the system is trusted, it would include information about the
 override account in any reports.  The music company would refuse to do
 business with you.
 
 More to the point, many other businesses would refuse to do business with you.

There's the rub.

The prevalence of systems that can be trusted by third parties who do
not trust their owners affects what applications are possible, and it
affects the balance of power between computer owners and others.

If very few such systems are deployed, it would be absurd to say that
the music company would refuse to do business with you -- because the
music company has to do business to keep its doors open.  Businesses
that refuse to serve customers will not last very long.

In the entertainment case, the antagonism is already clearly expressed.
(Open war is upon you, as I recall Theoden being told in that movie
last night.)  The more so-called legacy systems do not do DRM or do
not do it very well or easily, the more difficult it is for publishers
to apply DRM systems and succeed in the market.  The more systems do
DRM natively, or easily or cheaply, the easier it is to be successful
publishing things restricted with DRM.  In either case, the publishers
still have to publish; before the creation of DVD-A and SACD,
publishers of audio CDs couldn't very well say CD-DA, we hates it!
Nasty, tricksy format! (sorry, um) We are going to stop publishing
in the CD-DA format because it isn't encrypted.  Even today, they
would be hard-pressed to do so, because DVD-A and SACD players are
extraordinarily rare compared to audio CD players.

The question of whether the supposed added profit that comes with being
able to enforce DRM terms provides an important creative incentive
comparable to that provided by copyright law goes back to the era
immediately before the adoption of the DMCA, where the Bruce Lehman
White Paper argue that it did (that copyright law's incentive was
becoming inadequate and an additional control-of-the-public and
control-of-technology incentive would be required).  Indeed, the group
that pushed for the DMCA was called the Creative Incentives Coalition,
and it said that thus restricting customers was really all a matter of
preserving and expanding creative incentives.

http://www.uspto.gov/web/offices/com/doc/ipnii/

I think Bruce Lehman was wrong then and is wrong now.  On the other
hand, the _structure_ of the argument that the prospect of restricting
customers provides an incentive to do something that one would not
otherwise do is not incoherent on its face.  The interesting question
about remote attestation is whether there are (as some people have
suggested) interesting and important new applications that customers
would really value that are infeasible today.

For example, it has been argued by Unlimited Freedom that there would
be incentives to invest in useful things we don't have now (and things
we would benefit from) only if attestation could be used to control
what software we used to interact with those things.

In the entertainment case, though, there is already a large
entertainment industry that has to sell into a base of actually
deployed platforms (unless it wants to bundle players with
entertainment works) -- and its ability to refuse to do business with
you is constrained by what it can learn about you as a basis for
making that decision.  It's also constrained if its rationale for
refusing to sell to you would also imply that it needs to refuse to
sell to millions of other people.  Only if enormous numbers of people
in the future can preserve the benefit of creating uncertainty about
their software environment's identity will entertainment publishers
and others lack the ability to discriminate against people who use
disfavored software.

 Yes, you can construct a system that *you* can trust, but no one else has
 any reason to trust.  However, the capability to do that can be easily
 leveraged to produce a system that *others* can trust as well.  There are
 so many 

Re: Difference between TCPA-Hardware and other forms of trust

2003-12-20 Thread Peter Gutmann
John Gilmore [EMAIL PROTECTED] writes:

They eventually censored out all the sample application scenarios like DRM'd
online music, and ramped up the level of jargon significantly, so that nobody
reading it can tell what it's for any more.  Now all the documents available
at that site go on for pages and pages saying things like FIA_UAU.1 Timing of
authentication. Hierarchical to: No other components. FIA_UAU.1.1 The TSF
shall allow access to data and keys where entity owner has given the 'world'
access based on the value of TCPA_AUTH_DATA_USAGE; access to the following
commands: TPM_SelfTestFull, TPM_ContinueSelfTest, TPM_GetTestResult,
TPM_PcrRead, TPM_DirRead, and TPM_EvictKey on behalf of the user to be
performed before the user is authenticated.

That gobbledigook sounds like Common Criteria-speak.  So it's not deliberate,
it's a side-effect of making it CC-friendly.

nobody reading it can tell what it's for any more

Yup, that's definitely Common Criteria.

Peter.

-
The Cryptography Mailing List
Unsubscribe by sending unsubscribe cryptography to [EMAIL PROTECTED]


Re: Difference between TCPA-Hardware and other forms of trust

2003-12-20 Thread Bill Frantz
At 7:30 AM -0800 12/17/03, Jerrold Leichter wrote:

...

If the system were really trusted, it could store things like your credit
balance:  A vendor would trust your system's word about the contents, because
even you would not be able to modify the value.  This is what smart cards
attempt to offer - and, again, it would be really nice if you didn't have to
have a whole bunch of them.  The bank records stored on your system could
be trusted:  By the bank, by you - and, perhaps quite useful to you, by a
court if you claimed that the bank's records had been altered.

One should note that TCPA is designed to store its data (encrypted) in the
standard file system, so standard backup and restore techniques can be
used.  However, being able to backup my bank balance, buy a bunch of neat
stuff, and then restore the previous balance is not really what a banking
application wants.  Smart cards address this situation by storing the data
on the card, which is designed to be difficult to duplicate.

[I always considered the biggest contribution from Mondex was the idea of
deposit-only purses, which might reduce the incentive to rob late-night
business.]

Cheers - Bill



-
Bill Frantz| There's nothing so clear as a | Periwinkle
(408)356-8506  | vague idea you haven't written | 16345 Englewood Ave
www.pwpconsult.com | down yet. -- Dean Tribble | Los Gatos, CA 95032


-
The Cryptography Mailing List
Unsubscribe by sending unsubscribe cryptography to [EMAIL PROTECTED]