RE: Ross's TCPA paper

2002-06-26 Thread bear



On Wed, 26 Jun 2002, Scott Guthery wrote:

Privacy abuse is first and foremost the failure
of a digital rights management system.  A broken
safe is not evidence that banks shouldn't use
safes.  It is only an argument that they shouldn't
use the safe than was broken.

I'm hard pressed to imagine what privacy without
DRM looks like.  Perhaps somebody can describe
a non-DRM privacy management system.  On the other
hand, I easily can imagine how I'd use DRM
technology to manage my privacy.

You are fundamentally confusing the problem of
privacy (controlling unpublished information and
not being compelled to publish it) with the
problem of DRM (attempting to control published
information and compelling others to refrain
from sharing it).  Privacy does not require
anyone to be compelled against their will to
do anything.  DRM does.

As I see it, we can get either privacy or DRM,
but there is no way on Earth to get both.
Privacy can happen only among citizens who are
free to manage their information and DRM can
happen only among subjects who may be compelled
to disclose or abandon information against
their will.

Privacy without DRM is when you don't need anyone's
permission to run any software on your computer.

Privacy without DRM is when you are absolutely free
to do anything you want with any bits in your
posession, but people can keep you from *getting*
bits private to them into your posession.

Privacy without DRM means being able to legally
keep stuff you don't want published to yourself,
even if that means using pseudonymous or anonymous
transactions for non-fraudulent purposes.

Privacy without DRM means being able to simply,
instantly, and arbitrarily change legal identities
to get out from under extant privacy infringements,
and not have the new identity easily linkable to
the old.

Privacy without DRM means people being able to
create keys for cryptosystems and use them in
complete confidence that no one else has a key
that will decrypt the communication -- this is
fundamental to keeping private information
private.

Privacy without DRM means no restrictions whatsoever
on usable crypto in the hands of citizens.  It may
be a crime to withhold any stored keys when under a
subpeona, but that subpeona should issue only when
there is probable cause to believe that you have
committed a crime or are withholding information
about one, and you should *ALWAYS* be notified of the
issue within 30 days.  It also means that keys which
are in your head rather than stored somewhere are
not subject to subpeona -- on fifth amendment grounds
(in the USA) if the record doesn't exist outside
your head, then you cannot be coerced to produce
it.

Privacy without DRM means being able to keep and
do whatever you want with the records your business
creates -- but not being able to force someone to
use their real name or linkable identity information
to do business with you if that person wants that
information to remain private.

Bear






-
The Cryptography Mailing List
Unsubscribe by sending unsubscribe cryptography to [EMAIL PROTECTED]



Re: Ross's TCPA paper

2002-06-26 Thread pasward

I'm slightly confused about this.  My understanding of contract law is
that five things are required to form a valid contract: offer and
acceptance, mutual intent, consideration, capacity, and lawful
intent.  It seems to me that a click-through agreement is likely to
fail on at least one, and possibly two of these requirements.  First,
it is doubtful that there is mutual intent.  The average user doesn't
even read the agreement, so there is hardly mutual intent.  However,
even if I accept mutual intent, it would be easy to argue that there
is no capacity.  I have four children under the age of seven.  None of
them have the legal capacity to form a contract.  Three of them have
the physical capacity to click a button.  A corporation would
therefore have to demonstrate that I and not they clicked on the
agreement for the contract to be valid.

As a side note, it seems that a corporation would actually have to
demonstrate that I had seen and agreed to the thing and clicked
acceptance.  Prior to that point, I could reverse engineer, since
there is no statement that I cannot reverse engineer agreed to.  So
what would happen if I reverse engineered the installation so that the
agreement that was display stated that I could do what I liked with
the software?  Ok, so there would be no mutual intent, but on the
other hand, there would also be no agreement on the click-through
agreement either.

Paul

Peter D. Junger writes:
  Pete Chown writes:
  
  : Anonymous wrote:
  : 
  :  Furthermore, inherent to the TCPA concept is that the chip can in
  :  effect be turned off.  No one proposes to forbid you from booting a
  :  non-compliant OS or including non-compliant drivers.
  : 
  : Good point.  At least I hope they don't. :-)
  : 
  :  There is not even social opprobrium; look at how eager
  :  everyone was to look the other way on the question of whether the DeCSS
  :  reverse engineering violated the click-through agreement.
  : 
  : Perhaps it did, but the licence agreement was unenforceable.  It's
  : clearly reverse engineering for interoperability (between Linux and DVD
  : players) so the legal exemption applies.  You can't escape the exemption
  : by contract.  Now, you might say that morally he should obey the
  : agreement he made.  My view is that there is a reason why this type of
  : contract is unenforceable; you might as well take advantage of the
  : exemption.
  
  That isn't the reason why a click-through agreement isn't 
  enforceable---the agreement could, were it enforceable, validlly
  forbid reverse engineering for any reason and that clause would
  in most cases be upheld.  But, unless you buy your software from
  the copyright owner, you own your copy of the software and
  clicking on a so called agreement with the copyright owner
  that you won't do certain things with your software is---or,
  at least should be---as unenforceable as promise to your doctor
  that you won't smoke another cigarette.  The important point
  is not, however, that click-through agreements are probably
  unenforceable; the important point is that people---at least
  those people who think that they own their own computers and
  the software copies that they have purchased---generally
  believe that they should be unenforceable.  (And in the
  actual case involving Linux and DVD players there was no
  agreement not to circumvent the technological control measures
  in DVD's; the case was based on the theory that the circumvention
  violated the Digital Millenium Copyright Act.)
   
  : The prosecution was on some nonsense charge that amounted to him
  : burgling his own house.  A statute that was meant to penalise computer
  : break-ins was used against someone who owned the computer that he broke
  : into.
  : 
  :  The TCPA allows you to do something that you can't do today: run your
  :  system in a way which convinces the other guy that you will honor your
  :  promises, that you will guard his content as he requires in exchange for
  :  his providing it to you.
  : 
  : Right, but it has an odd effect too.  No legal system gives people
  : complete freedom to contract.  Suppose you really, really want to exempt
  : a shop from liability if your new toaster explodes.  You can't do it;
  : the legal system does not give you the freedom to contract in that way.
  : 
  : DRM, however, gives people complete freedom to make contracts about how
  : they will deal with digital content.  Under EU single market rules, a
  : contract term to the effect that you could pass on your content to
  : someone in the UK but not the rest of the EU is unenforceable.  No
  : problem for DRM though...
  
  I don't think that one should confuse contract limitations, or 
  limitations on enforceable contract limitations, with technological
  limitations.  There is nothing, for example, in any legal system that
  forbids one from violating the law of gravity.
  
  One of the many problems with the use of the Digital Millenium 
  

Re: privacy digital rights management

2002-06-26 Thread John S. Denker

I wrote:
  Perhaps we are using
  wildly divergent notions of privacy 

Donald Eastlake 3rd wrote:

 You are confusing privacy with secrecy 

That's not a helpful remark.  My first contribution to
this thread called attention to the possibility of
wildly divergent notions of privacy.

Also please note that according to the US Office of
Technology Assessment, such terms do not posess a single
clear definition, and theorists argue variously ... the
same, completely distinct, or in some cases overlapping.

Please let's avoid adversarial wrangling over terminology.
If there is an important conceptual distinction, please
explain the concepts using unambiguous multi-word descriptions
so that we may have a collegial discussion.

 The spectrum from 2 people knowing something to 2 billion knowing
 something is pretty smooth and continuous. 

That is quite true, but quite irrelevant to the point I was making.
Pick an intermediate number, say 100 people.  Distributing
knowledge to a group of 100 people who share a vested interest in not 
divulging it outside the group is starkly different from distributing 
it to 100 people who have nothing to lose and something to gain by
divulging it.

Rights Management isn't even directly connected to knowledge.  Suppose
I know by heart the lyrics and music to _The Producers_ --- that doesn't 
mean I'm free to rent a hall and put on a performance.

 Both DRM and privacy have to
 do with controlling material after you have released it to someone who
 might wish to pass it on further against your wishes. There is little
 *tehcnical* difference between your doctors records being passed on to
 assorted insurance companies, your boss, and/or tabloid newspapers and
 the latest Disney movies being passed on from a country where it has
 been released to people/theaters in a country where it has not been
 released.

That's partly true (although overstated).  In any case it supports
my point that fixating on the *technical* issues misses some
crucial aspects of the problem.

 The only case where all holders of information always have a common
 interest is where the number of holder is one.

Colorful language is no substitute for a logical argument.
Exaggerated remarks (... ALWAYS have ...) tend to drive the
discussion away from reasonable paths.  In the real world,
there is a great deal of information held by N people where
(N1) and (Ninfinity).

-
The Cryptography Mailing List
Unsubscribe by sending unsubscribe cryptography to [EMAIL PROTECTED]



Re: privacy digital rights management

2002-06-26 Thread RL 'Bob' Morgan


On Wed, 26 Jun 2002, Donald Eastlake 3rd wrote:

 Privacy, according to the usual definitions, involve controlling the
 spread of information by persons autorized to have it. Contrast with
 secrecy which primarily has to do with stopping the spread of
 information through the actions of those not authorized to have it.

  We have thousands of years of experience with military crypto, where
  the parties at both ends of the conversation are highly motivated to
  restrict the flow of private information.  The current state of this
  technology is very robust.

 That's secrecy technology, not privacy technology.

I have seen private and secret defined in exactly the opposite fashion
as regards keys:  a private key is private because you never ever share
it with anyone, whereas a secret (symmetric) key is a secret because
you've told someone else and you expect them to not share it (in the sense
of can you keep a secret?).

Clearly there's not a common understanding of these simple words.  Seems
to me that Dan's mini-rant was referring to privacy in the sense you
define it above (controlling spread of info already held by others).

 - RL Bob



-
The Cryptography Mailing List
Unsubscribe by sending unsubscribe cryptography to [EMAIL PROTECTED]



Re: Ross's TCPA paper

2002-06-26 Thread Adam Back

On Wed, Jun 26, 2002 at 10:01:00AM -0700, bear wrote:
 As I see it, we can get either privacy or DRM,
 but there is no way on Earth to get both.
 [...]

Hear, hear!  First post on this long thread that got it right.

Not sure what the rest of the usually clueful posters were thinking!

DRM systems are the enemy of privacy.  Think about it... strong DRM
requires enforcement as DRM is not strongly possible (all bit streams
can be re-encoded from one digital form (CD-MP3, DVD-DIVX),
encrypted content streams out to the monitor / speakers subjected to
scrutiny by hardware hackers to get digital content, or A-D
reconverted back to digital in high fidelity.

So I agree with Bear, and re-iterate the prediction I make
periodically that the ultimate conclusion of the direction DRM laws
being persued by the media cartels will be to attempt to get
legislation directly attacking privacy.

This is because strong privacy (cryptographically protected privacy)
allows people to exchange bit-strings with limited chance of being
identified.  As the arms race between the media cartels and DRM
cohorts continues, file sharing will start to offer privacy as a form
of protection for end-users (eg. freenet has some privacy related
features, serveral others involve encryption already).

Donald Eastlake wrote:

| There is little *tehcnical* difference between your doctors records
| being passed on to assorted insurance companies, your boss, and/or
| tabloid newspapers and the latest Disney movies being passed on from a
| country where it has been released to people/theaters in a country
| where it has not been released.

There is lots of technical difference.  When was the last time you saw
your doctor use cryptlopes, watermarks etc to remind himself of his
obligations of privacy.

The point is that with privacy there is an explicit or implied
agreement between the parties about the handling of information.  The
agreement can not be technically *enforced* to any stringent degree.

However privacy policy aware applications can help the company avoid
unintentionally breaching it's own agreed policy.  Clearly if the
company is hostile they can write the information down off the screen
at absolute minimum.  Information fidelity is hardly a criteria with
private information such as health care records, so watermarks, copy
protect marks and the rest of the DRM schtick are hardly likely to
help!

Privacy applications can be successful to the in helping companies
avoid accidental privacy policy breaches.  But DRM can not succeed
because they are inherently insecure.  You give the data and the keys
to millions of people some large proportion of whom are hostile to the
controls the keys are supposedly restricting.  Given the volume of
people, and lack of social stigma attached to wide-spread flouting of
copy protection restrictions, there are ample supply of people to
break any scheme hardware or software that has been developed so far,
and is likely to be developed or is constructible.

I think content providors can still make lots of money where the
convenience, and /or enhanced fidelity of obtaining bought copies
means that people would rather do that than obtain content on the net.

But I don't think DRM is significantly helping them and that they ware
wasting their money on it.  All current DRM systems aren't even a
speed bump on the way to unauthorised Net re-distribution of content.

Where the media cartels are being somewhat effective, and where we're
already starting to see evidence of the prediction I mentioned above
about DRM leading to a clash with privacy is in the area of
criminalization of reverse engineering, with Skylarov case, Ed
Felten's case etc.  Already a number of interesting breaks of DRM
systems are starting to be released anonymously.  As things heat up we
may start to see incentives for the users of file-sharing for
unauthorised re-distribution to also _use_ the software anonymsouly.

Really I think copyright protections as being exploited by media
cartels need to be substantially modified to reduce or remove the
existing protections rather than further restrictions and powers
awareded to the media cartels.

Adam

-
The Cryptography Mailing List
Unsubscribe by sending unsubscribe cryptography to [EMAIL PROTECTED]



TCPA / Palladium FAQ (was: Re: Ross's TCPA paper)

2002-06-26 Thread Ross Anderson


http://www.cl.cam.ac.uk/~rja14/tcpa-faq.html

Ross

-
The Cryptography Mailing List
Unsubscribe by sending unsubscribe cryptography to [EMAIL PROTECTED]



DRMs vs internet privacy (Re: Ross's TCPA paper)

2002-06-26 Thread Adam Back

On Wed, Jun 26, 2002 at 03:57:15PM -0400, C Wegrzyn wrote:
 If a DRM system is based on X.509, according to Brand I thought you could
 get anonymity in the transaction. Wouldn't this accomplish the same thing?

I don't mean that you would necessarily have to correlate your viewing
habits with your TrueName for DRM systems.  Though that is mostly
(exclusively?) the case for current deployed (or at least implemented
with a view of attempting commercial deployment) copy-mark
(fingerprint) systems, there are a number of approaches which have
been suggested, or could be used to have viewing privacy.

Brands credentials are one example of a technology that allows
trap-door privacy (privacy until you reveal more copies than you are
allowed to -- eg more than once for ecash).  Conceivably this could be
used with a somewhat online, or in combination with a tamper-resistant
observer chip in lieu of online copy-protection system to limit
someone for example to a limited number of viewings.

Another is the public key fingerprinting (public key copy-marking)
schemes by Birgit Pfitzmann and others.  This addresses the issue of
proof, such that the user of the marked-object and the verifier (eg a
court) of a claim of unauthorised copying can be assured that the
copy-marker did not frame the user.

Perhaps schemes which combine both aspects (viewer privacy and
avoidance of need to trust at face value claims of the copy-marker)
can be built and deployed.

(With the caveat that though they can be built, they are largely
irrelevant as they will no doubt also be easily removable, and anyway
do not prevent the copying of the marked object under the real or
feigned claim of theft from the user whose identity is marked in the
object).


But anyway, my predictions about the impending collision between
privacy and the DRM and copy protection legislation power-grabs stems
from the relationship of privacy to the later redistrubtion
observation that:

1) clearly copy protection doesn't and can't a-priori prevent copying
and conversion into non-DRM formats (eg into MP3, DIVX)

2) once 1) happens, the media cartels have an interest to track
general file trading on the internet;

3) _but_ strong encryption and cryptographically enforced privacy mean
that the media cartels will ultimately be unsuccessful in this
endeavour.

4) _therefore_ they will try to outlaw privacy and impose escrow
identity and internet passports etc. and try to get cryptographically
assured privacy outlawed.  (Similar to the previous escrow on
encryption for media cartel interests instead of signals intelligence
special interests; but the media cartels are also a powerful
adversary).

Also I note an slip in my earlier post [of Bear's post]:

| First post on this long thread that got it right.

Ross Anderson's comments were also right on the money (as always).

Adam

-
The Cryptography Mailing List
Unsubscribe by sending unsubscribe cryptography to [EMAIL PROTECTED]