Re: economics of DRM, was Re: Ross's TCPA paper
On Sat, Jul 13, 2002 at 10:59:23AM -0700, Eric Murray wrote: Microsoft does not do things simply because they enjoy being evil. They are not so worried about Linux (with its small share of the market) that they will spend mega-bucks now on a very long term project that might possibly let them keep it off some PCs in the far future. They _are_ concerned with getting paid for the 50% of their software that isn't paid for. There's a shitload of money there, and if getting at some of it costs a little, well, its still more profit than they would have gotten otherwise. Isn't it much simpler for them to just write into their OS the ability to snitch on what M$ software was on the users machine everytime they go online? In fact, I've been assuming that everything from w98 on did exactly that. And wouldn't it be trivial for them to check for cracked serial numbers, or duplicate serial numbers? -- Harmon Seaver CyberShamanix http://www.cybershamanix.com
Re: Ross's TCPA paper
Eric Murray [EMAIL PROTECTED] writes: On Fri, Jul 12, 2002 at 07:14:55PM +1200, Peter Gutmann wrote: From a purely economic perspectice, I can't see how this will fly. I'll pull a random figure of $5 out of thin air (well, I saw it mentioned somewhere but can't remember the source) as the additional manufacturing cost for the TCPA hardware components. Motherboard manufacturers go through redesigns in order to save cents in manufacturing costs, and they're expected to add $5 to their manufacturing cost just to help Microsoft manage its piracy problem? Motherboard makers don't pay for it. Microsoft pays for it. Hmm, I can just see it now, Windows 2005 ships as three CDs, a 400-page EULA, a fine-tip soldering iron, a magnifying glass, an EMBASSY chip, and a copy of SMD Soldering for Dummies. Peter.
Re: Ross's TCPA paper
On Fri, 5 Jul 2002, AARG!Anonymous wrote: ... / Right, and you can boot untrusted OS's as well. Recently there was discussion here of HP making a trusted form of Linux that would work with the TCPA hardware. So you will have options in both the closed source and open source worlds to boot trusted OS's, or you can boot untrusted ones, like old versions of Windows. The user will have more choice, not less. ... / Nonsense. Let us remember what Palladium is: Palladium is a system designed to enable a few large corporations and governments to run source secret, indeed, well-encrypted, code on home user's machines in such a way that the home user cannot see, modify, or control the running code. The Orwellian, strictly Animal Farmish, claim runs: Why it is all just perfectly OK, because anyone can run source secret, well encrypted, code in an uncontrolled manner on anyone's machine at will! We are all equal, it is just that some, that is, We the Englobulators, will in practice get to run source secret, well-encrypted, code on hundreds of millions of users' machines while you, you will never run such code on anybody else's machine except at a hobbyists' fair, precisely to demonstrate we are all equal.. There are other advantages to Palladium: No free kernel will ever freely boot on a Palladium machine. And there is more. If Palladium is instituted: Microsoft will support the most vicious interpretation of the DMCA and press for passage of the SSSCA, in order that the first crack does not prove to the world that Palladium cannot prevent all copyright infringement. Microsoft will be able to say See, it is these GNU/BSD/XFree/Sendmail/Apache/CLISP folk who are causing all this dreadful copyright infringement. Why owning a non-Palladium machine should be declared, no, not illegal, we are not monsters after all, but probative evidence that the owner is an infringer, and more, a general infringer and a member of the Copyright Infringement Conspiracy. Why some of them even write such code as the well known, and in CIC circles, widely used, tool of infringement called 'cp'. Senator, I know you will be as shocked as I was when I learned what 'cp' stands for. It stands for 'copy'. And I do not mean safe Englobulator-Certified Fair Use Copying, such as is provided by the Triple X Box, which, for a reasonable license fee, allows up to six copy-protected copies to be made before settling of accounts and re-certification of the Box over the net. No, I mean, raw, completely promiscuous copying of any file on the machine, as many times as the infringer wishes. Without record, without payment to the artist, without restraint. Senator, I prefer to call cp 'The Boston Strangler', because that is exactly what it is. And every single non-Palladium operating system in the world comes with cp already loaded, loaded and running.. oo--JS.
Re: Ross's TCPA paper
At 09:43 PM 06/28/2002 +0200, Thomas Tydal wrote: Well, first I want to say that I don't like the way it is today. I want things to get better. I can't read e-books on my pocket computer, for example, which is sad since I actually would be able to enjoy e-books if I only could load them onto my small computer that follows my everywhere. You may not be able to read an Adobe\(tm Brand E-Book\(tm, but that just means you'll need to buy electronic books from publishers that don't use that data format - whether it's raw ascii text or Palm-formatted text or PalmOS DRMware that you can also view on your PC using an emulator in glorious 160x160-pixel format :-) Of course, if your PC's home country of Nauru has Software Police implementing some local equivalent of the DMCA, that emulator that you need for debugging may be illegal. ... How good is Winamp if it can't play any music recorded in 2004 or later? Given that Windows Media Player can play all your tunes and it takes a reboot to switch to Winamp, who wouldn't stick with WMP?
Re: Ross's TCPA paper
-- On 5 Jul 2002 at 14:45, AARG! Anonymous wrote: Right, and you can boot untrusted OS's as well. Recently there was discussion here of HP making a trusted form of Linux that would work with the TCPA hardware. So you will have options in both the closed source and open source worlds to boot trusted OS's, or you can boot untrusted ones, like old versions of Windows. The user will have more choice, not less. Yes he will, but the big expansion of choice is for the the seller of content and software, who will have more choices as to how he can cripple what he sells you. For example he can sell you music that will only play on a particular music player on your particular machine. But that is not enough to give the content industry what it wants, for someone can still break it on one machine, perhaps by intercepting the bitstream to the the DA, and having broken it on one machine, can run it on all machines all over the internet. Break once, run everywhere. Microsoft has also been talking out of both sides of its mouth, by saying that this will also protect against break once, run everywhere. The only way that this can protect against break-once-run-everywhere is to reduce user choice, to make it mandatory that the user can only run government trusted software, and to reduce seller choice, prohibit sellers from providing unacceptable software, such as napster like software. --digsig James A. Donald 6YeGpsZR+nOTh/cGwvITnSR3TdzclVpR0+pr3YYQdkG XQJ33SB0W84Cm4Mw0+3lnN4nsUtaB4B6cIa1dP/2 2s67UXEL+Y5FHrr52MYArwzRuptDlBNVQIJOj/n/8
Re: Ross's TCPA paper
Hadmut Danisch writes: You won't be able to enter a simple shell script through the keyboard. If so, you could simple print protected files as a hexdump or use the screen (or maybe the sound device or any LED) as a serial interface. Since you could use the keyboard to enter a non-certified program, the keyboard is to be considered as a nontrusted device. This means that you either * have to use a certified keyboard which doesn't let you enter bad programs * don't have a keyboard at all * or are not able to use shell scripts (at least not in trusted context). This means a strict separation between certified software and data. The latter is closest to what's intended in Palladium. Individual programs using Palladium features are able to prevent one another from reading their executing or stored state. You can write your own programs, but somebody else can also write programs which can process data in a way that your programs can't interact with. The Palladium security model and features are different from Unix, but you can imagine by rough analogy a Unix implementation on a system with protected memory. Every process can have its own virtual memory space, read and write files, interact with the user, etc. But normally a program can't read another program's memory without the other program's permission. The analogy starts to break down, though: in Unix a process running as the superuser or code running in kernel mode may be able to ignore memory protection and monitor or control an arbitrary process. In Palladium, if a system is started in a trusted mode, not even the OS kernel will have access to all system resources. That limitation doesn't stop you from writing your own application software or scripts. Interestingly, Palladium and TCPA both allow you to modify any part of the software installed on your system (though not your hardware). The worst thing which can happen to you as a result is that the system will know that it is no longer trusted, or will otherwise be able to recognize or take account of the changes you made. In principle, there's nothing wrong with running untrusted; particular applications or services which relied on a trusted feature, including sealed storage (see below), may fail to operate. Palladium and TCPA both allow an application to make use of hardware-based encryption and decryption in a scheme called sealed storage which uses a hash of the running system's software as part of the key. One result of this is that, if you change relevant parts of the software, the hardware will no longer be able to perform the decryption step. To oversimplify slightly, you could imagine that the hardware uses the currently-running OS kernel's hash as part of this key. Then, if you change the kernel in any way (which you're permitted to do), applications running under it will find that they're no longer able to decrypt sealed files which were created under the original kernel. Rebooting with the original kernel will restore the ability to decrypt, because the hash will again match the original kernel's hash. (I've been reading TCPA specs and recently met with some Microsoft Palladium team members. But I'm still learning about both systems and may well have made some mistakes in my description.) -- Seth Schoen Staff Technologist[EMAIL PROTECTED] Electronic Frontier Foundationhttp://www.eff.org/ 454 Shotwell Street, San Francisco, CA 94110 1 415 436 9333 x107
RE: Ross's TCPA paper
Hadmut Danisch wrote: On Wed, Jul 03, 2002 at 10:54:43PM -0700, Bill Stewart wrote: At 12:59 AM 06/27/2002 -0700, Lucky Green wrote: I fully agree that the TCPA's efforts offer potentially beneficial effects. Assuming the TPM has not been compromised, the TPM should enable to detect if interested parties have replaced you NIC with the rarer, but not unheard of, variant that ships out the contents of your operating RAM via DMA and IP padding outside the abilities of your OS to detect. It can? I thought that DMA was there to let you avoid bothering the CPU. The Alternate NIC card would need to have a CPU of its own to do a good job of this, but that's not hard. I don't think so. As far as I understood, the bus system (PCI,...) will be encrypted as well. You'll have to use a NIC which is certified and can decrypt the information on the bus. Obviously, you won't get a certification for such an network card. You won't and Bill won't. But those who employ such NIC's will have no difficulty obtaining certification. But this implies other problems: You won't be able to enter a simple shell script through the keyboard. If so, you could simple print protected files as a hexdump or use the screen (or maybe the sound device or any LED) as a serial interface. Since you could use the keyboard to enter a non-certified program, the keyboard is to be considered as a nontrusted device. This means that you either * have to use a certified keyboard which doesn't let you enter bad programs * don't have a keyboard at all * or are not able to use shell scripts (at least not in trusted context). This means a strict separation between certified software and data. Sure you can use shell scripts. Though I don't understand how a shell script will help you in obtaining a dump of the protected data since your script has insufficient privileges to read the data. Nor can you give the shell script those privileges since you don't have supervisor mode access to the CPU. How does your shell script plan to get past the memory protection? What am I missing? --Lucky
Re: Ross's TCPA paper
On Thu, Jul 04, 2002 at 10:54:34PM -0700, Lucky Green wrote: Sure you can use shell scripts. Though I don't understand how a shell script will help you in obtaining a dump of the protected data since your script has insufficient privileges to read the data. Nor can you give the shell script those privileges since you don't have supervisor mode access to the CPU. How does your shell script plan to get past the memory protection? That's why I was talking about a shell script (or take any other program to be interpreted). What does need to be certified: The shell or the shell script? The CPU doesn't recognize the shell script as a program, this is just some plain data entered through the keyboard like writing a letter. A shell script is not a program, it is data entered at a program's runtime. This moves one step forward: The hardware (palladium chip, memory management, etc.) can check the binary program to be loaded. So you won'te be able to run a compiled program and to access protected information. But once a certified software is running, it takes input (reading mouse, keyboard, files, asking DNS, connecting servers,...). This input might cause (by interpretation, by bug or however) the certified software to do certain things which do not comply with DRM requirements. At this stage, the running binary software itself is the instance to provide the DRM security, not the palladium memory management anymore. I agree that this is not yet an open sesame, but it shows that the game does not play on the binary/memory management layer only. But who controls runtime input? History shows, that M$ software is anything but able to deal with malicious input. That's why the world is using virus filters. That's nothing else than an external filter to keep malicious input from an attacker away from the running software. By analogy, Palladium might require the same: an input filter between attacker and running software. Since the attacker is sitting in front of the computer this time, this filter has to be applied to the user interface, keyboard and mouse. Maybe they'll install a filter between the keyboard and the software, thus building a certified keyboard, which filters out any malicious key sequences. And maybe you can use your keyboard only, if you have downloaded the latest patterns (like your daily virus filter update). I agree that this depends on the assumption that the certified software is not perfect and can't deal with arbitrary input. But that's reality. Hadmut
Re: Ross's TCPA paper
Seth Schoen writes: The Palladium security model and features are different from Unix, but you can imagine by rough analogy a Unix implementation on a system with protected memory. Every process can have its own virtual memory space, read and write files, interact with the user, etc. But normally a program can't read another program's memory without the other program's permission. The analogy starts to break down, though: in Unix a process running as the superuser or code running in kernel mode may be able to ignore memory protection and monitor or control an arbitrary process. In Palladium, if a system is started in a trusted mode, not even the OS kernel will have access to all system resources. Wouldn't it be more accurate to say that a trusted OS will not peek at system resources that it is not supposed to? After all, since the OS loads the application, it has full power to molest that application in any way. Any embedded keys or certs in the app could be changed by the OS. There is no way for an application to protect itself against the OS. And there is no need; a trusted OS by definition does not interfere with the application's use of confidential data. It does not allow other applications to get access to that data. And it provides no back doors for root or the system owner or device drivers to get access to the application data, either. At http://vitanuova.loyalty.org/2002-07-03.html you provide more information about your meeting with Microsoft. It's an interesting writeup, but the part about the system somehow protecting the app from the OS can't be right. Apps don't have that kind of structural integrity. A chip in the system cannot protect them from an OS virtualizing that chip. What the chip does do is to let *remote* applications verify that the OS is running in trusted mode. But local apps can never achieve that degree of certainty, they are at the mercy of the OS which can twiddle their bits at will and make them believe anything it wants. Of course a trusted OS would never behave in such an uncouth manner. That limitation doesn't stop you from writing your own application software or scripts. Absolutely. The fantasies which have been floating here of filters preventing people from typing virus-triggering command lines are utterly absurd. What are people trying to prove by raising such nonsensical propositions? Palladium needs no such capability. Interestingly, Palladium and TCPA both allow you to modify any part of the software installed on your system (though not your hardware). The worst thing which can happen to you as a result is that the system will know that it is no longer trusted, or will otherwise be able to recognize or take account of the changes you made. In principle, there's nothing wrong with running untrusted; particular applications or services which relied on a trusted feature, including sealed storage (see below), may fail to operate. Right, and you can boot untrusted OS's as well. Recently there was discussion here of HP making a trusted form of Linux that would work with the TCPA hardware. So you will have options in both the closed source and open source worlds to boot trusted OS's, or you can boot untrusted ones, like old versions of Windows. The user will have more choice, not less. Palladium and TCPA both allow an application to make use of hardware-based encryption and decryption in a scheme called sealed storage which uses a hash of the running system's software as part of the key. One result of this is that, if you change relevant parts of the software, the hardware will no longer be able to perform the decryption step. To oversimplify slightly, you could imagine that the hardware uses the currently-running OS kernel's hash as part of this key. Then, if you change the kernel in any way (which you're permitted to do), applications running under it will find that they're no longer able to decrypt sealed files which were created under the original kernel. Rebooting with the original kernel will restore the ability to decrypt, because the hash will again match the original kernel's hash. Yes, your web page goes into somewhat more detail about how this would work. This way a program can run under a secure OS and store sensitive data on the disk, such that booting into another OS will then make it impossible to decrypt that data. Some concerns have been raised here about upgrades. Did Microsoft discuss how that was planned to work, migrating from one version of a secure OS to another? Presumably they have different hashes, but it is necessary for the new one to be able to unseal data sealed by the old one. One obvious solution would be for the new OS to present a cert to the chip which basically said that its OS hash should be treated as an alias of the older OS's hash. So the chip would unseal using the old OS hash even when the new OS was running, based on the fact that this cert was
Re: Ross's TCPA paper
anonym n : Mr. and Mrs. John Smith when signed in a motel register. On Sun, Jun 30, 2002 at 09:55:58PM -0400, R. A. Hettinga wrote: More to the point, there is no such thing as an anonym, by definition. -- Barney Wolff I never met a computer I didn't like.
Re: Ross's TCPA paper
R. A. Hettinga wrote: At 12:06 AM +0100 on 7/1/02, Ben Laurie wrote: No, a pseudonym can be linked to stuff (such as reputation, publications, money). An anonym cannot. More to the point, there is no such thing as an anonym, by definition. Hmm. So present the appropriate definition? Cheers, Ben. -- http://www.apache-ssl.org/ben.html http://www.thebunker.net/ There is no limit to what a man can do or how far he can go if he doesn't mind who gets the credit. - Robert Woodruff
Re: Ross's TCPA paper
At 11:30 PM -0400 on 6/30/02, Barney Wolff wrote: anonym n : Mr. and Mrs. John Smith when signed in a motel register. No. Pseudonym(s). Subclass Alias. An anonym (literally, no name, right?) is not signing the book at all, and, thus, as nyms go, can't exist except in your mind. Somewhere St. Anselm is smiling... I'd be tempted to say that an anonym is it's own antinym and thus can't exist, but that, as James Coburn said in a movie recently, would be just plain mean... :-). Cheers, RAH -- - R. A. Hettinga mailto: [EMAIL PROTECTED] The Internet Bearer Underwriting Corporation http://www.ibuc.com/ 44 Farquhar Street, Boston, MA 02131 USA ... however it may deserve respect for its usefulness and antiquity, [predicting the end of the world] has not been found agreeable to experience. -- Edward Gibbon, 'Decline and Fall of the Roman Empire'
Re: Ross's TCPA paper
My use of anonym was a joke. Sorry if it was too deadpan. But my serious point was that if a pseudonym costs nothing to get or give up, it makes one effectively anonymous, if one so chooses. On Mon, Jul 01, 2002 at 11:37:28AM +0100, Ben Laurie wrote: R. A. Hettinga wrote: At 12:06 AM +0100 on 7/1/02, Ben Laurie wrote: No, a pseudonym can be linked to stuff (such as reputation, publications, money). An anonym cannot. More to the point, there is no such thing as an anonym, by definition. Hmm. So present the appropriate definition? -- Barney Wolff I never met a computer I didn't like.
Anonyms, Pseudonyms, and Fists (was Re: Ross's TCPA paper)
-BEGIN PGP SIGNED MESSAGE- Hash: SHA1 At 11:37 AM +0100 on 7/1/02, Ben Laurie wrote: Hmm. So present the appropriate definition? Well, like I said, (and to be completely pedantic about it :-)), it seems to me that logically there's no such thing as an anonym even though you could do pseudonymous things that are, prima facie, and probably functionally, anonymous. The closest thing might be a string of single-use keys, pseudonyms, as we've said, or, in the Mr. and Mrs. Smith of motel register fame (or user cypherpunks, password writecode), everyone using the same key, to using a key, or name as we (and now a dictionary somewhere, though my spell-check dictionary flags it :-)), have also said, is to create an *alternate* name or key for yourself, which is, by definition, a pseudonym, even if it is used once, and unlinked to any other event somehow. And, to throw a curve into the whole discussion, there's also the fist everyone uses on the net, like the fist that people had when keying Morse Code. Or, more recently, the words, syntax, semantics, concordance, whatever, that they use when writing or talking. That stuff has has been used in literature -- to apparent lesser effect more recently with Shakespeare, and to greater effect with Joel Klien, for instance. Or the way we buy or things in an electronic market, or by mousing around the web. That kind of stuff, as Carl Ellison has noted, is probably as good a biometric as there might ever be, given enough data, so certainly a persistent pseudonym can't be anonymous in the sense of unlinked behavior to itself. Frankly, since we still live in a world of physical IP addresses, and apparently, given the ZKS experience, a still uneconomical way of mixing those addresses, traffic analysis, as usual, is still quite a bitch. Only when we can change the economics of pseudonymity will we have anything approaching anonymity, in other words. If it's cheaper to do things anonymously -- especially financial things, which are at the core of most traceable, most linkable, literally accountable, transparent activity, right now -- then we'll get closer and closer to anonymity. So, maybe there isn't such a thing as an anonym, even though we know what anonymity is. We can make generalizations about anonymity all the time. The ultimate generalization being that anonymity, like security and cryptography themselves, is more of an economic asymptote than anything else. Something like perfection; as Anselm said in trying to prove the existence of God before the concept of calculus and limits would have shown him the error of his ways :-), something that we can conceive in our mind, if not actually see in reality. We can probably get close enough to be free, however, even in a world of ubiquitous optical supervision of private property. Dramatically freer than we are now, certainly, which is all that matters. Cheers, RAH -BEGIN PGP SIGNATURE- Version: PGP 7.5 iQA/AwUBPSBYzMPxH8jf3ohaEQKAVACfYeUm0QMu3PIcj9IacILb4S5t87AAoIZJ B51jtZMJN0l+bOITjKVqK5Rn =dZrT -END PGP SIGNATURE- -- - R. A. Hettinga mailto: [EMAIL PROTECTED] The Internet Bearer Underwriting Corporation http://www.ibuc.com/ 44 Farquhar Street, Boston, MA 02131 USA ... however it may deserve respect for its usefulness and antiquity, [predicting the end of the world] has not been found agreeable to experience. -- Edward Gibbon, 'Decline and Fall of the Roman Empire'
Re: Ross's TCPA paper
Barney Wolff wrote: My use of anonym was a joke. Sorry if it was too deadpan. But my serious point was that if a pseudonym costs nothing to get or give up, it makes one effectively anonymous, if one so chooses. Well, yeah, I'd say that single-use pseudonyms are, in fact, the definition of anonyms. Zero cost is not required, of course, except to make anonymity, err, zero cost. Cheers, Ben. -- http://www.apache-ssl.org/ben.html http://www.thebunker.net/ There is no limit to what a man can do or how far he can go if he doesn't mind who gets the credit. - Robert Woodruff
Re: Ross's TCPA paper
[Repost] Bear writes: A few years ago merchants were equally adamant and believed equally in the rightness of maintaining their right to not do business with blacks, chicanos, irish, and women. It'll pass as people wake up and smell the coffee. Unfortunately that won't be until after at least a decade of really vicious abuses of private data by merchants who believe in their god-given right to snoop on their customers. My God, how low the cypherpunk list has sunk. Here we have someone not only demanding that merchants be forced to deal with pseudonymous customers, he invokes civil rights laws to support his argument! Where's Tim May when we need him? His racism is odious but at least he's not trying to force other people to follow his beliefs. I'm sure he'd have a thing or two to say about our wonderful civil rights laws and Bear's proposal to extend similar regulations to cyberspace. Here's a clue, Mr. Bear. The cypherpunks list was founded on the principle that cyberspace can enhance freedom, and that includes freedom to associate with whomever you choose. Racism is evil, but the solution must lie in people's hearts. Pointing a gun at them and forcing them to act in a politically correct manner (which is what civil rights regulations really do) is no solution to the problem. So yeah, I think that the right to privacy implies the right to use a pseudonym. For any non-fraudulent purpose, including doing business with merchants who don't know it's a pseudonym. And I think that's a constitutional right, whether the merchants happen to like it or not... And of course any reference to the constitution betrays utter cluelessness when talking on an international mailing list about technology which spans national borders. Unless you are prepared to be bound by the Iraqi constitution, Mr. Bear, don't ask us to be governed by yours.
Re: Ross's TCPA paper
-- On 1 Jul 2002 at 15:06, Tim May wrote: I have strong views on all this DRM and TCPA stuff, and especially on the claim that some form of DRM is needed to prevent government from taking over control of the arts. But we said everything that needed to be said _years_ ago. No point in repeating the same points. No, it does need to be said again. You cannot merely do a copy and paste from the cyphernomicon. You will find it necessary a copy and paste from the cyphernomicon followed by several global search and replaces and a small amount of new material referring to current events. Palladium, as described by Microsoft, is actually a pretty cool idea that would be useful for quite a few cypherpunkly projects. When Microsoft gave its description of Palladium, there were a few caveats and maybes that to me sounded as if they were saying Well our hearts are in the right place, this is the way it will be if only it was not going to be the way that it actually is going to be. Unfortunately it is being introduced at the same time as there is legislation proposed, the SSSCA, to outlaw general purpose computers, turning them into set top boxes, and license software engineers, so that only a small number of specially privileged people will be permitted access to general purpose computers. This timing creates a reasonable suspicion that Palladium is in fact a stalking horse for that project, a preparation for a slightly more acceptable variant of the SSSCA. --digsig James A. Donald 6YeGpsZR+nOTh/cGwvITnSR3TdzclVpR0+pr3YYQdkG jJb9+mkN3R59T+7qqwbaNl6DlnXtC7susSRKhpeg 2XCDBLPYrZ4/b3EazgN2sjfbch9lCok9wmcWkHl6X
Re: Ross's TCPA paper
On Monday, July 1, 2002, at 02:23 PM, Anonymous wrote: [Repost] Bear writes: A few years ago merchants were equally adamant and believed equally in the rightness of maintaining their right to not do business with blacks, chicanos, irish, and women. It'll pass as people wake up and smell the coffee. Unfortunately that won't be until after at least a decade of really vicious abuses of private data by merchants who believe in their god-given right to snoop on their customers. My God, how low the cypherpunk list has sunk. Here we have someone not only demanding that merchants be forced to deal with pseudonymous customers, he invokes civil rights laws to support his argument! Where's Tim May when we need him? I'm right here. But you have missed something very important: Bear did not write that article for the _Cypherpunks_ list. It was one of many articles cross-posted between the _Cryptography_ list and the _Cypherpunks_ list and even some of Hettinga's many lists. Here are the headers: From: bear [EMAIL PROTECTED] Date: Sat Jun 29, 2002 10:03:33 PM US/Pacific To: Barney Wolff [EMAIL PROTECTED] Cc: '[EMAIL PROTECTED] ' [EMAIL PROTECTED], '[EMAIL PROTECTED] ' [EMAIL PROTECTED] Subject: Re: Ross's TCPA paper Here's a clue, Mr. Bear. The cypherpunks list was founded on the principle that cyberspace can enhance freedom, and that includes freedom to associate with whomever you choose. Racism is evil, but the solution must lie in people's hearts. Pointing a gun at them and forcing them to act in a politically correct manner (which is what civil rights regulations really do) is no solution to the problem. Bear left the Cypherpunks list a long while ago, citing fundamental disagreements. Cryptography was formed as a putatively apolitical list. Apparently this is no longer so, and its politics are at odds with the main themes on our list. (I believe this partly comes about precisely _because_ it supposedly has no political compass heading.) I have strong views on all this DRM and TCPA stuff, and especially on the claim that some form of DRM is needed to prevent government from taking over control of the arts. But we said everything that needed to be said _years_ ago. No point in repeating the same points. --Tim May Dogs can't conceive of a group of cats without an alpha cat. --David Honig, on the Cypherpunks list, 2001-11
Re: Ross's TCPA paper
A pseudonym that I can give up at will and that can never afterwards be traced to me is equivalent to an anonym. I'm not suggesting that anonymity be outlawed, or that every merchant be required to reject anonymous or pseudonymous customers. All I'm suggesting is that small merchants MUST NOT be required to accept such customers. On Sun, Jun 30, 2002 at 08:38:29AM -0700, bear wrote: On Sun, 30 Jun 2002, Barney Wolff wrote: The trouble I have with this is that I'm not only a consumer, I'm also a merchant, selling my own professional services. And I just will not, ever, perform services for an anonymous client. That's my choice, and the gov't will take it away only when they can pry it from my cold dead fingers. :) Are you one of those who makes no distinction between anonymity and pseudonymity? 'Cause I've been talking about pseudonymity, and all your answers have been talking about anonymity. Bear -- Barney Wolff I never met a computer I didn't like.
Re: Ross's TCPA paper
On Sun, 30 Jun 2002, Barney Wolff wrote: The trouble I have with this is that I'm not only a consumer, I'm also a merchant, selling my own professional services. And I just will not, ever, perform services for an anonymous client. That's my choice, and the gov't will take it away only when they can pry it from my cold dead fingers. :) Are you one of those who makes no distinction between anonymity and pseudonymity? 'Cause I've been talking about pseudonymity, and all your answers have been talking about anonymity. Bear
Re: Ross's TCPA paper
On Sat, Jun 29, 2002 at 10:03:33PM -0700, bear wrote: ... I won't give up the right NOT to do business with anonymous customers, or anyone else with whom I choose not to do business. A few years ago merchants were equally adamant and believed equally in the rightness of maintaining their right to not do business with blacks, chicanos, irish, and women. It'll pass as people wake up and smell the coffee. Unfortunately that won't be until after at least a decade of really vicious abuses of private data by merchants who believe in their god-given right to snoop on their customers. The trouble I have with this is that I'm not only a consumer, I'm also a merchant, selling my own professional services. And I just will not, ever, perform services for an anonymous client. That's my choice, and the gov't will take it away only when they can pry it from my cold dead fingers. :) It's not that I hate my govt, although I liked it a whole lot better before 1/20/01, but I will not risk aiding and abetting criminality, even if I can pretend I don't know I'm doing it. Oh by the way, last time you visited your favorite kinky sex shop, didn't you notice the surveillance camera in the corner? And didn't you see the cashier at your ${house_of_worship} last ${sabbath}? The right to anonymity seems to be a new one, not a traditional one that we're about to lose. It may be a needed defense against the ever-increasing ability to correlate data. All I'm really railing against is the notion that just because I'm selling something I MUST accept your anonymity. ... I don't see any way that DRM addresses the privacy concern of database linking. Especially since I expect database linking to be done using specialized software that doesn't have to get inspected by anybody with a motive to prevent it, I certainly agree that DRM cannot protect privacy violation by a user with access rights. The whole issue of database correlation and anonymity was insightfully explored by Heinlein in The Moon is a Harsh Mistress in 1966. -- Barney Wolff I never met a computer I didn't like.
Re: Ross's TCPA paper
On Sun, 30 Jun 2002, Barney Wolff wrote: A pseudonym that I can give up at will and that can never afterwards be traced to me is equivalent to an anonym. Actually, I don't have a problem with it being traced afterwards, if a crime has been committed and there's a search warrant or equivalent to trace it in order to further the investigation of a specific crime. And that's a pseudonym, not anonymity. My problem is that if merchant's information is easily linkable, or if several merchants have access to the same linkable field, then privacy is out the window. It's reasonable for a merchant to know every deal I've ever done with him (pseudonymity). It's not reasonable for a merchant to know nothing at all about my past dealings with anyone including himself (anonymity) nor for a merchant to know every deal I've done in my life, with everyone (marketing databases based on linkable ID's). Ray
Re: Ross's TCPA paper
On Mon, 24 Jun 2002, Anonymous wrote: The important thing to note is this: you are no worse off than today! You are already in the second state today: you run untrusted, and none of the content companies will let you download their data. But boolegs are widely available. The problem is that the analog hole is how we debug stuff. When our speakers don't sound right, we tap the signal, put it on an oscilloscope so we can see what's wrong, correct the drivers, and try again. When our monitor can't make sense of the video signal, it's different equipment but the same idea. When you encrypt all the connections to basic display hardware, as proposed in Palladium, it means nobody can write drivers or debug hardware without a million-dollar license. And if you do fix a bug so your system works better, your system's trusted computing system will be shut down. Not that that's any great loss. Likewise, encrypted instruction streams mean you don't know what the hell your CPU is doing. You would have no way to audit a program and make sure it wasn't stealing stuff from you or sending your personal information to someone else. Do we even need to recount how many abuses have been foisted on citizens to harvest marketing data, and exposed after-the- fact by some little-known hero who was looking at the assembly code and went, Hey look what it's doing here. Why is it accessing the passwords/browser cache/registry/whatever? Do we want to recount how many times personal data has been exported from customer's machines by adware that hoped not to be noticed? Or how popup ads get downloaded by software that has nothing to do with what website people are actually looking at? I don't want to give vendors a tunnel in and out of my system that I can't monitor. I want to be able to shut it down and nail it shut with a hardware switch. I don't want to ever run source code that people are so ashamed of that they don't want me to be able to check and see what it does; I want to nail that mode of my CPU off so that no software can turn it on EVER. I'll skip the digital movies if need be, but to me trusted computing means that *I* can trust my computer, not that someone else can. Bear
Re: Ross's TCPA paper
Yes, this is a debate I've had with the medical privacy7 guys, some of whom like the idea of using Palladium to protect medical records. This is a subject on which I've a lot of experience (see my web page), and I don't think that Palladium will help. Privacy abuses almost always involve abuse of authorised access by an insider. Recent case: 15-year old girl in Croydon, England, gets termination of pregnancy without telling her mother. This is reported to the local health authority, where her uncle works; he sees the report and tells the family. Palladium doesn't help here. Even if the unclse is constrined by the Fritz chip from doing anything other than look at the screen, he still has the information. The fix for this problem is anonymous reporting, with the identity of the girl known only to the treating physician. It is a policy issue, not a techjnology issue; if technology such as Palladium is introduced it will most likely be by health authorities trying to find an excuse to retain access to data that they shouldn't have in the first place. (We've seen a similar effect with smartcards in healthcare, and in fact the general phenomenon has an interesting similarity with what the environmental economists call the `social reward trap': making `green' goods available often increases pollution as people consume green goods rather than consuming less.) Ross
Re: Ross's TCPA paper
On Wed, 26 Jun 2002, Barney Wolff wrote: Do you really mean that if I'm a business, you can force me to deal with you even though you refuse to supply your real name? Not acceptable. I don't think that privacy (in the sense of having the right to keep private details of your life from being linked for use unauthorized by you) is ever going to happen if merchants have the right to demand true identities. As a merchant, you have the right to be paid and to be sure of your payment. I don't think you have the right to collect data that you can correlate with every public and business record in the universe and build a profile linked to my identity that says what brand of breakfast cereal I eat, how much a month I spend on sex toys, what kind of books I read, and whether I'm in trouble in divorce court. The problem is that there is no way to check what merchants do with the data once they've got it; customers are prevented from getting into the customer databases and finding out what a merchant's got on them. Merchants have no motive whatsoever to police or restrain their actions in invasion of privacy, and they have a financial motive to link data - so there is no reason to believe that DRM stuff on consumer machines is going to apply to their data handling in the least. I just don't see any possible application of DRM that merchants would allow that protects consumer privacy. So yeah, I think that the right to privacy implies the right to use a pseudonym. For any non-fraudulent purpose, including doing business with merchants who don't know it's a pseudonym. And I think that's a constitutional right, whether the merchants happen to like it or not, just like the right to eat in a restaurant even if the manager don't like colored folks, or picket outside a merchant's business on public property seeking redress of grievances, or tell the truth about a merchant even if it's not flattering to him, or otherwise exercising ordinary civil rights the merchant might prefer you didn't. You can't have privacy without the option of pseudonymity, any more than you can have bread without flour. I won't give up the right NOT to do business with anonymous customers, or anyone else with whom I choose not to do business. A few years ago merchants were equally adamant and believed equally in the rightness of maintaining their right to not do business with blacks, chicanos, irish, and women. It'll pass as people wake up and smell the coffee. Unfortunately that won't be until after at least a decade of really vicious abuses of private data by merchants who believe in their god-given right to snoop on their customers. The point about DRM, if I understand it, is that you could disclose your information to me for certain purposes without my being able to make use of it in ways you have not agreed to. At least in theory. But this debate appears largely to ignore differences in the number of bits involved. To violate your privacy I can always take a picture of my screen with an old camera, or just read it into a tape-recorder. I can't do that effectively with your new DVD without significant loss of quality. Understand that I don't really give a flying crap about the DVD player; if I want a nice movie, I'll get together with some buddies and make one. And I'll let anybody who wants to watch it download it. What I want is the right to prevent my customer records at the bookstore from being correlated with the customer records at my doctor, my dentist, my insurance agent, my therapist, my attorney, my grocery store, my pharmacist, the comics shop, the sex-toy shop, the car dealership, the art gallery, the stained-glass place, the computer store, the video-rental place, my favorite restaurants, and my travel agent, and sold as a nice totally invasive bundle back to the marketing databases of all of the above. This is not a question about number of bits. I figure the database will have an efficient, no-nonsense representation of all of these things, and a photo of the screen, if it can be scanned back, is just as good as a binary copy. I don't see any way that DRM addresses the privacy concern of database linking. Especially since I expect database linking to be done using specialized software that doesn't have to get inspected by anybody with a motive to prevent it, on professional (Non-DRM) machines if necessary. Bear
Re: Ross's TCPA paper
On 27 Jun 2002, David Wagner wrote: No, it's not. Read Ross Anderson's article again. Your analysis misses part of the point. Here's an example of a more problematic vision: you can buy Microsoft Office for $500 and be able to view MS Office documents; or you can refrain from buying it and you won't be able to view MS Office documents. Do you see why this is problematic? It lets one vendor lock the world into a monopoly; noone else will be able to develop compatible MS Word viewers without the consent of Microsoft. (StarOffice on Linux won't work, because to get the session key to decrypt the Word document your viewer has to go online to microsoft.com and ask for it, but microsoft.com won't give you the key unless you've bought a secure trusted OS and purchased Microsoft Office for $500.) Now notice that the same idea can be used to inhibit competition in just about any computer market, and I hope you appreciate Ross's point. TCPA/DRM has the potential for anti-competitive effects, and the result may well be worse off than we are today. As long as MS Office isn't mandated by law, who cares? So what: somebody sends me a file. I tell them I can't read it. Now, they have a choice, they can give me MS Office or they can send me ascii. The market will determine if secure OS's are useful. DRM isn't the problem. Legislating DRM is the problem. You can go buy IBM portables with secure key chips built in right now to help protect your box and your business data. That's TCPA. Nothing wrong with it, it's a good idea. It doesn't become wrong until it becomes forced down our throats. That's where S.2048 becomes something to worry about, it forces us to use hardware we don't need (or may not need for our purposes). TCPA and DRM are not the problem here, and privacy and copyright are side issues too. There is no need for the law to intervene, the market will decide how all this stuff can be used efficiently and effectively. And that's what the entertainment industry needs to figure out and fast too. The law is slow. Technology is fast. Patience, persistence, truth, Dr. mike
RE: DRMs vs internet privacy (Re: Ross's TCPA paper)
Adam Back wrote: I don't mean that you would necessarily have to correlate your viewing habits with your TrueName for DRM systems. Though that is mostly (exclusively?) the case for current deployed (or at least implemented with a view of attempting commercial deployment) copy-mark (fingerprint) systems, there are a number of approaches which have been suggested, or could be used to have viewing privacy. The TCPA specs were carefully designed to permit the user to obtain multiple certificates from multiple CA's and thus, if, and that's a big if, the CA's don't collude and furthermore indeed discard the true name identities of the customer, utilize multiple separate identities for various online applications. I.e., the user could have one cert for their True Name, one used to enable Microsoft Office, and one to authenticate the user to other online services. It is very much the intent of the TCPA to permit the use of pseudonymous credentials for many, if not most, applications. Otherwise, the TCPA's carefully planned attempts at winning over the online liberty groups would have been doomed from the start. --Lucky Green
RE: Ross's TCPA paper
David wrote: It's not clear that enabling anti-competitive behavior is good for society. After all, there's a reason we have anti-trust law. Ross Anderson's point -- and it seems to me it's one worth considering -- is that, if there are potentially harmful effects that come with the beneficial effects, maybe we should think about them in advance. I fully agree that the TCPA's efforts offer potentially beneficial effects. Assuming the TPM has not been compromised, the TPM should enable to detect if interested parties have replaced you NIC with the rarer, but not unheard of, variant that ships out the contents of your operating RAM via DMA and IP padding outside the abilities of your OS to detect. However, enabling platform security, as much as might be stressed otherwise by the stakeholders, has never been the motive behind the TCPA. The motive has been DRM. Does this mean that one should ignore the benefits that TCPA might bring? Of course not. But it does mean that one should carefully weigh the benefits against the risks. --Lucky Green
Re: Ross's TCPA paper
From: [EMAIL PROTECTED] As a side note, it seems that a corporation would actually have to demonstrate that I had seen and agreed to the thing and clicked acceptance. Prior to that point, I could reverse engineer, since there is no statement that I cannot reverse engineer agreed to. So what would happen if I reverse engineered the installation so that the agreement that was display stated that I could do what I liked with the software? Ok, so there would be no mutual intent, but on the other hand, there would also be no agreement on the click-through agreement either. I have an application that replaces the caption on the I agree button to your liking; I wrote it exactly because of this reasoning. http://picosoft.freeservers.com/NoLicense.htm Of course, it's a stupid little program, I'm sure anyone can come up with something better in no time... BTW, for any lawyers around here - shouldn't the mere existence of this program be enough to blow up the idea that you agreed to the click-through stuff? Mark
RE: Ross's TCPA paper
On Thu, 27 Jun 2002, Lucky Green wrote: David wrote: It's not clear that enabling anti-competitive behavior is good for society. After all, there's a reason we have anti-trust law. Ross Anderson's point -- and it seems to me it's one worth considering -- is that, if there are potentially harmful effects that come with the beneficial effects, maybe we should think about them in advance. I fully agree that the TCPA's efforts offer potentially beneficial effects. Assuming the TPM has not been compromised, the TPM should enable to detect if interested parties have replaced you NIC with the rarer, but not unheard of, variant that ships out the contents of your operating RAM via DMA and IP padding outside the abilities of your OS to detect. However, enabling platform security, as much as might be stressed otherwise by the stakeholders, has never been the motive behind the TCPA. The motive has been DRM. Does this mean that one should ignore the benefits that TCPA might bring? Of course not. But it does mean that one should carefully weigh the benefits against the risks. --Lucky Green I don't see DRM as anti-competitive, I see it as a road block. The French government just signed a contract to put Linux into many of their service machines to help people get data into and out of the government (and I bet there's a lot!). A Microsoft DRM file won't work there, so Microsoft is screwed. The majority of people and businesses want to do things as cheaply as possible. The whole reason Microsoft has gotten as big as they are is because they are cheap. That they happen to be crappy too didn't bother most people, compared to a Sun or Dec workstation, a PC running DOS or WinXX was a factor of 10 cheaper. Controlling secrets for use within a company is what most companies want. The TCPA helps solve that problem, and if Microsoft can sell them something that does it cheaply, they'll happily buy it. The line gets crossed when Hollywood wants to sell movies over the net, and they realize all those bits can be sent by anyone, anywhere, anytime once they have them. For Hollywood to mandate that all platforms and devices protect their IP is insane, and we need to make sure it doesn't happen. However, we can build very special devices that connect directly to Hollywood to play their stuff. If somebody steals it, then it's out and there's not much they can do. Most people won't want to do that - the special boxes can be cheap enough that it's not worth the effort. These special boxes are also TCP, but they are not general computing platforms - they are special movie playing or music playing platforms. So technology can be made so we all win - IP is normativly protected, PC's are generic, and consumers and business get solutions that are low cost. It's an economic win too because guys like me get more work building more boxes :-) Certainly there will be people who could tap into a special box and transfer the data to the general net and make it work on a general PC. They will be called theives and eventually be apprehended. If Hollywood has any brains, these guys will have a lot of work to do. People still counterfiet money too - but they usually lose money!! There are lots of solutions here. The law is not one of them. There is more than enough applicable law to use, and anyone who tries to force their solution down everyone's throat can be taken in for anti-trust violations. I see the risk as being too much law and fixed technology. DRM and TCP are useful tools, they should not be forged into weapons. Patience, persistence, truth, Dr. mike
Re: Ross's TCPA paper
Peter D. Junger wrote: That isn't the reason why a click-through agreement isn't enforceable---the agreement could, were it enforceable, validlly forbid reverse engineering for any reason and that clause would in most cases be upheld. Not in Europe though. EU directive 91/250/EEC on the legal protection of computer programs makes provision for reverse engineering for interoperability. In Britain this was incorporated into domestic law by the Copyright (Computer Programs) Regulations 1992: http://www.hmso.gov.uk/si/si1992/Uksi_19923233_en_1.htm See in particular s.50B(4) which the regulations added to the Copyright Designs and Patents Act 1988. (And in the actual case involving Linux and DVD players there was no agreement not to circumvent the technological control measures in DVD's; the case was based on the theory that the circumvention violated the Digital Millenium Copyright Act.) The American cases were, but the European case of course wasn't. The DMCA doesn't apply over here, though we have something similar in the works. I think lawyers will hate this. I don't see why we should. We don't hate the law of gravity or the law of large numbers. You should hate it. :-) It is appropriate for the legislature to decide which acts are restricted by copyright and which are not. The DMCA and similar legislation hands that right to private organisations. To some extent anti-trust law guards against the worst abuses, but it is more appropriate for the boundaries of copyright to be set by our elected representatives. BTW, I have been thinking for a while about putting together a UK competition complaint about DVD region coding. No promises that anything will happen quickly. On the other hand, if people offer help (or just tell me that they think it is a worthwhile thing to do) it will probably move faster. -- Pete
RE: Ross's TCPA paper
Privacy abuse is first and foremost the failure of a digital rights management system. A broken safe is not evidence that banks shouldn't use safes. It is only an argument that they shouldn't use the safe than was broken. I'm hard pressed to imagine what privacy without DRM looks like. Perhaps somebody can describe a non-DRM privacy management system. On the other hand, I easily can imagine how I'd use DRM technology to manage my privacy. Yes, it would be nice if we didn't need safes but until we don't, I'll use one. You can choose not to use DRM to manage your privacy but like stacking your money on your front porch, you don't get to grump if people take it. It's called contributory negligance, I believe. Cheers, Scott -Original Message- From: Ross Anderson To: [EMAIL PROTECTED] X-Orig-To: Dan Geer Cc: [EMAIL PROTECTED]; [EMAIL PROTECTED]; [EMAIL PROTECTED]; [EMAIL PROTECTED] Sent: 6/25/02 11:56 AM Subject: Re: Ross's TCPA paper I don't believe that the choice is both privacy and TCPA, or neither. Essentially all privacy violations are abuses of authorised access by insiders. Your employer's medical insurance scheme insists on a waiver allowing them access to your records, which they then use for promotion decisions. The fizx is fundamentally legislative: that sort of behaviour is generally illegal in Europe, but tolerated in the USA. There may be symmetry when we consider the problem as theoretical computer scientists might, as an issue for abstract machines. This symmetry breaks rapidly when the applications are seen in context. As well as the legal aspects, there are also the economic aspects: most security systems promote the interests of the people who pay for them (surprise, surprise). So I do not agree with the argument that we must allow DRM in order to get privacy. Following that line brings us to a world in which we have DRM, but where the privacy abuses persist just as before. There is simply no realistic prospect of American health insurers or HMOs settling for one-time read-only access to your medical records, no matter how well that gets implemented in Palladium Ross - The Cryptography Mailing List Unsubscribe by sending unsubscribe cryptography to [EMAIL PROTECTED]
Re: Ross's TCPA paper
On Wed, Jun 26, 2002 at 10:01:00AM -0700, bear wrote: As I see it, we can get either privacy or DRM, but there is no way on Earth to get both. [...] Hear, hear! First post on this long thread that got it right. Not sure what the rest of the usually clueful posters were thinking! DRM systems are the enemy of privacy. Think about it... strong DRM requires enforcement as DRM is not strongly possible (all bit streams can be re-encoded from one digital form (CD-MP3, DVD-DIVX), encrypted content streams out to the monitor / speakers subjected to scrutiny by hardware hackers to get digital content, or A-D reconverted back to digital in high fidelity. So I agree with Bear, and re-iterate the prediction I make periodically that the ultimate conclusion of the direction DRM laws being persued by the media cartels will be to attempt to get legislation directly attacking privacy. This is because strong privacy (cryptographically protected privacy) allows people to exchange bit-strings with limited chance of being identified. As the arms race between the media cartels and DRM cohorts continues, file sharing will start to offer privacy as a form of protection for end-users (eg. freenet has some privacy related features, serveral others involve encryption already). Donald Eastlake wrote: | There is little *tehcnical* difference between your doctors records | being passed on to assorted insurance companies, your boss, and/or | tabloid newspapers and the latest Disney movies being passed on from a | country where it has been released to people/theaters in a country | where it has not been released. There is lots of technical difference. When was the last time you saw your doctor use cryptlopes, watermarks etc to remind himself of his obligations of privacy. The point is that with privacy there is an explicit or implied agreement between the parties about the handling of information. The agreement can not be technically *enforced* to any stringent degree. However privacy policy aware applications can help the company avoid unintentionally breaching it's own agreed policy. Clearly if the company is hostile they can write the information down off the screen at absolute minimum. Information fidelity is hardly a criteria with private information such as health care records, so watermarks, copy protect marks and the rest of the DRM schtick are hardly likely to help! Privacy applications can be successful to the in helping companies avoid accidental privacy policy breaches. But DRM can not succeed because they are inherently insecure. You give the data and the keys to millions of people some large proportion of whom are hostile to the controls the keys are supposedly restricting. Given the volume of people, and lack of social stigma attached to wide-spread flouting of copy protection restrictions, there are ample supply of people to break any scheme hardware or software that has been developed so far, and is likely to be developed or is constructible. I think content providors can still make lots of money where the convenience, and /or enhanced fidelity of obtaining bought copies means that people would rather do that than obtain content on the net. But I don't think DRM is significantly helping them and that they ware wasting their money on it. All current DRM systems aren't even a speed bump on the way to unauthorised Net re-distribution of content. Where the media cartels are being somewhat effective, and where we're already starting to see evidence of the prediction I mentioned above about DRM leading to a clash with privacy is in the area of criminalization of reverse engineering, with Skylarov case, Ed Felten's case etc. Already a number of interesting breaks of DRM systems are starting to be released anonymously. As things heat up we may start to see incentives for the users of file-sharing for unauthorised re-distribution to also _use_ the software anonymsouly. Really I think copyright protections as being exploited by media cartels need to be substantially modified to reduce or remove the existing protections rather than further restrictions and powers awareded to the media cartels. Adam
Re: Ross's TCPA paper
On Tue, 25 Jun 2002, Dan Geer wrote: the problem statements for privacy and for digital rights management were identical Hmm, so: privacy : DRM :: wiretapping : fair use - RL Bob
Re: Ross's TCPA paper
I'm slightly confused about this. My understanding of contract law is that five things are required to form a valid contract: offer and acceptance, mutual intent, consideration, capacity, and lawful intent. It seems to me that a click-through agreement is likely to fail on at least one, and possibly two of these requirements. First, it is doubtful that there is mutual intent. The average user doesn't even read the agreement, so there is hardly mutual intent. However, even if I accept mutual intent, it would be easy to argue that there is no capacity. I have four children under the age of seven. None of them have the legal capacity to form a contract. Three of them have the physical capacity to click a button. A corporation would therefore have to demonstrate that I and not they clicked on the agreement for the contract to be valid. As a side note, it seems that a corporation would actually have to demonstrate that I had seen and agreed to the thing and clicked acceptance. Prior to that point, I could reverse engineer, since there is no statement that I cannot reverse engineer agreed to. So what would happen if I reverse engineered the installation so that the agreement that was display stated that I could do what I liked with the software? Ok, so there would be no mutual intent, but on the other hand, there would also be no agreement on the click-through agreement either. Paul Peter D. Junger writes: Pete Chown writes: : Anonymous wrote: : : Furthermore, inherent to the TCPA concept is that the chip can in : effect be turned off. No one proposes to forbid you from booting a : non-compliant OS or including non-compliant drivers. : : Good point. At least I hope they don't. :-) : : There is not even social opprobrium; look at how eager : everyone was to look the other way on the question of whether the DeCSS : reverse engineering violated the click-through agreement. : : Perhaps it did, but the licence agreement was unenforceable. It's : clearly reverse engineering for interoperability (between Linux and DVD : players) so the legal exemption applies. You can't escape the exemption : by contract. Now, you might say that morally he should obey the : agreement he made. My view is that there is a reason why this type of : contract is unenforceable; you might as well take advantage of the : exemption. That isn't the reason why a click-through agreement isn't enforceable---the agreement could, were it enforceable, validlly forbid reverse engineering for any reason and that clause would in most cases be upheld. But, unless you buy your software from the copyright owner, you own your copy of the software and clicking on a so called agreement with the copyright owner that you won't do certain things with your software is---or, at least should be---as unenforceable as promise to your doctor that you won't smoke another cigarette. The important point is not, however, that click-through agreements are probably unenforceable; the important point is that people---at least those people who think that they own their own computers and the software copies that they have purchased---generally believe that they should be unenforceable. (And in the actual case involving Linux and DVD players there was no agreement not to circumvent the technological control measures in DVD's; the case was based on the theory that the circumvention violated the Digital Millenium Copyright Act.) : The prosecution was on some nonsense charge that amounted to him : burgling his own house. A statute that was meant to penalise computer : break-ins was used against someone who owned the computer that he broke : into. : : The TCPA allows you to do something that you can't do today: run your : system in a way which convinces the other guy that you will honor your : promises, that you will guard his content as he requires in exchange for : his providing it to you. : : Right, but it has an odd effect too. No legal system gives people : complete freedom to contract. Suppose you really, really want to exempt : a shop from liability if your new toaster explodes. You can't do it; : the legal system does not give you the freedom to contract in that way. : : DRM, however, gives people complete freedom to make contracts about how : they will deal with digital content. Under EU single market rules, a : contract term to the effect that you could pass on your content to : someone in the UK but not the rest of the EU is unenforceable. No : problem for DRM though... I don't think that one should confuse contract limitations, or limitations on enforceable contract limitations, with technological limitations. There is nothing, for example, in any legal system that forbids one from violating the law of gravity. One of the many problems with the use of the Digital Millenium
TCPA / Palladium FAQ (was: Re: Ross's TCPA paper)
http://www.cl.cam.ac.uk/~rja14/tcpa-faq.html Ross
Re: TCPA / Palladium FAQ (was: Re: Ross's TCPA paper)
Interesting QA paper and list comments. Three additional comments: 1. DRM and privacy look like apple and speedboats. Privacy includes the option of not telling, which DRM does not have. 2. Palladium looks like just another vaporware from Microsoft, to preempt a market like when MS promised Windows and killed IBM's OS/2 in the process. 3. Embedding keys in mass-produced chips has great sales potential. Now we may have to upgrade processors also because the key is compromised ;-) Cheers, Ed Gerck PS: We would be much better off with OS/2, IMO. Ross Anderson wrote: http://www.cl.cam.ac.uk/~rja14/tcpa-faq.html Ross - The Cryptography Mailing List Unsubscribe by sending unsubscribe cryptography to [EMAIL PROTECTED]
DRMs vs internet privacy (Re: Ross's TCPA paper)
On Wed, Jun 26, 2002 at 03:57:15PM -0400, C Wegrzyn wrote: If a DRM system is based on X.509, according to Brand I thought you could get anonymity in the transaction. Wouldn't this accomplish the same thing? I don't mean that you would necessarily have to correlate your viewing habits with your TrueName for DRM systems. Though that is mostly (exclusively?) the case for current deployed (or at least implemented with a view of attempting commercial deployment) copy-mark (fingerprint) systems, there are a number of approaches which have been suggested, or could be used to have viewing privacy. Brands credentials are one example of a technology that allows trap-door privacy (privacy until you reveal more copies than you are allowed to -- eg more than once for ecash). Conceivably this could be used with a somewhat online, or in combination with a tamper-resistant observer chip in lieu of online copy-protection system to limit someone for example to a limited number of viewings. Another is the public key fingerprinting (public key copy-marking) schemes by Birgit Pfitzmann and others. This addresses the issue of proof, such that the user of the marked-object and the verifier (eg a court) of a claim of unauthorised copying can be assured that the copy-marker did not frame the user. Perhaps schemes which combine both aspects (viewer privacy and avoidance of need to trust at face value claims of the copy-marker) can be built and deployed. (With the caveat that though they can be built, they are largely irrelevant as they will no doubt also be easily removable, and anyway do not prevent the copying of the marked object under the real or feigned claim of theft from the user whose identity is marked in the object). But anyway, my predictions about the impending collision between privacy and the DRM and copy protection legislation power-grabs stems from the relationship of privacy to the later redistrubtion observation that: 1) clearly copy protection doesn't and can't a-priori prevent copying and conversion into non-DRM formats (eg into MP3, DIVX) 2) once 1) happens, the media cartels have an interest to track general file trading on the internet; 3) _but_ strong encryption and cryptographically enforced privacy mean that the media cartels will ultimately be unsuccessful in this endeavour. 4) _therefore_ they will try to outlaw privacy and impose escrow identity and internet passports etc. and try to get cryptographically assured privacy outlawed. (Similar to the previous escrow on encryption for media cartel interests instead of signals intelligence special interests; but the media cartels are also a powerful adversary). Also I note an slip in my earlier post [of Bear's post]: | First post on this long thread that got it right. Ross Anderson's comments were also right on the money (as always). Adam
Re: Ross's TCPA paper
On Wed, 26 Jun 2002, Barney Wolff wrote: Do you really mean that if I'm a business, you can force me to deal with you even though you refuse to supply your real name? Not acceptable. I won't give up the right NOT to do business with anonymous customers, or anyone else with whom I choose not to do business. As a business, you want to get paid. As long as you are sure of your money, what the hell business is it of yours where I live, what name I'm currently registered under, or who I'm screwing? When I buy things with cash or silver, if they ask for ID I leave or lie. I think that people should be free to use a pseudo for any non-fraudulent purposes. Bear
Re: Ross's TCPA paper
Scott Guthery wrote: Perhaps somebody can describe a non-DRM privacy management system. Uhh, anonymous remailers? I never disclose my identity, hence there is no need for parties I don't trust to manage it. Come on, folks. This ought to be cypherpunks 101. DRM might be one way to achieve privacy, but it is not the only way. One simple way for me to ensure my privacy is simply never to disclose my personal information. There's no DRM here. Sure, maybe we could envision some alternate world where I disclose my personal information in return for some promise from Big Brother to protect my personal information with DRM, but this doesn't mean that DRM is the only way to achieve privacy!
Re: Ross's TCPA paper
Pete Chown wrote: [...] This doesn't help with your other point, though; people wouldn't be able to modify the code and have a useful end product. I wonder if it could be argued that your private key is part of the source code? Am I expected to distribute my password with my code?
Re: Ross's TCPA paper
I don't believe that the choice is both privacy and TCPA, or neither. Essentially all privacy violations are abuses of authorised access by insiders. Your employer's medical insurance scheme insists on a waiver allowing them access to your records, which they then use for promotion decisions. The fizx is fundamentally legislative: that sort of behaviour is generally illegal in Europe, but tolerated in the USA. There may be symmetry when we consider the problem as theoretical computer scientists might, as an issue for abstract machines. This symmetry breaks rapidly when the applications are seen in context. As well as the legal aspects, there are also the economic aspects: most security systems promote the interests of the people who pay for them (surprise, surprise). So I do not agree with the argument that we must allow DRM in order to get privacy. Following that line brings us to a world in which we have DRM, but where the privacy abuses persist just as before. There is simply no realistic prospect of American health insurers or HMOs settling for one-time read-only access to your medical records, no matter how well that gets implemented in Palladium Ross
Re: Ross's TCPA paper
--- begin forwarded text Status: U Date: Sun, 23 Jun 2002 12:53:42 -0700 From: Paul Harrison [EMAIL PROTECTED] Subject: Re: Ross's TCPA paper To: R. A. Hettinga [EMAIL PROTECTED] User-Agent: Microsoft-Outlook-Express-Macintosh-Edition/5.02.2022 on 6/23/02 6:50 AM, R. A. Hettinga at [EMAIL PROTECTED] wrote: --- begin forwarded text Status: U From: Lucky Green [EMAIL PROTECTED] To: [EMAIL PROTECTED] Cc: [EMAIL PROTECTED] Subject: RE: Ross's TCPA paper Date: Sat, 22 Jun 2002 23:01:12 -0700 Sender: [EMAIL PROTECTED] Tres Snippage.. None of these obstacles are impossible to overcome, but not by Joe Computer User, not by even the most talented 16-year old hacker, and not even by many folks in the field. Sure, I know some that could overcome it, but they may not be willing to do the time for what by then will be a crime. Come to think of it, doing so already is a crime. --Lucky Green --- end forwarded text The discussion of TCPA has a tendency to avoid serious discussion of what I feel is the core security issue: ownership of the platform. Comments such as Lucky's: TPM will make it near impossible for the owner of that motherboard to access supervisor mode on the CPU without their knowledge obfuscate this. The Trusted Computing Platform includes the TPM, the motherboard and the CPU, all wired together with some amount of tamper resistance. It is meaningless to speak of different owners of different parts. The owner of a TCP might be a corporate IT department (for employee machines), a cable company (for set-top boxen), or an individual. The important question is not whether trusted platforms are a good idea, but who will own them. Purchasing a TCP without the keys to the TPM is like buying property without doing a title search. Of course it is possible to _rent_ property from a title holder, and in some cases this is desirable. I would think a TCP _with_ ownership of the TPM would be every paranoid cypherpunk's wet dream. A box which would tell you if it had been tampered with either in hardware or software? Great. Someone else's TCP is more like a rental car: you want the rental company to be completely responsible for the safety of the vehicle. This is the economic achilles heal of using TCPA for DRM. Who is going to take financial responsibility for the proper operation of the platform? It can work for a set top box, but it won't fly for a general purpose computer. --- end forwarded text -- - R. A. Hettinga mailto: [EMAIL PROTECTED] The Internet Bearer Underwriting Corporation http://www.ibuc.com/ 44 Farquhar Street, Boston, MA 02131 USA ... however it may deserve respect for its usefulness and antiquity, [predicting the end of the world] has not been found agreeable to experience. -- Edward Gibbon, 'Decline and Fall of the Roman Empire'
Re: Ross's TCPA paper
Date: Sun, 23 Jun 2002 12:53:42 -0700 From: Paul Harrison [EMAIL PROTECTED] Subject: Re: Ross's TCPA paper I would think a TCP _with_ ownership of the TPM would be every paranoid cypherpunk's wet dream. A box which would tell you if it had been tampered with either in hardware or software? Great. Someone else's TCP is more like a rental car: you want the rental company to be completely responsible for the safety of the vehicle. This is the economic achilles heal of using TCPA for DRM. Who is going to take financial responsibility for the proper operation of the platform? It can work for a set top box, but it won't fly for a general purpose computer. Exactly my point - economicly it can't work for the nightmare scenario. The whole DRM concept is seriously flawed, and the fact it's being pushed by a guy who used to run a paint-ball arena is really no supprise. There's a large group of academics working on DRM concepts for access to university facilities, including libraries and computers. They use secure platforms, but they still have to worry about who gets physical access to the platform. And I also don't think conspiricy is the right term. The article Lucky quoted from indicated that use of the trusted platform for DRM was an afterthought, and that's much more believeable. A bunch of sharks looking for money all swim around the same target. It has to do with where the money is, not any collusion between the players. S.2048 is not likely to see the light of day. The automotive industry is bigger than the entertainment industry, and they have more sway in washington when it comes to how much some bill is going to cost them. S.2048 makes cars way too expensive, and when union workers find out that a) they will have fewer jobs and b) they won't be able to watch videos when they get home, the shit will hit the fan big time. Definitly write a letter to your congress critter to let them know the whole thing is stupid. But don't call it a conspiricy, that gives the morons thinking this whole thing up a bit too much intellect. Patience, persistence, truth, Dr. mike
Re: Ross's TCPA paper
I, for one, can vouch for the fact that TCPA could absolutely be applied to a DRM application. In a previous life I actually designed a DRM system (the company has since gone under). In our research and development in '96-98, we decided that you need at least some trusted hardware at the client to perform any DRM, but if you _did_ have some _minimal_ trusted hardware, that would provide a large hook to a fairly secure DRM system. Check the archives of, IIRC, coderpunks... I started a thread entitled The Black Box Problem. The issue is that in a DRM system you (the content provider) wants to verify the operation of the client, even though the client is not under your control. We developed an online interactive protocol with a sandbox environment to protect content, but it would certainly be possible for someone to crack it. Our threat model was that we didn't want people to be able to use a hacked client against our distributation system. We discovered that if we had some trusted hardware that had a few key functions (I don't recall the few key functions offhand, but it was more than just encrypt and decrypt) we could increase the effectiveness of the DRM system astoundingly. We thought about using cryptodongles, but the Black Box problem still applies. The trusted hardware must be a core piece of the client machine for this to work. Like everything else in the technical world, TPCA is a tool.. It is neither good nor bad; that distinction comes in how us humans apply the technology. -derek Lucky Green [EMAIL PROTECTED] writes: Anonymous writes: Lucky Green writes regarding Ross Anderson's paper at: Ross and Lucky should justify their claims to the community in general and to the members of the TCPA in particular. If you're going to make accusations, you are obliged to offer evidence. Is the TCPA really, as they claim, a secretive effort to get DRM hardware into consumer PCs? Or is it, as the documents on the web site claim, a general effort to improve the security in systems and to provide new capabilities for improving the trustworthiness of computing platforms? Anonymous raises a valid question. To hand Anonymous additional rope, I will even assure the reader that when questioned directly, the members of the TCPA will insist that their efforts in the context of TCPA are concerned with increasing platform security in general and are not targeted at providing a DRM solution. Unfortunately, and I apologize for having to disappoint the reader, I do not feel at liberty to provide the proof Anonymous is requesting myself, though perhaps Ross might. (I have no first-hand knowledge of what Ross may or may not be able to provide). I however encourage readers familiar with the state of the art in PC platform security to read the TCPA specifications, read the TCPA's membership list, read the Hollings bill, and then ask themselves if they are aware of, or can locate somebody who is aware of, any other technical solution that enjoys a similar level of PC platform industry support, is anywhere as near to wide-spread production as TPM's, and is of sufficient integration into the platform to be able to form the platform basis for meeting the requirements of the Hollings bill. Would Anonymous perhaps like to take this question? --Lucky Green - The Cryptography Mailing List Unsubscribe by sending unsubscribe cryptography to [EMAIL PROTECTED] -- Derek Atkins Computer and Internet Security Consultant [EMAIL PROTECTED] www.ihtfp.com
Re: Ross's TCPA paper
It's an interesting claim, but there is only one small problem. Neither Ross Anderson nor Lucky Green offers any evidence that the TCPA (http://www.trustedcomputing.org) is being designed for the support of digital rights management (DRM) applications. Microsoft admits it: http://www.msnbc.com/news/770511.asp Intel admitted it to me to. They said that the reason for TCPA was that their company makes most of its money from the PC microprocessor; they have most of the market; so to grow the company they need to grow the overall market for PCs; that means making sure the PC is the hub of the future home network; and if entertainment's the killer app, and DRM is the key technology for entertainment, then the PC must do DRM. Now here's another aspect of TCPA. You can use it to defeat the GPL. During my investigations into TCPA, I learned that HP has started a development program to produce a TCPA-compliant version of GNU/linux. I couldn't figure out how they planned to make money out of this. On Thursday, at the Open Source Software Economics conference, I figured out how they might. Making a TCPA-compliant version of GNU/linux (or Apache, or whatever) will mean tidying up the code and removing whatever features conflict with the TCPA security policy. The company will then submit the pruned code to an evaluator, together with a mass of documentation for the work that's been done, including a whole lot of analyses showing, for example, that you can't get root by a buffer overflow. The business model, I believe, is this. HP will not dispute that the resulting `pruned code' is covered by the GPL. You will be able to download it, compile it, check it against the binary, and do what you like with it. However, to make it into TCPA-linux, to run it on a TCPA-enabled machine in privileged mode, you need more than the code. You need a valid signature on the binary, plus a cert to use the TCPA PKI. That will cost you money (if not at first, then eventually). Anyone will be free to make modifications to the pruned code, but in the absence of a signature the resulting O/S won't enable users to access TCPA features. It will of course be open to competitors to try to re-do the evaluation effort for enhanced versions of the pruned code, but that will cost money; six figures at least. There will likely be little motive for commercial competitors to do it, as HP will have the first mover advantages and will be able to undercut them on price. There will also be little incentive for philanthropists to do it, as the resulting product would not really be a GPL version of a TCPA operating system, but a proprietary operating system that the philanthropist could give away free. (There are still issues about who would pay for use of the PKI that hands out user certs.) The need to go through evaluation with each change is completely incompatible with the business model of free and open source software. People believed that the GPL made it impossible for a company to come along and steal code that was the result of community effort. That may have been the case so long as the processor was open, and anyone could access supervisor mode. But TCPA changes that completely. Once the majority of PCs on the market are TCPA-enabled, the GPL won't work as intended any more. There has never been anything to stop people selling complementary products and services to GPL'ed code; once the functioning of these products can be tied to a signature on the binary, the model breaks. Can anyone from HP comment on whether this is actually their plan? Ross
Re: Ross's TCPA paper
It seems clear at least if DRM is an application than DRM applications would benefit from the increased trust and architecturally that such trust would be needed to enforce/ensure some/all of the requirements of the Hollings bill. hawk Lucky Green wrote: other technical solution that enjoys a similar level of PC platform industry support, is anywhere as near to wide-spread production as TPM's, and is of sufficient integration into the platform to be able to form the platform basis for meeting the requirements of the Hollings bill. Would Anonymous perhaps like to take this question?
Re: Ross's TCPA paper
On Mon, Jun 24, 2002 at 08:15:29AM -0400, R. A. Hettinga wrote: Status: U Date: Sun, 23 Jun 2002 12:53:42 -0700 From: Paul Harrison [EMAIL PROTECTED] Subject: Re: Ross's TCPA paper To: R. A. Hettinga [EMAIL PROTECTED] The important question is not whether trusted platforms are a good idea, but who will own them. Purchasing a TCP without the keys to the TPM is like buying property without doing a title search. Of course it is possible to _rent_ property from a title holder, and in some cases this is desirable. I would think a TCP _with_ ownership of the TPM would be every paranoid cypherpunk's wet dream. A box which would tell you if it had been tampered with either in hardware or software? Great. Someone else's TCP is more like a rental car: you want the rental company to be completely responsible for the safety of the vehicle. This is the economic achilles heal of using TCPA for DRM. Who is going to take financial responsibility for the proper operation of the platform? It can work for a set top box, but it won't fly for a general purpose computer. In general, I'm very fond of this sort of ownership analysis. If I have a TCPA box running my software, and thinking that its mine, how do I know there isn't one more layer? Leave it off, and my analysis is simpler. I suspect that verifying ownership of the TPM will be like verifying ownership of property in modern Russia: There may be a title that looks clean. But what does the mafia think? What about the security services? There may even be someone with a pre-Bolshevik title floating around. Or a forgery. Hard to tell. It's annoying to have one's transaction costs pushed up that high. I can get very high quality baseline software today. What I need for my cypherpunk wet dreams is ecash, and a nice anonymizing network. What I also need is that the general purpose computing environment stay free of control points, in Lessig sense. Adam
Re: Ross's TCPA paper
Anonymous wrote: Furthermore, inherent to the TCPA concept is that the chip can in effect be turned off. No one proposes to forbid you from booting a non-compliant OS or including non-compliant drivers. Good point. At least I hope they don't. :-) There is not even social opprobrium; look at how eager everyone was to look the other way on the question of whether the DeCSS reverse engineering violated the click-through agreement. Perhaps it did, but the licence agreement was unenforceable. It's clearly reverse engineering for interoperability (between Linux and DVD players) so the legal exemption applies. You can't escape the exemption by contract. Now, you might say that morally he should obey the agreement he made. My view is that there is a reason why this type of contract is unenforceable; you might as well take advantage of the exemption. The prosecution was on some nonsense charge that amounted to him burgling his own house. A statute that was meant to penalise computer break-ins was used against someone who owned the computer that he broke into. The TCPA allows you to do something that you can't do today: run your system in a way which convinces the other guy that you will honor your promises, that you will guard his content as he requires in exchange for his providing it to you. Right, but it has an odd effect too. No legal system gives people complete freedom to contract. Suppose you really, really want to exempt a shop from liability if your new toaster explodes. You can't do it; the legal system does not give you the freedom to contract in that way. DRM, however, gives people complete freedom to make contracts about how they will deal with digital content. Under EU single market rules, a contract term to the effect that you could pass on your content to someone in the UK but not the rest of the EU is unenforceable. No problem for DRM though... I think lawyers will hate this. -- Pete
RE: Ross's TCPA paper
Mike wrote quoting Lucky: trusted here means that the members of the TCPA trust that the TPM will make it near impossible for the owner of that motherboard to access supervisor mode on the CPU without their knowledge, they trust that the TPM will enable them to determine remotely if the customer has a kernel-level debugger loaded, and they trust that the TPM will prevent a user from bypassing OS protections by installing custom PCI cards to read out memory directly via DMA without going through the CPU. I don't see how they expect this to work. We've already got cheap rip off motherboards, who's gonna stop cheap rip off TPM's that ain't really T? I think it moves the game into a smaller field where the players all have some bucks to begin with, but somebody will create a TPM that looks like the real thing, but runs cypherpunk code just fine. I agree with your assertion that TPM's can't prevent DRM from being broken. Nor is this the intent of introducing TPM's. The vendors have realized that they have to raise the technical bar only so high to keep those most inclined to break their systems (i.e. 16-year old Norwegians) from doing so. Those that have the knowledge and resources to break TCPA systems either won't have the time because they are engaged in gainful employment, won't be willing to take the risk, because they have accumulated sufficient material possessions to be unwilling to risk losing their possessions, not to mention their freedom, in litigation, or will break the security for their own gain, but won't release the crack to the public. Criminal enterprise falls into the latter category. The content vendors, which in this case includes the operating system and application vendors, dislike, but can live with, major criminal enterprise being the only other party to have unfettered access, since criminal enterprise is just another competitor in the market place. Most business models can survive another competitor. Where business models threaten to collapse is when the marginal cost of an illegal copy goes to zero and the public at large can obtain your goods without payment. I don't know if the TCPA's efforts will prevent this, but in the process of trying to achieve this objective, the average computers users, and even many advanced computer users, will find themselves in a new relationship with their PC: that of a pure consumer, with only the choices available to them the what the 180 TCPA's members digital signatures permit. Cloning TPM's is difficult, though not impossible. Note that all TPM's unique initial internal device keys are signed at time of manufacture by a derivative of the TCPA master key. Unless you are one of the well-known chipset or BIOS manufacturers, you can't get your TPM products signed. It is theoretically possible, though far from easy, to clone an entire TPM, keys and all. However, the moment those fake TPM's show up in the market place, their keys will simply be listed in the next CRL update. And if your OS and TPM's miss a few CRL updates, your commercial OS and all your applications will stop working. As might in the future your video card, your PCI cards, your hard drive, and your peripherals. You can try to hack around the code in the OS or firmware that performs the checks, as long as you are willing to operate your machine permanently off the Net from then on, because your system will fail the remote integrity checks, but given that this and other security relevant code inside the OS and applications are 3DES encrypted and are only decrypted inside the TPM, you can't just read the object code from disk, but get to first microprobe the decrypted op codes off the bus before taking a debugger to the code. Not a trivial task at today's PC bus speeds. Nor can you get too aggressive with the hacks, since your Fritz may simply flush the keys and leave you with a bunch of 3DES encrypted op codes and no corresponding decryption keys. Reverse engineering turns pretty dim at that point. None of these obstacles are impossible to overcome, but not by Joe Computer User, not by even the most talented 16-year old hacker, and not even by many folks in the field. Sure, I know some that could overcome it, but they may not be willing to do the time for what by then will be a crime. Come to think of it, doing so already is a crime. --Lucky Green
Re: Ross's TCPA paper
Lucky Green writes regarding Ross Anderson's paper at: http://www.ftp.cl.cam.ac.uk/ftp/users/rja14/toulouse.pdf I must confess that after reading the paper I am quite relieved to finally have solid confirmation that at least one other person has realized (outside the authors and proponents of the bill) that the Hollings bill, while failing to mention TCPA anywhere in the text of the bill, was written with the specific technology provided by the TCPA in mind for the purpose of mandating the inclusion of this technology in all future general-purpose computing platforms, now that the technology has been tested, is ready to ship, and the BIOS vendors are on side. It's an interesting claim, but there is only one small problem. Neither Ross Anderson nor Lucky Green offers any evidence that the TCPA (http://www.trustedcomputing.org) is being designed for the support of digital rights management (DRM) applications. In fact if you look at the documents on the TCPA web site you see much discussion of applications such as platform-based ecommerce (so that even if a user's keys get stolen they can't be used on another PC), securing corporate networks (assuring that each workstation is running an IT-approved configuration), detecting viruses, and enhancing the security of VPNs. DRM is not mentioned. Is the claim by Ross and Lucky that the TCPA is a fraud, secretly designed for the purpose of supporting DRM while using the applications above merely as a cover to hide their true purposes? If so, shouldn't we expect to see the media content companies as supporters of this effort? But the membership list at http://www.trustedcomputing.org/tcpaasp4/members.asp shows none of the usual suspects. Disney's not there. Sony's not there. No Viacom, no AOL/Time/Warner, no News Corp. The members are all technology companies, including crypto companies like RSA, Verisign and nCipher. Contrast this for example with the Brodcast Protection Discussion Group whose ongoing efforts are being monitored by the EFF at http://www.eff.org/IP/Video/HDTV/. There you do find the big media companies. That effort is plainly aimed at protecting information and supporting DRM, so it makes sense that the companies most interested in those goals are involved. But with the TCPA, the players are completely different. And unlike with the BPDG, the rationale being offered is not based on DRM but on improving the trustworthiness of software for many applications. Ross and Lucky should justify their claims to the community in general and to the members of the TCPA in particular. If you're going to make accusations, you are obliged to offer evidence. Is the TCPA really, as they claim, a secretive effort to get DRM hardware into consumer PCs? Or is it, as the documents on the web site claim, a general effort to improve the security in systems and to provide new capabilities for improving the trustworthiness of computing platforms?
RE: Ross's TCPA paper
Anonymous writes: Lucky Green writes regarding Ross Anderson's paper at: Ross and Lucky should justify their claims to the community in general and to the members of the TCPA in particular. If you're going to make accusations, you are obliged to offer evidence. Is the TCPA really, as they claim, a secretive effort to get DRM hardware into consumer PCs? Or is it, as the documents on the web site claim, a general effort to improve the security in systems and to provide new capabilities for improving the trustworthiness of computing platforms? Anonymous raises a valid question. To hand Anonymous additional rope, I will even assure the reader that when questioned directly, the members of the TCPA will insist that their efforts in the context of TCPA are concerned with increasing platform security in general and are not targeted at providing a DRM solution. Unfortunately, and I apologize for having to disappoint the reader, I do not feel at liberty to provide the proof Anonymous is requesting myself, though perhaps Ross might. (I have no first-hand knowledge of what Ross may or may not be able to provide). I however encourage readers familiar with the state of the art in PC platform security to read the TCPA specifications, read the TCPA's membership list, read the Hollings bill, and then ask themselves if they are aware of, or can locate somebody who is aware of, any other technical solution that enjoys a similar level of PC platform industry support, is anywhere as near to wide-spread production as TPM's, and is of sufficient integration into the platform to be able to form the platform basis for meeting the requirements of the Hollings bill. Would Anonymous perhaps like to take this question? --Lucky Green
Ross's TCPA paper
I recently had a chance to read Ross Anderson's paper on the activities of the TCPA at http://www.cl.cam.ac.uk/ftp/users/rja14/.temp/toulouse.pdf I must confess that after reading the paper I am quite relieved to finally have solid confirmation that at least one other person has realized (outside the authors and proponents of the bill) that the Hollings bill, while failing to mention TCPA anywhere in the text of the bill, was written with the specific technology provided by the TCPA in mind for the purpose of mandating the inclusion of this technology in all future general-purpose computing platforms, now that the technology has been tested, is ready to ship, and the BIOS vendors are on side. Perhaps the Hollings Consumer Broadband and Digital Television Promotion Act bill would be more accurately termed the TCPA Enablement Act. BTW, the module that Ross calls a Fritz in his paper after the author of the bill, long had a name: it is called a Trusted Platform Module (TPM). Granted, in the context of the TCPA and the Hollings bill, the term trusted is used somewhat differently than the customers of future motherboards, which are all slated to include a TPM, might expect: trusted here means that the members of the TCPA trust that the TPM will make it near impossible for the owner of that motherboard to access supervisor mode on the CPU without their knowledge, they trust that the TPM will enable them to determine remotely if the customer has a kernel-level debugger loaded, and they trust that the TPM will prevent a user from bypassing OS protections by installing custom PCI cards to read out memory directly via DMA without going through the CPU. The public and the media now need to somehow, preferably soon, arrive at the next stage of realization: the involvement in the TCPA by many companies who's CEO's wrote the widely distributed open letter to the movie studios, telling the studios, or more precisely -- given that it was an open letter -- telling the public, that mandating DRM's in general-purpose computing platforms may not be a good idea, is indicative of one of two possible scenarios: 1) the CEO's of said computer companies are utterly unaware of a major strategic initiative their staff has been diligently executing for about 3 years, in the case of the principals in the TCPA, such as Intel, Compaq, HP, and Microsoft, several years longer. 2) the CEO's wrote this open letter as part of a deliberate good cop, bad cop ploy, feigning opposition to DRM in general computing platforms to pull the wool over the public's eye for hopefully long enough to achieve widespread deployment of the mother of all DRM solution in the market place. I do not know which of the two potential scenarios holds true. However, I believe public debate regarding the massive change in the way users will interact with their future computers due to the efforts of the TCPA and the Hollings bill would be greatly aided by attempts to establish which of the two scenarios is the fact the case. --Lucky Green
Re: Ross's TCPA paper
Ross has shifted his TCPA paper to: http://www.ftp.cl.cam.ac.uk/ftp/users/rja14/toulouse.pdf At 07:03 PM 6/22/2002 -0700, Lucky wrote: I recently had a chance to read Ross Anderson's paper on the activities of the TCPA at http://www.cl.cam.ac.uk/ftp/users/rja14/.temp/toulouse.pdf