Re: [s-t] bright lights, big computers digest #1

2005-02-04 Thread Eugen Leitl
[from somelist]

> Subject: Re: [s-t] The return of Das Blinkenlight 
> Date: Mon, 31 Jan 2005 19:00:49 -0500
> 
> >In the early 90's I was a product manager for a (now-defunct) company
> >that made LAN hubs-- this was when a 10Base-T port would cost you a couple
> 
> 
> This reminded me of a story from a few years ago.
> 
> Apparently a lot of modem manufacturers tied the activity light on
> the modem directly to the circuit which modulated the sound.
> 
> Then someone realized that with a telescope, and and optical
> transister, one could read that datastream as if hooked to the modem
> directly.
> 
> And astonishing numbers of businesses had their modem pools facing
> windows, because the blinkenlights looked impressive.



Not just modems.  Some Cisco routers, even at megabit rates.  2002
publication, although the research was over the previous couple of
years.

And (for instance) the Paradyne Infolock 2811-11 DES encryptor, which
has an LED on the plaintext data.

How we laughed.

The paper also covers using LEDs (such as keyboard LEDs) as covert
data channels.  And yes, it cites Cryptonomicon.

I'm not sure whether this was more or less cool than Marcus Kuhn's
work on reconstructing CRT displays from reflected light, by reverse
convolution with the impulse-response curves of the various phosphors.
Both papers are fantastic reads, very accessible, very stimulating.



Nick B

- End forwarded message -
-- 
Eugen* Leitl http://leitl.org";>leitl
__
ICBM: 48.07078, 11.61144http://www.leitl.org
8B29F6BE: 099D 78BA 2FD3 B014 B08A  7779 75B0 2443 8B29 F6BE
http://moleculardevices.org http://nanomachines.net


pgpFHmleewRBU.pgp
Description: PGP signature


Re: Dell to Add Security Chip to PCs

2005-02-04 Thread Justin
On 2005-02-04T14:30:48-0500, Mark Allen Earnest wrote:
> The government was not able to get the Clipper chip passed and that was 
> backed with the horror stories of rampant pedophilia, terrorism, and 
> organized crime. Do you honestly believe they will be able to destroy 
> open source, linux, independent software development, and the like with 
> just the fear of movie piracy, mp3 sharing, and such? Do you really 
> think they are willing to piss off large sections of the voting 
> population, the tech segment of the economy, universities, small 
> businesses, and the rest of the world just because the MPAA and RIAA 
> don't like customers owning devices they do not control?

They managed with the HTDV broadcast flag mandate.

-- 
"War is the father and king of all, and some he shows as gods, others as
men; some he makes slaves, others free."  --Heraclitus (Kahn.83/D-K.53) 



RE: Dell to Add Security Chip to PCs

2005-02-04 Thread Jason Holt

On Thu, 3 Feb 2005, Erwann ABALEA wrote:
> And do you seriously think that "you can't do that, it's technically not
> possible" is a good answer? That's what you're saying. For me, a better
> answer is "you don't have the right to deny my ownership".

Yes, Senator McCarthy, I do in fact feel safer knowing that mathematics
protects my data.  Welcome to cypherpunks.

-J



Re: Dell to Add Security Chip to PCs

2005-02-04 Thread Ian G
Ed Reed wrote:
I'm just curious on this point.  I haven't seen much
to indicate that Microsoft and others are ready
for a nymous, tradeable software assets world.
   

No, and neither are corporate customers, to a large extent.
 

Right, so my point (I think) was that without some
indication that those people are ready for a nymous,
tradeable assets world, the notion of a trusted
computing base is limited to working for the
Microsofts off the world as the owners of the
content, not to users as the owners of assets.
Accountability is, in fact, a treasured property of business computing.
Lack of accountability creates things like Enron, Anderson Consulting,
Oil-for-Food scams, and the missing 9 billion dollars or so of
reconstruction aid.  It's the fuel that propells SPAM, graft, and
identity theft.
What I've not seen is much work providing accountability for anonymous
transactions.
 

I am having trouble with tying in "accountability"
with the above examples.  That doesn't sound like
an accountability issue in the technical sense,
that sounds like a theft problem.  In this sense,
I see two different uses of the word, and they don't
have much of a linkage.
Nymous systems are generally far more accountable
in the technical sense, simply because they give you
the tools to be absolutely sure about your statements.
A nymous account has a an audit trail that can be
traced as far as you have access to the information,
and because the audit trail is cryptographically
secured (by usage of hash and digsigs) a complete
picture can be built up.
This stands in contraposition to systems based on
blinding formulas.  That sort of issued money is
intended to be untraceable and is thus less easily
used to 'account' for everything.  Having said that,
there's no reason why a given transaction can't be
set and stabilised in stone with a digital receipt,
which then can form part of an accounting trail.
But regardless of which system is used (nymous,
blinded or POBA - plain old bank account) the
money can be stolen, statements can be hidden
and fudged, and purposes can be misrepresented,
just like any others...  If there was a reason why
these big companies didn't get into such digital
assets, I'd say it was because they hadn't
succeeded in a form that was 'feel good' enough,
as yet for them.
In which case, I'd say that they would consider
'accountability' to mean 'my accountant won't
think it strange.'
iang
--
News and views on what matters in finance+crypto:
   http://financialcryptography.com/


Re: Dell to Add Security Chip to PCs

2005-02-04 Thread Mark Allen Earnest
Trei, Peter wrote:
It could easily be leveraged to make motherboards
which will only run 'authorized' OSs, and OSs
which will run only 'authorized' software.
And you, the owner of the computer, will NOT
neccesarily be the authority which gets to decide
what OS and software the machine can run.
If you 'take ownership' as you put it, the internal
keys and certs change, and all of a sudden you
might not have a bootable computer anymore.
Goodbye Linux.
Goodbye Freeware.
Goodbye independent software development.
It would be a very sad world if this comes
to pass.
Yes it would, many governments are turning to Linux and other freeware. 
Many huge companies make heavy use of Linux and and freeware, suddenly 
losing this would have a massive effect on their bottom line and 
possibly enough to impact the economy as a whole. Independent software 
developers are a significant part of the economy as well, and most 
politicians do not want to associate themselves with the concept of 
"hurting small business". Universities and other educational 
institutions will fight anything that resembles what you have described 
tooth and nail.

To think that this kind of technology would be mandated by a government 
is laughable. Nor do I believe there will be any conspiracy on the part 
of ISPs to require to in order to get on the Internet. As it stands now 
most people are running 5+ year old computer and windows 98/me, I doubt 
this is going to change much because for most people, this does what 
they want (minus all the security vulnerabilities, but with NAT 
appliances those are not even that big a deal). There is no customer 
demand for this technology to be mandated, there is no reason why an ISP 
or vendor would want to piss off significant percentages of their 
clients in this way. The software world is becoming MORE open. Firefox 
and Openoffice are becoming legitimate in the eyes of government and 
businesses, Linux is huge these days, and the open source development 
method is being talked about in business mags, board rooms, and 
universities everywhere.

The government was not able to get the Clipper chip passed and that was 
backed with the horror stories of rampant pedophilia, terrorism, and 
organized crime. Do you honestly believe they will be able to destroy 
open source, linux, independent software development, and the like with 
just the fear of movie piracy, mp3 sharing, and such? Do you really 
think they are willing to piss off large sections of the voting 
population, the tech segment of the economy, universities, small 
businesses, and the rest of the world just because the MPAA and RIAA 
don't like customers owning devices they do not control?

It is entirely possibly that a machine like you described will be built, 
 I wish them luck because they will need it. It is attempted quite 
often and yet history shows us that there is really no widespread demand 
for iOpeners, WebTV, and their ilk. I don't see customers demanding 
this, therefor there will probably not be much of a supply. Either way, 
there is currently a HUGE market for general use PCs that the end user 
controls, so I imagine there will always be companies willing to supply 
them.

My primary fear regarding TCPA is the remote attestation component. I 
can easily picture Microsoft deciding that they do not like Samba and 
decide to make it so that Windows boxes simply cannot communicate with 
it for domain, filesystem, or authentication purposes. All they need do 
is require that the piece on the other end be signed by Microsoft. Heck 
they could render http agent spoofing useless if they decide to make it 
so that only IE could connect to ISS. Again though, doing so would piss 
off a great many of their customers, some of who are slowly jumping ship 
to other solutions anyway.

--
Mark Allen Earnest
Lead Systems Programmer
Emerging Technologies
The Pennsylvania State University


smime.p7s
Description: S/MIME Cryptographic Signature


mmm, petits filous (was Re: NTK now, 2005-02-04)

2005-02-04 Thread R.A. Hettinga
At 5:45 PM + 2/4/05, Dave Green wrote:
> mmm, petits filous
>
> Everyone else likes to worry about Google's gathering
> conflict of interests, but Verisign's S.P.E.C.T.R.E.-level
> skills still take some beating. This week, orbiting crypto
> analysts Ian Grigg and Adam Shostock belatedly pointed out
> to ICANN that perhaps Verisign couldn't trusted with
> .net. Why? Well, Verisign these days offers both top level
> domains and SSL certificate authentication. They also, with
> their NetDiscovery service - sell ISPs a complete service for
> complying with law enforcement surveillance orders. So, if an
> American court demands an ISP wiretap its customers, and the
> ISP turns that order over to Verisign to do the dirty: well,
> Verisign can now fake any domain you want, and issue any
> temporary fake certificate, allowing even SSLed
> communications to be monitored. What's even more fun is that
> they are - at least in the US - now moving into providing
> infrastructure for mobile telephony. Yes, NOT EVEN YOUR
> RINGTONES ARE SAFE.
> http://forum.icann.org/lists/net-rfp-verisign/msg8.html
>- you know, this is probably a little late
> http://iang.org/ssl/
> - but then, this is the year of the snail
> http://www.thefeature.com/article?articleid=101334&ref=5459267
>  - stupid network vs stupider company

-- 
-
R. A. Hettinga 
The Internet Bearer Underwriting Corporation 
44 Farquhar Street, Boston, MA 02131 USA
"... however it may deserve respect for its usefulness and antiquity,
[predicting the end of the world] has not been found agreeable to
experience." -- Edward Gibbon, 'Decline and Fall of the Roman Empire'



Re: Dell to Add Security Chip to PCs

2005-02-04 Thread Steven M. Bellovin
In message <[EMAIL PROTECTED]>, Dan Kaminsky writes:
>
>>>Uh, you *really* have no idea how much the black hat community is
>>>looking forward to TCPA.  For example, Office is going to have core
>>>components running inside a protected environment totally immune to
>>>antivirus.
>>>
>>>
>>
>>How? TCPA is only a cryptographic device, and some BIOS code, nothing
>>else. Does the coming of TCPA chips eliminate the bugs, buffer overflows,
>>stack overflows, or any other way to execute arbitrary code? If yes, isn't
>>that a wonderful thing? Obviously it doesn't (eliminate bugs and so on).
>>
>>  
>>
>TCPA eliminates external checks and balances, such as antivirus.  As the 
>user, I'm not trusted to audit operations within a TCPA-established 
>sandbox.  Antivirus is essentially a user system auditing tool, and 
>TCPA-based systems have these big black boxes AV isn't allowed to analyze.
>
>Imagine a sandbox that parses input code signed to an API-derivable 
>public key.  Imagine an exploit encrypted to that.  Can AV decrypt the 
>payload and prevent execution?  No, of course not.  Only the TCPA 
>sandbox can.  But since AV can't get inside of the TCPA sandbox, 
>whatever content is "protected" in there is quite conspicuously unprotected.
>
>It's a little like having a serial killer in San Quentin.  You feel 
>really safe until you realize...uh, he's your cellmate.
>
>I don't know how clear I can say this, your threat model is broken, and 
>the bad guys can't stop laughing about it.
>

I have no idea whether or not the bad guys are laughing about it, but 
if they are, I agree with them -- I'm very afriad that this chip will 
make matters worse, not better.  With one exception -- preventing the 
theft of very sensitive user-owned private keys -- I don't think that 
the TCPA chip is solving the right problems.  *Maybe* it will solve the 
problems of a future operating system architecture; on today's systems, 
it doesn't help, and probably makes matters worse.

TCPA is a way to raise the walls between programs executing in 
different protection spaces.  So far, so good.  Now -- tell me the last 
time you saw an OS flaw that directly exploited flaws in conventional 
memory protection or process isolation?  They're *very* rare.

The problems we see are code bugs and architectural failures.  A buffer 
overflow in a Web browser still compromises the browser; if the 
now-evil browser is capable of writing files, registry entries, etc., 
the user's machine is still capable of being turned into a spam engine, 
etc.  Sure, in some new OS there might be restrictions on what such an 
application can do, but you can implement those restrictions with 
today's hardware.  Again, the problem is in the OS architecture, not in 
the limitations of its hardware isolation.

I can certainly imagine an operating system that does a much better job 
of isolating processes.  (In fact, I've worked on such things; if 
you're interested, see my papers on sub-operating systems and separate 
IP addresses per process group.)  But I don't see that TCPA chips add 
much over today's memory management architectures.  Furthermore, as Dan 
points out, it may make things worse -- the safety of the OS depends on 
the userland/kernel interface, which in turn is heavily dependent on 
the complexity of the privileged kernel modules.  If you put too much 
complex code in your kernel -- and from the talks I've heard this is 
exactly what Microsoft is planning -- it's not going to help the 
situation at all.  Indeed, as Dan points out, it may make matters worse.

Microsoft's current secure coding initiative is a good idea, and from 
what I've seen they're doing a good job of it.  In 5 years, I wouldn't 
be at all surprised if the rate of simple bugs -- the buffer overflows, 
format string errors, race conditions, etc. -- was much lower in 
Windows and Office than in competing open source products.  (I would 
add that this gain has come at a *very* high monetary cost -- training, 
code reviews, etc., aren't cheap.)  The remaining danger -- and it's a 
big one -- is the architecture flaws, where ease of use and 
functionality often lead to danger.  Getting this right -- getting it 
easy to use *and* secure -- is the real challenge.  Nor are competing 
products immune; the drive to make KDE and Gnome (and for that matter 
MacOS X) as easy to use (well, easier to use) than Windows is likely to 
lead to the same downward security sprial.

I'm ranting, and this is going off-topic.  My bottom line: does this 
chip solve real problems that aren't solvable with today's technology?  
Other than protecting keys -- and, of course, DRM -- I'm very far from 
convinced of it.  "The fault, dear Brutus, is not in our stars but in 
ourselves."

--Prof. Steven M. Bellovin, http://www.cs.columbia.edu/~smb




Re: Dell to Add Security Chip to PCs

2005-02-04 Thread Dan Kaminsky

The best that can happen with TCPA is pretty good -
it could stop a lot of viruses and malware, for one
thing.
 

No, it can't.  That's the point; it's not like the code running inside 
the sandbox becomes magically exploitproof...it just becomes totally 
opaque to any external auditor.  A black hat takes an exploit, encrypts 
it to the public key exported by the TCPA-compliant environment (think 
about a worm that encrypts itself to each cached public key) and sends 
the newly unauditable structure out.  Sure, the worm can only manipulate 
data inside the sandbox, but when the whole *idea* is to put everything 
valuable inside these safe sandboxes, that's not exactly comforting.

--Dan


Re: Dell to Add Security Chip to PCs

2005-02-04 Thread Anne & Lynn Wheeler
Peter Gutmann wrote:
Neither.  Currently they've typically been smart-card cores glued to the 
MB and accessed via I2C/SMB.
and chips that typically have had eal4+ or eal5+ evaluations. hot topic 
in 2000, 2001 ... at the intel developer's forums and rsa conferences



Re: Dell to Add Security Chip to PCs

2005-02-04 Thread Anne & Lynn Wheeler
Erwann ABALEA wrote:
 > I've read your objections. Maybe I wasn't clear. What's wrong in
installing a cryptographic device by default on PC motherboards?
I work for a PKI 'vendor', and for me, software private keys is a
nonsense. How will you convice "Mr Smith" (or Mme Michu) to buy an
expensive CC EAL4+ evaluated token, install the drivers, and solve the
inevitable conflicts that will occur, simply to store his private key? You
first have to be good to convice him to justify the extra depense.
If a standard secure hardware cryptographic device is installed by default
on PCs, it's OK! You could obviously say that Mr Smith won't be able to
move his certificates from machine A to machine B, but more than 98% of
the time, Mr Smith doesn't need to do that.
Installing a TCPA chip is not a bad idea. It is as 'trustable' as any
other cryptographic device, internal or external. What is bad is accepting
to buy a software that you won't be able to use if you decide to claim
your ownership... Palladium is bad, TCPA is not bad. Don't confuse the
two.
the cost of EAL evaluation typically has already been amortized across 
large number of chips in the smartcard market. the manufactoring costs 
of such a chip is pretty proportional to the chip size ... and the thing 
that drives chip size tends to be the amount of eeprom memory.

in tcpa track at intel developer's forum a couple years ago ... i gave a 
talk and claimed that i had designed and significantly cost reduced such 
a chip by throwing out all features that weren't absolutely necessary 
for security. I also mentioned that two years after i had finished such 
a design ... that tcpa was starting to converge to something similar. 
the head of tcpa in the audience quiped that i didn't have a committee 
of 200 helping me with the design.



Re: Dell to Add Security Chip to PCs

2005-02-04 Thread Dan Kaminsky

Uh, you *really* have no idea how much the black hat community is
looking forward to TCPA.  For example, Office is going to have core
components running inside a protected environment totally immune to
antivirus.
   

How? TCPA is only a cryptographic device, and some BIOS code, nothing
else. Does the coming of TCPA chips eliminate the bugs, buffer overflows,
stack overflows, or any other way to execute arbitrary code? If yes, isn't
that a wonderful thing? Obviously it doesn't (eliminate bugs and so on).
 

TCPA eliminates external checks and balances, such as antivirus.  As the 
user, I'm not trusted to audit operations within a TCPA-established 
sandbox.  Antivirus is essentially a user system auditing tool, and 
TCPA-based systems have these big black boxes AV isn't allowed to analyze.

Imagine a sandbox that parses input code signed to an API-derivable 
public key.  Imagine an exploit encrypted to that.  Can AV decrypt the 
payload and prevent execution?  No, of course not.  Only the TCPA 
sandbox can.  But since AV can't get inside of the TCPA sandbox, 
whatever content is "protected" in there is quite conspicuously unprotected.

It's a little like having a serial killer in San Quentin.  You feel 
really safe until you realize...uh, he's your cellmate.

I don't know how clear I can say this, your threat model is broken, and 
the bad guys can't stop laughing about it.

I use cryptographic devices everyday, and TCPA is not different than the
present situation. No better, no worse.
 

I do a fair number of conferences with exploit authors every few months, 
and I can tell you, much worse.  "Licking chops" is an accurate assessment.

Honestly, it's a little like HID's "radio barcode number" concept of 
RFID.  Everyone expects it to get everywhere, then get exploited 
mercilessly, then get ripped off the market quite painfully. 

--Dan


Re: Dell to Add Security Chip to PCs

2005-02-04 Thread Eric Murray
On Thu, Feb 03, 2005 at 11:45:01PM -0600, Shawn K. Quinn wrote:
> Isn't it possible to emulate the TCPA chip in software, using one's own
> RSA key, and thus signing whatever you damn well please with it instead
> of whatever the chip wants to sign? So in reality, as far as remote
> attestation goes, it's only as secure as the software driver used to
> talk to the TCPA chip, right?

The TCPA chip verifies the (signature on the) BIOS and the OS.
So the software driver is the one that's trusted by the TCPA chip.

Plus the private key is kept in the chip, so it can't
be read by your emulator.  If your emulator picks its own key pair
then its attesations will be detected as invalid by a
relying party that's using the real TCPA public keys.


Eric



Re: Dell to Add Security Chip to PCs

2005-02-04 Thread Anonymous
I spent considerable time a couple years ago on these lists arguing
that people should have the right to use this technology if they want.
I also believe that it has potential good uses.  But let's be accurate.

> Please stop relaying FUD. You have full control over your PC, even if this
> one is equiped with a TCPA chip. See the TCPA chip as a hardware security
> module integrated into your PC. An API exists to use it, and one if the
> functions of this API is 'take ownership', which has the effect of
> erasing it and regenerating new internal keys.

It is not true that the TPM_TakeOwnership command erases and regenerates
the internal keys.  It does generate a new Storage Root Key, which is
used for encrypting local data.  But the main controversy around TC is
the Remote Attestation feature.  That uses a key called the Endorsement
Key, EK.  It is an RSA public key generated on chip at manufacture
time, before it comes into the user's hands.  The manufacturer issues a
certificate on the public part of the EK, called the PUBEK.  This key is
then used (in a somewhat roundabout manner) to issue signed statements
which attest to the software state of the machine.  These attestations
are what allow a remote server to know if you are running a client
software configuration which the server finds acceptable, allowing the
server to refuse service to you if it doesn't like what you're running.
And this is the foundation for DRM.

The point is that the user can't change the PUBEK.  Only one is generated
per chip, and that is the only one which gets a certificate from the
manufacturer.  The private part of this key never leaves the chip and no
one, not the user and not the manufacturer, ever learns the private key.

Now, my personal perspective on this is that this is no real threat.
It allows people who choose to use the capability to issue reasonably
credible and convincing statements about their software configuration.
Basically it allows people to tell the truth about their software in a
convincing way.  Anyone who is threatened by the ability of other people
to tell the truth should take a hard look at his own ethical standards.
Honesty is no threat to the world!

The only people endangered by this capability are those who want to be
able to lie.  They want to agree to contracts and user agreements that,
for example, require them to observe DRM restrictions and copyright
laws, but then they want the power to go back on their word, to dishonor
their commitment, and to lie about their promises.  An honest man is
not affected by Trusted Computing; it would not change his behavior in
any way, because he would be as bound by his word as by the TC software
restrictions.

But I guess Cypherpunks are rogues, theives and liars, if my earlier
interactions with them are any guide.  It's an ironic and unfortunate
turn for an organization originally devoted to empowering end users
to use new cryptographic technologies in favor of what was once called
crypto anarchy.  TC is the ultimate manifestation of anarchic behavior,
a technology which is purely voluntary and threatens no one, which
allows people to make new kinds of contracts and commitments that no one
else should have the right to oppose.  And yet Cypherpunks are now arch
collectivists, fighting the right of private individuals and companies
to make their own choices about what technologies to use.  How the worm
has turned.

Another poster writes:
> Please stop relaying pro-DRM pabulum. The only reason for Nagscab is
> restricting the user's rights to his own files.
> Of course there are other reasons for having crypto compartments in your
> machine, but the reason Dell/IBM is rolling them out is not that.

A sad illustration of the paranoia and blinkered groupthink so prevalant
on this mailing list today.  Imagine, Dell is providing this chip as part
of a vast conspiracy to restrict the user's rights to his own files.
Anyone whose grasp on reality is so poor as to believe this deserves
what he gets.

The truth is, frankly, that Dell is providing this chip on their laptops
simply because laptop owners like the idea of having a security chip,
most other laptop companies offer them, and the TCG is the main player
in this space.  Dell is neither seeking to advance my liberatarian goals
nor promoting the conspiracy-theorist vision of taking away people's
control over their computers.  The truth is far more mundane.



Using TCPA

2005-02-04 Thread Eric Murray
On Thu, Feb 03, 2005 at 11:51:57AM -0500, Trei, Peter wrote:
 
> It could easily be leveraged to make motherboards
> which will only run 'authorized' OSs, and OSs
> which will run only 'authorized' software.

[..]

> If you 'take ownership' as you put it, the internal
> keys and certs change, and all of a sudden you
> might not have a bootable computer anymore.

I have an application for exactly that behaviour.
It's a secure appliance.  Users don't run
code on it.  It needs to be able
to verify that it's running the authorized OS and software
and that new software is authorized.
(it does it already, but a TCPA chip might do it better).

So a question for the TCPA proponents (or opponents):
how would I do that using TCPA?


Eric



RE: Dell to Add Security Chip to PCs

2005-02-04 Thread Marcel Popescu
> From: [EMAIL PROTECTED] [mailto:[EMAIL PROTECTED]
> On Behalf Of Anonymous

> The only people endangered by this capability are those who want to be
> able to lie.  They want to agree to contracts and user agreements that,
> for example, require them to observe DRM restrictions and copyright
> laws, but then they want the power to go back on their word, to dishonor
> their commitment, and to lie about their promises.

This assumes an US world, which is - to say the least - a little unreal. In
my country, contracts are void unless signed in the official language. That
means that, even if I want to agree to the license, I can't legally do so -
because it's in English. Which means that I can click on "I agree" WITHOUT
legally agreeing to anything - and everybody knows that.

> An honest man is
> not affected by Trusted Computing; it would not change his behavior in
> any way, because he would be as bound by his word as by the TC software
> restrictions.

Only in the US and related countries :) We are not bound, legally or even
morally, by a contract in a foreign language - there are people who bought
Windows or some other software even though they don't speak an iota of
English. (Furthermore, I wrote a little application which can change the
caption of a button - so I can change it to "I do not agree" (or the
equivalent in my language) before installing whatever I'm installing. Do you
think that's good enough? )

> And yet Cypherpunks are now arch
> collectivists, fighting the right of private individuals and companies
> to make their own choices about what technologies to use.  How the worm
> has turned.

BS, of course. As has already been explained here, we are paranoids - we try
to defend against the worst that could happen, not against the best. 

> A sad illustration of the paranoia and blinkered groupthink so prevalant
> on this mailing list today.

Today? You're new here, right? Paranoia is the motto of the cypherpunks :)

> Imagine, Dell is providing this chip as part
> of a vast conspiracy to restrict the user's rights to his own files.

It's not THAT vast. The mere idea that it is NOT a conspiracy, OTOH, is
plainly ridiculous. They've been at it for several years, and everyone here
should know that.

> The truth is, frankly, that Dell is providing this chip on their laptops
> simply because laptop owners like the idea of having a security chip,

No really? Name five of these laptop owners. (No, that was rethorical. Your
phrase was information-free.)

> most other laptop companies offer them, and the TCG is the main player
> in this space.

Name other five (out of the "most") laptop companies offering this chip in
their laptops. (This is NOT rethorical, I'm really curious.)

> Dell is neither seeking to advance my liberatarian goals
> nor promoting the conspiracy-theorist vision of taking away people's
> control over their computers.  The truth is far more mundane.

Profit is a very good tool, for both good and evil. In this case, they see
profit in doing something that can ultimately be used against consumers. We
comment on that, nothing more. Then again, if the consumers catch on the
trick, profit will dictate that they remove it. 

Marcel


-- 
No virus found in this outgoing message.
Checked by AVG Anti-Virus.
Version: 7.0.300 / Virus Database: 265.8.5 - Release Date: 2/3/2005
 



RE: Dell to Add Security Chip to PCs

2005-02-04 Thread Peter Gutmann
Erwann ABALEA <[EMAIL PROTECTED]> writes:

>I've read your objections. Maybe I wasn't clear. What's wrong in installing a
>cryptographic device by default on PC motherboards? I work for a PKI 'vendor',
>and for me, software private keys is a nonsense. 

A simple crypto device controlled by the same software is only slightly less
nonsensical.  That is, the difference between software-controlled keys and a
device controlling the keys that does anything the software tells it to is
negligible.  To get any real security you need to add a trusted display, I/O
system, clock, and complete crypto message-processing capability (not just
"generate a signature" like the current generation of smart cards do), and
that's a long way removed from what TCPA gives you.

>You could obviously say that Mr Smith won't be able to move his certificates
>from machine A to machine B, but more than 98% of the time, Mr Smith doesn't
>need to do that.

Yes he will.  That is, he may not really need to do it, but he really, really
wants to do it.  Look at the almost-universal use of PKCS #12 to allow people
to spread their keys around all over the place - any product aimed at a mass-
market audience that prevents key moving is pretty much dead in the water.

>Installing a TCPA chip is not a bad idea. 

The only effective thing a TCPA chip gives you is a built-in dongle on every
PC.  Whether having a ready-made dongle hardwired into every PC is a good or
bad thing depends on the user (that is, the software vendor using the TCPA
device, not the PC user).

Peter.



Re: Dell to Add Security Chip to PCs

2005-02-04 Thread Joseph Ashwood
- Original Message - 
From: "Shawn K. Quinn" <[EMAIL PROTECTED]>
Subject: Re: Dell to Add Security Chip to PCs


Isn't it possible to emulate the TCPA chip in software, using one's own
RSA key, and thus signing whatever you damn well please with it instead
of whatever the chip wants to sign? So in reality, as far as remote
attestation goes, it's only as secure as the software driver used to
talk to the TCPA chip, right?
That issue has been dealt with. They do this by initializing the chip at the 
production plant, and generating the certs there, thus the process of making 
your software TCPA work actually involves faking out the production facility 
for some chips. This prevents the re-init that I think I saw mentioned a few 
messages ago (unless there's some re-signing process within the chip to 
allow back-registering, entirely possible, but unlikely). It even gets worse 
from there because the TCPA chip actually verifies the operating system on 
load, and then the OS verifies the drivers, solid chain of verification. 
Honestly Kaminsky has the correct idea about how to get into the chip and 
break the security, one small unchecked buffer and all the security 
disappears forever.
   Joe

Trust Laboratories
Changing Software Development
http://www.trustlaboratories.com 



Re: Dell to Add Security Chip to PCs

2005-02-04 Thread Tyler Durden
I don't know how clear I can say this, your threat model is broken, and the 
bad guys can't stop laughing about it.
Come on, now...who's going to be better at Security than Microsoft? Since 
bad guys won't be allowed inside the TCPA world then everything's going to 
be just fine.

Seems like the "evil packet" idea will be useful here...bad packets should 
have their "evil bit" set to one, and they won't be alllowed inside.

-TD



Re: Dell to Add Security Chip to PCs

2005-02-04 Thread Shawn K. Quinn
On Thu, 2005-02-03 at 22:25 +0100, Anonymous wrote:
> The manufacturer issues a certificate on the public part of the EK,
> called the PUBEK.  This key is then used (in a somewhat roundabout
> manner) to issue signed statements which attest to the software state
> of the machine.  These attestations are what allow a remote server to
> know if you are running a client software configuration which the
> server finds acceptable, allowing the server to refuse service to you
> if it doesn't like what you're running. And this is the foundation for
> DRM.

Isn't it possible to emulate the TCPA chip in software, using one's own
RSA key, and thus signing whatever you damn well please with it instead
of whatever the chip wants to sign? So in reality, as far as remote
attestation goes, it's only as secure as the software driver used to
talk to the TCPA chip, right?

-- 
Shawn K. Quinn <[EMAIL PROTECTED]>