> Is anyone familiar with any weak points that may exist within the CLR
with
> regards to ensuring files are not tampered with?  Given a file that is
> strong named and digitally signed, we're meant to rest assured that
the
> file is completely tamper proof.  However, there has to be a weak
point
> somewhere along the line.

If you mean bugs specifically, then I'm not aware of any. There probably
are some right now, but I'm not sure why you say there "has to be a weak
point". There are, however, several issues with the current system. I
will elaborate anon.

> Does anyone know if hackers have ever succeeded in hacking the CLR so
that
> it will pass a file that has been tampered with even though the
> unencrypted hash of the file will not match after the tampering?

Err, if they can modify things like mscorlib, you're screwed, dude. The
whole thing rests on the assumption that the underlying platform is
secure. Meaning, you use NTFS with strong passwords, etc. In the absence
of file system security, of course, they can just modify your client.exe
directly.

> In other words, we rely upon the CLR being hackproof when we rely upon
a
> strongly named & digitally signed assembly being tamper proof.

Yes.

> I have plans for an unmanaged app to host the CLR and load an
assembly.  I
> hope to rely upon the CLR to ensure that the assembly that I am
loading
> has not been altered.

OK. Here's where I explain what I meant before.

First of all, you have to understand what the signature checking really
gets you. Two things, really:

1) It tells you that the public key hasn't changed since you built your
client.
2) It tells you that the private key used to sign this assembly
corresponded to the public key that's embedded in it.

Note what this does *not* tell you.

A) That the public key belonged to someone you should trust.
B) That the public key wasn't changed before you built your client.
C) That the person who owns the private key didn't post it on a web page
somewhere.
D) That the assembly has not been altered since you built your client -
it just tells you that if it was altered, whoever had the right private
key was the one who did the altering.

There is an additional weakness in this scheme. Because most compilers
don't actually record the public key in the client, but rather a 64-bit
hash of the public key (the public key token). Which is hard to attack
with brute-force, but (I believe) not impossible. I expect someone has
already launched just such an attack against the MSFT and ECMA public
keys, so they can find other public keys that hash to the same token. It
may take a few years, but if it's less than five, that's still a
problem.

I think the problem comes down to calculating the SHA-160 hash of 2^63
(on average) public keys. Someone else might know how much CPU that
would take. Presumably it's not prohibitively expensive, since MSFT
makes the CLR do it once every time it loads a signed assembly.

Of course, this all comes down to this: who are you trying to protect
your app against? Your coworkers? Teenagers? Determined hackers?
Governments? If the latter, I suggest you unplug the machine from the
network and put it in a locked room.

In short: It Depends. ;)

You can read messages from the Advanced DOTNET archive, unsubscribe from Advanced 
DOTNET, or
subscribe to other DevelopMentor lists at http://discuss.develop.com.

Reply via email to