Ian asked about the possibility of distributing binaries built with a crypto toolkit. I took the initial view that closed source and trustable crypto are mutually incompatible, but on reflection, I can think of circumstances where that might not be true.


Example. You're a company. You build hardware devices which need to talk to each other securely. (Say, ATMs for example). Obviously it wouldn't make sense for that company to have to supply its ATM-using-customers with the source code of the ATMs. So where should one draw the line?

This is an important question (for me, at least) since it affects the licensing of the yet-to-be-written TLS++ project.

After a lot of thought, I think it all boils down to the simple question ... "trusted by whom?". I might trust an application for any variety of reasons. This does not mean that you have to trust it.

It seems to me, therefore, that if you're putting together a crypto app which is going to run in an embedded software environment, in a chip, on some product, you need to consider WHO is going to be relying on the crypto services of that product. If it's just you (or your company) then only you (or your company) need to trust it, so only you (or your company) need to see the source code. On the other hand, if the public are going to be using it, then how will /they/ be assured that there are no Trojans in the chips? Should they just take your word for it? Even if you gave them the source code, the public would (in general) have no way of verifying that it actually WAS the source code. They couldn't (in general) compile the code down to the processor-specific machine code used on that device and burn in the new binary. Basically, this means you can never trust a hardware device you didn't build yourself.

But ... if nobody apart from you (or your company) is going to be relying on the crypto, then surely you should be allowed to use TLS++?

With software though, this would be an unusual circumstance. Claims such as "Download this app and you will be secure" should definitely need to be proven, and if the app is built with TLS++ that would mean distributing the source code. But would that mean distributing the source code for the whole product, or just the crypto library parts? I would argue that it would have to be the whole product, otherwise how can a user know whether what you /claim/ is the source code actually /is/ the source code?

I'm lost on this one. I don't have any answers, and I'm hoping someone else does. I don't want to restrict the distribution of TLS++, but I also don't want crippled versions of it being used to fool the public. If anyone could help me to outline a reasonable possibility....?

Maybe the solution should be this: You can distribute the binary without any source code whatsoever, and use this toolkit, unrestricted, in whatever manner you choose, provided that EITHER you distribute the source code for the whole product in a form which allows the user to reconstruct a working executable from the source code, OR you include a message which says something like "Warning - this product is closed source. If you rely on its crypto features, you do so at your own risk".

(Of course, it's also "at your own risk" if it's open source. The difference is, you have a better idea of the risk).

Jill


--------------------------------------------------------------------- The Cryptography Mailing List Unsubscribe by sending "unsubscribe cryptography" to [EMAIL PROTECTED]

Reply via email to