Re: the effects of a spy
On Tue, Nov 15, 2005 at 06:31:30PM -0500, Perry E. Metzger wrote: Steven M. Bellovin [EMAIL PROTECTED] writes: Bruce Schneier's newsletter Cryptogram has the following fascinating link: http://www.fas.org/irp/eprint/heath.pdf It's the story of effects of a single spy who betrayed keys and encryptor designs. Very interesting indeed. I was unaware that the military had such astonishingly bad key management practices. One wonders if things have actually improved. Probably not. I'm an outsider listening in but what I can hear seems to say they are no better at key management. Or crypto gear which does not get in the way of fast reliable tactical communications. One thing one hopes has changed is that one hopes that it is no longer necessary for everyone to share the same keying material among so many different endpoints. Public key cryptography and key negotiation could (in theory) make it unnecessary to store shared secrets for long periods of time before use, where they are rendered vulnerable to espionage. One hopes that, over the last thirty years, this or something analogous has been implemented. The term broadcast has a special meaning in the radio world. It is by definition one-way. Thus the fleet broadcast was sent to all the ships and each picked out it's own messages. Key negotiation probably was never practical on those circuits. The broadcast became available via satellite sometime in the sixties. It was 75 baud teletype. It is still there today. One intriguing question that I was left with after reading the whole thing was not mentioned in the document at all. One portion of the NSA's role is to break other people's codes. However, we also have to assume that equipment would fall into the wrong people's hands at intervals, as happened with the Pueblo incident. If properly designed, the compromise of such equipment won't reveal communications, but there is no way to prevent it from revealing methods, which could then be exploited by an opponent to secure their own communications. I doubt the top-level equipment could fall into the wrong people's hands as it is probably not in the field. The tactical systems don't need to be as good since the information is not useful for very long. With any luck, the EP-3 that landed in China did not give up as much info. The CD-ROMs for loading the computers become unreadable after a few seconds in the microwave oven. :) Does the tension between securing one's own communications and breaking an opponents communications sometimes drive the use of COMSEC gear that may be too close to the edge for comfort, for fear of revealing too much about more secure methods? If so, does the public revelation of Suite B mean that the NSA has decided it prefers to keep communications secure to breaking opposition communications? There is probably some level where this is considered but there is little indication the military is not about as far behind the real world as they have always been. We also can hope the intel function has shifted from breaking diplomatic and military communications to sifting out the gems from the pebbles in the landslide of general telecomm. And there is the problem of brainpower. The military and NSA probably have less now than during real wars. Note that by current standards, Alan Turing could not get a US security clearance. LRK - The Cryptography Mailing List Unsubscribe by sending unsubscribe cryptography to [EMAIL PROTECTED]
Re: public-key: the wrong model for email?
On Thu, Sep 16, 2004 at 04:57:39PM -0700, Bill Stewart wrote: At 10:19 PM 9/15/2004, Ed Gerck wrote: Yes, PKC provides a workable solution for key distribution... when you look at servers. For email, the PKC solution is not workable (hasn't been) and gives a false impression of security. For example, the sender has no way of knowing if the recipient's key is weak (in spite of its length) or has some key-access feature. Nonetheless, the sender has to use that key. I don't understand the threat model here. That seems to be the actual problem. If you want real security, you need a vault, guards, cryptographers, and do the crypto in the vault. I use GnuPG so my e-mail is in an envelope rather than on a postcard. If the fedz want to read it they bring guns, slammers, and rubber hoses anyway. Perhaps it is time to define an e-mail definition of crypto to keep the postman from reading the postcards. That should be easy enough to implement for the average user and provide some degree of privacy for their mail. Call it envelopes rather than crypto. Real security requires more than a Windoz program. -- [EMAIL PROTECTED] - The Cryptography Mailing List Unsubscribe by sending unsubscribe cryptography to [EMAIL PROTECTED]
Re: Cryptography and the Open Source Security Debate
On Wed, Aug 25, 2004 at 03:17:15PM +0100, Ben Laurie wrote: lrk wrote: My examination of RSAREF and OpenSSL code was more toward understanding how they handled big numbers. It appears both generate prime numbers which are half the length of the required N and with both of the two most significant bits set to one. This means the ratio R=P/Q (P being the larger prime) is limited to 1R(4/3). The actual maximum R is less and can be determined by examining N. This doesn't sound right to me - OpenSSL, IIRC, sets the top and bottom bits to 1. Of course, all large primes have the bottom bit set to one. The source of OpenSSL I looked at was part of the FreeBSD distribution. int BN_rand(BIGNUM *rnd, int bits, int top, int bottom); BN_rand() generates a cryptographically strong pseudo-random number of bits bits in length and stores it in rnd. If top is -1, the most significant bit of the random number can be zero. If top is 0, it is set to 1, and if top is 1, the two most significant bits of the number will be set to 1, so that the product of two such random numbers will always have 2*bits length. If bottom is true, the number will be odd. It appears this is called with top=1 for RSA primes. OpenSSL may not use it that way. -- [EMAIL PROTECTED] - The Cryptography Mailing List Unsubscribe by sending unsubscribe cryptography to [EMAIL PROTECTED]
Re: Cryptography and the Open Source Security Debate
On Thu, Aug 12, 2004 at 03:27:07PM -0700, Jon Callas wrote: On 10 Aug 2004, at 5:16 AM, John Kelsey wrote: So, how many people on this list have actually looked at the PGP key generation code in any depth? Open source makes it possible for people to look for security holes, but it sure doesn't guarantee that anyone will do so, especially anyone who's at all good at it. Incidentally, none of the issues that lrk brought up (RSA key being made from an easy to factor composite, a symmetric key that is a weak key, etc.) are unique to PGP. Yep. And I know that. But as my hair turns grey, I make more simple mistakes and catch fewer of them. Looks like we are batting zero here. I have seen no responses nor received off-list e-mail from anyone admitting to examining the open source for holes. My examination of RSAREF and OpenSSL code was more toward understanding how they handled big numbers. It appears both generate prime numbers which are half the length of the required N and with both of the two most significant bits set to one. This means the ratio R=P/Q (P being the larger prime) is limited to 1R(4/3). The actual maximum R is less and can be determined by examining N. While this seems not very helpful, the more bits of R I know, the easier it is to factor N. Is this well known and has it been discussed here? -- [EMAIL PROTECTED] - The Cryptography Mailing List Unsubscribe by sending unsubscribe cryptography to [EMAIL PROTECTED]
Re: Monoculture / Guild
On Thu, Oct 02, 2003 at 03:34:35PM -0700, John Gilmore wrote: ... it does look very much from the outside that there is an informal Cryptographers Guild in place... The Guild, such as it is, is a meritocracy; many previously unknown people have joined it since I started watching it in about 1990. The way to tell who's in the Guild is that they can break your protocols or algorithms, but you can't break theirs. The problem with guilds is that they become set in their ways. Ask here how the fact that not all large numbers are hard to factor affects RSA and you will be ignored or dismissed. Ask whether cubic meters of special hardware could brute-force keys better than the same cubic meters of super computers and you get the same. As a perineal outsider, I notice this in several fields. I'm not in the guild for measuring the Specific Gravity of Gases. Which is precisely why my name is on the patent for the smallest machine (4,677,841). -- - | Lyn KennedyE-mail | [EMAIL PROTECTED] | | K5QWB ICBM | 32.5 North 96.9 West| ---Livin' on an information dirt road a few miles off the superhighway--- - The Cryptography Mailing List Unsubscribe by sending unsubscribe cryptography to [EMAIL PROTECTED]