Re: [Cryptography] NSA and cryptanalysis
On Fri, Aug 30, 2013 at 07:17:08AM -0400, Jerry Leichter wrote: > So the latest Snowden data contains hints that the NSA (a) spends a > great deal of money on cracking encrypted Internet traffic; (b) recently > made some kind of a cryptanalytic "breakthrough". What are we to make > of this? (Obviously, this will all be wild speculation unless Snowden > leaks more specific information - which wouldn't fit his style, at least > as demonstrated so far.) I wonder how much of the editing of the recent Snowden data is in any way related to Snowden himself (who is presumably very completely controlled and monitored by the Russians at the moment) ? The story as I understand it (from afar), is that he expropriated some roughly 20,000 complete NSA documents... and has turned some of them - mostly complete and unedited - over to his journalist collaborators who have in turn turned some of those over to their larger news organizations - where the editors have figured out what parts of them to publish under great pressure from various spooks and high officials NOT to publish certain information. What we have seen so far rather looks like it was heavily bowdlerized under very great government pressure from various governments, and it seems very likely MOST if not all of this pressure was aimed at the editorial and management level of news organizations, not Snowden himself (who is beyond their reach obviously, but also not in a position to control much about what is published). In the end it is pretty likely nobody in senior management of the media organizations involved really wants to take responsibility for leaking something that actually destroys a major US intelligence edge... and what was left out "to protect legitimate US intelligence secrets" or "technical methods" is anyone's guess at the moment. Surely, however, inevitably eventually *some* of this will leak out of the media organizations to the extent that it has passed outside of a very very small circle of people there. What is not clear, is how many of those folks at the media organizations know enough about the technological implications of what they are reading to understand what its long term significance is. A cryptanalytic "breakthrough" might be huge and fundamental and invalidate a lot of currently deployed cryptography, or just a new and very effective attack on some aspect of a commonly used security protocol that can be easily patched once it is known. > -- Jerry -- Dave Emery N1PRE/AE, d...@dieconsulting.com DIE Consulting, Weston, Mass 02493 "An empty zombie mind with a forlorn barely readable weatherbeaten 'For Rent' sign still vainly flapping outside on the weed encrusted pole - in celebration of what could have been, but wasn't and is not to be now either." ___ The cryptography mailing list cryptography@metzdowd.com http://www.metzdowd.com/mailman/listinfo/cryptography
Re: How the Greek cellphone network was tapped.
On Sat, Jul 21, 2007 at 12:56:00PM -0400, Steven M. Bellovin wrote: > On Sat, 21 Jul 2007 04:46:51 -0700 (PDT) > look at 18 USC 2512 > (http://www4.law.cornell.edu/uscode/html/uscode18/usc_sec_18_2512000-.html) > > any person who intentionally ... > > manufactures, assembles, possesses, or sells any electronic, > mechanical, or other device, knowing or having reason to know > that the design of such device renders it primarily useful for the > purpose of the surreptitious interception of wire, oral, or > electronic communications, and that such device or any component > thereof has been or will be sent through the mail or transported > in interstate or foreign commerce; > > ... > > So simple possession of a surreptitious interception device is illegal, > with exceptions for things like sale to law enforcement or > communications companies. This language was originally aimed at "bugs", hidden microphones, and other similar devices with essentially no purpose other than intercepting conversations. These devices are usually called "Title III" devices and are indeed illegal as defined above except in the hands of law enforcement and the like. Private use and even possession is forbidden. And there have been many prosecutions for possession, sale, trafficking in, and importing "bugs" and similar intercept hardware - mostly of "Spy Shop" operators who import this stuff from abroad and sell it to sleazy private investigators and divorcing spouses. This language has been around since the 1968 Omnibus Act was passed and was extended with the passage of the 1986 ECPA to cover "wire, oral, or electronic communications". It is not new and did not result from the Newt Gingrich intercept or other more recent incidents. AFAIK, (and IANL), the DOJ has rarely if ever applied Title III to ordinary radio receivers or other hardware which has general purpose uses. Scanners and other radio receivers sold to the general public are regulated by the FCC under authority created in 1993, and FCC rules were substantially toughened around 1999 to require scanners not be readily modifiable to tune analog cellular frequencies and meet certain design criteria intended to make this harder and make it harder to hear cellular calls on image frequencies. These rules also make it illegal to modify scanners to tune cellular calls. I know of no court case which has established that sale or possession of scanners or radio receivers built before the ban on cellular reception went into effect is illegal, and many tens of thousands if not hundreds of thousands of such radios are in circulation (and sold regularly on eBay). In recent years there have a small number of prosecutions for sale or possession of radio equipment and software to intercept commercial common carrier pager transmissions under Title III. There is at least one precedent that defines such software as a Title III device. This probably means that software specifically intended to enable intercept of any other signal that is not legal to listen to might also be declared a Title III device, though I am unaware of this having happened as of yet. However, even though the cell industry asked the FCC to do so, the FCC has declined to regulate test equipment - including test equipment that can tune and demodulate digital cellular and other forbidden RF signals - provided it is not marketed to the general public. It is not illegal to possess or sell, import or export, manufacture or modify such gear though of course it is illegal to actually use such gear to intercept signals not included in the list of allowed to listen to signals in section 119 of Title III. And obviously regulation of test equipment would pose some very difficult problems - since many many common real world RF tests require DC to daylight coverage without gaps to spot spurious signals, mixing products, noise, interference etc... and crippled test equipment COULD not do this job. -- Dave Emery N1PRE/AE, [EMAIL PROTECTED] DIE Consulting, Weston, Mass 02493 "An empty zombie mind with a forlorn barely readable weatherbeaten 'For Rent' sign still vainly flapping outside on the weed encrusted pole - in celebration of what could have been, but wasn't and is not to be now either." - The Cryptography Mailing List Unsubscribe by sending "unsubscribe cryptography" to [EMAIL PROTECTED]
Re: It's a Presidential Mandate, Feds use it. How come you are not using FDE?
On Tue, Jan 16, 2007 at 11:33:46AM -0500, Steven M. Bellovin wrote: > On Tue, 16 Jan 2007 08:19:41 -0800 > "Saqib Ali" <[EMAIL PROTECTED]> wrote: > > > Dr. Bellovin, > > > > > In most situations, disk encryption is useless and probably harmful. > > > It's useless because you're still relying on the OS to prevent > > > access to the cleartext through the file system, and if the OS can > > > do that it can do that with an unencrypted disk. > > > > I am not sure I understand this. With FDE, the HDD is unlocked by a > > pre-boot kernel (linux). It is not the function of the resident OS to > > unlock the drive. > > Not necessarily -- many of my systems have multiple disk drives and > file systems, some of which are on removable media. Apart from that, > though, this is reinforcing my point -- what is the threat model? Seems to me the threat model is real and obvious - physical access to the disk hardware - either by theft or (worse) by stealth (eg black bag jobs, or insider access at night or on weekends). Think of someone either image copying or stealing a drive that contains valuable data... most of the time this necessarily involves either powering it down or disconnecting it in a way that can be readily detected by drive and host interface firmware. If this results in zeroization of the working key in the drive requiring some kind of re-authentication of host to drive and drive to host and then reload of key before the data can be read it at least becomes significantly harder to steal data by just unplugging the drive and either walking out the door with it in your briefcase or plugging it into another system for an image copy before returning it to its normal home. Needless to say if the drive and its contained file systems aren't encrypted this is pretty low hanging fruit. Relatively unskilled attackers can easily capture very valuable material if they can gain physical access for only a few minutes. And further, unusual events - disasters such as floods, fires, tornadoes, building collapses and the like - can result in massive exposure of confidential data amidst the ruins whereas if the disks in desktops and servers were encrypted capture of - or covert access to - the drives in the chaos surrounded a crisis is much less useful to an adversary. Obviously it may be possible for really sophisticated attackers to somehow unplug drives from live machines without the key zeroization happening and presumably without the host noticing and raising an alarm and logging the event, but given the mechanical design of modern high end desktop and server boxes with a common connector for power and signals for the current generation of SATA drives this is at the very least significantly more challenging to do without getting noticed or caught than just causing a fake power fail and removing the disks. And it can be made harder by appropriate modest hardware, firmware and system tweaks too. Obviously too, a disk whose surface is encrypted with a key it forgets when the power is off can be quite safely shipped or stored or even decommissioned and destroyed without much danger of disclosure of confidential data contained therein. This is far more useful in practice than it might in first seem as it reduces costs and risks a lot in many common situations where drives and even entire machines need to be moved, stored, sold, scrapped and shipped around in untrusted hands. And a server or desktop that is depowered (if it is truly depowered, not always the case with modern hardware) can be assumed to be in a fairly secure state (presuming the key reload on power up requires some external intervention) whereas a traditional in-the-clear disks server or desktop that contains highly sensitive information is in face MORE vulnerable when powered down in that its disks can be removed, image copied, and returned to the system without much of anything being the wiser. A powered up machine is much more likely to at least log anomalous events that can be detected if not suspiciously crash altogether when its disks are removed or disconnected. This paradoxically makes the systems in a typical office more vulnerable exactly when they are least well monitored and protected - nights and weekends and other off hours. So I do think the classic FDE with AES in the drive ASICs does gain something meaningful against this kind of threat, though obviously the most sophisticated and careful attacks can defeat it. But defeating the less elaborate attacks at least removes an AWFUL lot of low hanging fruit and in doing so materially increases overall security. There are far fewer really sophisticated attackers than common (and often pretty stupid) petty criminals near computers, after all. Back under my rock... -- Dave Emery N1PRE, [EMAIL PROTECTED] DIE Consulting, Weston, Mass 02493 "An empty zombie mind with a forlorn barely readable weathe
Re: gang uses crypto to hide identity theft databases
On Fri, Dec 22, 2006 at 10:57:17AM -0800, Alex Alten wrote: > I'm curious as to why the cops didn't just pull the plugs right away. It > would probably > take a while (minutes, hours?) to encrypt any significant amount of > data. At the risk of stating the obvious, this is almost certainly a case of key zeroization rather than suddenly encrypting otherwise in-the-clear databases. What one does is ALWAYS encrypt all the data, but store only one single copy of the key(s) required to decrypt it and make provision for some kind of dead man switch that zeroizes the key store when pushed. Shutting off the power leaves almost all of the data intact and unaltered, but without the keys it is just random bits. Special switches and hardware assistance for key zeroization are a very standard feature of US government crypto gear and installations. The idea is that one zeros the key if one is expecting to be captured (or crash or sink) and then all the remaining data in non volatile storage is useless to your adversary if he is able to recover the media and attempt to read it. -- Dave Emery N1PRE, [EMAIL PROTECTED] DIE Consulting, Weston, Mass 02493 "An empty zombie mind with a forlorn barely readable weatherbeaten 'For Rent' sign still vainly flapping outside on the weed encrusted pole - in celebration of what could have been, but wasn't and is not to be now either." - The Cryptography Mailing List Unsubscribe by sending "unsubscribe cryptography" to [EMAIL PROTECTED]