Re: going around the crypto
This isn't really a problem with the servers though, the problem lies in the fact that client-side certs are (effectively) unworkable. I know of a number of organisations who wanted to use them and ran into so many problems just with pilots involving small numbers of (presumably) experienced users that they gave up on trying to deploy them to the masses. If we had an all- encompassing PKI (ie.if cert management was easy) and if we could assume technically clueful users and if those users really cared about security (rather than seeing it as an impediment to getting their work done, which it often is), then client-side certs would be feasible. At the moment they're just not workable except within closed communities where you can control a lot of the parameters (you run the PKI, you vet the users, and you tell them you won't talk to them unless they take the appropriate security precautions and hope you don't have competitors who'll let them in without this). It is my understanding that MIT has a number of widely-used web applications (eg student registration) that have been using only client certs for authentication for a couple of years with reasonable success. You might say that this makes your point (these are MIT people, after all, hence closed, vetted, clueful, etc), but it is reasonably large-scale (~20K users or so, I think). The point, perhaps, is that this PKI deployment duplicates, more or less, for the web the functionality that Kerberos provided ten years ago for its suite of applications (telnet etc). So it's comforting to know that a PKI can do that much. *8^)* - RL "Bob"
Re: going around the crypto
"Steven M. Bellovin" wrote: It's clearly not automatic, but I suspect it would work User behaviour is the weak point here--while the browsers WILL notify you that the cert is signed by a CA you don't recognize, they also give you the option of accepting the cert, which most users will just blindly accept. Netscape gives you a couple of options here--accept the site cert for this session only, or accept it forever; I expect lots of users will choose "forever", since that's simpler. -- -- Marcus Leech Mail: Dept 8M70, MS 012, FITZ Systems Security Architect Phone: (ESN) 393-9145 +1 613 763 9145 Security and Internet Solutions Fax: (ESN) 395-1407 +1 613 765 1407 Nortel Networks [EMAIL PROTECTED] -Expressed opinions are my own, not my employer's--
Bill to spell out court-ordered access to keys (via ISN)
Date: Mon, 16 Aug 1999 22:03:49 -0600 Reply-To: mea culpa [EMAIL PROTECTED] From: mea culpa [EMAIL PROTECTED] Subject: Bill reopens encryption access debate To: [EMAIL PROTECTED] http://www.fcw.com:80/pubs/fcw/1999/0816/fcw-newsencrypt-08-16-99.html Bill reopens encryption access debate BY DOUG BROWN ([EMAIL PROTECTED]) AND L. SCOTT TILLETT ([EMAIL PROTECTED]) Renewing efforts to allow law enforcement agencies to access and read suspected criminals' encrypted electronic files, the Clinton administration has drafted a bill that would give those agencies access to the electronic "keys" held by third parties. The Cyberspace Electronic Security Act, the drafting of which is being led by the Office and Management and Budget and the Justice Department, "updates law enforcement and privacy rules for our emerging world of widespread cryptography," according to an analysis accompanying the bill obtained by Federal Computer Week. Encryption technology, according to the draft, is "an important tool for protecting the privacy of legitimate communications and stored data" but also has been used "to facilitate and hide unlawful activity by terrorists, drug traffickers, child pornographers and other criminals." The new bill seeks to uncover that activity by allowing law enforcement officials to obtain the keys needed to decrypt messages by applying for search warrants or court orders, much as they might do to uncover other evidence. The administration is concerned about the use of encryption technology because advances in recent years have made it extremely difficult for law enforcement officials to crack a code once they have intercepted a message. [snip..] ISN is sponsored by Security-Focus.COM
IP: Latest in computer security revealed
--- begin forwarded text From: [EMAIL PROTECTED] Date: Mon, 16 Aug 1999 13:34:55 -0500 To: [EMAIL PROTECTED] Subject: IP: Latest in computer security revealed Sender: [EMAIL PROTECTED] Reply-To: [EMAIL PROTECTED] Source: EurekAlert! http://www.eurekalert.org/releases/wpi-lic081699.html FOR IMMEDIATE RELEASE: 16 AUGUST 1999 Contact: Arlie Corday [EMAIL PROTECTED] 508-831-6085 Worcester Polytechnic Institute Latest in computer security revealed at WPI international workshop More than 180 computer security experts, half of whom traveled from outside the United States, converged on Worcester Polytechnic Institute for the 1999 Workshop on Cryptographic Hardware and Embedded Systems (CHES), Aug. 12-13. The popular workshop provided a forum for real-world system and design issues. Conference organizers Cetin Koc of Oregon State University and Christof Paar of WPI point out that many consumer products are gaining computer-like capabilities. E-commerce and other electronic communications demand that sensitive data, such as credit card numbers, must be protected from prying eyes. The tool for protecting information, called cryptography, will be required in these products, using embedded systems that offer relatively little computational power. The challenge of adding cryptography to hardware devices and embedded systems led to the development of the WPI workshop. In its inaugural year, international experts presented new results on efficient implementation of cryptographic algorithms and attacks, as well as other practical issues in system design such as random number generation. Among the highlights of the conference was a talk by Adi Shamir, a co-inventor of the RSA code used to protect e-commerce. Shamir called the security of the world's leading web browsers into question with a new fast factoring attack. The most eagerly awaited contribution to CHES involved not only a fast way to make a code, but also a fast way to break one. The RSA public-key cryptosystem, which is widely used in web browsers such as Netscape Communicator and Microsoft Internet Explorer, is based on the problem of factoring large numbers. It is an acronym based on its inventors (Rivest-Shamir-Adleman). Fortunately for consumers and businesses, up until now, factoring algorithms have been slow and memory intensive processes. But at the workshop, Shamir, from Israel's Weizmann Institute of Science, shed light on an ingenious way to speed up part of a factoring computation known as sieving. A sieve procedure consists of repeatedly running through a long list of numbers and finding which small integers divide those in the list. Using optoelectronics, Shamir's new device, called TWINKLE, offers a 500-1000 times speedup over the fastest workstations on the market in this crucial stage of factoring. This development has grave implications for electronic commerce: Due to U.S. export laws, the strongest exportable public-key systems are restricted to 512 bits. If and when the device is actually built, these systems can be easily broken. The systems, Shamir pointed out, "protect 95 percent of today's e-commerce on the Internet," and thus render them "very vulnerable." Brian Snow of the U.S. National Security Agency emphasized the need for more research in assurance technolgy. "The scene I see is products and services sufficiently robust to counter many, but not all, of the 'hacker' attacks we hear so much about today, but not adequate against the more serious but real attacks mounted by economic adversaries and nation states," Snow noted. "We will be in a truly dangerous stance: We will think we are secure, and act accordingly, when in fact we are not secure." Experts continue to search for answers to computer security. Another development at CHES involved improved methods for generating random numbers. Nearly all real-world cryptosystems need random numbers. Unfortunately, this is an extremely difficult problem, since computers are designed to be completely predictable. At CHES, scientists from Italy's Ugo Bordoni Foundation offered a cost-effective idea based on sampling noisy semiconductor junctions. Normally in circuit design, engineers try to reduce noise. However, by building noisy circuits on purpose, one can use the noise as a source of random numbers. In addition, researchers from Bell Labs Innovations provided a variety of new, practical techniques including one based on chaos theory, which appears to be particularly cost-efficient. Of course, efficiency of performance is just as crucial as cost. Sandia National Labs researchers presented a design for a new computer chip that can encrypt up to 10 gigabits of data per second, satisfying all but the most demanding of applications. In addition, one can use three of the chips together to handle Triple-DES encryption with no loss of performance. The DES, or Data Encryption Standard, algorithm is the most widely used bulk encryption method,
Re: going around the crypto
Michael Helm wrote: The attacker could also present a certficate from a fake CA with an appropriate name -- say, "Netscape Security Services", or something that Right. In which case Netscape brings up a different dialog which says that the server certificate is signed by an unrecognized CA. Again, you can proceed, but it's not like it's automatic. It's clearly not automatic, but I suspect it would work In many organizations which have attempted to do pki, it is often the case that a home-made certificate authority with a self-signed root CA certificate is created that issues in-house certificates for servers, clients, or whatever. But many of those orgs. have found distributing the root CA certificate very problematical, with the result that these acceptance dialogs alluded to above becomes routine to the user community. The first time this happens you probably look at what the many pop up windows are saying, puzzle over them, even dial up the local help desk. The tenth time you just hold down the mouse button whip thru it. It's simply amazing how much disinformation there is floating around about this stuff. If the organization attempting to do PKI isn't incompetent, they burn their cert into the install package for the browser that they give to users, eliminating this problem altogether. Sorry if I'm starting to sound a bit strident. I know that Netscape's PKI isn't anywhere within sight of perfect, but it's better than a lot of people give it credit for. -- What is appropriate for the master is not appropriate| Tom Weinstein for the novice. You must understand Tao before | [EMAIL PROTECTED] transcending structure. -- The Tao of Programming |
[ANNOUNCE] PureTLS: Alpha 2 Release
http://www.rtfm.com/puretls/ Claymore Systems, Inc. is pleased to announce the availability of PureTLS 0.9a2. PureTLS is a free pure Java implementation of TLS and SSLv3. This is the second Alpha release of PureTLS. We consider the code quality to be late Alpha. That is to say, it's undergone some testing, including interoperability testing with OpenSSL, and we think it's a useful product. Some bugs have been fixed since Alpha 1, but there are certainly still bugs. We expect to produce a beta-quality product by mid-September, (We'd been hoping to do it earlier, but things got busy) but to do that we need people to try it and send us bug reports. This version makes a number of changes from Alpha 1, including fixing a serious security problem. If you're using Alpha 1, please upgrade. PureTLS is released under a BSD-style license. Quite simply, we feel that good security should be a commodity, and this is our contribution to that end. CHANGES FROM ALPHA 1 PureTLS now works with Cryptix 3.1 and JDK 1.2. A horribly embarrasing packaging oversight has been fixed. Alpha 1 included test-only code that always verified every signature on a certificate as true. Obviously, this is a major security hole and it's been fixed in Alpha 2. X509Cert now includes support for extensions. Several failure modes are now cleaner. In particular, if client auth is requested but not available, an exception is thrown instead of a null pointer error. For details and to download, see: http://www.rtfm.com/puretls/
Re: linux-ipsec: Re: Summary re: /dev/random
At 11:39 AM -0500 8/13/99, Jim Thompson wrote: This thread started over concerns about diskless nodes that want to run IPsec. Worst case, these boxes would not have any slots or other expansion capability. The only source of entropy would be network transactions, which makes me nervous... An interesting alternative, I think, is an add-on RNG which could go on a serial or parallel port. The bandwidth achievable without loading down the machine is limited, but we don't need tremendous speeds, and many PCs used as routers, firewalls, etc. have such ports sitting idle. Even semi-dedicated diskless boxes would *often* have one of those. Of course, such a box already exists. The complete details of its design are available, and purchasing the box gives you the right to reproduce the design (once) such that you can, indeed, verify that you're getting random bits out of the box. I spent some time searching the Web for hardware randomness sources and I have summarized what I found at http://www.world.std.com/~reinhold/truenoise.html. I located several serial port RNG devices and some good sources of white noise that can be plugged into a sound port. I don't think I found the box Mr. Thompson refers to, but I would be glad to add it to the list. I also included serial and USB video cameras, which may be a good source of randomness due to digitization noise, if nothing else. I still feel strongly that diskless machines that are likely to use IPsec or other security software (e.g. SSL) should have a built-in source of randomness, a la the Pentium III. If the other microprocessor manufacturers won't comply, a TRNG should be included on one of the support chips. Randomness generation is so critical to public key cryptography that we should insist it be engineered in, not pasted on. Arnold Reinhold
ElGamal, Barnes, Callas, Parekh, etc., take over Packet Storm?
At 2:00 PM -0400 on 8/17/99, [EMAIL PROTECTED] wrote: Title: Security Firm to Revive Computer-Defense Site Resource Type: News Article Date: August 17, 1999 Source: NYT (Free Registration Required) Author: PETER WAYNER Keywords: KROLL-O'GARA,PACKET STORM,WEBSITE TAKEOVER,HACKERS Abstract/Summary: Kroll-O'Gara, the international security consulting firm, said Monday it would take over an Internet site that not only posted information about defending computer systems against attacks but also told how to break into them. In the shadowy world of hackers and crackers, it is often hard to tell the good guys from the bad. Computer-security experts frequently test systems by breaking into them, and the site, Packet Storm, posted descriptions of those break-ins. Kroll-O'Gara's computer security unit, Securify, which declined to discuss financial terms of its acquisition, said it planned to maintain the site's tradition of high-quality information as a way to market its services. But Kroll-O'Gara executives said that it would rid the site of its more contentious publications. Original URL: http://www.nytimes.com/library/tech/99/08/biztech/articles/17secure.ht ml Added: Tue Aug 17 9:15:18 -040 1999 Contributed by: David Dillard - Robert A. Hettinga mailto: [EMAIL PROTECTED] The Internet Bearer Underwriting Corporation http://www.ibuc.com/ 44 Farquhar Street, Boston, MA 02131 USA "... however it may deserve respect for its usefulness and antiquity, [predicting the end of the world] has not been found agreeable to experience." -- Edward Gibbon, 'Decline and Fall of the Roman Empire'
Fingerprints and smart cards (was: going around the crypto)
Peter Gutman said: Smart cards with thumbprint readers are one step in this direction, although they're currently prohibitively expensive. American Biometrics (www.abio.com) has their Biomouse II, which I once heard was supposed to retail around $250 or so. The old finger-only Biomouse should cost about half that much. No doubt this is "prohibitive" for many users. Plus you have to ask what level of threat you're willing to accept. Some guys did a terrific article on thumbprint readers and found the Biomouse reader couldn't be tricked by a photocopied fingerprint (like some competitors) but could be tricked some other way (I forget the details). The more you pay for the reader, the harder it is to forge the fingerprint. This suggests interesting implications if people start using thumbprints with, say, ATMs. How good does the ATM's thumb print reader *really* have to be? What about those cheap machines sprouting up all over the place? In some ways the thumbprint based ATM technology might pose greater risks than PINs. At least people are warned to keep their PINs secret. There's no cultural tradition to keep your fingerprints off of things (unless you're performing a criminal act). Rick. [EMAIL PROTECTED] "Internet Cryptography" at http://www.visi.com/crypto/
Re: linux-ipsec: Re: Summary re: /dev/random
At 09:11 PM 8/17/99 -0700, Nick Szabo wrote: how it was prepared. There simply *cannot* be an all-purpose statistical test. Quite so. I'd like to see what Maurer's "universal" test says about the entropy of completely predictable sequences like the following: (1) pi (2) Champernowne's number (0.12345678901011121314151617181920...) Look, no test can distinguish between an arbitrarily large-state PRNG and a 'real' RNG. Pi's digits will appear fully entropic, under MUST, Diehard, etc. Even though its Kolomogorov/Chaitin-complexity is simple (ie, the program that computes Pi is short). Pi is not random, though its digits (and all N-tuples of digits, etc.) are evenly distributed. This *is* a profound point. Dunno about C's number, suspect its the same. Maurer, BTW, points out that his test is only useful if you know you have a real random bitstream generator. (Any faults in this exposition are my own. I have never met or corresponded with Maurer, in fact.) But if you cut through the philosophical boolsheet, and elegant computation-theory definitions of complexity, you are left with a problem: how to measure the entropy of a sample of a source, e.g., /dev/random's input. And it comes down to F log F no matter what algorithm you use to approximate it. The only philosophy you need is this: the Adversary doesn't know the internal (e.g., atomic) state of your hardware. Therefore, the measured state is unpredictable; but it probably isn't uniformly distributed. So you distill it. Until you've got fully independent bits. And you hash and stir it when you read it, for 'security in depth', ie, extra layers of protection. . Again, the question is, what is the alternative? I'm willing to discuss e.g., a function of raw vs. gzip-compressed file size as a measure of entropy. I think my major point is, measure it as best you can. I stumbled upon MUST; its a measure, so easier to handle than a multidimensional spectrum like Diehard; more informative than FIPS-140 binary tests. I am open to suggestions as to how to quantitatively evaluate RNGs, for /dev/r or otherwise. Cheers, David Honig
Re: linux-ipsec: Re: semantics of /dev/{u}random
On Wed, 18 Aug 1999, Arnold G. Reinhold wrote: Finally, I think thought should be given to the question of how to use copious hardware random number generators on systems where they are available. These could include on-chip RNGs, like the Pentium III's, sound cards with noise input, HRNG boxes connected to a serial or USB port, or even video cameras. I have located a number of relatively inexpensive hardware RNG solution (http://world.std.com/~reinhold/truenoise.html). I suspect they would be used more if they were fully supported in the OS. I have written a piece of software that may be of assistance here. audio-entropyd periodically reads audio from a stereo soundcard and feeds the difference between the left and right channels into /dev/random (via SHA1). The time between reads, size of the input buffer, length of the hash and number of bits credited to the KRNG are all user configurable. There is an alpha version at: http://toad.ilogic.com.au/~dmiller/files/audio-entropyd-0.0.0.tar.gz Regards, Damien Miller -- | "Bombay is 250ms from New York in the new world order" - Alan Cox | Damien Miller - http://www.ilogic.com.au/~dmiller | Email: [EMAIL PROTECTED] (home) -or- [EMAIL PROTECTED] (work)
Nonrepudiation and what to do about it (Jueneman - FW)
--- begin forwarded text Date: Fri, 20 Aug 1999 02:27:15 -0400 Reply-To: Law Policy of Computer Communications [EMAIL PROTECTED] Sender: Law Policy of Computer Communications [EMAIL PROTECTED] From: Vin McLellan [EMAIL PROTECTED] Subject: Nonrepudiation and what to do about it (Jueneman - FW) To: [EMAIL PROTECTED] Status: U This is an excerpt -- a "history lesson" -- from a 8/19/99 proposal by cryptographer, network security architect, and PKI guru Bob Jueneman of Novell on the IETF's PKIX and S/MIME mailing lists. Please copy Mr. Jueneman on responses at [EMAIL PROTECTED]. The full post can be found at: http://www.imc.org/ietf-smime/mail-archive/msg02933.html. _Vin ooo Begin Forwarded Text ooo When the ABA Digital Signature Guidelines were being formulated within the Information Security Committee, with lots of very bright, well-informed attorneys and technologists contributing, there was a fundamental, underlying assumption that PKI technology could be used to reduce some of the uncertainty that was perceived to be a barrier to the efficient use of electronic commerce. Instead of having to use proprietary, value added networks and negotiate N*(N-1) contracts between all of the trading partners, it was expected that the use of a common PKI technology and appropriate legal frameworks would eliminate most of that overhead. It was recognized that a accretion of case law had resulted in a situation where printed forms, letterhead, FAXs, telegrams and later Telexes, ordinary e-mail, and who knows what else forms of communications could, under the proper circumstances, be interpreted as being a legally binding signature. The trouble was that the technology had moved much faster than the case law, and the uncertainty was increasing at a compounded rate. For example, back when printed forms were created on letterhead presses, and were filled in using either handwriting or a typewriter, it was pretty obvious what the difference was. And because going to a printer and having a lot of standard forms printed involved some expense, time and effort, the conventional use of such a form for purposes of trade might reasonably be considered tantamount to a signature of the company. Unfortunately, a technological decision that was rational at the time is no longer rational in the age of laser printers, when preprinted forms have almost disappeared. But the case law hasn't changed, so the question of what constitutes signature becomes more of a risk, both for the relying party who thought it was valid, and for the originator, who really didn't intend for it to be anything more than a draft proposal. In addition to these technical/legal issues, there was also the issue of liability in the event of something going wrong, such as a key being compromised. One approach would be the very loose standard of care embodied in the US credit card law (Regulation E), where even the most egregious carelessness on the part of the subscriber could only result in a $50 loss. The problem with that approach is that it effectively required the establishment of a mechanism that would be very similar to the credit card industry to centralize the reporting of every time a certificate was used to verify a transaction, so that loss limits could be enforced. At the other end of the spectrum was "strict liability,' which is the standard used between major financial institutions. Because of the volume of the business, and the difficulty of backing out transactions in error that might otherwise leave an innocent third party holding the bag for a transaction gone wrong, inter-bank transactions are generally governed by strict liability -- no matter what the extenuating circumstances might be the bank was still liable for a transaction that went out in its name. In between these two poles were standards of simple negligence or gross negligence as a possible defense. The final decision that was incorporated in the Guidelines, Section 5.6 Presumption in dispute resolution, was to create a "rebuttable presumption" that a digital signature verified by reference to the public key listed in a valid certificate is the digital signature of the subscriber listed in that certificate. The effect of this presumption was to allocate the burden of proof to the person who is challenge the validity of the signature. In the case of a claimed forgery, for example, the burden of proof (independent of the risk of loss) falls on the subscriber, who would generally be in a much better position to know how the keys were protected, etc., than the relying party. The State of Utah, in their pioneering Digital Signature Act, didn't go quite so far as that. Instead, they applied the rebuttable presumption argument only to a special class of certificates created by so-called "Licensed Certification Authorities" that were subject to a higher level of assurance, involving inspection and audit and
Another web secure mail service
Visit http://www.1on1mail.com/ It has a downloadable Windows client that I haven't tried yet, and a lot of blather about how secure 2048 bit RSA keys are. It's free, supported by ads. I wonder if it puts them in the encrypted messages. Regards, John Levine, [EMAIL PROTECTED], Primary Perpetrator of "The Internet for Dummies", Information Superhighwayman wanna-be, http://iecc.com/johnl, Sewer Commissioner Finger for PGP key, f'print = 3A 5B D0 3F D9 A0 6A A4 2D AC 1E 9E A6 36 A3 47
Controlled CPU TEMPEST emanations
Hello, After having implemented and successfully tested Ross Anderson's idea to use the video output to synthesize a mediumwave AM signal, I wondered if a similar effect could be obtained by using only the CPU, since it was easy to correlate CPU activity with radio noise. I've just written a quick C program that tries to force activity on the memory bus in a repetitive pattern, with adjustable frequency. After having fiddled with the timings for about one hour, I managed to broadcast a test tune using my Pentium 120 running Linux, giving extremely clear reception on FM band at about 87.5 Mhz (I have in no way calculated or predicted this frequency). Be warned that my understanding of radio waves is bad and incomplete, and that I have no particular radio equipment, save a walkman and a radio cassette player. I found that it is possible to hear the test tune over the whole "consumer" medium- and short-wave spectrum (530-1600 KHz, 2.3-22 MHz) using the walkman, which has a digital synthesized PLL radio (which is generally very sensitive to electrical noise), provided the radio is held at a distance of less than two meters around the CPU, which suggests that there are spectral components of CPU activity at many frequencies dividing the clock frequency and at their harmonics (which gives a very rich spectrum). The reception in the FM band is much more clean, and it is possible to hear the test tune in the next room (three to four meters). I've found that accesses to the main memory create much more noise that other CPU activity, which is readily understandable. As it is not possible to disable CPU caches in user mode, the program allocates a buffer of 1 megabyte, larger than the CPU caches, and fills it with an arbitrary pattern for a number of cycles, then pauses for a number of cycles. These numbers are supplied on the command line. There is an evident correlation between the pitch of the tones generated and the length of the cycles. However, the amplitude of the received signal, although constant for one run, can vary significantly between different runs. My guess is that this has to do with the physical addresses of the memory pages allocated by the process. I guess that with higher frequency processors and careful assembly coding, it should be possible to do good broadcasting upto and including the FM band. Unlike broadcasting done using an attached CRT display, this broadcasting would be totally invisible and undetectable to the user unless he is suspecting such an activity, and either starts to investigate it or is a radio amateur having lots of equipment (like a spectrum analyzer) which could give him hints about weird CPU activity patterns (but it should be possible to use spread-spectrum transmissions to hide them completely, altough decoding SS is hard). However broadcasting done using the CPU and/or system buses is much less powerful than broadcasting done using the CRT display. But, since it is invisible, if one can get reasonably close to the target computer, it might be possible to discretely record the signals using a dissimulated received, for later processing. Thus, I think that this threat is at least as serious as hidden data transmissions via the CRT. If you are too lazy to write your own and want my quicly hacked, slow, dirty source code for CRT or CPU broadcasting (X11/Linux, DGA), e-mail me and I'll make them available on the net. Berke. -- Berke Durak [EMAIL PROTECTED] PGP 262i F203A409 44780515D0DC5FF1:BBE6C2EE0D1F56A1 GnuPG 1024D/15FAB6E4 2048g/64021883 E38EE35DCED067CEB949:FC77DAFA083A15FAB6E4 Kripto-TR http://gsu.linux.org.tr/kripto-tr/
decorellation
What does decorellation do? -- Mike Stay Cryptographer / Programmer AccessData Corp. mailto:[EMAIL PROTECTED]
Euro-Parl Surveillance Reports
We offer the European Parliament-sponsored reports which have been prepared as follow-up to the 1998 "Appraisal of the Technologies of Political Control." The four-part series is titled "Development of Surveillance Technology and Risk of Abuse of Economic Information (an appraisal of technologies of political control)," April and May 1999. Part 1: "The perception of economic risks arising from the potential vulnerability of electronic commercial media to interception - Survey of opinions of experts. Interim Study," by Nikos Bogonikolos: http://cryptome.org/dst-1.htm (158K, English) Part 2: "The legality of the interception of electronic communications: A concise survey of the principal legal issues and instruments under international, European and national law," by Prof. Chris Elliott: http://cryptome.org/dst-2.htm (42K, English) Part 3: "Encryption and cryptosystems in electronic surveillance: a survey of the technology assessment issues," by Dr. Franck Leprévost: http://cryptome.org/dst-3.htm (81K, FR; EN trans invited) To round out the four parts, we point to the previously published Part 4: "The state of the art in Communications Intelligence (COMINT) of automated processing for intelligence purposes of intercepted broadband multi-language leased or common carrier systems, and its applicability to COMINT targeting and selection, including speech recognition," by Duncan Campbell: http://www.iptvreports.mcmail.com/stoa_cover.htm
$100 secure phones from Starium
Starium is about to start selling $100 phone encryption units, according to this article: http://www.wired.com/news/news/technology/story/21236.html This could potentially change the encryption debate landscape quite dramatically, as even casual users will be able to justify the price. -- Perry Metzger [EMAIL PROTECTED] -- "Ask not what your country can force other people to do for you..."
ADMIN: finally caught up
After a week of machine crashes and internet access problems, I've finally caught up on the moderation backlog. I'm very sorry about the temporary disruption. -- Perry Metzger [EMAIL PROTECTED] -- "Ask not what your country can force other people to do for you..."
bo2k cryptography
I've received some questions by email which are beyond my ability to answer. The questions are about the cryptographic strength of the plugin for bo2k (3DES IIRC, see www.bo2k.com and www.cdc.com, down once in a while it seems). If anyone don't know what bo2k is, it's a remote control utility which has caused some discussions regarding ethics which are off topic here... Basicly I wonder if there is any evaluation of how strong the encryption is. I'm aware that that 168 bit is concidered "NSA-secure" and that 3DES is concidered secure, but what about -- 3DES algorithm used correctly? -- Key generation: Good PRNG, Bad PRNG, Good Hash, Bad Hash? And any other subject which might come into mind. //blue
Re: [ANNOUNCE] PureTLS: Alpha 2 Release
David Honig [EMAIL PROTECTED] writes: At 09:26 PM 8/16/99 -0700, Eric Rescorla wrote: A horribly embarrasing packaging oversight has been fixed. Alpha 1 included test-only code that always verified every signature on a certificate as true. Well, at least some of your testing went remarkably smoothly :-) Quite so. It really shows the importance of doing negative controls as well as positive controls. -Ekr -- [Eric Rescorla [EMAIL PROTECTED]] PureTLS - free SSLv3/TLS software for Java http://www.rtfm.com/puretls/
Computerworld on FreeS/WAN
At 2:00 PM -0400 on 8/19/99, [EMAIL PROTECTED] wrote: Title: Hackers, Consultants Embrace Secure Tool Resource Type: News Article Date: 08/16/99 Source: Computer World Author: Ann Harrison Keywords: SECURITY,ENCRYPTION ,HACKER/SECURITY ,CONSULTANTS Abstract/Summary: When IT security consultants attend hacker conferences, they have high expectations for finding open-source security tools tested in hostile environments. One that meets the standard for hacker information technology consultants is the FreeS/WAN project's free, open-source Linux-based server software that uses strong encryption to create secure data tunnels between any two points on the Internet -- a needed alternative to expensive, proprietary virtual private networks (VPN). FreeS/WAN uses the IPSec protocol, an interoperable global standard for securing IP connections. It automatically encrypts data packets at 6 bit/sec. and creates secure gateways in a VPN without modifying the operating system or application software. A PC running FreeS/WAN ( www.xs4all.nl/~freeswan ) can set up a secure tunnel in less than a second. The software generated strong interest among the 1,800 hackers who attended the Chaos Communication Camp, the Chaos Computer Club's first international hacker conference, held here last week. Among the attendees was Kurt Seifried, an independent security consultant from Edmonton, Alberta, who uses FreeS/WAN to create secure networks for customers. Original URL: http://www.computerworld.com/home/print.nsf/all/990816BBE2 Added: Thu Aug 19 10:28:26 -040 1999 Contributed by: Keeffee - Robert A. Hettinga mailto: [EMAIL PROTECTED] The Internet Bearer Underwriting Corporation http://www.ibuc.com/ 44 Farquhar Street, Boston, MA 02131 USA "... however it may deserve respect for its usefulness and antiquity, [predicting the end of the world] has not been found agreeable to experience." -- Edward Gibbon, 'Decline and Fall of the Roman Empire'
s/w radios secure modules
In the Aug 16 '99 EETimes, there are several articles about software radios. These have analog front ends, and after down-conversion are digital. This lets you deal with complex back-compatability/protocol/DSP improvement/legal issues flexibly. The FCC is flipping out, considering how to regulate these. Because the software *is* the radio. And software is anarchy :-) Anyway, p 70, there is a 3 paragraph mention of: 1. secure computer systems to handle software changes (ref to GSM) 2. monitors, which prevent unwanted send/receives (sort of a primitive sanity-check, probably more an EEPROM'd lawyer) IMO, given that the radios will be field-upgradable (after all, besides development/marketing economic issues, this is the benefit of reprogrammable systems) there will be plenty of fun for the phreak of 2005. David Honig
PECSENC Says Free Up Crypto?
John, Have you heard about this PECSENC recommendation cited by Dorothy Denning? I've written the PECSENC administrator about getting the recommendation. That's Jason Gomberg [EMAIL PROTECTED]. Could you try from your end? Thanks, John -- Date: Fri, 20 Aug 1999 13:49:07 -0400 From: [EMAIL PROTECTED] (Dorothy Denning) Message-Id: [EMAIL PROTECTED] To: [EMAIL PROTECTED] Subject: Re: Proposed US Export changes? The President's Export Council Subcommittee on Encryption, of which I am a member, recommended something to that effect, but I do not know if the Administration will adopt that recommendation. The next meeting is September 29 and perhaps we will learn something then. Dorothy From [EMAIL PROTECTED] Fri Aug 20 13:53:36 1999 From: Jeremy Hilton [EMAIL PROTECTED] To: "'UK Crypto'" [EMAIL PROTECTED] Subject: Proposed US Export changes? Date: Fri, 20 Aug 1999 18:17:37 +0100 I have heard in a couple of areas that the US may be considering easing export controls whereby crypto can be exported up to the same strength that is commercially available in other parts of the world. Does anyone know if there is any truth in this? Jeremy