Re: Revision of US Crypto Export Controls
on Thu, Dec 11, 2003 at 05:08:38AM -0800, John Young ([EMAIL PROTECTED]) wrote: On December 10, 2003, the Bureau of Industry and Security issued a final rule to revise the Commerce Control List which regulates export of US technologhy. Below are excerpts involving encryption. The full rule: http://cryptome.org/bis121003.txt [Excerpts] Any chance of highlighting the revisions for those of us who'd not memorized the earlier regs? Peace. -- Karsten M. Self [EMAIL PROTECTED]http://kmself.home.netcom.com/ What Part of Gestalt don't you understand? Geek for hire: http://kmself.home.netcom.com/resume.html pgp0.pgp Description: PGP signature
Re: Difference between TCPA-Hardware and other forms of trust
On Wed, 17 Dec 2003, Jerrold Leichter wrote: Given this setup, a music company will sell you a program that you must install with a given set of access rights. The program itself will check (a) that it wasn't modified; (b) that a trusted report indicates that it has been given exactly the rights specified. Among the things it will check in the report is that no one has the right to change the rights! And, of course, the program won't grant generic rights to any music file - it will specifically control what you can do with the files. Copying will, of course, not be one of those things. I think that if the music company wants that much control (which is, btw, in clear violation of the First Sale Doctrine), then the only legal way for them to achieve it is to provide a player specifically for the music which they own, in exactly the same way that banks retain ownership of the credit cards and smartcards we use. As long as the player is not their property, they can't do this. The major problem I want a trusted kernel for is because I don't want to trust binaries provided by closed-source software houses. I want my trusted kernel to tell me exactly what priveleges they're asking for and I want to tell it exactly what priveleges it's allowed to provide them. I want it to be able to tell me exactly when every executable file appeared, and as a result of running which other executable file (all the way back to whichever command *I* gave that resulted in its being there). I want it to tell me exactly how the daemon listening on any tcp port got installed and what priveleges it has. I want my trusted kernel to keep tamper-proof logs; in fact I'd go so far as to want to use a write-once media for logfiles just to make absolutely sure. A trusted kernel should absolutely know when any program is reading screen memory it didn't write, or keyboard keystrokes that it then passes as input to another program, and it should be possible for me to set up instant notification for it to alert me when any program does so. A trusted kernel should monitor outgoing network packets and sound an alarm when any of them contains personal information like PINs, passwords, keys, Social Security Number, Drivers License, Credit Card numbers, Address, etc. It should even be possible to have a terminate-with-prejudice policy that drops any such packets before sending and terminates and uninstalls any unauthorized application that attempts to send such packets. I really don't care if anyone *else* trusts my system; as far as I'm concerned, their secrets should not be on my system in the first place, any more than my secrets should be on theirs. The fact is I'm building a system out of pieces and parts from hundreds of sources and I don't know all the sources; with an appropriate trusted kernel I wouldn't have to extend nearly as much black box trust to all the different places software comes from. Yes, you can construct a system that *you* can trust, but no one else has any reason to trust. However, the capability to do that can be easily leveraged to produce a system that *others* can trust as well. There are so many potential applications for the latter type of system that, as soon as systems of the former type are fielded, the pressure to convert them to the latter type will be overwhelming. I do not think so. People want to retain ownership of their computer systems and personal information, and a system that is made for *others* to trust would be used to take that ownership and information. Ultimately, TCPA or no, you will be faced with a stark choice: Join the broad trust community, or live in the woods. No. Lots of bands release music and encourage sharing, as promo for their main revenue source (concert tours). I see those bands getting a leg up as their released music becomes popular while music only available with onerous conditions languishes. Lots of other artists do graphic or animation work just for the chance to be seen, and some of them are quite good. You may consider it living in the woods to listen to stuff that isn't the top 20; but I think lots of people will find that the woods is a friendlier and more trustworthy place than a world full of weasels who want to control their systems. Bear - The Cryptography Mailing List Unsubscribe by sending unsubscribe cryptography to [EMAIL PROTECTED]
Re: Difference between TCPA-Hardware and other forms of trust
Jerrold Leichter writes: Given this setup, a music company will sell you a program that you must install with a given set of access rights. The program itself will check (a) that it wasn't modified; (b) that a trusted report indicates that it has been given exactly the rights specified. Among the things it will check in the report is that no one has the right to change the rights! And, of course, the program won't grant generic rights to any music file - it will specifically control what you can do with the files. Copying will, of course, not be one of those things. Now, what you'll say is that you want a way to override the trusted system, and grant yourself whatever rights you like. Well, then you no longer have a system *anyone else* can trust, because *you* have become the real security kernel. And in a trusted system, you could certainly create a subject that would automatically be granted all access rights to everything. Of course, since the system is trusted, it would include information about the override account in any reports. The music company would refuse to do business with you. More to the point, many other businesses would refuse to do business with you. There's the rub. The prevalence of systems that can be trusted by third parties who do not trust their owners affects what applications are possible, and it affects the balance of power between computer owners and others. If very few such systems are deployed, it would be absurd to say that the music company would refuse to do business with you -- because the music company has to do business to keep its doors open. Businesses that refuse to serve customers will not last very long. In the entertainment case, the antagonism is already clearly expressed. (Open war is upon you, as I recall Theoden being told in that movie last night.) The more so-called legacy systems do not do DRM or do not do it very well or easily, the more difficult it is for publishers to apply DRM systems and succeed in the market. The more systems do DRM natively, or easily or cheaply, the easier it is to be successful publishing things restricted with DRM. In either case, the publishers still have to publish; before the creation of DVD-A and SACD, publishers of audio CDs couldn't very well say CD-DA, we hates it! Nasty, tricksy format! (sorry, um) We are going to stop publishing in the CD-DA format because it isn't encrypted. Even today, they would be hard-pressed to do so, because DVD-A and SACD players are extraordinarily rare compared to audio CD players. The question of whether the supposed added profit that comes with being able to enforce DRM terms provides an important creative incentive comparable to that provided by copyright law goes back to the era immediately before the adoption of the DMCA, where the Bruce Lehman White Paper argue that it did (that copyright law's incentive was becoming inadequate and an additional control-of-the-public and control-of-technology incentive would be required). Indeed, the group that pushed for the DMCA was called the Creative Incentives Coalition, and it said that thus restricting customers was really all a matter of preserving and expanding creative incentives. http://www.uspto.gov/web/offices/com/doc/ipnii/ I think Bruce Lehman was wrong then and is wrong now. On the other hand, the _structure_ of the argument that the prospect of restricting customers provides an incentive to do something that one would not otherwise do is not incoherent on its face. The interesting question about remote attestation is whether there are (as some people have suggested) interesting and important new applications that customers would really value that are infeasible today. For example, it has been argued by Unlimited Freedom that there would be incentives to invest in useful things we don't have now (and things we would benefit from) only if attestation could be used to control what software we used to interact with those things. In the entertainment case, though, there is already a large entertainment industry that has to sell into a base of actually deployed platforms (unless it wants to bundle players with entertainment works) -- and its ability to refuse to do business with you is constrained by what it can learn about you as a basis for making that decision. It's also constrained if its rationale for refusing to sell to you would also imply that it needs to refuse to sell to millions of other people. Only if enormous numbers of people in the future can preserve the benefit of creating uncertainty about their software environment's identity will entertainment publishers and others lack the ability to discriminate against people who use disfavored software. Yes, you can construct a system that *you* can trust, but no one else has any reason to trust. However, the capability to do that can be easily leveraged to produce a system that *others* can trust as well. There are so many
Ross Anderson's Trusted Computing FAQ
Ross Anderson's Trusted Computing FAQ has a lot to say about recent threads: http://www.cl.cam.ac.uk/~rja14/tcpa-faq.html iang - The Cryptography Mailing List Unsubscribe by sending unsubscribe cryptography to [EMAIL PROTECTED]
RE: Difference between TCPA-Hardware and a smart card (was: example: secure computing kernel needed)
Stefan Lucks [EMAIL PROTECTED] writes: Currently, I have three smart cards in my wallet, which I did not want to own and which I did never pay for. I never used any of them. Conversation from a few years ago, about multifunction smart cards: - Multifunction smart cards are great, because they'll reduce the number of [smart] cards we'll have to carry around. - I'm carrying zero smart cards, so it's working already! Peter. - The Cryptography Mailing List Unsubscribe by sending unsubscribe cryptography to [EMAIL PROTECTED]
Re: Difference between TCPA-Hardware and other forms of trust
John Gilmore [EMAIL PROTECTED] writes: They eventually censored out all the sample application scenarios like DRM'd online music, and ramped up the level of jargon significantly, so that nobody reading it can tell what it's for any more. Now all the documents available at that site go on for pages and pages saying things like FIA_UAU.1 Timing of authentication. Hierarchical to: No other components. FIA_UAU.1.1 The TSF shall allow access to data and keys where entity owner has given the 'world' access based on the value of TCPA_AUTH_DATA_USAGE; access to the following commands: TPM_SelfTestFull, TPM_ContinueSelfTest, TPM_GetTestResult, TPM_PcrRead, TPM_DirRead, and TPM_EvictKey on behalf of the user to be performed before the user is authenticated. That gobbledigook sounds like Common Criteria-speak. So it's not deliberate, it's a side-effect of making it CC-friendly. nobody reading it can tell what it's for any more Yup, that's definitely Common Criteria. Peter. - The Cryptography Mailing List Unsubscribe by sending unsubscribe cryptography to [EMAIL PROTECTED]
Re: Difference between TCPA-Hardware and a smart card (was: example:secure computing kernel needed)
At 09:38 AM 12/16/2003 -0500, Ian Grigg wrote: In the late nineties, the smart card world worked out that each smart card was so expensive, it would only work if the issuer could do multiple apps on each card. That is, if they could share the cost with different uses (or users). This resulted in a big shift to multi-application cards, and a lot of expensive reworking and a lot of hype. All the smart card people were rushing to present their own architecture; all the user banks were rushing to port their apps back into these environments, and scratching their heads to come up with App #2 (access control, loyalty...) . I've maintained since the mid-90s ... that the idea of multi-app smartcard is from sometimes in the '80s. the tarket was the portable computing environment before there was portable input output technology. One of the reasons for smartcard standards was to have interoperability between input/output support stations and the portable computing. The mid-90s saw some take-off in capability of multi-app smartcards because the technology that could be packaged into a smartcard got greater. Also by the mid-90s, there was portable input output technology and PDAs and cellphones were starting to rapidly fill the target market niche for multi-app smartcards (where everybody had their own portable computing input/output capability w/o having to find a station someplace). One of the other target market niches for the portable computing devices was the offline environment (again left=over from the 80s) however, with the pervasive penetration of the Internet into the world market followed by all sorts of wireless capability any target offline market niche is rapidly going the way of the dinosaurs. One might claim that continuing momentum for multi-app smartcards is the enormous investment that was made starting by at least the late '80s continuing up through the current time. So while there was an escalating amount of capability that could be packaged in a smartcard form-factor by the late 90s along with an escalating cost apparently requiring escalating feature/function to try and justify the escalating costs why would somebody want significant amount of capability in what is effectively a deaf dumb device (w/o its support stations) when you could get enormously better usability by packaging the significant amount of capability in PDA/cellphone form factor. i tried to take the opposite track with the aads chip strawman find a reasonably compelling business case for a hardware token and then totally focus on that function. the compelling business use selected was authentication. aads attempts to totally focus on KISS authentication as a compelling business reason for a hardware token with aggressive discarding everything that doesn't support the authentication compelling business use (if something non-KISS authentication is needed get a PDA or cellphone). misc. aads stuff: http://www.garlic.com/~lynn/index.html#aads -- Anne Lynn Wheelerhttp://www.garlic.com/~lynn/ Internet trivia 20th anv http://www.garlic.com/~lynn/rfcietff.htm - The Cryptography Mailing List Unsubscribe by sending unsubscribe cryptography to [EMAIL PROTECTED]
Re: Difference between TCPA-Hardware and a smart card (was: example: secure computing kernel needed)
Carl Ellison wrote: It is an advantage for a TCPA-equipped platform, IMHO. Smart cards cost money. Therefore, I am likely to have at most 1. If I glance quickly through my wallet, I find 7 smartcards (all credit cards). Plus the one in my phone makes 8. So, run that at most 1 argument past me again? Cheers, Ben. -- http://www.apache-ssl.org/ben.html http://www.thebunker.net/ There is no limit to what a man can do or how far he can go if he doesn't mind who gets the credit. - Robert Woodruff - The Cryptography Mailing List Unsubscribe by sending unsubscribe cryptography to [EMAIL PROTECTED]
RE: Difference between TCPA-Hardware and a smart card (was: example: secure computing kernel needed)
We see here a difference between your and my sides of the Atlantic. Here in the US, almost no one has a smart card. Of those cards you carry, how many are capable of doing public key operations? A simple memory smartcard doesn't count for what we were talking about. There are other problems with doing TCPA-like operations with a smartcard, but I didn't go into those. The biggest one to chew on is that I, the computer owner, need verification that my software is in good shape. My agent in my computer (presumably the smartcard) needs a way to examine the software state of my computer without relying on any of the software in my computer (which might have been corrupted, if the computer's S/W has been corrupted). This implies to me that my agent chip needs a H/W path for examining all the S/W of my computer. That's something the TPM gives us that a smartcard doesn't (when that smartcard goes through a normal device driver to access its machine). - Carl +--+ |Carl M. Ellison [EMAIL PROTECTED] http://theworld.com/~cme | |PGP: 75C5 1814 C3E3 AAA7 3F31 47B9 73F1 7E3C 96E7 2B71 | +---Officer, arrest that man. He's whistling a copyrighted song.---+ -Original Message- From: Ben Laurie [mailto:[EMAIL PROTECTED] Sent: Friday, December 19, 2003 2:42 AM To: Carl Ellison Cc: 'Stefan Lucks'; [EMAIL PROTECTED] Subject: Re: Difference between TCPA-Hardware and a smart card (was: example: secure computing kernel needed) Carl Ellison wrote: It is an advantage for a TCPA-equipped platform, IMHO. Smart cards cost money. Therefore, I am likely to have at most 1. If I glance quickly through my wallet, I find 7 smartcards (all credit cards). Plus the one in my phone makes 8. So, run that at most 1 argument past me again? Cheers, Ben. -- http://www.apache-ssl.org/ben.html http://www.thebunker.net/ There is no limit to what a man can do or how far he can go if he doesn't mind who gets the credit. - Robert Woodruff - The Cryptography Mailing List Unsubscribe by sending unsubscribe cryptography to [EMAIL PROTECTED]
RE: Difference between TCPA-Hardware and a smart card (was: example: secure computing kernel needed)
Stefan, I replied to much of this earlier, so I'll skip those parts. - Carl +--+ |Carl M. Ellison [EMAIL PROTECTED] http://theworld.com/~cme | |PGP: 75C5 1814 C3E3 AAA7 3F31 47B9 73F1 7E3C 96E7 2B71 | +---Officer, arrest that man. He's whistling a copyrighted song.---+ -Original Message- From: [EMAIL PROTECTED] [mailto:[EMAIL PROTECTED] On Behalf Of Stefan Lucks Sent: Tuesday, December 16, 2003 1:02 AM To: Carl Ellison Cc: [EMAIL PROTECTED] Subject: RE: Difference between TCPA-Hardware and a smart card (was: example: secure computing kernel needed) On Mon, 15 Dec 2003, Carl Ellison wrote: The point is that Your system is not supposed to prevent You from doing anything I want you not to do! TCPA is supposed to lock You out of some parts of Your system. This has nothing to do with the TCPA / TPM hardware. This is a political argument about the unclean origins of TCPA (as an attempt to woo Hollywood). I, meanwhile, never did buy the remote attestation argument for high price content. It doesn't work. So, I looked at this as an engineer. OK, I've got this hardware. If remote attestation is worthless, then I can and should block that (e.g., with a personal firewall). Now, if I do that, do I have anything of value left? My answer was that I did - as long as I could attest about the state of the software to myself, the machine owner. This required putting the origins of the project out of my head while I thought about the engineering. That took effort, but paid off (to me). [...] If it were my machine, I would never do remote attestation. With that one choice, I get to reap the personal advantages of the TPM while disabling its behaviors that you find objectionable (serving the outside master). I am not sure, whether I fully understand you. If you mean that TCPA comes with the option to run a secure kernel where you (as the owner and physical holder of the machine running) have full control over what the system is doing and isn't doing -- ok, that is a nice thing. On the other hand, we would not need a monster such as TCPA for this. What we need is some agent of mine - a chip - that: 1) has access to the machine guts, so it can verify S/W state 2) has a cryptographic channel to me, so it can report that result to me and 3) has its own S/W in a place where no attacker could get to it, even if that attacker had complete control over the OS. The TCPA/TPM can be used that way. Meanwhile, the TPM has no channel to the outside world, so it is not capable of doing remote attestation by itself. You need to volunteer to allow such communications to go through. If you don't like them, then block them. Problem solved. This reminds me of the abortion debate bumper sticker. If you're against abortion, don't have one. - Carl - The Cryptography Mailing List Unsubscribe by sending unsubscribe cryptography to [EMAIL PROTECTED]
Re: Difference between TCPA-Hardware and a smart card (was: example: secure computing kernel needed)
On Mon, 15 Dec 2003 19:02:06 -0500 (EST) Jerrold Leichter [EMAIL PROTECTED] wrote: However, this advantage is there only because there are so few smart cards, and so few smart card enabled applications, around. It is not really true that there are so few smartcards. Almost every mobile phone contains one (the SIM module is a smartcard). Also the situation in Europe is quite different from the USA. Electronic purses on smart cards are pretty common here, especially in France and the Netherlands, where most adults have at least one. But it is true that there are only very few smart card enabled applications. I have worked on several projects for multifunctional use of these smart cards and almost all of them were complete failures. Ernst Lippe - The Cryptography Mailing List Unsubscribe by sending unsubscribe cryptography to [EMAIL PROTECTED]
Re: Difference between TCPA-Hardware and other forms of trust
At 7:30 AM -0800 12/17/03, Jerrold Leichter wrote: ... If the system were really trusted, it could store things like your credit balance: A vendor would trust your system's word about the contents, because even you would not be able to modify the value. This is what smart cards attempt to offer - and, again, it would be really nice if you didn't have to have a whole bunch of them. The bank records stored on your system could be trusted: By the bank, by you - and, perhaps quite useful to you, by a court if you claimed that the bank's records had been altered. One should note that TCPA is designed to store its data (encrypted) in the standard file system, so standard backup and restore techniques can be used. However, being able to backup my bank balance, buy a bunch of neat stuff, and then restore the previous balance is not really what a banking application wants. Smart cards address this situation by storing the data on the card, which is designed to be difficult to duplicate. [I always considered the biggest contribution from Mondex was the idea of deposit-only purses, which might reduce the incentive to rob late-night business.] Cheers - Bill - Bill Frantz| There's nothing so clear as a | Periwinkle (408)356-8506 | vague idea you haven't written | 16345 Englewood Ave www.pwpconsult.com | down yet. -- Dean Tribble | Los Gatos, CA 95032 - The Cryptography Mailing List Unsubscribe by sending unsubscribe cryptography to [EMAIL PROTECTED]
Re: Quantum Crypto
Perry is absolutely right. There is no point in pursuing this. It might even be analogous to what we now know about computers. We were warned that there would never be a need for more than A half-dozen - after all, they were extremely expensive just to get A few more digits in the logarithm table ... Thank goodness that we stopped those wasteful government research efforts and put money into improving analog mechanical desktop calculators - which is all anyone ever needed anyway. ;-) Perry, I seem to remember paying excessive amounts for my first installations of 1822, X.25, token-ring, ethernet - in fact all new devices. Even the ones that weren't needed ... Initial cost is a poor metric and you of all people should know it. However, I sincerely applaud your effort to present a snapshot of the state of the art - and the effort to qualify the QKD folks who are prematurely entering the market. Please try to include a view the long term potential and imagine how it might be used when you write your report. After all, who would have thought that computers _would_ be linked together to create communication networks ... And that my 75-year old mother could not only afford one but actually enjoy using it. (Ok, its a Macintosh ...) Please don't dismiss what is really a very new research area with unknown potential - just leaving the physicist's lab bench for the engineering lab bench - because a few folks are entering the market too soon and claiming that they have product. There is a baby in that bath water ! Season's Greetings ! John On 12/16/03 10:14, Perry E.Metzger [EMAIL PROTECTED] wrote: There have been more press releases about quantum crypto products lately. I will summarize my opinion simply -- even if they can do what is advertised, they aren't very useful. They only provide link security, and at extremely high cost. You can easily just run AES+HMAC on all the bits crossing a line and get what is for all practical purposes similar security, at a fraction of the price. The problem in security is not that we don't have crypto technologies that are good enough -- our algorithms are fine. Our real problem is in much more practical things like getting our software to high enough assurance levels, architectural flaws in our systems, etc. Thus, Quantum Crypto ends up being a very high priced way to solve problems that we don't have. Perry - The Cryptography Mailing List Unsubscribe by sending unsubscribe cryptography to [EMAIL PROTECTED] - The Cryptography Mailing List Unsubscribe by sending unsubscribe cryptography to [EMAIL PROTECTED]
Re: Difference between TCPA-Hardware and a smart card (was: example: secure computing kernel needed)
At 10:51 AM 12/16/2003 +0100, Stefan Lucks wrote: I agree with you: A good compromise between security and convenience is an issue, when you are changing between different smart cards. E.g., I could imagine using the smart card *once* when logging into my bank account, and then only needing it, perhaps, to authorise a money transfer. This is a difficult user interface issue, but something we should be able to solve. One problem of TCPA is the opposite user interface issue -- the user has lost control over what is going on. (And I believe that this originates much of the resistance against TCPA.) In sci.crtypt, there has been a thread discussing does OTP (one-time-pad) and how does integrity and authentication play and somewhat subtread about does authentication of a message involve checking the integrity of the contents and/or checking the origin of message. A security taxonomy, PAIN: * privacy (aka thinks like encryption) * authentication (origin) * integrity (contents) * non-repudiation http://www.garlic.com/~lynn/2003p.html#4 Does OTP need authentication? http://www.garlic.com/~lynn/2003p.html#6 Does OTP need authentication? http://www.garlic.com/~lynn/2003p.html#17 Does OTP need authentication? One of the issues is that privacy, authentication, and integrity are totally different business processes and that the same technology, lets say involving keys might be involved in all three, aka digital signatures ( public/private keys) can be used to simultaneously provide for authentication (of sender) and integrity )of message contents). Both privacy (encryption) and authentication (say digital signatures) can involve keys that need protecting; privacy because key access needs to be controlled to prevent unauthorized access to data, authentication because unauthorized access to keys could lead to impersonation. In the authentication case, involving public/private keys the business requirement has sometimes led to guidelines that the private key is absolutely protected and things like key escrow is not allowed because it could contributed to impersonation. In the privacy csse, involving public/private keys ... the business requirement can lead to guidelines that require mandated escrow of private key(s) because of business continuity issues. This can create ambiguity where the same technology can be used for both authentication and privacy, but because the business processes are different, there can be mandated requirement that the same keys are never used for both authentication and privacy ... and it is mandated that authentication keys are never escrowed and that privacy keys are always escrowed. TCPA chip can also be used to protect private keys used in authentication either authentication of the hardware component as its own entity say like a router in a large network, or possibly implied authentication of a person that owns or possesses the hardware component. An authentication taxonomy is 3-factor authentication: * something you have * something you know * something you are A hardware token (possibly in chipcard form factor) can be designed to generate a unique public/private key pair inside the token and that the private key never leaves the chip. Any digital signature that can be verified by the corresponding public key can be used to imply something you have authentication (i.e. the digital signature is assumed to have originated from a specific hardware token). A hardware token can also be designed to only operate in specific way when the correct PIN/password has been entered in which case the digital signature can imply two-factor authentication, both something you have and something you know. From a business process standpoint it would be perfectly consistent to mandate that there is never key escrow for keys involved in authentication business process while at the same time mandating key escrow for keys involved in privacy. At issue in business continuity are business requirements for things like no single point of failure, offsite storage of backups, etc. The threat model is 1) data in business files can be one of its most valuable assets, 2) it can't afford to have unauthorized access to the data, 3) it can't afford to loose access to data, 4) encryption is used to help prevent unauthorized access to the data, 5) if the encryption keys are protected by a TCPA chip, are the encryption keys recoverable if the TCPA chip fails? -- Anne Lynn Wheelerhttp://www.garlic.com/~lynn/ Internet trivia 20th anv http://www.garlic.com/~lynn/rfcietff.htm - The Cryptography Mailing List Unsubscribe by sending unsubscribe cryptography to [EMAIL PROTECTED]
Re: Quantum Crypto
John Lowry [EMAIL PROTECTED] writes: Perry is absolutely right. There is no point in pursuing this. It might even be analogous to what we now know about computers. We were warned that there would never be a need for more than A half-dozen - after all, they were extremely expensive just to get A few more digits in the logarithm table ... Thank goodness that we stopped those wasteful government research efforts and put money into improving analog mechanical desktop calculators - which is all anyone ever needed anyway. ;-) Your amusing banter aside, my point remains. QCrypto doesn't solve any problems that anyone has in the real world -- everything it can do can be done far more cheaply and indeed far better by other means -- so it is a large expense that serves no purpose. I know of no company using something like AES+HMAC for link security that has had its cryptographically secured communications successfully attacked by cryptanalysis* -- and AES is free, and running it is nearly free. On the other hand, I know of lots of companies that have had problems because they haven't thought out their remote access systems well or because they are running software vulnerable to buffer overflows. The issue is not that we need unbreakable crypto -- we already have it for practical purposes. The issue is that our systems are not built robustly. Please don't dismiss what is really a very new research area with unknown potential - This is not an issue of unknown potential -- we know what the systems being marketed do. They have specifications and user manuals. I would never suggest that people stop research, of course, but it seems that QCrypto is not a solution to any real world problem. Perry *By this, I don't include things like the key management algorithm only used all ones as the key -- I mean legitimate attacks against AES etc. - The Cryptography Mailing List Unsubscribe by sending unsubscribe cryptography to [EMAIL PROTECTED]
I don't know PAIN...
What is the source of the acronym PAIN? Lynn said: ... A security taxonomy, PAIN: * privacy (aka thinks like encryption) * authentication (origin) * integrity (contents) * non-repudiation I.e., its provenance? Google shows only a few hits, indicating it is not widespread. iang - The Cryptography Mailing List Unsubscribe by sending unsubscribe cryptography to [EMAIL PROTECTED]