James A. Donald wrote:
PKI was designed to defeat man in the middle attacks
based on network sniffing, or DNS hijacking, which
turned out to be less of a threat than expected.

asymmetric cryptography has a pair of keys ... the other of the key-pair decodes what has been encoding by one of them. a business process was defined using this technology where one of the key-pair is designated as public ... and freely distributed and the other of the key-pair is designated as confidential and never divulaged. an authentication business process was defined using public/private business process called digital signature .... where a hash of a message is taken and encoded with the private key. the recipient can recompute the hash of the received message and compare it to the digital signature that has been decoded with the corresponding public key. this catches whether the message has been altered and from 3-factor authentication
http://www.garlic.com/~lynn/subpubkey.html#3factor

* something you have
* something you know
* something you are

implies "something you have" authentication ... i.e. the originator has access and use of the corresponding private key.

PKI was somewhat targeted at the offline email model of the early 80s; the relying party dials up their (electronic) post office, exchanges email, and hangs up. They then may be dealing with first time correspondance from a total stranger with no (offline or online) recourse for determining information about the sender. Relying parties could be seeded with trusted public key repository of trusted third party certification authorities. Stangers could be issued "certificates" (digitally signed by one of these certification authorities) containing informoation about themselves bound to their public key. Email recipients in the offline email days of the early 80s ... could now of source of information regarding first time communication from total strangers (sort of the "letters of credit" model from the sailing ship days).

we were asked to work this small client/server startup in menlo park
http://www.garlic.com/~lynn/aadsm5.htm#asrn2
http://www.garlic.com/~lynn/aadsm5.htm#asrn3

that wanted to do payments on something they called a commerce server. In the year we worked with them ... they moved from menlo park to mountain view and changed their name (trivia question ... who previously had the rights to their new name? also what large corporate entity was providing most of the funding for the commerce sever?). some topic drift ... recent postings referencing this original e-commerce work as an example of service oriented architecture (SOA):
http://www.garlic.com/~lynn/2005i.html#42
http://www.garlic.com/~lynn/2005i.html#43

they had this technology called SSL which was configured at addressing two issues: a) is the webserver that the user had indicated to the browser ... the actual webserver the browser was talking to and b) encryption of the transmitted information.

SSL digital certificates would be issued
http://www.garlic.com/~lynn/subpubkey.html#sslcert

which would contain the domain name of the webserver bound to their public key. the browsers would have trusted public key repository seeded with the public keys of some number of trusted third party certification authorities. the browser SSL process would compare the domain name indicated by the user to the domain name in the digital certificate (after validating the certificate).

(at least) two (other) kinds of vulnerabilities/exploits have shown up.

1) in the name of convenience, the browsers have significantly obfuscated the certificate operation from the end-user. attackers have devised ways for the end-users to indicate incorrect webservers ... which the browser SSL process (if it is even invoked) will then gladly validate as the webserver the user indicated.

2) a perceived issue (with knowing that the webserver that a browser is talking to is the webserver the user indicated) were integrity issues in the domain name infrastructure. however, as part of doing this consulting with this small client/server startup ... we also had to do detailed end-to-end business process due dilligence on some number of these certification authorities. it turns out that a certification authority typically has to check with the authoritative agency for the information they are certifying. the authoritative agency for domain name ownership is the domain name infrastructure ... the very institution that there are integrity questions giving rise to the requirement for SSL domain name server certificates.

In the second vulnerability, the certification authority industry is somewhat backing a proposal that when somebody registers a domain name with the domain name infrastructure ... they also register their public key. then in future communication with the domain name infrastructure, they digitally sign the communication. the domain name infrastructure then can validate the digital signature using the (certificateless) public key onfile for that domain. This supposedly improves the integrity of the communication between the domain name owner and the domain name infrastructure .... mitigating some possible domain name hijacking exploits (where some other organization becomes recorded as the domain name owner).

It turns out that the certification authority industry also has an issue. When somebody makes an application for an SSL domain name certificate, they need to supply a bunch of identification information. This is so the certification authority can perform the expensive, time-consuming and error-prone identification process ... and then do the same with the information on file at the domain name infrastructure as to the owner of the domain name ... and then see if the two domain name owner identifications appear to match. Having an on-file public key for the domain name owner ... the certification authority industry can also require that an SSL domain name applicant, digitally sign their application. Then the certification authority can retrieve the onfile (certificateless) public key and change an expensive, error-prone, and time-consuming identification process into a simple and more reliable authentication process (by retrieving the onfile public key and validating the digital signature).

From an e-commerce perspective ... the SSL process was to protect against credit card information havesting for use in fraudulent transactions. However, the major vulnerability/exploit before SSL and after the introduction of SSL ... wasn't against credit card information in flight ... but against huge repositories of credit card information (information at rest). It was much easier for the crooks to steal the information already collected in huge repositories than go to the effort of evesdropping the information inflight and creating their own repositories (fraud return-on-investment, much bigger benefit in stealing large repositories of already collected and organized information). related reference regarding security proportional to risk
http://www.garlic.com/~lynn/2001h.html#61

the financial standards working group, x9a10 was given the task of preserving the integrity of the financial infrastructure for all retail payments (as well as some number of other requirements) for x9.59 standard
http://www.garlic.com/~lynn/index.html#x959

so some earlier work on PKI-oriented protection for retail payments involved digitally signed transaction oriented protocol with attached digital certificates.

in the early 90s, there was some work on x.509 identity certificates. however, there was some issues with ceritifcation authorities predicting exactly what information might be needed by unknown future relying parties ... and so there was some direction to grossly overload these certificates with excessive amounts of personal information. In the mid-90s, some number of institutions were starting to realize that such overloaded repositories of excessive personal information representing significant liability and privacy issues. As a result you saw some retrenchment to relying-party-only certificates
http://www.garlic.com/~lynn/subpubkey.html#rpo

these were digital certificates that basically contained some kind of database record locator (like an account number) bound to a public key (the database record contained all the real information). however, it became trivial to demonstrate that such relying-party-only certificates were redundant and superfluous. This was, in part because they violated the original design point for certificates ... the relying party not having any other recourse to the necessary information. By definition if all the information was in a relying-party's database ... then by definition the certificate was redundant and superfluous.

in this later part of the mid-90s payment scene, these relying-party-only certificates were on the order of 4k-12k bytes. It turns out that a typical retail payment message is 60-80 bytes. Not only were the stale, static, relying-party-only certificates redundant and superfluous ... but they also would contribute to enormous payload bloat (on the order of one hundred times).

the other problem with the relying-party-only, redundant and superfluous, stale, staic, enormous payload-bloat digital certificate based infrastructure ... were that they effectively were targeted only at protecting credit card information "in-flight" ... something that SSL was already doing. They were providing no countermeasure for the major vulnerability to the data "at rest". the information at rest was still vulnerable (and was the major exploit already with or w/o SSL)

So one of the things in the x9a10 financial standards working group was to do a treat and vulnerability analysis ... and design something that could preserve the integrity of the financial infrastructure for all retail payments (credit, debit, stored-value, online, offline, pos, etc).

X9A10 defined a light-weight digitally signed transaction that wouldn't contribute to the enormous payload bloat of the stale, static, redundant and superfluous certificate-based infrastructures.

Another issue was the analysis demonstrated that the major treat and vulnerability was to the data at rest. So for X9.59, a business rule was defined ... for account numbers used for X9.59 transactions ... only correctly verified digitally signed transactions (authenticated) could be authorized.

An x9.59 transaction was digitally signed, and the relying party could use an on-file public key to validate the digital signature .... showing the transaction wasn't modified in transit and providing "something you have" authentication as to the originator (they had access and use of the corresponding private key). furthermore, evesdropping of the transaction in flight ... and/or harvesting the large transaction databases (information at rest) wouldn't yield information for the crook to perform a fraudulent transaction. the current exploits where knowledge from an existing transaction is sufficient to generate fraudulent transaction has gone away ... for vulnerabilities involving both "data in flight" as well as "data at rest".

The issue wasn't that SSL being designed to protect data-in-flight ... the issue was that the major threat/vulnerability has been to "data-at-rest" ... so to some extent, SSL (and the various other countermeasures to "data-in-flight" vulnerabilities) wasn't responding to the major threats. To some extent, e-commerce/internet was opening a theoritical, new vulnerabilities ("data-in-flight") compared to the non-internet world ... and so SSL was somewhat theoritically demonstrating that e-commerce/internet use wouldn't make the situation any worse.

Recent studies have indicated that at least 77% of the id theft exploits have involved insiders (supporting the long standing premise that the majority of fraud is by insiders). The introduction of e-commerce and internet have introduced new avenues for attacking data-at-rest by outsiders. As a result, e-commerce/internet potential threats to data-at-rest has contributed to obfuscating responsible insiders in cases of exploits against data-at-rest.


---------------------------------------------------------------------
The Cryptography Mailing List
Unsubscribe by sending "unsubscribe cryptography" to [EMAIL PROTECTED]

Reply via email to