Hi Ed, I have demonstrated the theory behind YURLs by providing an implementation, the Waterken Browser, and by explaining how two other widely used systems implement the theory. Please clarify your concerns by providing a detailed attack description for any one of these three implementations.
See: http://www.waterken.com/dev/YURL/Like/ Tyler On Monday 14 July 2003 14:31, Ed Gerck wrote: > From your URLs: > > "The browser verifies that the fingerprint in the URL matches the public > key provided by the visited site. Certificates and Certificate Authorities > are unnecessary. " > > Spoofing? Man-in-the-middle? Revocation? > > Also, in general, we find that one reference is not enough to induce trust. > Self-references cannot induce trust, either (Trust me!). Thus, it is > misleading to let the introducer determine the message target, in what you > call the "y-property". Spoofing and MITM become quite easy to do if you > trust an introducer to tell you where to go. > > Not that I believe CAs are essential (I don't, for reasons already > presented in '97), but unless the issues of spoofing, MITM and revocation > are adequately handled according to a threat model that is useful, > communication cannot be considered secure. > > Cheers, > Ed Gerck > > Tyler Close wrote: > > Now available on the Waterken Inc. site is a specification and > > implementation for a new HTTP extension, HTTPSY. > > ... > > --------------------------------------------------------------------- > The Cryptography Mailing List > Unsubscribe by sending "unsubscribe cryptography" to [EMAIL PROTECTED] -- The union of REST and capability-based security: http://www.waterken.com/dev/Web/ --------------------------------------------------------------------- The Cryptography Mailing List Unsubscribe by sending "unsubscribe cryptography" to [EMAIL PROTECTED]
