I'm having trouble seeing how some additional once-per-year cache revalidation requests to <http://xauth.org>xauth.org's IP would change the amount of information leakage in any appreciable way.

Single point of failure = NON-centralization.

I assume the caching cannot be reset, and that your intent is to get browser support within the year so it never has to be revalidated; this would limit blocking/interception attacks.

I don't, however, understand why you can't set up a local proxy server, temporarily bounce the JS request through it, and set the cache accordingly. Post the source code, let everyone review it, then let users download it from mirrors (verifying the hash against anywhere it is published), creating a truly decentralized setup. (For at least another year, they wouldn't have to worry about setting up the cache again.)

There *is* a slightly higher barrier to entry, then, but it would do a great deal to alleviate the concerns of those of us who see a centralized service which *may*, in some idyllic future, become what it was promised to be.

-Shade
_______________________________________________
specs mailing list
[email protected]
http://lists.openid.net/mailman/listinfo/openid-specs

Reply via email to