On Tue, Jun 8, 2010 at 7:07 AM, Peter Watkins <[email protected]> wrote:
> On Mon, Jun 07, 2010 at 09:46:35PM -0700, John Panzer wrote: > > On Mon, Jun 7, 2010 at 7:35 PM, Peter Watkins <[email protected]> wrote: > > > > Whitelists again? Ugh. Chris had that nice video explaining how XAuth > could > > > put a few IdPs up their in a short, relevant list of choices. Now > you're > > > saying to make privacy OK, IdPs have to whitelist RP sites? > > > What makes you think their IdP wouldn't be doing this based on the user's > > preferences? > > Because that would just move the NASCAR problem from the RP site to the IdP > site. Your current draft says the IdP can specify a list of RP domains when > it deposits a token. In order to give the end user control over what sites > this should be used at, the IdP would need a UI for determining what to go > in this "extend" list. And it would have to so so when adding each token! > NASCAR all over again, and probably even worse, since my impression is that > there are at least two orders of magnitude more RPs than IdPs for any user. > (In reality, this means IdP sites would not bother implementing such > controls.) > This would certainly be a poor UI. I can imagine better ones, but more to the point, the marketplace can decide what the best UI is in this case. > > This is a great example of why this should be in-browser. With an > in-browser > solution, a user could be prompted each time an RP asks for XAuth tokens, > and could decide at that time which IdP tokens to reveal, and whether to > always reveal the same set to that RP, etc. Users would only be prompted > about the tokens they actually possess, and the RP sites they actually > viist -- solving the privacy/disclosure NASCAR problem efficiently. > I think this would be a poor UI too -- it's well known that most users will simply end up clicking "OK" in this situation, and the experience is worse. But without getting into that argument: You could implement essentially the same UX using JS -- the RP doesn't get the data sent back via postMessage() unless the xauth.org JS says it can. You could probably have a better UX with an in-browser solution, but not a qualitatively different one. In other words, this is not a strong differentiator for in-browser vs. JS solutions. > > Another reason for in-browser: avoiding one-size-fits-all solutions. You > could add per-request controls to the code in Auth.org, but many users > wouldn't want them, as it would slow them down. And you'd spend time > localizing > the UI. Move it to the browser and it becomes someone else's problem. You > could release minimally compliant extensions, and let "the market" handle > edge cases, just as the ISV market has for years provided additional > controls > over things like HTTP cookies. > I agree that an in-browser solution could provide a better UX; in fact that's my argument (make it work, then make it better) ;). > > > > > I think that browser support would make some > > > > things easier -- perhaps defending against "pretend" IdPs that use > social > > > > engineering to get themselves on your IdP list -- but (a) those > attacks > > > > aren't privacy issues and (b) they appear low value to me. > > > > > > What social engineering? Getting you to open a page with an XAuth > iframe? > > > That sounds like a relatively easy attack to carry out, esp. in the era > > > of tinyurl and bit.ly. > > > And if successful, it gets the attacker a slightly annoying link with a > big > > glowing verified pointer back to their web site. I hope spammers do > attempt > > this; it'll be much easier to combat than other things they do today. > > Many phishers don't really care about having legitimate-looking URLs. > Some would try using this to phish someone's Facebook credentials, and > since > XAuth code is agressively cached in the browser, this wouldn't be any > easier > to defend against than current phishing scams (if XAuth.org *did* see > traffic > for each token jar request, then you would have a point). Consider Chris' > demo > again -- the Receiver site using simple strings & favicons to represent > XAuth-discovered IdPs. Seems a legitimate concern to me. > Apparently, in order to phish Facebook credentials, one must do no more than rank #1 on the search result for "facebook login" ( http://knowyourmeme.com/memes/i-want-the-old-facebook-back). I do agree that we need more comprehensive defenses against "bad actors" on the Internet. This is a separate discussion, but the ability to have distributed, abuse-resistant reputation for both sites and users is something that is sorely needed if we really want to promote a fully decentralized architecture for just about anything of interest. If for example xauth.org had a good way to ask for the Internet's opinion of a site, it could ask the user to confirm for the small subset of sites that it thinks are hinky (or which it has no information about), with appropriate warnings. An immune system for the Internet, if you will. On the other hand, this is also something that is not a differentiator between in-browser and JS-based implementations; either one could consult such a service and pop up (infrequent, scary) warnings. > > > > > I will agree that things would be more secure if we encased every > client > > > > computer in concrete with no 'net connection and sank them to the > bottom > > > of > > > > the ocean. > > > > > > Excuse me? > > > My somewhat flippant point was that eliminating all possible risks also > > eliminates all possible usefulness. > > And the implication is that you don't want to bother with threat > assessment. > Or that perhaps I'm not a worthy interlocutor. Please try not to be > flippant; > I think most of us here (save perhaps potty-mouth Santosh) are trying to be > civil and respectful. > Definitely not the intention. I was having trouble figuring out what specific threats people were concerned about. If you have to handle all possible threats, there is almost no adequate defense. > > > > > Sure, we could host extensions at xauth.org. And then people could > > > download > > > > them. From, um, a centralized site. How is that more decentralized > > > > exactly? > > > > > > Users could vet the code and not worry about it changing on them the > way it > > > could from a SaaS site like xauth.org. > > > Of course regular users are not going to vet the code. And xauth.org is > not > > a SaaS site. It's just a stock web server. > > Users are relying on code not under their control (whatever is served by > auth.org). It might not be full-on REST/SOAP, but it's a centralized > service. > Yes. This begs the question of what "under their control" means exactly though (what threats are we considering?) > > > But yes, it would be better for _security in the long run_ to have this > code > > be baked into an otherwise trusted download (like the browser). It is > not > > necessary to do this to start with, and indeed is a very bad idea while > the > > APIs are still in flux. > > I don't buy that. It's simple to update a repository XML file to tell > Chrome, > Firefox, etc., that there's a new extension available. If the *API* is in > flux, > then all the RPs will have to adapt anyway. Having the code in browser > makes > it tougher to update the *implementation*, but not the API, and only very, > very slightly. Make new distributable, update software XML feeds, publish. > I do this for my own software, and it takes 2-3 short commands for me to > publish an update. No big deal. > > > xauth.org would have no indication > > > whatsoever that a user was interacting with XAuth-compatible Extenders > or > > > Receivers. > > > Not sure what you're saying here. > > Spelled out in my subsequent email -- Referer data for IFRAME requests. > > > > I'll post this on the xauth group now, too, but for instance currently > the > > > xauth.org site is not accessible via a https URL (at least not one > that > > > passes normal CN/hostname checks) > > > Heh. They're currently using Akamai to mitigate SPOF issues and Akamai > is > > responding to SSL requests as *.akamai.net hosts. Obviously this > > configuration issue will be fixed. > > Yeah, it's been a month since Allen Tom raised the issue in the xAuth > Google > group, and nobody there actually replied to say they'd fix that. Glad to > hear > it from you. > > > (Note that exactly the same issues arise when downloading extensions. JS > is > > just a way of delivering always-latest-version extensions to your > browser.) > > And the solutions are similar -- code-signing and publishing extension info > on https pages, as Firefox does. > How does this avoid having to trust a central site (the extension site and the owner of the signing key)? Or do you see the case of retrieving JS via TLS as qualitatively different from retrieving similar code from an extension site? Or is the fact that users would need to actively download the extension (until, as some suggest, it is baked into browsers automatically)? Thought experiment: Would you be satisfied if xauth were baked into Chromium (hosted at www.chromium.org)? If so, would it be sufficient to CNAME xauth.org to www.chromium.org and serve up JS from there, signed with the Chromium.org private key? > -Peter > >
_______________________________________________ specs mailing list [email protected] http://lists.openid.net/mailman/listinfo/openid-specs
