Re: CORS performance
On Feb 19, 2015 3:33 AM, "Anne van Kesteren" wrote: > > On Thu, Feb 19, 2015 at 12:17 PM, Dale Harvey wrote: > > With Couch / PouchDB we are working with an existing REST API wherein every > > request is to a different url (which is unlikely to change), the performance > > impact is significant since most of the time is used up by latency, the CORS > > preflight request essentially double the time it takes to do anything > > Yeah, also, it should not be up to us how people design their HTTP > APIs. Limiting HTTP in that way because it is hard to make CORS scale > seems bad. > > +1. Forcing developers to change their APIs would be bad form at this stage. Not to mention just plain silly. Optimizing with an OPTIONS * preflight is a good option but won't be as broadly available to developers as a response header. Perhaps another approach would be to allow a resource to declare a CORS policy only for subordinate resources, rather than the entire origin. For instance, an OPTIONS sent to http://example.org/api/ can return CORS headers that cover every URL prefixed with http://example.org/api/. That would logically extend all the way up to OPTIONS * in order to set a policy that covers the entire origin. - James > I think we've been too conservative when introducing CORS. It's > effectively protecting content behind a firewall, but we added all > these additional opt in mechanism beyond protecting content behind a > firewall due to unease about the potential risks. Figuring out to what > extent that actually serves a purpose would be good. > > If declaring this policy through a header is not acceptable, we could > attempt a double preflight fetch for the very first CORS fetch against > an origin (that requires a preflight). Try OPTIONS * before OPTIONS > /actual-request. If that handshake succeeds (details TBD) no more > preflights necessary for the entire origin. > > > -- > https://annevankesteren.nl/ >
Re: HTML imports: new XSS hole?
Some initial informal testing shows that import links do make it through the filters I have readily handy. It was quick work to write up some custom filters, however. On Jun 2, 2014 1:52 PM, "Boris Zbarsky" wrote: > On 6/2/14, 4:21 PM, Giorgio Maone wrote: > >> I do hope any filter already blocked out elements, as CSS has >> been a XSS vector for a long time >> > > elements without "stylesheet" in rel don't load CSS, though. > > Hence the worries about blacklist vs whitelist... > > -Boris > >
Re: HTML imports: new XSS hole?
Im not saying it's perfect. Not by any stretch. I'm saying it shouldn't be worse. Any impl that supports the mechanism will need to be aware of the risk and content filters will need to evolve. Perhaps an additional strongly worded warning in the spec would be helpful. On Jun 2, 2014 6:43 AM, "Boris Zbarsky" wrote: > On 6/2/14, 9:22 AM, James M Snell wrote: > >> Yes, that's true. Content filters are likely to miss the links >> themselves. Hopefully, the imported documents themselves get filtered >> > > By what, exactly? I mean, CSP will apply to them, but not website content > filters... > > One assumption we can possibly make is that >> any implementation that knows how to follow import links ought to know >> that they need to be filtered. >> > > Sure, but that assumes the filtering we're talking about is being done by > the UA to start with. > > -Boris >
Re: HTML imports: new XSS hole?
Yes, that's true. Content filters are likely to miss the links themselves. Hopefully, the imported documents themselves get filtered, but there's no guarantee. One assumption we can possibly make is that any implementation that knows how to follow import links ought to know that they need to be filtered. Im not aware of any current user agents that are not import aware that automatically follow and execute link tags. On Jun 2, 2014 6:12 AM, "Boris Zbarsky" wrote: > On 6/2/14, 9:02 AM, James M Snell wrote: > >> I suppose that If you >> needed the ability to sandbox them further, just wrap them inside a >> sandboxed iframe. >> > > The worry here is sites that currently have html filters for user-provided > content that don't know about being able to run scripts. Clearly > once a site knows about this they can adopt various mitigation strategies. > The question is whether we're creating XSS vulnerabilities in sites that > are currently not vulnerable by adding this functionality. > > -Boris > > P.S. A correctly written whitelist filter will filter these things out. > Are we confident this is standard practice now? > >
Re: HTML imports: new XSS hole?
Yup, like I said, it shouldn't be any worse. From what I've seen with chrome, at the very least, import links are handled with the same CSP as script tags. Which is certainly a good thing. I suppose that If you needed the ability to sandbox them further, just wrap them inside a sandboxed iframe. It's a bit ugly but it works. On Jun 2, 2014 5:56 AM, "Anne van Kesteren" wrote: > On Mon, Jun 2, 2014 at 2:54 PM, James M Snell wrote: > > So long as they're handled with the same policy and restrictions as the > > script tag, it shouldn't be any worse. > > Well,
Re: HTML imports: new XSS hole?
So long as they're handled with the same policy and restrictions as the script tag, it shouldn't be any worse. On Jun 2, 2014 2:35 AM, "Anne van Kesteren" wrote: > How big of a problem is it that we're making as dangerous as >