On Tue, Feb 17, 2015 at 5:02 PM, Marcos Caceres <[email protected]> wrote:
> On February 18, 2015 at 12:49:39 AM, Dale Harvey ([email protected]) wrote:
>> The problem is not browser support for CORS, which has for quite a long
>> time had pretty good support. The issue is that there are applications that
>> in order to function require access to arbitrary remote services which we
>> do no control. This is entirely different from the decision (which I
>> disagree with) for TLS to be required for ServiceWorkers (or Geolocation /
>> GUM) where the developer in the typical case has control over the stack and
>> can choose to implement https however the developer cannot control wether
>> arbitrary remote servers implement CORS.
>
> But that's the whole point. We don't want people going around taking other 
> people's content without permission on scale. That's just wrong. If site A 
> wants to talk to Site B, then they need to establish a CORS relationship.

We do actually want to allow that, but we can't because it would
circumvent the user's firewall. That's the problem with exposing
"systemXHR" to the web, nothing else.


> No, but they will require to do some work. Flipboard is an example of an app 
> that works because of established relationships.
> https://flipboard.com/publishers/faq/

They likely aggregate on the server so for them this use case does not
really apply I think.


>> Its worth mentioning in the case of ServiceWorkers an entirely new api for
>> network access that supported workaround for cors (no-cors requests with
>> opaque responses) had to be implemented

Actually, what we added for service workers was an API that can
explain <img>, <form>, etc. That simply wasn't needed until now, but
it's not really opening up anything fundamentally new.


-- 
https://annevankesteren.nl/
_______________________________________________
dev-b2g mailing list
[email protected]
https://lists.mozilla.org/listinfo/dev-b2g

Reply via email to