Re: [Edbrowse-dev] XHR same-domain restriction

2018-03-13 Thread Kevin Carhart


I'm interested in the discussion that followed here.  I don't have a 
problem with not simply wrapping the fetchHTTP call.  It might be a good 
idea to address this somehow, but I take Karl's point that circumvention 
is only one 'make' away assuming someone knows what make is.
What Dominique is saying about preventing code injection attacks is also 
important. I think part of the problem is that we are simultaneously a little like 
firefox/chrome/IE and a little like wget/curl, or for that matter, like 
'rm' the C program.  The author of rm doesn't write rm and then try to protect 
you from it, do they?  I don't know.  Do we have a savvy audience 
who can worry about their own caution, or a mass audience who we ought to 
protect from bricking their routers??



I doubt there are restrictions on xhr domains in other browsers.
If there were such restrictions, one could get around them easily.


I am not saying this hasn't been superceded later on by workarounds, but 
it's part of bedrock, early AJAX information that there is a restriction 
on the domain, or is supposed to be.  W3schools has a lot of outdated 
pages and is occasionally ridiculed (there was a site called w3fools 
advising not to use it), but here is their basic AJAX 
information which I probably "grew up with", or used as a reference in 
2007 or 2010: "...For security reasons, modern browsers do not allow 
access across domains.  This means that both the web page and the XML file 
it tries to load, must be located on the same server.  The examples on 
W3Schools all open XML files located on the W3Schools domain..."


So when I reference it as though it's a fact of life, that's where I'm 
getting it from.  After that it bifurcates into what kind of audience 
you're talking about and what is at stake.


___
Edbrowse-dev mailing list
Edbrowse-dev@lists.the-brannons.com
http://lists.the-brannons.com/mailman/listinfo/edbrowse-dev


Re: [Edbrowse-dev] XHR same-domain restriction

2018-03-12 Thread Dominique Martinet
Hi,

No worry, I agree this isn't easy.
We/you've been working hard to make more sites work so I don't want to
break these either, let's take the time needed to research first.

Adding to that that I'm a little bit paranoid and like to disable as
much as I can get with, but we'll need to come up with a decent
interface to display what's blocked and set exceptions...
I think that'll be the most tricky part here!

Karl Dahlke wrote on Mon, Mar 12, 2018:
> 1. frames

I'm honestly not sure there.
As the website serving the frame you can say you don't want to be
displayed, so that protects the remote point's resources but I think we
really need to check if/how dynamic frames would work and what kind of
other limits there can be.

> 2. The same guy that writes the js, and the html, also sends out the http 
> headers,
> so if he wants xhr to access anything then he just sets that http header to 0 
> and off he goes.
> It's like we put a lock on our browser for some kind of security,
> but they can open it with an http key, and everybody knows it.

I think the main purpose of this is to protect a website from code
injection.
Say you're a forum or some blog with a comment area. The fields the
users can fill are supposed to be sanitized, but often enough someone
comes up with a way to insert actual html code/js and could hijack the
user's sessions or whatever it is they want to do.

http headers are usually set by the web server directly without regards
for the content, so no matter how much the site is defaced if the site
says don't load external stuff then it won't load them.

The disabled mode must have been added for compatibility.
Ultimately, if some site depends on it by design and they haven't taken
the time to say code.jquery.com is allowed then they can just set 0 and
things will keep working even if they don't protect themselves.

> 3. If we implement restrictions, we have to do it all,
> including the http key that unlocks them, because some website might unlock 
> them
> and expect xhr to work on some other domain, and when it doesn't, then
> the website doesn't work.

Definitely agreed there, both headers I pointed at have a disabled mode
and it should be easy to implement as it is what we currently do -
basically we just have to make the checks conditional.


> 4. bar.com -> foo.bar.com

That's likely true, need to check as well.

-- 
Dominique | Asmadeus
___
Edbrowse-dev mailing list
Edbrowse-dev@lists.the-brannons.com
http://lists.the-brannons.com/mailman/listinfo/edbrowse-dev


Re: [Edbrowse-dev] XHR same-domain restriction

2018-03-12 Thread Dominique Martinet
Karl Dahlke wrote on Mon, Mar 12, 2018:
> I doubt there are restrictions on xhr domains in other browsers. If
> there were such restrictions, one could get around them easily.

Hm, I don't know how it works, but firefox did not do any network
request to pizza.com when loading Kevin's page.


> var d = document.createElement("div");
> d.innerHTML = "";
> // this fetches the html, just like an xhr request would
> d.firstChild.contentDocument.innerHTML;

That's intersting.
I tried that as well and it refused too, this time I could follow a bit
better, it looks like the frame is not loaded without some extra
function call and contentDocument is null.
Maybe it'd work if the iframe is loaded in the original html though,
don't have the time to test right now.


> And I know frames can be inter domain, I see those all the time.

There actually are rules for that with content security policy.
For example, with most browsers you cannot make an iframe with the
normal youtube url directly (https://youtube.com/watch?v=something will
not load), but the "embed" url https://youtube.com/embed/something does
work.

I'll post now what I wrote over half a year ago, this is not finished
and I do not expect us to do all this yet, but there are at least two
more http headers that matter for xhr and js:

 - X-XSS-Protection[1], which can take a few different values:

0 will not block anything (what we do right now)

1 (browser default) will attempt to remove "unsafe parts" of javascript.
The logic is not precisely described anywhere I could see, but I guess
code holds the truth, and a few sites linked directly to webkit's
XSSAuditor.cpp[2] file.
I have not read it completely yet, but it looks like it will go over all
the tags in