Alright, I think we that is the furthest
we get now and we got to mind the specs!
Thanks Craig.
On 4/7/19 2:01 PM, Craig Francis wrote:
Hi Joris,
I suspect it's just how the web has developed, where the mixing of
JavaScript and imperfect HTML is normal.
I quite like this video as a demo
I agree, that would be a vulnerability.
But I think this is not the core of my wonder.
I wonder, why do Web developers have to
guess what the Browser thinks is JS and executes
it and what isn't?
Why can't they just ask the Web Browser to do that
for them?
That would be more secure because
all
f matching for every possible insertion of JS, you just match
for the closing of the article tag and if there is such a tag, just
don't display the content.
On 4/23/19 1:38 PM, Craig Francis wrote:
Hi Joris,
I think this suffers from the same issue...
onload="disableScripts(document.getElementById('xss_output')">
*
Engine.
Where there could still be something
between sanitizing and rendering
on the web page if you just have
a Browser sanitize function.
But if the JS Engine isn't
even enabled, nobody can execute
XSS at the last point.
On 4/24/19 5:22 PM, Craig Francis wrote:
Hi Joris,
I think we should follow
Hey Craig,
I did open this
discussion somewhere else:
https://discourse.wicg.io/t/xss-prevention-in-the-browser-by-the-browser/3518/4
On 4/24/19 5:55 PM, joris wrote:
Yes,
in a way it would do the same job as a sanitizer,
but it is more than that.
I think that a simple sanitize function could
In a previous Mail I talked about
a noscript tag, that if set on a HTML Element would
direct the Browser not to execute any Scripts inside
that Element, thus behaving like JS would be disabled
globally. But this approach has the disadvantage of
being enabled and disabled entirely in HTML,
thus