begin  quoting James G. Sack (jim) as of Sun, Jun 25, 2006 at 11:18:20AM -0700:
[snip]
> Well, one of the attractive uses of javascript/ecmascript in recent
> fashion, uses XMLHttpRequest/AJAX to make the client experience more
> interactive, without moving business logic or other traditional server
> functionality into the client.
 
...and to get everyone to turn on javascript in order to _see_ content,
and thus open themselves up to scripting attacks.

> Andy, for example mentioned/alluded-to google maps, which everybody
> seems to classify as a _good thing_.

Not me.

I want to find out how to get somewhere. Someone sends me a google-maps
URL. I can't just follow the url and see... nooo..., I have to have a
freaking interactive experience with my computer. I don't *WANT* an
interactive experience with my computer, I just want to see where
someone _lives_.

> Another example might be in background-validation of forms input, where
> (say) the js might conduct an XMLHttp round trip with the server, to
> determine whether a new account username was valid/already-taken, etc.
> The exceptions could nicely update the browser output (via js DOM
> interface) to report, say "Username already in use; choose another",
> right on the form, while the user is still composing his input.
>
> Well, that's a simplistic example, but nevertheless illustrates an
> interactivity benefit, compared to how it was done last year.

I don't see that much of a benefit; any sort of validation mechanism
better damn well work _without_ javascript, or it's a waste of time. You
still have to validate on the server side, and provide a reasonable
error if it passes in the javascript but fails on the server.

[snip]
> I haven't actually implemented anything of this sort yet, but clearly
> there does have to be additional server functionality added, as well as
> shipping the javascript with the HTML. The js, of course, may be
> optimized via src=".." in the <script> tags, so that common
> functionality needn't actually be embedded in every page, and the
> browser will find it in its cache.

...and the user can't see it when they "view source"...

I think I want a proxy cache -- capture all the javascript in the proxy,
verify and/or modify it, and then when that javascript is requested by
the client, feed it the cached version.  This will, no doubt, infuriate
web-developers who figure it's their machine when their software is
running, dammit.

> I have nagging suspicions about javascript security issues (where is the
> audit evidence?), but conventional wisdom seems to be that the
> javascript sandbox barriers are _pretty good_. Anyone care to add
> insight to this?
 
"Conventional wisdom" being what?  Javascript/ECMAScript advocates
waving their hands, saying "just trust us" and "we don't see any real
problems"?

Finding more and more of the web less and less useful, as content is
hidden by developers gratituously using javascript; the need for data
increases the risks we're willing to take, therefore, it's pretty good?

When a vendor sells you software on a CD, and that software does
something terribly damaging... you have the CD. You can, in theory,
hand it to an expert, who can determine that it's that vendor's
software that screwed you.

When I download and run software off the web, such as with Javascript,
and something goes terribly wrong.... I have nothing. The next download
may not include the same code, so I have no record of what actually ran
on my computer. Can't audit --> no accountability.

Remember, "conventional wisdom" says that it's a good idea to send
around M$Word and M$Excel attachments, to run macros when you load a
file, to run programs on a CD when you insert a CD, etc. etc.

-- 
_ |\_
 \|


-- 
[email protected]
http://www.kernel-panic.org/cgi-bin/mailman/listinfo/kplug-list

Reply via email to