On 2/7/12 3:59 PM, Matthew Wilcox wrote:
Fair enough. This then becomes a cost/benefit issue. But there's nothing to 
stop this working if the user's default is an opt out and a prompt is given to 
allow. In exactly the same way that things currently work for geo-location 
data. Right?

Maybe.  That's pretty intrusive UI...  and I'm not a UI designer.

Just wait until UAs start shipping with JS disabled for certain domains by 
default, just like they're already shipping with other capabilities turned on 
or off with certain domains by default.

I browse with NoScript and regularly come across people that haven't coded 
site's responsibly (it drives me nuts). While I applaud this notion because 
it'll bring more attention to the issue - i highly doubt it'll have any impact 
on the larger spectrum of sites. JS is now relied on and if certain UA's 
deliver a broken web-page to users, the users are just going to blame the 
software, not the website. Why? Because the site works in every *other* 
browser, including their old ones...

I did say "certain domains". Think "script disabled for Facebook Like buttons by default". ;)

They should be. But at the same time a hell of a lot of people have read that 
FaceBook watches their every move, and yet happily continue to use it. Users 
don't weigh the issue in the same way we do.

A lot of them don't true.

Agreed! It's still my inclination to default to more permissive things though. 
If people build poor sites, users stop going, the site fails, and either the 
author learns (I hope) or switches to a job with less requirement on skill 
level. It's like in business - if you're not providing a decent product, you 
don't survive. The government doesn't just disallow people from making rubbish 
with the resources they have.

A problem comes when "too big to fail" sites are coded somewhat incompetently. Things like Google's web properties, Yahoo, Microsoft, eBay. Heck, CNN (which is coded _very_ incompetently, imo). For sites like that, the user is more likely to switch browsers or even devices than to use a different site... Sometimes there is no "different site": if MSDN breaks in your browser, what alternate documentation repository will you use, exactly? If you use gmail and it breaks in your browser, then what?

Plenty of people have been producing websites of .... varying quality ... for a decade or more and surviving just fine.

In that case I completely agree. But the answer is educate them to test the 
right thing.

This is a _huge_ undertaking. We (Mozilla) don't have the manpower for it; we've done evangelism and education in the past, and it's _very_ labor-intensive. Opera has literally dozens of people working full-time on this sort of thing and they say they don't really have the manpower for it either.

The real answer needs to be to build things that are easier to use right than to use wrong, because developers will almost always choose the path of least resistance. And I can't blame them.

Ooo, interesting. OK, doesn't SPDY allow pushing content and open connections? 
Can't we hijack that? And, given that any and all new info is triggered by a 
request of some sort, why wouldn't the browser send updated headers with those 
requests? (Again, may be a daft question, I am not familiar with how this stuff 
works on any real level of detail)

I'm not familiar enough with SPDY to comment intelligently.

Interesting example. OK, how could this be fixed? Could hooks not be provided 
for JS so that the site author could watch for these changes and re-request as 
needed?

That might be the right way to go, yeah...  Now we're getting somewhere.  ;)

-Boris

Reply via email to