Allen Wirfs-Brock <mailto:[email protected]>
January 27, 2012 11:41 AM
On Jan 26, 2012, at 4:57 PM, Brendan Eich wrote:
<script src="someLib.js"  type="application/javascript" async="async" 
id="lib1"></script>
<script src="anotherLib.js"  type="application/javascript" 
async="async"id="lib2"></script>
<script src="baseApp.js"  type="application/javascript" async="async" id="base" 
after="lib1"></script>
<script src="extraApp"  type="application/javascript" async="async" 
after="lib2,base"></script>
Because that's brittle and slow -- you don't care about order, you typically 
just want to race and exploit as much parallelism as possible, given host 
CPU(s), the intervening network, TCP connection sharing/limits/etc.

But the ordering dependencies are real.

This is not the right argument. Developers generally do not need to partially or totally order dependencies except in the degenerate partial sense of "all must load before I can rely on my functions". There are exceptional cases of progressive loading and dependency, but those are handled via non-async <script> vs. element-with-event-handler order.

Developers know how to turn on their app when DOMContentReady or whatever fires. That means most async script load scheduling can and should be left to the browser implementation.

  If you don't explicitly identify them and they also aren't implicitly 
recognized then your application will be unreliable.

People build reliable web apps all the time using onload et seq.


I'm pretty sure the collisions-are-rejected thing is going to burn. It's the 
opposite of what people not only abuse (unknown latent bugs), but absolutely 
use and rely on today, with var and function among scripts.

With STL we would be saying that if you need redefinition/multiple definition 
behavior you need to continue to use var and function or you need to refractor 
to use modules/export/input.  But you can't use naked let/const.

I understand the proposal, but my response is that instead of a migration path from var to let at top-level (non-strict), you're making a higher barrier: var to let-in-module and proper modularization.

Not the end of the world, mind you -- plausible and perhaps winning on balance.

  Some people would get burned by this and conclude that var is always 
preferable to let.  But others would learn about modules.  In either case, we 
would have a consistent semantics for let/const.  BTW, we should make sure that 
modules can actually support the use cases that need such multi/redefinitions.  
I think it is mostly various ways of doing polyfill like things.

Yup. Those will use var and global property testing for a while to come.

also raises issues like, what scope does an indirect eval use?  I haven't head 
anyone recently advocating for TLpSi.  You can achieve almost the same thing 
using SQ by wrapping a block around each script body.
Right, although TLpSi is still attractive as a parallel to function bodies, 
where we agreed body-level let has to bind in a body block that shadows 
parameters and vars. Not much of a plus but it's a plausible alternative.

did you just say that this is legal:

function f() {
     var b;
     let b;
}

and interpreted as:

function f() {
     var b;
     { let b; }
}

I didn't say that. Your

https://mail.mozilla.org/pipermail/es-discuss/2012-January/019817.html

has two alternative rules, either way var vs. let at top level for a given name is an error.

I kind of like TLPSi, but I think it would be counterintuitive for web 
developers who write in-line scripts such as:

<script>
const debug = true;
</script>
<-- a bunch of html -->
<script>
if (debug) {...}
</script>

If we only had src based scripts it might be the right thing...

Yes, making either or both of those out-of-line may change developer expectations. TLpSi seems better with any out-of-line (out of sight is out of mind!) script.

In general, when I start think about the ways that lexical declarations in 
separate scripts might interact I run into issue that are probably best 
resolved using modules. I'm inclined to favor the simpler solution and leave it 
to modules to deal with managing actual interdependencies cases.
Is STL the simplest solution for users?

It avoids issues like:

<script>
const debug = true;
function f1 () {
     if (debug) log(...);
     ...
}
</script>
<script>
const debug = false;
function f2 () {
     if (debug) log(...);
     f1();     // but this logs regardless
     ...
}
</script>
<script>
let debug=false;  //this has no effect
f1();
f2();
</script>

the behavior of this under TLpSn is perfectly normal given lexical scoping.  
However, there is no physical nesting so it is likely to be surprising to many 
users.

But a nasty hard error with STL is better? Again out-of-line scripts and hacking modally file by file may just leave frustrated users. Module bodies are file contents and (without export) have isolated let and const at top level.

/be
_______________________________________________
es-discuss mailing list
[email protected]
https://mail.mozilla.org/listinfo/es-discuss

Reply via email to