On 11/01/2013 08:26 AM, Jason Orendorff wrote:
> This proposal is before TC39 for inclusion in the next ECMAScript spec
> edition following ES6:
>   http://wiki.ecmascript.org/doku.php?id=strawman:weak_references
> 
> Mozilla GC hackers are opposed, for reasons they can articulate; I'm
> opposed because I can't stick the nondeterminism and because the total
> cost is out of all proportion with the benefit.
> 
> However. There are use cases. Please see:
>   https://mail.mozilla.org/pipermail/es-discuss/2013-February/028572.html
> and subsequent messages.
> 
> Also, though I think this use case is weaker:
>   https://mail.mozilla.org/pipermail/es-discuss/2013-March/029423.html

I would agree that pretty much all the use cases I've seen have value:
GC is a very handy form of resource management and it is nice not to
have to re-invent it at the program level.

However, I think this proposal is a much more significant change than
most people realize. Right now JS GC is unspecified magic that happens
behind the scenes to prevent OOM. This proposal would change the JS GC
into a generic resource management framework. I'm actually fine with
this, but please understand that it will impose some significant
constraints on how we can evolve the GC.

First, performance: this particular proposal would force us to visit
objects that are being swept. Our nursery design is currently such that
this is not even possible. This is already problematic for weak maps. We
are able to get around this using a few hacks, but it depends on being
able to do full mark and sweep GC's at some point. If we moved to a more
distributed architecture like G1, this would be a severe limitation. I'm
not even sure how concurrent GC will handle weakmaps efficiently.

The performance of the web is vital. With gaming and video as
first-class citizens, we have to consider both the throughput and
latency of the GC as priorities. Any added complexity in the GC will
either directly make the web slower or will disproportionately divert
our energy from making other things faster.

Secondly, correctness. The GC is by definition a cross-cutting concern;
you cannot build anything in SpiderMonkey without considering GC. This
makes weak primitives a cross-cutting concern of a cross-cutting
concern. Our experience with combining generational GC and WeakMaps
reflects this.

When implementing generational GC, our first few instances attempted to
deal with weak maps in an efficient way. Unfortunately, it turns out
this is actually impossible in the presence of nested weakmaps: we
cannot reach a fixed point while only visiting the nursery heap. Sadly,
even after we realized this, we still had to spent a tremendous amount
of effort merely proving to ourself that our design is safe in the
presence of weak maps.

The GC is a bad place to add complexity: any error in the GC leads to an
impossible-to-debug top-crasher with a sec-crit rating. We can certainly
deal with this -- we have so far -- but it takes a disproportionate
amount of work.

That said, we absolutely do want to create a resource management
framework that can enable the sort of neat implementations that people
are envisioning for this proposal. I just believe there /must/ be a
better way to bring that to the web than piggy-backing on the GC.

Cheers,
Terrence

> If you're a GC hacker and you want to stop this train, your best bet is
> to read those threads and speak up now.
> 
> -j
> 
> _______________________________________________
> dev-tech-js-engine-internals mailing list
> dev-tech-js-engine-internals@lists.mozilla.org
> https://lists.mozilla.org/listinfo/dev-tech-js-engine-internals
> 

_______________________________________________
dev-tech-js-engine-internals mailing list
dev-tech-js-engine-internals@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-tech-js-engine-internals

Reply via email to