Value types are supposed to be implementable in hardware if available. You
are assuming they are objects with reference semantics. That contradicts
their purpose and why we are discussing them as a new kind of type.
I was talking about decimal proxies, not decimal primitives itself.
Nothing prevents implementing === operator for decimal primitives,
because it's happening on interpreter side. For backward-compatibility
both Decimal() and new Decimal() would be proxies, and again there is
not point of implementing === on programmer-side, because Decimal()
would be singleton proxy, and new Decimal() would create copy of
singleton proxy, so we get desired behavior: Decimal(1.1) ===
Decimal(1.1), but Decimal(1.1) !== new Decimal().
Creating
new proxy each time would cause memory overload, especially
considering decimals.
Not so. IEEE754r fits in 128 bits and that is efficiently passed and returned
by value, compared to heap-boxing with copy on write or eager heap allocation
and copying -- or as you seem to propose, memoization (which is a deal-killer
in performance terms). Yes, literals should be interned or memoized. Not so
every computed intermediate or final result!
Again, I was taling about proxies not primitives like decimal itself.
Primitives are singletons by default. I meant the _proxy_ for
decimal object should be implemented as singleton (yes, in future JS
we could use literals like 1.1m, but anyways it is not
backward-compatible).
Additionally in JS something like 2 === new
Number(2) returns false,
Now you are comparing apples to oranges. new Number(2) is explicitly
creating an object with reference type semantics. No one is proposing a new
Decimal('1.1') that would do differently, but what we do propose is a
literal 1.1m that does not heap-allocate a mutable object.
1.1m would be synonym for Decimal('1.1'). 1.1m would return actual
primitive (which is not backward-compatible), and Decimal('1.1') would
return either frozen singleton proxy _or_ in future actual,
undistinguishable actual literal as 1.1m does.
so in my opinion it is not good idea, to let
programmer define ===, even for decimal proxies. And if we let
programmer create fresh 1_1m objects there would be no way to
distinguish them, just because of trapped ===.
So? Why do you need to distniguish 1.1 from compute_1_dot_1() today with
IEEE754 binary double precision floating point? You don't.
I meant distinguishing objects like Decimal('1.1') and new Decimal('1.1').
The problem is objects in JS are reference types currently. Value types or
proxies would be objects that, by virtue of being shallowly frozen, say,
could be compared by reference or property values. This needs care in
spec'ing since the shallow vs. deep freezing may not cover value types with
deeper structures that nevertheless want to be compared by value. Deep
comparison is possible of course, just more work.
Exactly. === should be used only as reference-comparator (primitive
types are somehow singletons).
You are assuming your conclusion. Value types are not reference types, please
re-read the strawmen.
I'm simplifying things, sorry. But anyways, currently value types and
reference-objects behave the same as far as we talk about ===
operator. === does not look up object in any complicated way, just
checks if reference or primitive value are the same. And that should
stay.
I don't say I understood strawman entirely. Just saying that allowing
programmers to create trap for === would be rather not brilliant idea.
Actually, I'm going read strawman once more, right now :-)
adam.
___
es-discuss mailing list
es-discuss@mozilla.org
https://mail.mozilla.org/listinfo/es-discuss