Michel Fortin wrote:
On 2008-11-26 10:24:17 -0500, Andrei Alexandrescu
<[EMAIL PROTECTED]> said:
Also consider:
auto delta = a1.length - a2.length;
What should the type of delta be? Well, it depends. In my scheme that
wouldn't even compile, which I think is a good thing; you must decide
whether prior information makes it an unsigned or a signed integral.
In my scheme it would give you a uint. You'd have to cast to get a
signed integer... I see how it's not ideal, but I can't imagine how it
could be coherent otherwise.
auto diff = cast(int)a1.length - cast(int)a2.length;
Actually, there's no solution.
Imagine a 32 bit system, where one object can be greater than 2GB in
size (not possible in Windows AFAIK, but theoretically possible). Then
if a1 is 3GB, delta cannot be stored in an int. If a2 is 3GB, it
requires an int for storage, since result is less than 0.
==> I think length has to be an int. It's less bad than uint.
Perhaps we could add a "sign" property to uint and an "unsign" property
to int that'd give you the signed or unsigned corresponding value and
which could do range checking at runtime (enabled by a compiler flag).
auto diff = a1.length.sign - a2.length.sign;
And for the general problem of "uint - uint" giving a result below
uint.min, as I said in my other post, that could be handled by a runtime
check (enabled by a compiler flag) just like array bound checking.
That's not bad.
Fine. With constants there is some mileage that can be squeezed. But
let's keep in mind that that doesn't solve the larger issue.
Well, by making implicit convertions between uint and int illegal, we're
solving the larger issue. Just not in a seemless manner.
We are of one mind. I think that constants are the root cause of the
problem.