Steven Schveighoffer wrote:
On Thu, 16 Jul 2009 08:49:14 -0400, bearophile
<[email protected]> wrote:
I'm playing with the new D2 a bit, this comes from some real D1 code:
void main(string[] args) {
int n = args.length;
ubyte m = (n <= 0 ? 0 : (n >= 255 ? 255 : n));
}
At compile-time the compiler says:
temp.d(3): Error: cannot implicitly convert expression (n <= 0 ? 0 : n
>= 255 ? 255 : n) of type int to ubyte
You have to add a silly cast:
void main(string[] args) {
int n = args.length;
ubyte m = (n <= 0 ? 0 : (n >= 255 ? 255 : cast(ubyte)n));
}
In theory if the compiler gets a smarter such cast can be unnecessary.
I don't see how, doesn't this require semantic analysis to determine
whether implicit casting is allowed? I think you are asking too much of
the compiler. What if the expression was instead a function call,
should the compiler look at the function source to determine whether it
can fit in a ubyte? Where do you draw the line? I think the current
behavior is fine. The D1 code probably works not because the compiler
is 'smarter' but because it blindly truncates data.
Perhaps if it were an optimization it could be implemented, but the
result of an optimization cannot change the validity of the code...
In other words, it couldn't be a compiler feature, it would have to be
part of the spec, which would mean all compilers must implement it.
BTW, I think cast is a perfect requirement here -- you are saying, yes I
know the risks and I'm casting anyways.
-Steve
He's saying the cast shouldn't be required, as the code entails that n
will fit into a ubyte without loss of information.
Perhaps it's too much to ask. I'm not sure. I don't think he's sure.
But if he doesn't ask, he won't find out. (And it sure would be nice to
avoid casts in situations analogous to that.)