On Saturday, 19 March 2016 at 10:01:41 UTC, Basile B. wrote:
On Saturday, 19 March 2016 at 09:33:25 UTC, tsbockman wrote:
[...] The reason that *attempting* such a comparison produces
such weird results, is because the signed value is being
implicitly cast to an unsigned type.
Yes and that's the opposite that should happend: when signed
and unsigned are mixed in a comparison, the unsigned value
should be implictly cast to a wider signed value. And then it
works!
- https://issues.dlang.org/show_bug.cgi?id=15805
-
https://github.com/BBasile/iz/blob/v0.5.8/import/iz/sugar.d#L1017
I have no problem with C++ compilers complaining about
signed/unsigned comparisons. It sometimes means you should
reconsider the comparison, so it leads to better code.
The better solution is to add 7, 15, 31 and 63 bit unsigned
integer types that safely converts to signed (this is what Ada
does) and remove implicit conversion for unsigned 8,16,32, and 64
bit integers.