On Thu, 15 May 2014 10:42:00 -0400, David Nadlinger <[email protected]>
wrote:
On Thursday, 15 May 2014 at 13:42:58 UTC, Steven Schveighoffer wrote:
On Thu, 15 May 2014 09:24:54 -0400, Ola Fosheim Grøstad
<[email protected]> wrote:
That's the wrong attitude to take when it comes to the compiler and
runtime.
There are always ways around the guarantees. This one isn't as
detectable, since there is no "red-flag" cast or attribute. But I see
no utility in such code.
I have to agree with Ola here. If you write a piece of pure, @safe code,
there should be no way for compiler optimizations to make your code
behave differently.
In general, I would say any code that performs differently after
optimization is not preferable. But in this case, you have ignored the
rules, and the result is not exactly incorrect or unsafe. In fact, you
can't really argue that it's invalid (randomBit could legitimately always
return true or false, even in a non-optimized program).
The question here is, can we come up with a static rule that is effective,
but not cripplingly prohibitive?
This is not only because implicitly allowing unsafe code would be
against the design philosophy behind D, but also as attribute inference
might tag functions as pure without the programmer explicitly specifying
so.
So far, I haven't seen code that's unsafe only if optimized. Have an
example?
In any case, the alternative is to have D pure functions avoid using
pointers. It's as drastic as that.
I'd suspect that it is enough to disallow using the content of pointers
explicitly, i.e. as a sequence of bytes instead of just a handle to an
object.
This means format("%x", ptr) isn't allowed to be pure?
What about calculating index offsets? Note that pointer math between two
pointers on the same block of memory is perfectly legitimate.
I would expect that someone could be able to write a type equivalent to a
slice, and it should be allowed to be pure.
-Steve