``` immutable int x = 10; int* px = cast(int*)&x; *px = 9; writeln(x); ```It prints 10, where I expected 9. This is on Windows. I'm curious if anyone knows why it happens.
Essentially, because x is immutable, the compiler optimizes writeln(x) to writeln(10). This seems to happen even without -O.