Don <> changed:

           What    |Removed                     |Added
                 CC|                            |

--- Comment #1 from Don <> 2012-06-12 09:48:41 PDT ---
This behaviour is intentional. Pointer operations are strictly checked in CTFE.
It's the same as doing 

int n = 0;
char c = ""[n];

which generates an array bounds error at runtime.

Is the terminating null character still in the spec? A long time ago it was in
there, but now I can only find two references to it in the current spec (in
'arrays' and in 'interfacing to C'), and they both relate to printf. 

The most detailed is in 'interface to C', which states:
"string literals, when they are not part of an initializer to a larger data
structure, have a '\0' character helpfully stored after the end of them."

which is pretty weird. These funky semantics would be difficult to implement in
CTFE, and I doubt they are desirable. Here's an example:

const(char)[] foo(char[] s) { return "abc" ~ s; }

immutable bar = foo("xyz"); // becomes a string literal when it leaves CTFE

bool baz()
    immutable bar2 = foo("xyz"); // local variable, so isn't a string literal.

    return true;
static assert(baz());

---> bar is zero-terminated, bar2 is not, even though they had the same
assignment. When does this magical trailing zero get added?

I think you could reasonably interpret the spec as meaning that a trailing zero
is added to the end of string literals by the linker, not by the compiler. It's
only in CTFE that you can tell the difference.

Configure issuemail:
------- You are receiving this mail because: -------

Reply via email to