On Tuesday, 2 July 2013 at 11:26:59 UTC, Michal Minich wrote:
On Tuesday, 2 July 2013 at 10:09:10 UTC, John Colvin wrote:
The reason you can't implicitly convert a string to a char[] is because string is an alias to immutable(char)[]

Therefore, you would be providing a mutable window to immutable data, which is disallowed unless you *explicitly* ask for it (see below).
would be -> might be

no, "would be" is correct. char is mutable, immutable(char) is not. By definition char[] is a mutable view of data, immutable(char)[] is not.


You can explicitly cast string to char[], but it is undefined behaviour to modify the contents of the result

ok

 (i.e. don't cast
away immutable unless you absolutely have to and even then be very very careful).

This is not correct. You should can not cast when you 'absolutely have to' but when you can prove by means not available to compiler that the data being casted are in fact mutable. And the 'proving' of this is what you should be careful about.

Eek... no, unless i'm misunderstanding you or you made a typo, that's not the case at all.

Don't mutate anything that has a live (and used?)* immutable reference to it.

Just because the data was mutable before being cast to immutable, doesn't mean you can just cast away immutable and mutate it.


Or, alternatively presented:

You can cast to or away from immutable when you can prove by any means that the data being casted are not mutated through any reference after the creation of or before the final use of any immutable references.

Erring on the safe side, you might want to wait for the immutable references to be actually dead and gone before any mutation.


* Would like some clarification on whether it is the usage of the immutable reference or the lifetime that counts, if anyone knows for sure.

Reply via email to