On Sunday, 27 October 2019 at 12:44:05 UTC, Per Nordlöw wrote:
In which circumstances can a `char` be initialized a non-7-bit value (>= 128)? Is it possible only in non-@safe code?

All circumstances, `char`'s default initializer is 255.

char a; // is 255

And, if so, what will be the result of casting such a value to `dchar`? Will that result in an exception or will it interpret the `char` using a 8-bit character encoding?

It will treat the numeric value as a Unicode code point then.

I'm asking because I'm pondering about how to specialize the non-7-bit `needle`-case of the following array-overload of `startsWith` when `T` is `char`:

I'd say that is just plain invalid and it should throw; I'm of the opinion the assert there is correct.

But you could also do cast into dchar, then call std.utf.encode

http://dpldocs.info/experimental-docs/std.utf.encode.1.html

to get it back to utf-8 and compare the values then. It'd spit out a two byte pair that is probably the closest thing to what the user intended.

But I'm just not convinced the library should be guessing what the user intended to begin with.

Reply via email to