Nick Sabalausky:
Big +1Having the language expect x"..." to always be a string (let alone a *valid UTF* string) is just insane. It's just toodamn useful for arbitrary binary data.
I'd like an opinion on such topics from one of the the D bosses :-)
Bye, bearophile
