Nick Sabalausky:

Big +1

Having the language expect x"..." to always be a string (let alone a *valid UTF* string) is just insane. It's just too
damn useful for arbitrary binary data.

I'd like an opinion on such topics from one of the the D bosses :-)

Bye,
bearophile

Reply via email to