On Sunday, 23 September 2018 at 21:12:13 UTC, Walter Bright wrote:
D supports Unicode in identifiers because C and C++ do, and we want to be able to interoperate with them. Extending Unicode identifier support off into other directions, especially ones that break such interoperability, is just doing a disservice to users.

I always thought D supported Unicode with the goal of going forward with it while C was stuck with ASCII:
http://www.drdobbs.com/cpp/time-for-unicode/228700405

"The D programming language has already driven stakes in the ground, saying it will not support 16 bit processors, processors that don't have 8 bit bytes, and processors with crippled, non-IEEE floating point. Is it time to drive another stake in and say the time for Unicode has come? "

Have you changed your mind since?

Reply via email to