On Tue, Nov 14, 2017 at 11:05:51PM +0000, Michael V. Franklin via Digitalmars-d 
wrote:
> On Tuesday, 14 November 2017 at 13:54:03 UTC, Steven Schveighoffer wrote:
> 
> > IMO, no character types should implicitly convert from integer
> > types. In fact, character types shouldn't convert from ANYTHING
> > (even other character types). We have so many problems with this.
> 
> Is everyone in general agreement on this?  Can anyone think of a
> compelling use case?
[...]

I am 100% for this change.  I've been bitten before by things like this:

        void myfunc(char ch) { ... }
        void myfunc(int i) { ... }

        char c;
        int i;

        myfunc(c);      // calls first overload
        myfunc('a');    // calls second overload (WAT)
        myfunc(i);      // calls second overload
        myfunc(1);      // calls second overload

There is no compelling use case for implicitly converting char types to
int.  If you want to directly manipulate ASCII values / Unicode code
point values, a direct cast is warranted (clearer code intent).

Converting char to wchar (or dchar, or vice versa, etc.) implicitly is
also fraught with peril: if the char happens to be an upper byte of a
multibyte sequence, you *implicitly* get a garbage value.  Not useful at
all.  Needing to write an explicit cast will remind you to think twice,
which is a good thing.


T

-- 
Famous last words: I wonder what will happen if I do *this*...

Reply via email to