On Friday, 28 June 2013 at 00:48:23 UTC, Steven Schveighoffer wrote:
On Thu, 27 Jun 2013 20:34:53 -0400, JS <[email protected]> wrote:

Would it be possible for a language(specifically d) to have the ability to automatically type a variable by looking at its use cases without adding too much complexity? It seems to me that most compilers already can infer type mismatchs which would allow them to handle stuff like:

main()
{
   auto x;
   auto y;
   x = 3;   // x is an int, same as auto x = 3;
   y = f(); // y is the same type as what f() returns
x = 3.9; // x is really a float, no mismatch with previous type(int)
}

in this case x and y's type is inferred from future use. The compiler essentially just lazily infers the variable type. Obviously ambiguity will generate an error.


There are very good reasons not to do this, even if possible. Especially if the type can change.

Consider this case:

void foo(int);
void foo(double);

void main()
{
auto x;
x = 5;
foo(x);

....
// way later down in main
x = 6.0;
}

What version of foo should be called? By your logic, it should be the double version, but looking at the code, I can't reason about it. I have to read the whole function, and look at every usage of x. auto then becomes a liability, and not a benefit.


says who? No one is forcing you to use it with an immediate inference. If you get easily confused then simply declare x as a double in the first place!

Most of the time a variable's type is well know by the programmer. That is, the programmer has some idea of the type a variable is to take on. Having the compiler infer the type is tantamount to figuring out what the programmer had in mind, in most cases this is rather easy to do... in any ambiguous case an error can be thrown.

Coupling the type of a variable with sparse usages is going to be extremely confusing and problematic. You are better off declaring the variable as a variant.


If you are confused by the usage then don't use it. Just because for some programmers in some cases it is bad does not mean that it can't be useful to some programmers in some cases.

Some programmers what to dumb down the compiler because they themselves want to limit all potential risk... What's amazing is that many times the features they are against does not have to be used in the first place.

If you devise an extremely convoluted example then simply use a unit test or define the type explicitly. I don't think limiting the compiler feature set for the lowest common denominator is a way to develop a powerful language.

You say using a variant type is better off, how? What is the difference besides performance? An auto type without immediate type inference offers all the benefits of static typing with some of those from a variant type...

Since it seems you are not against variant then why would you be against a static version, since it actually offers more safety?

In fact, my suggestion could simply be seen as an optimization of a variant type.

e.g.,

variant x;
x = 3;


the compiler realizes that x can be reduced to an int type and sees the code as

int x;
x = 3;

Hence, unless you are against variants and think they are evil(which contradicts your suggestion to use it), your argument fails.

Reply via email to