I have a dgeree in Theoretical Physics, and I would very much like to grasp the inferred type immediately ;-) That is exactly my point: Inferring obvious, simple cases is a good approach and brings the most benefit. I would call that "Groovy now supports return type inference in the obvious cases, which bring the most benefit to developers". (No one is forced to leave out return types, so no one should complain that the compiler infers Object or throws a CTE in complex cases.)

The "int to long" inference example is an interesting special case that I think would need to be discussed*: We could either reject these case (throw CTE) - or maybe inferring to a higher up super type (Number in the extreme case) could sort of work... But due to inherently involved performance/precision issues, in my opinion inferring number types does not seem like a good idea...

Cheers,
mg

*Since Groovy goes to BigInteger/BigDecimal pretty quickly (at least people who need the performance of fundamental number types have complained on the ML about that), this problem might actually not be as big in Groovy as in other languages ;-)



On 06.09.2018 11:37, Jochen Theodorou wrote:


Am 06.09.2018 um 02:23 schrieb MG:
Hi Jochen,

but in what sense is any of these examples confusing for the user ? Type inference is not magic, and it can quickyl become a hard mathematical problem (https://en.wikipedia.org/wiki/Type_inference).

And you as user do not have to understand the inferred type? I would like to be able to do that without a mathematical degree.

But in all that cases, we should just fall back to Object or throw. I don't know Daniel's intentions, but for me type inference for methods (same as for fields/variables) should only be used for simple, obvious cases, not for complex ones (eveb if these are of course the only interesting intellectual challnge ;-) ).

If it is only for simple cases it will always strike the user as a not very powerful system and the user might be wondering what this inference for the return type actually is... unless we call it "simple return type inference" of course.

[...]

private foo() {
  if (something) {
     return x // of class X
  } else {
     return y // of class Y
  }
}

if X and Y implement the interfaces Foo and Bar the common super type would be something like Object+Foo+Bar, which cannot be an actual return type, because the Java type system cannot properly express that type. Which is it then? Object, or Foo or Bar?

Intuitively I would not infer on interfaces, but only classes. In practice I would expect X and Y to have a common superclass that is not Object; otherwise infer Object.

if we keep it simple we fail here, which means compilation error

And if you think this problem is small, you have to consider this one here as well:

private foo() {
  def ret
  if (something) {
     ret = x // of class X
  } else {
     ret = y // of class Y
  }
  return ret
}

Same problem as before obviously, just showing that using local variables makes it even worse.

Infer Object - if you use Object for the return variable type, this is what you should expect...

If I had use X or Y it would have failed compilation. That is my point here.

And how about this one?

private f(List<X> l) {   if (l.size()==1) {
    return l[0]
  } else {
    return f(l.tail())
  }
}

for me it is obvious the return type is X, but a compiler must be able to see through the recursive call and it must see that it is a recursive call.

Too complex => infer Object

Too complex => fail compilation!

[...]
Or let us say there is also g(Object):Object and let us assume we delete g(X):X. Then inferring the type above successfully means obviously to let f return Object. The change will go unnoticed in f and cause secondary errors in callers of f. In the worst case even quite far away from the callsite itself.

I do not follow: What secondary errors would that be ? How would they differ from errors that occur when the user explicitely supplies Object as return type ?


lets say you have

def g() {
 f()+1
}

int b = g()


let us say that the type for f changed from int to long, then g() will still compile, but now it will also return long instead of int. This will cause int b = g() to fail compilation because of loss of precision from long to int. And to fix this you will have to fix not the assignment, not g, but f.


bye Jochen


Reply via email to