What i am asking is why it have to be this way.
In "return 1 / float", why 1 is assumed as an integer, and consequently casted to float. In math [real numbers] > [integers] > [natural numbers}, why 1 is an integer? and especially why 1 is an 32 bit integer?

T foo(T)(T k) {
        return k + 3;
}
General answer will be like, above code should fricking work, without my explicit cast! And you are right, That is perfectly fine with me! :) I again ask, for this code to work, Is it necessary "3" be a 32bit integer, or a native type at all?

well as you said in mathematics a Natural number is also an Integer which is also a Rational which is also a Real. One could say that you have an injection, an implicit cast from Natural to Integer to Rational to Real. By the way maybe you fear that the implicit cast is done at runtime, or has some hidden cost, but that is not the case, for literals it is done at compile time.

In reality (and also in mathematics if you use cyclic groups, or approximate numbers) the situation becomes a lot more murky, but in the end not much changes.

On 29-mar-10, at 15:47, so wrote:

It would be nice to say "in this function, assume numeric literals are of type T," but that might be too specific a solution (could only apply to builtin types). I don't think it's feasible for the compiler to infer what type it should use.

-Steve

It would be nice indeed!
My proposal"ish" idea was exactly for this problem,
that removing "default literal" rule, and treating every implicit constant (constant without literal) as a generic type. I don't think it will ever happen even if i was able to provide full parsing framework, since C is a strong opponent :)

The example you give cannot be solved easily and efficiently (find the type to use in a function) without some kind of inference based on the return type, annotations, or Hindley Milner style type inference. annotation don't scale, using inference based on the return type is very difficult and not doable in general, few languages do it (aldor for example did it), Hindley Milner is incompatible with C.

Please note that (possibly due to my C background) I like to put some type annotations, in my (limited) experience that pays off also with ML style languages, otherwise when you have ambiguity small errors can change the called functions and give surprising results.

Fawzi

Reply via email to