What if numeric literals acted the same way as irrational numbers like pi, 
and automatically adapted to the environment where they are used?
I know that that sort of rule would have made a lot of the bit twiddling 
code a lot simpler than having to worry that 0xff and 0x1ff,
or 1234567890123456789 and 12345678901234567890 are different types,
or that a & 15 will force an unsigned into a signed number.
Just a random thought!

On Tuesday, April 5, 2016 at 11:37:20 AM UTC-4, Erik Schnetter wrote:
>
> On Tue, Apr 5, 2016 at 11:05 AM, Didier Verna <[email protected] 
> <javascript:>> wrote: 
> > Erik Schnetter <[email protected] <javascript:>> wrote: 
> > 
> >> The literal `1` has type `Int`.  The promotion rules for `Int8` and 
> >> `Int` state that, before the addition, `Int8` is converted to `Int`. 
> >> (On your system, it seems that `Int` is `Int64`.) 
> > 
> >   OK, so indeed, there's modular arithmetics for the non native 
> >   representations as well. It's just that litterals are neither 
> >   overloaded, nor implicitly transtyped. I see, thanks. 
>
> Some mathematical constants (e.g. pi) automatically adapt to the 
> environment where they are used, defaulting to Float64. Literals, 
> however, do not -- they have a specific type. 
>
> -erik 
>
> -- 
> Erik Schnetter <[email protected] <javascript:>> 
> http://www.perimeterinstitute.ca/personal/eschnetter/ 
>

Reply via email to