Want to see some real magic?

struct A : _ExpressibleByBuiltinIntegerLiteral {
     
    init(_builtinIntegerLiteral value: _MaxBuiltinIntegerType) {}
}

struct B : ExpressibleByIntegerLiteral {
     
    init(integerLiteral value: A) {
         
        print(type(of: value))
    }
}

let b: B = 42 // prints "A"
My integer literel type is now A and not any of the (U)Int(8,16,32,64) family.



-- 
Adrian Zubarev
Sent with Airmail

Am 22. November 2016 um 19:36:26, Kenny Leung via swift-users 
(swift-users@swift.org) schrieb:

Hi Marc.

My old mechanical engineering prof used to say, “C is easy if you know 
assembler”.  

The fact that such a simple construct does not work and requires such a long 
explanation - which may still not be understood by a newbie - is a problem that 
should be addressed.

Even this explanation requires that you “see inside” the compiler to know what 
it’s “thinking”. And the fact that NSNumber comes into this makes it more 
interesting. What would be the behaviour (or at least the error message) on 
Linux, where there is no NSNumber? (or is there? I’m even more confused - have 
to try it out for myself).

We are also getting complacent when “A literal doesn’t have a type on its own. 
Instead, a literal is parsed as having infinite precision and Swift’s type 
inference attempts to infer a type for the literal.” gets condensed down to 
“literals in Swift are untyped” I don’t think this helps the explanation when 
there really is a distinction between different types of literals (otherwise 
there wouldn’t be things like ExpressibleBy*Boolean*Literal).

I think part of it is the way the documentation itself is worded. Another part 
here is the weird side effect Objective-C compatibility brings into the picture.

I think I’m turning this into a swift-evolution topic:
* should Int(Bool) be supported in the standard library?
** if so, then Int(t) should work here
** if not, then Int(true) should also error to avoid confusion

-Kenny


> On Nov 21, 2016, at 3:09 PM, Marco S Hyman via swift-users 
> <swift-users@swift.org> wrote:
>  
>> Except it does, because if I write
>>  
>> let a = 2
>  
>> a is of type Int (at least, according to Xcode's code completion).
>  
> and if you write
>  
> let b = 2 + 0.5
>  
> 2 is treated as a double. The type of the literal “2” varies with context. Do 
> you also find that inconsistent and confusing?  
>  
>> But this gives inconsistent results:
>>  
>> let t = true
>>  
>> let a = Int(true)
>> let b = Int(t)       // Error
>>  
>> I find this to be very inconsistent and confusing.
>  
> t is a Bool and there is no automatic conversion from Bool to Int.
>  
> true is not a Bool. It may be treated as a Bool depending upon context. In 
> the line `let t = true` it is treated as a Bool. In `let a = Int(true)` it is 
> treated as an NSNumber (assuming you import foundation).
>  
> Marc
> _______________________________________________
> swift-users mailing list
> swift-users@swift.org
> https://lists.swift.org/mailman/listinfo/swift-users

_______________________________________________
swift-users mailing list
swift-users@swift.org
https://lists.swift.org/mailman/listinfo/swift-users
_______________________________________________
swift-users mailing list
swift-users@swift.org
https://lists.swift.org/mailman/listinfo/swift-users

Reply via email to