> On Nov 21, 2016, at 15:09 , Marco S Hyman <m...@snafu.org> wrote:
> 
>> Except it does, because if I write
>> 
>>      let a = 2
> 
>> a is of type Int (at least, according to Xcode's code completion).
> 
> and if you write
> 
>       let b = 2 + 0.5
> 
> 2 is treated as a double. The type of the literal “2” varies with context. Do 
> you also find that inconsistent and confusing? 

Nope. I can see how the promotion works. Also, Xcode would tell me b is a 
Double.

> 
>> But this gives inconsistent results:
>> 
>>      let t = true
>> 
>>      let a = Int(true)
>>      let b = Int(t)          //  Error
>> 
>> I find this to be very inconsistent and confusing.
> 
> t is a Bool and there is no automatic conversion from Bool to Int.
> 
> true is not a Bool.  It may be treated as a Bool depending upon context.  In 
> the line `let t = true` it is treated as a Bool. In `let a = Int(true)` it is 
> treated as an NSNumber (assuming you import foundation).

That may be what's happening, but it's still confusing and unintuitive. That 
something is lost in the transitivity of going through a variable, aside from 
"literalness", is confusing. 

And really, it would be nice if the language provided a fast way of getting an 
number "1" out of a Bool variable true (and 0 out of false). But that 
conversation is a bigger can of worms than I care to open right now.

-- 
Rick Mann
rm...@latencyzero.com


_______________________________________________
swift-users mailing list
swift-users@swift.org
https://lists.swift.org/mailman/listinfo/swift-users

Reply via email to