@Stephen @John: 
Interesting to learn about the low-level things, thank you,
but efficient or not. somewhere along the way, 
conversions are simply unavoidable, 
whether explicit or implicit, 
regardless of its performance...

Theoretically, doing:
aDouble = Double(aFloat) 
     should have the same performance as
aDouble = aFloat   //implicitly
The compiler simply generates the same code in both cases, i assume.

Implicit or explicit?  Thinking further, it seems to me that it doesn’t
matter because the programmer should be equally aware of the 
operation, whether it is an explicit or an implicit conversion.

(seems to be not so difficult assisted by verbose compiler warnings during 
editing) 

So, thinking along this line, (that is, the programmer has to make 
almost the same judging effort in both cases anyway) it seems 
logically correct that explicit conversions  are in fact superfluous, 
pointless, 
and can thus be removed from the language, with possibly the exception for 
explicit conversion functions that have extra parameters to influence
the conversion, like its precision, magnitude, rounding etc.
e.g: 
   anInt = Int(aFloat, truncation: .roundingUp) 

?

TedvG 


> On 20. Jun 2017, at 00:08, Stephen Canon <[email protected]> wrote:
> 
>> 
>> On Jun 19, 2017, at 5:43 PM, David Sweeris <[email protected] 
>> <mailto:[email protected]>> wrote:
>> 
>> Sent from my iPhone
>> On Jun 19, 2017, at 13:44, John McCall via swift-evolution 
>> <[email protected] <mailto:[email protected]>> wrote:
>> 
>>>> On Jun 19, 2017, at 1:58 PM, Stephen Canon via swift-evolution 
>>>> <[email protected] <mailto:[email protected]>> wrote:
>>>>> On Jun 19, 2017, at 11:46 AM, Ted F.A. van Gaalen via swift-evolution 
>>>>> <[email protected] <mailto:[email protected]>> wrote:
>>>>> 
>>>>> var result: Float = 0.0
>>>>> result = float * integer * uint8 +  double   
>>>>> // here, all operands should be implicitly promoted to Double before the 
>>>>> complete expression evaluation.
>>>> 
>>>> You would have this produce different results than:
>>>> 
>>>>    let temp = float * integer * uint8
>>>>    result = temp + double
>>>> 
>>>> That would be extremely surprising to many unsuspecting users.
>>>> 
>>>> Don’t get me wrong; I *really want* implicit promotions (I proposed one 
>>>> scheme for them  way back when Swift was first unveiled publicly).
>>> 
>>> I don't!  At least not for floating point.  It is important for both 
>>> reliable behavior and performance that programmers understand and minimize 
>>> the conversions they do between different floating-point types.
>> 
>> How expensive is it?
> 
> On most contemporary hardware, it’s comparable to a floating-point add or 
> multiply. On current generation Intel, it’s actually a little bit more 
> expensive than that. Not catastrophic, but expensive enough that you are 
> throwing away half or more of your performance if you incur spurious 
> conversions on every operation.
> 
> This is really common in C and C++ where a naked floating-point literal like 
> 1.2 is double:
> 
>       float x;
>       x *= 1.2;
> 
> Instead of a bare multiplication (current generation x86 hardware: 1 µop and 
> 4 cycles latency) this produces a convert-to-double, multiplication, and 
> convert-to-float (5 µops and 14 cycles latency per Agner Fog).
> 
> –Steve

_______________________________________________
swift-evolution mailing list
[email protected]
https://lists.swift.org/mailman/listinfo/swift-evolution

Reply via email to