And for the record, I am ignoring the verbal quibble that
implementation-define means it's defined or that something
which is implementation-dependent is or is not "defined".
This isn't a pedantic-C mailing list and the difference
between the 2 is moot in *this* discussion.

On Jan 9, 2014, at 9:52 AM, Jim Jagielski <j...@jagunet.com> wrote:

> Undefined means that the specification does not define
> what happens, and that people cannot expect anything,
> since what happens is implementation dependent.
> 
> On Jan 9, 2014, at 8:49 AM, Mattias Engdegård <matti...@bredband.net> wrote:
> 
>> 9 jan 2014 kl. 14.37 skrev Jim Jagielski:
>> 
>>> However, if a is 4,294,967,200, then the behavior
>>> of (int)a is undefined and implementation dependent,
>>> since you can't express that value within the
>>> limits of a signed int (assuming 32 bits).
>> 
>> No, it's not undefined but implementation-defined, which means that an 
>> implementation can decide what to do as long as it documents it. There is 
>> quite a difference.
>> 
>> All compilers I have ever used, and then some, treat conversions to signed 
>> by modular reduction into the interval defined by that type. Nothing else 
>> makes sense, and compilers won't start doing it differently.
>> 
>>> My point is that the possibility in that case of
>>> (int)a resulting in 0 is pretty freakin' remote,
>>> even if it is undefined behavior ;)
>> 
>> It's not undefined behaviour.
>> 
> 

Reply via email to