--- In [email protected], "Brett W. McCoy" <[EMAIL PROTECTED]> wrote: > > On 5/14/07, Ajinkya Kale <[EMAIL PROTECTED]> wrote: > > > I asked this trivial question only for the reason that this > > question was asked in one of my orals and the examiner told > > me that DOS works in real mode and Windows in protected so > > the difference in the sizes of int. > > I dont think this is true...what do you all think? > > He's right -- real mode was introduced with the 286 chip > (to be backward compatible with the 8086/8088 chips, which > only had one mode of operating, which is the same as what > real mode is on later chips) and this is what DOS runs in > -- 20 bit segmented memory addressing, only 1 meg of memory > available (and for userspace applications, only 640k of that > 1 meg). Technically, all CPUs 286 and higher start up in > real mode but then switch into protected mode to load > operating systems like Windows, Linux, FreeBSD, etc., > and able to use 32bit integers. Real mode uses 16 bit > integers. <snip>
I dare to partially disagree, Brett. The point is that every compiler writer is free to define int values at will (except for the ANSI requirements that short is >= 16 bits and that int is >= 17 bits, if I recall correctly). It would be perfectly legal for a compiler writer to define int and long both as 64-bit integers, be it under MS-DOS or Windows. Not only did Windows on a 286 CPU still employ 16-bit integers (because this was the natural word size of the 80286), no, even nowadays Windows still defines "int" as 16 bits when compiling in console mode. At least that's what Visual C++ 6.0 (yes, I know it's ancient, but I have to use it for reasons out of my influence) still does. BTW could anyone please update me on the current sizes of int values for console-mode applications under VS.NET and other modern compilers? Just for the sake of my curiosity, I admit it. Regards, Nico
