On 5 Jul 2001, Thayne Harbaugh wrote:
> I'm going to open a can of worms. include/ggi/system.h does this:
>
> typedef signed int int32;
> typedef unsigned int uint32;
>
> Why is this int? I know that there is no guarantee for the size of int
> and the size of long, but I think that long is more often 32 bits and is
> (somewhat) more correct.
<ggi/system.h> is always correct, because it is autogenerated by
configure.
> Furthermore it seems that many headers always want to reinvent the wheel
> when it comes to these typedefs. Why doesn't GGI just "#include
> <sys/types.h>" ? It provides the following:
>
> typedef char int8_t;
> typedef short int int16_t;
> typedef int int32_t;
>
> typedef unsigned char u_int8_t;
> typedef unsigned short int u_int16_t;
> typedef unsigned int u_int32_t;
I don't know. The decision was made, before I came to GGI. I guess,
platform independence is the reason.
> If the argument is that not every platform provides these then couldn't
> it be "discovered" in the configure stage and the proper header file
> built at that time?
>
>
> I propose, at very least, the following change.
>
>
> --- system.h.orig Tue Jul 3 10:14:36 2001
> +++ system.h Thu Jul 5 13:24:51 2001
> @@ -55,8 +55,8 @@
> typedef signed short sint16;
> typedef unsigned short uint16;
>
> -typedef signed int sint32;
> -typedef unsigned int uint32;
> +typedef signed long sint32;
> +typedef unsigned long uint32;
That's wrong, because on 64bit architectures "long" is 64bit wide.
> typedef signed int ggi_sint;
> typedef unsigned int ggi_uint;
>
CU,
Christoph Egger
E-Mail: [EMAIL PROTECTED]