I'm going to open a can of worms. include/ggi/system.h does this:
typedef signed int int32;
typedef unsigned int uint32;
Why is this int? I know that there is no guarantee for the size of int
and the size of long, but I think that long is more often 32 bits and is
(somewhat) more correct.
Furthermore it seems that many headers always want to reinvent the wheel
when it comes to these typedefs. Why doesn't GGI just "#include
<sys/types.h>" ? It provides the following:
typedef char int8_t;
typedef short int int16_t;
typedef int int32_t;
typedef unsigned char u_int8_t;
typedef unsigned short int u_int16_t;
typedef unsigned int u_int32_t;
If the argument is that not every platform provides these then couldn't
it be "discovered" in the configure stage and the proper header file
built at that time?
I propose, at very least, the following change.
--- system.h.orig Tue Jul 3 10:14:36 2001
+++ system.h Thu Jul 5 13:24:51 2001
@@ -55,8 +55,8 @@
typedef signed short sint16;
typedef unsigned short uint16;
-typedef signed int sint32;
-typedef unsigned int uint32;
+typedef signed long sint32;
+typedef unsigned long uint32;
typedef signed int ggi_sint;
typedef unsigned int ggi_uint;
--
Thayne Harbaugh
Your eyes are weary from staring at the CRT. You feel sleepy. Notice how
restful it is to watch the cursor blink. Close your eyes. The opinions
stated above are yours. You cannot imagine why you ever felt otherwise.