Note that these sources were originally created in a day when "unsigned long" was a 32-bit quantity in most places and often used to make sure you didn't get a 16-bit integer. That sizing rule of thumb changed over the years and now it usually gives you a 64-bit quantity. (Don't you just love C?)

In looking deeper into the SERVER64 directive which controls this (setting it would use a CARD32 to define a VisualID which has a more concrete 32-bit definition), it looks like they are specifically intending that to not be used for client code, but I can find no explanation via Google. The closest that I could find was a commit comment to the sources for some X11 client code that mentioned that SERVER64 should not be used in client sources because it could wreak havoc, but no explanation as to why.

My guess is that they didn't really embrace "machine dependent typedef" technology to ensure that these quantities were declared as 32-bit quantities until it was too late and they had spawned a mixture of versions of Xlib that just had whatever "unsigned long" compiled to. (Still, you'd think they would have noticed this as soon as the first 64-bit Xlib failed to interpret the protocol correctly, no? Or maybe they thought that "unsigned long" was a more natural type for the compilers to play with?) As a result, to go back and fix it for the client libraries would cause this compatibility havoc and so they only fix it now for server sources. It's sloppy, but probably necessary due to practical concerns. So I guess it would appear that they really do want these local types to be 64-bit quantities despite the fact that the protocol defines their data as 32-bit values.

So, after quite a bit of obscure digging, it looks like our use of the header files is actually correct and so this fix is accurate now despite the conceptual conflict with the definition of the protocol.

%lx, according to the printf/scanf man pages, should track the size of "unsigned long" on any given platform and it shouldn't misinterpret the correct value of any 32-bit hex string that it parses so I guess there is no problem here (other than my now feeling quite a bit more disgusted/amused with the cobbled state of the X11 APIs ;-).

Apologies for the disruption... :-(

                        ...jim

On 3/9/11 10:00 AM, Phil Race wrote:
That's what I had supposed too, but the client definition depends on the
machine
definition of long.

% more x.c
#include <stdlib.h>
#include <stdio.h>
#include <X11/X.h>

int main( int ac, char** av) {
VisualID id;
printf("sz=%d\n", sizeof(id));
}

% cc -I/usr/openwin/include x.c
rincewind 125% file a.out
a.out: ELF 32-bit MSB executable SPARC32PLUS Version 1, V8+ Required,
dynamically linked, not stripped
% a.out
sz=4


% cc -I/usr/openwin/include -m64 x.c
% file a.out
a.out: ELF 64-bit MSB executable SPARCV9 Version 1, dynamically linked,
not stripped
% a.out
sz=8

-phil.

Jim Graham wrote:
Please revert this fix and re-read the X11 protocol. Something *else*
went wrong for parfait to think that we wanted to parse a 64-bit value
there. VisualID values are small numbers and transferred as 32-bit
quantities in the protocol. If the typedef ended up with a 64-bit
representation from the header files then we have probably not
configured something correctly...

...jim

On 3/9/11 2:37 AM, [email protected] wrote:
Changeset: 69ec42543dd9
Author: bae
Date: 2011-03-09 13:08 +0300
URL: http://hg.openjdk.java.net/jdk7/2d/jdk/rev/69ec42543dd9

7022280: Parfait reports Format string argument mismatch in
/jdk/src/solaris/native/sun/awt/awt_GraphicsEnv.c
Reviewed-by: jgodinez, prr

! src/solaris/native/sun/awt/awt_GraphicsEnv.c


Reply via email to