I'm debugging a problem with a game I'm trying to make work reliably
under Linux and with out the change I proposed,
the application gets a segmentation fault and exits.

In the code calling the ggLock/ggUnlock with the NULL value is code from

the file unix.c from the libgii-0.6/gii
directory. In my version I got the file in the form of a tgz file and
built it so that I could build and debug the game.
A snippet of code around the call from gii/unix.c  is   ....


 if (timeout) {
  if (timeout->tv_usec == 0 && timeout->tv_sec == 0) {
   zero_timeout = 1;
  } else {
   gettimeofday(&origtv, NULL);
  }
 }

 /* Give the sources a try. */
 tmpmask = _giiPollall(inp, mask, NULL);  //  <<<<<<<<<<<< here it is
 if (tmpmask) {
  return tmpmask;
 }

 /* Catch common case, avoid underflow and select() */
 if (zero_timeout) return 0;

_giiPollall calls ...


gii_event_mask
_giiPollall(struct gii_input *inp, gii_event_mask mask, void *arg)
{
....
   retmask |= (curr->GIIeventpoll(curr, arg) & mask);

where the NULL from above is now the value for the void * arg.

The routine at curr->GIIeventpoll( ... ) isn't debugable, so I don't
know where that goes. The level of call goes to
ggLock and ggUnlock, where the NULL value for the mutex
causes the failure in the original version of the code.

Is there something obvious that avoids this problem -- it looks
unavoidable (since the call is set up in a library I don't
own)? Why is this tightly coupled library abusing the interface?

Thanks in advance any one's help.

Chris Arena
Virginia Beach, VA USA

Reply via email to