I see an issue in the 3.0 codebase with the atomic increment and decrement routines:

- They take an "int" parameter, which isn't of fixed size across architectures. - The Windows atomic op routines cast this to a long (which might really mess things up in 64 bits if sizeof(int) != sizeof(long)? ) - The Mac atomic op routines cast this to a 32 bit value (which might work okay assuming sizeof(int) is 32 bits) - The POSIX routines have no problem, given that they just take a mutex around the manipulation and don't twiddle the bits directly.

In grepping the source, I don't see that either of these routines is even used at present, so this may not be the end of the world. But it's probably something we'd like to clean up for 3.0, since we have a chance to change the API.

A couple of options:

        - Remove the routines altogether, if we really don't use them anymore.
        - Replace the int parameter with XMLUInt32, for instance.
- Ignore the situation (which will undoubtably cause somebody grief some day).

Thoughts?

James

---------------------------------------------------------------------
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]

Reply via email to