Darren Dale wrote:
On my system, SciPy's signbit function reports that the sign bit is not set
for any number, positive or negative. Could someone here help me understand
how to test the libc signbit function? I have to admit I have no experience
with C programming.
Hi Darren,
the signbit fuction is actually a macro (as the manpage says) defined in
math.h that in turn calls the right inline function (for the type
needed) which is defined in mathinline.h --- so as far as i can see,
libc should not be involved, only header files. I have attached a small
example below on how to use the function. Please note the use of
-std=c99 (you may also use -std=gnu99) as the macro is only activated
when in C99 mode and gcc's default mode is C89 ("ANSI C"). If you're
interested in the differences between the two standards the wikipedia
entry on c has some info:
http://en.wikipedia.org/wiki/C_programming_language
the program (save it under signbit_test.c):
[cut]
#include <math.h>
#include <stdio.h>
int main() {
printf("sign of 1.7 is %d\n", signbit(1.7));
printf("sign of -1.1 is %d\n", signbit(-1.1));
printf("sign of -0.0 is %d\n", signbit(-0.0));
printf("sign of 0.0 is %d\n", signbit(0.0));
return 0;
}
[/cut]
compile with:
gcc -Wall -std=c99 -lm signbit_test.c -o signbit_test
run with:
./signbit_test
should produce this output:
sign of 1.7 is 0
sign of -1.1 is -2147483648
sign of -0.0 is -2147483648
sign of 0.0 is 0
This was run with gcc 3.4.4 on amd64, if you want to i can try on a x86
install in qemu.
Marco
--
[email protected] mailing list