FYI, I can't reproduce David's failures on my machine (intel core2 duo w/ 10.5.5) * python 2.6 from macports * numpy svn 6098 * GCC 4.0.1 (Apple Inc. build 5488)
I have only 1 failure: FAIL: test_umath.TestComplexFunctions.test_against_cmath ---------------------------------------------------------------------- Traceback (most recent call last): File "/opt/local/lib/python2.6/site-packages/nose-0.10.4-py2.6.egg/ nose/case.py", line 182, in runTest self.test(*self.arg) File "/Users/pierregm/Computing/.pythonenvs/default26/lib/python2.6/ site-packages/numpy/core/tests/test_umath.py", line 423, in test_against_cmath assert abs(a - b) < atol, "%s %s: %s; cmath: %s"%(fname,p,a,b) AssertionError: arcsin 2: (1.57079632679-1.31695789692j); cmath: (1.57079632679+1.31695789692j) ---------------------------------------------------------------------- (Well, there's another one in numpy.ma.min, but that's a different matter). On Nov 25, 2008, at 2:19 AM, David Cournapeau wrote: > On Mon, 2008-11-24 at 22:06 -0700, Charles R Harris wrote: >> >> >> Well, it may not be that easy to figure. The (generated) >> pyconfig-32.h has >> >> /* Define to 1 if your processor stores words with the most >> significant byte >> first (like Motorola and SPARC, unlike Intel and VAX). >> >> The block below does compile-time checking for endianness on >> platforms >> that use GCC and therefore allows compiling fat binaries on OSX by >> using >> '-arch ppc -arch i386' as the compile flags. The phrasing was >> choosen >> such that the configure-result is used on systems that don't use >> GCC. >> */ >> #ifdef __BIG_ENDIAN__ >> #define WORDS_BIGENDIAN 1 >> #else >> #ifndef __LITTLE_ENDIAN__ >> /* #undef WORDS_BIGENDIAN */ >> #endif >> #endif >> > > Hm, interesting: just by grepping, I do have WORDS_BIGENDIAN defined > to > 1 on *both* python 2.5 and python 2.6 on Mac OS X (running Intel). > Looking closer, I do have the above code (conditional) in 2.5, but not > in 2.6: it is inconditionally defined to BIGENDIAN on 2.6 !! That's > actually part of something I have wondered for quite some time about > fat > binaries: how do you handle config headers, since they are generated > only once for every fat binary, but they should really be generated > for > each arch. > >> And I guess that __BIG_ENDIAN__ is a compiler flag, it isn't in any >> of the include files. In any case, this looks like a Python bug or >> the >> Python folks have switched their API on us. > > Hm, actually, it is a bug in numpy as much as in python: python should > NOT include any config.h in their public namespace, and we should not > rely on it. > > But with this info, it should be relatively easy to fix (by setting > the > correct endianness by ourselves with some detection code) > > David > > > _______________________________________________ > Numpy-discussion mailing list > Numpy-discussion@scipy.org > http://projects.scipy.org/mailman/listinfo/numpy-discussion _______________________________________________ Numpy-discussion mailing list Numpy-discussion@scipy.org http://projects.scipy.org/mailman/listinfo/numpy-discussion