On Fri, Aug 20, 2010 at 12:59 PM, Eric Pashman <[email protected]> wrote:

> To add to my previous message, I've just noticed in running through the HDF5
> build process again that ./ configure returns this configuration:
> General Information:
> -------------------
>   HDF5 Version: 1.8.5
>  Configured on: Fri Aug 20 16:25:11 BST 2010
>  Configured by: [email protected]
> Configure mode: production
>    Host system: i386-apple-darwin10.4.0
>      Uname information: Darwin my.domain 10.4.0 Darwin Kernel Version
> 10.4.0: Fri Apr 23 18:28:53 PDT 2010; root:xnu-1504.7.4~1/RELEASE_I386 i386

I assume the uname information is just copied from the 'uname -a'
output.   Maybe
your kernel is running in 32-bit mode, which might be necessary with
older hardware
(e.g., lacking 64-bit drivers).  In principle you can still run 64-bit
apps, but it may take
extra effort to convince 'configure' scripts to built 64-bit versions.
  You can try
running configure after something like:

export CFLAGS='-m64'
or
export CFLAGS='-arch x86_64'






>       Byte sex: little-endian
>      Libraries:
>     Installation point: /usr/local/hdf5
>
> That configuration is incorrect. I am running Mac OS 10.6 on an x86_64
> architecture. ./configure shows at various steps that it's found a 64-bit
> architecture, it's looking at the gcc linker at
> /usr/libexec/gcc/i686-apple-darwin10/4.2.1/ld (which is correct), but also
> reports
> checking dynamic linker characteristics... darwin10.4.0 dyld
> which possibly is incorrect. I say possibly because Apple's version of the
> gcc linker takes multi-architecture input files and then output just one
> file for the correct architecture. Though I don't know why it would report
> as above.  I've also tried prefacing make with sudo env ARCHFLAGS="-arch
> x86_64", but the end result is the same.
> The tests shouldn't pass if it were building for the wrong architecture,
> anyway, so this is probably all a red herring. (Recall that I also tried the
> binaries on the website.) Has anyone installed HDF5 on Mac OS 10.6 lately
> and seen this behavior?
> Regards,
> Eric
>
> On Fri, Aug 20, 2010 at 4:10 PM, Eric Pashman <[email protected]>
> wrote:
>>
>> I'm having all kinds of problems installing PyTables, whose developer
>> suggested I seek help here, as the problem seems related to my HDF5
>> installation or its use of SZIP.
>> First, my system configuration:
>> Mac OS 10.6.4
>> Intel Core 2 Duo (64-bit)
>> Python 2.7 (binaries from python.org)
>> Numpy 1.5.0b1 (binaries from scipy.org)
>> I've installed all of Pytables's dependencies, and those dependencies'
>> dependencies: Numexpr (1.4, via easy_install), bzip2 (1.0.5, built from
>> source), SZIP (2.1, built from source), LZO (2.03, built from source), zlib
>> (1.2.5, built from source), all without apparent problems. The stuff I built
>> from source passed all tests in make check/test.
>> I tried installing the HDF5 1.8.5 binaries available on the website, and
>> when that didn't work I built it from source. The tests in make check/test
>> all PASS, except a couple are SKIPPED. But when I try to install PyTables
>> (both via easy_install and via setup.py), things go wrong. This first step
>> yields no errors, but there's a scary warning about not finding
>> the HDF5 runtime):
>> tables-2.2 myuser$ sudo python setup.py build_ext --inplace
>> * Found numpy 1.5.0b1 package installed.
>> * Found numexpr 1.4 package installed.
>> * Found HDF5 headers at ``/usr/local/include``, library at
>> ``/usr/local/lib``.
>> .. WARNING:: Could not find the HDF5 runtime.
>>    The HDF5 shared library was *not* found in the default library
>>    paths. In case of runtime problems, please remember to install it.
>> * Found LZO 2 headers at ``/usr/local/include``, library at
>> ``/usr/local/lib``.
>> * Skipping detection of LZO 1 since LZO 2 has already been found.
>> * Found bzip2 headers at ``/usr/include``, library at ``/usr/lib``.
>> * Found pthreads headers at ``/usr/include``, library at ``/usr/lib``.
>> running build_ext
>> cythoning tables/linkExtension.pyx to tables/linkExtension.c
>> building 'tables.linkExtension' extension
>> The same thing happens if I use the --hdf5=whatever command. There's more
>> output that includes a bunch of architectural warnings, but no errors. That
>> looks OK to me because the errors are all for ppc and i386 architectures,
>> and I should need the x86_64 stuff. Then if I run Python and try to load
>> Pytables, I get this error:
>> >> import tables
>> Traceback (most recent call last):
>>   File "<stdin>", line 1, in <module>
>>   File "tables/__init__.py", line 63, in <module>
>>     from tables.utilsExtension import getPyTablesVersion, getHDF5Version
>> ImportError: dlopen(tables/utilsExtension.so, 2): Symbol not found:
>> _SZ_BufftoBuffCompress
>>   Referenced from: /usr/local/src/tables-2.2/tables/utilsExtension.so
>>   Expected in: flat namespace
>>  in /usr/local/src/tables-2.2/tables/utilsExtension.so
>> I've ended up with the same error through every permutation of the
>> installation process that I've tried. I rebuilt HDF5 several times, and I
>> verified that it was finding the SZIP libraries; The HDF5 ./configure even
>> outputted this line:
>> checking for SZ_BufftoBuffCompress in -lsz... yes
>> Yet that seems to be the stumbling block.  Does anyone know anything about
>> SZ_BufftoBuffCompress? Very little shows up on Google, and nothing on this
>> mailing list. I assume from the name that it is a SZIP thing, so I tried
>> reinstalling that several times, building HDF5 with and without the SZIP
>> flag, etc., all to no avail.
>> I'd be grateful for any pointers.
>> Regards,
>> Eric
>
> _______________________________________________
> Hdf-forum is for HDF software users discussion.
> [email protected]
> http://mail.hdfgroup.org/mailman/listinfo/hdf-forum_hdfgroup.org
>
>



-- 
George N. White III <[email protected]>
Head of St. Margarets Bay, Nova Scotia

_______________________________________________
Hdf-forum is for HDF software users discussion.
[email protected]
http://mail.hdfgroup.org/mailman/listinfo/hdf-forum_hdfgroup.org

Reply via email to