Re: [Hdf-forum] H5IMPORT test fail

2013-05-27 Thread Albert Cheng

Hi Dave,

Please try these:

1. ./configure ...
2. make # build all binary
3. make check# run testings

The reasons of the failures you saw is because h5dump is arranged to be
built AFTER h5import.  When you did make check directly, the h5dump
that was needed by the H5IMPORT test was not built yet, resulting in the
no such file failure.

It is our mistake to arrange h5dump built later than h5import. We will
fix it for the next release.  Thank you for reporting this failure.

-Albert Cheng
The HDF Group

On 5/27/13 1:28 AM, Dave wrote:

Dear hdf5 users

I'm trying to build HDF5 library in ubuntu 12.04 64bit LTS using gcc 4.7.3

$ ./configure --prefix=/opt/hdf5 --with-szlib=/opt/hdf5
--with-zlib=/opt/hdf5 \
$ make check

('szlib' and 'zlib' are already installed in /opt/hdf5)

And I got 8 fails during running 'make check' as below.

I've installed hdf5 tools from ubuntu software  center and checked 'h5dump'
worked fine but test didn't make it.

It seems that there is no posts to refer for the problem in this forum.

Please help this newbie.

Thank you in advance.

==
H5IMPORT tests started
==
Testing ASCII I32 rank 3 - Output BE
h5importtestutil.sh: line 165: ../h5dump: Is a directory
h5importtestutil.sh: line 168: ../h5dump: No such file or directory
  PASSED
Testing ASCII I16 rank 3 - Output LE - CHUNKED - extended
h5importtestutil.sh: line 165: ../h5dump: Is a directory
h5importtestutil.sh: line 168: ../h5dump: No such file or directory
  PASSED
Testing ASCII I8 - rank 3 - Output I8 LE-Chunked+Extended+Compressed
h5importtestutil.sh: line 165: ../h5dump: Is a directory
h5importtestutil.sh: line 168: ../h5dump: No such file or directory
  PASSED
Testing ASCII UI16 - rank 2 - Output LE+Chunked+Compressed
h5importtestutil.sh: line 165: ../h5dump: Is a directory
h5importtestutil.sh: line 168: ../h5dump: No such file or directory
  PASSED
Testing ASCII UI32 - rank 3 - Output BE
h5importtestutil.sh: line 165: ../h5dump: Is a directory
h5importtestutil.sh: line 168: ../h5dump: No such file or directory
  PASSED
Testing ASCII F32 - rank 3 - Output LE
h5importtestutil.sh: line 165: ../h5dump: Is a directory
h5importtestutil.sh: line 168: ../h5dump: No such file or directory
  PASSED
Testing ASCII F64 - rank 3 - Output BE + CHUNKED+Extended+Compressed
h5importtestutil.sh: line 165: ../h5dump: Is a directory
h5importtestutil.sh: line 168: ../h5dump: No such file or directory
  PASSED
Testing BINARY F64 - rank 3 - Output LE+CHUNKED+Extended+Compressed
h5importtestutil.sh: line 165: ../h5dump: Is a directory
h5importtestutil.sh: line 168: ../h5dump: No such file or directory
  PASSED
Testing H5DUMP-BINARY F64 - rank 3 - Output
LE+CHUNKED+Extended+Compreh5importtestutil.sh: line 186: ../h5dump: Is a
directory
One or more of the required fields (RANK, DIMENSION-SIZES) missing.
Configuration parameters are invalid in dbinfp64.h5.dmp.
Error in processing the configuration file: dbinfp64.h5.dmp.
Program aborted.
HDF5-DIAG: Error detected in HDF5 (1.8.11) thread 0:
   #000: H5L.c line 824 in H5Lexists(): unable to get link info
 major: Symbol table
 minor: Object not found
   #001: H5L.c line 2765 in H5L_exists(): path doesn't exist
 major: Symbol table
 minor: Object already exists
   #002: H5Gtraverse.c line 861 in H5G_traverse(): internal path traversal
failed
 major: Symbol table
 minor: Object not found
   #003: H5Gtraverse.c line 755 in H5G_traverse_real(): component not found
 major: Symbol table
 minor: Object not found
*FAILED*
Testing BINARY I8 - rank 3 - Output I16LE +
Chunked+Extended+Compresseh5importtestutil.sh: line 165: ../h5dump: Is a
directory
h5importtestutil.sh: line 168: ../h5dump: No such file or directory
  PASSED
Testing H5DUMP-BINARY I8 - rank 3 - Output I16LE +
Chunked+Extended+Coh5importtestutil.sh: line 186: ../h5dump: Is a directory
One or more of the required fields (RANK, DIMENSION-SIZES) missing.
Configuration parameters are invalid in dbinin8.h5.dmp.
Error in processing the configuration file: dbinin8.h5.dmp.
Program aborted.
HDF5-DIAG: Error detected in HDF5 (1.8.11) thread 0:
   #000: H5L.c line 824 in H5Lexists(): unable to get link info
 major: Symbol table
 minor: Object not found
   #001: H5L.c line 2765 in H5L_exists(): path doesn't exist
 major: Symbol table
 minor: Object already exists
   #002: H5Gtraverse.c line 861 in H5G_traverse(): internal path traversal
failed
 major: Symbol table
 minor: Object not found
   #003: H5Gtraverse.c line 755 in H5G_traverse_real(): component not found
 major: Symbol table
 minor: Object not found
*FAILED*
Testing BINARY I16 - rank 3 - Output order LE + CHUNKED + extended
h5importtestutil.sh: line 165: ../h5dump: Is a directory
h5importtestutil.sh: line 168: ../h5dump: No such file or directory
  PASSED
Testing H5DUMP-BINARY I16 - rank 3 - Output order LE + CHUNKED +
extenh5importtestutil.sh: line 186: ../h5dump

Re: [Hdf-forum] building hdf5-1.8.10-patch1: H5_CFLAGS variable problems for Cray compiler

2013-05-07 Thread Albert Cheng

Hi Neil,

Since the configure process is executed in the front end which is a 
Linux system,
configure naturally thinks it is working in a Linux system, therefore it 
goes to
the Linux configure settings.  I think the right solution is to tell 
configure to
build for a cross-compiling Cray system.  This will take some effort and 
won't

be available in the pending release v1.8.11.  We are planning to provide it
in the next release.

-Albert Cheng
The HDF Group

On 5/2/13 3:49 PM, Neil wrote:

Hi Albert,

We are using a Cray XK6 running Cray's Compute Linux Environment (CLE) with
the Cray CCE compiler. The front end nodes run SLES Linux and the compute
nodes run CLE. So when building HDF5 (and any other application/library)
it's a cross compilation.

I can get the library to build successfully, both serial and parallel
versions by hacking the config/linux-gnulibcl to remove the inclusion of
the intel-flags and intel-fflags files (as stated previously). It would be
handy if the configure script could correctly identify the Cray CCE compiler
as some users may not be aware that this is the problem.

We use pretty much the same method as described in the
release_docs/INSTALL_parallel. As the doc says, the key is to set the
RUNPARALLEL and RUNSERIAL environment variables.

With the Cray CCE compiler it doesn't generate Fortran module files by
default, so you have to make sure that -em is set in FFLAGS. For CFLAGS
and CXXFLAGS we also have to set -h tolerant -h system_alloc.

I'll try to post our configure options for both serial and parallel builds
tomorrow when I next have access to the system in case this is of help.

Cheers,

Neil.




--
View this message in context: 
http://hdf-forum.184993.n3.nabble.com/building-hdf5-1-8-10-patch1-H5-CFLAGS-variable-problems-for-Cray-compiler-tp4026036p4026154.html
Sent from the hdf-forum mailing list archive at Nabble.com.

___
Hdf-forum is for HDF software users discussion.
Hdf-forum@hdfgroup.org
http://mail.hdfgroup.org/mailman/listinfo/hdf-forum_hdfgroup.org




___
Hdf-forum is for HDF software users discussion.
Hdf-forum@hdfgroup.org
http://mail.hdfgroup.org/mailman/listinfo/hdf-forum_hdfgroup.org


Re: [Hdf-forum] HDF5 on Redhat

2013-05-02 Thread Albert Cheng
HDF5 configure detects you are using a parallel MPI compiler (.../mpicc) 
and turns on --enable-parallel
by default which set RUNPARALLEL to .../mpirun.  Your version of mpirun 
probably needs mpd
to present, thus the failure.  (Note: this is an educated guess, not 
necessary what happened in

your machine.)

If you want to use parallel HDF5, you need to setup your environment to 
run MPI programs.
You should check with your system administrator how to do that in your 
machine.


If you don't mean to build parallel HDF5, then don't use mpicc to build 
HDF5.
But if you want to use mpicc but not to build parallel HDF5, then you 
should do,

CC=.../mpicc ./configure --disable-parallel ...

Hope this helps.

-Albert Cheng
The HDF Group

On 5/1/13 6:03 PM, brown wrap wrote:


I have just compiled HDF5 1.8.10 on RHEL Server 5.9. If I compile it 
with gcc and run the test, make check, everything is fine. If I 
compile it using:


CC=/share/apps/opt/intel/impi/4.1.0.024/intel64/bin/mpicc ./configure 
--prefix=/share/apps/hdf5-impi-intel-64


I get the following errors:

cannot connect to local mpd (/tmp/mpd2.console_ramos); possible causes:
  1. no mpd is running on this host
  2. an mpd is running but was started without a console (-n option)
Command exited with non-zero status 255
0.14user 0.02system 0:00.23elapsed 73%CPU (0avgtext+0avgdata 
37520maxresident)k

0inputs+0outputs (0major+5476minor)pagefaults 0swaps
make[4]: *** [t_mpi.chkexe_] Error 1
make[4]: Leaving directory `/users/ramos/hdf5-1.8.10/testpar'
make[3]: *** [build-check-p] Error 1
make[3]: Leaving directory `/users/ramos/hdf5-1.8.10/testpar'
make[2]: *** [test] Error 2
make[2]: Leaving directory `/users/ramos/hdf5-1.8.10/testpar'
make[1]: *** [check-am] Error 2
make[1]: Leaving directory `/users/ramos/hdf5-1.8.10/testpar'
make: *** [check-recursive] Error 1


___
Hdf-forum is for HDF software users discussion.
Hdf-forum@hdfgroup.org
http://mail.hdfgroup.org/mailman/listinfo/hdf-forum_hdfgroup.org


___
Hdf-forum is for HDF software users discussion.
Hdf-forum@hdfgroup.org
http://mail.hdfgroup.org/mailman/listinfo/hdf-forum_hdfgroup.org


Re: [Hdf-forum] building hdf5-1.8.10-patch1: H5_CFLAGS variable problems for Cray compiler

2013-05-02 Thread Albert Cheng

Hi Neil,

What kind of Cray and OS version do you use?
Does it have a frontend for compiling and a backend for
compute?  If so, what kind of machine and OS version
is the frontend?

There are problems with the configure process for the Cray platform
that we will not be able to fix it in time for this pending release.
We plan to fix it in the next release.

Meanwhile, can you follow the instruction in release_docs/INSTALL_parallel,
under section 2.4. Hopper (Cray XE6) (for v1.8 and later)
to see if you can build HDF5 in your Cray?

By the way, are you building the serial HDF5 library or the Parallel HDF5
which requires MPI support?

-Albert Chen
The HDF Group

On 4/30/13 4:57 AM, Neil wrote:


Hi again,

As promised, here is the version output information for the Cray 
compiler (CCE). Note: this version information is obtained using the 
command: cc -V.


|cc -v| and |cc --version| are not supported.

|cc -V
/opt/cray/xt-asyncpe/5.17/bin/cc: INFO: Compiling with 
CRAYPE_COMPILE_TARGET=native.

Cray C : Version 8.1.7 Tue Apr 30, 2013 09:48:17|

I have also tried out HDF5 1.8.11-pre2, but it also incorrectly 
identifies the Cray compiler as the Intel compiler.


Hope this helps,

Neil.



View this message in context: Re: building hdf5-1.8.10-patch1: 
H5_CFLAGS variable problems for Cray compiler 
http://hdf-forum.184993.n3.nabble.com/building-hdf5-1-8-10-patch1-H5-CFLAGS-variable-problems-for-Cray-compiler-tp4026036p4026141.html
Sent from the hdf-forum mailing list archive 
http://hdf-forum.184993.n3.nabble.com/ at Nabble.com.



___
Hdf-forum is for HDF software users discussion.
Hdf-forum@hdfgroup.org
http://mail.hdfgroup.org/mailman/listinfo/hdf-forum_hdfgroup.org


___
Hdf-forum is for HDF software users discussion.
Hdf-forum@hdfgroup.org
http://mail.hdfgroup.org/mailman/listinfo/hdf-forum_hdfgroup.org


[Hdf-forum] HDF5 1.8.11 release candidate is available for testing

2013-04-30 Thread Albert Cheng

Dear HDF users,

A pre-release candidate version of HDF5 1.8.11 is available for testing 
and can be downloaded at the following link:


http://www.hdfgroup.uiuc.edu/ftp/pub/outgoing/hdf5/hdf5-1.8.11/hdf5-1.8.11-pre2.tar.gz
For Windows users, there is an equivalent
http://www.hdfgroup.uiuc.edu/ftp/pub/outgoing/hdf5/hdf5-1.8.11/hdf5-1.8.11-pre2.zip 
http://www.hdfgroup.uiuc.edu/ftp/pub/outgoing/hdf5/hdf5-1.8.11/hdf5-1.8.11-pre2.tar.gz


The tar ball contains fixes for a few CMake problems reported on FORUM 
for the previous release candidate. Specifically, the requested 
enhancement for building parallel HDF5 with CMake has been addressed.


Last time we failed to announce that HDF5 1.8.11 release will contain 
two new features that may be of interest to anyone who works with the 
compressed data in HDF5.


- Writing a compressed chunk bypassing HDF5 library internals for a 
better speed.

  We added a high-level function H5DOwrite_chunk documented here
http://www.hdfgroup.org/HDF5/doc_test/H51.8docs/Advanced/DirectChunkWrite/UsingDirectChunkWrite.pdf 



-Dynamically loaded filters in HDF5 will allow to load a compression 
library at run time. With this feature the users will be able to use
  the HDF5 library and tools distributed by The HDF Group with the 
thrid-party HDF5 compression methods (and other filters).
  The document that describes the feature and the BZI2 plugin example 
is available from
http://www.hdfgroup.org/HDF5/doc_test/H51.8docs/Advanced/DynamicallyLoadedFilters/HDF5DynamicallyLoadedFilters.pdf 



If you have some time to test this pre-release, we would greatly 
appreciate it. Newly reported issues will be documented and will be 
addressed in the next release.


Thank you!

-Albert Cheng
The HDF Group
___
Hdf-forum is for HDF software users discussion.
Hdf-forum@hdfgroup.org
http://mail.hdfgroup.org/mailman/listinfo/hdf-forum_hdfgroup.org


Re: [Hdf-forum] building hdf5-1.8.10-patch1: H5_CFLAGS variable problems for Cray compiler

2013-04-26 Thread Albert Cheng

Hi Neil,

I worked in a Cray about 3 to 4 years ago. Back then the cc and ftn were 
actually cross compilers

provided by PGI.  Maybe they are changed and supported by Intel.

One of our staff has more recent experience with Cray XE6,
Cray Linux Environment (CLE)  PrgEnv-pgi/4.0.46
hopper.nersc.gov  pgcc 12.5-0 64-bit target on x86-64 
Linux -tp shanghai
  pgf90 12.5-0 64-bit target on x86-64 
Linux -tp shanghai
  pgCC 12.5-0 64-bit target on x86-64 
Linux -tp shanghai


We are in a pre-release testing for v1.8.11 and a pre-release source 
tarball is available at:
http://www.hdfgroup.uiuc.edu/ftp/pub/outgoing/hdf5/hdf5-1.8.11/hdf5-1.8.11-pre1.tar.gz 



After your have downloaded the tar file, untar it into hdf5-1.8.11-pre1/,
cd into it and tried to follow the instruction in 
release_docs/INSTALL_parallel,

under 2.4. Red Storm (Cray XT3) (for v1.8 and later) or
2.5. Hopper (Cray XE6) (for v1.8 and later).

The section 2.4 is more detailed but it is old information for Red Storm.
The section 2.5 is current.  Please see which one is more relevant to your
Cray.  (And let me know too.)

If it still fails for you in any way, please send me all these information:
1. Machine information (Cray so and so)
2. All commands used and their output.
3. compiler versions (e.g. output of cc --version)

Thanks.

-Albert Cheng
The HDF Group

On 4/24/13 9:57 AM, Neil wrote:

Hi,

I too have had the same problem with both HDF5 1.8.5 and 1.8.8 with the Cray
compiler.

The -std=c99 flag gets added because the HDF5 configure script incorrectly
identifies the Cray compiler has the Intel compiler and consequently it
automatically sets a load of compiler flags that are intended for the Intel
compilers.

To workarround this edit config/linux-gnulibcl and comment out the two lines
that source files named intel-flags and intel-fflags.

It would be handy if future versions of HDF5 could detect the Cray compiler
correctly.

Hope this helps,

Neil.



--
View this message in context: 
http://hdf-forum.184993.n3.nabble.com/building-hdf5-1-8-10-patch1-H5-CFLAGS-variable-problems-for-Cray-compiler-tp4026036p4026136.html
Sent from the hdf-forum mailing list archive at Nabble.com.

___
Hdf-forum is for HDF software users discussion.
Hdf-forum@hdfgroup.org
http://mail.hdfgroup.org/mailman/listinfo/hdf-forum_hdfgroup.org




___
Hdf-forum is for HDF software users discussion.
Hdf-forum@hdfgroup.org
http://mail.hdfgroup.org/mailman/listinfo/hdf-forum_hdfgroup.org


Re: [Hdf-forum] HDF5 1.8.11 release candidate is available for testing -- HighLevel library missing

2013-04-19 Thread Albert Cheng

Hi Pedro,

Thank you for your feedback.
I think you are on a Windows system but could you give more details like
OS versions, platform specifics (32, 64 bits,... ), compilers, ...

I will check with our Cmake person of your suggestions.
A question about your suggestion 2.  Wouldn't it be better
to use folder names to specify different versions?  E.g.,

...\hdf5-1.8.11-pre1\HDF5.sln
...\hdf5-1.8.10\HDF5.sln

On 4/18/13 9:47 PM, Pedro Vicente wrote:

Hi Albert

I tried the CMake build with

T:\hdf5-1.8.11-pre1\buildcmake ..

and unlike the previous hdf5-1.8.10 reported errors, no errors this time,

However, the High-level library is not included in the build.

The netCDF build requires the Dimension Scales library, so this is a 
must for netCDF users.


1) Would it be possible to have a CMake build for the High-level 
library  ?


2) I have a suggestion, if you could name the CMake generated Visual 
Studio solution with the HDF5 release version name, like


hdf5-1.8.11-pre1.sln

instead of

HDF5.sln

This is helpful for anyone that uses several versions

Thanks

Pedro



--
Pedro Vicente, Earth System Science
University of California, Irvine
http://www.ess.uci.edu/


- Original Message - From: Albert Cheng ach...@hdfgroup.org
To: HDF Users Discussion List hdf-forum@hdfgroup.org
Sent: Friday, April 12, 2013 1:59 PM
Subject: [Hdf-forum] HDF5 1.8.11 release candidate is available for 
testing




Hello everyone,

A pre-release candidate version of HDF5 1.8.11 is available for 
testing and can be downloaded at the following link:


http://www.hdfgroup.uiuc.edu/ftp/pub/outgoing/hdf5/hdf5-1.8.11/hdf5-1.8.11-pre1.tar.gz 



If you have some time to test this pre-release, we would greatly 
appreciate it. We try to test on a wide variety of platforms and 
environments but are unable to test everywhere so feedback from the 
user community is always welcome.


Please note that while the release notes contained in the pre-release 
are reflective of the changes and additions present in this release, 
the 'platforms tested' and 'tested configurations' sections have yet 
to be updated for this version of HDF5.


We plan to release HDF5 1.8.11 in mid-May barring the discovery of 
any critical issues.


Thank you!

The HDF Group


___
Hdf-forum is for HDF software users discussion.
Hdf-forum@hdfgroup.org
http://mail.hdfgroup.org/mailman/listinfo/hdf-forum_hdfgroup.org





___
Hdf-forum is for HDF software users discussion.
Hdf-forum@hdfgroup.org
http://mail.hdfgroup.org/mailman/listinfo/hdf-forum_hdfgroup.org




___
Hdf-forum is for HDF software users discussion.
Hdf-forum@hdfgroup.org
http://mail.hdfgroup.org/mailman/listinfo/hdf-forum_hdfgroup.org


[Hdf-forum] HDF5 1.8.11 release candidate is available for testing

2013-04-12 Thread Albert Cheng

Hello everyone,

A pre-release candidate version of HDF5 1.8.11 is available for testing 
and can be downloaded at the following link:


http://www.hdfgroup.uiuc.edu/ftp/pub/outgoing/hdf5/hdf5-1.8.11/hdf5-1.8.11-pre1.tar.gz

If you have some time to test this pre-release, we would greatly 
appreciate it. We try to test on a wide variety of platforms and 
environments but are unable to test everywhere so feedback from the user 
community is always welcome.


Please note that while the release notes contained in the pre-release 
are reflective of the changes and additions present in this release, the 
'platforms tested' and 'tested configurations' sections have yet to be 
updated for this version of HDF5.


We plan to release HDF5 1.8.11 in mid-May barring the discovery of any 
critical issues.


Thank you!

The HDF Group


___
Hdf-forum is for HDF software users discussion.
Hdf-forum@hdfgroup.org
http://mail.hdfgroup.org/mailman/listinfo/hdf-forum_hdfgroup.org


Re: [Hdf-forum] [Paraview] Fortran wrapper of HDF5

2013-03-15 Thread Albert Cheng
Years ago, I heard one explanation for the seeming contradiction between 
the C order and
the Fortran order that the C order is computer science based while the 
Fortran order is

mathematics based.  Here is what it means.

The C order, aka row major order, is last index changes first. So, a 
10 by 10 two dimensional

array is stored in memory like this:
(0,0), (0,1), ..., (0,9), (1,0), (1,1),..., (9,9)
which is as natural as 00, 01, ..., 09, 10, 11, ..., 99.

The Fortran order, aka column major order, is first index changes 
first. So, a 10 by 10 two dimensional

array is stored in memory like this:
(1,1), (2,1), ..., (10,1), (1, 2), (2,2), ..., (10, 10)
which seems unnatural.  But if you recall when you learned
the Cartesian coordination in high school mathematics.
The x-axis is horizontal and y-axis is vertical. Points on the
Cartesian plain are (x, y):
(1,1), (2,1), ..., (10, 1), ...
Since Fortran, was first known as FORTRAN, was derived from
The IBM Mathematical Formula Translating System, the Fortran
order is natural too.

One more comment, the two ordering, though seem like a simple
rotation at two dimensions, are not really the same in physics for higher
dimensions.  The C order is in left-hand rule, while the Fortran order is in
right-hand rule.  If mixed up, you may attempt to operate the human
heart in the wrong side. :-)

-Albert

On 3/14/13 11:57 PM, Elena Pourmal wrote:

John,

I think you explained it well.

Looks like Jason is not along. Let me try to explain one more time

To address Jason's
I've read the section of the hdf5 page that Pradeep linked to (C vs 
Fortran ordering) several times, and for the life of me, I can't 
figure out whether the hdf5 format ordering is transparent to the 
application or not


Yes, it is transparent.

Think about N-dim array in any language as 1-dim flatten array. 
Transparency means that one always gets the same 1-dim flatten array 
of data in any programming language used.


Dimensions of the user's array (dataset dimensions) are stored in HDF5 
file as 1-dim array following a convention that the last element of 
that array has the size of the fastest changing dimension of the 
user's array. File format is INDEPENDENT from which language an array 
is written (and yes, it is turned to be a C convention :-)


When an array is written from Fortran, the value of its first 
dimension is stored in the last element of the 1-dim array in the HDF5 
file.


When Fortran library queries dimensions for an array (dataset 
dimensions), it knows as does the C library, that the last element of 
the 1-dim array in the file is the size of the fastest changing 
dimension of the user's array. It will be the first dimension for the 
Fortran array and the last dimension for the C array. I.e., Fortran 
wrappers flip dimensions, but NEVER touch user's data.


If one reads data written by a Fortran application using a C 
application (and vice versa), data will appear to be transposed, but 
it will be same data if you think about it as 1-dim flatten array!


Is it even more confusing now? :-)

Elena



~~
Elena Pourmal  The HDF Group http://hdfgroup.org
1800 So. Oak St., Suite 203, Champaign IL 61820
217.531.6112
~~



On Mar 13, 2013, at 2:28 PM, Biddiscombe, John A. wrote:

I'm forwarding this to the hdf mailing list just in case someone can 
explain it better, or correct me


You write a Fortran array with dimensions say arr[10,20,30] 
{x=10,y=20,z=30}, and this goes onto disk as an array with dimension 
sizes

0 = 10
1 = 20
2 = 30

The convention is simply that the dimensions are listed in row major 
order, so if you read the data in c into an array

arr[z][y][x]

then everything will look fine. nine times out of ten, I've found 
that fortran arrays are declared as arr[x,y,z] and c arrays as 
arr[z][y][x] so the programmer has already flipped the orders of the 
dimensions and everything works out. All hdf does is say that the 
dimensions are listed in row major order, you can interpret the data 
how you like. Hdf doesn't say 'this is X, this is Y, this is Z' - it 
only says, dim 0 is size 10, dim 1 size 20, dim 2 size 30 - you may 
add metadata yourself to tell the user which 'should' be X/Y/Z if you 
wish.


If the OP has data physically stored as fortran array[z,y,x] then the 
data will be written out transposed relative to what we expect when 
reading into paraview, then transposing will be necessary (we won't 
go into coordinate transforms to achieve the same flipping at the 
graphics end).


Did I get that right?

JB


-Original Message-
From: paraview-boun...@paraview.org 
mailto:paraview-boun...@paraview.org 
[mailto:paraview-boun...@paraview.org] On Behalf Of Jason Fleming

Sent: 13 March 2013 19:51
To: parav...@paraview.org mailto:parav...@paraview.org
Subject: Re: [Paraview] Fortran wrapper of HDF5

I've read the section of the hdf5 page that Pradeep linked 

Re: [Hdf-forum] make check FAILED testlibinfo.sh on Mac OS X 10.7.5 with GNU and Intel compilers

2013-02-04 Thread Albert Cheng

(George, thanks for the good information.)
Pat,

I copied a copy of libhdf5.7.dylib I generated in a Darwin 11.4.2 (Lion)
to a Darwin 10.8.0 (Snow Leopard) and did
$ strings libhdf5.7.dylib  /dev/null
strings: object: libhdf5.dylib malformed object (unknown load command 8)

Puzzled, I read man strings and found this:
Description
   Strings  looks  for  ASCII strings in a binary file or standard 
input.
   Strings is useful for identifying random object files and many  
other
   things.   A string is any sequence of 4 (the default) or more 
printing
   characters ending with a newline or a null.   Unless  the -  
flag  is
   given,  strings  looks  in _all sections of the object files_ 
except the
   (__TEXT,__text) section.  If no files are specified standard 
input  is

   read.

So, I guess strings detects libhdf5.7.dylib is an object and tries to
find the sections.  But since the object file is generated in a newer
OS (Lion), the older strings in Snow Leopard does not understand the
newer format.

I then tried this in the Snow Leopard system:
$  strings  libhdf5.dylib | grep SUMMARY
SUMMARY OF THE HDF5 CONFIGURATION

My suggestions to you:
1. Ask your system support staff to update the strings command in your 
system;

Or
2. Edit the test/testlibinfo.sh file by changing this line:
if strings $1 | grep SUMMARY OF THE HDF5 CONFIGURATION  
/dev/null; then

to
if strings  $1 | grep SUMMARY OF THE HDF5 CONFIGURATION  
/dev/null; then

   ^
Meanwhile, I will update the future release of HDF5 to use this 
solution. It will take

a micro seconds to run it but it avoids this system versions issue.

Hope this helps.

-Albert

On 2/3/13 12:37 PM, George N. White III wrote:

It looks like you have an old version of the strings program.

On Snow Leopard with Xcode 3.2 I have:

$ /usr/bin/strings /usr/bin/strings | grep cctools
cctools-750
$ /usr/bin/strings /opt/local/bin/strings | grep cctools
cctools-836

$ port provides /opt/local/bin/strings
/opt/local/bin/strings is provided by: cctools
$ port info cctools
cctools @836 (devel)
Variants: llvm29, llvm30, [+]llvm31, llvm32, universal

Description:  A set of essential tools to support development 
on Mac OS
  X and Darwin. Conceptually similar similar to 
binutils on

  other platforms.
Homepage: http://opensource.apple.com/source/cctools/

Build Dependencies:   ld64
Library Dependencies: cctools-headers
Runtime Dependencies: llvm-3.1
Platforms:darwin
License:  APSL-2 GPL-2+
Maintainers: jerem...@macports.org mailto:jerem...@macports.org, 
openmaintai...@macports.org mailto:openmaintai...@macports.org


cctools-836 works, but cctools-750 fails.

$ /opt/local/bin/strings /opt/local/lib/libhdf5.7.dylib | grep SUMMARY
SUMMARY OF THE HDF5 CONFIGURATION

$ /usr/bin/strings /opt/local/lib/libhdf5.7.dylib  | grep SUMMARY
/usr/bin/strings: object: /opt/local/lib/libhdf5.7.dylib malformed 
object (unknown load command 8)


I have also run into situations where Apple's ld64 fails but macports 
build of the Apple open source tool worked 
(e.g.,LDFLAGS=-B/opt/local/libexec/ld64).





On Fri, Feb 1, 2013 at 11:53 AM, pkabl...@atmos.umd.edu 
mailto:pkabl...@atmos.umd.edu wrote:


Thanks for the advice, Albert.

I get the same error when I execute the if statement from the command
line.  However, I get no error when I execute
$ strings /etc/group  /dev/null

So, any idea why libhdf5.7.dylib would be a 'malformed object'?

I have included the output from make below with the hope that it
may help.

-Pat

Making all in src
make[1]: Entering directory
`/Users/pkablick/Downloads/tarballs/hdf5-1.8.10-patch1-intel-serial/src'
make  all-am
make[2]: Entering directory
`/Users/pkablick/Downloads/tarballs/hdf5-1.8.10-patch1-intel-serial/src'



...[skipped]







 On 1/30/13 4:36 PM, acheng at hdfgroup.org http://hdfgroup.org
wrote:
 The testlibinfo.sh does something simple in the first test.  It just
does this:
 if strings ../src/.libs/libhdf5.7.dylib | grep SUMMARY OF THE HDF5
CONFIGURATION  /dev/null; then
  echo  PASSED
 else
  echo  FAILED
  nerrors=`expr $nerrors + 1`
 fi
  From the message you showed, I think the error was from the command
 strings.
 Can you cd test and hand run the above commands?  (They are
Bourn sh
style commands.
 So, if you use cshell, do a sh first.)
 If you see the same error message, try this:
 $ strings /etc/group  /dev/null
 If you see the same error message again, something
 is wrong with the command strings executable.
 In that case, check with your system support staff.
 Let me know how it turns out.
 -Albert Cheng
  On 1/28/13 12:02 PM, pkablick at atmos.umd.edu
http://atmos.umd.edu wrote:
  Hello,
  I'm trying to build HDF5 on Mac OS

Re: [Hdf-forum] make check FAILED testlibinfo.sh on Mac OS X 10.7.5 with GNU and Intel compilers

2013-01-30 Thread Albert Cheng
The testlibinfo.sh does something simple in the first test.  It just 
does this:


if strings ../src/.libs/libhdf5.7.dylib | grep SUMMARY OF THE HDF5 
CONFIGURATION  /dev/null; then

echo  PASSED
else
echo  FAILED
nerrors=`expr $nerrors + 1`
fi

From the message you showed, I think the error was from the command 
strings.
Can you cd test and hand run the above commands?  (They are Bourn sh 
style commands.

So, if you use cshell, do a sh first.)

If you see the same error message, try this:
$ strings /etc/group  /dev/null

If you see the same error message again, something
is wrong with the command strings executable.
In that case, check with your system support staff.

Let me know how it turns out.

-Albert Cheng

On 1/28/13 12:02 PM, pkabl...@atmos.umd.edu wrote:

Hello,

Noob here.  I'm trying to build HDF5 on Mac OS X 10.7.5, but I've run into
a problem that I haven't been able to fix.  Any help or suggestions are
appreciated.

I've tried to build with both GNU and Intel compilers, and I get the same
failure with both compiler suites on make check:




Testing testlibinfo.sh
*** Error ignored
Finished testing testlibinfo.sh

testlibinfo.sh  Test Log

Check file ../src/.libs/libhdf5.7.dylib
strings: object: ../src/.libs/libhdf5.7.dylib malformed object (unknown
load command 15)
   FAILED
Check file ../src/.libs/libhdf5.a  PASSED
Check file testhdf5-SKIP-
***1 errors encountered***
 0.31 real 0.03 user 0.05 sys

Finished testing testlibinfo.sh




Based on suggestions from other fora, here is some information about the
dynamic library using file, otool and lipo:


$ file src/.libs/libhdf5.7.dylib
src/.libs/libhdf5.7.dylib: Mach-O 64-bit dynamically linked shared library
x86_64

$ otool -L src/.libs/libhdf5.7.dylib
src/.libs/libhdf5.7.dylib:
/usr/local/lib/libhdf5.7.dylib (compatibility version 8.0.0, current
version 8.4.0)
/usr/lib/libz.1.dylib (compatibility version 1.0.0, current version
1.2.5)
/usr/lib/libSystem.B.dylib (compatibility version 1.0.0, current version
159.1.0)
mac64/libcilkrts.5.dylib (compatibility version 0.0.0, current version
0.0.0)
/usr/lib/libstdc++.6.dylib (compatibility version 7.0.0, current version
52.0.0)
/usr/lib/libgcc_s.1.dylib (compatibility version 1.0.0, current version
1094.0.0)

$ lipo -info src/.libs/libhdf5.7.dylib
Non-fat file: src/.libs/libhdf5.7.dylib is architecture: x86_64


I run the following bash script to perform the build:



#!/bin/bash

# version of HDF5
export HDF5=hdf5-1.8.10-patch1

# location of installation
export PREFIX=/usr/local

# location of compiler
export COMPREFIX=/usr

# c preprocessor
export CPP='icc -E'

# c compiler
export CC=$COMPREFIX/bin/icc

# archiver and linker
export AR=$COMPREFIX/bin/xiar
export LD=$COMPREFIX/bin/xild

# c flags
export CFLAGS='-O0'

# linker flags
export LDFLAGS=-L$COMPREFIX/lib

# compression libraries used by HDF5
export ZLIB=$COMPREFIX

rm -rf $HDF5-intel-serial
tar xzf $HDF5.tar.gz
mv $HDF5 $HDF5-intel-serial
cd $HDF5-intel-serial

./configure --prefix=$PREFIX --with-zlib=$ZLIB

# tell make to continue even if errors are encountered
export HDF5_Make_Ignore=yes

# use shared libraries
export HDF5_USE_SHLIB=yes

make all
make check



Also, here is some system information that may be helpful:


$uname -a
Darwin kestrel.local 11.4.2 Darwin Kernel Version 11.4.2: Thu Aug 23
16:25:48 PDT 2012; root:xnu-1699.32.7~1/RELEASE_X86_64 x86_64

$ which icc  which ifort  which icpc
/usr/bin/icc
/usr/bin/ifort
/usr/bin/icpc

$ icc --version  ifort --version  icpc --version
icc (ICC) 13.0.1 20121010
Copyright (C) 1985-2012 Intel Corporation.  All rights reserved.

ifort (IFORT) 13.0.1 20121010
Copyright (C) 1985-2012 Intel Corporation.  All rights reserved.

icpc (ICC) 13.0.1 20121010
Copyright (C) 1985-2012 Intel Corporation.  All rights reserved.

$ which gcc  which gfortran  which g++
/usr/local/bin/gcc
/usr/local/bin/gfortran
/usr/local/bin/g++

$ gcc --version  gfortran --version  g++ --version
gcc (GCC) 4.8.0 20120603 (experimental)
Copyright (C) 2012 Free Software Foundation, Inc.

GNU Fortran (GCC) 4.8.0 20120603 (experimental)
Copyright (C) 2012 Free Software Foundation, Inc.

g++ (GCC) 4.8.0 20120603 (experimental)
Copyright (C) 2012 Free Software Foundation, Inc.





___
Hdf-forum is for HDF software users discussion.
Hdf-forum@hdfgroup.org
http://mail.hdfgroup.org/mailman/listinfo/hdf-forum_hdfgroup.org




___
Hdf-forum is for HDF software users discussion.
Hdf-forum

Re: [Hdf-forum] 1.8.10 pre-release problem

2012-10-19 Thread Albert Cheng

Hi Scott,

We are not using cmake to build HDF5 library in our Mac systems, yet.
I have a built of h5dump in my Mountain Lion system and used it
to dump the file you provided.  It worked fine (see output h5dump.out
in the ftp space below.)

Here is my solution, for now:
1. I have deposited my h5dump executable and your hdf5 data file in my
FTP outgoing directories as below.  Please downloaded it and try it out.
   ftp://ftp.hdfgroup.uiuc.edu/pub/outgoing/acheng/h5dump/

mdsum output to verify you got the right files.
da2f13df506878b406146524eff7d3da  diblock2s_History.h5
8fe01cd79b43ba6f528ee7be314da2b1  h5dump
7d1047e6339ad415f3a2cecc8c3aa4ab  h5dump.out

2. If my executable works in your system, try use the following
steps to build your own and see if it works.
   ./configure; make; make check; make install; 

Let me know if this solution works for you, for now.

-Albert Cheng

On 10/17/12 5:37 PM, Scott Sides wrote:

thanks a lot. Let me know if you find a solution and
if you need any more info.

Scott

On 10/17/12 4:31 PM, Peter Cao wrote:

Hi Scott,

Thank you for the information and the sample file. We will build 
1.8.10 with

the same configuration and try to reproduce the problem.

Thanks
--pc

On 10/17/2012 3:27 PM, Scott Sides wrote:

Here's the info from our build/configure system for hdf5-1.8.10.
I didnt do the 'make check' test. h5dump seg faults when trying to 
access file.


Scott

#!/bin/bash

# Clear cmake cache to ensure a clean configure.
rm -rf CMakeFiles CMakeCache.txt

'/Users/swsides/contrib/cmake/bin/cmake' \
-DCMAKE_INSTALL_PREFIX:PATH=/Users/swsides/contrib/hdf5-1.8.10-pre1-ser 
\

  -DCMAKE_BUILD_TYPE:STRING=Release \
  -DCMAKE_COLOR_MAKEFILE:BOOL=FALSE \
  -DCMAKE_VERBOSE_MAKEFILE:BOOL=TRUE \
  -DCMAKE_C_COMPILER:FILEPATH='/usr/bin/gcc' \
  -DCMAKE_CXX_COMPILER:FILEPATH='/usr/bin/g++' \
  -DCMAKE_Fortran_COMPILER:FILEPATH='/usr/local/bin/gfortran' \
  -DCMAKE_C_FLAGS:STRING='-pipe -fPIC' \
  -DCMAKE_CXX_FLAGS:STRING='-pipe -fPIC' \
  -DCMAKE_Fortran_FLAGS:STRING='-fPIC' \
  -DHDF5_BUILD_TOOLS:BOOL=ON \
  -DHDF5_BUILD_HL_LIB:BOOL=ON \
  -DHDF5_BUILD_FORTRAN:BOOL=ON \
  /Users/swsides/chompstall/builds/hdf5-1.8.10-pre1


this is the command I used for looking at the file thats attached.
/Users/swsides/contrib/hdf5-1.8.10-pre1-ser/bin/h5dump 
diblock2s_History.h5


Thanks!
Scott




On 10/17/12 2:19 PM, Albert Cheng wrote:

Hi Scot,

I would like to ask for some more information about h5dump seg fault
in your Mountain Lion system.

Did you also use gcc compilers to build hdf5 v1.8.10 pre-release in 
your

mountain lion system?

What configure option(s) did you use to build the library and tools?

How does h5dump seg fault on you?
Did it seg fault during make check?
Or did it seg fault when you used it to access your own HDF5 data 
files?


If the former (during make check), can you please send me whatever
output it shows when it seg faulted.

If the latter, any chance I may get a copy of your HDF5 files and
the command you used?

Thanks.

-Albert

On 10/17/12 11:28 AM, Scott Sides wrote:

Dave, thanks for pointing that out. I had two windows open.
Here are the results for my Mac Lion.

polymer:~ swsides$ gfortran --version ;gcc --version ; uname -a
GNU Fortran (GCC) 4.6.2 20111019 (prerelease)
Copyright (C) 2011 Free Software Foundation, Inc.

GNU Fortran comes with NO WARRANTY, to the extent permitted by law.
You may redistribute copies of GNU Fortran
under the terms of the GNU General Public License.
For more information about these matters, see the file named COPYING

i686-apple-darwin11-llvm-gcc-4.2 (GCC) 4.2.1 (Based on Apple Inc. 
build 5658) (LLVM build 2336.1.00)

Copyright (C) 2007 Free Software Foundation, Inc.
This is free software; see the source for copying conditions. 
There is NO
warranty; not even for MERCHANTABILITY or FITNESS FOR A PARTICULAR 
PURPOSE.


Darwin polymer.txcorp.com 11.4.2 Darwin Kernel Version 11.4.2: Thu 
Aug 23 16:25:48 PDT 2012; root:xnu-1699.32.7~1/RELEASE_X86_64 x86_64




On 10/17/12 10:23 AM, Scott Sides wrote:

[swsides@volt lists]$ gfortran --version ; gcc --version ; uname -a
GNU Fortran (GCC) 4.4.4 20100726 (Red Hat 4.4.4-13)
Copyright (C) 2010 Free Software Foundation, Inc.

GNU Fortran comes with NO WARRANTY, to the extent permitted by law.
You may redistribute copies of GNU Fortran
under the terms of the GNU General Public License.
For more information about these matters, see the file named COPYING

gcc (GCC) 4.4.4 20100726 (Red Hat 4.4.4-13)
Copyright (C) 2010 Free Software Foundation, Inc.
This is free software; see the source for copying conditions. 
There is NO
warranty; not even for MERCHANTABILITY or FITNESS FOR A 
PARTICULAR PURPOSE.


Linux volt.txcorp.com 2.6.32-71.29.1.el6.x86_64 #1 SMP Mon Jun 27 
19:49:27 BST 2011 x86_64 x86_64 x86_64 GNU/Linux


On 10/17/12 10:14 AM, Raymond Lu wrote:

Scot,

What compiler do you use?  What version is it? Thanks.

Ray

On Oct 17, 2012, at 10:03 AM, Scott Sides wrote:


I just tried

Re: [Hdf-forum] HDF5 1.8.9 release candidate is available for testing

2012-10-17 Thread Albert Cheng

On 10/16/12 12:17 PM, Sean McBride wrote:

On Tue, 16 Oct 2012 11:38:15 -0500, Albert Cheng said:


Please note that while the release notes contained in the pre-release
are reflective of the changes and additions present in this release, the
'platforms tested' and 'tested configurations' sections have yet to be
updated for this version of HDF5.

For the Macs listed, might I suggest adding the Xcode version number?  You can get it 
from Xcode's 'About Xcode' menu.  I'd put something like Xcode 4.5.1 (4G1004).

This entry is suspicious:

 Mac OS X Mountain Lion 10.8.1 GCC 4.2.1 gcc
 (owl)GNU Fortran (GCC) 4.6.1 gfortran
   GCC 4.2.1. g++
  Apple clang version 4.0 (cc)
  Apple clang version 4.0 (c++)

Are you sure you're using gcc 4.2.1 there?  The current Xcodes on Mountain Lion 
only come with clang, and not gcc.

Cheers,


Hi Sean,

I maintained the Mountain entry.  I started using Mountain Lion just 
over a month.  Things
are still a bit uncertain for me, especially in the compiler areas as 
Apple seems to have made some

changes in Mountain Lion.
For example,
in snow leopard, both cc and gcc are links to gcc-4.2.
In Lion, cc and gcc both links to llvm-gcc-4.2.
Then things changed in Mountain Lion. Here is what I have in my Mac 
Mountain Lion.


% sw_vers
ProductName:Mac OS X
ProductVersion:10.8.2
BuildVersion:12C60

% uname -a
Darwin owl 12.2.0 Darwin Kernel Version 12.2.0: Sat Aug 25 00:48:52 PDT 
2012; root:xnu-2050.18.24~1/RELEASE_X86_64 x86_64


% ll /usr/bin/gcc
lrwxr-xr-x  1 root  wheel  12 Aug 24 13:53 /usr/bin/gcc - llvm-gcc-4.2

% gcc --version
i686-apple-darwin11-llvm-gcc-4.2 (GCC) 4.2.1 (Based on Apple Inc. build 
5658) (LLVM build 2336.11.00)

Copyright (C) 2007 Free Software Foundation, Inc.
This is free software; see the source for copying conditions.  There is NO
warranty; not even for MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.

% ll /usr/bin/cc
lrwxr-xr-x  1 root  wheel  5 Aug 24 13:53 /usr/bin/cc - clang

% cc --version
Apple clang version 4.0 (tags/Apple/clang-421.0.60) (based on LLVM 3.1svn)
Target: x86_64-apple-darwin12.2.0
Thread model: posix

About adding the Xcode version number.  Here is what shows in my 
Mountain Lion.

% xcodebuild -version
Xcode 4.5.1
Build version 4G1004

Would Mac users prefer us reporting:
1. Only Xcode version and NO more cc or gcc version?
2. Show both Xcode and cc or gcc verison?


___
Hdf-forum is for HDF software users discussion.
Hdf-forum@hdfgroup.org
http://mail.hdfgroup.org/mailman/listinfo/hdf-forum_hdfgroup.org


Re: [Hdf-forum] 1.8.10 pre-release problem

2012-10-17 Thread Albert Cheng

Hi Scot,

I would like to ask for some more information about h5dump seg fault
in your Mountain Lion system.

Did you also use gcc compilers to build hdf5 v1.8.10 pre-release in your
mountain lion system?

What configure option(s) did you use to build the library and tools?

How does h5dump seg fault on you?
Did it seg fault during make check?
Or did it seg fault when you used it to access your own HDF5 data files?

If the former (during make check), can you please send me whatever
output it shows when it seg faulted.

If the latter, any chance I may get a copy of your HDF5 files and
the command you used?

Thanks.

-Albert

On 10/17/12 11:28 AM, Scott Sides wrote:

Dave, thanks for pointing that out. I had two windows open.
Here are the results for my Mac Lion.

polymer:~ swsides$ gfortran --version ;gcc --version ; uname -a
GNU Fortran (GCC) 4.6.2 20111019 (prerelease)
Copyright (C) 2011 Free Software Foundation, Inc.

GNU Fortran comes with NO WARRANTY, to the extent permitted by law.
You may redistribute copies of GNU Fortran
under the terms of the GNU General Public License.
For more information about these matters, see the file named COPYING

i686-apple-darwin11-llvm-gcc-4.2 (GCC) 4.2.1 (Based on Apple Inc. 
build 5658) (LLVM build 2336.1.00)

Copyright (C) 2007 Free Software Foundation, Inc.
This is free software; see the source for copying conditions. There is NO
warranty; not even for MERCHANTABILITY or FITNESS FOR A PARTICULAR 
PURPOSE.


Darwin polymer.txcorp.com 11.4.2 Darwin Kernel Version 11.4.2: Thu Aug 
23 16:25:48 PDT 2012; root:xnu-1699.32.7~1/RELEASE_X86_64 x86_64




On 10/17/12 10:23 AM, Scott Sides wrote:

[swsides@volt lists]$ gfortran --version ; gcc --version ; uname -a
GNU Fortran (GCC) 4.4.4 20100726 (Red Hat 4.4.4-13)
Copyright (C) 2010 Free Software Foundation, Inc.

GNU Fortran comes with NO WARRANTY, to the extent permitted by law.
You may redistribute copies of GNU Fortran
under the terms of the GNU General Public License.
For more information about these matters, see the file named COPYING

gcc (GCC) 4.4.4 20100726 (Red Hat 4.4.4-13)
Copyright (C) 2010 Free Software Foundation, Inc.
This is free software; see the source for copying conditions. There 
is NO
warranty; not even for MERCHANTABILITY or FITNESS FOR A PARTICULAR 
PURPOSE.


Linux volt.txcorp.com 2.6.32-71.29.1.el6.x86_64 #1 SMP Mon Jun 27 
19:49:27 BST 2011 x86_64 x86_64 x86_64 GNU/Linux


On 10/17/12 10:14 AM, Raymond Lu wrote:

Scot,

What compiler do you use?  What version is it?  Thanks.

Ray

On Oct 17, 2012, at 10:03 AM, Scott Sides wrote:


I just tried the new 1.8.10 pre-release on Mac Lion
and the binaries do not work. In particular the h5dump binary
seg faults.  Actualy, this problem seem to be persistent across
lion, mountain lion and snow leopard since 1.8.7. We've had
to revert to 1.8.7 on mac for some time. Is this problem known
and is a fix being considered?  Thanks,

--
*
Dr. Scott W. Sides
swsi...@txcorp.com

Tech-X Corporation
5621 Arapahoe Avenue, Suite A
Boulder, CO 80303
office:  (720) 974-1849
fax   :  (303) 448-7756
*


___
Hdf-forum is for HDF software users discussion.
Hdf-forum@hdfgroup.org
http://mail.hdfgroup.org/mailman/listinfo/hdf-forum_hdfgroup.org


___
Hdf-forum is for HDF software users discussion.
Hdf-forum@hdfgroup.org
http://mail.hdfgroup.org/mailman/listinfo/hdf-forum_hdfgroup.org










___
Hdf-forum is for HDF software users discussion.
Hdf-forum@hdfgroup.org
http://mail.hdfgroup.org/mailman/listinfo/hdf-forum_hdfgroup.org


[Hdf-forum] HDF5 1.8.9 release candidate is available for testing

2012-10-16 Thread Albert Cheng

Hello everyone,

A pre-release candidate version of HDF5 1.8.10 is available for testing 
and can be downloaded at the following link:


http://www.hdfgroup.uiuc.edu/ftp/pub/outgoing/hdf5/hdf5-1.8.10-pre1/

If you have some time to test this pre-release, we would greatly 
appreciate it. We try to test on a wide variety of platforms and 
environments but are unable to test everywhere so feedback from the user 
community is always welcome.


Please note that while the release notes contained in the pre-release 
are reflective of the changes and additions present in this release, the 
'platforms tested' and 'tested configurations' sections have yet to be 
updated for this version of HDF5.


We plan to release HDF5 1.8.10 in mid-November barring the discovery of 
any critical issues.


Thank you!

The HDF Group


___
Hdf-forum is for HDF software users discussion.
Hdf-forum@hdfgroup.org
http://mail.hdfgroup.org/mailman/listinfo/hdf-forum_hdfgroup.org


Re: [Hdf-forum] h5edit test/Makefile.am shows 'check_PROGRAMS' recursively defined

2011-09-15 Thread Albert Cheng

On 9/13/2011 11:02 AM, Rhys Ulerich wrote:

Hi Albert,

   

h5edit.1.0.0.tar.gz (md5sum 065013585cc483fa7cd945020e3d967f) shows
the fatal error
   test/Makefile.am:22: variable `check_PROGRAMS' recursively defined
after configure when 'make' is invoked.  Modifying test/Makefile.am to read
   check_PROGRAMS= $(EXTRA_TEST)
fixes that issue.  'make check' seems to check nothing.
   
   

Could you tell me more about the machine you test h5edit on?
For example, uname -a output, compilers used.  Thanks.
 

Sure.  Thanks for looking into it.

$ uname -a
Linux setun 2.6.32-31-generic #60-Ubuntu SMP Thu Mar 17 22:15:39 UTC
2011 x86_64 GNU/Linux

$ gcc --version
gcc (GCC) 4.5.1

$ automake --version
automake (GNU automake) 1.11.1

$ autoconf --version
autoconf (GNU Autoconf) 2.68

Also, I used CC=h5pcc (from a parallel build of HDF5).  FWIW, this
problem doesn't seem build-system version-dependent to me.  I suspect
the problem is a mixture of stale development artifacts lingering in
the source tree.

- Rhys
   

Hi Rhys,
I discovered that I have forgot to include an autoconfigure related file 
in the

release tarball that will cause some problems during the make stage.  But
the error should not be fatal.  Are you able to do the following build 
and installation

stage?

1) % ./configure # assuming h5cc is in your $PATH
2) % gmake # some non-fatal error messages
3) % gmake check # to run tests
4) % gmake prefix=install-path# install h5edit in install-path/bin

If it does not work for you,  could you please email me 
(ach...@hdfgroup.org) the output

of the above commands (or whatever commands you used)?
Thanks.

-Albert
h5edit developer



___
Hdf-forum is for HDF software users discussion.
Hdf-forum@hdfgroup.org
http://mail.hdfgroup.org/mailman/listinfo/hdf-forum_hdfgroup.org


Re: [Hdf-forum] redhat and hdf5

2011-07-26 Thread Albert Cheng

Hi John,

Redhat (equivalent to CentOS, sort of) is one of the HDF5 development 
platforms.

You should be quite okay running HDF5 in Redhat on x86 hardware.

Parallel I/O is a different issue.  It depends what parallel file 
systems you use.

Two commercial systems, GPFS (by IBM) and Lustre (now owned by Oracle)
are possible options.  Their performance varies and are highly
dependent on the characteristics of the I/O pattern of your applications.

-Albert Cheng

On 7/26/2011 12:28 PM, John Knutson wrote:
I've recently been informed that we're going to be changing operating 
systems and hardware architecture to redhat on an x86 blade system (or 
something of the sort).  I'm starting to investigate what all the 
implications might be and am digging through the archives for items of 
interest, but in the mean-time I was wondering if anyone out there was 
using HDF5 on RedHat (no, I don't know which flavor, either, we 
haven't been given much information), and what complications or 
advantages they ran across in the process.


One thing of particular interest to me is the parallel I/O - I'd be 
interested in hearing comments about that WRT RH.  I'm not entirely 
sure it's relevant to our system but it's worth researching anyway.



___
Hdf-forum is for HDF software users discussion.
Hdf-forum@hdfgroup.org
http://mail.hdfgroup.org/mailman/listinfo/hdf-forum_hdfgroup.org




___
Hdf-forum is for HDF software users discussion.
Hdf-forum@hdfgroup.org
http://mail.hdfgroup.org/mailman/listinfo/hdf-forum_hdfgroup.org


Re: [Hdf-forum] unresolved external symbol _H5PGET_DXPL_MPIO_C

2011-06-30 Thread Albert Cheng

Hi Jerome,

You are right.  Thank you for reporting it.
It will be fixed in the next release.

-Albert Cheng
On 6/29/2011 6:58 PM, Soumagne Jerome wrote:


Hi,

As I needed hdf5 parallel/fortran support on windows (I needed it for 
our parallel driver), I got a strange error at compile time:


Error  53   error LNK2019: unresolved external symbol 
_H5PGET_DXPL_MPIO_C referenced in function 
_H5FDMPIO_mp_H5PGET_DXPL_MPIO_F   H5FDmpioff.obj


Error  54   fatal error LNK1120: 1 unresolved 
externals
C:\Projects\hdf5-vfd\bin\RelWithDebInfo\hdf5_fortrandll.dll


It appears that somehow a typo was inserted in the file 
fortran/src/H5f90proto.h line 974


H5_FCDLL int_f nh5pget_dxpl_mpio_rc(hid_t_f *prp_id, int_f* 
data_xfer_mode);


 ^

which should be

H5_FCDLL int_f nh5pget_dxpl_mpio_c(hid_t_f *prp_id, int_f* 
data_xfer_mode);


This apparently fixed my build. Just had a quick look, it's still 
present in the svn trunk.


Jerome

--

Jérôme Soumagne

Scientific Computing Research Group

CSCS, Swiss National Supercomputing Centre

Galleria 2, Via Cantonale  | Tel: +41 (0)91 610 8258

CH-6928 Manno, Switzerland | Fax: +41 (0)91 610 8282


___
Hdf-forum is for HDF software users discussion.
Hdf-forum@hdfgroup.org
http://mail.hdfgroup.org/mailman/listinfo/hdf-forum_hdfgroup.org
   


___
Hdf-forum is for HDF software users discussion.
Hdf-forum@hdfgroup.org
http://mail.hdfgroup.org/mailman/listinfo/hdf-forum_hdfgroup.org


Re: [Hdf-forum] HDF with MPI and CPP

2010-07-16 Thread Albert Cheng
Hi Phil,

Depends on mean with MPI.  PHDF5 uses MPI-IO to support
parallel I/O to HDF5 files and it has C and Fortran APIs
but no C++ API.  Therefore, if you want to do parallel
access to HDF5 file via C++ calls, it won't work.

If you use MPI for the compute and inter-processes
communication but limit each HDF5 file access to a single
process which in turn gathers and scatters HDF5 file data
with other processes, that would still work.

If you want to use parallel access to the HDF5 files, you
would have to code your C++ application to call the PHDF5
C APIs.

Hope this answers your question.

-Albert

At 04:46 AM 7/16/2010, Kraus Philipp wrote:
Hi,

I have had compiled the current HDF lib with --enabled-cxx and use the  
C++ interface. Now I would like to switch HDF with MPI. Can I use Cpp  
and MPI together?

Thanks for the information

Phil

___
Hdf-forum is for HDF software users discussion.
Hdf-forum@hdfgroup.org
http://mail.hdfgroup.org/mailman/listinfo/hdf-forum_hdfgroup.org
___
Hdf-forum is for HDF software users discussion.
Hdf-forum@hdfgroup.org
http://mail.hdfgroup.org/mailman/listinfo/hdf-forum_hdfgroup.org