Hi Carl, What file system are you testing on? Is it a network file system like NFS, AFS, or SMB?
That test was added in HDF5 1.10.0 and tests single-writer/multiple-readers (SWMR) functionality. Since that is a new feature for 1.10.0, the test is not a part of the HDF5 1.8 release. Dana Robinson Software Engineer The HDF Group From: Hdf-forum [mailto:[email protected]] On Behalf Of Carl Ponder Sent: Saturday, April 23, 2016 7:45 PM To: HDF Users Discussion List <[email protected]> Cc: Cyril Zeller <[email protected]> Subject: [Hdf-forum] HDF5 release 1.10.0 giving "use_append_chunk" failures I'm running "make check" with the 1.10.0 release of HDF5, and seeing these failures: ./use_append_chunk *FAILED* ./use_append_chunk -z 256 *FAILED* ./use_append_chunk -z 256 -y 5 *FAILED* ./use_append_mchunks -z 256 *FAILED* I listed the details below for one case, the others are similar. The errors happen with the GCC, Intel & PGI compilers, using MVAPICH2 or OpenMPI, so if it's an issue with my software stack, it would have to be deeper than these. The other cases passed: ./use_append_chunk -f /tmp/datatfile.1160 PASSED ./use_append_chunk -l w PASSED ./use_append_chunk -l r PASSED ./use_append_mchunks PASSED ./use_append_mchunks -f /tmp/datatfile.1160 PASSED ./use_append_mchunks -l w PASSED ./use_append_mchunks -l r PASSED ./use_append_mchunks -z 256 -y 5 PASSED I don't see these tests being run with HDF5 version 1.8.16, though, so is it possible that the they are not formulated correctly? Thanks, Carl Ponder ________________________________ ./use_append_chunk -z 256 *FAILED* ===Parameters used:=== chunk dims=(1, 256, 256) dataset max dims=(18446744073709551615, 256, 256) number of planes to write=256 using SWMR mode=yes(1) data filename=use_append_chunk.h5 launch part=Reader/Writer number of iterations=1 (not used yet) ===Parameters shown=== Creating skeleton data file for test... File created. 1559: launch reader process ===Parameters used:=== chunk dims=(1, 256, 256) dataset max dims=(18446744073709551615, 256, 256) number of planes to write=256 using SWMR mode=yes(1) data filename=use_append_chunk.h5 launch part=Reader/Writer number of iterations=1 (not used yet) ===Parameters shown=== Creating skeleton data file for test... File created. 1545: continue as the writer process dataset rank 3, dimensions 0 x 256 x 256 1545: child process exited with non-zero code (1) Error(s) encountered HDF5-DIAG: Error detected in HDF5 (1.10.0) thread 0: #000: H5F.c line 579 in H5Fopen(): unable to open file major: File accessibilty minor: Unable to open file #001: H5Fint.c line 1208 in H5F_open(): unable to read superblock major: File accessibilty minor: Read failed #002: H5Fsuper.c line 443 in H5F__super_read(): truncated file: eof = 526815, sblock->base_addr = 0, stored_eof = 33559007 major: File accessibilty minor: File has been truncated H5Fopen failed read_uc_file encountered error ________________________________ This email message is for the sole use of the intended recipient(s) and may contain confidential information. Any unauthorized review, use, disclosure or distribution is prohibited. If you are not the intended recipient, please contact the sender by reply email and destroy all copies of the original message. ________________________________
_______________________________________________ Hdf-forum is for HDF software users discussion. [email protected] http://lists.hdfgroup.org/mailman/listinfo/hdf-forum_lists.hdfgroup.org Twitter: https://twitter.com/hdf5
