Hi Nitya,
Can you also give more information about the machine in which you ran your
application
that got the failures? E.g.,
1. OS version ("uname -a" usually shows it.)
1.1. For Mac, try "sw_vers".
2. Did you build the HDF5 library from source?
2.1. if yes, please send a copy of the output of configure.
2.2. If the output of configure is not available, please send a copy of
libhdf5.settings.
You can usually find "libhdf5.settings" in the same directory where you
find the
HDF5 library, libhdf5.a or libhdf5.so.
This will help us to try to repeat your problems in the appropriate platform.
Thanks.
-Albert Change
THG Staff
On Jul 2, 2013, at 11:20 AM, Dana Robinson <[email protected]> wrote:
> Hi Nitya,
>
> I'll take a look into this. It's a holiday weekend here, so I probably won't
> get to this until next week.
>
> Dana
>
>
> On Mon, Jul 1, 2013 at 1:37 PM, Nitya Hariharan <[email protected]>
> wrote:
> Hi,
>
> I am having some trouble reading in a HDF5 file that is ~5GB. I was
> initially having some problems even writing out a file this large and
> looked at some postings in the HDF5 forum related to this.
>
> http://hdf-forum.184993.n3.nabble.com/Trouble-writing-2GB-dataset-from-single-task-with-HDF5-1-8-10-td4025821.html
>
> I tried using the serial version of the HDF5 library, v1.8.9 and was
> able to write out a file of size ~5GB. However, when I try to read it
> back in, I get the following error.
>
> ----------------
> #000: /home/hdftest/snapshots-bin-hdf5_1_8_11/current/src/H5Dio.c line
> 182 in H5Dread(): can't read data
> major: Dataset
> minor: Read failed
> #001: /home/hdftest/snapshots-bin-hdf5_1_8_11/current/src/H5Dio.c line
> 550 in H5D__read(): can't read data
> major: Dataset
> minor: Read failed
> #002: /home/hdftest/snapshots-bin-hdf5_1_8_11/current/src/H5Dcontig.c
> line 543 in H5D__contig_read(): contiguous read failed
> major: Dataset
> minor: Read failed
> #003: /home/hdftest/snapshots-bin-hdf5_1_8_11/current/src/H5Dselect.c
> line 278 in H5D__select_read(): read error
> major: Dataspace
> minor: Read failed
> #004: /home/hdftest/snapshots-bin-hdf5_1_8_11/current/src/H5Dselect.c
> line 213 in H5D__select_io(): read error
> major: Dataspace
> minor: Read failed
> #005: /home/hdftest/snapshots-bin-hdf5_1_8_11/current/src/H5Dcontig.c
> line 873 in H5D__contig_readvv(): can't perform vectorized sieve buffer read
> major: Dataset
> minor: Can't operate on object
> #006: /home/hdftest/snapshots-bin-hdf5_1_8_11/current/src/H5V.c line
> 1457 in H5V_opvv(): can't perform operation
> major: Internal error (too specific to document in detail)
> minor: Can't operate on object
> #007: /home/hdftest/snapshots-bin-hdf5_1_8_11/current/src/H5Dcontig.c
> line 674 in H5D__contig_readvv_sieve_cb(): block read failed
> major: Dataset
> minor: Read failed
> #008: /home/hdftest/snapshots-bin-hdf5_1_8_11/current/src/H5Fio.c line
> 113 in H5F_block_read(): read through metadata accumulator failed
> major: Low-level I/O
> minor: Read failed
> #009: /home/hdftest/snapshots-bin-hdf5_1_8_11/current/src/H5Faccum.c
> line 258 in H5F_accum_read(): driver read request failed
> major: Low-level I/O
> minor: Read failed
> #010: /home/hdftest/snapshots-bin-hdf5_1_8_11/current/src/H5FDint.c
> line 142 in H5FD_read(): driver read request failed
> major: Virtual File Layer
> minor: Read failed
> #011: /home/hdftest/snapshots-bin-hdf5_1_8_11/current/src/H5FDsec2.c
> line 725 in H5FD_sec2_read(): file read failed: time = Mon Jul 1
> 20:21:57 2013
> , filename = '/tmp/file.hdf5', file descriptor = 5, errno = 14, error
> message = 'Bad address', buf = 0x2ae9619ea010, total read size =
> 4677466176, bytes this sub-read = 4677466176, bytes actually read =
> 18446744073709551615, offset = 744468544
> -----------------
>
> I looked at the forum again and saw this posting, which mentioned that
> there was a bug fix done for POSIX issues.
>
> http://mail.lists.hdfgroup.org/pipermail/hdf-forum_lists.hdfgroup.org/2012-December/006348.html
>
> I was using v1.8.9, but thought of using the latest 1.8.11 to rule out
> any issues with the HDF5 version I was using. However, I still get the
> above error.
>
> Please could someone provide some feedback on why this is happening. If
> I am able to write out such a large file, I should be able to read it in
> as well? Ofcourse, small sized files work perfectly fine in my application.
>
> Since the library I am using, that in turn calls the HDF5 routines,
> needs the interface of v1.6 of the HDF5 library, I use the flag
> H5_USE_16_API while compiling. Would this in anyway cause the problem.
>
> Thanks in advance.
>
> --
> Regards
>
> Nitya
>
>
> _______________________________________________
> Hdf-forum is for HDF software users discussion.
> [email protected]
> http://mail.lists.hdfgroup.org/mailman/listinfo/hdf-forum_lists.hdfgroup.org
>
> _______________________________________________
> Hdf-forum is for HDF software users discussion.
> [email protected]
> http://mail.lists.hdfgroup.org/mailman/listinfo/hdf-forum_lists.hdfgroup.org
_______________________________________________
Hdf-forum is for HDF software users discussion.
[email protected]
http://mail.lists.hdfgroup.org/mailman/listinfo/hdf-forum_lists.hdfgroup.org