On Mar 3, 2010, at 11:49 AM, Mark Miller wrote:
> I think the comment in h5test.c is misleading.
>
> The documentation for H5Pset_fapl_core says the inc argument is the
> number of BYTES, not MEGAbytes.
>
> I'd bet the initial malloc is failing for reading your whole file.
> Sorry. Didn't expect that. But, I was thinking in terms of 64 bits and
> 30 gigs 5 bits too many for a 32 bit number. So, if you are running in
> truly 64 bit, then I am completely puzzled. Otherwise, if you are
> running 32 bit, then I think the problem has presented itself; we can't
> address 30 gig buffer in a 32 bit application.
>
> I'd suggest looking at some of the other properties such as
>
> H5Pset_small_data_block_size
> HP5set_buffer (used during read/xform only)
> H5Pset_cache (relevant only for chunked datasets)
> H5Pset_meta_block_size( hid_t fapl_id, hsize_t size )
>
> and see if setting any/all of these to large numbers; 100's of megabytes
> has any positive impact.
Those probably won't have any effect, I'm guessing that you are correct
that the machine ran out of memory (or Thorben has a quota about the amount of
RAM his process can use).
Quincey
>
> Mark
>
> On Wed, 2010-03-03 at 09:30, Thorben Kröger wrote:
>> On Wednesday 03 March 2010 18:17:31 Mark Miller wrote:
>>> I think the '5' in the HP5set_fapl_core means to allocate only 5 more
>>> bytes each time it needs to allocate. I think that is too small and
>>> maybe a problem since its an odd number and not a power of 2.
>>>
>>> Although I would not necessarily expect that to have failed in the way
>>> you are seeing either.
>>>
>>> Try '5<<20' instead of '5'
>>
>> I found this in h5test.c, function h5_fileaccess()
>>
>> /* In-core temporary file with 1MB increment */
>> if (H5Pset_fapl_core(fapl, (size_t)1, TRUE)<0) return -1;
>>
>> Anyway, using a number like 5<<20 did not make a difference, it still
>> aborts...
>>
>> Cheers,
>> Thorben
>>
>>>
>>> Other than that, I don't see anything wrong with what your are doing.
>>> Except that maybe allocating the 30 gigabyte buffer in HDF5 to read your
>>> file into core is failing due to too large a request. I guess that is
>>> possible. To see if that is happening, if your HDF5 installation was
>>> compiled with tracing/debugging features turned on, then try setting
>>> HDF5_DEBUG=all in your enviornment where you are running your
>>> application.
>>>
>>> Mark
>>>
>>> On Wed, 2010-03-03 at 09:07, Thorben Kröger wrote:
>>>>>> On Wednesday 03 March 2010 16:47:20 Mark Miller wrote:
>>>>>>> Hello Thorben,
>>>>>>>
>>>>>>> However, did you in fact try using HDF5-1.8.4
>>>>>>> and reading the file with the CORE vfd (H5Pset_fapl_core)
>>>>
>>>> instead of opening my file like this:
>>>>
>>>> hdf5geometryFile_ = H5Fopen(filename.c_str(), H5F_ACC_RDONLY,
>>>> H5P_DEFAULT);
>>>>
>>>> I am now doing this:
>>>>
>>>> hid_t fapl = H5Pcreate(H5P_FILE_ACCESS);
>>>> if(fapl < 0) {throw std::runtime_error("fapl < 0");}
>>>> hid_t ret = H5Pset_fapl_core(fapl, 5/*MB?*/, 0 /*no backing store*/);
>>>> if(ret < 0) {throw std::runtime_error("could not create fapl core");}
>>>> hdf5geometryFile_ = H5Fopen(filename.c_str(), H5F_ACC_RDONLY, fapl);
>>>>
>>>> But using the second way, it cannot open any of my datasets:
>>>>
>>>> HDF5-DIAG: Error detected in HDF5 (1.8.4) thread 0:
>>>> #000: H5D.c line 321 in H5Dopen2(): not found
>>>>
>>>> major: Dataset
>>>> minor: Object not found
>>>>
>>>> #001: H5Gloc.c line 468 in H5G_loc_find(): can't find object
>>>>
>>>> major: Symbol table
>>>> minor: Object not found
>>>>
>>>> #002: H5Gtraverse.c line 877 in H5G_traverse(): internal path traversal
>>>>
>>>> failed
>>>>
>>>> major: Symbol table
>>>> minor: Object not found
>>>>
>>>> #003: H5Gtraverse.c line 665 in H5G_traverse_real(): can't look up
>>>> component
>>>>
>>>> major: Symbol table
>>>> minor: Object not found
>>>>
>>>> #004: H5Gobj.c line 1047 in H5G_obj_lookup(): can't locate object
>>>>
>>>> major: Symbol table
>>>> minor: Object not found
>>>>
>>>> #005: H5Gstab.c line 860 in H5G_stab_lookup(): not found
>>>>
>>>> major: Symbol table
>>>> minor: Object not found
>>>>
>>>> #006: H5B.c line 360 in H5B_find(): can't lookup key in leaf node
>>>>
>>>> major: B-Tree node
>>>> minor: Object not found
>>>>
>>>> #007: H5Gnode.c line 522 in H5G_node_found(): unable to protect symbol
>>>> table
>>>>
>>>> node
>>>>
>>>> major: Symbol table
>>>> minor: Unable to load metadata into cache
>>>>
>>>> #008: H5AC.c line 1831 in H5AC_protect(): H5C_protect() failed.
>>>>
>>>> major: Object cache
>>>> minor: Unable to protect metadata
>>>>
>>>> #009: H5C.c line 6160 in H5C_protect(): can't load entry
>>>>
>>>> major: Object cache
>>>> minor: Unable to load metadata into cache
>>>>
>>>> #010: H5C.c line 10990 in H5C_load_entry(): unable to load entry
>>>>
>>>> major: Object cache
>>>> minor: Unable to load metadata into cache
>>>>
>>>> #011: H5Gcache.c line 176 in H5G_node_load(): bad symbol table node
>>>>
>>>> signature
>>>>
>>>> major: Symbol table
>>>> minor: Unable to load metadata into cache
>>>>
>>>> I'm at a loss now, what am I doing wrong here?
>>>>
>>>> Cheers,
>>>> Thorben
>>>>
>>>> _______________________________________________
>>>> Hdf-forum is for HDF software users discussion.
>>>> [email protected]
>>>> http://**mail.hdfgroup.org/mailman/listinfo/hdf-forum_hdfgroup.org
>>
>> _______________________________________________
>> Hdf-forum is for HDF software users discussion.
>> [email protected]
>> http://*mail.hdfgroup.org/mailman/listinfo/hdf-forum_hdfgroup.org
> --
> Mark C. Miller, Lawrence Livermore National Laboratory
> ================!!LLNL BUSINESS ONLY!!================
> [email protected] urgent: [email protected]
> T:8-6 (925)-423-5901 M/W/Th:7-12,2-7 (530)-753-851
>
>
> _______________________________________________
> Hdf-forum is for HDF software users discussion.
> [email protected]
> http://mail.hdfgroup.org/mailman/listinfo/hdf-forum_hdfgroup.org
_______________________________________________
Hdf-forum is for HDF software users discussion.
[email protected]
http://mail.hdfgroup.org/mailman/listinfo/hdf-forum_hdfgroup.org