Dear all,
I am facing a problem when trying to delete HDF5 attribute stored in dense
storage in MATLAB R2014b. The test code (below) works for small data size
(10x10) but causes "red errors" for bigger data (100x100).

close all;
> clear all;
> clc;
>
> fileName = 'H5A_delete.h5';
> groupName = 'test';
> attrName = 'testAttribute';
>
> data = 1.23*ones(100, 100);
>
> if exist(fileName, 'file')
>    delete(fileName);
> end
>
> faplID = H5P.create('H5P_FILE_ACCESS');
> H5P.set_libver_bounds(faplID, 'H5F_LIBVER_18', 'H5F_LIBVER_LATEST');
> fileID = H5F.create(fileName, 'H5F_ACC_TRUNC', 'H5P_DEFAULT', faplID);
> H5P.close(faplID);
> groupID = H5G.create(fileID, groupName, 'H5P_DEFAULT', 'H5P_DEFAULT',
> 'H5P_DEFAULT');
> spaceID = H5S.create_simple(length(size(data)), fliplr(size(data)), []);
> % attribute creation
> acplID = H5P.create('H5P_ATTRIBUTE_CREATE');
> attrID = H5A.create(groupID, attrName, 'H5T_IEEE_F64LE', spaceID, acplID);
> H5P.close(acplID);
> % writing attribute value
> H5A.write(attrID, 'H5T_NATIVE_DOUBLE', data);
> H5A.close(attrID);
> H5S.close(spaceID);
> H5G.close(groupID);
> H5F.close(fileID);
>
> clear acplID attrID data faplID fileID groupID spaceID;
>
> % fileattrib(filename, '+w');
> fileID = H5F.open(fileName, 'H5F_ACC_RDWR', 'H5P_DEFAULT');
> groupID = H5G.open(fileID, groupName);
> H5A.delete(groupID, attrName);
> H5G.close(groupID);
> H5F.close(fileID);
>

Error messages:

> Error using hdf5lib2
> The HDF5 library encountered an error and produced the following stack
> trace information:
>
>     H5FD_windows_read          addr overflow
>     H5FD_read                  driver read request failed
>     H5F_accum_read             driver read request failed
>     H5F_block_read             read through metadata accumulator failed
>     H5HF_huge_op_real          can't read 'huge' object's data from the
> file
>     H5HF_huge_op               unable to operate on heap object
>     H5HF_op                    can't operate on 'huge' object from fractal
> heap
>     H5A_dense_delete_bt2_cb    heap op callback failed
>     H5B2_delete_node           iterator function failed
>     H5B2_hdr_delete            unable to delete B-tree nodes
>     H5SL_insert_common         can't insert duplicate key
>     H5SL_insert                can't create new skip list node
>     H5FS_sect_link_rest        can't insert free space node into merging
> skip list
>     H5FS_sect_link             can't add section to non-size tracking data
> structures
>     H5FS_sect_add              can't insert free space section into skip
> list
>     H5MF_xfree                 can't add section to file free space
>     H5HF_cache_hdr_dest        unable to free fractal heap header
>     H5HF_cache_hdr_clear       unable to destroy fractal heap header
>     H5C_flush_single_entry     can't clear entry
>     H5C_unprotect              Can't flush.
>     H5AC_unprotect             H5C_unprotect() failed.
>     H5HF_hdr_delete            unable to release fractal heap header
>     H5HF_delete                unable to delete fractal heap
>     H5C_unprotect              Entry already unprotected??
>     H5AC_unprotect             H5C_unprotect() failed.
>     H5HF_delete                unable to release fractal heap header
>     H5A_dense_delete           unable to delete fractal heap
>     H5O_ainfo_delete           unable to free dense attribute storage
>     H5O_delete_mesg            unable to delete file space for object
> header message
>     H5O_release_mesg           unable to delete file space for object
> header message
>     H5O_msg_remove_cb          unable to release message
>     H5O_msg_iterate_real       iterator function failed
>
> Error in H5A.delete (line 21)
> H5ML.hdf5lib2('H5Adelete', loc_id, name);
>
> Error in h5a_delete (line 37)
> H5A.delete(groupID, attrName);
>

Thank you for any help.

Best regards,
 Vladimir Sedenka
_______________________________________________
Hdf-forum is for HDF software users discussion.
[email protected]
http://lists.hdfgroup.org/mailman/listinfo/hdf-forum_lists.hdfgroup.org
Twitter: https://twitter.com/hdf5

Reply via email to