Not that this is very helpful but I thought the requirement that
'resizable' datasets be chunked came from the HDF5 library itself, not
anything above it such as h5py.






On Thu, 2011-02-17 at 08:45, nls wrote:
> Hi everyone, 
> 
> thanks for the helpful comments.
> 
> I did check that 
> 1. compression is actually on
> 2. i am using the new 1.8 group format 
> (this actually required me t write my first nontrivial cython wrapper since
> h5py does not provide access to LIBVER_LATEST)
> 
> Following the helpful advice on the chunk index overhead I tried to use
> contiguous storage. Unfortunately, again I
> ran into an unsupported feature: h5py only supports resizable Datasets when
> they're chunked, even when using the "low-level" functions which wrap the
> HDF5 C-API:
> 
> "h5py._stub.NotImplementedError: Extendible contiguous non-external dataset
> (Dataset: Feature is unsupported)"
> 
> Since I do need resizing, I guess I am stuck with chunked Datasets for now.
> I tried different chunk sizes but that did not make a noticeable difference. 
> 
> In conclusion, I see no way to get less than about 15x file size overhead
> when using HDF5 with h5py for my data....
> 
> cheers, Nils
-- 
Mark C. Miller, Lawrence Livermore National Laboratory
================!!LLNL BUSINESS ONLY!!================
[email protected]      urgent: [email protected]
T:8-6 (925)-423-5901    M/W/Th:7-12,2-7 (530)-753-8511


_______________________________________________
Hdf-forum is for HDF software users discussion.
[email protected]
http://mail.hdfgroup.org/mailman/listinfo/hdf-forum_hdfgroup.org

Reply via email to