On 18/08/11 20:49, John Knutson wrote:
> In my own work, I use pre-sized datasets with compound types.  Our data
> are not homogeneous, so the compound types are pretty much required.
> The pre-sized datasets affords us the ability to actively index the data
> on read and write, which is of particular value on the read-side where
> knowing exactly where to find the data you want is highly valuable.
> 
> If, on the other hand, you just need to store data that you're never
> going to do partial reads of, packet tables are easier to implement.

Dear John,

thank you very much for your hints. Partial reads and absolute
performances are not critical for my application. I think I'll give
packet tables a try. On the other hand, easily accessing the data from
Python and Matlab is a requirement.

Are packet tables (or the underlying structures) accessible in an easy
way from Matlab? Python has the h5py library, which, if it is not
already capable of handling packet tables, should be not too difficult
to extend.

A while back I investigated the kind of data structure used by PyTables,
which offers the kind of functionality I'm interested in. From my
aproximative knowledge of HDF5, I understood that it uses a "custom"
version of HDF5 Tables which support chunked writes. However I need
compatibility with Matlab and the ability of writing those files from a
C library. How such a structure compares with packed tables?

Thank you. Cheers,
-- 
Daniele

_______________________________________________
Hdf-forum is for HDF software users discussion.
[email protected]
http://mail.hdfgroup.org/mailman/listinfo/hdf-forum_hdfgroup.org

Reply via email to