> This is, more or less, the procedure to follow.  All the important 
> methods mentioned here are documented with docstrings and with 
> comments.  Please read them carefully as they will hopefully provide 
> you with clues for get your job done.
> 


Francesc, thanks for the advice! I've made real progress and am
successfully reading scalar compound data types in attributes. However, 
one thing I'm running into that I'm not quite sure about. This may be
more of a general HDF5 question, so if it is off-topic I apologize. 

I've rarely worked with the HDF5 "array" datatypes, to my knowledge.
Instead I normally use a multidimensional dataspace with a base
datatype. For example, currently with each table I wish to associate
some "recording session metadata". This is a four-field struct, and I
will generally only have one entry in this struct, but sometimes there
may be up to ~5. 

Instead of having a non-scalar dataspace with a H5T_COMPOUND type, is it
smarter to create an array data type with the H5T_COMPOUND as the
base-type? In your experience, will pytables (and perhaps even other
tools) be happier? 

Pytables is by far my preferred way of interacting with HDF5, but I have
users who will be using both the Java HDF5 interface and matlab's
built-in HDF5 support, so I'd be curious to know if anyone on the list
thinks there's any virtue in one approach over the other. 


Thanks again, 
                ...Eric




-------------------------------------------------------------------------
This SF.Net email is sponsored by the Moblin Your Move Developer's challenge
Build the coolest Linux based applications with Moblin SDK & win great prizes
Grand prize is a trip for two to an Open Source event anywhere in the world
http://moblin-contest.org/redirect.php?banner_id=100&url=/
_______________________________________________
Pytables-users mailing list
Pytables-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/pytables-users

Reply via email to