A Tuesday 13 April 2010 10:42:03 Stamminger, Johannes escrigué:
> > Mmh, if you don't want to have the nuisance of maintaining several tables
> > for keeping your data, another possibility would be to compress your data
> > before injecting it into variable length types in HDF5.  You will have to
> > deal with the zlib API to do so, but probably that would be easier than
> > what you are planning.  And you would get better results in terms of
> > efficiency too.  The drawback is that you won't be able to read your data
> > by using standard HDF5 tools (like HDFView, for example).
> 
> I need the references in any way as the order I write the blobs is not
> the correct one. I have to do some sorting, the refs dataset is
> therefore needed anyway. And this is the only "maintenance" I have to
> deal with AFAIS. Though I did not test to read the blobs through this
> indirection, yet ...
> 
> Additionally I'm still not able to write variable length binary data -
> as I try to get help for with the mentioned thread "Varying length
> binary data".
> 
> And finally I doubt it being very efficient to compress each blob
> separately (mean size just 465 bytes - and this is realistic data) - and
> I definitely need the ability to access each blob separately.
> 
> 
> So the refs list seems more charming to me at the moment ... ?
> 

Fair enough.  It seems so, yes.

-- 
Francesc Alted

_______________________________________________
Hdf-forum is for HDF software users discussion.
[email protected]
http://mail.hdfgroup.org/mailman/listinfo/hdf-forum_hdfgroup.org

Reply via email to