Well, I had envisioned your 'buffer' as being a collection of datasets.

You could just have a single dataset that is the 'buffer' and then you'd
have to use hyperslabs or selections to write to just a portion of that
dataset (as Quincey already mentioned).

HTH

Mark

On Thu, 2010-03-25 at 14:03, [email protected] wrote:
> Mark,
> 
> I am new to HDF5 and still working my way through the Tutorials. It looks
> promising thus far, but have been concerned about the Circular Database
> implementation.
> The dataset size will be static based upon the time duration for which I
> want to provide data lookup and the data output rate of the sensors. I
> suppose what I need to figure out then, based on your approach, is how to
> "seek" to the appropriate location (record) within the dataset for
> continued writing of the data. This is probably where your suggestion of
> adding an attribute (time of acquisition) comes into play.
> 
> Thanks for the reassurance and the tips,
> Kirk
> 
> > You should be able to do that pretty easily with HDF5.
> >
> > If you are absolutely certain your datasets will never, ever change in
> > size, you could create an 'empty' database by going through and creating
> > N datasets (H5Dcreate) of desired size (H5Screate_simple) but not
> > actually writing anything to any of the datasets.
> >
> > Then, as time evolves, you pick a particular dataset to open (H5Dopen),
> > write to (writing afresh if the dataset has yet to be written to or
> > overwriting whats already there if it has already been written to --
> > makes no difference to the application. It just calls H5Dwrite) and
> > H5Dclose.
> >
> > If you think you might want to be able to vary dataset size over time,
> > use 'chunked' datasets (H5Pset_chunk) instead of the default
> > (contiguous). If you need to maintain other tidbits of information about
> > the datasets such as time of acquisition, sensor # (whatever), and that
> > data is 'small' (<16kb), attach attributes (H5Acreate) to your datasets
> > and overwrite those attributes as you would datasets (H5Aopen, H5Awrite,
> > H5Aclose).
> >
> > Mark
> >
> >
> > On Thu, 2010-03-25 at 13:11, [email protected] wrote:
> >> I am interested in using HDF5 to manage sensor data within a continuous
> >> Circular Database/File. I wish to define a database of a fixed size to
> >> manage a finite amount of historical data. When the database file is
> >> full
> >> (i.e. reach the defined capacity) I would like to begin overwriting the
> >> oldest data within the file.) This is for an application for a system
> >> where I only care about the most recent data over a specific duration
> >> with
> >> obvious constraints on the amount of storage available.
> >>
> >> Does HDF5 have such capability or is there a recommended
> >> approach/suggestions anyone has?
> >>
> >> Best Regards,
> >> Kirk Harrison
> >>
> >>
> >>
> >> _______________________________________________
> >> Hdf-forum is for HDF software users discussion.
> >> [email protected]
> >> http://**mail.hdfgroup.org/mailman/listinfo/hdf-forum_hdfgroup.org
> > --
> > Mark C. Miller, Lawrence Livermore National Laboratory
> > ================!!LLNL BUSINESS ONLY!!================
> > [email protected]      urgent: [email protected]
> > T:8-6 (925)-423-5901     M/W/Th:7-12,2-7 (530)-753-851
> >
> >
> > _______________________________________________
> > Hdf-forum is for HDF software users discussion.
> > [email protected]
> > http://*mail.hdfgroup.org/mailman/listinfo/hdf-forum_hdfgroup.org
> >
> 
-- 
Mark C. Miller, Lawrence Livermore National Laboratory
================!!LLNL BUSINESS ONLY!!================
[email protected]      urgent: [email protected]
T:8-6 (925)-423-5901     M/W/Th:7-12,2-7 (530)-753-851


_______________________________________________
Hdf-forum is for HDF software users discussion.
[email protected]
http://mail.hdfgroup.org/mailman/listinfo/hdf-forum_hdfgroup.org

Reply via email to