Eric Iverson writes:

 > I am not familiar with HDF5. For big text type files I would use J64 and
 > memory map the big file to a noun. That is a start.

Lettow, Kenneth writes:

 > I have not worked with HDF5 files, but you should take a look at using
 > memory mapped files in J.
 > http://www.jsoftware.com/jwiki/Studio/Mapped%20Files


I had a look at this and it looks quite simple and powerful - as long
as you stay in the J universe. I also suspect that portability of
those files between machines is safe to assume only for text-based
versions.

So assume you have 60 GB of binary data you need to work with and
archive for ten years - what do you do? That's not an academic
question but my very real situation.

Right now I keep such data either in netCDF or in HDF5 files. Both are
platform-neutral and stable binary file formats the let me store
arrays of any data type. My 60 GB are single-precision floats, for
example. I used netCDF in the past, but I am currently transitioning
to HDF5 because of better performance and more flexible storage
options.

Konrad.
----------------------------------------------------------------------
For information about J forums see http://www.jsoftware.com/forums.htm

Reply via email to