Hi Peter -
Thanks for the quick response. I had a feeling that I wasn't going to just be able to use the object-level API.

I'm in the middle of exploring the viability of writing HDF files from my Java application. If we decide that's what we want to do, I'll be interested in pursuing the option you mentioned - mapping between Java and C structures for compound data.

-Josiah

On 1/5/2011 1:42 PM, Peter Cao wrote:
Hi Josiah,

Writing nested compound is not supported from the HDF-Java object level.
You may be able
to accomplish this through the JNI level, i.e. directly using JNI calls.
Even though there isn't
any better way other than writing data field by field since our current
HDF-Java implementation
does not map any data structure from Java to C.

Below is an example of writing field by field using the HDF5 JNI (or the
wrapper).

If you would like to pursue a better way, e.g. directly mapping between
Java and C structure
for compound data, please let us know.

=========================
      private static void createNestedcompound(String fname, String
dname) throws Exception
      {
          int DIM1 = 50;
          long[] dims = {DIM1};
          int cmpSize = 20;
          int fid=-1, did=-1, tid = -1, tid_nested=-1, sid=-1;
          int indexData[] = new int[DIM1];
          double lonData[] = new double[DIM1];
          double latData[] = new double[DIM1];


          for (int i=0; i<DIM1; i++) {
              indexData[i] = i;
              lonData[i] = 5200.1+i;
              latData[i] = 10.2+i;
          }

          fid = H5.H5Fcreate(fname, HDF5Constants.H5F_ACC_TRUNC,
HDF5Constants.H5P_DEFAULT, HDF5Constants.H5P_DEFAULT);
          sid = H5.H5Screate_simple(1, dims, null);

          tid_nested = H5.H5Tcreate(HDF5Constants.H5T_COMPOUND, 16);
          H5.H5Tinsert(tid_nested, "Lon", 0,
HDF5Constants.H5T_NATIVE_DOUBLE);
          H5.H5Tinsert(tid_nested, "Lat", 8,
HDF5Constants.H5T_NATIVE_DOUBLE);

          tid = H5.H5Tcreate(HDF5Constants.H5T_COMPOUND, cmpSize);
          H5.H5Tinsert(tid, "index", 0, HDF5Constants.H5T_NATIVE_INT32);
          H5.H5Tinsert(tid, "location", 4, tid_nested);

          did = H5.H5Dcreate(fid, dname, tid, sid,
HDF5Constants.H5P_DEFAULT, HDF5Constants.H5P_DEFAULT,
HDF5Constants.H5P_DEFAULT);

          // write the first field "index"
          int tid_tmp = H5.H5Tcreate(HDF5Constants.H5T_COMPOUND, 4);
          H5.H5Tinsert(tid_tmp, "index", 0, HDF5Constants.H5T_NATIVE_INT32);
          H5.H5Dwrite(did, tid_tmp, HDF5Constants.H5S_ALL,
HDF5Constants.H5S_ALL, HDF5Constants.H5P_DEFAULT, indexData);
          H5.H5Tclose(tid_tmp);

          // write the first field of the nested compound, "location"->"Lon"
          tid_tmp = H5.H5Tcreate(HDF5Constants.H5T_COMPOUND, 8);
          H5.H5Tinsert(tid_tmp, "Lon", 0, HDF5Constants.H5T_NATIVE_DOUBLE);
          int tid_tmp_nested = H5.H5Tcreate(HDF5Constants.H5T_COMPOUND, 8);
          H5.H5Tinsert(tid_tmp_nested, "location", 0, tid_tmp);
          H5.H5Dwrite(did, tid_tmp_nested, HDF5Constants.H5S_ALL,
HDF5Constants.H5S_ALL, HDF5Constants.H5P_DEFAULT, lonData);
          H5.H5Tclose(tid_tmp_nested);
          H5.H5Tclose(tid_tmp);

          // write the second field of the nested compound, "location"->"Lat"
          tid_tmp = H5.H5Tcreate(HDF5Constants.H5T_COMPOUND, 8);
          H5.H5Tinsert(tid_tmp, "Lat", 0, HDF5Constants.H5T_NATIVE_DOUBLE);
          tid_tmp_nested = H5.H5Tcreate(HDF5Constants.H5T_COMPOUND, 8);
          H5.H5Tinsert(tid_tmp_nested, "location", 0, tid_tmp);
          H5.H5Dwrite(did, tid_tmp_nested, HDF5Constants.H5S_ALL,
HDF5Constants.H5S_ALL, HDF5Constants.H5P_DEFAULT, latData);
          H5.H5Tclose(tid_tmp_nested);
          H5.H5Tclose(tid_tmp);

          H5.H5Tclose(tid);
          H5.H5Tclose(tid_nested);
          H5.H5Sclose(sid);
          H5.H5Dclose(did);
          H5.H5Fclose(fid);
      }
========================================

On 1/5/2011 11:12 AM, Josiah Slack wrote:
Hi folks -
I'm just starting to get familiar with the HDF Object API, and I'm
interested in creating a nested compound Dataset.  I've been
experimenting with H5File.createCompoundDS(), and am getting the
following stack trace:
Exception in thread "main"
ncsa.hdf.hdf5lib.exceptions.HDF5LibraryException
     at ncsa.hdf.hdf5lib.H5._H5Tarray_create(Native Method)
     at ncsa.hdf.hdf5lib.H5.H5Tarray_create(H5.java:4411)
     at ncsa.hdf.object.h5.H5CompoundDS.create(H5CompoundDS.java:1167)
     at ncsa.hdf.object.h5.H5File.createCompoundDS(H5File.java:1147)
     at javaExample.H5CompoundDSCreate.main(H5CompoundDSCreate.java:126)

My naive implementation looks like this:
         // retrieve an instance of H5File
         FileFormat fileFormat =
FileFormat.getFileFormat(FileFormat.FILE_TYPE_HDF5);

         if (fileFormat == null) {
             System.err.println("Cannot find HDF5 FileFormat.");
             return;
         }

         // create a new file with a given file name.
         H5File testFile = (H5File)fileFormat.create(fname);

         if (testFile == null) {
             System.err.println("Failed to create file:"+fname);
             return;
         }

         // open the file and retrieve the root group
         testFile.open();
         Group root =
(Group)((javax.swing.tree.DefaultMutableTreeNode)testFile.getRootNode()).getUserObject();

         // create groups at the root
         Group topLevel = testFile.createGroup("top level", root);
         Group secondLevel = testFile.createGroup("second level",
topLevel);
         String[] memberNames = {
             "time",
             "distance"
         };

         Datatype[] memberDatatypes = new Datatype[] {
             new H5Datatype(Datatype.CLASS_FLOAT, 8, Datatype.NATIVE, -1),
             new H5Datatype(Datatype.CLASS_FLOAT, 8, Datatype.NATIVE, -1)
         };
         int[] memberSizes = {1, 1};
         int numEntries = 10;
         double[] time = new double[numEntries];
         double[] distance = new double[numEntries];
         Random random = new Random();
         for (int ii = 0; ii<  numEntries; ii++) {
             time[ii] = random.nextDouble();
             distance[ii] = 100.0*random.nextDouble();
         }
         Vector<Object>  data = new Vector<Object>();
         data.add(time);
         data.add(distance);
         long[] dims = { numEntries };

         Dataset compoundDS = testFile.createCompoundDS("time distance",
                                                        secondLevel,
                                                        dims,
                                                        null,
                                                        null,
                                                        0,
                                                        memberNames,
                                                        memberDatatypes,
                                                        memberSizes,
                                                        data);

         memberNames = new String[] {
             "height",
             "width",
             "time distance"
         };

         memberDatatypes = new Datatype[] {
             new H5Datatype(Datatype.CLASS_FLOAT, 8, Datatype.NATIVE, -1),
             new H5Datatype(Datatype.CLASS_FLOAT, 8, Datatype.NATIVE, -1),
             compoundDS.getDatatype()
         };
         memberSizes = new int[]{1, 1, 1};
         double[] height = new double[] { random.nextDouble() };
         double[] width = new double[] { random.nextDouble() };
         Vector<Object>  topLevelData = new Vector<Object>();
         topLevelData.add(height);
         topLevelData.add(width);
         topLevelData.add(compoundDS.getData());
         dims = new long[] {1};
         Dataset topLevelDS = testFile.createCompoundDS("compound ds",
                                                        topLevel,
                                                        dims,
                                                        null,
                                                        null,
                                                        0,
                                                        memberNames,
                                                        memberDatatypes,
                                                        memberSizes,
                                                        topLevelData);
         // close file resource
         testFile.close();


Is there a standard idiom that people use for this sort of thing?

Thanks in advance for any guidance.

-Josiah


_______________________________________________
Hdf-forum is for HDF software users discussion.
[email protected]
http://mail.hdfgroup.org/mailman/listinfo/hdf-forum_hdfgroup.org


--
Josiah Slack
MIT Lincoln Laboratory
Group 36
[email protected]
Ph: (781) 981-1754

_______________________________________________
Hdf-forum is for HDF software users discussion.
[email protected]
http://mail.hdfgroup.org/mailman/listinfo/hdf-forum_hdfgroup.org

Reply via email to