Hi folks -
I'm just starting to get familiar with the HDF Object API, and I'm interested in creating a nested compound Dataset. I've been experimenting with H5File.createCompoundDS(), and am getting the following stack trace:
Exception in thread "main" ncsa.hdf.hdf5lib.exceptions.HDF5LibraryException
        at ncsa.hdf.hdf5lib.H5._H5Tarray_create(Native Method)
        at ncsa.hdf.hdf5lib.H5.H5Tarray_create(H5.java:4411)
        at ncsa.hdf.object.h5.H5CompoundDS.create(H5CompoundDS.java:1167)
        at ncsa.hdf.object.h5.H5File.createCompoundDS(H5File.java:1147)
        at javaExample.H5CompoundDSCreate.main(H5CompoundDSCreate.java:126)

My naive implementation looks like this:
        // retrieve an instance of H5File
FileFormat fileFormat = FileFormat.getFileFormat(FileFormat.FILE_TYPE_HDF5);

        if (fileFormat == null) {
            System.err.println("Cannot find HDF5 FileFormat.");
            return;
        }

        // create a new file with a given file name.
        H5File testFile = (H5File)fileFormat.create(fname);

        if (testFile == null) {
            System.err.println("Failed to create file:"+fname);
            return;
        }

        // open the file and retrieve the root group
        testFile.open();
Group root = (Group)((javax.swing.tree.DefaultMutableTreeNode)testFile.getRootNode()).getUserObject();

        // create groups at the root
        Group topLevel = testFile.createGroup("top level", root);
        Group secondLevel = testFile.createGroup("second level", topLevel);
        String[] memberNames = {
            "time",
            "distance"
        };

        Datatype[] memberDatatypes = new Datatype[] {
            new H5Datatype(Datatype.CLASS_FLOAT, 8, Datatype.NATIVE, -1),
            new H5Datatype(Datatype.CLASS_FLOAT, 8, Datatype.NATIVE, -1)
        };
        int[] memberSizes = {1, 1};
        int numEntries = 10;
        double[] time = new double[numEntries];
        double[] distance = new double[numEntries];
        Random random = new Random();
        for (int ii = 0; ii < numEntries; ii++) {
            time[ii] = random.nextDouble();
            distance[ii] = 100.0*random.nextDouble();
        }
        Vector<Object> data = new Vector<Object>();
        data.add(time);
        data.add(distance);
        long[] dims = { numEntries };

        Dataset compoundDS = testFile.createCompoundDS("time distance",
                                                       secondLevel,
                                                       dims,
                                                       null,
                                                       null,
                                                       0,
                                                       memberNames,
                                                       memberDatatypes,
                                                       memberSizes,
                                                       data);

        memberNames = new String[] {
            "height",
            "width",
            "time distance"
        };

        memberDatatypes = new Datatype[] {
            new H5Datatype(Datatype.CLASS_FLOAT, 8, Datatype.NATIVE, -1),
            new H5Datatype(Datatype.CLASS_FLOAT, 8, Datatype.NATIVE, -1),
            compoundDS.getDatatype()
        };
        memberSizes = new int[]{1, 1, 1};
        double[] height = new double[] { random.nextDouble() };
        double[] width = new double[] { random.nextDouble() };
        Vector<Object> topLevelData = new Vector<Object>();
        topLevelData.add(height);
        topLevelData.add(width);
        topLevelData.add(compoundDS.getData());
        dims = new long[] {1};
        Dataset topLevelDS = testFile.createCompoundDS("compound ds",
                                                       topLevel,
                                                       dims,
                                                       null,
                                                       null,
                                                       0,
                                                       memberNames,
                                                       memberDatatypes,
                                                       memberSizes,
                                                       topLevelData);
        // close file resource
        testFile.close();


Is there a standard idiom that people use for this sort of thing?

Thanks in advance for any guidance.

-Josiah

--
Josiah Slack
MIT Lincoln Laboratory
Group 36
[email protected]
Ph: (781) 981-1754

_______________________________________________
Hdf-forum is for HDF software users discussion.
[email protected]
http://mail.hdfgroup.org/mailman/listinfo/hdf-forum_hdfgroup.org

Reply via email to