Hi, I've written an application that uses the object interface in HDFView. After migrating from 2.13 to 2.14 or 3.0.0, an exception is being thrown when writing the second "slice" of data. Here's the exception and the relevant code. Any ideas what I've done wrong?
Thanks, Scott hdf.hdf5lib.exceptions.HDF5DataspaceInterfaceException: Out of range hdf.hdf5lib.exceptions.HDF5DataspaceInterfaceException: Out of range at hdf.hdf5lib.H5.H5Dwrite_double(Native Method) at hdf.hdf5lib.H5.H5Dwrite(H5.java:1972) at hdf.hdf5lib.H5.H5Dwrite(H5.java:1905) at hdf.object.h5.H5ScalarDS.write(Unknown Source) at com.agi.stk.volumetric.VolumetricFile.VolumetricFileFromRasterInfo(VolumetricFile.java:642) at Main.originalMain(Main.java:517) at Main.main(Main.java:105) FYI: numCoord1 = 180, numCoord2 = 360, numCoord3 = 1, numTimes = 12 // Create the Values Dataset { long dims[] = { numCoord1, numCoord2, (numCoord3 > 1) ? numCoord3 : 2, numTimes }; long maxDims[] = dims; Datatype dataType = null; try { dataType = h5File.createDatatype(Datatype.CLASS_FLOAT, 8, Datatype.ORDER_LE, -1); } catch (Exception e) { // TODO Auto-generated catch block e.printStackTrace(); } hdf.object.Dataset dataset = null; try { dataset = h5File.createScalarDS("Values", volumetricData, dataType, dims, maxDims, null, 0, null); } catch (Exception e) { // TODO Auto-generated catch block e.printStackTrace(); } try { // Define the dataset dataset.write(); } catch (Exception e) { // TODO Auto-generated catch block e.printStackTrace(); } // Write each raster to dataset dataset.init(); long[] start = dataset.getStartDims(); double[] values = new double [numCoord1 * numCoord2]; double[] rowValues = new double [numCoord2]; int valuesOffset; // Each file contains data for one altitude at one forecast time int a = 0; int t = 0; for (int r = 0; r < rasterFilePaths.getLength(); r++) { System.out.printf("Reading raster: %s%n", rasterFilePaths.item(r).getTextContent()); Dataset hDataset2 = gdal.Open(rasterFilePaths.item(r).getTextContent(), gdalconst.GA_ReadOnly); Band hBand = hDataset2.GetRasterBand(1); int bandDataType = hBand.GetRasterDataType(); for (int row = 0; row < numCoord1; ++row) { // GDAL raster (0, 0) is top left. STK wants (0, 0) to be bottom left. hBand.ReadRaster( 0, row, // xOff, yOff numCoord2, 1, // xSize, ySize numCoord2, 1, // bufXSize, bufYSize bandDataType, rowValues, // bufType, buffer 0, 0); // nPixelSpace, nLineSpace valuesOffset = ((numCoord1 - row - 1) * numCoord2); System.arraycopy(rowValues, 0, values, valuesOffset, numCoord2); } hDataset2.delete(); try { start[2] = a++; start[3] = t; dataset.write(values); if (numCoord3 == 1) { // Need to duplicate altitude values to create volume // See if skipping this avoids exception. Did for Thunderstorm. start[2] = a; dataset.write(values); // ***** THIS IS LINE 642 ***** } if (a >= numCoord3) { a = 0; t++; } } catch (HDF5DataspaceInterfaceException e) { // TODO Auto-generated catch block e.printStackTrace(); } catch (Exception e) { // TODO Auto-generated catch block e.printStackTrace(); } } }
_______________________________________________ Hdf-forum is for HDF software users discussion. Hdf-forum@lists.hdfgroup.org http://lists.hdfgroup.org/mailman/listinfo/hdf-forum_lists.hdfgroup.org Twitter: https://twitter.com/hdf5