I actually just created a buffer array and then passed back to a vector.
But what I'm saying is that this is an extremely common way to store data
in C++, if not the default way for most users. Vectors are extremely common
as well. Why are we still held back by using C type arrays?

On Mon, May 9, 2016 at 12:58 PM, David <[email protected]> wrote:

> Hi Steve,
>
> boost::multi_array provides a clean interface for multi dimensional arrays
> in C++.
>
> You can also do something like this:
>
> auto data = new double[rows*cols]; // allocate all data in one block
> auto md_data = new double*[rows];  // allocate pointers for each row
> for (int r = 0; r != rows; ++r)    // set row pointers
>     md_data[r] = data + r*cols;
> md_data[2][5] = 1.0;  // row pointer array can be used as a pseudo md array
>
>
> On Fri, May 6, 2016 at 1:29 PM, Steven Walton <[email protected]>
> wrote:
>
>> So I am noticing some interesting behavior and is wondering if there is a
>> way around this.
>> I am able so assign a rank 1 array dynamically and write this to an hdf5
>> filetype but I do not seem to be able to do with with higher order arrays.
>> I would like to be able to write a PPx array to h5 and retain the data
>> integrity. More specifically I am trying to create a easy to use vector to
>> array library <https://github.com/stevenwalton/H5Easy> that can handle
>> multidimensional data (works with rank 1).
>>
>> Let me give some examples. I will also show the typenames of the arrays.
>>
>> Works:
>> double *a = new double[numPts]; // typename: Pd
>> double a[numPts]; // typename A#pts_d
>> double a[num1][num2]; typename:Anum1_Anum2_d
>>
>> What doesn't work:
>> double **a = new double*[num1];
>> for ( size_t i = 0; i < num1; ++i )
>>    a[i] = new double[num2];
>> // typename PPd
>>
>> Testing the saved arrays with h5dump (and loading and reading directly) I
>> find that if I have typename PPx (not necessarily double) I get garbage
>> stored. Here is an example code and output from h5dump showing the
>> behavior.
>> ------------------------------------------------------------
>> compiled with h5c++ -std=c++11
>> ------------------------------------------------------------
>> #include "H5Cpp.h"
>> using namespace H5;
>>
>> #define FILE "multi.h5"
>>
>> int main()
>> {
>>   hsize_t dims[2];
>>   herr_t status;
>>   H5File file(FILE, H5F_ACC_TRUNC);
>>   dims[0] = 4;
>>   dims[1] = 6;
>>
>>   double **data = new double*[dims[0]];
>>   for ( size_t i = 0; i < dims[0]; ++i )
>>     data[i] = new double[dims[1]];
>>
>>   for ( size_t i = 0; i < dims[0]; ++i )
>>     for ( size_t j = 0; j < dims[1]; ++j )
>>       data[i][j] = i + j;
>>
>>   DataSpace dataspace = DataSpace(2,dims);
>>   DataSet dataset( file.createDataSet( "test", PredType::IEEE_F64LE,
>> dataspace ) );
>>   dataset.write(data, PredType::IEEE_F64LE);
>>   dataset.close();
>>   dataspace.close();
>>   file.close();
>>
>>   return 0;
>> }
>> ------------------------------------------------------------
>> h5dump
>> ------------------------------------------------------------
>> HDF5 "multi.h5" {
>> GROUP "/" {
>>    DATASET "test" {
>>       DATATYPE  H5T_IEEE_F64LE
>>       DATASPACE  SIMPLE { ( 4, 6 ) / ( 4, 6 ) }
>>       DATA {
>>       (0,0): 1.86018e-316, 1.86018e-316, 1.86018e-316, 1.86019e-316, 0,
>>       (0,5): 3.21143e-322,
>>       (1,0): 0, 1, 2, 3, 4, 5,
>>       (2,0): 0, 3.21143e-322, 1, 2, 3, 4,
>>       (3,0): 5, 6, 0, 3.21143e-322, 2, 3
>>       }
>>    }
>> }
>> }
>> ------------------------------------------------------------------
>> As can be seen the (0,0) set is absolute garbage (except the last
>> character which is the first number of the actual array), (0,5) is out of
>> bounds,  and has garbage data. (1,0) has always contained real data (though
>> it should be located at (0,0)). So this seems like some addressing problem.
>>
>> Is this a bug in the h5 libraries that allows me to read and write Pd
>> data as well as Ax0_...Axn_t data but not P...Pt data? Or is this for some
>> reason intentional? As using new is a fairly standard way to assign arrays,
>> making P...Pt type data common, I have a hard time seeing this as
>> intentional. In the mean time is anyone aware of a workaround to this? The
>> data I am taking in will be dynamically allocated so I do not see a way to
>> get Ax_... type data.
>>
>> Thank you,
>> Steven
>>
>> _______________________________________________
>> Hdf-forum is for HDF software users discussion.
>> [email protected]
>> http://lists.hdfgroup.org/mailman/listinfo/hdf-forum_lists.hdfgroup.org
>> Twitter: https://twitter.com/hdf5
>>
>
>
> _______________________________________________
> Hdf-forum is for HDF software users discussion.
> [email protected]
> http://lists.hdfgroup.org/mailman/listinfo/hdf-forum_lists.hdfgroup.org
> Twitter: https://twitter.com/hdf5
>
_______________________________________________
Hdf-forum is for HDF software users discussion.
[email protected]
http://lists.hdfgroup.org/mailman/listinfo/hdf-forum_lists.hdfgroup.org
Twitter: https://twitter.com/hdf5

Reply via email to