Hi,

Thanks for your hints. I got an example working thanks to this link
http://www.unidata.ucar.edu/mailing_lists/archives/netcdf-hdf/2003/msg00073.html,
 there didn't seem to be an example for writing in the tutorial section? 
(http://www.hdfgroup.org/HDF5/Tutor/select.html)

I still have a query though, to help me in clarifying the HDF5 data "system".

>From first link:
         if ((mem_spaceid = H5Screate_simple(NDIMS, h5dim, NULL)) < 0)
            BAIL(-3);

         for (h5start[0] = 0; h5start[0]<XLEN; h5start[0]++)
         {
            ... //Chopped out the H5Dextend, this is not needed in my case.
            if ((file_spaceid = H5Dget_space(datasetid)) < 0)
               BAIL(-3);
            if (H5Sselect_hyperslab(file_spaceid, H5S_SELECT_SET, h5start,
                                    NULL, h5count, NULL) < 0)
               BAIL(-3);
            if (H5Dwrite(datasetid, H5T_STD_I32BE, mem_spaceid,
                         file_spaceid, H5P_DEFAULT, data))
               BAIL(-5);
            H5Sclose(file_spaceid);
         }

What is the reason for being required to create a "memory space"? Surely all 
that is needed to be known is the slab information and the destination in the 
file, which I presume what is created and tied to the ID passed in 
H5Sselect_hyperslab. Coupling this with the original memory location ("data" 
above) and that should be it, right? So all you would need is file_spaceid 
(which stream in file I want to dump the data), hyperslab (where in stream to 
dump data) and the data itself. 

This interpretation must be incorrect for two reasons:

* The ID retrieved by calling H5Screate_simple, is unneeded in the above 
interpretation. The only way I can resolve it is to think of it as if there are 
two stages of buffering (memory location -> some kind of staging area -> file)
* The need to continually update the file_spaceid is also unclear -- this 
shouldn't be changing, I am continually using the same datasetid, so the return 
value from H5Dget_space should always be the same -- must be some kind of 
internal state change, but what, and why?


Still confused, even though it is working :)

Thanks.



--- On Fri, 7/16/10, Quincey Koziol <[email protected]> wrote:

> From: Quincey Koziol <[email protected]>
> Subject: Re: [Hdf-forum] hyperslabs with compound objects
> To: "HDF Users Discussion List" <[email protected]>
> Date: Friday, July 16, 2010, 7:15 AM
> Howdy,
> 
> On Jul 15, 2010, at 1:34 PM, D Haley wrote:
> 
> > Hello,
> > 
> > Can anyone tell me where I am going wrong? I have a
> dataset that is too big to fit into ram, so I need to read a
> file, processes it and  write the output in sections. I
> have previously been doing this with fixed width records,
> but this has proven too inflexible for my needs, hence I
> thought I would try to use HDF5.
> > 
> > I have created a little test program below, where I
> simply try to select each element in the file one at a time
> using hyperslab, and attempt to write to that -- however
> this is causing problems (see end of message) when I try to
> do this.
> > 
> > I am new to the HDF5 API, and may not fully understand
> what I am doing.
> 
>     You are setting up the hyperslab with
> H5Sselect_hyperslab, but then aren't using it as a memory or
> file dataspace selection in your call to H5Dwrite (and are
> using H5S_ALL instead).  Take a look in the examples
> that are distributed with HDF5 for uses of
> H5Sselect_hyperslab and you should be able to modify your
> program appropriately.
> 
>     Quincey
> 
> > Thanks.
> > 
> > 
> > ====
> > 
> > #include <iostream>
> > #include <hdf5.h>
> > 
> > using namespace std;
> > 
> > int main()
> > {
> >     hid_t
> fileId,vlenDataTypeId,dataTypeId,dataSetId,
> >          
>            
> dataSpaceId,pListId;
> > 
> >     //Try to create the output file
> >     fileId =
> H5Fcreate("det.hdf5",H5F_ACC_TRUNC,H5P_DEFAULT,H5P_DEFAULT);
> > 
> >     if(!fileId)
> >     {
> >         cerr <<
> "Error opening file" << endl;
> >         return 1;
> >     }
> > 
> >     //Create a variable length data
> type to store TOI info
> >     vlenDataTypeId =
> H5Tvlen_create(H5T_NATIVE_FLOAT);
> > 
> > 
> >     const unsigned int HIT_OFFSET_X=0;
> >     const unsigned int
> HIT_OFFSET_Y=HIT_OFFSET_X+4;
> >     const unsigned int
> HIT_OFFSET_TOIARRAY=HIT_OFFSET_Y+4;
> >     const unsigned int
> HIT_OFFSET_VOLTAGE=HIT_OFFSET_TOIARRAY+sizeof(hvl_t);
> >     const unsigned int
> HIT_OFFSET_PULSENUM=HIT_OFFSET_VOLTAGE+4;
> >     const unsigned int
> HIT_OFFSET_EVENTTYPE=HIT_OFFSET_PULSENUM+8;
> > 
> >     const unsigned int
> HIT_SIZE=HIT_OFFSET_EVENTTYPE+2;
> > 
> >     dataTypeId =
> H5Tcreate(H5T_COMPOUND,HIT_SIZE);
> > 
> >     if(!dataTypeId)
> >         cerr <<
> "Some kind of error with datatypeID" << endl; 
> > 
> >    
> if(H5Tinsert(dataTypeId,"x",HIT_OFFSET_X,H5T_NATIVE_FLOAT))
> >     {
> >         cerr <<
> "Err inserting x into type" << endl;
> >         return 1;
> >     }
> > 
> >    
> if(H5Tinsert(dataTypeId,"y",HIT_OFFSET_Y,H5T_NATIVE_FLOAT))
> >     {
> >         cerr <<
> "Err inserting y into type" << endl;
> >         return 1;
> >     }
> > 
> >    
> if(H5Tinsert(dataTypeId,"toiArray",HIT_OFFSET_TOIARRAY,vlenDataTypeId))
> >     {
> >         cerr <<
> "Err inserting toiArray into type" << endl;
> >         return 1;
> >     }
> > 
> >    
> if(H5Tinsert(dataTypeId,"voltage",HIT_OFFSET_VOLTAGE,H5T_NATIVE_FLOAT))
> >     {
> >         cerr <<
> "Err inserting voltage into type" << endl;
> >         return 1;
> >     }
> > 
> >    
> if(H5Tinsert(dataTypeId,"pulseNum",HIT_OFFSET_PULSENUM,H5T_NATIVE_ULLONG))
> >     {
> >         cerr <<
> "Err inserting pulseNum into type" << endl;
> >         return 1;
> >     }
> > 
> >    
> if(H5Tinsert(dataTypeId,"eventType",HIT_OFFSET_EVENTTYPE,H5T_NATIVE_SHORT))
> >     {
> >         cerr <<
> "Err inserting eventType into type" << endl;
> >         return 1;
> >     }
> > 
> >     hsize_t columnDim=5;
> >    
> dataSpaceId=H5Screate_simple(1,&columnDim,NULL);
> >     
> >     pListId =
> H5Pcreate(H5P_LINK_CREATE); //The example has this as
> H5P_DATASET_CREATE, but HDF libs doesn't like it
> > 
> >     dataSetId =
> H5Dcreate2(fileId,"/hitdata",dataTypeId,dataSpaceId,pListId,NULL,NULL);
> >     //Create the input array
> >     char *buffer = new char[HIT_SIZE];
> > 
> >     float *timeData=new float;
> > 
> >     for(unsigned int ui=0;ui<5;
> ui++)
> >     {
> >        
> *timeData=1.2345f;
> > 
> >         *((float
> *)(buffer+HIT_OFFSET_X)) = 0;
> >         *((float
> *)(buffer+HIT_OFFSET_Y)) = 1;
> >        
> ((hvl_t*)(buffer+HIT_OFFSET_TOIARRAY))->len = 1;
> >        
> ((hvl_t*)(buffer+HIT_OFFSET_TOIARRAY))->p= timeData;
> > 
> >         *((float
> *)(buffer+HIT_OFFSET_VOLTAGE)) = 2;
> >         *((unsigned long
> long*)(buffer+HIT_OFFSET_PULSENUM)) =3;
> >         *((unsigned
> short*)(buffer+HIT_OFFSET_EVENTTYPE)) = 4;
> > 
> >         hsize_t
> pos,count;
> >         pos = ui+1;
> >         count=1;
> >         if(
> H5Sselect_hyperslab(dataSpaceId,H5S_SELECT_SET,
> &pos,NULL, &count, NULL)  < 0)
> >         {
> >        
>     cerr << "Hyperslab selection
> problem" << endl;
> >        
>     return 1;
> >         }
> > 
> >         cerr <<
> "Writing data: " << endl;
> >        
> if(H5Dwrite(dataSetId,dataTypeId,H5S_ALL,H5S_ALL,H5P_DEFAULT,
> buffer))
> >         {
> >        
>     cerr << "Error writing data"
> << endl;
> >        
>     return 1;
> >         }
> >     }
> >     
> >     delete timeData;
> >     delete[] buffer;
> > 
> >     H5Pclose(pListId);
> >     H5Dclose(dataSetId);
> >     H5Sclose(dataSpaceId);
> >     H5Tclose(vlenDataTypeId);
> >     H5Tclose(dataTypeId);
> >     H5Fclose(fileId);
> > 
> >     return 0;
> > }
> > 
> > 
> > 
> > 
> > 
> > ./main 
> > Writing data: 
> > HDF5-DIAG: Error detected in HDF5 (1.8.4) thread 0:
> >  #000: H5Dio.c line 266 in H5Dwrite(): can't
> write data
> >    major: Dataset
> >    minor: Write failed
> >  #001: H5Dio.c line 578 in H5D_write(): can't
> write data
> >    major: Dataset
> >    minor: Write failed
> >  #002: H5Dcontig.c line 557 in
> H5D_contig_write(): contiguous write failed
> >    major: Dataset
> >    minor: Write failed
> >  #003: H5Dscatgath.c line 677 in
> H5D_scatgath_write(): datatype conversion failed
> >    major: Dataset
> >    minor: Can't convert datatypes
> >  #004: H5T.c line 4704 in H5T_convert(): data
> type conversion failed
> >    major: Attribute
> >    minor: Unable to encode value
> >  #005: H5Tconv.c line 2470 in
> H5T_conv_struct_opt(): unable to convert compound datatype
> member
> >    major: Datatype
> >    minor: Unable to initialize object
> >  #006: H5T.c line 4704 in H5T_convert(): data
> type conversion failed
> >    major: Attribute
> >    minor: Unable to encode value
> >  #007: H5Tconv.c line 3140 in H5T_conv_vlen():
> can't write VL data
> >    major: Datatype
> >    minor: Write failed
> >  #008: H5Tvlen.c line 1015 in
> H5T_vlen_disk_write(): Unable to write VL information
> >    major: Datatype
> >    minor: Write failed
> >  #009: H5HG.c line 624 in H5HG_insert(): unable
> to allocate a global heap collection
> >    major: Heap
> >    minor: Unable to initialize object
> >  #010: H5HG.c line 182 in H5HG_create(): unable
> to allocate file space for global heap
> >    major: Heap
> >    minor: Unable to initialize object
> >  #011: H5MF.c line 488 in H5MF_alloc():
> allocation failed from aggr/vfd
> >    major: Virtual File Layer
> >    minor: Can't allocate space
> >  #012: H5MFaggr.c line 114 in
> H5MF_aggr_vfd_alloc(): can't allocate metadata
> >    major: Resource unavailable
> >    minor: Can't allocate space
> >  #013: H5MFaggr.c line 219 in H5MF_aggr_alloc():
> 'normal' file space allocation request will overlap into
> 'temporary' file space
> >    major: Resource unavailable
> >    minor: Out of range
> > Error writing data
> > 
> > 
> > 
> > 
> > 
> > 
> > 
> > _______________________________________________
> > Hdf-forum is for HDF software users discussion.
> > [email protected]
> > http://mail.hdfgroup.org/mailman/listinfo/hdf-forum_hdfgroup.org
> 
> 
> _______________________________________________
> Hdf-forum is for HDF software users discussion.
> [email protected]
> http://mail.hdfgroup.org/mailman/listinfo/hdf-forum_hdfgroup.org
> 


      

_______________________________________________
Hdf-forum is for HDF software users discussion.
[email protected]
http://mail.hdfgroup.org/mailman/listinfo/hdf-forum_hdfgroup.org

Reply via email to