Hi All,
 
I have an oddity when trying to read a grib1 dataset.
 
When I extract the data using code:
 
 hBand = GDALGetRasterBand(hSrcDS, 1 );
 float *pafScanline;
 int   nXSize = GDALGetRasterBandXSize( hBand );
 pafScanline = (float *) CPLMalloc(sizeof(float)*nXSize);
 GDALRasterIO( hBand, GF_Read, 0, 0, nXSize, 1,
                      pafScanline, nXSize, 1, GDT_Float32,
                      0, 0 );
 
 for( int j = 0; j < nXSize; j++ )
        std::cout << pafScanline[j] << std::endl;
 
I am seeing values that are along the lines of:
12.86
12.802
12.734
12.671
12.562
12.425
12.241
12.12
12.043
12.035
11.986
11.917
11.83
11.776
 
The same grib1 file when dumped using wgrib produces values ( degrees Kelvin ):
295.552
295.565
295.577
295.589
295.6
295.61
295.619
295.626
295.633
295.639
 
As an additional piece of information when I use the same above code and 
extract values that are less that 1 the values are appearing correct.  It is 
almost as if there is some sort of scaling issue
with values greater than 1.  
 
I have tried changing the pafScanline to a double and made the appropriate 
GDT_Float64 changes to GDALRasterIO and I get the same results.
 
Am I missing something fundamental here?
 
Thanks,
Bill

_______________________________________________
gdal-dev mailing list
[email protected]
http://lists.osgeo.org/mailman/listinfo/gdal-dev

Reply via email to