Hi Even,
New Year, new mysteries. I'm having quite strange slow times in computing the
min/max of netCDF files.
In GMT we can read files via GDAL by appending =gd to the file name. So both of
these do a similar job
grdinfo grav_29_img.nc=gd
and
gdalinfo grav_29_img.nc -mm
and it takes around 28 sec in a new laptop running Windows and master GDAL.
However in WSL, same laptop but GDAL 2.2.3, they take about 5:30 MINUTES to
run. On a OSX it takes about 3:30 minutes (not me who run this, and a GDAL
3.0.3 MacPorts)
On Windows for files of similar size it run faster when data is of type float
(the grid in this example is a short int).
Now, perhaps the weirdest thing is that if I run the GMT command via our Julia
wrapper it only takes ~8 sec, whilst the same command via the Matlab wrapper
took 1:18 min
The ~8 sec time is close to what I get with a pure GMT (i.e. not using GDAL to
read the nc file (grdinfo grav_29_img.nc -M))
>From the GMT side all happens in the call
GDALComputeRasterMinMax(hBand, false, adfMinMax);
And I guess that the same holds from the pure GDAL call.
If you want to test with the grid used in these testings, it's here
ftp://ftp.soest.hawaii.edu<ftp://ftp.soest.hawaii.edu.pwessel>/pwessel/grav_29_img.nc<http://grav_29_img.nc>
Happy new year
Joaquim
_______________________________________________
gdal-dev mailing list
[email protected]
https://lists.osgeo.org/mailman/listinfo/gdal-dev