Hi, I converted 153600 x 249600 pixels of rasterdata into a Rasterlite table. After creating overviews the result behaves well in Quantum GIS once the table is opened. Unfortunatly opening the table with QGis installed on a mini PC takes ages.
I do not know what QGis is really doing but I guessed that perhaps it wants to know some metadata and makes a gdalinfo request or something equivalent. Therefore I made a test with bit more powerful computer and running gdalinfo with that took about 3 minutes. That feels rather slow. Is gdalinfo perhaps walking through every single tile in the rasterlite table for gathering the image layer info? Could there be any other way to do it in more effective way on the GDAL side? I have a feeling that this may not be really a GDAL problem. Doing plain SELECT count(*) from test_rasters with Spatialite-gui is also very slow and took about the same 3 minutes in my rapid test. That makes me think that for getting more speed Rasterlite should store more info about tiles into raster_metadata tables for providing an instant access to data. Perhaps this is something to talk with Alessandro who is developing a new Rasterlite 2 which should be ready by the end of this year. When it comes to GDAL, could it make any sense to cache gdalinfo from Rasterlite layers? Three minutes is rather a long time and my 153600 x 249600 pixel sized layer with 780270 rows/tiles in 5 meter resolution in the table is not exceptionally big. If time is increasing with tile count it would mean 12 minutes for getting gdalinfo from 2.5 meter resolution and 48 minutes from 1.25 minutes layer... Test with gdal_translate makes me feel that it is also reading the layer info first and spending about 3 minutes for that before starting the real work. Timing is very uncertain though, I could only take the time before the progress bar appears on the screen. -Jukka Rahkonen- _______________________________________________ gdal-dev mailing list [email protected] http://lists.osgeo.org/mailman/listinfo/gdal-dev
