Hi all,
i work with OSG for fun, and i have created a DataBase for whole Italy at HiRes 
1m/pixel in about 3 weeks.

The final DB is about 600G. The Source Images are about 13000 GeoTiff files 
(700G), and about 10 DEM files (40 meters res, 2.5G)
I have used and old notebook dual core 1.6Ghz, and two USB Disk 1Tera each 
(please, don't LOL |-) ), and i have run ONLY 1 osgdem thread for each script.

I have used vpbmaster to generate the scripts, BUT i have executed the scripts 
manually, for two reasons: 
 . first, vpbmaster executed 2 osgdem threads, and my HD was running to death! 
(i have tried to use "--machines", but did not work)
. second, all the scripts use the same "build_master.source" files, which is 
not optimized, and contains the 13000 GeoTiff files list.

The build_master.source was a big problem, because every script wasted about 
1hour to examine all the 13000 files, to find the few ones to be processed. I 
have written a little C program to create many distinct "build_master.source" 
files for all the scripts, and of course i have updated the scripts to use the 
correct ".source" files. 
Every "build_master.source" is optimized with a list of no more than 10 GTiff 
files.

I Think that a big DB can be created with poor hadware (and poor OS), but:
 1. no more than 1 thread for each USB HD
 2. build_master.source files optimized, if many source files are used
 3. avoid ECW/JPEG200: they are slower that geotiff.
 4. Read from USB HD, but Write to Internal HD (to avoid many and many write 
failures!)
 5. Three weeks is not much time, to create a 600G Db with a 300$ notebook 
1.6Ghz! IT'S GREAT!

...

------------------
Read this topic online here:
http://forum.openscenegraph.org/viewtopic.php?p=23652#23652





_______________________________________________
osg-users mailing list
osg-users@lists.openscenegraph.org
http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org

Reply via email to