On Sat, Dec 30, 2017 at 2:32 PM, Markus Metz <[email protected]> wrote: > > > On Fri, Dec 29, 2017 at 5:07 PM, Dylan Beaudette <[email protected]> > wrote: >> >> Hi everyone, >> >> First of all, thanks in advance for the r.sun.daily module which is a >> nice replacement for my amateurish attempts over the last 12 years. >> >> I am currently working on an annual beam radiance map for a large >> geographic region, at 30m res: 70,953 x 46,964 cells. This is far too >> large for a single pass of r.horizon or r.sun on my machine so I have >> split the data into 5,000 x 5,000 cell tiles with 100 cells of >> overlap. This seems to be sufficient for my purposes and the edge >> effects are tolerable. >> >> At 8-15 minutes / tile / day (r.sun) and 54 tiles this job calls for >> multiple CPU cores. All of the parallel processing that I use is (as >> far as I know) contained within the same region. >> >> >> I have had good success with running r.horizon in parallel via GNU >> parallel like this: >> >> # 1: start angle >> # 2: angle step >> # 3: elevation tile >> seq 0 $step 330 | parallel --gnu --progress "bash make-hz-maps.sh {} >> $step $elev" >> >> Which is just a wrapper around r.horizon and run in parallel "within" >> tiles. >> >> Next, I run r.sun.daily (8 CPU cores) within tiles: >> >> r.sun.daily --overwrite elevation=$elev aspect=$asp slope=$slope \ >> start_day=1 end_day=365 beam_rad=$beam horizon_basename=hzangle >> horizon_step=$step nprocs=8 >> >> >> The r.sun.daily modules finishes without error about 50-60% of the >> time, results look good. The other 50-40% of the the time I get this: >> >> >> ERROR: Error uncompressing raster data for row 2605 of >> <r_sun_crop8255_beam_rad_tmp_352> >> *** buffer overflow detected ***: g.message terminated >> ======= Backtrace: ========= >> /lib/x86_64-linux-gnu/libc.so.6(+0x7329f)[0x7f8b9a70129f] >> /lib/x86_64-linux-gnu/libc.so.6(__fortify_fail+0x5c)[0x7f8b9a79c83c] >> /lib/x86_64-linux-gnu/libc.so.6(+0x10d710)[0x7f8b9a79b710] >> /lib/x86_64-linux-gnu/libc.so.6(+0x10cc19)[0x7f8b9a79ac19] >> /lib/x86_64-linux-gnu/libc.so.6(_IO_default_xsputn+0xbc)[0x7f8b9a70961c] >> /lib/x86_64-linux-gnu/libc.so.6(_IO_vfprintf+0x1cc5)[0x7f8b9a6d9905] >> /lib/x86_64-linux-gnu/libc.so.6(__vsprintf_chk+0x84)[0x7f8b9a79aca4] >> >> /usr/local/grass-7.5.svn/lib/libgrass_gis.7.5.svn.so(+0x1343c)[0x7f8b9aa6a43c] >> >> /usr/local/grass-7.5.svn/lib/libgrass_gis.7.5.svn.so(G_fatal_error+0xbf)[0x7f8b9aa6accf] >> g.message(main+0x254)[0x400dd4] >> /lib/x86_64-linux-gnu/libc.so.6(__libc_start_main+0xf5)[0x7f8b9a6aff45] >> g.message[0x400ea2] >> >> >> I can't seem to replicate the problem, as subsequent runs with the >> same parameters and in the same tile are successful! This leads me to >> think that: >> >> * some aspect of this approach is not thread safe >> * there is something wrong with my computer >> * there is a subtle bug in the raster writing / reading code when >> invoked in parallel >> >> >> I have encountered similar raster reading errors in the past, >> typically in the context of parallel processing: >> >> https://trac.osgeo.org/grass/ticket/2762 >> >> https://trac.osgeo.org/grass/ticket/2764 >> >> >> http://osgeo-org.1560.x6.nabble.com/Error-reading-raster-data-for-row-xxx-only-when-using-r-series-and-t-rast-series-td5229569i20.html >> >> https://lists.osgeo.org/pipermail/grass-dev/2015-July/075691.html >> >> >> Here is some information on my system and version of GRASS: >> >> ./configure --enable-64bit --with-libs=/usr/lib --without-pthread >> --without-odbc --without-mysql --with-readline --with-cxx >> --enable-largefile --with-freetype >> --with-freetype-includes=/usr/include/freetype2 --with-sqlite >> --with-python --with-geos=/usr/local/bin/geos-config --without-opencl >> --with-opencl-includes=/usr/include/CL/ --with-postgres >> --with-postgres-includes=/usr/include/postgresql/ >> --with-postgres-libs=/usr/lib/ >> --with-proj-share=/usr/local/share/proj/ >> >> version=7.5.svn >> date=2017 >> revision=r71964 >> build_date=2017-12-21 >> build_platform=x86_64-pc-linux-gnu >> build_off_t_size=8 >> >> >> Any ideas? > > Please try the patch attached to ticket #2764 helps to get closer to the > problem. > > Markus M >
Hi Markus, Thank you for the quick reply and patch. I have recompiled with the patch from #2764 and now waiting to see what happens. Happy New Year! Dylan _______________________________________________ grass-user mailing list [email protected] https://lists.osgeo.org/mailman/listinfo/grass-user
