On 16/08/18 12:26, Moritz Lennert wrote:
On 14/08/18 10:13, Moritz Lennert wrote:
On 13/08/18 15:30, Markus Neteler wrote:
Hi Moritz,

On Mon, Aug 13, 2018 at 2:04 PM, Moritz Lennert
<[email protected]> wrote:
On 13/08/18 13:41, Markus Neteler wrote:
...
AFAIK, the only moment where i.cutlines potentially reads the whole image
would be in the edge detection part. That's why there is the tiling option
to avoid just that.

I suppose you refer to

tile_width=integer
      Width of tiles for tiled edge detection (pixels)
tile_height=integer
      Height of tiles for tiled edge detection (pixels)
?

So, unless I'm forgetting something, you should be able
to work on large images. If you have seen this crash the module, please file
a bug report.

A colleague working with a large dataset > 10e9 pixels) just reported
that the module does not crash, but that it seems to take "forever" (he
stopped the process after a day). I guess this is in the r.cost phase.

An option would be to tile (and parallelize) the entire process which
would mean finding cutlines in the individual tiles, making sure that
the start and endpoints of these cutlines match the start and endpoints
on the neighboring tiles...

I see this in r.cost's main.c:

420 /* this is most probably the limitation of r.cost for large datasets
421 * segment size needs to be reduced to avoid unecessary disk IO
422 * but it doesn't make sense to go down to 1
423 * so use 64 segment rows and cols for <= 200 million cells
424 * for larger regions, 32 segment rows and cols
425 * maybe go down to 16 for > 500 million cells ? */

So maybe it's time to experiment a bit more with r.cost in very large regions ?

Moritz
_______________________________________________
grass-dev mailing list
[email protected]
https://lists.osgeo.org/mailman/listinfo/grass-dev

Reply via email to