On Fri, Feb 13, 2015 at 10:13 AM, Paulo van Breugel <[email protected]> wrote:
> Hi Pietro, > > Thanks for the suggestion, I will have a look at the documentation. > > Paulo > > On Fri, Feb 13, 2015 at 10:09 AM, Pietro <[email protected]> wrote: > >> Dear Paulo, >> >> On Fri, Feb 13, 2015 at 9:57 AM, Paulo van Breugel >> <[email protected]> wrote: >> > I guess this is because the calculations are done in-memory? Any way to >> > avoid this memory problem when using large data sets (something like >> working >> > with memmap objects?) >> >> With memmap you still have a limits of 2Gb I guess, you should try: dask >> > Just reading the memmap manual page, where it reads: "Memory-mapped arrays use the Python memory-map object which (prior to Python 2.5) does not allow files to be larger than a certain size depending on the platform. This size is always < 2GB even on 64-bit systems.". Which is unclear to me; I am not sure if that means that this limit is different or does not apply when on Python 2.5 or newer (what is the minimum python version for GRASS?) > >> Dask Array implements the NumPy ndarray interface using blocked >> algorithms, cutting up the large array into many small arrays. This >> lets us compute on arrays larger than memory using all of our cores. >> We coordinate these blocked algorithms using dask graphs. >> >> http://dask.readthedocs.org/en/latest/array.html >> >> I didn't have a chance to try it yet, but it support a numpy array >> syntax, and since you are using quite basic functionalities I think >> you should be able to work with it. >> > > >> All the best >> >> Pietro >> > >
_______________________________________________ grass-dev mailing list [email protected] http://lists.osgeo.org/mailman/listinfo/grass-dev
