This has also been my experience when dealing with long-running programs that allocate large fractions of the GPU memory. However, I'm not sure why normal Python reference counting is insufficient to free GPU memory as soon as the Python object container goes out of scope.
The fact that gc.collect() fixes the problem suggests that there is a reference cycle associated with each GPU memory allocation, which is why garbage collection is required to free the memory. In my application, all of my GPU arrays were attributes in instances of a Python class, so I added a __del__ method to my class to call gc.collect() for me whenever an class instance was deallocated. On Jul 23, 2014, at 11:59 AM, Matthias Lee <[email protected]> wrote: > Hi all, > > I noticed something interesting today. > I am working on an image processing tool which loops several times over each > of a series of images. Everything is done in place and I should not be > growing my memory footprint between iterations. > > Now when I tracked the actual GPU memory consumption I found that I would > ultimately I would run out of GPU memory (just a short excerpt): > http://i.imgur.com/AjmmpEk.png > > I double and triple checked that everything is happening in place, started > trying to delete GPU objects as soon as I'm finished with them to try to > trigger the GC, but that only had limited success. I would expect the GC to > kick in before the GPU runs out of memory.. > > I then started manually calling gc.collect() every few iteration and suddenly > everything started behaving and is now relatively stable. See here (note the > scale difference): http://i.imgur.com/Zzq5YdC.png > > Is this normal? Is this a bug? > > Thanks, > > Matthias > > -- > Matthias Lee > IDIES/Johns Hopkins University > Performance @ Rational/IBM > > [email protected] > [email protected] > (320) 496 6293 > > To know recursion, you must first know recursion. > _______________________________________________ > PyCUDA mailing list > [email protected] > http://lists.tiker.net/listinfo/pycuda _______________________________________________ PyCUDA mailing list [email protected] http://lists.tiker.net/listinfo/pycuda
