I have a question about repeated image processing and recovering memory via garbage collection.
A couple images are displayed simultaneously for analysis, using split.screen(). I'm looping over a large collection of images, using jpeg/readJPEG(), png/readPNG(), and caTools/read.gif() to load an image into memory from a file. Then I use erase.screen() to erase the previous image, use plot() to set up a plotting area, and display the image with rasterImage(). Then the user elects some processing, and we move on to the next image(s). I'm very carefully not holding references to any of the previous images. GC is indeed happening, but memory quickly fills up. It certainly *looks* as though the image memory is not being released: printing out trace info using mem_used()[[1]] says memory use increases with each image, by about the image size. Forcing periodic GC's, measuring memory used before and after, shows that no memory is recovered. Some questions: (1) Is it known that processing images in this way will lead to allocating memory that is not recovered by GC? (2) Are there any tools you'd recommend to see exactly *what* is filling up memory, just in case it's not previous images, but some other problem? This is R 4.4.2 (for now). Running MacOS on an ARM processor. ___________ Steve Rowley <s...@alum.mit.edu> Zoom: 839 529 4589 <https://us04web.zoom.us/j/8395294589?pwd=dlQ4MUFHK1NFOCtoZFpUNFRtZ2lSQT09> It is very dark & after 2000. If you continue, you are likely to be eaten by a bleen. [[alternative HTML version deleted]] ______________________________________________ R-help@r-project.org mailing list -- To UNSUBSCRIBE and more, see https://stat.ethz.ch/mailman/listinfo/r-help PLEASE do read the posting guide https://www.R-project.org/posting-guide.html and provide commented, minimal, self-contained, reproducible code.