Searching a little bit on the Internet I found a bug report on bugzilla
related to the issue I have.
From this comment
https://bugs.r-project.org/bugzilla3/show_bug.cgi?id=14611#c3 on, Simon
Urbanek clarifies that all this has nothing to do with R. Rather, it is
about memory allocation in Linux, where it cannot release memory
allocated using brk if there are active blocks.
It also states that the OS would swap out pages from the unused blocks.
I wonder why that didnt work out in may case, since R crashed soon after
re-launching the R function.
Anyway, I tried first what Hao suggested - free everything but the
objects to be returned - but it didnt change much: I could free about
1GB of memory, but rsession kept holding more than 2GB, even after the
c++ function terminated.
After reading those comments in the bug report, I tried using the
malloc_trim() function (it forces the return of any unused memory space
to OS) from within R. This trick actually worked: rsession memory
consumption drops from 3.6GB to about 100MB, and R does not crash, even
after several subsequent calls to that "guilty" function.
I have a feeling this is a kind of dirty solution to a more serious
problem. But it does not seem to harm anything within my application.
Best,
Fabio
On 06/23/2015 07:45 PM, Hao Ye wrote:
I’m not that familiar with R’s garbage collection, but maybe you can call
something manually to free up memory?
rewrite your program to work more incrementally and releasing memory as needed.
Or the simple alternative solution is to free everything but the objects to be
returned in the function itself.
Best,
--
Hao Ye
h...@ucsd.edu
On Jun 23, 2015, at 9:57 AM, Dirk Eddelbuettel <e...@debian.org> wrote:
On 23 June 2015 at 18:45, Fabio Tordini wrote:
| it seems like the C++ function, once it terminates its execution, does
| not free the memory used. instead, rsession keeps holding these objects
| even though C++ has terminated. is it possible?
Of course. We all have bad days every now and then. :)
From your description you seem to be holding on to data for too long, and
maybe even transfering back to R, when it crashes. "So don't do that." You
identified a run-time constraint. Now you either need a bigger machine
(100gb of ram in not unheard of these days) or rewrite your program to work
work more incrementally and releasing memory as needed.
Dirk
--
http://dirk.eddelbuettel.com | @eddelbuettel | e...@debian.org
_______________________________________________
Rcpp-devel mailing list
Rcpp-devel@lists.r-forge.r-project.org
https://lists.r-forge.r-project.org/cgi-bin/mailman/listinfo/rcpp-devel
_______________________________________________
Rcpp-devel mailing list
Rcpp-devel@lists.r-forge.r-project.org
https://lists.r-forge.r-project.org/cgi-bin/mailman/listinfo/rcpp-devel