I am using the function setTimeLimit(elapsed=60) in R web applications to
prevent processes from taking up too many resources. This works fine when a
calculation takes too long, however, when the machine runs out of memory,
the time limit also seems to fail, resulting in a stuck process maximum
memory until killed manually. This probably makes sense in some way, but I
was wondering if there is a solution. Some example code using ggplot2 that
crashes my machine:

#time limit works fine:
setTimeLimit(elapsed=10);
while(TRUE) {sqrt(pi)}
setTimeLimit();

#time limit fails:
library(ggplot2);
x <- rnorm(1e6); 
y <- rnorm(1e6);
setTimeLimit(elapsed=10);
qplot(x,y,geom="density2d");
setTimeLimit();

Jeroen Ooms
www.stat.ucla.edu/~jeroen
-- 
View this message in context: 
http://n4.nabble.com/setTimeLimit-fails-when-out-of-memory-tp1129595p1129595.html
Sent from the R devel mailing list archive at Nabble.com.

______________________________________________
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel

Reply via email to