Dear list, 

I am using the foreach/doSNOW packages to loop faster over some hundreds of
gzipped files, each of about 1mb zipped, around 10mb unzipped. 
This is the loop I've written:

cl.tmp <- makeCluster(rep("localhost", 4), type="SOCK")
registerDoSNOW(cl.tmp) 

output <- 
foreach(f=dir(rec=T)) %dopar% {
        load(f) 
        # do a bunch of operations on the file's data frame
        save( data, file = f, compress=T)
        rm( <bunch of objects>)
        gc()
        }

stopCluster(cl.tmp)

While this works, it eats away all the pc's memory as if it does not discard
the data after each iteration of the loop is finished. The pagefile usage
drops to normal when the stopCluster() command is executed.
Does anyone have an idea where this leak's coming from, and a remedy?

Thanks in advance! 

Vassilis
-- 
View this message in context: 
http://r.789695.n4.nabble.com/Memory-leak-shen-using-foreach-tp3215612p3215612.html
Sent from the R help mailing list archive at Nabble.com.

______________________________________________
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.

Reply via email to