As a part of a simulation, I need to sample from a large vector repeatedly.
For some reason sample() builds up the memory usage (> 500 MB for this 
example) when used inside a for loop as illustrated here:

X <- 1:100000
P <- runif(100000)
for(i in 1:500) Xsamp <- sample(X,30000,replace=TRUE,prob=P)

Even worse, I am not able to free up memory without quitting R.
I quickly run out of memory when trying to perform the simulation. Is 
there any way to avoid this to happen?

The problem seem to appear only when specifying both replace=TRUE and 
probability weights for the vector being sampled, and this happens both 
on Windows XP and Linux (Ubuntu).


Victor

______________________________________________
R-help@stat.math.ethz.ch mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.

Reply via email to