Scott Zentz wrote:
Hello Everyone,
We have recently purchased a server which has 64GB of memory running
a 64bit OS and I have compiled R from source with the following config
./configure --prefix=/usr/local/R-2.9.1 --enable-Rshlib
--enable-BLAS-shlib --enable-shared --with-readline --with-iconv
--with-x --with-tcktk --with-aqua --with-libpng --with-jpeglib
and I would like to verify that I can use 55GB-60GB of the 64GB of
memory within R. Does anyone know how this is possible? Will R be able
to access that amount of memory from a single process? I am not an R
user myself but I just wanted to test this before I turned the server
over to the researchers..
Hmm, it's slightly tricky because R often duplicates objects, so you may
hit the limit only transiently. Also, R has an internal 2GB limit on
single vectors. But something like this
Y <- replicate(30, rnorm(2^28-1))
should create an object of about 30*2GB. Then lapply(Y, mean) should
generate 30 very good and very expensive approximations to 0.
(For obvious reasons, I haven't tested this on a 1GB ThinkPad X40....)
--
O__ ---- Peter Dalgaard Ă˜ster Farimagsgade 5, Entr.B
c/ /'_ --- Dept. of Biostatistics PO Box 2099, 1014 Cph. K
(*) \(*) -- University of Copenhagen Denmark Ph: (+45) 35327918
~~~~~~~~~~ - (p.dalga...@biostat.ku.dk) FAX: (+45) 35327907
______________________________________________
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.