On Mon, 1 Aug 2011, Paul Rodriguez wrote:

Hello R experts,

I'm trying to test R in a shared memory environment in which addressable memory 
is aggregrated to about 600G.

However, I get an error of 'too many elements' specified when I try creating a 
45K x 100K matrix.

I tried running R with a --max-nsize=50000000000 option, but got the same 
message.

Is there a way to run create such large matrices?

No.  See ?"Memory-limits" (a matrix in R is also a vector).

NB: setting a limit on Ncells (there normally is not one) isn't going to help allocation of vectors, is it?

thanks,
Paul Rodriguez

--
Brian D. Ripley,                  [email protected]
Professor of Applied Statistics,  http://www.stats.ox.ac.uk/~ripley/
University of Oxford,             Tel:  +44 1865 272861 (self)
1 South Parks Road,                     +44 1865 272866 (PA)
Oxford OX1 3TG, UK                Fax:  +44 1865 272595

______________________________________________
[email protected] mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.

Reply via email to