On Dec 11, 2009, at 6:24 AM, Ambrosi Alessandro wrote:


Dear all, I am meeting some problems with memory allocation. I know it is an old issue, I'm sorry. I looked for a solution in the FAQs and manuals, mails, but without finding the working answer.
I really hope you can help me.
For instance, if I try to read micorarray data I get:

mab=ReadAffy(cdfname="hgu133plus2cdf")
Error: cannot allocate vector of size 858.0 Mb


I get similar errors with smaller objects, smaller data sets or other procedures
("Error: cannot allocate vector of size 123.0 Mb").
I'm running R with Suse 11.1 Linux OS, on two Xeon processors (8 cores), 32 GB RAM. I suppose I have enough resources to manage these objects and data files....

Any suggestions or hints will be really appreciated!
Many thanks in advance.
Alessandro

Well, you are running into a situation where there is not a contiguous chunk of RAM available in the sizes referenced, for allocation to the vector.

Presuming that you are running a 64 bit version of SUSE (what does 'uname -a' show in a system console), you should also check to be sure that you are also running a 64 bit version of R. What does:

  .Machine$sizeof.pointer

show?

If it returns 4, then you are running a 32 bit version of R, which cannot take advantage of your 64 bit platform. You should install a 64 bit version of R.

HTH,

Marc Schwartz

______________________________________________
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.

Reply via email to