Hi, all; I know there has been a lot of discussions on memory usage in R. However, I have some odd situation here. Basically, I have a rare opportunity to run R in a system with 64GB memory without any limit on memory usage for any person or process. However, I encountered the memory problem error message like this:
Error: cannot allocate vector of size 594075 Kb I got this error message while I was trying to apply dChip preprocessing procedures for 150 Affymetrix U133v2 chips that has > 22,000 probe sets on them. The actual codes I ran was like this: > Data <- ReadAffy(filenames = paste(HOME, "CelData/", fname, sep="")) > mem.limits() nsize vsize NA NA > gc() used (Mb) gc trigger (Mb) Ncells 530216 14.2 899071 24.1 Vcells 76196137 581.4 243983468 1861.5 > eset <- expresso(Data, normalize.method="invariantset", bg.correct=FALSE, pmc\ orrect.method="pmonly", summary.method="liwong") normalization: invariantset PM/MM correction : pmonly expression values: liwong normalizing...Error: cannot allocate vector of size 594075 Kb > gc() used (Mb) gc trigger (Mb) Ncells 797983 21.4 1710298 45.7 Vcells 76716811 585.4 305954068 2334.3 > object.size(Data) [1] 608355664 > memory.profile() NILSXP SYMSXP LISTSXP CLOSXP ENVSXP PROMSXP LANGSXP 1 30484 372373 4845 420 180 127274 SPECIALSXP BUILTINSXP CHARSXP LGLSXP INTSXP 203 1168 111434 5296 0 0 44649 REALSXP CPLXSXP STRSXP DOTSXP ANYSXP VECSXP EXPRSXP 13382 9 60173 0 0 26002 0 BCODESXP EXTPTRSXP WEAKREFSXP 0 106 0 Although I have no idea of memory allocation in R, apparently something's wrong with this. The memory problem must have nothing to do with physical memory. My question is this. Is this memory problem due to some non-optimal configuration of memory usage? If so, then what will be the optimal configuration for this? If not, then there must be problems on actual implementations of functions I used here, right? The reason I am asking this is that, according to the reference manual, the error message I got can be brought up by roughly three reasons. First, when the system is unable to provide the R requested memory. Second, when the requested memory size exceeds the address-space limit for a process. Finally, when the length of a vector is larger than 2^31-1. I wonder the problem has anything to do with the third case. (If so, then I think I am hopeless unless the internal implementations change...) ______________________________________________ R-help@stat.math.ethz.ch mailing list https://stat.ethz.ch/mailman/listinfo/r-help PLEASE do read the posting guide! http://www.R-project.org/posting-guide.html