Dear list,

I have been trying to run the function "qvalue" under the package qvalue
on a vector with about 20 million values.

> asso_p.qvalue<-qvalue(asso_p.vector)
Error: cannot allocate vector of size 156513 Kb
> sessionInfo()
Version 2.3.1 (2006-06-01)
i686-pc-linux-gnu

attached base packages:
[1] "methods"   "stats"     "graphics"  "grDevices" "utils"
"datasets"
[7] "base"

other attached packages:
qvalue
 "1.1"
> gc()
            used  (Mb) gc trigger   (Mb)  max used   (Mb)
Ncells    320188   8.6   23540643  628.7  20464901  546.5
Vcells 101232265 772.4  294421000 2246.3 291161136 2221.4

I have been told that the linux box has 4Gb of RAM, so it should be able
to do better than this.
I searched the FAQ and found some tips on increasing memory size, but
they seem to be windows specific, such as memory.size() and the
-max-mem-size flag. On my linux box R didn't recognise them.

I don't understand the meaning of max-vsize, max-nsize and max-ppsize.
Any help on how to increase the memory allocation on linux is much
appreciated.

Many thanks,
Alex

------------------------------------
Alex Lam
PhD student
Department of Genetics and Genomics
Roslin Institute (Edinburgh)
Roslin
Midlothian EH25 9PS

Phone +44 131 5274471
Web   http://www.roslin.ac.uk

______________________________________________
R-help@stat.math.ethz.ch mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.

Reply via email to