Le vendredi 03 janvier 2014 à 22:40 +0200, Xebar Saram a écrit :
Hi again and thank you all for the answers
i need to add that im a relatively R neewb so i apologize in advance
i started R with the --vanilla option and ran gc()
this is the output i get:
gc()
used (Mb) gc
Hi again and thank you all for the answers
i need to add that im a relatively R neewb so i apologize in advance
i started R with the --vanilla option and ran gc()
this is the output i get:
gc()
used (Mb) gc trigger (Mb) max used (Mb)
Ncells 182236 9.8 407500 21.8 35 18.7
It would help to know the sizes of the objects that have in your
workspace and also provide the 10 prior lines of your script at the
point of the error so that we can see what you are trying to do. The
following commands will list out the sizes of the objects:
cbind(sapply(ls(), function(x){
Xebar Saram zeltakc at gmail.com writes:
Hi All,
I have a terrible issue i cant seem to debug which is halting my work
completely. I have R 3.02 installed on a linux machine (arch linux-latest)
which I built specifically for running high memory use models. the system
is a 16 core, 256 GB
Describing the problem would help a lot more. For example, if you were
using some of the parallel processing options in R, this can make extra
copies of objects and drive memory usage up very quickly.
Max
On Thu, Jan 2, 2014 at 3:35 PM, Ben Bolker bbol...@gmail.com wrote:
Xebar Saram zeltakc
Le jeudi 02 janvier 2014 à 09:07 +0200, Xebar Saram a écrit :
Hi All,
I have a terrible issue i cant seem to debug which is halting my work
completely. I have R 3.02 installed on a linux machine (arch linux-latest)
which I built specifically for running high memory use models. the system
is
Hi All,
I have a terrible issue i cant seem to debug which is halting my work
completely. I have R 3.02 installed on a linux machine (arch linux-latest)
which I built specifically for running high memory use models. the system
is a 16 core, 256 GB RAM machine. it worked well at the start but in
7 matches
Mail list logo