Le jeudi 02 janvier 2014 à 09:07 +0200, Xebar Saram a écrit : > Hi All, > > I have a terrible issue i cant seem to debug which is halting my work > completely. I have R 3.02 installed on a linux machine (arch linux-latest) > which I built specifically for running high memory use models. the system > is a 16 core, 256 GB RAM machine. it worked well at the start but in the > recent days i keep getting errors and crashes regarding memory use, such as > "cannot create vector size of XXX, not enough memory" etc > > when looking at top (linux system monitor) i see i barley scrape the 60 GB > of ram (out of 256GB) > > i really don't know how to debug this and my whole work is halted due to > this so any help would be greatly appreciated One important thing to note is that while the memory use may appear to be low, if the memory is fragmented, R may not be able to allocate a *contiguous* memory area for a big vector (you didn't tell us how big it was). In that case, AFAIK the only solution is to restart R (saving the session or objects you want to keep).
Regards ______________________________________________ R-help@r-project.org mailing list https://stat.ethz.ch/mailman/listinfo/r-help PLEASE do read the posting guide http://www.R-project.org/posting-guide.html and provide commented, minimal, self-contained, reproducible code.