Stephen Choularton wrote:

Hi
I am trying to do a large glm and running into this message. Error: cannot allocate vector of size 3725426 Kb
In addition: Warning message: Reached total allocation of 494Mb: see help(memory.size)
Am I simply out of memory (I only have .5 gig)?
Is there something I can do?

You have to rethink whether the analyses you are doing is sensible this way, or whether you can respecify things. R claims to need almost 4Gb(!) for the next memory allocation step, so you will get in trouble even on huge machines....


 Uwe Ligges


Stephen

        [[alternative HTML version deleted]]

______________________________________________
[email protected] mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide! http://www.R-project.org/posting-guide.html

______________________________________________ [email protected] mailing list https://stat.ethz.ch/mailman/listinfo/r-help PLEASE do read the posting guide! http://www.R-project.org/posting-guide.html

Reply via email to