On 02.09.2020 04:44, David Jones wrote:
I ran a number of analyses in R and saved the workspace, which
resulted in a 2GB .RData file. When I try to read the file back into R

Compressed in RData but uncompressed in main memory....


later, it won't read into R and provides the error: "Error: cannot
allocate vector of size 37 Kb"

This error comes after 1 minute of trying to read things in - I
presume a single vector sends it over the memory limit. But,
memory.limit() shows that I have access to a full 16gb of ram on my
machine (12 GB are free when I try to load the RData file).

But the data may need more....


gc() shows the following after I receive this error:

used (Mb) gc trigger (Mb) max used (Mb)
Ncells 623130 33.3 4134347 220.8 5715387 305.3
Vcells 1535682 11.8 883084810 6737.5 2100594002 16026.3

So 16GB were used when R gave up.

Best,
Uwe Ligges



______________________________________________
R-help@r-project.org mailing list -- To UNSUBSCRIBE and more, see
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


______________________________________________
R-help@r-project.org mailing list -- To UNSUBSCRIBE and more, see
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.

Reply via email to