Re: [R] Why does a 2 GB RData file exceed my 16GB memory limit when reading it in?

2020-09-03 Thread Ista Zahn
On Wed, Sep 2, 2020 at 7:22 PM Leandro Marino wrote: > > David, > > If the ".Rdata" contains more than one object you could (and maybe should > use) the SOAR package (from Venables). This package helps you to split the > objects over multiple RData files. It's useful when you have numerous >

Re: [R] Why does a 2 GB RData file exceed my 16GB memory limit when reading it in?

2020-09-02 Thread John via R-help
On Wed, 2 Sep 2020 16:31:53 -0500 David Jones wrote: > Thank you Uwe, John, and Bert - this is very helpful context. > > If it helps inform the discussion, to address John and Bert's > questions - I actually had less memory free when I originally ran the > analyses and saved the workspace, than

Re: [R] Why does a 2 GB RData file exceed my 16GB memory limit when reading it in?

2020-09-02 Thread Jeff Newmiller
You need more RAM to load this file. As the memory was being used in your original file, certain objects (such as numeric columns) were being shared among different higher-level objects (such as data frames). When serialized into the file those optimizations were lost, and now those columns are

Re: [R] Why does a 2 GB RData file exceed my 16GB memory limit when reading it in?

2020-09-02 Thread Leandro Marino
David, If the ".Rdata" contains more than one object you could (and maybe should use) the SOAR package (from Venables). This package helps you to split the objects over multiple RData files. It's useful when you have numerous medium-large objects in the workspace but doesn't use then at the same

Re: [R] Why does a 2 GB RData file exceed my 16GB memory limit when reading it in?

2020-09-02 Thread David Jones
Thank you Uwe, John, and Bert - this is very helpful context. If it helps inform the discussion, to address John and Bert's questions - I actually had less memory free when I originally ran the analyses and saved the workspace, than when I read in the data back in later on (I rebooted in an

Re: [R] Why does a 2 GB RData file exceed my 16GB memory limit when reading it in?

2020-09-02 Thread Bert Gunter
R experts may give you a detailed explanation, but it is certainly possible that the memory available to R when it wrote the file was different than when it tried to read it, is it not? Bert Gunter "The trouble with having an open mind is that people keep coming along and sticking things into

Re: [R] Why does a 2 GB RData file exceed my 16GB memory limit when reading it in?

2020-09-02 Thread John via R-help
On Wed, 2 Sep 2020 13:36:43 +0200 Uwe Ligges wrote: > On 02.09.2020 04:44, David Jones wrote: > > I ran a number of analyses in R and saved the workspace, which > > resulted in a 2GB .RData file. When I try to read the file back > > into R > > Compressed in RData but uncompressed in main

Re: [R] Why does a 2 GB RData file exceed my 16GB memory limit when reading it in?

2020-09-02 Thread Uwe Ligges
On 02.09.2020 04:44, David Jones wrote: I ran a number of analyses in R and saved the workspace, which resulted in a 2GB .RData file. When I try to read the file back into R Compressed in RData but uncompressed in main memory later, it won't read into R and provides the error: