There seem to have been three almost identical reports of this from three 
different email addresses, including two widely-separate UK academic 
institutions, and two user names, which is a bit weird.

However:  R  will need more than 4GB memory to handle a 3.99GB file, and 
probably more than 4GB memory to handle a 1.33GB file.

On 32-bit R I think the recommendation is that the data should occupy no more 
than 10% of the address space, ie, about 400MB.  For 64-bit R, you are likely 
to still find R very slow if the data set is more than about 1/3 of physical 
memory.   For a 4GB data file I would recommend a computer with at least 16GB 
memory.

Now, it is often possible to load only a small fraction of the data at one time 
and so to analyse large data sets on smaller computers.  Three examples that I 
have worked on
    - analysing data from a survey data set about the same size as your data, 
by keeping most of the data in a SQLite database and just loading a
       few variables at a time.
    - fitting linear regression models to large data sets with the biglm 
package, by keeping the data in a SQLite database and just loading a few rows
       at a time
    - analysing whole-genome genetic data (40GB or so) by storing the data in a 
netCDF file and reading appropriate chunks with the ncdf package.

          -thomas


On Wed, 17 Mar 2010, Mamun wrote:


Dear List,

I am trying to read some files using read.csv and total size of those files
are 3.99 GB. I am using MacBook Pro with 4GB RAM(snow leopard). I also tried
to run a chunk from those files and altogether the size was 1.33 GB. But
every time I was getting the following error

R(1200) malloc: *** mmap(size=16777216) failed (error code=12)
*** error: can't allocate region
*** set a breakpoint in malloc_error_break to debug

I managed to run smaller chunks (400MB) and I also saved the object. But
problem is when I try to load some of those objects(which is not more than
1.5 GB altogether), I get same error again. Can someone please help? why am
I getting this error? does R need more space than the actual file size? I
may buy new machine if its something related to RAM size. but if it is some
R problem then it will be very useless to buy new machine. So, I really need
to know why this is happening.

Any help is appreciated.

Thanks in advance.

regards,
Mamun
--
View this message in context: 
http://n4.nabble.com/mmap-error-12-macbook-pro-tp1596234p1596234.html
Sent from the R help mailing list archive at Nabble.com.

______________________________________________
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


Thomas Lumley                   Assoc. Professor, Biostatistics
tlum...@u.washington.edu        University of Washington, Seattle

______________________________________________
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.

Reply via email to