On 5/2/2008 2:13 PM, ajoyner wrote:
Hello,
I'm attempting to load a ~110 MB text file with ~500,000 rows and 200
columns using read.table . R hangs and seems to give up. Can anyone tell me
an efficient way to load a file of this size?

It will help a lot if you specify the column types (using the colClasses argument), so that R doesn't have to determine them from the data.

It will also help if you've got lots of physical memory available for R; depending on the data, that could take several hundred MB of memory, and if the OS needs to use swap space to get it, you'll find it very slow. If you want to limit the memory footprint, don't read all of the data at once: specify some columns to be skipped (set their class to "NULL") or some rows (using skip and/or nrow).

Duncan Murdoch

______________________________________________
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.

Reply via email to