You could also have a look at the LaF package which is written to handle large text files:

http://cran.r-project.org/web/packages/LaF/index.html

Under the vignettes you'll find a manual.

Note: LaF does not help you to fit 9GB of data in 4GB of memory, but it could help you reading your file block by block and filtering it.

Jan






RHelpPlease <rrum...@trghcsolutions.com> schreef:

Hi Barry,

"You could do a similar thing in R by opening a text connection to
your file and reading one line at a time, writing the modified or
selected lines to a new file."

Great!  I'm aware of this existing, but don't know the commands for R.  I
have a variable [560,1] to use to pare down the incoming large data set (I'm
sure of millions of rows).  With other data sets they've been small enough
where I've been able to use the merge function after data has been read in.
Obviously I'm having trouble reading in this large data set in in the first
place.

Any additional help would be great!


--
View this message in context: http://r.789695.n4.nabble.com/Reading-in-9-6GB-DAT-File-OK-with-64-bit-R-tp4457220p4458074.html
Sent from the R help mailing list archive at Nabble.com.

______________________________________________
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.

______________________________________________
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.

Reply via email to