Hi Jeff & Steve,
Thanks for your responses.  After seven hours R/machine ran out of memory
(and thus ended).  Currently the machine has 4GB RAM.  I'm looking to
install more RAM tomorrow.

I will look into SQLLite3; thanks!

I've read that SQL would be a great program for data of this size (read-in,
manipulate), but I understand there is a hefty price tag (similar to the
cost of SAS? [licensing]).  At this time I'm looking for a low-cost
solution, if possible.  After this data event, a program like SQL would not
be needed in the future; also, with these multiple data sets to synthesize,
only a handful are of this size.

Thanks & please lend any other advice!

--
View this message in context: 
http://r.789695.n4.nabble.com/Reading-in-9-6GB-DAT-File-OK-with-64-bit-R-tp4457220p4458042.html
Sent from the R help mailing list archive at Nabble.com.

______________________________________________
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.

Reply via email to