Hello,

I've finally reached the wall - my 2gb RAM machine simply can't handle
the datasets that I am working with. I've tried a 64-bit compile of R on
a 8gb RAM machine but that's not always available to use.

Now there are several proposed ways around this, but it seems the most
general solution is to leverage a SQL database to manage large datasets.

I thought I'd ask around to see what is the best approach before I went
off and expended a bunch of time being unable to get anything to work. I
have no experience with SQL or database administration.

My dataframes are in the 100,000 x 10,000 range (at the most) with a mix
of numerical, factor, and character data.

What works best right now - DBI + RSQLite, or would SQLiteDF be better,
for somebody with basically NO experience?

Thank you,
Bing

______________________________________________
R-help@stat.math.ethz.ch mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.

Reply via email to