Hi,
I've been using large datasets (GB) and I've stored them in MySQL
databases and use RMySQL to access them. My feeling is that most of the
times you don't need to keep the dataset in your workspace, but you need
to access parts of it or aggregate it in some way, before run some
analysis. So
Hi, do any one have experience with loading dataset
that is larger than 2GB into R. My organization is a
SAS oriented shop and I'm in the process of switching
it to R. One of the complain about R has always been
it's inability to handle large dataset (GB)
efficiently. I would like some comments
Absolutely no problem on 64-bit OSes with enough memory. Many 32-bit OSes
have problems with 2Gb files.
Please do read the posting guide and tell us basic facts like which OS you
are running on, so we don't have to speculate to answer your question.
Also, what you want to do with the