Hi,
I have one critical question in using R.
I am currently working on some research which involves huge amounts
of data(it is about 15GB).
I am trying to use R in this research rather than using SAS or STATA.
(The company where I am working right now, is trying to switch SAS/STATA to
R)

As far as I know, the memory limit in R is 4GB;
However, I believe that there are ways to handle the large dataset.
Most of my works in R would be something like cleaning the data or running a
simple regression(OLS/Logit) though.

The whole company relies on me when it comes to R.
Please teach me how to deal with large data in R.
If you can, please give me a response very soon.
Thank you very much.

Regards,
Hyo

        [[alternative HTML version deleted]]

______________________________________________
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.

Reply via email to