Hi: Sorry if this is a double post. I posted the same thing this morning and did not see it.
I just started using R and am asking the following questions so that I can plan for the future when I may have to analyze volume data. 1) What are the limitations of R when it comes to handling large datasets? Say for example something like 200M rows and 15 columns data frame (between 1.5 to 2 GB in size)? Will the limitation be based on the specifications of the hardware or R itself? 2) Is R 32 bit compiled or 64 bit (on say Windows and AIX) 3) Are there any other points to note / things to keep in mind when handling large datasets? 4) Should I be looking at SAS also only for this reason (we do have SAS in-house but the problem is that I am still not sure what we have license for, etc.) Any pointers / thoughts will be appreciated. Satish ______________________________________________ [email protected] mailing list https://stat.ethz.ch/mailman/listinfo/r-help PLEASE do read the posting guide http://www.R-project.org/posting-guide.html and provide commented, minimal, self-contained, reproducible code.

