> We are new to R and evaluating if we can use it for a project we need to > do. We have read that R is not well suited to handle very large data > sets. Assuming we have the data prepped and stored in an RDBMS (Oracle, > Teradata, SQL Server), what can R reasonably handle from a volume > perspective? Are there some guidelines on memory/machine sizing based > on data volume? We need to be able to handle Millions of Rows from > several sources. Any advice is much appreciated. Thanks.
The most important thing is what type of analysis do you want to do with the data? Is the algorithm that implements the analysis O(n), O(n log n) or O(n^2) ? Hadley -- http://had.co.nz/ ______________________________________________ R-help@r-project.org mailing list https://stat.ethz.ch/mailman/listinfo/r-help PLEASE do read the posting guide http://www.R-project.org/posting-guide.html and provide commented, minimal, self-contained, reproducible code.