We are new to R and evaluating if we can use it for a project we need to
do. We have read that R is not well suited to handle very large data
sets. Assuming we have the data prepped and stored in an RDBMS (Oracle,
Teradata, SQL Server), what can R reasonably handle from a volume
perspective? Are there some guidelines on memory/machine sizing based
on data volume? We need to be able to handle Millions of Rows from
several sources. Any advice is much appreciated. Thanks.
[[alternative HTML version deleted]]
______________________________________________
[email protected] mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.