We are new to R and evaluating if we can use it for a project we need to
do. We have read that R is not well suited to handle very large data
sets. Assuming we have the data prepped and stored in an RDBMS (Oracle,
Teradata, SQL Server), what can R reasonably handle from a volume
perspective?
Dear Jeff,
R works fine for 22 rows that i tested on a home PC with XP . Memory is
limited to hardware that you have. I suggest beefing up RAM to 2 GB and hard
disk space and then working it out. I evaluated R too on my site
www.decisionstats.com and I found it comparable if not better to
On Tue, Apr 08, 2008 at 09:26:22AM -0500, Jeff Royce wrote:
We are new to R and evaluating if we can use it for a project we need to
do. We have read that R is not well suited to handle very large data
sets. Assuming we have the data prepped and stored in an RDBMS (Oracle,
Teradata, SQL
We are new to R and evaluating if we can use it for a project we need to
do. We have read that R is not well suited to handle very large data
sets. Assuming we have the data prepped and stored in an RDBMS (Oracle,
Teradata, SQL Server), what can R reasonably handle from a volume
Millions of rows can be a problem if all is loaded into memory,
depending on type of data. Numeric should be fine but if you have
strings and you would want to process based on that column (string
comparisons etc) then it would be slow.
You may want to combine sources outside - stored
5 matches
Mail list logo