[R] Practical Data Limitations with R

2008-04-08 Thread Jeff Royce
We are new to R and evaluating if we can use it for a project we need to do. We have read that R is not well suited to handle very large data sets. Assuming we have the data prepped and stored in an RDBMS (Oracle, Teradata, SQL Server), what can R reasonably handle from a volume perspective?

Re: [R] Practical Data Limitations with R

2008-04-08 Thread ajay ohri
Dear Jeff, R works fine for 22 rows that i tested on a home PC with XP . Memory is limited to hardware that you have. I suggest beefing up RAM to 2 GB and hard disk space and then working it out. I evaluated R too on my site www.decisionstats.com and I found it comparable if not better to

Re: [R] Practical Data Limitations with R

2008-04-08 Thread Philipp Pagel
On Tue, Apr 08, 2008 at 09:26:22AM -0500, Jeff Royce wrote: We are new to R and evaluating if we can use it for a project we need to do. We have read that R is not well suited to handle very large data sets. Assuming we have the data prepped and stored in an RDBMS (Oracle, Teradata, SQL

Re: [R] Practical Data Limitations with R

2008-04-08 Thread hadley wickham
We are new to R and evaluating if we can use it for a project we need to do. We have read that R is not well suited to handle very large data sets. Assuming we have the data prepped and stored in an RDBMS (Oracle, Teradata, SQL Server), what can R reasonably handle from a volume

Re: [R] Practical Data Limitations with R

2008-04-08 Thread Sankalp Upadhyay
Millions of rows can be a problem if all is loaded into memory, depending on type of data. Numeric should be fine but if you have strings and you would want to process based on that column (string comparisons etc) then it would be slow. You may want to combine sources outside - stored