On Sun, 13 Feb 2005, Thomas Colson wrote:
Hi,
I've collected quite a bit of elevation data (LIDAR elevation points) and am looking for a suitable platform to do analysis and modeling on it. The data is sitting in an Oracle database, one table, 200 million rows of x,y, and z. I'm trying to figure out what hardware resources we need to reserve in order to run 64 BIT R for this size data.
Here's my question: Is the 64 BIT version of R appropriate for this size? Or is attempting to read all 200 million rows a pipe dream no matter what platform I'm using?
In principle R can handle this with enough memory. However, 200 million rows and three columns is 4.8Gb of storage, and R usually needs a few times the size of the data for working space.
You would likely be better off not reading the whole data set at once, but loading sections of it from Oracle as needed.
-thomas
______________________________________________ [email protected] mailing list https://stat.ethz.ch/mailman/listinfo/r-help PLEASE do read the posting guide! http://www.R-project.org/posting-guide.html
