On Mon, 4 Oct 2004, Greg Butler wrote:

Hi,

I have been enjoying r for some time now, but was wondering about working
with larger data files.  When I try to load in big files with more than
20,000 records, the programs seems unbable to store all the records.  Is
there some way that I can increase the size of records that I work with?
Ideally I would like to work with census data which can hold a million
records.


You should be able to handle 20,000 records on a reasonable computer (my laptop, with 256Mb memory can, very slowly, do survey analyses on a file with 26,000 records and about 100 variables).


A million records is likely to be infeasible. A 32bit computer can't even address enough memory to store that much data. You would need to put the data either in a database or in a file format such as netCDF or hdf5 that allows smaller chunks to be read and processed.

        -thomas

______________________________________________
[EMAIL PROTECTED] mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide! http://www.R-project.org/posting-guide.html

Reply via email to