I am working with data sets that have 2 matrices of 300 columns by 19,000 rows , and I manage to get the data loaded in a reasonable amount of time. Once its in I save the workspace and load from there. Once I start doing some work on the data, I am taking up about 600 Meg's of RAM out of the 1 Gig I have in the computer.I will soon upgrade to 2 Gig because I will have to work with an even larger data matrix soon.

I must say that the speed of R given with what I have been doing, is acceptable.

Peter




At 07:59 PM 6/29/2004, Vadim Ogranovich wrote:
 R's IO is indeed 20 - 50 times slower than that of equivalent C code no
matter what you do, which has been a pain for some of us. It does
however help read the Import/Export tips as w/o them the ratio gets much
worse. As Gabor G. suggested in another mail, if you use the file
repeatedly you can convert it into internal format: read.table once into
R and save using save()... This is much faster.

In my experience R is not so good at large data sets, where large is
roughly 10% of your RAM.

______________________________________________
[EMAIL PROTECTED] mailing list
https://www.stat.math.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide! http://www.R-project.org/posting-guide.html

______________________________________________ [EMAIL PROTECTED] mailing list https://www.stat.math.ethz.ch/mailman/listinfo/r-help PLEASE do read the posting guide! http://www.R-project.org/posting-guide.html

Reply via email to