I am a SAS user currently evaluating R as a possible addition or even
replacement for SAS. The difficulty I have come across

straight away is R's apparent difficulty in handling relatively large data
files. Whilst I would not expect it to handle

datasets with millions of records, I still really need to be able to work
with dataset with 100,000+ records and 100+

variables. Yet, when reading a .csv file with 180,000 records and about 200
variables, the software virtually ground to a

halt (I stopped it after 1 hour). Are there guidelines or maybe a
limitations document anywhere that helps me assess the size

of file that R, generally, or specific routines will handle? Also, mindful
of the fact that I am am an R novice, are there
guidelines to make efficient use of R in terms of data handling?

Many thanks in advance for your help.

Regards,
Fabiano Vergari
[EMAIL PROTECTED]

        [[alternative HTML version deleted]]

______________________________________________
R-help@stat.math.ethz.ch mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.

Reply via email to