I was trying to read a large .csv file (80 colums, 400,000 rows, size of
about 200MB). I used scan(), R 2.3.1 on Windows XP. My computer is AMD 2000+
and has 512MB ram.
It sometimes freezes my PC, sometimes just shuts down R quitely.
Is there a way (option, function) to better handle large files?
Seemingly SAS can deal with it with no problem, but I just persuaded my
professor transfering to R, so it is quite disappointing.
Please help, thank you.
[[alternative HTML version deleted]]
______________________________________________
[email protected] mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.