Hi All,
 
I am currently trying to familiarize with "ff" package which allows me to store 
R objects in the hard drive. One of things that I notice when reading in a text 
file with "read.csv.ffdf" function - is that, in the R temp folder, 1000+ 
small files get created, each file having a name like "ffdf1bcd8b4aa0.ff". Each 
file is about 5KB in size. 
 
My understanding is, the whole file has been split into small small pieces and 
stored in the hard drive. What I am trying to see is that - is there a way to 
reduce the number of splits by increasing the file size? 
 
Thanks in advance,
 
Regards,
Indrajit
        [[alternative HTML version deleted]]

______________________________________________
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.

Reply via email to