type

?memory

into R and that will explain what to do...

S
----- Original Message ----- From: "Edwin Sendjaja" <edw...@web.de>
To: <r-help@r-project.org>
Sent: Tuesday, January 06, 2009 11:41 AM
Subject: [R] Large Dataset


Hi alI,

I have a 3.1 GB Dataset ( with 11 coloumns and lots data in int and string).
If I use read.table; it takes very long. It seems that my RAM is not big
enough (overload) I have 3.2 RAM and  7GB SWAP, 64 Bit Ubuntu.

Is there a best sultion to read a large data R? I have seen, that people
suggest to use bigmemory package, ff. But it seems very complicated. I dont
know how to start with that packages.

i have tried to use bigmemory. But I got some kind of errors. Then I gave up.


can someone give me an simple example how ot use ff or bigmemory?or maybe re
better sollution?



Thank you in advance,


Edwin

______________________________________________
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


______________________________________________
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.

Reply via email to