A little more information would help, such as the number of columns?  I
imagine it must
be large, because 100,000 rows isn't overwhelming.  Second, does the
read.csv() fail,
or does it work but only after a long time?  And third, how much RAM do you
have
available?

R Core provides some guidelines in the Installation and Administration
documentation
that suggests that a single object around 10% of your RAM is reasonable, but
beyond
that things can become challenging, particularly once you start working with
your data.

There are a wide range of packages to help with large data sets.  For
example,
RMySQL supports MySQL databases.  At the other end of the spectrum, there
are
possibilities discussed on a nice page by Dirk Eddelbuettel which you might
look at:

http://cran.r-project.org/web/views/HighPerformanceComputing.html

Jay

-- 
John W. Emerson (Jay)
Associate Professor of Statistics
Department of Statistics
Yale University
http://www.stat.yale.edu/~jay

(original message below)
------------------------------

Message: 128
Date: Sat, 27 Mar 2010 10:19:33 +0100
From: "n\.vial...@libero\.it" <n.via...@libero.it>
To: "r-help" <r-help@r-project.org>
Subject: [R] large dataset
Message-ID: <kzxokl$991aa2d6c95c3bd9f464c3b32b78b...@libero.it>
Content-Type: text/plain; charset=iso-8859-1

Hi I have a question,
as im not able to import a csv file which contains a big dataset(100.000
records) someone knows how many records R can handle without giving
problems?
What im facing when i try to import the file is that R generates more than
100.000 records and is very slow...
thanks a lot!!!

        [[alternative HTML version deleted]]

______________________________________________
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.

Reply via email to