Martin == Martin Lam [EMAIL PROTECTED]
on Tue, 24 Jan 2006 12:13:07 -0800 (PST) writes:
Martin Dear Gueorgui,
Is it true that R generally cannot handle medium sized
data sets(a couple of hundreds of thousands observations)
and threrefore large date set(couple of millions
Dear R experts,
Is it true that R generally cannot handle medium sized data sets(a
couple of hundreds of thousands observations) and threrefore large
date set(couple of millions of observations)?
I googled and I found lots of questions regarding this issue, but
curiously there were no
Hello,
This is not true that R cannot handle matrices of 100 000's
observations... but:
- Importation (typically using read.table() and the like) saturates
much faster. Solution: use scan() and fill a preallocated matrix, or
better, use a database.
- Data frames are very nice objects, but if
my experience is that 100,000 shouldn't be a problem. of course, it also
depends on your computer configuration.
On 1/24/06, Gueorgui Kolev [EMAIL PROTECTED] wrote:
Dear R experts,
Is it true that R generally cannot handle medium sized data sets(a
couple of hundreds of thousands
On Tue, 24 Jan 2006, Gueorgui Kolev wrote:
Dear R experts,
Is it true that R generally cannot handle medium sized data sets(a
couple of hundreds of thousands observations) and threrefore large
date set(couple of millions of observations)?
I googled and I found lots of questions regarding
Completely agree, i use R to analyze graphs with millions of vertices and
tens of millions of edges. (Of course this is a bit different than working
with data frames.)
I think your problem is that the foreign package parses/converts your data
file slowly, convert it to R format and it will be
Dear Gueorgui,
Is it true that R generally cannot handle medium
sized data sets(a
couple of hundreds of thousands observations) and
threrefore large
date set(couple of millions of observations)?
It depends on what you want to do with the data sets.
Loading the data sets shouldn't be any