Re: [R] Handling large data sets via scan()

2005-02-04 Thread Christoph Lehmann
does it solve to a part your problem, if you use read.table() instead of scan, since it imports data directly to a data.frame? let me know, if it helps Nawaaz Ahmed wrote: I'm trying to read in datasets with roughly 150,000 rows and 600 features. I wrote a function using scan() to read it in (I

Re: [R] Handling large data sets via scan()

2005-02-04 Thread Roger D. Peng
I can usually read in large tables by very careful usage of read.table() without having to resort to scan(). In particular, using the `colClasses', `nrows', and `comment.char' arguments correctly can greatly reduce memory usage (and increase speed) when reading in data. Converting from a list

[R] Handling large data sets via scan()

2005-02-03 Thread Nawaaz Ahmed
I'm trying to read in datasets with roughly 150,000 rows and 600 features. I wrote a function using scan() to read it in (I have a 4GB linux machine) and it works like a charm. Unfortunately, converting the scanned list into a datafame using as.data.frame() causes the memory usage to explode (it

RE: [R] Handling large data sets via scan()

2005-02-03 Thread Mulholland, Tom
] Subject: [R] Handling large data sets via scan() I'm trying to read in datasets with roughly 150,000 rows and 600 features. I wrote a function using scan() to read it in (I have a 4GB linux machine) and it works like a charm. Unfortunately, converting the scanned list into a datafame using