To read in a few columns you can can also use something like,
reduced.data-do.call(cbind,scan(file='location/name',what=list(NULL,NULL,NULL,NULL,NULL,NULL,NULL,0,NULL,0,NULL,NULL,NULL),flush=TRUE))
where only columns 8 and 10 where saved in your object.
Rubén
On 2/28/08, Gabor Grothendieck [EMAIL PROTECTED] wrote:
The sqldf package can read a subset of rows and columns (actually any
sql operation)
from a file larger than R can otherwise handle. It will automatically
set up a temporary
SQLite database for you, load the file into the database
On Fri, Feb 29, 2008 at 8:27 AM, Liviu Andronic [EMAIL PROTECTED] wrote:
On 2/28/08, Gabor Grothendieck [EMAIL PROTECTED] wrote:
The sqldf package can read a subset of rows and columns (actually any
sql operation)
from a file larger than R can otherwise handle. It will automatically
Jorge Iván Vélez a écrit :
Dear R-list,
Does somebody know how can I read a HUGE data set using R? It is a hapmap
data set (txt format) which is around 4GB. After read it, I need to delete
some specific rows and columns. I'm running R 2.6.2 patched over XP SP2
using a 2.4 GHz Core 2-Duo
Sounds like you want to use the filehash package which was written for just
such problems
http://yusung.blogspot.com/2007/09/dealing-with-large-data-set-in-r.html
http://cran.r-project.org/web/packages/filehash/index.html
or maybe the ff package
Hi,
Jorge Iván Vélez wrote:
Dear R-list,
Does somebody know how can I read a HUGE data set using R? It is a hapmap
data set (txt format) which is around 4GB. After read it, I need to delete
some specific rows and columns. I'm running R 2.6.2 patched over XP SP2
in such a case, I would
read.table's colClasses= argument can take a NULL for those columns
that you want
ignored. Also see the skip= argument. ?read.table .
The sqldf package can read a subset of rows and columns (actually any
sql operation)
from a file larger than R can otherwise handle. It will automatically
set
Dear R-list,
Does somebody know how can I read a HUGE data set using R? It is a hapmap
data set (txt format) which is around 4GB. After read it, I need to delete
some specific rows and columns. I'm running R 2.6.2 patched over XP SP2
using a 2.4 GHz Core 2-Duo processor and 4GB RAM. Any
Depending on how many rows you will delete, and if you know in advance
which ones they are, one approach is to use the skip argument of
read.table. If you only need a fraction of the total number of rows this
will save a lot of RAM.
Mark
Mark W. Kimpel MD ** Neuroinformatics ** Dept. of
I may be mistaken, but I believe R does all it work in memory. If
that is so, you would really only have 2 options:
1. Get a lot of memory
2. Figure out a way to do the desired operation on parts of the data
at a time.
-Roy M.
On Feb 27, 2008, at 9:03 PM, Jorge Iván Vélez wrote:
Dear
On Wed, 27-Feb-2008 at 09:13PM -0800, Roy Mendelssohn wrote:
| I may be mistaken, but I believe R does all it work in memory. If
| that is so, you would really only have 2 options:
|
| 1. Get a lot of memory
But with a 32bit operating system, 4G is all the memory that can be
addressed
11 matches
Mail list logo