It does look like you've got a memory issue. perhaps using as.is=TRUE, and/or stringsAsFactors=FALSE will help as optional arguments to read.table
if you don't specify these sorts of things, R can have to look through the file and figure out which columns are characters/factors etc and so the larger files cause more of a headache for R I'm guess. Hopefully someone else can comment further on this? I'd true toggling TRUE/FALSE for as.is and stringsAsFactors. do you have other objects loaded in memory as well? this file by itself might not be the problem - but it's a cumulative issue. have you checked the file structure in any other manner? how large (Mb/kb) is the file that you're trying to read? if you just read in parts of the file, is it okay? read.table(filename,header=FALSE,sep="\t",nrows=100) read.table(filename,header=FALSE,sep="\t",skip=20000,nrows=100) -- View this message in context: http://r.789695.n4.nabble.com/read-table-segfaults-tp3771793p3771817.html Sent from the R devel mailing list archive at Nabble.com. ______________________________________________ R-devel@r-project.org mailing list https://stat.ethz.ch/mailman/listinfo/r-devel