I had a similar problem not long ago. My solution was to look at the definition of "write.table" and essentially do it by hand. The key steps are to create a matrix of characters that includes the dimnames (if desired), and then use "writeLines" to put that into a file.
My machine has 1G as well and my problem was a numeric matrix that was 5000 square. So you should have no problem. Patrick Burns Burns Statistics [EMAIL PROTECTED] +44 (0)20 8525 0696 http://www.burns-stat.com (home of S Poetry and "A Guide for the Unwilling S User") array chip wrote: >Hi, > >I am having trouble of exporting a large data frame >out of R to be used in other purpose. The data frame >is numeric with size 17000x400. It takes a quite some >time to start R as well. my computer has 1GB RAM. I >used the following command to write the data frame to >a text file and got the error message below: > > > >>write.table(xxx, "C:\\xxx", sep="\t", >> >> >row.names=FALSE,col.names=FALSE,quote=FALSE) > >Error: cannot allocate vector of size 55750 Kb >In addition: Warning message: >Reached total allocation of 1023Mb: see >help(memory.size) > >I tried to increase the memory size by >memory.size(size=), but it seems running the above >command takes forever. > >what can I do with this error message to get the data >out? > >Thanks > >______________________________________________ >[EMAIL PROTECTED] mailing list >https://www.stat.math.ethz.ch/mailman/listinfo/r-help > > > > [[alternative HTML version deleted]] ______________________________________________ [EMAIL PROTECTED] mailing list https://www.stat.math.ethz.ch/mailman/listinfo/r-help
