Hi Manuel,

Look at your memory.limit() and run R with '--mem-max=' option

Try also to use the simpler netCDF library


Antonio

> -----Mensaje original-----
> De: [EMAIL PROTECTED]
> [mailto:[EMAIL PROTECTED] nombre de Manuel Gutierrez
> Enviado el: viernes, 21 de enero de 2005 10:03
> Para: r-help@stat.math.ethz.ch
> Asunto: [R] memory and swap space in ncdf
> 
> 
> I've a linux system with 2Gb of memory which  is not
> enough for reading a 446Mb netcdf file using ncdf:
> library(ncdf)
> ncold <- open.ncdf("gridone.grd")
> Error: cannot allocate vector of size 1822753 Kb
> 
> When I look at the free memory in my system I can see
> that none of the Swap space is being used by R.
> I am a newbie in linux and R, I've read the Memory
> help pages but still have some questions:
> can I use the swap space in R to solve my problem of
> lack of memory?
> if not, are there any ways to read the data apart from
> buying more RAM?
> Thanks,
> M
> 
> ______________________________________________
> R-help@stat.math.ethz.ch mailing list
> https://stat.ethz.ch/mailman/listinfo/r-help
> PLEASE do read the posting guide! 
http://www.R-project.org/posting-guide.html
-- 
No virus found in this incoming message.
Checked by AVG Anti-Virus.


-- 
No virus found in this outgoing message.
Checked by AVG Anti-Virus.

______________________________________________
R-help@stat.math.ethz.ch mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide! http://www.R-project.org/posting-guide.html

Reply via email to