Hi list folks,

I have been trying to figure out how I can run the script that I have included below. I have 1.2 gigs of memory on the computer, but this is not enough, given the size of the input datasets. The input matrix ('input') is a little under 2400 x 2900 cells, and is about 60 megs when stored on hard drive...what is R doing, that this (small) size of file takes up so much memory? The script uses three variables of this size, but I don't see why that would add up to over 1.2 gigs...Is the way that I call variables in the for-loops causing this problem, with some sort of leakiness? I had heard of leaky memory problems with version 1.5.1, but I am using 1.9.1. Alternatively, I had wondered if it was just the way memory works in R, as I had read a post earlier today that suggested that R requires 5-10 times more memory than the input to meaninfully calculate anything...

Thanks...Sam



bflow <- 2
pie <- 3.141592654


input <- read.table ("xxx.txt", sep="", header=FALSE, na.strings = "-9999") rowsize <- dim(input)[1] colsize <- dim(input)[2]

result <- matrix (NA, rowsize, colsize)

for (i in 1:rowsize)
for (j in 1:colsize)

   if (is.na (input [i,j]) == FALSE) {

       probability <- matrix (NA, rowsize, colsize)

for (p in 1:rowsize) {
or (q in 1:colsize){
distance <- sqrt ((p - i)^2 + (q - j)^2)
probability [p,q] <- (2 / (bflow * pie * (1 + (distance / bflow)^2))) } }


kernsum <- sum ((probability * input / input), na = TRUE) intmean <- sum (input * probability, na = TRUE) / kernsum result [i,j] <- sum (((input - intmean)^2 * probability), na = TRUE) / kernsum
}


}
}

______________________________________________
[EMAIL PROTECTED] mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide! http://www.R-project.org/posting-guide.html

Reply via email to