I am trying to make a predicted vegetation map using the predict ( )
function and am running into an issue with memory size

Specifically I am building a random forest classification (dataframe = "
vegmap.rf") using the randomForest library and then am trying to apply
results from that to construct a predicted map (dataframe ="testvegmap.pred
"):

          testvegmap.pred <-predict(vegmap.rf, veg)

And when I try to run this I get a message of:  "cannot allocate vector of
size 88.0Mb"

I have used the series of commands below to increase the memory size to
4000Mb (the largest I seemingly can expand to):

          memory.size(max=FALSE)
          memory.limit(size=4000)

Any suggestions?  Is my only option to reduce the size of the area I am
trying to make a predicted map of?

Thanks
Brad

        [[alternative HTML version deleted]]

______________________________________________
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.

Reply via email to