Kolder [EMAIL PROTECTED]
Cc: r-help@stat.math.ethz.ch; N.C. Onland-moret [EMAIL PROTECTED]
Sent: Monday, December 18, 2006 7:48:23 PM
Subject: RE: [R] Memory problem on a linux cluster using a large data set
[Broadcast]
In addition to my off-list reply to Iris (pointing her to an old post
On Thu, 21 Dec 2006, Iris Kolder wrote:
Thank you all for your help!
So with all your suggestions we will try to run it on a computer with a
64 bits proccesor. But i've been told that the new R versions all work
on a 32bits processor. I read in other posts that only the old R
versions
Section 8 of the Installation and Administration guide says that on
64-bit architectures the 'size of a block of memory allocated is
limited to 2^32-1 (8 GB) bytes'.
The wording 'a block of memory' here is important, because this sets a
limit on a single allocation rather than the memory consumed
In addition to my off-list reply to Iris (pointing her to an old post of
mine that detailed the memory requirement of RF in R), she might
consider the following:
- Use larger nodesize
- Use sampsize to control the size of bootstrap samples
Both of these have the effect of reducing sizes of trees