Use "Rprof" on a small sample and determine where the time is being
spent.  Do some periodic gc() or memory.size() to see how fast you are
using up memory.  Do an object.size on all your objects to see see how
be they are.  This would help in the determination of the problem.

On Mon, Aug 31, 2009 at 7:01 AM, alexander russell<ssv...@gmail.com> wrote:
> Hello,
>
> After putting together interaction code that worked for a single pair of
> interactions, when I try to evaluate two pairs of interactions(
> flowers*gopher, flowers*rockiness) my computer runs out of memory, and the
> larger desktop I use just doesn't go anywhere after about 20 minutes.
>
> Is it really that big a calculation?
>
> to start:
>
> mle2(minuslogl = Lily_sum$seedlings ~ dnbinom(mu = a, size = k),
>
> start = list(a = 10, k = 1))
> then:
> i2<-interaction(Lily_sum$flowers, Lily_sum$gopher)
>
> i3<-interaction(Lily_sum$flowers, Lily_sum$rockiness)
>
> mle2(Lily_sum$seedlings ~ dnbinom(mu = a, size = k), start=list(a=10,k=1)
> ,parameters=list(a~i3+i2+Lily_sum$flowers))
>
> (the last run leads to a stalled calculation)
>
> regards,
>
> R
>
>        [[alternative HTML version deleted]]
>
> ______________________________________________
> R-help@r-project.org mailing list
> https://stat.ethz.ch/mailman/listinfo/r-help
> PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
> and provide commented, minimal, self-contained, reproducible code.
>



-- 
Jim Holtman
Cincinnati, OH
+1 513 646 9390

What is the problem that you are trying to solve?

______________________________________________
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.

Reply via email to