HI Berend,
Thank you for your reply.
2011/4/13 Berend Hasselman b...@xs4all.nl
Questions:
1. why are you defining Bo within a loop?
2. Why are you doing library(nleqslv) within the loop?
Yes, I see what you mean. There's no reason for defining that within the
loop.
Doing both those
On 14-04-2011, at 09:00, Kristian Lind wrote:
HI Berend,
Thank you for your reply.
..
Finally the likelihood function at the end of your code
#Maximum likelihood estimation using mle package
library(stats4)
#defining loglikelighood function
#T - length(v)
#minuslogLik -
Albyn and others,
Thank you for your replies.
In order to be more specific I've constructed my program. I know it's long
and in some places quite messy. It works until the last part where the
log-likelihood function has to be defined and maximized wrt the parameters.
The log-likelihood has the
Questions:
1. why are you defining Bo within a loop?
2. Why are you doing library(nleqslv) within the loop?
Doing both those statements outside the loop once is more efficient.
In your transdens function you are not using the function argument
parameters, why?
Shouldn't there be a
Hi there,
I'm trying to solve a ML problem where the likelihood function is a function
of two numerical procedures and I'm having some problems figuring out how to
do this.
The log-likelihood function is of the form L(c,psi) = 1/T sum [log (f(c,
psi)) - log(g(c,psi))], where c is a 2xT matrix of
Hi Kristian
The obvious approach is to treat it like any other MLE problem: evaluation
of the log-likelihood is done as often as necessary for the optimizer
you are using: eg a call to optim(psi,LL,...) where LL(psi) evaluates
the log likelihood at psi. There may be computational
to clarify: by if you knew that LL(psi+eps) were well approximated
by LL(psi), for the values of eps used to evaluate numerical
derivatives of LL.
I mean the derivatives of LL(psi+eps) are close to the derivatives of LL(psi),
and perhaps you would want the hessian to be close as well.
7 matches
Mail list logo