Hello,
Inline.
Às 21:30 de 27/04/20, J C Nash escreveu:
After looking at MASS::fitdistr and fitdistrplus::fitdist, the latter seems to
have
code to detect (near-)singular hessian that is almost certainly the "crash
site" for
this thread. Was that package tried in this work?
I tried it. I di
After looking at MASS::fitdistr and fitdistrplus::fitdist, the latter seems to
have
code to detect (near-)singular hessian that is almost certainly the "crash
site" for
this thread. Was that package tried in this work?
I agree with Mark that writing one's own code for this is a lot of work, and
it's been a long time but I vaguely remember Rvmminb computing
gradients ( and possibly hessians )
subject to constraints. John can say more about this but, if one is going
to go through the anguish of
creating a fitdstr2, then you may want to have it call Rvmminb instead of
whatever is cur
I thought about this some more and realized my last suggestion is
unlikely to work.
Another possibility would be to create a new function to compute the
Hessian with a smaller step size, but I suspect there will be more
problems.
Possibly a much simpler approach would be to:
Modify the source for
> Dear Ms. Spurdle
I usually refer to myself as "He".
(But then, that's not the whole story...)
I'm not an expert on maximum likelihood approaches.
So, I apologize if the following suggestion is a poor one.
Does your likelihood function have a limit, as alpha approaches zero (say zero)?
If so, t
Peter is correct. I was about to reply when I saw his post.
It should be possible to suppress the Hessian call. I try to do this
generally in my optimx package as computing the Hessian by finite differences
uses a lot more compute-time than solving the optimization problem that
precedes the usual
The optim() function has trouble calculating derivatives at/near the boundary,
because it is using a simplistic finite-difference formula centered on the
parameter. optimx::optimr() may work better.
-pd
> On 26 Apr 2020, at 09:02 , Abby Spurdle wrote:
>
> I ran your example.
> It's possible t
Dear Ms. Spurdle ,
Thanks for looking into this. I think that you are correct in that it
is a problem with the hessian calculation. It seems that fitdistr()
explicitly sets hessian=TRUE, with no possibility of opting out.
It also seems that optim() ignores the "lower" argument when comput
fitdistr computes a Hessian matrix.
I think optim ignores the lower value computing the Hessian.
Here's the result after removing the Hessian and Hessian-dependent info:
> str (fit)
List of 3
$ estimate: Named num [1:2] 0.000149 1.0797684972
..- attr(*, "names")= chr [1:2] "alpha" "beta"
I ran your example.
It's possible that it's another bug in the optim function.
Here's the optim call (from within fitdistr):
stats::optim(x = c(1, 4, 1, 2, 3, 1, 1, 1, 2, 2, 2, 2, 1, 1,
1, 1, 1, 4, 4, 3, 1, 2, 2, 1, 1, 3, 1, 1, 1, 4, 1, 1, 1, 1, 1, #more lines...
1, 4, 1, 1, 1, 5, 5, 5, 4, 5, 2,
I haven't run your example.
I may try tomorrow-ish if no one else answers.
But one question: Are you sure the "x" and "i" are correct in your function?
It looks like a typo...
On Sun, Apr 26, 2020 at 2:14 PM Rolf Turner wrote:
>
>
> For some reason fitdistr() does not seem to be passing on the
For some reason fitdistr() does not seem to be passing on the "..."
argument "lower" to optim() in the proper manner, and as result
falls over.
Here is my example; note that data are attached in the file "x.txt".
dhse <- function(i,alpha,beta,topn) {
x <- seq(0,1,length=topn+2)[-c(1,topn+2
12 matches
Mail list logo