Hello:

Below is a toy logistic regression problem. When I wrote my own code,
Newton-Raphson converged in three iterations using both the gradient 
and the Hessian  and the starting values  given below. But I can't 
get nlm() to work! I would much appreciate any help.

 > x
[1] 10.2  7.7  5.1  3.8  2.6
 > y
[1] 9 8 3 2 1
 > n
[1] 10  9  6  8 10


derfs4=function(b,x,y,n)
{
         b0 = b[1]
         b1 = b[2]
         c=b0+b1*x
         d=exp(c)
         p=d/(1+d)
         e=d/(1+d)^2
         f  = -sum(log(choose(n,y))-n*log(1+d)+y*c)
         attr(f,"gradient")=c(-sum(y-n*p),-sum(x*(y-n*p)))
         
attr(f,"hessian")=matrix(c(sum(n*e),sum(n*x*e),sum(n*x*e),sum(n*x^2*e)),2,2)
         return(f)
}


 > nlm(derfs4,c(-3.9,.64),hessian=T,print.level=2,x=x,y=y,n=n)
Error in choose(n, y) : argument "n" is missing, with no default
 >
I tried a variety of other ways  too.  When I got it to work it did not
converge in 100 iterations ;rather like the fgh example given on the lme
help page.

Mervyn


        [[alternative HTML version deleted]]

______________________________________________
R-help@stat.math.ethz.ch mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.

Reply via email to