Hi, All: What tools exist for diagnosing singular gradient problems with 'nls'? Consider the following toy example:
DF1 <- data.frame(y=1:9, one=rep(1,9))
nlsToyProblem <- nls(y~(a+2*b)*one, DF1, start=list(a=1, b=1),
                 control=nls.control(warnOnly=TRUE))
Error in nlsModel(formula, mf, start, wts) :
 singular gradient matrix at initial parameter estimates

This example is obviously stupid, but other singular gradient problems are not so obvious. If we transfer this problem to 'optim', we can get diagnostics from an eigen analysis of the hessian:
dumfun <- function(x, y, one){
 d <- y-(x[1]+2*x[2])*one
 sum(d^2)
}
optimToyProblem <- optim(c(a=1, b=1), dumfun, hessian=TRUE,
                        y=DF1$y, one=DF1$one)
eigen(optimToyProblem$hessian, symmetric=TRUE)
$values
[1]  9.000000e+01 -7.105427e-10

$vectors
         [,1]       [,2]
[1,] 0.4472136 -0.8944272
[2,] 0.8944272  0.4472136

The smallest eigenvalue is essentially numerically zero relative to the largest,confirming the 'singular gradient' message. The corresponding eigenvector helps diagnose the problem: Adding (-0.9, 0.45)*z to any solution gives another equally good solution, for any z. I've used this technique to diagnose many subtle convergence problems with 'optim'. Are tools of this nature available for 'nls'?
     Thanks,
     Spencer Graves

______________________________________________
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.

Reply via email to