We should probably document this in the main README for Optim. -- John
On Feb 5, 2014, at 2:16 PM, Andreas Noack Jensen <[email protected]> wrote: > In the Calculus.jl package, there is a hessian function where you can stick > in you likelihood and the estimate. > > > 2014-02-05 Bradley Fay <[email protected]>: > I'm trying to figure out how to compute/recover the standard error of the > estimates for a simple linear regression using MLE and simulated. Here is the > code I'm using to generate data, compute the likelihood function, and > optimize the function. > > using Optim > > using Distributions > > > xmat = [ones(10000,1) rand(Normal(2,1),10000)]; #creating matrix of > Independent varibles > u = rand(Normal(0,1),10000); #create vector of normally distributed errors > mean 0 std 1 > ymat = xmat*[3,2] + u; #generating dependent variables. > > start_val = [1.0,1.0,1.0]; #Starting values for MLE estimation. Need to be in > float format. > > > #Beginning of Likelihood function > > function ll(param); > b1=param[1]; > b2=param[2]; > sig=param[3]; > > ee = ymat-xmat*[b1,b2]; > loglik = -0.5*log(2*pi*sig^2)-0.5*(ee.^2)/sig^2; > -sum(loglik); > end; > > optimize(ll,start_val) > > > I'm more familiar with Gauss and MATLAB as I just started playing with Julia > yesterday. To compute the standard errors in the identical situation in > MATLAB, I can directly compute the square root of the diagonal of the inverse > of the numerical hessian because Matlab will return the hessian. > mlese=sqrt(diag(inv(hessian))); #MATLAB Code for computing standard errors > > Is there a way to do this in Julia? I'm having trouble figuring this out. > Thanks! > > > > > -- > Med venlig hilsen > > Andreas Noack Jensen
