Got it. Thanks!
On Wednesday, February 5, 2014 3:16:47 PM UTC-7, Andreas Noack Jensen wrote: > > In the Calculus.jl package, there is a hessian function where you can > stick in you likelihood and the estimate. > > > 2014-02-05 Bradley Fay <[email protected] <javascript:>>: > >> I'm trying to figure out how to compute/recover the standard error of the >> estimates for a simple linear regression using MLE and simulated. Here is >> the code I'm using to generate data, compute the likelihood function, and >> optimize the function. >> >> using Optim >> >> using Distributions >> >> >> xmat = [ones(10000,1) rand(Normal(2,1),10000)]; #creating matrix of >> Independent varibles >> >> u = rand(Normal(0,1),10000); #create vector of normally distributed errors >> mean 0 std 1 >> >> ymat = xmat*[3,2] + u; #generating dependent variables. >> >> >> start_val = [1.0,1.0,1.0]; #Starting values for MLE estimation. Need to be >> in float format. >> >> >> #Beginning of Likelihood function >> >> function ll(param); >> >> b1=param[1]; >> >> b2=param[2]; >> >> sig=param[3]; >> >> ee = ymat-xmat*[b1,b2]; >> >> loglik = -0.5*log(2*pi*sig^2)-0.5*(ee.^2)/sig^2; >> >> -sum(loglik); >> >> end; >> >> >> optimize(ll,start_val) >> >> >> I'm more familiar with Gauss and MATLAB as I just started playing with >> Julia yesterday. To compute the standard errors in the identical situation >> in MATLAB, I can directly compute the square root of the diagonal of the >> inverse of the numerical hessian because Matlab will return the hessian. >> mlese=sqrt(diag(inv(hessian))); #MATLAB Code for computing standard >> errors >> >> Is there a way to do this in Julia? I'm having trouble figuring this out. >> Thanks! >> >> > > > -- > Med venlig hilsen > > Andreas Noack Jensen >
