Hello,

I�m using ridge regression and I have a problem finding the variance
of the constant coefficient. So, I have the model
y=b_0+b_1*x_1+..+b_k*x_k+e
and I "standardize" it in order to use ridge regression
y_(st)=beta_1*x_(st),1+..+beta_k*x_(st),k+e_(st)
The variance of the beta�s can be found from 
s^2*(R+r*I)^(-2)*R,
where s^2 the estimator of e_(st)^2, R the correlation matrix (which
is equal to (X_(st))^T *X_(st) ) and r the value I use for the ridge
regression.
Now I can transform back and calculate the variance of b_i, i=1,..,k
from
var[b_i]=var[beta_i]*(s_y)^2/(s_i)^2,
with (s_y)^2=(sum(y_j)^2)-n*mean(y)^2 and (s_i)^2 analog for x_i.
But how do I find the variance for b_0??

Thanks and sorry for my bad english,
Lydia
.
.
=================================================================
Instructions for joining and leaving this list, remarks about the
problem of INAPPROPRIATE MESSAGES, and archives are available at:
.                  http://jse.stat.ncsu.edu/                    .
=================================================================

Reply via email to