In article <araj91$8iu$[EMAIL PROTECTED]>,
C R <[EMAIL PROTECTED]> wrote:
>Hi

>I have a problem where I am finding that I cannot invert covariance matrices
>because they are ill-conditioned. I have been told that I can 'condition'
>these matrices by adding to their diagonals. My questions are:
>i) what should I add (do I add the same to each element of the diagonal, or
>different quantities and in such a case how do I decide)?
>ii) when do I decide to perform such conditioning (e.g. is there a common
>rule of thumb regarding the condition number etc.)?

>Many thanks in advance.


Covariance matrices tend to be ill-conditioned.  Also, it
is not the usual condition number which is relevant, as 
this can be quite bad for good matrices.

The procedure you seem to be referring to is ridge regression.
There are lots of "rules of thumb" in use.  The general 
Bayesian formulation is that if one has the regression

        y = X*\beta + u,

and \beta has the prior mean 0 and covariance matrix T, the
Bayes estimate with quadratic loss is to estimate \beta by

        b = (X'X + T^{-1})^{-1})^{-1} X'y.



-- 
This address is for information only.  I do not claim that these views
are those of the Statistics Department or of Purdue University.
Herman Rubin, Deptartment of Statistics, Purdue University
[EMAIL PROTECTED]         Phone: (765)494-6054   FAX: (765)494-0558
.
.
=================================================================
Instructions for joining and leaving this list, remarks about the
problem of INAPPROPRIATE MESSAGES, and archives are available at:
.                  http://jse.stat.ncsu.edu/                    .
=================================================================

Reply via email to