trying a second time without links to see if I'm still a spam sender.

josef

On Mon, Oct 22, 2012 at 11:30 AM,  <josef.p...@gmail.com> wrote:
> On Mon, Oct 22, 2012 at 10:51 AM, federico vaggi
> <vaggi.feder...@gmail.com> wrote:
>> Josef,
>>
>> could you explain that in slightly more detail?  I'm afraid I'm not
>> familiar with the literature you are referencing at all.
>
> I worked on this last winter, and don't remember which papers I looked at
> a google search shows this as the latest
> journals.cambridge.org/action/displayAbstract?fromPage=online&aid=8136352
> one of the early papers is Trenkler
> www.sciencedirect.com/science/article/pii/0304407684900459
>
> standard Ridge estimate is
>
> beta = (X' X + k * eye){-1} X'Y
>
> If errors are correlated, N(0, Sigma)
> and W= inv(Sigma)
> then
> beta = inv(X' W X + k * eye) X' W Y
>
> If W is positive definite, then there is a T such that T'T = W, and we
> can transform
> X_star = T X
> y_star = T y
>
> and we are back to the identity error covariance matrix with the
> transformed variables, and can we use the
> original formulas.
>
> The transformation is just the basic idea behind weighted and
> generalized least squares.
>
>>
>> If you have some code I could take a look at, that would probably be
>> the easiest thing for me.
>
> My code is too dirty to show, and I never tried whether that case
> actually works correctly.
>
> Josef
>
>>
>> Thanks!
>>
>> Federico
>>
>> On Mon, Oct 22, 2012 at 4:40 PM,  <josef.p...@gmail.com> wrote:
>>> On Mon, Oct 22, 2012 at 10:33 AM,  <josef.p...@gmail.com> wrote:
>>>> On Mon, Oct 22, 2012 at 10:05 AM, federico vaggi
>>>> <vaggi.feder...@gmail.com> wrote:
>>>>> Hi Gael,
>>>>>
>>>>> I took the time to dig a little bit, and found some MATLAB code
>>>>> written by Diego di Bernardo.
>>>>>
>>>>> http://dibernardo.tigem.it/wiki/index.php/Network_Inference_by_Reverse-engineering_NIR
>>>>>
>>>>> He has a closed form result (the formula is in calc_cov.m), but the
>>>>> formula is really weird, and I can't really figure out where it comes
>>>>> from.
>>>>>
>>>>> function covA = calc_cov(A,X,sX,P,sP,RIDGE,W);
>>>>>
>>>>> % covA = calc_cov(A,X,sX,P,sP [, RIDGE, W]);
>>>>> % X,sX,P,SP are N x M, where N=number of genes, M=number of expts.
>>>>> % RIDGE is an optional ridge regression parameter
>>>>> % W is an optional weight parameter.
>>>>>
>>>>> [rows,N]=size(A);
>>>>> covA=zeros(N,N,rows);
>>>>> [N,M] = size(X);
>>>>>
>>>>> if 1~=exist('RIDGE')
>>>>>     RIDGE = 0;
>>>>> end
>>>>> if 1~=exist('W')
>>>>>     W = eye(M);
>>>>> end
>>>>>
>>>>> Q = W*W';
>>>>>
>>>>> for g=1:rows
>>>>>     idx = find(A(g,:)~=0);
>>>>>     vEta = sP(g,:).^2 + A(g,:).^2 * sX.^2;
>>>>> %     vEta = sP(g,g).^2 + A(g,:).^2 * sX.^2;  % Deigo's way
>>>>>     Z=X(idx,:);
>>>>>     T=inv(Z*Q*Z'+RIDGE*eye(length(idx)))*Z*Q';
>>>>>     covA(idx,idx,g) = T*diag(vEta)*T';
>>>>> end
>>>>>
>>>>> I'm not really sure how exactly the formula was derived though.
>>>>
>>>> I don't know about L1 penalization, but for Ridge there is a
>>>> literature on Ridge Regression with heteroscedastic or autocorrelated
>>>> errors, or general error covariance matrix.
>>>>
>>>> W looks like it's just a weight matrix, combined with ridge, Q is the
>>>> error covariance.
>>>
>>> Also calculating Q is not very efficient in this case.
>>> If you know W, then you can just calculate new data by whitening
>>> or transforming  x_new = W * X for the estimation.
>>>
>>> Josef
>>>
>>>>
>>>> Your problem is a bit different from standard Ridge regression because
>>>> you have several dependent/outcome variables.
>>>>
>>>> Josef
>>>>
>>>>
>>>>>
>>>>> Note - while the code is free to download, he specifically says that
>>>>> he doesn't want the code used for commercial purposes without
>>>>> permission.  I presume reproducing it is ok?
>>>>>
>>>>> Federico
>>>>>
>>>>>
>>>>>
>>>>> On Mon, Oct 22, 2012 at 3:34 PM, Gael Varoquaux
>>>>> <gael.varoqu...@normalesup.org> wrote:
>>>>>> On Fri, Oct 19, 2012 at 01:09:21PM +0200, federico vaggi wrote:
>>>>>>> Assuming that X and B are experimentally measured values with
>>>>>>> uncertainties, what's the correct way to transfer that uncertainty to
>>>>>>> A?
>>>>>>
>>>>>> There exists to my knowledge no theoretical/closed form result. I would
>>>>>> rely on bootstrap:
>>>>>> http://en.wikipedia.org/wiki/Bootstrapping_%28statistics%29
>>>>>>
>>>>>> G
>>>>>>
>>>>>> ------------------------------------------------------------------------------
>>>>>> Everyone hates slow websites. So do we.
>>>>>> Make your web apps faster with AppDynamics
>>>>>> Download AppDynamics Lite for free today:
>>>>>> http://p.sf.net/sfu/appdyn_sfd2d_oct
>>>>>> _______________________________________________
>>>>>> Scikit-learn-general mailing list
>>>>>> Scikit-learn-general@lists.sourceforge.net
>>>>>> https://lists.sourceforge.net/lists/listinfo/scikit-learn-general
>>>>>
>>>>> ------------------------------------------------------------------------------
>>>>> Everyone hates slow websites. So do we.
>>>>> Make your web apps faster with AppDynamics
>>>>> Download AppDynamics Lite for free today:
>>>>> http://p.sf.net/sfu/appdyn_sfd2d_oct
>>>>> _______________________________________________
>>>>> Scikit-learn-general mailing list
>>>>> Scikit-learn-general@lists.sourceforge.net
>>>>> https://lists.sourceforge.net/lists/listinfo/scikit-learn-general
>>>
>>> ------------------------------------------------------------------------------
>>> Everyone hates slow websites. So do we.
>>> Make your web apps faster with AppDynamics
>>> Download AppDynamics Lite for free today:
>>> http://p.sf.net/sfu/appdyn_sfd2d_oct
>>> _______________________________________________
>>> Scikit-learn-general mailing list
>>> Scikit-learn-general@lists.sourceforge.net
>>> https://lists.sourceforge.net/lists/listinfo/scikit-learn-general
>>
>> ------------------------------------------------------------------------------
>> Everyone hates slow websites. So do we.
>> Make your web apps faster with AppDynamics
>> Download AppDynamics Lite for free today:
>> http://p.sf.net/sfu/appdyn_sfd2d_oct
>> _______________________________________________
>> Scikit-learn-general mailing list
>> Scikit-learn-general@lists.sourceforge.net
>> https://lists.sourceforge.net/lists/listinfo/scikit-learn-general

------------------------------------------------------------------------------
Everyone hates slow websites. So do we.
Make your web apps faster with AppDynamics
Download AppDynamics Lite for free today:
http://p.sf.net/sfu/appdyn_sfd2d_oct
_______________________________________________
Scikit-learn-general mailing list
Scikit-learn-general@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/scikit-learn-general

Reply via email to