Dear Mike,

You are absolutely right, and it the amount of covariance is another other 
element that is of interest to me. In broad terms I was thinking of relating 
covariation back to inital shape variation to gain a magnitude measure through 
its predictive% on shape variation, but I want and need to think about it some 
more on how to go about it (and its validity!).

Thank you for helping me, and your example is very useful

Have a lovely day,
Katrien


 

> On 14 May 2020, at 13:09, Mike Collyer <[email protected]> wrote:
> 
> Katrien,
> 
> Even though one can calculate the fractional covariance as Philipp described, 
> one has to consider whether there is much overall covariance between 
> matrices.  The following is an example of how the fractional covariances 
> based on squared singular values can be deceiving, in R.
> 
> > set.seed(22)
> > n <- 20
> > p <- 3
> > X <- matrix(rnorm(n * p), n, p)
> > Y <- matrix(rnorm(n * p), n, p)
> > 
> > X <- scale(X, scale = FALSE)
> > Y <- scale(Y, scale = FALSE)
> > 
> > sxy <- svd(crossprod(X, Y)/(n-1))
> > 
> > sxy$d^2 # squared singular values
> [1] 0.2689045372 0.0543512900 0.0001299134
> > sum(sxy$d^2) # their sum
> [1] 0.3233857
> > sxy$d^2/sum(sxy$d^2) # their fraction
> [1] 0.8315287394 0.1680695319 0.0004017288
> > 
> > sx <- svd(crossprod(X)/(n-1))
> > sy <- svd(crossprod(Y)/(n-1))
> > 
> > sx$d # eigenvalues for X
> [1] 1.6143390 1.1115365 0.2711724
> > sum(sx$d) # their sum
> [1] 2.997048
> > 
> > sy$d # eigenvalues for Y
> [1] 1.5372356 0.9224574 0.4432287
> > sum(sy$d) # their sum
> [1] 2.902922
> 
> 
> So, by generating two matrices of random data, and any covariation between 
> them is incidental, we find that 83.15% of the overall covariance is 
> explained in the first PLS vectors.  However, the summed squared singular 
> values (overall covariance) is 0.32, which is quite small compared to the 
> 2.99 or 2.90 summed eigenvalues (total variances) of either matrix.  It is 
> possible to have much of the very little covariance explained by the first 
> axis.  Focusing on the 83.15%, alone, could be misleading about the amount of 
> covariance between X and Y.
> 
> Cheers!
> Mike
> 
>> On May 14, 2020, at 2:59 AM, [email protected] 
>> <mailto:[email protected]> wrote:
>> 
>> Dear Katrien,
>> 
>> 
>> 
>> The sum of the squared singular values in PLS equals the sum of all the 
>> squared pairwise covariances between the two blocks of variables. Therefore, 
>> what you can compute is the fraction of "squared" total covariance that each 
>> PLS dimension accounts for. You can do this by dividing each squared 
>> singular value by the summed squared singular values.
>> 
>> Also, if one insists on a significance test, it should be based on the 
>> distribution of these singular values.
>> 
>> 
>> 
>> Best,
>> 
>> 
>> 
>> Philipp 
>> 
>> 
>> 
>> Am Donnerstag, 14. Mai 2020 03:31:12 UTC+2 schrieb katrien.janin:
>> Hello everybody,
>> 
>> I am hoping you can help me figure out the following: after having used the 
>> integration.test function (geomorph), I would like to know the % each PLS 
>> captures of the total covariance but I am kinda stumped on what might be a 
>> good way how to go about it.
>> 
>> I was initially thinking along the lines of generating the covariance matrix 
>> of the two block (e.g. cov.il.is <http://cov.il.is/> <-cov(il.is.int 
>> <http://il.is.int/>$A1.matrix,il.is.int <http://il.is.int/>$A2.matrix) and 
>> computing the eigenvalue for each PLS , and then regress these. But of 
>> course the cov matrix has a very different structure, so that will not work. 
>> I am clearly thinking in the wrong direction, but for now I can’t seem to 
>> see the tree in the forest. Any ideas?
>> 
>> Best wishes,
>> Katrien
>> 
>> -- 
>> You received this message because you are subscribed to the Google Groups 
>> "Morphmet" group.
>> To unsubscribe from this group and stop receiving emails from it, send an 
>> email to [email protected] 
>> <mailto:[email protected]>.
>> To view this discussion on the web visit 
>> https://groups.google.com/d/msgid/morphmet2/76c0b670-b6ed-448a-b80d-4d662263aa83%40googlegroups.com
>>  
>> <https://groups.google.com/d/msgid/morphmet2/76c0b670-b6ed-448a-b80d-4d662263aa83%40googlegroups.com?utm_medium=email&utm_source=footer>.
> 
> 
> -- 
> You received this message because you are subscribed to the Google Groups 
> "Morphmet" group.
> To unsubscribe from this group and stop receiving emails from it, send an 
> email to [email protected] 
> <mailto:[email protected]>.
> To view this discussion on the web visit 
> https://groups.google.com/d/msgid/morphmet2/0F0BC6C1-D02A-46FE-847F-C8F6BE069DD4%40gmail.com
>  
> <https://groups.google.com/d/msgid/morphmet2/0F0BC6C1-D02A-46FE-847F-C8F6BE069DD4%40gmail.com?utm_medium=email&utm_source=footer>.

-- 
You received this message because you are subscribed to the Google Groups 
"Morphmet" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to [email protected].
To view this discussion on the web visit 
https://groups.google.com/d/msgid/morphmet2/8933EC72-B6C4-4D75-AF00-218CB3C8F2A3%40gmail.com.

Reply via email to