On 6/24/05, Domenico Cozzetto <[EMAIL PROTECTED]> wrote: > Dear all, > I have a matrix of dissimilarities and I'd like to get a 2d embedding. I'm > not an expert of these techniques, so I'm a bit confused... Furthermore I > was not able to find on the web any satisfactory tutorial. So even though > this may be not the most appropriate place to discuss about this issues, I'd > be very grateful to those who will reply. > > My first question is: Do I need an initial embedding of my data before > applying PCA through the methods princomp or prcomp in the stats package? > > What does it mean that PCA and classical scaling are equivalent? And where > can I find a proof of this fact? > > If this is true, I should get the same results by applying the methods > prcomp() or cmdscale() in stats. If it may help, use the dissimilarities in > the attached "diss.tab" file.
We can verify the equivalence of the PCA components and the MDS components on the iris data set. Eigenvectors are not unique but it seems that changing the signs of the components is all we will need to do here: iris4 <- iris[,-5] # test data std <- function(x) x %*% sign(diag(x[1,])) # standardize sign iris4.pca <- std(predict(prcomp(iris4))) iris4.mds <- std(cmdscale(dist(iris4), k = ncol(iris4))) all.equal(iris4.pca, iris4.mds) # TRUE ______________________________________________ [email protected] mailing list https://stat.ethz.ch/mailman/listinfo/r-help PLEASE do read the posting guide! http://www.R-project.org/posting-guide.html
