Hi Mahmood, There are different pieces of info that you can get from PCA:
1. How important is a given PC to reconstruct the entire dataset -> This is given by explained_variance_ratio_ as Guillaume suggested
2. What is the contribution of each feature to each PC (remember that a PC is a linear combination of all the features i.e.: PC_1 = X_1 . alpha_11 + X_2 . alpha_12 + ... X_m . alpha_1m). The alpha_ij are what you're looking for and they are given in the components_ matrix which is a n_components x n_features matrix.
Nicolas On 1/22/21 9:13 AM, Mahmood Naderan wrote:
Hi I have a question about PCA and that is, how we can determine, a variable, X, is better captured by which factor (principal component)? For example, maybe one variable has low weight in the first PC but has a higher weight in the fifth PC. When I use the PCA from Scikit, I have to manually work with the PCs, therefore, I may miss the point that although a variable is weak in PC1-PC2 plot, it may be strong in PC4-PC5 plot. Any comment on that? Regards, Mahmood _______________________________________________ scikit-learn mailing list scikit-learn@python.org https://mail.python.org/mailman/listinfo/scikit-learn
_______________________________________________ scikit-learn mailing list scikit-learn@python.org https://mail.python.org/mailman/listinfo/scikit-learn