Hi,
I was looking to the code in the cross-decomposition section.  In PLS the
PCs are computed  in order to maximize the correlation between X and y and
not to maximize the covariance of X.

def _nipals_twoblocks_inner_loop(X, Y, mode="A", max_iter=500, tol=1e-06,
                                 norm_y_weights=False):
    """Inner loop of the iterative NIPALS algorithm.

    Provides an alternative to the svd(X'Y); returns the first left and right
    singular vectors of X'Y.  See PLS for the meaning of the parameters.  It is
    similar to the Power method for determining the eigenvectors and
    eigenvalues of a X'Y.
    """

In my opinion a standard implementation of Nipals PCA would be useful
for sklearn and probably easy to develop

Best,
Luca


On Mon, Jun 02, 2014 at 12:27:34AM +0100, Luca Puggini wrote:
> > This is a good alternative to SVD and it is much faster in situations
> where we
> > have a lot of variables and we are interested only in a small number of
> > components.
> > This is a well known and tested algorithm and I was actually surprised
> when I
> > discovered that it is not in sklearn.? (Maybe it has been replaced by a
> faster
> > alternative?)
>
> It is in scikit-learn, I believe, but written in such a way that nobody
> finds it are uses it (there are a few lessons to be learned there :) ):
> inside sklearn.cross_decomposition, you'll find some NIPALS.
>
> G
>
>
>
------------------------------------------------------------------------------
Learn Graph Databases - Download FREE O'Reilly Book
"Graph Databases" is the definitive new guide to graph databases and their 
applications. Written by three acclaimed leaders in the field, 
this first edition is now available. Download your free book today!
http://p.sf.net/sfu/NeoTech
_______________________________________________
Scikit-learn-general mailing list
Scikit-learn-general@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/scikit-learn-general

Reply via email to