Dear Luca,
In terms of efficiency, do you think that nipals outperforms the
RandomizedPCA ? I'm not an expert in these methods, but it sounds like
they rely on similar tricks.
My suggestion would be to run a benchmark on some dataset of the
scikit to compare the accurcay/computation time tradeoffs.
Best,
Bertrand
On 28/05/2014 18:45, Luca Puggini wrote:
Hi,
I was looking to the PCA and SparsePCA implementation of sklearn.
They are both based on SVD but I think that the nipals implementation
of the same algorithm can really increase the speed in some situations.
In particular with sparse PCA we usually use a small number of
components and so its speed can be increased using nipals to compute
the initial value of u,v (in the class dictionary learning).
Always on this road there is a nipals like algorithm for Sparse PCA. I
have already written a python implementation for this and should not
be a problem for me to integrate it with sklearn.
Is this considered useful for the community or is this off topic?
Are there others people already working on it?
Thanks,
Luca
------------------------------------------------------------------------------
Time is money. Stop wasting it! Get your web API in 5 minutes.
www.restlet.com/download
http://p.sf.net/sfu/restlet
_______________________________________________
Scikit-learn-general mailing list
[email protected]
https://lists.sourceforge.net/lists/listinfo/scikit-learn-general
------------------------------------------------------------------------------
Time is money. Stop wasting it! Get your web API in 5 minutes.
www.restlet.com/download
http://p.sf.net/sfu/restlet
_______________________________________________
Scikit-learn-general mailing list
[email protected]
https://lists.sourceforge.net/lists/listinfo/scikit-learn-general