Hi all,

During the PyCon sprint I kept digging into the NMF and specifically
ways to solve each sub-iteration. It became clear that the alternating
NLS approach finds good reconstructions and converges well, but the
NLS solving step is critical and must be optimized.

I have started looking into different ways to solve multi-target NLS
problems. This is very much a work in progress but I wanted to share
quickly so that I can get your feedback.

Check out the notebook here: http://nbviewer.ipython.org/7224672

Adding "L1" (elementwise) regularization makes L-BFGS-B converge much
quicker.  This is cool because for NMF such a penalty has other
advantages.

I will add the projected gradient solver in pure python that we have
and that seems to be very fast for larger n_targets.

Cheers,
Vlad

------------------------------------------------------------------------------
Android is increasing in popularity, but the open development platform that
developers love is also attractive to malware creators. Download this white
paper to learn more about secure code signing practices that can help keep
Android apps secure.
http://pubads.g.doubleclick.net/gampad/clk?id=65839951&iu=/4140/ostg.clktrk
_______________________________________________
Scikit-learn-general mailing list
Scikit-learn-general@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/scikit-learn-general

Reply via email to