On Sat, Feb 4, 2012 at 1:16 AM, Vlad Niculae wrote:
> Sorry for being vague. NMF indeed tends to generally return sparse
> representations. I meant sparse NMF as in an implementation of NMF that can
> take sparse matrices as inputs (effectively keeping either the larger of W, H
> or both as sp
On Feb 3, 2012, at 18:07 , Mathieu Blondel wrote:
> On Fri, Feb 3, 2012 at 11:55 PM, Vlad Niculae wrote:
>
>> The scipy NNLS is written in Fortran. I'd like to bench _nls_subproblem
>> against it.
>> Maybe we could have a cython projected sgd non-negative least square method
>> with L1 constra
I had a need for a non-negative logistic classifier a while back, and wrote
a light-weight function that does the optimization directly, along with an
L2 regularizer. The code is on gist: https://gist.github.com/1730797
-Marc
On Fri, Feb 3, 2012 at 9:55 AM, Vlad Niculae wrote:
> A nice idea wo
A nice idea would be to extend the scipy NNLS in the ways needed to use it in
scikit-learn's NMF instead of the _nls_subproblem code translated from C.J.
Lin's code.
The scipy NNLS is written in Fortran. I'd like to bench _nls_subproblem against
it.
Maybe we could have a cython projected sgd no
For non-negative least-squares, you can use this:
http://docs.scipy.org/doc/scipy-0.7.x/reference/generated/scipy.optimize.nnls.html
We could also add an estimator that implements fit and predict in
scikit-learn (although the above function doesn't support sparse
matrices :$)
Mathieu
--
On Thu, Feb 02, 2012 at 10:17:02PM -0500, Jieyun Fu wrote:
>Is there a way to enforce the constraints on sklearn optimizers or
>classifiers? For example, if I put some data into a logistic regression, I
>want to make sure some coefficients are positive / negative.
No. The optimizers a