There was a thread back in April about including some metric learning
algorithms in scikit-learn. (
http://www.mail-archive.com/scikit-learn-general@lists.sourceforge.net/msg06952.html
)
The consensus at that time was to come up with some metric learning code in
a separate project first, to get a
Hello guys,
Recently I did a lot of work on sequential monte carlo and online
variational methods for dirichlet process mixture models among other
things. I have never contributed to sklearn but was wondering if an online
version of DPMM would be something of an interest to the community. Before
I
2013/11/4 Nantas Nardelli
> Thank you, Andy!
> I've opened a pull request for the README bit. I think I'll have to look
> to the untagged issues, as the majority of the easy open issues have either
> already some pull request open or are really old and irrelevant (although I
> may find something
Hi List,
I'm currently working on some performance documentation [1] and I wanted to
micro-benchmark the dense vs sparse coefficients case.
I created a self-contained script and wanted to bench it using
line_profiler, but it seems that after the call to `sparsify()` my
SGDRegressor can't predict
Hi Eustache,
that's quite a bug - thanks for reporting - I fixed it and added a sparsify
test to test_common.py - pushed directly to master.
thanks,
Peter
2013/11/4 Eustache DIEMERT
> Hi List,
>
> I'm currently working on some performance documentation [1] and I wanted
> to micro-benchmark t
Cool :)
issue is gone !
Thanks Peter
Eustache
2013/11/4 Peter Prettenhofer
> Hi Eustache,
>
> that's quite a bug - thanks for reporting - I fixed it and added a
> sparsify test to test_common.py - pushed directly to master.
>
> thanks,
> Peter
>
>
> 2013/11/4 Eustache DIEMERT
>
>> Hi List,
Great, thanks Oliver!
I'll have a look around and I'll try to analyse what I've already used in
my past projects. I mainly asked to know if that was something important in
particular that you'd like people to focus on, given that there is going to
be a 1.0 release pretty soon.
Looking forward to c