Good question, Ed.

This was some good work by mktal to build a multi-class svm module.

The issue with the PR is that the mini-batching is embedded in the svm
code, whereas we would prefer to add mini-batching as a general capability
in the stochastic gradient descent framework, so that it can be used by
other modules besides.

There is work in progress on this currently:

Add mini-batching to IGD framework
https://issues.apache.org/jira/browse/MADLIB-1048
for the next release.

When that JIRA is done, we can then finish:

Multi-class SVM with mini-batching
https://issues.apache.org/jira/browse/MADLIB-1037

which is related to the PR you asked about.

I will add this comment into the PR as well.

Frank


On Wed, Sep 20, 2017 at 8:13 AM, Ed Espino <[email protected]> wrote:

> I'm more curious than anything, what is the current status of the following
> PR?
>
> PR: SVM: Implement c++ functions for training multi-class svm in mini-batch
> #75
> #75 opened on Nov 14, 2016 by mktal
> https://github.com/apache/madlib/pull/75
>
> Thanks,
> -=e
>
> --
> *Ed Espino*
>

Reply via email to