[ 
https://issues.apache.org/jira/browse/SPARK-2401?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Gang Bai updated SPARK-2401:
----------------------------

    Description: 
The goal of a multi-class multi-label classifier is to tag a sample data point 
with a subset of labels from a finite, pre-specified set. Given a set of L 
labels, a data point can be tagged with one of the 2^L possible subsets. The 
main challenges in training a multi-class multi-label classifier are the 
exponentially large label space. 

Multi-class multi-label classifiers are very useful in 

This JIRA is created to track the effort of solving the training problem of 
multi-class, multi-label classifiers by implementing AdaBoost.MH on Apache 
Spark. It will not be an easy task. I will start from a basic DecisionStump 
weak learner and a simple Hamming tree resembling DecisionStumps into a meta 
weak learner, and the iterative boosting procedure. I will be reusing modules 
of Alexander Ulanov's multi-class and multi-label metrics evaluation and Manish 
Amde's decision tree/boosting/ensemble implementations. 



> AdaBoost.MH, a multi-class multi-label classifier
> -------------------------------------------------
>
>                 Key: SPARK-2401
>                 URL: https://issues.apache.org/jira/browse/SPARK-2401
>             Project: Spark
>          Issue Type: New Feature
>          Components: MLlib
>            Reporter: Gang Bai
>
> The goal of a multi-class multi-label classifier is to tag a sample data 
> point with a subset of labels from a finite, pre-specified set. Given a set 
> of L labels, a data point can be tagged with one of the 2^L possible subsets. 
> The main challenges in training a multi-class multi-label classifier are the 
> exponentially large label space. 
> Multi-class multi-label classifiers are very useful in 
> This JIRA is created to track the effort of solving the training problem of 
> multi-class, multi-label classifiers by implementing AdaBoost.MH on Apache 
> Spark. It will not be an easy task. I will start from a basic DecisionStump 
> weak learner and a simple Hamming tree resembling DecisionStumps into a meta 
> weak learner, and the iterative boosting procedure. I will be reusing modules 
> of Alexander Ulanov's multi-class and multi-label metrics evaluation and 
> Manish Amde's decision tree/boosting/ensemble implementations. 



--
This message was sent by Atlassian JIRA
(v6.2#6252)

Reply via email to