[ 
https://issues.apache.org/jira/browse/SPARK-21027?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16047243#comment-16047243
 ] 

Joseph K. Bradley commented on SPARK-21027:
-------------------------------------------

Copying from [ML-14450]:

[SPARK-7861] adds a Python wrapper for OneVsRest.  Because of possible issues 
related to using existing libraries like {{multiprocessing}}, we are not 
training multiple models in parallel initially.

This issue is for prototyping, testing, and implementing a way to train 
multiple models at once.  Speaking with [~joshrosen], a good option might be 
the concurrent.futures package:
* Python 3.x: 
[https://docs.python.org/3/library/concurrent.futures.html#module-concurrent.futures]
* Python 2.x: [https://pypi.python.org/pypi/futures]

> Parallel One vs. Rest Classifier
> --------------------------------
>
>                 Key: SPARK-21027
>                 URL: https://issues.apache.org/jira/browse/SPARK-21027
>             Project: Spark
>          Issue Type: New Feature
>          Components: PySpark
>    Affects Versions: 2.2.0, 2.2.1
>            Reporter: Ajay Saini
>
> Currently, the Scala implementation of OneVsRest allows the user to run a 
> parallel implementation in which each class is evaluated in a different 
> thread. This implementation allows up to a 2X speedup as determined by 
> experiments but is not currently not tunable. Furthermore, the python 
> implementation of OneVsRest does not parallelize at all. It would be useful 
> to add a parallel, tunable implementation of OneVsRest to the python library 
> in order to speed up the algorithm.
>  A ticket for the Scala implementation of this classifier is here: 
> https://issues.apache.org/jira/browse/SPARK-21028



--
This message was sent by Atlassian JIRA
(v6.4.14#64029)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to