[ 
https://issues.apache.org/jira/browse/SYSTEMML-1471?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15961112#comment-15961112
 ] 

Mike Dusenberry commented on SYSTEMML-1471:
-------------------------------------------

+1 for using MLContext for this task.  We are already going to remove the old 
MLContext; could we also aim to deprecate and remove JMLC in favor of MLContext?

> Support PreparedScript for MLContext
> ------------------------------------
>
>                 Key: SYSTEMML-1471
>                 URL: https://issues.apache.org/jira/browse/SYSTEMML-1471
>             Project: SystemML
>          Issue Type: Improvement
>            Reporter: Niketan Pansare
>
> The intent of this JIRA is three-fold:
> 1. Allow MLContext to be used in prediction scenario.
> 2. Consolidate the code of JMLC and MLContext.
> 3. Explore what extensions are needed in SystemML to support Spark streaming.
> For prediction scenario, it is important to reduce the parsing/validation 
> overhead as much as possible and reusing the JMLC infrastructure might be a 
> good step in that direction. It is also important that MLContext continues to 
> support dynamic recompilation and other optimization as the input size could 
> be small (similar to JMLC), but could also be large (if window size is large, 
> making MLContext ideal for this scenario). 
> {code}
> val streamingContext = new StreamingContext(sc, SLIDE_INTERVAL)
> val windowDStream  = .....window(WINDOW_LENGTH, SLIDE_INTERVAL)
> val preparedScript = ....prepareScript(....)
> windowDStream.foreachRDD(currentWindow => {
> if (currentWindow.count() > 0) {
>   ml.execute(preparedScript.in("X", currentWindow.toDF()))
>   ...
> }
> })
> {code}
> [~deron] [~mboehm7] [~reinwald] [~freiss] [~mwdus...@us.ibm.com] [~nakul02] 
> Is this something that interest anyone of you ?



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)

Reply via email to