[
https://issues.apache.org/jira/browse/SPARK-2345?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14056913#comment-14056913
]
Hari Shreedharan commented on SPARK-2345:
-----------------------------------------
Actually my concern was slightly different (the other concerns were after
thoughts from the original one). Today, if you want to run a spark job from the
foreachfun passed to the ForEachDStream, you need to explicitly call
context.runJob and pass the partitions. I was wondering if it makes sense to
have a slightly different DStream that will actually call runJob for you, and
pass the foreachfunc as the function that runs on Spark. So the foreachfunc
would be running on spark itself.
> ForEachDStream should have an option of running the foreachfunc on Spark
> ------------------------------------------------------------------------
>
> Key: SPARK-2345
> URL: https://issues.apache.org/jira/browse/SPARK-2345
> Project: Spark
> Issue Type: Bug
> Components: Streaming
> Reporter: Hari Shreedharan
>
> Today the Job generated simply calls the foreachfunc, but does not run it on
> spark itself using the sparkContext.runJob method.
--
This message was sent by Atlassian JIRA
(v6.2#6252)