[
https://issues.apache.org/jira/browse/SPARK-3215?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14112780#comment-14112780
]
Marcelo Vanzin commented on SPARK-3215:
---------------------------------------
Hi Reynold, thanks for the comments.
This definitely needs more details, but I wanted to get the high-level idea out
there first, since I've been told that in the past similar projects met some
resistance. If everybody is ok with the approach, I'll go forward and write a
proper spec. (I'm also working on a p.o.c. to test some ideas and use as a way
to play with what the API would look like.)
> Add remote interface for SparkContext
> -------------------------------------
>
> Key: SPARK-3215
> URL: https://issues.apache.org/jira/browse/SPARK-3215
> Project: Spark
> Issue Type: New Feature
> Components: Spark Core
> Reporter: Marcelo Vanzin
> Labels: hive
> Attachments: RemoteSparkContext.pdf
>
>
> A quick description of the issue: as part of running Hive jobs on top of
> Spark, it's desirable to have a SparkContext that is running in the
> background and listening for job requests for a particular user session.
> Running multiple contexts in the same JVM is not a very good solution. Not
> only SparkContext currently has issues sharing the same JVM among multiple
> instances, but that turns the JVM running the contexts into a huge bottleneck
> in the system.
> So I'm proposing a solution where we have a SparkContext that is running in a
> separate process, and listening for requests from the client application via
> some RPC interface (most probably Akka).
> I'll attach a document shortly with the current proposal. Let's use this bug
> to discuss the proposal and any other suggestions.
--
This message was sent by Atlassian JIRA
(v6.2#6252)
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]