[ 
https://issues.apache.org/jira/browse/SPARK-3215?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14120708#comment-14120708
 ] 

Marcelo Vanzin commented on SPARK-3215:
---------------------------------------

Thanks Matei. Looking at a Java API is next in my TODO list - I want to look at 
how the RDD API does things and try to mimic that.

I chose to use SparkConf because that's sort of standard; you may want to 
configure other things than just the cluster URL - e.g., executor count and 
size and things like that. So I wanted to avoid having to create yet another 
config object. It's a little unfortunate that SparkConf will inherit system 
properties, but since theoretically the application using the client is not a 
Spark application, it won't be using system properties to set SparkConf 
options. Also note that all options in the passed SparkConf instance are 
actually passed to the Spark app - so you will not be using the 
spark-defaults.conf file from anywhere.

The code already creates a job ID internally, I just didn't expose it. That 
should be pretty simple to do.

> Add remote interface for SparkContext
> -------------------------------------
>
>                 Key: SPARK-3215
>                 URL: https://issues.apache.org/jira/browse/SPARK-3215
>             Project: Spark
>          Issue Type: New Feature
>          Components: Spark Core
>            Reporter: Marcelo Vanzin
>              Labels: hive
>         Attachments: RemoteSparkContext.pdf
>
>
> A quick description of the issue: as part of running Hive jobs on top of 
> Spark, it's desirable to have a SparkContext that is running in the 
> background and listening for job requests for a particular user session.
> Running multiple contexts in the same JVM is not a very good solution. Not 
> only SparkContext currently has issues sharing the same JVM among multiple 
> instances, but that turns the JVM running the contexts into a huge bottleneck 
> in the system.
> So I'm proposing a solution where we have a SparkContext that is running in a 
> separate process, and listening for requests from the client application via 
> some RPC interface (most probably Akka).
> I'll attach a document shortly with the current proposal. Let's use this bug 
> to discuss the proposal and any other suggestions.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to