Bob Treacy commented on SPARK-18278:

Is this release going to support using SparkLauncher? I have run into a couple 
of obstacles with this.

First, it wouldn't accept running a local: jar file, e.g. 
local:///opt/spark/examples/jars/spark-examples_2.11-2.2.0-k8s-0.5.0.jar , on 
my spark docker image (allowsMixedArguments and isAppResourceReq booleans in 
SparkSubmitCommandBuilder.java get in the way). When I hacked past that, it 
seems that the getState() method of SparkAppHandle is always returning UNKNOWN. 
So my first Spark application would execute successfully, but the app would 
hang waiting for FAILED,FINISHED, or KILLED on the SparkAppHandle state.The 
code I'm trying to run is working with a Spark standalone cluster and 
previously worked on mesos as well.


> SPIP: Support native submission of spark jobs to a kubernetes cluster
> ---------------------------------------------------------------------
>                 Key: SPARK-18278
>                 URL: https://issues.apache.org/jira/browse/SPARK-18278
>             Project: Spark
>          Issue Type: Umbrella
>          Components: Build, Deploy, Documentation, Kubernetes, Scheduler, 
> Spark Core
>    Affects Versions: 2.3.0
>            Reporter: Erik Erlandson
>            Priority: Major
>              Labels: SPIP
>         Attachments: SPARK-18278 Spark on Kubernetes Design Proposal Revision 
> 2 (1).pdf
> A new Apache Spark sub-project that enables native support for submitting 
> Spark applications to a kubernetes cluster.   The submitted application runs 
> in a driver executing on a kubernetes pod, and executors lifecycles are also 
> managed as pods.

This message was sent by Atlassian JIRA

To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to