[ 
https://issues.apache.org/jira/browse/SPARK-24793?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16541990#comment-16541990
 ] 

Anirudh Ramanathan commented on SPARK-24793:
--------------------------------------------

Good points Erik. These options are not without precedent - they exist and work 
for Mesos and Standalone mode. I agree that the operator is the desired way to 
build more automation, this item is more focused on the end user of spark 
submit who now has to learn two different commandline tools (spark-submit and 
kubectl) to effectively use Spark on k8s.

> Make spark-submit more useful with k8s
> --------------------------------------
>
>                 Key: SPARK-24793
>                 URL: https://issues.apache.org/jira/browse/SPARK-24793
>             Project: Spark
>          Issue Type: Improvement
>          Components: Kubernetes
>    Affects Versions: 2.3.0
>            Reporter: Anirudh Ramanathan
>            Assignee: Anirudh Ramanathan
>            Priority: Major
>
> Support controlling the lifecycle of Spark Application through spark-submit. 
> For example:
> {{ 
>   --kill app_name           If given, kills the driver specified.
>   --status app_name      If given, requests the status of the driver 
> specified.
> }}
> Potentially also --list to list all spark drivers running.
> Given that our submission client can actually launch jobs into many different 
> namespaces, we'll need an additional specification of the namespace through a 
> --namespace flag potentially.
> I think this is pretty useful to have instead of forcing a user to use 
> kubectl to manage the lifecycle of any k8s Spark Application.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to