[ https://issues.apache.org/jira/browse/SPARK-18278?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16183960#comment-16183960 ]
Anirudh Ramanathan commented on SPARK-18278: -------------------------------------------- An update on the plan - 2 days ago, we cut https://github.com/apache-spark-on-k8s/spark/releases/tag/v2.2.0-kubernetes-0.4.0 and updated our documentation. We are planning to use that tagged release as the basis of our PRs to apache/spark in the 2.3 timeframe. https://github.com/apache-spark-on-k8s/spark/issues/441#issuecomment-330802935 is a tentative plan that we have. To bring greater visibility (beyond discussions in our fork, posting here also) It might be a good idea sense for us to make sub-JIRAs and tracking the following individually: * Basic scheduler backend components * Spark Submit changes and basic k8s submission logic * More K8s specific submission steps * Dynamic allocation and shuffle service * CI and e2e Testing * Distributing docker images via the ASF process * Documentation [~rxin] [~felixcheung] [~matei] > SPIP: Support native submission of spark jobs to a kubernetes cluster > --------------------------------------------------------------------- > > Key: SPARK-18278 > URL: https://issues.apache.org/jira/browse/SPARK-18278 > Project: Spark > Issue Type: Umbrella > Components: Build, Deploy, Documentation, Scheduler, Spark Core > Affects Versions: 2.3.0 > Reporter: Erik Erlandson > Labels: SPIP > Attachments: SPARK-18278 Spark on Kubernetes Design Proposal Revision > 2 (1).pdf > > > A new Apache Spark sub-project that enables native support for submitting > Spark applications to a kubernetes cluster. The submitted application runs > in a driver executing on a kubernetes pod, and executors lifecycles are also > managed as pods. -- This message was sent by Atlassian JIRA (v6.4.14#64029) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org