[ https://issues.apache.org/jira/browse/SPARK-31482?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Hyukjin Kwon updated SPARK-31482: --------------------------------- Priority: Major (was: Blocker) > spark.kubernetes.driver.podTemplateFile Configuration not used by the job > ------------------------------------------------------------------------- > > Key: SPARK-31482 > URL: https://issues.apache.org/jira/browse/SPARK-31482 > Project: Spark > Issue Type: Bug > Components: Kubernetes > Affects Versions: 3.0.0 > Reporter: Pradeep Misra > Priority: Major > > Spark 3.0 - Running Spark Submit as below and point to a MinKube cluster > {code:java} > bin/spark-submit \ > --master k8s://https://192.168.99.102:8443 \ > --deploy-mode cluster \ > --name spark-pi \ > --class org.apache.spark.examples.SparkPi \ > --conf spark.kubernetes.driver.podTemplateFile=../driver_1E.template \ > --conf spark.kubernetes.executor.podTemplateFile=../executor.template \ > --conf spark.kubernetes.container.image=spark:spark3 \ > local:///opt/spark/examples/jars/spark-examples_2.12-3.0.0-preview2.jar 1 > {code} > > Spark Binaries - spark-3.0.0-preview2-bin-hadoop2.7.tgz > Driver Template - > {code:java} > apiVersion: v1 > kind: Pod > metadata: > labels: > spark-app-id: my-custom-id > annotations: > spark-driver-cpu: 1 > spark-driver-mem: 1 > spark-executor-cpu: 1 > spark-executor-mem: 1 > spark-executor-count: 1 > spec: > schedulerName: spark-scheduler{code} > Executor Template > > {code:java} > apiVersion: v1 > kind: Pod > metadata: > labels: > spark-app-id: my-custom-id > spec: > schedulerName: spark-scheduler{code} > Kubernetes Pods Launched - Two Executor Pods were launched which was default > {code:java} > spark-pi-e608e7718f11cc69-driver 1/1 Running 0 10s > spark-pi-e608e7718f11cc69-exec-1 1/1 Running 0 5s > spark-pi-e608e7718f11cc69-exec-2 1/1 Running 0 5s{code} > -- This message was sent by Atlassian Jira (v8.3.4#803005) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org