skonto removed a comment on issue #25229: [SPARK-27900][K8s] Add jvm oom flag
URL: https://github.com/apache/spark/pull/25229#issuecomment-518748224
 
 
   @dongjoon-hyun one thing I noticed is that if I set:
   ```
   case "$1" in
     driver)
       shift 1
       VERBOSE_FLAG=$(get_verbose_flag)
       CMD=(
         "$SPARK_HOME/bin/spark-submit"
         --conf spark.driver.extraJavaOptions="$DEFAULT_DRIVER_JVM_OPTIONS"
   ```
   
   Then by default the user will not be able to pass any 
`spark.driver.extraJavaOptions` settings are his properties are put in a 
properties file which has less priority compared 
to`spark.driver.extraJavaOptions`. So 
`spark.kubernetes.driverEnv.DEFAULT_DRIVER_JVM_OPTIONS=" "` will not work:
   
   
   ```
   Spark properties used, including those specified through
    --conf and those from the properties file /opt/spark/conf/spark.properties:
     (spark.app.id,spark-26132ad61a7b467995596f32f23b8794)
     (spark.testing,false)
     
(spark.kubernetes.driver.pod.name,spark-test-app-5fa51e6b917340c783d3c8ef305c1a4a)
     
(spark.kubernetes.driver.label.spark-app-locator,1f06190a6a2548b9b66f32ece1c4b923)
     (spark.authenticate,true)
     (spark.kubernetes.driverEnv.DEFAULT_DRIVER_JVM_OPTIONS,)
     
(spark.kubernetes.executor.label.spark-app-locator,1f06190a6a2548b9b66f32ece1c4b923)
     (spark.kubernetes.submission.waitAppCompletion,false)
     (spark.kubernetes.driverEnv.DRIVER_VERBOSE,true)
     (spark.submit.pyFiles,)
     (spark.kubernetes.namespace,spark)
     (spark.kubernetes.authenticate.driver.serviceAccountName,spark-sa)
     (spark.kubernetes.submitInDriver,true)
     (spark.driver.host,spark-test-app-a1b3276c67d01ca3-driver-svc.spark.svc)
     (spark.kubernetes.memoryOverheadFactor,0.1)
     (spark.app.name,spark-test-app)
     (spark.driver.blockManager.port,7079)
     (spark.ui.enabled,true)
     (spark.driver.extraJavaOptions, )
     (spark.kubernetes.resource.type,java)
     
(spark.jars,local:///opt/spark/examples/jars/spark-examples_2.12-3.0.0-SNAPSHOT.jar)
     (spark.submit.deployMode,cluster)
     (spark.executors.instances,1)
     (spark.kubernetes.container.image,skonto/spark:oom2)
     (spark.master,k8s://https://192.168.2.8:8443/)
     (spark.driver.port,7078)
     (spark.kubernetes.executor.podNamePrefix,spark-test-app-a1b3276c67d01ca3)
     (spark.driver.bindAddress,172.17.0.4)
     (spark.executor.cores,1)
     (spark.kubernetes.container.image.pullPolicy,Always)
   
       
   19/08/06 16:43:15 WARN NativeCodeLoader: Unable to load native-hadoop 
library for your platform... using builtin-java classes where applicable
   Main class:
   org.apache.spark.examples.DriverSubmissionTest
   Arguments:
   5
   Spark config:
   
(spark.kubernetes.executor.label.spark-app-locator,1f06190a6a2548b9b66f32ece1c4b923)
   (spark.kubernetes.submission.waitAppCompletion,false)
   (spark.driver.host,spark-test-app-a1b3276c67d01ca3-driver-svc.spark.svc)
   (spark.kubernetes.namespace,spark)
   (spark.testing,false)
   (spark.driver.port,7078)
   (spark.executors.instances,1)
   
(spark.jars,local:///opt/spark/examples/jars/spark-examples_2.12-3.0.0-SNAPSHOT.jar,local:///opt/spark/examples/jars/spark-examples_2.12-3.0.0-SNAPSHOT.jar)
   (spark.kubernetes.driverEnv.DEFAULT_DRIVER_JVM_OPTIONS,)
   (spark.driver.blockManager.port,7079)
   (spark.ui.enabled,true)
   (spark.app.name,spark-test-app)
   (spark.kubernetes.submitInDriver,true)
   (spark.kubernetes.driverEnv.DRIVER_VERBOSE,true)
   (spark.submit.pyFiles,)
   (spark.kubernetes.memoryOverheadFactor,0.1)
   (spark.driver.bindAddress,172.17.0.4)
   (spark.kubernetes.container.image.pullPolicy,Always)
   (spark.kubernetes.resource.type,java)
   (spark.kubernetes.container.image,skonto/spark:oom2)
   (spark.driver.extraJavaOptions, )
   (spark.submit.deployMode,client)
   (spark.master,k8s://https://192.168.2.8:8443/)
   
(spark.kubernetes.driver.label.spark-app-locator,1f06190a6a2548b9b66f32ece1c4b923)
   (spark.kubernetes.authenticate.driver.serviceAccountName,spark-sa)
   (spark.kubernetes.executor.podNamePrefix,spark-test-app-a1b3276c67d01ca3)
   (spark.authenticate,true)
   (spark.executor.cores,1)
   
(spark.repl.local.jars,local:///opt/spark/examples/jars/spark-examples_2.12-3.0.0-SNAPSHOT.jar)
   (spark.app.id,spark-26132ad61a7b467995596f32f23b8794)
   
(spark.kubernetes.driver.pod.name,spark-test-app-5fa51e6b917340c783d3c8ef305c1a4a)
   ```
   Initially the user options are applied but then are overriden so I need to 
detect if user supplies empty defaults and then unset it.  

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
[email protected]


With regards,
Apache Git Services

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to