skonto commented on issue #25870: [SPARK-27936][K8S] support python deps
URL: https://github.com/apache/spark/pull/25870#issuecomment-541001902
 
 
   @holdenk this is because spark-submit add the resource in sparks.jars:
   ```
   19/10/11 13:01:30 WARN Utils: Your hostname, universe resolves to a loopback 
address: 127.0.1.1; using 192.168.2.4 instead (on interface wlp2s0)
   19/10/11 13:01:30 WARN Utils: Set SPARK_LOCAL_IP if you need to bind to 
another address
   Parsed arguments:
     master                  k8s://https://10.96.0.1:443
     deployMode              cluster
     executorMemory          1G
     executorCores           null
     totalExecutorCores      null
     propertiesFile          null
     driverMemory            1G
     driverCores             null
     driverExtraClassPath    null
     driverExtraLibraryPath  null
     driverExtraJavaOptions  null
     supervise               false
     queue                   null
     numExecutors            2
     files                   null
     pyFiles                 null
     archives                null
     mainClass               org.apache.spark.examples.SparkPi
     primaryResource         
local:///opt/spark/examples/jars/spark-examples_2.12-2.1.2-2.4.4-lightbend.jar
     name                    spark-pi
     childArgs               [100]
     jars                    null
     packages                null
     packagesExclusions      null
     repositories            null
     verbose                 true
   
   Spark properties used, including those specified through
    --conf and those from the properties file null:
     (spark.kubernetes.driver.pod.name,spark-pi-driver)
     (spark.executor.instances,2)
     (spark.driver.memory,1G)
     (spark.executor.memory,1G)
     (spark.kubernetes.authenticate.driver.serviceAccountName,spark-sa)
     (spark.kubernetes.namespace,spark)
     
(spark.kubernetes.container.image,lightbend/spark:2.1.2-OpenShift-2.4.4-rh-2.12)
     (spark.kubernetes.container.image.pullPolicy,Always)
   
       
   19/10/11 13:01:31 WARN NativeCodeLoader: Unable to load native-hadoop 
library for your platform... using builtin-java classes where applicable
   Main class:
   org.apache.spark.deploy.k8s.submit.KubernetesClientApplication
   Arguments:
   --primary-java-resource
   
local:///opt/spark/examples/jars/spark-examples_2.12-2.1.2-2.4.4-lightbend.jar
   --main-class
   org.apache.spark.examples.SparkPi
   --arg
   100
   Spark config:
   (spark.kubernetes.namespace,spark)
   
(spark.jars,local:///opt/spark/examples/jars/spark-examples_2.12-2.1.2-2.4.4-lightbend.jar)
   (spark.app.name,spark-pi)
   (spark.driver.memory,1G)
   (spark.executor.instances,2)
   (spark.submit.pyFiles,)
   (spark.kubernetes.container.image.pullPolicy,Always)
   
(spark.kubernetes.container.image,lightbend/spark:2.1.2-OpenShift-2.4.4-rh-2.12)
   (spark.submit.deployMode,cluster)
   (spark.master,k8s://https://10.96.0.1:443)
   (spark.kubernetes.authenticate.driver.serviceAccountName,spark-sa)
   (spark.executor.memory,1G)
   (spark.kubernetes.driver.pod.name,spark-pi-driver)
   Classpath elements:
   ```
   So since the PR [here](https://github.com/apache/spark/pull/23546) manages 
that property we dont need to do double work. For python its different.

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
[email protected]


With regards,
Apache Git Services

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to