Oleg Frenkel created SPARK-26789:
------------------------------------

             Summary: [k8s] pyspark needs to upload local resources to driver 
and executor pods
                 Key: SPARK-26789
                 URL: https://issues.apache.org/jira/browse/SPARK-26789
             Project: Spark
          Issue Type: New Feature
          Components: Kubernetes, PySpark
    Affects Versions: 2.4.0
            Reporter: Oleg Frenkel


Kubernetes support provided with [https://github.com/apache-spark-on-k8s/spark] 
allows local dependencies to be used with cluster deployment mode. 
Specifically, the Resource Staging Server is used in order to upload local 
dependencies to Kubernetes so that driver and executor pods can download these 
dependencies. It looks that Spark 2.4.0 release does not support local 
dependencies. 

For example, the following command is expected to automatically upload pi.py 
from local machine to the Kubernetes cluster and make it available for both 
driver and executor pods:

{{bin/spark-submit --conf spark.app.name=example.python.pi --master 
k8s://http://127.0.0.1:8001 --deploy-mode cluster --conf 
spark.kubernetes.container.image=spark-py:spark-2.4.0 
./examples/src/main/python/pi.py}}

 



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to