Github user mccheah commented on a diff in the pull request:

    https://github.com/apache/spark/pull/21462#discussion_r191948799
  
    --- Diff: docs/running-on-kubernetes.md ---
    @@ -121,8 +121,8 @@ This URI is the location of the example jar that is 
already in the Docker image.
     
     If your application's dependencies are all hosted in remote locations like 
HDFS or HTTP servers, they may be referred to
     by their appropriate remote URIs. Also, application dependencies can be 
pre-mounted into custom-built Docker images.
    -Those dependencies can be added to the classpath by referencing them with 
`local://` URIs and/or setting the
    -`SPARK_EXTRA_CLASSPATH` environment variable in your Dockerfiles. The 
`local://` scheme is also required when referring to
    --- End diff --
    
    Should be able to specify dependencies with the URI `local://`. However in 
order for the image itself to specify additional jars without having to list 
them in spark-submit, you have to put the jars in the same directory as the 
rest of the Spark distribution jars. It would be good to have an API or 
environment variable that can point to additional jars.


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to