GitHub user skonto opened a pull request:

    https://github.com/apache/spark/pull/21378

    [SPARK-24326][mesos] add support for local:// scheme for the app jar

    ## What changes were proposed in this pull request?
    
    *Adds support for local:// scheme like in k8s case for image based 
deployments where the jar is already in the image. Affects cluster mode and the 
mesos dispatcher. Mimics the k8s approach. Covers also file:// scheme. Keeps 
the default case where jar resolution happens on the host.
    
    ## How was this patch tested?
    
    ispatcher image with the patch, use it to start DC/OS Spark service: 
    skonto/spark-local-disp:test
    
    Test image with my application jar located at the root folder:
    skonto/spark-local:test
    
    Dockerfile for that image.
    
    From mesosphere/spark:2.3.0-2.2.1-2-hadoop-2.6
    COPY spark-examples_2.11-2.2.1.jar /
    WORKDIR /opt/spark/dist
    
    Tests:
    
    The following work as expected:
    
    * local normal example
    ```
    dcos spark run --submit-args="--conf 
spark.mesos.appJar.local.resolution.mode=container --conf 
spark.executor.memory=1g --conf 
spark.mesos.executor.docker.image=skonto/spark-local:test
     --conf spark.executor.cores=2 --conf spark.cores.max=8
     --class org.apache.spark.examples.SparkPi 
local:///spark-examples_2.11-2.2.1.jar"
    ```
    
    * make sure the flag does not affect other uris
    ```
    dcos spark run --submit-args="--conf 
spark.mesos.appJar.local.resolution.mode=container --conf 
spark.executor.memory=1g  --conf spark.executor.cores=2 --conf spark.cores.max=8
     --class org.apache.spark.examples.SparkPi 
https://s3-eu-west-1.amazonaws.com/fdp-stavros-test/spark-examples_2.11-2.1.1.jar";
    ```
    
    * normal example no local
    ```
    dcos spark run --submit-args="--conf spark.executor.memory=1g  --conf 
spark.executor.cores=2 --conf spark.cores.max=8
     --class org.apache.spark.examples.SparkPi 
https://s3-eu-west-1.amazonaws.com/fdp-stavros-test/spark-examples_2.11-2.1.1.jar";
    
    ```
    
    The following fails
    
     * uses local with no setting, default is host.
    ```
    dcos spark run --submit-args="--conf spark.executor.memory=1g --conf 
spark.mesos.executor.docker.image=skonto/spark-local:test
      --conf spark.executor.cores=2 --conf spark.cores.max=8
      --class org.apache.spark.examples.SparkPi 
local:///spark-examples_2.11-2.2.1.jar"
    ```
    
![image](https://user-images.githubusercontent.com/7945591/40283021-8d349762-5c80-11e8-9d62-2a61a4318fd5.png)
    
    


You can merge this pull request into a Git repository by running:

    $ git pull https://github.com/skonto/spark local-upstream

Alternatively you can review and apply these changes as the patch at:

    https://github.com/apache/spark/pull/21378.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

    This closes #21378
    
----
commit e379d92768c6f0ed3eb7f359f9bdbd2313a1705e
Author: Stavros Kontopoulos <stavros.kontopoulos@...>
Date:   2018-05-20T19:50:20Z

    add support for local:// scheme for app jar

----


---

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to