GitHub user ramaddepally opened a pull request:

    https://github.com/apache/spark/pull/23053

    [SPARK-25957][K8S] Add ability to skip building optional k8s docker i…

    …mages
    ## What changes were proposed in this pull request?
    bin/docker-image-tool.sh tries to build all docker images (JVM, PySpark
    and SparkR) by default. But not all spark distributions are built with
    SparkR and hence this script will fail on such distros.
    
    With this change,
    - We should be able to skip building optional docker
      images (PySpark and SparkR) by specifying -pskip or -Rskip flags.
    - We autodetect if SparkR is not installed in the build and skip building
      SparkR docker image.
    - We skip pushing docker images that are not available locally.
    
    ## How was this patch tested?
    
    Tested following scenarios.
    - On source code and distro with SparkR support
      - Run bin/docker-image-tool.sh -r <repo> -t <tag> build. Verify that JVM, 
PySpark and SparkR docker images are built.
      - Run bin/docker-image-tool.sh -r <repo> -t <tag> -Rskip -pskip build. 
Verify that only JVM docker image is built. Building PySpark and SparkR images 
is skipped.
    - On source code and distro without SparkR support
      - Run bin/docker-image-tool.sh -r <repo> -t <tag> build. Verify that only 
JVM, PySpark docker images are built. Building SparkR image is skipped.
    - On system with JVM, PySpark and SparkR images built,
      - Run bin/docker-image-tool.sh -r <repo> -t <tag> push. Verify that all 
images are pushed to docker registry.
    - On system with only JVM and PySpark images built.
      - Run bin/docker-image-tool.sh -r <repo> -t <tag> push. Verify that only 
JVM and PySpark images are pushed. Pushing SparkR images is skipped.


You can merge this pull request into a Git repository by running:

    $ git pull https://github.com/ramaddepally/spark SPARK-25957

Alternatively you can review and apply these changes as the patch at:

    https://github.com/apache/spark/pull/23053.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

    This closes #23053
    
----
commit ae1d74376b7daa86cd44361553e48f6d508e7ae0
Author: Nagaram Prasad Addepally <ram@...>
Date:   2018-11-15T23:51:04Z

    [SPARK-25957][K8S] Add ability to skip building optional k8s docker images
    
    bin/docker-image-tool.sh tries to build all docker images (JVM, PySpark
    and SparkR) by default. But not all spark distributions are built with
    SparkR and hence this script will fail on such distros.
    
    With this change,
    - We should be able to skip building optional docker
      images (PySpark and SparkR) by specifying -pskip or -Rskip flags.
    - We autodetect if SparkR is not installed in the build and skip building
      SparkR docker image.
    - We skip pushing docker images that are not available locally.
    
    Tested following scenarios.
    - On source code and distro with SparkR support
      - Run bin/docker-image-tool.sh -r <repo> -t <tag> build. Verify that JVM, 
PySpark and SparkR docker images are built.
      - Run bin/docker-image-tool.sh -r <repo> -t <tag> -Rskip -pskip build. 
Verify that only JVM docker image is built. Building PySpark and SparkR images 
is skipped.
    - On source code and distro without SparkR support
      - Run bin/docker-image-tool.sh -r <repo> -t <tag> build. Verify that only 
JVM, PySpark docker images are built. Building SparkR image is skipped.
    - On system with JVM, PySpark and SparkR images built,
      - Run bin/docker-image-tool.sh -r <repo> -t <tag> push. Verify that all 
images are pushed to docker registry.
    - On system with only JVM and PySpark images built.
      - Run bin/docker-image-tool.sh -r <repo> -t <tag> push. Verify that only 
JVM and PySpark images are pushed. Pushing SparkR images is skipped.

----


---

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to