Re: Spark docker image convention proposal for docker repository

2021-08-21 Thread Mich Talebzadeh
Hi all, These docker images are created on spark 3.1.1. The reason I have chosen this version for now is that most production offerings (for example Google Dataproc) are based on 3.1.1. So with 3.1.1, we have scala-2.12 with Java version 11-jre-slim OR Java version 8-jre-slim all currently on

Re: Spark docker image convention proposal for docker repository

2021-08-21 Thread Mich Talebzadeh
Apologies ignore the first line spark/spark-py (that is redundant). These are correct REPOSITORY TAG IMAGE ID CREATED SIZE spark/spark 3.1.1-scala_2.12-11-jre-slim-buster 71ff5ed3ca03 9 seconds ago 635MB openjdk 8-jre-slim

Re: Spark docker image convention proposal for docker repository

2021-08-21 Thread Mich Talebzadeh
Sorry there was a typo BASE_OS="buster" SPARK_VERSION="3.1.1" SCALA_VERSION="scala_2.12" DOCKERFILE="Dockerfile" DOCKERIMAGETAG="11-jre-slim" cd $SPARK_HOME # Building Docker image from provided Dockerfile base 11 cd $SPARK_HOME /opt/spark/bin/docker-image-tool.sh \ -r spark -t

Re: Spark docker image convention proposal for docker repository

2021-08-21 Thread Mich Talebzadeh
Hi Ankit, Sure I suppose that elaboration on OS base can be added BASE_OS="buster" SPARK_VERSION="3.1.1" SCALA_VERSION="scala_2.12" DOCKERFILE="Dockerfile" DOCKERIMAGETAG="11-jre-slim" # Building Docker image from provided Dockerfile base 11 cd $SPARK_HOME /opt/spark/bin/docker-image-tool.sh \

Re: Spark docker image convention proposal for docker repository

2021-08-21 Thread Ankit Gupta
Hey Mich Adding to what Mich is suggesting, how about having the base OS version in the image tag as well, like --- 3.1.2-scala_2.12-java11-slim 3.1.2_sparkpy-scala_2.12-java11-buster 3.1.2_sparkR-scala_2.12-java11-slim Regards. Ankit Prakash Gupta info.ank...@gmail.com LinkedIn :