Hey Mich

Adding to what Mich is suggesting, how about having the base OS version in
the image tag as well, like

<PRODUCT_VERSION>-<SCALA_VERSION>-<JAVA_VERSION>-<BASE_OS>

3.1.2-scala_2.12-java11-slim
3.1.2_sparkpy-scala_2.12-java11-buster
3.1.2_sparkR-scala_2.12-java11-slim

Regards.

Ankit Prakash Gupta
info.ank...@gmail.com
LinkedIn : https://www.linkedin.com/in/infoankitp/
Medium: https://medium.com/@info.ankitp

On Mon, Aug 16, 2021 at 5:39 PM Mich Talebzadeh <mich.talebza...@gmail.com>
wrote:

> Hi,
>
> I propose that for Spark docker images we follow the following convention
> similar to flink <https://hub.docker.com/_/flink>as shown in the attached
> file
>
> So for Spark we will have
>
>
> <PRODUCT_VERSION>-<SCALA_VERSION>-<JAVA_VERSION>
>
> 3.1.2-scala_2.12-java11
> 3.1.2_sparkpy-scala_2.12-java11
> 3.1.2_sparkR-scala_2.12-java11
>
>
> If this makes sense please respond, otherwise state your preference
>
>
> HTH
>
>
> *Disclaimer:* Use it at your own risk. Any and all responsibility for any
> loss, damage or destruction of data or any other property which may arise
> from relying on this email's technical content is explicitly disclaimed.
> The author will in no case be liable for any monetary damages arising from
> such loss, damage or destruction.
>
>
>
> ---------------------------------------------------------------------
> To unsubscribe e-mail: dev-unsubscr...@spark.apache.org

Reply via email to