[
https://issues.apache.org/jira/browse/SPARK-48092?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
Dongjoon Hyun resolved SPARK-48092.
-----------------------------------
Resolution: Not A Problem
Please use the latest version, which is Apache Spark 3.5.4 with Python 3.10.12.
{code}
$ docker run -it --rm spark:3.5.0-java17-python3 python3 --version
Python 3.10.12
$ docker run -it --rm spark:3.5.1-java17-python3 python3 --version
Python 3.10.12
$ docker run -it --rm spark:3.5.2-java17-python3 python3 --version
Python 3.10.12
$ docker run -it --rm spark:3.5.3-java17-python3 python3 --version
Python 3.10.12
$ docker run -it --rm spark:3.5.4-java17-python3 python3 --version
Python 3.10.12
{code}
> Spark images are based of Python 3.8 which is soon EOL
> ------------------------------------------------------
>
> Key: SPARK-48092
> URL: https://issues.apache.org/jira/browse/SPARK-48092
> Project: Spark
> Issue Type: Bug
> Components: Spark Docker
> Affects Versions: 3.4.2, 3.4.0, 3.4.1, 3.5.0, 3.5.1
> Reporter: Mayur Madnani
> Priority: Blocker
> Attachments: Screenshot 2024-05-02 at 21.00.18.png, Screenshot
> 2024-05-02 at 21.00.48.png
>
>
> Python 3.8 will be EOL in Oct 2024 and all the Spark docker images are based
> on Python 3.8 as of now.
> I am proposing to use Python 3.10 as default. Let me know if I can pick this
> up to make the changes in
> [spark-docker|[https://github.com/apache/spark-docker]]
--
This message was sent by Atlassian Jira
(v8.20.10#820010)
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]