This is an automated email from the ASF dual-hosted git repository.
ruifengz pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/spark.git
The following commit(s) were added to refs/heads/master by this push:
new c4f62d459f24 [SPARK-55191][INFRA][DOCS] Adjust the Python version used
in the Python-only daily test
c4f62d459f24 is described below
commit c4f62d459f245fa391a5c7361ef2a434f8b869df
Author: yangjie01 <[email protected]>
AuthorDate: Mon Jan 26 15:44:44 2026 +0800
[SPARK-55191][INFRA][DOCS] Adjust the Python version used in the
Python-only daily test
### What changes were proposed in this pull request?
This pr aims to adjust the Python version used in the Python-only daily
test:
1. Adjust the daily tests using `python_hosted_runner_test.yml` to Python
3.12, including the Python ARM test and Python macOS 26 test.
2. Update the Python classic-only test to Python 3.12 and modify the
Dockerfile generation process.
3. Update the Python-only test from Python 3.12 to Python 3.11 to ensure at
least one Python-only daily test is validating Python 3.11.
### Why are the changes needed?
Adjust the Python version used in the Python-only daily test
### Does this PR introduce _any_ user-facing change?
No
### How was this patch tested?
- Pass Github Actions
### Was this patch authored or co-authored using generative AI tooling?
No
Closes #53973 from LuciferYang/SPARK-55191.
Authored-by: yangjie01 <[email protected]>
Signed-off-by: Ruifeng Zheng <[email protected]>
---
.github/workflows/build_infra_images_cache.yml | 22 +++++++++++-----------
...build_python_3.12.yml => build_python_3.11.yml} | 6 +++---
...thon_3.11_arm.yml => build_python_3.12_arm.yml} | 2 +-
...only.yml => build_python_3.12_classic_only.yml} | 6 +++---
...1_macos26.yml => build_python_3.12_macos26.yml} | 2 +-
.github/workflows/python_hosted_runner_test.yml | 2 +-
README.md | 10 +++++-----
.../Dockerfile | 20 ++++++++++----------
8 files changed, 35 insertions(+), 35 deletions(-)
diff --git a/.github/workflows/build_infra_images_cache.yml
b/.github/workflows/build_infra_images_cache.yml
index d018d50ed1e3..210e413bbbc5 100644
--- a/.github/workflows/build_infra_images_cache.yml
+++ b/.github/workflows/build_infra_images_cache.yml
@@ -36,8 +36,8 @@ on:
- 'dev/spark-test-image/pypy-311/Dockerfile'
- 'dev/spark-test-image/python-310/Dockerfile'
- 'dev/spark-test-image/python-311/Dockerfile'
- - 'dev/spark-test-image/python-311-classic-only/Dockerfile'
- 'dev/spark-test-image/python-312/Dockerfile'
+ - 'dev/spark-test-image/python-312-classic-only/Dockerfile'
- 'dev/spark-test-image/python-312-pandas-3/Dockerfile'
- 'dev/spark-test-image/python-313/Dockerfile'
- 'dev/spark-test-image/python-314/Dockerfile'
@@ -193,19 +193,19 @@ jobs:
- name: Image digest (PySpark with Python 3.11)
if: hashFiles('dev/spark-test-image/python-311/Dockerfile') != ''
run: echo ${{ steps.docker_build_pyspark_python_311.outputs.digest }}
- - name: Build and push (PySpark Classic Only with Python 3.11)
- if:
hashFiles('dev/spark-test-image/python-311-classic-only/Dockerfile') != ''
- id: docker_build_pyspark_python_311_classic_only
+ - name: Build and push (PySpark Classic Only with Python 3.12)
+ if:
hashFiles('dev/spark-test-image/python-312-classic-only/Dockerfile') != ''
+ id: docker_build_pyspark_python_312_classic_only
uses: docker/build-push-action@v6
with:
- context: ./dev/spark-test-image/python-311-classic-only/
+ context: ./dev/spark-test-image/python-312-classic-only/
push: true
- tags:
ghcr.io/apache/spark/apache-spark-github-action-image-pyspark-python-311-classic-only-cache:${{
github.ref_name }}-static
- cache-from:
type=registry,ref=ghcr.io/apache/spark/apache-spark-github-action-image-pyspark-python-311-classic-only-cache:${{
github.ref_name }}
- cache-to:
type=registry,ref=ghcr.io/apache/spark/apache-spark-github-action-image-pyspark-python-311-classic-only-cache:${{
github.ref_name }},mode=max
- - name: Image digest (PySpark Classic Only with Python 3.11)
- if:
hashFiles('dev/spark-test-image/python-311-classic-only/Dockerfile') != ''
- run: echo ${{
steps.docker_build_pyspark_python_311_classic_only.outputs.digest }}
+ tags:
ghcr.io/apache/spark/apache-spark-github-action-image-pyspark-python-312-classic-only-cache:${{
github.ref_name }}-static
+ cache-from:
type=registry,ref=ghcr.io/apache/spark/apache-spark-github-action-image-pyspark-python-312-classic-only-cache:${{
github.ref_name }}
+ cache-to:
type=registry,ref=ghcr.io/apache/spark/apache-spark-github-action-image-pyspark-python-312-classic-only-cache:${{
github.ref_name }},mode=max
+ - name: Image digest (PySpark Classic Only with Python 3.12)
+ if:
hashFiles('dev/spark-test-image/python-312-classic-only/Dockerfile') != ''
+ run: echo ${{
steps.docker_build_pyspark_python_312_classic_only.outputs.digest }}
- name: Build and push (PySpark with Python 3.12)
if: hashFiles('dev/spark-test-image/python-312/Dockerfile') != ''
id: docker_build_pyspark_python_312
diff --git a/.github/workflows/build_python_3.12.yml
b/.github/workflows/build_python_3.11.yml
similarity index 89%
rename from .github/workflows/build_python_3.12.yml
rename to .github/workflows/build_python_3.11.yml
index e0c04700554c..d9cf8ba2af91 100644
--- a/.github/workflows/build_python_3.12.yml
+++ b/.github/workflows/build_python_3.11.yml
@@ -17,7 +17,7 @@
# under the License.
#
-name: "Build / Python-only (master, Python 3.12)"
+name: "Build / Python-only (master, Python 3.11)"
on:
schedule:
@@ -37,8 +37,8 @@ jobs:
hadoop: hadoop3
envs: >-
{
- "PYSPARK_IMAGE_TO_TEST": "python-312",
- "PYTHON_TO_TEST": "python3.12"
+ "PYSPARK_IMAGE_TO_TEST": "python-311",
+ "PYTHON_TO_TEST": "python3.11"
}
jobs: >-
{
diff --git a/.github/workflows/build_python_3.11_arm.yml
b/.github/workflows/build_python_3.12_arm.yml
similarity index 95%
rename from .github/workflows/build_python_3.11_arm.yml
rename to .github/workflows/build_python_3.12_arm.yml
index f0a1b467703c..146676e3a89f 100644
--- a/.github/workflows/build_python_3.11_arm.yml
+++ b/.github/workflows/build_python_3.12_arm.yml
@@ -17,7 +17,7 @@
# under the License.
#
-name: "Build / Python-only (master, Python 3.11, ARM)"
+name: "Build / Python-only (master, Python 3.12, ARM)"
on:
schedule:
diff --git a/.github/workflows/build_python_3.11_classic_only.yml
b/.github/workflows/build_python_3.12_classic_only.yml
similarity index 87%
rename from .github/workflows/build_python_3.11_classic_only.yml
rename to .github/workflows/build_python_3.12_classic_only.yml
index 0df7381dccd7..b9af9ed044a0 100644
--- a/.github/workflows/build_python_3.11_classic_only.yml
+++ b/.github/workflows/build_python_3.12_classic_only.yml
@@ -17,7 +17,7 @@
# under the License.
#
-name: "Build / Python-only, Classic-only (master, Python 3.11)"
+name: "Build / Python-only, Classic-only (master, Python 3.12)"
on:
schedule:
@@ -37,8 +37,8 @@ jobs:
hadoop: hadoop3
envs: >-
{
- "PYSPARK_IMAGE_TO_TEST": "python-311-classic-only",
- "PYTHON_TO_TEST": "python3.11"
+ "PYSPARK_IMAGE_TO_TEST": "python-312-classic-only",
+ "PYTHON_TO_TEST": "python3.12"
}
jobs: >-
{
diff --git a/.github/workflows/build_python_3.11_macos26.yml
b/.github/workflows/build_python_3.12_macos26.yml
similarity index 94%
rename from .github/workflows/build_python_3.11_macos26.yml
rename to .github/workflows/build_python_3.12_macos26.yml
index a6051192bdd3..b3576d838e3c 100644
--- a/.github/workflows/build_python_3.11_macos26.yml
+++ b/.github/workflows/build_python_3.12_macos26.yml
@@ -17,7 +17,7 @@
# under the License.
#
-name: "Build / Python-only (master, Python 3.11, MacOS26)"
+name: "Build / Python-only (master, Python 3.12, MacOS26)"
on:
schedule:
diff --git a/.github/workflows/python_hosted_runner_test.yml
b/.github/workflows/python_hosted_runner_test.yml
index 8e1628e05bf6..b82c7447464f 100644
--- a/.github/workflows/python_hosted_runner_test.yml
+++ b/.github/workflows/python_hosted_runner_test.yml
@@ -29,7 +29,7 @@ on:
python:
required: false
type: string
- default: 3.11
+ default: 3.12
branch:
description: Branch to run the build against
required: false
diff --git a/README.md b/README.md
index 11198eea4574..4703a077b346 100644
--- a/README.md
+++ b/README.md
@@ -36,12 +36,12 @@ This README file only contains basic setup instructions.
| | [](https://github.com/apache/spark/actions/workflows/build_maven_java21_arm.yml)
|
| | [](https://github.com/apache/spark/actions/workflows/build_coverage.yml)
|
| | [](https://github.com/apache/spark/actions/workflows/build_python_pypy3.10.yml)
|
-| | [](https://github.com/apache/spark/actions/workflows/build_python_pypy3.11.yml)
|
| | [](https://github.com/apache/spark/actions/workflows/build_python_3.10.yml)
|
-| | [](https://github.com/apache/spark/actions/workflows/build_python_3.11_classic_only.yml)
|
-| | [](https://github.com/apache/spark/actions/workflows/build_python_3.11_arm.yml)
|
-| | [](https://github.com/apache/spark/actions/workflows/build_python_3.11_macos26.yml)
|
-| | [](https://github.com/apache/spark/actions/workflows/build_python_3.12.yml)
|
+| | [](https://github.com/apache/spark/actions/workflows/build_python_3.11.yml)
|
+| | [](https://github.com/apache/spark/actions/workflows/build_python_pypy3.11.yml)
|
+| | [](https://github.com/apache/spark/actions/workflows/build_python_3.12_classic_only.yml)
|
+| | [](https://github.com/apache/spark/actions/workflows/build_python_3.12_arm.yml)
|
+| | [](https://github.com/apache/spark/actions/workflows/build_python_3.12_macos26.yml)
|
| | [](https://github.com/apache/spark/actions/workflows/build_python_3.12_pandas_3.yml)
|
| | [](https://github.com/apache/spark/actions/workflows/build_python_3.13.yml)
|
| | [](https://github.com/apache/spark/actions/workflows/build_python_3.14.yml)
|
diff --git a/dev/spark-test-image/python-311-classic-only/Dockerfile
b/dev/spark-test-image/python-312-classic-only/Dockerfile
similarity index 84%
rename from dev/spark-test-image/python-311-classic-only/Dockerfile
rename to dev/spark-test-image/python-312-classic-only/Dockerfile
index f7b8d1658956..9461ab86dfab 100644
--- a/dev/spark-test-image/python-311-classic-only/Dockerfile
+++ b/dev/spark-test-image/python-312-classic-only/Dockerfile
@@ -20,7 +20,7 @@
FROM ubuntu:jammy-20240911.1
LABEL org.opencontainers.image.authors="Apache Spark project
<[email protected]>"
LABEL org.opencontainers.image.licenses="Apache-2.0"
-LABEL org.opencontainers.image.ref.name="Apache Spark Infra Image For PySpark
Classic with Python 3.11"
+LABEL org.opencontainers.image.ref.name="Apache Spark Infra Image For PySpark
Classic with Python 3.12"
# Overwrite this label to avoid exposing the underlying Ubuntu OS version label
LABEL org.opencontainers.image.version=""
@@ -59,10 +59,10 @@ RUN apt-get update && apt-get install -y \
wget \
zlib1g-dev
-# Install Python 3.11
+# Install Python 3.12
RUN add-apt-repository ppa:deadsnakes/ppa
RUN apt-get update && apt-get install -y \
- python3.11 \
+ python3.12 \
&& apt-get autoremove --purge -y \
&& apt-get clean \
&& rm -rf /var/lib/apt/lists/*
@@ -71,10 +71,10 @@ RUN apt-get update && apt-get install -y \
ARG BASIC_PIP_PKGS="numpy pyarrow>=22.0.0 pandas==2.3.3 plotly<6.0.0
matplotlib openpyxl memory-profiler>=0.61.0 mlflow>=2.8.1 scipy
scikit-learn>=1.3.2"
ARG TEST_PIP_PKGS="coverage unittest-xml-reporting"
-# Install Python 3.11 packages
-RUN curl -sS https://bootstrap.pypa.io/get-pip.py | python3.11
-RUN python3.11 -m pip install --ignore-installed 'blinker>=1.6.2' # mlflow
needs this
-RUN python3.11 -m pip install $BASIC_PIP_PKGS $TEST_PIP_PKGS && \
- python3.11 -m pip install torch torchvision --index-url
https://download.pytorch.org/whl/cpu && \
- python3.11 -m pip install deepspeed torcheval && \
- python3.11 -m pip cache purge
+# Install Python 3.12 packages
+RUN curl -sS https://bootstrap.pypa.io/get-pip.py | python3.12
+RUN python3.12 -m pip install --ignore-installed 'blinker>=1.6.2' # mlflow
needs this
+RUN python3.12 -m pip install $BASIC_PIP_PKGS $TEST_PIP_PKGS && \
+ python3.12 -m pip install torch torchvision --index-url
https://download.pytorch.org/whl/cpu && \
+ python3.12 -m pip install deepspeed torcheval && \
+ python3.12 -m pip cache purge
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]