This is an automated email from the ASF dual-hosted git repository.
dongjoon pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/spark.git
The following commit(s) were added to refs/heads/master by this push:
new d1dbcdab1af9 [SPARK-55413][PYTHON][INFRA] Upgrade Python minimum dep
test images to Ubuntu 24.04
d1dbcdab1af9 is described below
commit d1dbcdab1af91d1403d89e0fd8de66c7130ade07
Author: Ruifeng Zheng <[email protected]>
AuthorDate: Sun Feb 8 12:35:45 2026 -0800
[SPARK-55413][PYTHON][INFRA] Upgrade Python minimum dep test images to
Ubuntu 24.04
### What changes were proposed in this pull request?
Upgrade Python minimum dep test images to Ubuntu 24.04
### Why are the changes needed?
to test against newer versions
### Does this PR introduce _any_ user-facing change?
no
### How was this patch tested?
PR builder with
```
default: '{"PYSPARK_IMAGE_TO_TEST": "python-minimum", "PYTHON_TO_TEST":
"python3.10"}'
```
https://github.com/zhengruifeng/spark/actions/runs/21777887237/job/62837606695
```
default: '{"PYSPARK_IMAGE_TO_TEST": "python-ps-minimum", "PYTHON_TO_TEST":
"python3.10"}'
```
https://github.com/zhengruifeng/spark/actions/runs/21791697723/job/62875766911
### Was this patch authored or co-authored using generative AI tooling?
no
Closes #54200 from zhengruifeng/u24_mini.
Authored-by: Ruifeng Zheng <[email protected]>
Signed-off-by: Dongjoon Hyun <[email protected]>
---
dev/spark-test-image/python-minimum/Dockerfile | 26 ++++++++++++++--------
dev/spark-test-image/python-ps-minimum/Dockerfile | 27 ++++++++++++++---------
2 files changed, 34 insertions(+), 19 deletions(-)
diff --git a/dev/spark-test-image/python-minimum/Dockerfile
b/dev/spark-test-image/python-minimum/Dockerfile
index 4f671c2229d5..011618a36d3f 100644
--- a/dev/spark-test-image/python-minimum/Dockerfile
+++ b/dev/spark-test-image/python-minimum/Dockerfile
@@ -15,16 +15,16 @@
# limitations under the License.
#
-# Image for building and testing Spark branches. Based on Ubuntu 22.04.
+# Image for building and testing Spark branches. Based on Ubuntu 24.04.
# See also in https://hub.docker.com/_/ubuntu
-FROM ubuntu:jammy-20240911.1
+FROM ubuntu:noble
LABEL org.opencontainers.image.authors="Apache Spark project
<[email protected]>"
LABEL org.opencontainers.image.licenses="Apache-2.0"
LABEL org.opencontainers.image.ref.name="Apache Spark Infra Image For PySpark
with old dependencies"
# Overwrite this label to avoid exposing the underlying Ubuntu OS version label
LABEL org.opencontainers.image.version=""
-ENV FULL_REFRESH_DATE=20260127
+ENV FULL_REFRESH_DATE=20260206
ENV DEBIAN_FRONTEND=noninteractive
ENV DEBCONF_NONINTERACTIVE_SEEN=true
@@ -43,20 +43,28 @@ RUN apt-get update && apt-get install -y \
libssl-dev \
openjdk-17-jdk-headless \
pkg-config \
- python3.10 \
- python3-psutil \
tzdata \
software-properties-common \
- zlib1g-dev \
+ zlib1g-dev
+
+# Install Python 3.10
+RUN add-apt-repository ppa:deadsnakes/ppa
+RUN apt-get update && apt-get install -y \
+ python3.10 \
&& apt-get autoremove --purge -y \
&& apt-get clean \
&& rm -rf /var/lib/apt/lists/*
-ARG BASIC_PIP_PKGS="numpy==1.22.4 pyarrow==18.0.0 pandas==2.2.0 six==1.16.0
scipy scikit-learn coverage unittest-xml-reporting"
-# Python deps for Spark Connect
-ARG CONNECT_PIP_PKGS="grpcio==1.76.0 grpcio-status==1.76.0
googleapis-common-protos==1.71.0 zstandard==0.25.0 graphviz==0.20
protobuf==6.33.5"
+# Setup virtual environment
+ENV VIRTUAL_ENV=/opt/spark-venv
+RUN python3.10 -m venv --without-pip $VIRTUAL_ENV
+ENV PATH="$VIRTUAL_ENV/bin:$PATH"
# Install Python 3.10 packages
RUN curl -sS https://bootstrap.pypa.io/get-pip.py | python3.10
+
+ARG BASIC_PIP_PKGS="numpy==1.22.4 pyarrow==18.0.0 pandas==2.2.0 six==1.16.0
scipy scikit-learn coverage unittest-xml-reporting psutil"
+ARG CONNECT_PIP_PKGS="grpcio==1.76.0 grpcio-status==1.76.0
googleapis-common-protos==1.71.0 zstandard==0.25.0 graphviz==0.20
protobuf==6.33.5"
+
RUN python3.10 -m pip install --force $BASIC_PIP_PKGS $CONNECT_PIP_PKGS && \
python3.10 -m pip cache purge
diff --git a/dev/spark-test-image/python-ps-minimum/Dockerfile
b/dev/spark-test-image/python-ps-minimum/Dockerfile
index ca1bd092a2fa..5daecd379498 100644
--- a/dev/spark-test-image/python-ps-minimum/Dockerfile
+++ b/dev/spark-test-image/python-ps-minimum/Dockerfile
@@ -15,16 +15,16 @@
# limitations under the License.
#
-# Image for building and testing Spark branches. Based on Ubuntu 22.04.
+# Image for building and testing Spark branches. Based on Ubuntu 24.04.
# See also in https://hub.docker.com/_/ubuntu
-FROM ubuntu:jammy-20240911.1
+FROM ubuntu:noble
LABEL org.opencontainers.image.authors="Apache Spark project
<[email protected]>"
LABEL org.opencontainers.image.licenses="Apache-2.0"
LABEL org.opencontainers.image.ref.name="Apache Spark Infra Image For Pandas
API on Spark with old dependencies"
# Overwrite this label to avoid exposing the underlying Ubuntu OS version label
LABEL org.opencontainers.image.version=""
-ENV FULL_REFRESH_DATE=20260127
+ENV FULL_REFRESH_DATE=20260206
ENV DEBIAN_FRONTEND=noninteractive
ENV DEBCONF_NONINTERACTIVE_SEEN=true
@@ -43,21 +43,28 @@ RUN apt-get update && apt-get install -y \
libssl-dev \
openjdk-17-jdk-headless \
pkg-config \
- python3.10 \
- python3-psutil \
tzdata \
software-properties-common \
- zlib1g-dev \
+ zlib1g-dev
+
+# Install Python 3.10
+RUN add-apt-repository ppa:deadsnakes/ppa
+RUN apt-get update && apt-get install -y \
+ python3.10 \
&& apt-get autoremove --purge -y \
&& apt-get clean \
&& rm -rf /var/lib/apt/lists/*
-
-ARG BASIC_PIP_PKGS="pyarrow==18.0.0 pandas==2.2.0 six==1.16.0 numpy scipy
coverage unittest-xml-reporting"
-# Python deps for Spark Connect
-ARG CONNECT_PIP_PKGS="grpcio==1.76.0 grpcio-status==1.76.0
googleapis-common-protos==1.71.0 zstandard==0.25.0 graphviz==0.20
protobuf==6.33.5"
+# Setup virtual environment
+ENV VIRTUAL_ENV=/opt/spark-venv
+RUN python3.10 -m venv --without-pip $VIRTUAL_ENV
+ENV PATH="$VIRTUAL_ENV/bin:$PATH"
# Install Python 3.10 packages
RUN curl -sS https://bootstrap.pypa.io/get-pip.py | python3.10
+
+ARG BASIC_PIP_PKGS="pyarrow==18.0.0 pandas==2.2.0 six==1.16.0 numpy scipy
coverage unittest-xml-reporting psutil"
+ARG CONNECT_PIP_PKGS="grpcio==1.76.0 grpcio-status==1.76.0
googleapis-common-protos==1.71.0 zstandard==0.25.0 graphviz==0.20
protobuf==6.33.5"
+
RUN python3.10 -m pip install --force $BASIC_PIP_PKGS $CONNECT_PIP_PKGS && \
python3.10 -m pip cache purge
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]