This is an automated email from the ASF dual-hosted git repository.

dongjoon pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/spark.git


The following commit(s) were added to refs/heads/master by this push:
     new 743ea2a8f3ec [SPARK-54548][INFRA] Install `zstandard` python library 
in Spark Connect post-merge builds
743ea2a8f3ec is described below

commit 743ea2a8f3ec8ffff22193dca36fe1e4f12830d1
Author: Jungtaek Lim <[email protected]>
AuthorDate: Fri Nov 28 14:06:26 2025 -0800

    [SPARK-54548][INFRA] Install `zstandard` python library in Spark Connect 
post-merge builds
    
    ### What changes were proposed in this pull request?
    
    This PR adds missing python library 'zstandard' in Spark Connect post-merge 
builds.
    
    ### Why are the changes needed?
    
    Without zstandard library, Spark Connect server may not function properly 
as it checks the requirement of library.
    
    ### Does this PR introduce _any_ user-facing change?
    
    No.
    
    ### How was this patch tested?
    
    N/A
    
    ### Was this patch authored or co-authored using generative AI tooling?
    
    No.
    
    Closes #53257 from HeartSaVioR/SPARK-54548.
    
    Authored-by: Jungtaek Lim <[email protected]>
    Signed-off-by: Dongjoon Hyun <[email protected]>
---
 .github/workflows/build_python_connect.yml   | 2 +-
 .github/workflows/build_python_connect40.yml | 2 +-
 2 files changed, 2 insertions(+), 2 deletions(-)

diff --git a/.github/workflows/build_python_connect.yml 
b/.github/workflows/build_python_connect.yml
index 7a4d277577f6..fcd177f15f1f 100644
--- a/.github/workflows/build_python_connect.yml
+++ b/.github/workflows/build_python_connect.yml
@@ -72,7 +72,7 @@ jobs:
           python packaging/client/setup.py sdist
           cd dist
           pip install pyspark*client-*.tar.gz
-          pip install 'grpcio==1.76.0' 'grpcio-status==1.76.0' 
'protobuf==6.33.0' 'googleapis-common-protos==1.71.0' 'graphviz==0.20.3' 
'six==1.16.0' 'pandas==2.3.3' scipy 'plotly<6.0.0' 'mlflow>=2.8.1' coverage 
matplotlib openpyxl 'memory-profiler>=0.61.0' 'scikit-learn>=1.3.2' 
'graphviz==0.20.3' 'torch<2.6.0' torchvision torcheval deepspeed 
unittest-xml-reporting
+          pip install 'grpcio==1.76.0' 'grpcio-status==1.76.0' 
'protobuf==6.33.0' 'googleapis-common-protos==1.71.0' 'graphviz==0.20.3' 
'six==1.16.0' 'pandas==2.3.3' scipy 'plotly<6.0.0' 'mlflow>=2.8.1' coverage 
matplotlib openpyxl 'memory-profiler>=0.61.0' 'scikit-learn>=1.3.2' 
'graphviz==0.20.3' 'torch<2.6.0' torchvision torcheval deepspeed 
unittest-xml-reporting 'zstandard==0.25.0'
       - name: List Python packages
         run: python -m pip list
       - name: Run tests (local)
diff --git a/.github/workflows/build_python_connect40.yml 
b/.github/workflows/build_python_connect40.yml
index 32bb8707db26..c4ffa08e6ccd 100644
--- a/.github/workflows/build_python_connect40.yml
+++ b/.github/workflows/build_python_connect40.yml
@@ -71,7 +71,7 @@ jobs:
           pip install 'numpy' 'pyarrow>=18.0.0' 'pandas==2.2.3' scipy 
unittest-xml-reporting 'plotly<6.0.0' 'mlflow>=2.8.1' coverage 'matplotlib' 
openpyxl 'memory-profiler==0.61.0' 'scikit-learn>=1.3.2'
 
           # Add Python deps for Spark Connect.
-          pip install 'grpcio==1.76.0' 'grpcio-status==1.76.0' 
'protobuf==6.33.0' 'googleapis-common-protos==1.71.0' 'graphviz==0.20.3'
+          pip install 'grpcio==1.76.0' 'grpcio-status==1.76.0' 
'protobuf==6.33.0' 'googleapis-common-protos==1.71.0' 'graphviz==0.20.3' 
'zstandard==0.25.0'
 
           # Add torch as a testing dependency for TorchDistributor
           pip install 'torch==2.0.1' 'torchvision==0.15.2' torcheval


---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to