This is an automated email from the ASF dual-hosted git repository.

yangjie01 pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/spark.git


The following commit(s) were added to refs/heads/master by this push:
     new 4c0f7b581243 [SPARK-50594][BUILD][INFRA] Align the gRPC-related Python 
package version and pin protobuf version in `page.yml` and `docs/Dockerfile`
4c0f7b581243 is described below

commit 4c0f7b581243edcd9bd6d265d7ef120d6a963a4c
Author: yangjie01 <[email protected]>
AuthorDate: Tue Dec 17 12:19:28 2024 +0800

    [SPARK-50594][BUILD][INFRA] Align the gRPC-related Python package version 
and pin protobuf version in `page.yml` and `docs/Dockerfile`
    
    ### What changes were proposed in this pull request?
    This pr specifies the gRPC-related Python packages version as 1.67.0 in 
both `page.yml` and `docs/Dockerfile`, and also sets the protobuf version to 
5.29.1 to maintain consistency with other testing envs.
    
    ### Why are the changes needed?
    Unify the Python test dependency versions for grpc and protobuf in test envs
    
    ### Does this PR introduce _any_ user-facing change?
    No
    
    ### How was this patch tested?
    Pass GitHub Actions
    
    ### Was this patch authored or co-authored using generative AI tooling?
    No
    
    Closes #49205 from LuciferYang/page-and-docs-grpc-pb.
    
    Authored-by: yangjie01 <[email protected]>
    Signed-off-by: yangjie01 <[email protected]>
---
 .github/workflows/pages.yml          | 2 +-
 dev/spark-test-image/docs/Dockerfile | 2 +-
 2 files changed, 2 insertions(+), 2 deletions(-)

diff --git a/.github/workflows/pages.yml b/.github/workflows/pages.yml
index 8729012c2b8d..637abb86b36c 100644
--- a/.github/workflows/pages.yml
+++ b/.github/workflows/pages.yml
@@ -63,7 +63,7 @@ jobs:
          pip install 'sphinx==4.5.0' mkdocs 'pydata_sphinx_theme>=0.13' 
sphinx-copybutton nbsphinx numpydoc jinja2 markupsafe 'pyzmq<24.0.0' \
             ipython ipython_genutils sphinx_plotly_directive 'numpy>=1.20.0' 
pyarrow 'pandas==2.2.3' 'plotly>=4.8' 'docutils<0.18.0' \
             'flake8==3.9.0' 'mypy==1.8.0' 'pytest==7.1.3' 
'pytest-mypy-plugins==1.9.3' 'black==23.9.1' \
-            'pandas-stubs==1.2.0.53' 'grpcio==1.62.0' 'grpcio-status==1.62.0' 
'grpc-stubs==1.24.11' 'googleapis-common-protos-stubs==2.2.0' \
+            'pandas-stubs==1.2.0.53' 'grpcio==1.67.0' 'grpcio-status==1.67.0' 
'protobuf==5.29.1' 'grpc-stubs==1.24.11' 
'googleapis-common-protos-stubs==2.2.0' \
             'sphinxcontrib-applehelp==1.0.4' 'sphinxcontrib-devhelp==1.0.2' 
'sphinxcontrib-htmlhelp==2.0.1' 'sphinxcontrib-qthelp==1.0.3' 
'sphinxcontrib-serializinghtml==1.1.5'
       - name: Install Ruby for documentation generation
         uses: ruby/setup-ruby@v1
diff --git a/dev/spark-test-image/docs/Dockerfile 
b/dev/spark-test-image/docs/Dockerfile
index 2db7e0717cdf..0f3559fa2d4c 100644
--- a/dev/spark-test-image/docs/Dockerfile
+++ b/dev/spark-test-image/docs/Dockerfile
@@ -86,6 +86,6 @@ RUN curl -sS https://bootstrap.pypa.io/get-pip.py | python3.9
 RUN python3.9 -m pip install 'sphinx==4.5.0' mkdocs 
'pydata_sphinx_theme>=0.13' sphinx-copybutton nbsphinx numpydoc jinja2 
markupsafe 'pyzmq<24.0.0' \
   ipython ipython_genutils sphinx_plotly_directive 'numpy>=1.20.0' pyarrow 
pandas 'plotly>=4.8' 'docutils<0.18.0' \
   'flake8==3.9.0' 'mypy==1.8.0' 'pytest==7.1.3' 'pytest-mypy-plugins==1.9.3' 
'black==23.9.1' \
-  'pandas-stubs==1.2.0.53' 'grpcio==1.62.0' 'grpcio-status==1.62.0' 
'grpc-stubs==1.24.11' 'googleapis-common-protos-stubs==2.2.0' \
+  'pandas-stubs==1.2.0.53' 'grpcio==1.67.0' 'grpcio-status==1.67.0' 
'protobuf==5.29.1' 'grpc-stubs==1.24.11' 
'googleapis-common-protos-stubs==2.2.0' \
   'sphinxcontrib-applehelp==1.0.4' 'sphinxcontrib-devhelp==1.0.2' 
'sphinxcontrib-htmlhelp==2.0.1' 'sphinxcontrib-qthelp==1.0.3' 
'sphinxcontrib-serializinghtml==1.1.5' \
   && python3.9 -m pip cache purge


---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to