This is an automated email from the ASF dual-hosted git repository.

dongjoon pushed a commit to branch branch-4.1
in repository https://gitbox.apache.org/repos/asf/spark.git


The following commit(s) were added to refs/heads/branch-4.1 by this push:
     new 8c1adc07f563 [SPARK-54194][PYTHON][FOLLOWUP] Fix `connectutils.py` to 
import `pb2` conditionally
8c1adc07f563 is described below

commit 8c1adc07f563c7a0c615281a8fce70e499ca69f4
Author: Dongjoon Hyun <[email protected]>
AuthorDate: Thu Nov 13 09:21:53 2025 -0800

    [SPARK-54194][PYTHON][FOLLOWUP] Fix `connectutils.py` to import `pb2` 
conditionally
    
    ### What changes were proposed in this pull request?
    
    This PR is a follow-up of the following to fix `connectutils.py` to import 
`pb2` conditionally.
    - #52894
    
    ### Why are the changes needed?
    
    Currently, Python CIs are broken like the following.
    - 
https://github.com/apache/spark/actions/workflows/build_python_3.11_classic_only.yml
        - 
https://github.com/apache/spark/actions/runs/19316448951/job/55248810741
    - https://github.com/apache/spark/actions/workflows/build_python_3.12.yml
        - 
https://github.com/apache/spark/actions/runs/19275741458/job/55212353468
    
    ```
      File "/__w/spark/spark/python/pyspark/testing/connectutils.py", line 26, 
in <module>
        import pyspark.sql.connect.proto as pb2
      File "/__w/spark/spark/python/pyspark/sql/connect/proto/__init__.py", 
line 18, in <module>
        from pyspark.sql.connect.proto.base_pb2_grpc import *
      File 
"/__w/spark/spark/python/pyspark/sql/connect/proto/base_pb2_grpc.py", line 19, 
in <module>
        import grpc
    ModuleNotFoundError: No module named 'grpc'
    ```
    
    ### Does this PR introduce _any_ user-facing change?
    
    No behavior change. We has been importing `pyspark.sql.connect` 
conditionally before #52894 .
    
    ### How was this patch tested?
    
    Pass the CIs and manual test.
    
    ### Was this patch authored or co-authored using generative AI tooling?
    
    No.
    
    Closes #53037 from dongjoon-hyun/SPARK-54194.
    
    Authored-by: Dongjoon Hyun <[email protected]>
    Signed-off-by: Dongjoon Hyun <[email protected]>
    (cherry picked from commit 63bcc871bdec31cbc717dc83b54d76d6796c16bb)
    Signed-off-by: Dongjoon Hyun <[email protected]>
---
 python/pyspark/testing/connectutils.py | 2 +-
 1 file changed, 1 insertion(+), 1 deletion(-)

diff --git a/python/pyspark/testing/connectutils.py 
b/python/pyspark/testing/connectutils.py
index d895c1d8a26b..302ae2a5751f 100644
--- a/python/pyspark/testing/connectutils.py
+++ b/python/pyspark/testing/connectutils.py
@@ -23,7 +23,6 @@ import unittest
 import uuid
 import contextlib
 
-import pyspark.sql.connect.proto as pb2
 from pyspark import Row, SparkConf
 from pyspark.util import is_remote_only
 from pyspark.testing.utils import PySparkErrorTestUtils
@@ -53,6 +52,7 @@ if should_test_connect:
     from pyspark.sql.connect.dataframe import DataFrame
     from pyspark.sql.connect.plan import Read, Range, SQL, LogicalPlan
     from pyspark.sql.connect.session import SparkSession
+    import pyspark.sql.connect.proto as pb2
 
 
 class MockRemoteSession:


---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to