This is an automated email from the ASF dual-hosted git repository.

gurwls223 pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/spark.git


The following commit(s) were added to refs/heads/master by this push:
     new 9927e414222e [SPARK-55615][PYTHON][CONNECT] Move SparkContext import 
into class branch
9927e414222e is described below

commit 9927e414222e1bb0f82cb2cb99e4ac36f7ec6c2b
Author: Tian Gao <[email protected]>
AuthorDate: Fri Feb 20 10:25:23 2026 +0900

    [SPARK-55615][PYTHON][CONNECT] Move SparkContext import into class branch
    
    ### What changes were proposed in this pull request?
    
    Move `from pyspark.core.context import SparkContext` to classic branch.
    
    ### Why are the changes needed?
    
    In #53820 we supported `create` from classic. However, the import is added 
outside of classic branch. It works fine if the user has a full pyspark, but if 
they only have connect client, they can't import `pyspark.core` and it will 
fail.
    
    https://github.com/apache/spark/actions/runs/22196452350/job/64197922961
    
    ### Does this PR introduce _any_ user-facing change?
    
    No.
    
    ### How was this patch tested?
    
    Scheduled CI should check it out.
    
    ### Was this patch authored or co-authored using generative AI tooling?
    
    No.
    
    Closes #54390 from gaogaotiantian/move-spark-context.
    
    Authored-by: Tian Gao <[email protected]>
    Signed-off-by: Hyukjin Kwon <[email protected]>
---
 python/pyspark/sql/session.py | 4 ++--
 1 file changed, 2 insertions(+), 2 deletions(-)

diff --git a/python/pyspark/sql/session.py b/python/pyspark/sql/session.py
index a3a0f3bc98a7..5bdd663c6293 100644
--- a/python/pyspark/sql/session.py
+++ b/python/pyspark/sql/session.py
@@ -584,8 +584,6 @@ class SparkSession(SparkConversionMixin):
             -----
             This method will update the default and/or active session if they 
are not set.
             """
-            from pyspark.core.context import SparkContext
-
             opts = dict(self._options)
             # Connect mode
             if "SPARK_REMOTE" in os.environ or "spark.remote" in opts:
@@ -609,6 +607,8 @@ class SparkSession(SparkConversionMixin):
                 return cast(SparkSession, 
RemoteSparkSession.builder.config(map=opts).create())
             # Classic mode
             else:
+                from pyspark.core.context import SparkContext
+
                 with self._lock:
                     # Build SparkConf from options
                     sparkConf = SparkConf()


---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to