This is an automated email from the ASF dual-hosted git repository.

gurwls223 pushed a commit to branch branch-4.0
in repository https://gitbox.apache.org/repos/asf/spark.git


The following commit(s) were added to refs/heads/branch-4.0 by this push:
     new a32231b9bf96 [SPARK-50922][ML][PYTHON][CONNECT][FOLLOW-UP] Import 
pyspark.core module later for Spark Classic only
a32231b9bf96 is described below

commit a32231b9bf96e289bf26534593d585dcc07d2b6c
Author: Hyukjin Kwon <gurwls...@apache.org>
AuthorDate: Thu Feb 6 10:29:01 2025 +0900

    [SPARK-50922][ML][PYTHON][CONNECT][FOLLOW-UP] Import pyspark.core module 
later for Spark Classic only
    
    ### What changes were proposed in this pull request?
    
    This PR is a followup of https://github.com/apache/spark/pull/49704 that 
Import pyspark.core module later for Spark Classic only. For pure python 
packaging, `pyspark.core` is not available.
    
    ### Why are the changes needed?
    
    To recover the build in 
https://github.com/apache/spark/actions/runs/13164682457/job/36741828881
    
    ### Does this PR introduce _any_ user-facing change?
    
    No, test-only.
    
    ### How was this patch tested?
    
    Will monitor the build.
    
    ### Was this patch authored or co-authored using generative AI tooling?
    
    No.
    
    Closes #49820 from HyukjinKwon/SPARK-50922-followup.
    
    Authored-by: Hyukjin Kwon <gurwls...@apache.org>
    Signed-off-by: Hyukjin Kwon <gurwls...@apache.org>
    (cherry picked from commit 0b7c4f15b301cd31a026266d5a672b3ffbf6d2f9)
    Signed-off-by: Hyukjin Kwon <gurwls...@apache.org>
---
 python/pyspark/ml/classification.py | 5 +++--
 1 file changed, 3 insertions(+), 2 deletions(-)

diff --git a/python/pyspark/ml/classification.py 
b/python/pyspark/ml/classification.py
index d228157def4b..3a6425d0bfcd 100644
--- a/python/pyspark/ml/classification.py
+++ b/python/pyspark/ml/classification.py
@@ -3810,11 +3810,12 @@ class OneVsRestModel(
 
     def __init__(self, models: List[ClassificationModel]):
         super(OneVsRestModel, self).__init__()
-        from pyspark.core.context import SparkContext
-
         self.models = models
         if is_remote() or not isinstance(models[0], JavaMLWritable):
             return
+
+        from pyspark.core.context import SparkContext
+
         # set java instance
         java_models = [cast(_JavaClassificationModel, model)._to_java() for 
model in self.models]
         sc = SparkContext._active_spark_context


---------------------------------------------------------------------
To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org
For additional commands, e-mail: commits-h...@spark.apache.org

Reply via email to