This is an automated email from the ASF dual-hosted git repository.

dongjoon pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/spark.git


The following commit(s) were added to refs/heads/master by this push:
     new 313e824931fd [SPARK-50392][PYTHON][FOLLOWUP] Move `import`s into 
methods to fix `connect-only` builds
313e824931fd is described below

commit 313e824931fd9b407b650fb1a8c11157dc3fe676
Author: Takuya Ueshin <[email protected]>
AuthorDate: Mon Jan 13 16:36:34 2025 -0800

    [SPARK-50392][PYTHON][FOLLOWUP] Move `import`s into methods to fix 
`connect-only` builds
    
    ### What changes were proposed in this pull request?
    
    Move imports into methods to fix connect-only builds.
    
    ### Why are the changes needed?
    
    #49055 broke the connect-only builds: 
https://github.com/apache/spark/pull/49055#pullrequestreview-2545547927
    
    ### Does this PR introduce _any_ user-facing change?
    
    No.
    
    ### How was this patch tested?
    
    Manually.
    
    ### Was this patch authored or co-authored using generative AI tooling?
    
    No.
    
    Closes #49472 from ueshin/issues/SPARK-50392/fup.
    
    Authored-by: Takuya Ueshin <[email protected]>
    Signed-off-by: Dongjoon Hyun <[email protected]>
---
 python/pyspark/sql/table_arg.py | 5 ++++-
 1 file changed, 4 insertions(+), 1 deletion(-)

diff --git a/python/pyspark/sql/table_arg.py b/python/pyspark/sql/table_arg.py
index d4b5e1653c7a..cacfd24b2f1b 100644
--- a/python/pyspark/sql/table_arg.py
+++ b/python/pyspark/sql/table_arg.py
@@ -17,7 +17,6 @@
 
 from typing import TYPE_CHECKING
 
-from pyspark.sql.classic.column import _to_java_column, _to_seq
 from pyspark.sql.tvf_argument import TableValuedFunctionArgument
 from pyspark.sql.utils import get_active_spark_context
 
@@ -32,6 +31,8 @@ class TableArg(TableValuedFunctionArgument):
         self._j_table_arg = j_table_arg
 
     def partitionBy(self, *cols: "ColumnOrName") -> "TableArg":
+        from pyspark.sql.classic.column import _to_java_column, _to_seq
+
         sc = get_active_spark_context()
         if len(cols) == 1 and isinstance(cols[0], list):
             cols = cols[0]
@@ -40,6 +41,8 @@ class TableArg(TableValuedFunctionArgument):
         return TableArg(new_j_table_arg)
 
     def orderBy(self, *cols: "ColumnOrName") -> "TableArg":
+        from pyspark.sql.classic.column import _to_java_column, _to_seq
+
         sc = get_active_spark_context()
         if len(cols) == 1 and isinstance(cols[0], list):
             cols = cols[0]


---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to