This is an automated email from the ASF dual-hosted git repository.
gurwls223 pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/spark.git
The following commit(s) were added to refs/heads/master by this push:
new c0984e70469d [SPARK-49609][PYTHON][TESTS][FOLLOW-UP] Avoid import
connect modules when connect dependencies not installed
c0984e70469d is described below
commit c0984e70469d99595b8e6eda0d943308f590aaec
Author: Hyukjin Kwon <[email protected]>
AuthorDate: Thu Sep 26 13:17:59 2024 +0900
[SPARK-49609][PYTHON][TESTS][FOLLOW-UP] Avoid import connect modules when
connect dependencies not installed
### What changes were proposed in this pull request?
This PR is a followup of https://github.com/apache/spark/pull/48085 that
skips the connect import which requires Connect dependencies.
### Why are the changes needed?
To recover the PyPy3 build
https://github.com/apache/spark/actions/runs/11035779484/job/30652736098 which
does not have PyArrow installed.
### Does this PR introduce _any_ user-facing change?
No, test-only.
### How was this patch tested?
Manually.
### Was this patch authored or co-authored using generative AI tooling?
No
Closes #48259 from HyukjinKwon/SPARK-49609-followup2.
Authored-by: Hyukjin Kwon <[email protected]>
Signed-off-by: Hyukjin Kwon <[email protected]>
---
python/pyspark/sql/tests/test_connect_compatibility.py | 8 +++++---
1 file changed, 5 insertions(+), 3 deletions(-)
diff --git a/python/pyspark/sql/tests/test_connect_compatibility.py
b/python/pyspark/sql/tests/test_connect_compatibility.py
index 8f3e86f5186a..dfa0fa63b2dd 100644
--- a/python/pyspark/sql/tests/test_connect_compatibility.py
+++ b/python/pyspark/sql/tests/test_connect_compatibility.py
@@ -21,11 +21,13 @@ import inspect
from pyspark.testing.connectutils import should_test_connect,
connect_requirement_message
from pyspark.testing.sqlutils import ReusedSQLTestCase
from pyspark.sql.classic.dataframe import DataFrame as ClassicDataFrame
-from pyspark.sql.connect.dataframe import DataFrame as ConnectDataFrame
from pyspark.sql.classic.column import Column as ClassicColumn
-from pyspark.sql.connect.column import Column as ConnectColumn
from pyspark.sql.session import SparkSession as ClassicSparkSession
-from pyspark.sql.connect.session import SparkSession as ConnectSparkSession
+
+if should_test_connect:
+ from pyspark.sql.connect.dataframe import DataFrame as ConnectDataFrame
+ from pyspark.sql.connect.column import Column as ConnectColumn
+ from pyspark.sql.connect.session import SparkSession as ConnectSparkSession
class ConnectCompatibilityTestsMixin:
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]