This is an automated email from the ASF dual-hosted git repository.

gurwls223 pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/spark.git


The following commit(s) were added to refs/heads/master by this push:
     new a75bc841db2 [SPARK-41412][CONNECT][TESTS][FOLLOW-UP] Exclude binary 
casting to make the tests to pass with/without ANSI mode
a75bc841db2 is described below

commit a75bc841db200a8d79ae8aabf3bff0308bcaadbb
Author: Hyukjin Kwon <gurwls...@apache.org>
AuthorDate: Wed Dec 14 08:35:04 2022 +0900

    [SPARK-41412][CONNECT][TESTS][FOLLOW-UP] Exclude binary casting to make the 
tests to pass with/without ANSI mode
    
    ### What changes were proposed in this pull request?
    
    This PR is another followup of https://github.com/apache/spark/pull/39034 
that, instead, make the tests to pass with/without ANSI mode.
    
    ### Why are the changes needed?
    
    Spark Connect uses isolated Spark session so setting the configuration in 
PySpark side does not take an effect. Therefore, the test still fails, see 
https://github.com/apache/spark/actions/runs/3681383627/jobs/6228030132.
    
    We should make the tests pass with/without ANSI mode for now.
    
    ### Does this PR introduce _any_ user-facing change?
    No, test-only
    
    ### How was this patch tested?
    
    Manually tested via:
    
    ```bash
    SPARK_ANSI_SQL_MODE=true ./python/run-tests --testnames 
'pyspark.sql.tests.connect.test_connect_column'
    ```
    
    Closes #39050 from HyukjinKwon/SPARK-41412.
    
    Authored-by: Hyukjin Kwon <gurwls...@apache.org>
    Signed-off-by: Hyukjin Kwon <gurwls...@apache.org>
---
 .../sql/tests/connect/test_connect_column.py       | 35 ++++++++++------------
 1 file changed, 15 insertions(+), 20 deletions(-)

diff --git a/python/pyspark/sql/tests/connect/test_connect_column.py 
b/python/pyspark/sql/tests/connect/test_connect_column.py
index e6701231990..8b70b4d9a44 100644
--- a/python/pyspark/sql/tests/connect/test_connect_column.py
+++ b/python/pyspark/sql/tests/connect/test_connect_column.py
@@ -26,7 +26,6 @@ from pyspark.sql.types import (
     DoubleType,
     LongType,
     DecimalType,
-    BinaryType,
     BooleanType,
 )
 from pyspark.testing.connectutils import should_test_connect
@@ -153,25 +152,21 @@ class SparkConnectTests(SparkConnectSQLTestCase):
             df.select(df.id.cast("string")).toPandas(), 
df2.select(df2.id.cast("string")).toPandas()
         )
 
-        # Test if the arguments can be passed properly.
-        # Do not need to check individual behaviour for the ANSI mode 
thoroughly.
-        with self.sql_conf({"spark.sql.ansi.enabled": False}):
-            for x in [
-                StringType(),
-                BinaryType(),
-                ShortType(),
-                IntegerType(),
-                LongType(),
-                FloatType(),
-                DoubleType(),
-                ByteType(),
-                DecimalType(10, 2),
-                BooleanType(),
-                DayTimeIntervalType(),
-            ]:
-                self.assert_eq(
-                    df.select(df.id.cast(x)).toPandas(), 
df2.select(df2.id.cast(x)).toPandas()
-                )
+        for x in [
+            StringType(),
+            ShortType(),
+            IntegerType(),
+            LongType(),
+            FloatType(),
+            DoubleType(),
+            ByteType(),
+            DecimalType(10, 2),
+            BooleanType(),
+            DayTimeIntervalType(),
+        ]:
+            self.assert_eq(
+                df.select(df.id.cast(x)).toPandas(), 
df2.select(df2.id.cast(x)).toPandas()
+            )
 
     def test_unsupported_functions(self):
         # SPARK-41225: Disable unsupported functions.


---------------------------------------------------------------------
To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org
For additional commands, e-mail: commits-h...@spark.apache.org

Reply via email to