This is an automated email from the ASF dual-hosted git repository.

yumwang pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/spark.git


The following commit(s) were added to refs/heads/master by this push:
     new 7165123  [SPARK-32268][TESTS][FOLLOWUP] Fix 
`BloomFilterAggregateQuerySuite` failed in ansi mode
7165123 is described below

commit 716512364468cef3c12a85403661de2837cc6fe5
Author: yangjie01 <yangji...@baidu.com>
AuthorDate: Thu Mar 24 07:18:27 2022 +0800

    [SPARK-32268][TESTS][FOLLOWUP] Fix `BloomFilterAggregateQuerySuite` failed 
in ansi mode
    
    ### What changes were proposed in this pull request?
    `Test that might_contain errors out non-constant Bloom filter` in 
`BloomFilterAggregateQuerySuite ` failed in ansi mode due to `Numeric <=> 
Binary` is [not allowed in ansi 
mode](https://github.com/apache/spark/pull/30260),  so the content of  
`exception.getMessage` is different from that of non-ans mode.
    
    This pr change the case to ensure that the error messages of `ansi` mode 
and `non-ansi` are consistent.
    
    ### Why are the changes needed?
    Bug fix.
    
    ### Does this PR introduce _any_ user-facing change?
    No
    
    ### How was this patch tested?
    
    - Pass GA
    - Local Test
    
    **Before**
    
    ```
    export SPARK_ANSI_SQL_MODE=false
    mvn clean test -pl sql/core -am -Dtest=none 
-DwildcardSuites=org.apache.spark.sql.BloomFilterAggregateQuerySuite
    ```
    
    ```
    Run completed in 23 seconds, 537 milliseconds.
    Total number of tests run: 8
    Suites: completed 2, aborted 0
    Tests: succeeded 8, failed 0, canceled 0, ignored 0, pending 0
    All tests passed.
    ```
    
    ```
    export SPARK_ANSI_SQL_MODE=true
    mvn clean test -pl sql/core -am -Dtest=none 
-DwildcardSuites=org.apache.spark.sql.BloomFilterAggregateQuerySuite
    ```
    
    ```
    - Test that might_contain errors out non-constant Bloom filter *** FAILED 
***
      "cannot resolve 'CAST(t.a AS BINARY)' due to data type mismatch:
       cannot cast bigint to binary with ANSI mode on.
       If you have to cast bigint to binary, you can set spark.sql.ansi.enabled 
as false.
      ; line 2 pos 21;
      'Project [unresolvedalias('might_contain(cast(a#2424L as binary), cast(5 
as bigint)), None)]
      +- SubqueryAlias t
         +- LocalRelation [a#2424L]
      " did not contain "The Bloom filter binary input to might_contain should 
be either a constant value or a scalar subquery expression" 
(BloomFilterAggregateQuerySuite.scala:171)
    ```
    
    **After**
    ```
    export SPARK_ANSI_SQL_MODE=false
    mvn clean test -pl sql/core -am -Dtest=none 
-DwildcardSuites=org.apache.spark.sql.BloomFilterAggregateQuerySuite
    ```
    
    ```
    Run completed in 26 seconds, 544 milliseconds.
    Total number of tests run: 8
    Suites: completed 2, aborted 0
    Tests: succeeded 8, failed 0, canceled 0, ignored 0, pending 0
    All tests passed.
    
    ```
    
    ```
    export SPARK_ANSI_SQL_MODE=true
    mvn clean test -pl sql/core -am -Dtest=none 
-DwildcardSuites=org.apache.spark.sql.BloomFilterAggregateQuerySuite
    ```
    
    ```
    Run completed in 25 seconds, 289 milliseconds.
    Total number of tests run: 8
    Suites: completed 2, aborted 0
    Tests: succeeded 8, failed 0, canceled 0, ignored 0, pending 0
    All tests passed.
    ```
    
    Closes #35953 from LuciferYang/SPARK-32268-FOLLOWUP.
    
    Authored-by: yangjie01 <yangji...@baidu.com>
    Signed-off-by: Yuming Wang <yumw...@ebay.com>
---
 .../scala/org/apache/spark/sql/BloomFilterAggregateQuerySuite.scala   | 4 ++--
 1 file changed, 2 insertions(+), 2 deletions(-)

diff --git 
a/sql/core/src/test/scala/org/apache/spark/sql/BloomFilterAggregateQuerySuite.scala
 
b/sql/core/src/test/scala/org/apache/spark/sql/BloomFilterAggregateQuerySuite.scala
index 025593b..7fc89ec 100644
--- 
a/sql/core/src/test/scala/org/apache/spark/sql/BloomFilterAggregateQuerySuite.scala
+++ 
b/sql/core/src/test/scala/org/apache/spark/sql/BloomFilterAggregateQuerySuite.scala
@@ -165,7 +165,7 @@ class BloomFilterAggregateQuerySuite extends QueryTest with 
SharedSparkSession {
     val exception1 = intercept[AnalysisException] {
       spark.sql("""
                   |SELECT might_contain(cast(a as binary), cast(5 as long))
-                  |FROM values (cast(1 as long)), (cast(2 as long)) as t(a)"""
+                  |FROM values (cast(1 as string)), (cast(2 as string)) as 
t(a)"""
         .stripMargin)
     }
     assert(exception1.getMessage.contains(
@@ -175,7 +175,7 @@ class BloomFilterAggregateQuerySuite extends QueryTest with 
SharedSparkSession {
     val exception2 = intercept[AnalysisException] {
       spark.sql("""
                   |SELECT might_contain((select cast(a as binary)), cast(5 as 
long))
-                  |FROM values (cast(1 as long)), (cast(2 as long)) as t(a)"""
+                  |FROM values (cast(1 as string)), (cast(2 as string)) as 
t(a)"""
         .stripMargin)
     }
     assert(exception2.getMessage.contains(

---------------------------------------------------------------------
To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org
For additional commands, e-mail: commits-h...@spark.apache.org

Reply via email to