HyukjinKwon opened a new pull request, #38325:
URL: https://github.com/apache/spark/pull/38325

   ### What changes were proposed in this pull request?
   
   This PR proposes the make the tests added in 
https://github.com/apache/spark/pull/38050 pass with ANSI mode enabled by 
avoiding string binary operations.
   
   ### Why are the changes needed?
   
   To make the tests pass with ANSI enabled on. Currently, it fails as below 
(https://github.com/apache/spark/actions/runs/3286184541/jobs/5414029918):
   
   ```
   [info] - SPARK-40615: Check unsupported data type when decorrelating 
subqueries *** FAILED *** (118 milliseconds)
   [info]   "[DATATYPE_MISMATCH.BINARY_OP_WRONG_TYPE] Cannot resolve "(a + a)" 
due to data type mismatch: the binary operator requires the input type 
("NUMERIC" or "INTERVAL DAY TO SECOND" or "INTERVAL YEAR TO MONTH" or 
"INTERVAL"), not "STRING".; line 1 pos 15;
   [info]   'Project [unresolvedalias(scalar-subquery#426412 [], None)]
   [info]   :  +- 'Project [unresolvedalias((a#426411 + a#426411), None)]
   [info]   :     +- SubqueryAlias __auto_generated_subquery_name
   [info]   :        +- Project [upper(cast(outer(x#426413)[a] as string)) AS 
a#426411]
   [info]   :           +- OneRowRelation
   [info]   +- SubqueryAlias v1
   [info]      +- View (`v1`, [x#426413])
   [info]         +- Project [cast(x#426414 as map<string,int>) AS x#426413]
   [info]            +- SubqueryAlias t
   [info]               +- LocalRelation [x#426414]
   [info]   " did not contain "Correlated column reference 'v1.x' cannot be map 
type" (SubquerySuite.scala:2480)
   [info]   org.scalatest.exceptions.TestFailedException:
   [info]   at 
org.scalatest.Assertions.newAssertionFailedException(Assertions.scala:472)
   [info]   at 
org.scalatest.Assertions.newAssertionFailedException$(Assertions.scala:471)
   [info]   at 
org.scalatest.Assertions$.newAssertionFailedException(Assertions.scala:1231)
   [info]   at 
org.scalatest.Assertions$AssertionsHelper.macroAssert(Assertions.scala:1295)
   [info]   at 
org.apache.spark.sql.SubquerySuite.$anonfun$new$320(SubquerySuite.scala:2480)
   [info]   at 
scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23)
   [info]   at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1491)
   [info]   at 
org.apache.spark.sql.test.SQLTestUtilsBase.withTempView(SQLTestUtils.scala:276)
   [info]   at 
org.apache.spark.sql.test.SQLTestUtilsBase.withTempView$(SQLTestUtils.scala:274)
   [info]   at 
org.apache.spark.sql.SubquerySuite.withTempView(SubquerySuite.scala:32)
   [info]   at 
org.apache.spark.sql.SubquerySuite.$anonfun$new$319(SubquerySuite.scala:2459)
   [info]   at 
scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23)
   [info]   at org.scalatest.OutcomeOf.outcomeOf(OutcomeOf.scala:85)
   [info]   at org.scalatest.OutcomeOf.outcomeOf$(OutcomeOf.scala:83)
   [info]   at org.scalatest.OutcomeOf$.outcomeOf(OutcomeOf.scala:104)
   [info]   at org.scalatest.Transformer.apply(Transformer.scala:22)
   ```
   
   ### Does this PR introduce _any_ user-facing change?
   
   No, test-only.
   
   ### How was this patch tested?
   
   Manually ran the tests and verified that it passes.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]


---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to