dongjoon-hyun commented on pull request #32578:
URL: https://github.com/apache/spark/pull/32578#issuecomment-842881500


   This is backported to branch-3.1, too.
   - 
https://amplab.cs.berkeley.edu/jenkins/view/Spark%20QA%20Test/job/spark-branch-3.1-test-maven-hadoop-2.7-scala-2.13/
   
   I manually checked the result in branch-3.1 with Scala 2.13.
   ```
   ...
   [info] SQLQueryTestSuite:
   23:17:59.055 WARN org.apache.hadoop.util.NativeCodeLoader: Unable to load 
native-hadoop library for your platform... using builtin-java classes where 
applicable
   [info] - subquery/subquery-in-from.sql (741 milliseconds)
   [info] - subquery/exists-subquery/exists-joins-and-set-ops.sql (30 seconds, 
725 milliseconds)
   [info] - subquery/exists-subquery/exists-having.sql (1 second, 60 
milliseconds)
   [info] - subquery/exists-subquery/exists-orderby-limit.sql (2 seconds, 586 
milliseconds)
   [info] - subquery/exists-subquery/exists-cte.sql (933 milliseconds)
   [info] - subquery/exists-subquery/exists-basic.sql (546 milliseconds)
   [info] - subquery/exists-subquery/exists-within-and-or.sql (286 milliseconds)
   [info] - subquery/exists-subquery/exists-aggregate.sql (3 seconds, 524 
milliseconds)
   [info] - subquery/negative-cases/subq-input-typecheck.sql (91 milliseconds)
   [info] - subquery/negative-cases/invalid-correlation.sql (110 milliseconds)
   23:18:49.431 WARN org.apache.spark.sql.catalyst.util.package: Truncated the 
string representation of a plan since it was too large. This behavior can be 
adjusted by setting 'spark.sql.debug.maxToStringFields'.
   [info] - subquery/scalar-subquery/scalar-subquery-select.sql (2 seconds, 641 
milliseconds)
   ...
   ```


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
[email protected]



---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to