mihailom-db commented on code in PR #46574:
URL: https://github.com/apache/spark/pull/46574#discussion_r1599646376


##########
sql/core/src/test/scala/org/apache/spark/sql/CollationSuite.scala:
##########
@@ -71,6 +71,13 @@ class CollationSuite extends DatasourceV2SQLBase with 
AdaptiveSparkPlanHelper {
     assert(sql(s"select collate('aaa', 
'utf8_binary_lcase')").schema(0).dataType == StringType(1))
   }
 
+  test("collate function syntax with default collation set") {
+    withSQLConf(SqlApiConf.DEFAULT_COLLATION -> "UTF8_BINARY_LCASE") {
+      assert(sql(s"select collate('aaa', 
'utf8_binary_lcase')").schema(0).dataType == StringType(1))
+      assert(sql(s"select collate('aaa', 'UNICODE')").schema(0).dataType == 
StringType(2))

Review Comment:
   In tests, please opt to use `StringType(collation_name)` as it will make 
life easier for everyone to review.



##########
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/collationExpressions.scala:
##########
@@ -57,14 +57,14 @@ object CollateExpressionBuilder extends ExpressionBuilder {
     expressions match {
       case Seq(e: Expression, collationExpr: Expression) =>
         (collationExpr.dataType, collationExpr.foldable) match {
-          case (StringType, true) =>
+          case (_: StringType, true) =>

Review Comment:
   This seems like it was a problem which was created long before default 
collation. Could we update the name of the PR to something that expresses this. 
I believe it was not possible to collate any expression which did not evaluate 
to UTF8_BINARY.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]


---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to