AngersZhuuuu commented on a change in pull request #26053: 
[SPARK-29379][SQL]SHOW FUNCTIONS  show '!=', '<>' , 'between', 'case'
URL: https://github.com/apache/spark/pull/26053#discussion_r335305020
 
 

 ##########
 File path: sql/core/src/test/scala/org/apache/spark/sql/SQLQuerySuite.scala
 ##########
 @@ -59,7 +59,8 @@ class SQLQuerySuite extends QueryTest with 
SharedSparkSession {
   test("show functions") {
     def getFunctions(pattern: String): Seq[Row] = {
       StringUtils.filterPattern(
-        
spark.sessionState.catalog.listFunctions("default").map(_._1.funcName), pattern)
+        spark.sessionState.catalog.listFunctions("default").map(_._1.funcName)
+        ++ Seq("!=", "<>", "between", "case"), pattern)
 
 Review comment:
   > `Seq("!=", "<>", "between", "case")` appears many times, shall we put it in
   > 
   > ```
   > object ShowFunctionsCommand {
   >   // operators that do not have corresponding functions.
   >   // They should be handled in `DropFunctionCommand` as well. 
   >   val virtualOperators = Seq("!=", "<>", "between", "case")
   > }
   > ```
   
   Change many times I change name to FunctionsCommand and replace  `Seq("!=", 
"<>", "between", "case")`
   
   
   ```
   object FunctionsCommand {
     // operators that do not have corresponding functions.
     // They should be handled in `CreateFunctionCommand`, 
`DescribeFunctionCommand`,
     // `DropFunctionCommand` and `ShowFunctionsCommand`
     val virtualOperators = Seq("!=", "<>", "between", "case")
   }
   
   ```

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
[email protected]


With regards,
Apache Git Services

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to