itholic commented on code in PR #39625:
URL: https://github.com/apache/spark/pull/39625#discussion_r1080824796


##########
sql/core/src/test/resources/sql-tests/results/ansi/string-functions.sql.out:
##########
@@ -6,21 +6,13 @@ struct<>
 -- !query output
 org.apache.spark.sql.AnalysisException
 {
-  "errorClass" : "DATATYPE_MISMATCH.WRONG_NUM_ARGS",
-  "sqlState" : "42K09",
+  "errorClass" : "WRONG_NUM_ARGS.WITHOUT_SUGGESTION",
+  "sqlState" : "42605",
   "messageParameters" : {
     "actualNum" : "0",
     "expectedNum" : "> 0",
-    "functionName" : "`concat_ws`",
-    "sqlExpr" : "\"concat_ws()\""
-  },
-  "queryContext" : [ {

Review Comment:
   To be honest, it is hard to figure out where these inconsistency behaviors 
come from.
   
   I used exactly the same errors, `wrongNumArgsError`, for all the cases but 
sometimes it lose query context and sometimes not.
   
   e.g.
   
   `select decode()` keeps query context in its error message, but `select 
concat_ws()` not.
   
   But I did throw same error for these two cases....
   
   Btw, do we want to necessarily expose the `queryContext` to user space?
   
   IMHO, it seems not that really helpful information for end-users, rather 
sounds a bit confusing to me.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]


---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to