yaooqinn commented on code in PR #40768:
URL: https://github.com/apache/spark/pull/40768#discussion_r1166618008
##########
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/analysis/FunctionRegistry.scala:
##########
@@ -742,6 +742,7 @@ object FunctionRegistry {
expression[SparkVersion]("version"),
expression[TypeOf]("typeof"),
expression[EqualNull]("equal_null"),
+ expression[SQLKeywords]("sql_keywords"),
Review Comment:
Although the function itself is not ANSI-standard, the behavior extending
the standard to add useful functions complies with ANSI. Let's discuss whether
to add it or not based on its usability.
- As we can see, most systems support such functionality with user-friendly
SQL APIs by `command` or `functions` despite already having it in
information_schema, which is not always for end-users.
- Comparing adding a new command with a new function, the latter is much
more lightweight.
- The standard JDBC API only supports(by contract) list keywords but can not
tell whether it is reserved or non-reserved.
- the return type of sql_keywords is compliant with information_schema, it
just acts as a `view` of information_schema.keywords.
- `sql_keywords` is a suitable name or not?
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]