Github user dilipbiswal commented on a diff in the pull request:
https://github.com/apache/spark/pull/20433#discussion_r222544245
--- Diff:
sql/catalyst/src/main/scala/org/apache/spark/sql/internal/SQLConf.scala ---
@@ -335,6 +335,12 @@ object SQLConf {
.booleanConf
.createWithDefault(true)
+ val ANSI_SQL_PARSER =
--- End diff --
@maropu Isn't this config too generic ? When this is set to true, can we
confidently say that we follow ANSI standard in all other rules ? I don't know
if DDLs are part of standard or not. If they are, then we may not be ansi
compliant , right ?
---
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]