Github user chenghao-intel commented on a diff in the pull request:
https://github.com/apache/spark/pull/4015#discussion_r28038983
--- Diff: sql/core/src/main/scala/org/apache/spark/sql/SQLContext.scala ---
@@ -154,6 +230,18 @@ class SQLContext(@transient val sparkContext:
SparkContext)
@transient
protected[sql] val defaultSession = createSession()
+ // Sub class may need to overwrite the following 2 functions custom sql
dialect parser support
+
+ protected[sql] def dialectClassName = if (conf.dialect == "sql") {
+ classOf[DefaultSQLDialect].getCanonicalName
+ } else {
+ conf.dialect
+ }
+
+ protected[sql] def resetSQLDialect() {
+ setConf(SQLConf.DIALECT, "sql")
+ }
--- End diff --
This is for when user specified an wrong dialect, we need to switch to the
default one.
For example:
```
spark-sql>set spark.sql.dialect=NotExistedClass; // this is OK
spark-sql>SELECT * FROM src; // this will be wrong, and then we need to
switch back to either hiveql or sql automatically.
```
Previously I was thinking to validate the dialect right after the `SET
spark.sql.dialect=xxx;`, however, we don't have the `hook` mechanism for `SET`
command in Spark SQL, then we have to check that when first time we using the
new dialect.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]