Github user wzhfy commented on the pull request:
https://github.com/apache/spark/pull/9349#issuecomment-152373512
@davies My intention is to change the displayed dialect when spark-sql
starts, not to set "hiveql" for HiveContext forever.
Here's my description of the problem in JIRA:
When we start bin/spark-sql, the default context is HiveContext, and the
corresponding dialect is hiveql.
However, if we type "set spark.sql.dialect;", the result is "sql", which is
inconsistent with the actual dialect and is misleading. For example, we can
create tables which is only allowed in hiveql, but this dialect conf shows it's
"sql".
Although this problem will not cause any execution error, it's misleading
to spark sql users. Therefore I think we should fix it.
After the change, we can still use "sql" as the dialect for HiveContext
through "set spark.sql.dialect=sql", then the conf.dialect in HiveContext will
become sql. Because in SQLConf, def dialect = getConf(), and now the dialect in
"settings" becomes "sql".
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]