[ 
https://issues.apache.org/jira/browse/SPARK-6200?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14364264#comment-14364264
 ] 

Michael Armbrust commented on SPARK-6200:
-----------------------------------------

Thank you for working on this.  I would like to support plug-able dialects, but 
I'm not sure I fully agree with the implementation.  In general it would be 
good to post a design on the JIRA and get agreement before doing too much 
implementation.

At a high level, I wonder if something much simpler would be sufficient.  I 
don't expect that users will spend a lot of time switching between dialects.  
Probably they will configure their preferred one in spark.defaults and never 
think about it again.  Additionally, we will still need {{SET 
spark.sql.dialect=}} as this is public API, so why not just extend that?

Basically I would propose we do the following.  Add a simple interface 
{{Dialect}} that takes a {{String}} and returns a {{LogicalPlan}} as you have 
done.  For the built in ones you just say {{SET spark.sql.dialect=sql}} or 
{{SET spark.sql.dialect=hiveql}}.  For external one you simply provide the 
fully qualified classname.  It would also be good to be clear in the interface 
what the contract is for DDL.  I would suggest that Spark SQL always parses its 
own DDL first and only defers to the dialect when the build in DDL parser does 
not handle the given string.

> Support dialect in SQL
> ----------------------
>
>                 Key: SPARK-6200
>                 URL: https://issues.apache.org/jira/browse/SPARK-6200
>             Project: Spark
>          Issue Type: Improvement
>          Components: SQL
>            Reporter: haiyang
>
> Created a new dialect manager,support dialect command and add new dialect use 
> sql statement etc.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to