[
https://issues.apache.org/jira/browse/SPARK-14507?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15240616#comment-15240616
]
Yan commented on SPARK-14507:
-----------------------------
In terms of Hive support vs Spark SQL support, the "external table" concept in
Spark SQL seems to be beyond that in Hive, not just for CTAS. For Hive,
an "external table" is only for the "schema-on-read" scenario on the data on,
say, HDFS. It has its own kinda unique DDL semantics and security features
different from normal SQL DB's. For Spark SQL's external table, as far as I
understand, it could be a mapping to a data source table. I'm not sure whether
this mapping would need special considerations regarding DDL semantics and
security models as Hive external tables.
> Decide if we should still support CREATE EXTERNAL TABLE AS SELECT
> -----------------------------------------------------------------
>
> Key: SPARK-14507
> URL: https://issues.apache.org/jira/browse/SPARK-14507
> Project: Spark
> Issue Type: Sub-task
> Components: SQL
> Reporter: Yin Huai
> Priority: Critical
>
> Look like we support CREATE EXTERNAL TABLE AS SELECT by accident. Should we
> still support it? Seems Hive does not support it. Based on the doc Impala,
> seems Impala supports it. Right now, seems the rule of CreateTables in
> HiveMetastoreCatalog.scala does not respect EXTERNAL keyword when
> {{hive.convertCTAS}} is true and the CTAS query does not provide any storage
> format. For this case, the table will become a MANAGED_TABLE and stored in
> the default metastore location (not the user specified location).
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]