[
https://issues.apache.org/jira/browse/SPARK-10672?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
Reynold Xin updated SPARK-10672:
--------------------------------
Target Version/s: 1.6.0, 1.5.1 (was: 1.5.1)
> We should not fail to create a table If we cannot persist metadata of a data
> source table to metastore in a Hive compatible way
> -------------------------------------------------------------------------------------------------------------------------------
>
> Key: SPARK-10672
> URL: https://issues.apache.org/jira/browse/SPARK-10672
> Project: Spark
> Issue Type: Bug
> Components: SQL
> Reporter: Yin Huai
> Assignee: Yin Huai
> Priority: Blocker
>
> It is possible that Hive has some internal restrictions on what kinds of
> metadata of a table it accepts (e.g. Hive 0.13 does not support decimal
> stored in parquet). If it is the case, we should not fail when we try to
> store the metadata in a Hive compatible way. We should just save it in the
> Spark SQL specific format.
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]