[
https://issues.apache.org/jira/browse/SPARK-16803?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16333297#comment-16333297
]
Xiao Li commented on SPARK-16803:
---------------------------------
[~Tagar] It has been resolved by
https://issues.apache.org/jira/browse/SPARK-19152
> SaveAsTable does not work when source DataFrame is built on a Hive Table
> ------------------------------------------------------------------------
>
> Key: SPARK-16803
> URL: https://issues.apache.org/jira/browse/SPARK-16803
> Project: Spark
> Issue Type: Bug
> Components: SQL
> Affects Versions: 2.0.0
> Reporter: Xiao Li
> Assignee: Xiao Li
> Priority: Major
> Fix For: 2.1.0
>
>
> {noformat}
> scala> sql("create table sample.sample stored as SEQUENCEFILE as select 1 as
> key, 'abc' as value")
> res2: org.apache.spark.sql.DataFrame = []
> scala> val df = sql("select key, value as value from sample.sample")
> df: org.apache.spark.sql.DataFrame = [key: int, value: string]
> scala> df.write.mode("append").saveAsTable("sample.sample")
> scala> sql("select * from sample.sample").show()
> +---+-----+
> |key|value|
> +---+-----+
> | 1| abc|
> | 1| abc|
> +---+-----+
> {noformat}
> In Spark 1.6, it works, but Spark 2.0 does not work. The error message from
> Spark 2.0 is
> {noformat}
> scala> df.write.mode("append").saveAsTable("sample.sample")
> org.apache.spark.sql.AnalysisException: Saving data in MetastoreRelation
> sample, sample
> is not supported.;
> {noformat}
> So far, we do not plan to support it in Spark 2.0. Spark 1.6 works because it
> internally uses {{insertInto}}. But, if we change it back it will break the
> semantic of {{saveAsTable}} (this method uses by-name resolution instead of
> using by-position resolution used by {{insertInto}}).
> Instead, users should use {{insertInto}} API. We should correct the error
> messages. Users can understand how to bypass it before we support it.
--
This message was sent by Atlassian JIRA
(v7.6.3#76005)
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]