[
https://issues.apache.org/jira/browse/SPARK-16570?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
Sean Owen resolved SPARK-16570.
-------------------------------
Resolution: Duplicate
Fix Version/s: (was: 1.6.2)
Target Version/s: (was: 1.6.2)
> Not able to access table's data after ALTER TABLE RENAME in Spark 1.6.2
> -----------------------------------------------------------------------
>
> Key: SPARK-16570
> URL: https://issues.apache.org/jira/browse/SPARK-16570
> Project: Spark
> Issue Type: Bug
> Components: SQL
> Affects Versions: 1.6.1, 1.6.2
> Environment: Ubuntu 14.04, hadoop 2.7.1
> Reporter: Ales Cervenka
>
> Spark 1.6.1 and 1.6.2 is not able to read data from a table after the table
> has been renamed. This can be reproduced with the following actions in the
> spark-shell:
> sqlContext.sql("SELECT 1 as col1, 2 as
> col2").write.format("parquet").mode("overwrite").saveAsTable("mytesttable")
> sqlContext.sql("SELECT * FROM mytesttable").show()
> +----+----+
> |col1|col2|
> +----+----+
> | 1| 2|
> +----+----+
> sqlContext.sql("ALTER TABLE mytesttable RENAME TO mytesttable_withnewname")
> sqlContext.sql("SELECT * FROM mytesttable_withnewname").show()
> +----+----+
> |col1|col2|
> +----+----+
> +----+----+
> I believe the issue is related to SPARK-14920 and SPARK-15635 - Spark stores
> (and later retrieves) a location of a table to a "path" SerDe property, which
> is not modified in HiveMetaStore's alter_table. And as the actual directory
> on HDFS is renamed, the "path" doesn't point to a correct location after the
> update.
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]