[ 
https://issues.apache.org/jira/browse/SPARK-6043?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14980899#comment-14980899
 ] 

kevin yu commented on SPARK-6043:
---------------------------------

Hello Trystan: I tried your testcase, and it works on spark 1.5, seems the 
problem has been fixed. Can you verify and close this jira? Thanks.

> Error when trying to rename table with alter table after using INSERT 
> OVERWITE to populate the table
> ----------------------------------------------------------------------------------------------------
>
>                 Key: SPARK-6043
>                 URL: https://issues.apache.org/jira/browse/SPARK-6043
>             Project: Spark
>          Issue Type: Bug
>          Components: SQL
>    Affects Versions: 1.2.1
>            Reporter: Trystan Leftwich
>            Priority: Minor
>
> If you populate a table using INSERT OVERWRITE and then try to rename the 
> table using alter table it fails with:
> {noformat}
> Error: org.apache.spark.sql.execution.QueryExecutionException: FAILED: 
> Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.DDLTask. 
> Unable to alter table. (state=,code=0)
> {noformat}
> Using the following SQL statement creates the error:
> {code:sql}
> CREATE TABLE `tmp_table` (salesamount_c1 DOUBLE);
> INSERT OVERWRITE table tmp_table SELECT
>    MIN(sales_customer.salesamount) salesamount_c1
> FROM
> (
>       SELECT
>          SUM(sales.salesamount) salesamount
>       FROM
>          internalsales sales
> ) sales_customer;
> ALTER TABLE tmp_table RENAME to not_tmp;
> {code}
> But if you change the 'OVERWRITE' to be 'INTO' the SQL statement works.
> This is happening on our CDH5.3 cluster with multiple workers, If we use the 
> CDH5.3 Quickstart VM the SQL does not produce an error. Both cases were spark 
> 1.2.1 built for hadoop2.4+



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to