[ https://issues.apache.org/jira/browse/SPARK-4232?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14198430#comment-14198430 ]
shengli edited comment on SPARK-4232 at 11/7/14 6:31 AM: --------------------------------------------------------- There is also a workaround without modify hive code, you can extract the database name , and separate this into two commands. First change database, Second execute truncate table command. Third, siwtch to the origin context. was (Author: oopsoutofmemory): There is also a workaround without modify hive code, you can extract the database name , and separate this into two commands. First change database, Second execute truncate table. > Truncate table not works when specific the table from non-current database > session > ---------------------------------------------------------------------------------- > > Key: SPARK-4232 > URL: https://issues.apache.org/jira/browse/SPARK-4232 > Project: Spark > Issue Type: Bug > Components: SQL > Affects Versions: 1.1.0 > Reporter: shengli > Priority: Minor > Fix For: 1.1.1, 1.2.0 > > > Currently the truncate table works fine in the situation that current table > is in the corresponding database session. But it doesn't work in the scenario > which the table is not in the corresponding database session. > What I mean is : > Assume we have two database:default, dw. A table named test_table in database > dw > By default we login as default database session. So I run: > use dw; > truncate table test_table [partions......]; is OK. > If I just use the default database default to run: > use default; > truncate table dw.test_table; > It will throw exception > Failed to parse: truncate table dw.test_table. > line 1:17 missing EOF at '.' near 'dw' > It's a bug when parsing the truncate table xxx -- This message was sent by Atlassian JIRA (v6.3.4#6332) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org