I'm hoping someone else has had this problem. I tried searching, but couldn't 
find anything ...
I'm running Hive Version 0.13.1 on EMR
I have data files staged externally in s3 in the following folder structure:
s3://root-bucket/sourcesystem/source=name/y=2015/m=01/d=XX
I have created the table in hive with this command:
CREATE EXTERNAL TABLE source_system (       nId string,       ...       ...     
  ...)PARTITIONED BY (source string,y string, m string, d string)ROW FORMAT 
DELIMITED FIELDS TERMINATED BY ','LOCATION 's3://root-bucket/sourcesystem';
When I try to load all of the available partitions using the RECOVER PARTITIONS 
command, I receive this error:
ALTER TABLE sourcesystem RECOVER PARTITIONS;
NoViableAltException(26@[926:1: alterTableStatementSuffix : ( 
alterStatementSuffixRename | alterStatementSuffixAddCol | 
alterStatementSuffixRenameCol | alterStatementSuffixDropPartitions | 
alterStatementSuffixAddPartitions | alterStatementSuffixTouch | 
alterStatementSuffixArchive | alterStatementSuffixUnArchive | 
alterStatementSuffixProperties | alterTblPartitionStatement | 
alterStatementSuffixSkewedby | alterStatementSuffixExchangePartition | 
alterStatementPartitionKeyType );])     at 
org.antlr.runtime.DFA.noViableAlt(DFA.java:158)      at 
org.antlr.runtime.DFA.predict(DFA.java:116)  at 
org.apache.hadoop.hive.ql.parse.HiveParser.alterTableStatementSuffix(HiveParser.java:6677)
   at 
org.apache.hadoop.hive.ql.parse.HiveParser.alterStatement(HiveParser.java:6552) 
     at 
org.apache.hadoop.hive.ql.parse.HiveParser.ddlStatement(HiveParser.java:2189)   
     at 
org.apache.hadoop.hive.ql.parse.HiveParser.execStatement(HiveParser.java:1398)  
     at 
org.apache.hadoop.hive.ql.parse.HiveParser.statement(HiveParser.java:1036)   at 
org.apache.hadoop.hive.ql.parse.ParseDriver.parse(ParseDriver.java:199)      at 
org.apache.hadoop.hive.ql.parse.ParseDriver.parse(ParseDriver.java:166)      at 
org.apache.hadoop.hive.ql.Driver.compile(Driver.java:404)    at 
org.apache.hadoop.hive.ql.Driver.compile(Driver.java:322)    at 
org.apache.hadoop.hive.ql.Driver.compileInternal(Driver.java:975)    at 
org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1040)       at 
org.apache.hadoop.hive.ql.Driver.run(Driver.java:911)        at 
org.apache.hadoop.hive.ql.Driver.run(Driver.java:901)        at 
org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:275)     at 
org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:227)  at 
org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:430) at 
org.apache.hadoop.hive.cli.CliDriver.executeDriver(CliDriver.java:803)       at 
org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:697) at 
org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:636)        at 
sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)  at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)   
     at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:606)     at 
org.apache.hadoop.util.RunJar.main(RunJar.java:212)FAILED: ParseException line 
1:22 cannot recognize input near 'sourcesystem' 'recover' 'partitions' in alter 
table statement
However, I am able to load/query successfully when I load a single partition 
with this command:
alter table sourcesystem add partition (source = 'name', y = '2014', m = '03', 
d = '04')location 'root-bucket/sourcesystem/source=name/y=2014/m=03/d=04';
I've used this same structure previously with no issues whatsoever, so I'm 
stumped.
Thanks a ton for any help, direction or pointers you can provide.
Best,Joshua Eldridge




                                          

Reply via email to