I spent a couple of hours digging into this with a colleague<http://www.d4rr3ll.com/> today, and in the end we’ve discovered that the issue is due to the MySQL-Connector JAR not being available.
We fixed this by copying it from the Sqoop libs folder into the Oozie shared libs in HDFS: /usr/bin/sudo -u hdfs hadoop fs -copyFromLocal /usr/lib/sqoop/lib/mysql-connector-java-5.1.21-bin.jar /user/oozie/share/lib/hive/ /usr/bin/sudo -u hdfs hadoop fs -chown oozie:oozie /user/oozie/share/lib/hive/mysql-connector-java-5.1.21-bin.jar I found the error message (“FAILED: SemanticException [Error 10001]: Table not found attempted_calls_import_raw_logs_named_route_name”) was not at all intuitive for figuring this out, but an informed hunch from my coworker got us there in the end. Thank you for taking a look for me. -- Best wishes, Dave Cardwell. http://davecardwell.co.uk/ On 13 December 2012 13:42, Paul Chavez <[email protected]>wrote: > Dave, > > I had a similar issue and had to specify the metastore connection > properties directly in the workflow definition. Both the hive-default.xml > and the hive-site.xml specified in the workflow definition and included in > the deployment directory were ignored, for some reason. > > Once I manually entered the following properties in my workflow xml it > worked (I am using a mysql metastore): > javax.jdo.option.ConnectionURL > javax.jdo.option.ConnectionDriverName > javax.jdo.option.ConnectionUserName > javax.jdo.option.ConnectionPassword > > Good luck, > Paul Chavez > > > -----Original Message----- > From: Dave Cardwell [mailto:[email protected]] > Sent: Wednesday, December 12, 2012 3:16 PM > To: [email protected] > Subject: Hive Action Failing in Oozie > > Hello there, > > I recently posted this question in the hadoop-user mailing list and > somebody suggested I might have more luck here... > > I have an Oozie workflow that is failing on a Hive action with the > following error: > > FAILED: SemanticException [Error 10001]: Table not found > > attempted_calls_import_raw_logs_named_route_name > > > If I run the query file from the command line (as described in the map > task log), it works fine: > > hive --hivevar WORKING_DIR=/user/lpadm/working/attempted_calls --hivevar > TABLE_NAME=attempted_calls_import_raw_logs_named_route_name -f > hive-named_route_name.q > > Below are the contents of hive-named_route_name.q and the full output of > the failing map task log can be found at > https://gist.github.com/fc665aacb3aa9c663f04 - any ideas? > > DROP TABLE IF EXISTS ${TABLE_NAME}; > > CREATE EXTERNAL TABLE ${TABLE_NAME} ( > named_route_name_id bigint, > prefix string, > handle string, > name string, > iso2 string, > country_handle string, > country_name string, > dial_code string, > category string > ) > ROW FORMAT DELIMITED > FIELDS TERMINATED BY '\001' > COLLECTION ITEMS TERMINATED BY '\002' > MAP KEYS TERMINATED BY '\003' > STORED AS SequenceFile > LOCATION '${WORKING_DIR}/${TABLE_NAME}' > ; > > SET hive.exec.compress.output=true; > SET io.seqfile.compression.type=BLOCK; > > INSERT INTO TABLE ${TABLE_NAME} > SELECT nrn.named_route_name_id, > nrn.prefix, > nrn.handle, > nrn.name, > nrn.iso2, > nrn.country_handle, > nrn.country_name, > nrn.dial_code, > nrn.category > FROM named_route_name nrn > ORDER BY nrn.prefix ASC > ; > > -- > Best wishes, > Dave Cardwell. > > http://davecardwell.co.uk/ >
