Hi Anil Kumar

The error comes basically due to lack of availability of connector jars in 
class path. Please ensure you have derbyclient.jar and derbytools.jar  in 
..../hive/lib . In the wort case you need to add there jars in ..../hadoop/lib 
as well.

A snippet from hive wiki

Copy Derby Jar Files
Now since there is a new client you MUST make sure hive has these in the lib 
directory or in the classpath. The same would be true if you used MySQL or some 
other DB.
cp /opt/hadoop/db-derby-10.4.1.3-bin/lib/derbyclient.jar /opt/hadoop/hive/lib
cp /opt/hadoop/db-derby-10.4.1.3-bin/lib/derbytools.jar /opt/hadoop/hive/lib 
If you receive the error "javax.jdo.JDOFatalInternalException: Error creating 
transactional connection factory" where the stack trace originates 
"org.datanucleus.exceptions.ClassNotResolvedException: Class 
'org.apache.derby.jdbc.ClientDriver' was not found in the CLASSPATH. Please 
check your specification and your CLASSPATH", you may benefit from putting the 
derby jar files directly in the hadoop lib directory:
cp /opt/hadoop/db-derby-10.4.1.3-bin/lib/derbyclient.jar 
/opt/hadoop/hadoop-0.17.2.1/lib
cp /opt/hadoop/db-derby-10.4.1.3-bin/lib/derbytools.jar 
/opt/hadoop/hadoop-0.17.2.1/lib 

 
Regards,
Bejoy KS


________________________________
 From: Bertrand Dechoux <[email protected]>
To: [email protected] 
Sent: Thursday, September 27, 2012 10:57 AM
Subject: Re: issue hive with external derby
 

Hi,

For 1), did you follow the wiki?
https://cwiki.apache.org/confluence/display/Hive/HiveDerbyServerMode

Maybe you didn't provide the right jars.
Did you check that you could connect yourself to the database?

For 2), I don't know which version is supported for Hadoop.
https://cwiki.apache.org/confluence/display/Hive/GettingStarted
"Most of our testing has been on Hadoop 0.20 - so we advise running it against 
this version even though it may compile/work against other versions"

See http://www.cloudera.com/blog/2012/01/an-update-on-apache-hadoop-1-0/ for a 
short overview of versions.

Regards

Bertrand



On Thu, Sep 27, 2012 at 5:44 AM, AnilKumar B <[email protected]> wrote:

Hi,
>
>Can anybody help me in following issues.
>
>1) I am using hadoop-1.0.3 with hive-0.9.0
>When start hive in 
embedded derby mode it is working fine. But when I start in external 
derby mode I am getting following error. What could be the issue?
>hive> show tables;
>FAILED: Error in metadata: javax.jdo.JDOFatalInternalException: Error creating 
>transactional connection factory
>NestedThrowables:
>java.lang.reflect.
>InvocationTargetException
>FAILED: Execution Error, return code 1 from 
>org.apache.hadoop.hive.ql.exec.DDLTask
>hive> 
>
>
>myconfig is:
><property>
>  <name>javax.jdo.option.ConnectionURL</name>
>  <value>jdbc:derby://localhost:1527/myderby1;create=true</value>
>  <description>JDBC connect string for a JDBC metastore</description>
></property>
><property>
>  <name>javax.jdo.option.ConnectionDriverName</name>
>  <value>org.apache.derby.jdbc.ClientDriver</value>
>  <description>Driver class name for a JDBC metastore</description>
></property>
>
>2) I tried hadoop-0.22.0 with 
hive-0.8.0. It is always throwing shims issue....... Can you please tell
 me how to overcome this?
>
>Thanks,
>B Anil Kumar.
>


-- 
Bertrand Dechoux

Reply via email to