Thanks Dean, i will try to remove unnesessary jars and come back with the outcome. On Jan 23, 2013 7:11 PM, "Dean Wampler" <dean.wamp...@thinkbiganalytics.com> wrote:
> I see from your listing that your jar contains the contents of many, if > not all, of the Apache and logging jars that are also in the hadoop/lib and > hive/lib directories, including the core hadoop and hive jars themselves. > plus some Google, JSON, and other libraries. Most likely, this is causing > issues, besides creating an unnecessarily large jar file of your code. > > You should remove ALL the contents of other jars from your jar and use ADD > JAR only on the unique jars, like the Joda time jar you're using. I suspect > that will reduce or eliminate the problems, even though that it can be > tedious. I suspect you only really need to add a few extra jars, though. > > The root-cause exception : > > Caused by: java.lang.NullPointerException > at org.datanucleus.plugin.NonManagedPluginRegistry.registerBundle( > NonManagedPluginRegistry.java:443) > at org.datanucleus.plugin.NonManagedPluginRegistry.registerBundle( > NonManagedPluginRegistry.java:355) > ... > > involves the $HIVE_HOME/lib/datanucleus-core-X.Y.Z.jar. I can only guess > that loading multiple copies of other jar contents is tripping it up > somehow. > > Good luck, > dean > > On Wed, Jan 23, 2013 at 11:44 AM, Ehsan Haq <ehsan....@klarna.com> wrote: > >> I tried to rename my serde jar to zzzz.jar so that it would be loaded in >> the last, but still the same behaviour. when i run "show tables;" on the >> terminal, I get this exception in the logs. Also the list of the classes in >> the serde jar files is added in the attachment >> >> 2013-01-23 18:29:59,852 ERROR exec.Task >> (SessionState.java:printError(380)) - FAILED: Error in metadata: >> javax.jdo.JDOFatalInternalException: Unexpected exception caught. >> NestedThrowables: >> java.lang.reflect.InvocationTargetException >> org.apache.hadoop.hive.ql.metadata.HiveException: >> javax.jdo.JDOFatalInternalException: Unexpected exception caught. >> NestedThrowables: >> java.lang.reflect.InvocationTargetException >> at >> org.apache.hadoop.hive.ql.metadata.Hive.getDatabase(Hive.java:1099) >> at >> org.apache.hadoop.hive.ql.metadata.Hive.databaseExists(Hive.java:1084) >> at >> org.apache.hadoop.hive.ql.exec.DDLTask.showTables(DDLTask.java:1957) >> at >> org.apache.hadoop.hive.ql.exec.DDLTask.execute(DDLTask.java:306) >> at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:133) >> at >> org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:57) >> at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:1332) >> at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:1123) >> at org.apache.hadoop.hive.ql.Driver.run(Driver.java:931) >> at >> org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:255) >> at >> org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:212) >> at >> org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:403) >> at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:671) >> at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:554) >> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) >> at >> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) >> at >> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) >> at java.lang.reflect.Method.invoke(Method.java:601) >> at org.apache.hadoop.util.RunJar.main(RunJar.java:156) >> Caused by: javax.jdo.JDOFatalInternalException: Unexpected exception >> caught. >> NestedThrowables: >> java.lang.reflect.InvocationTargetException >> at >> javax.jdo.JDOHelper.invokeGetPersistenceManagerFactoryOnImplementation(JDOHelper.java:1186) >> at >> javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:803) >> at >> javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:698) >> at >> org.apache.hadoop.hive.metastore.ObjectStore.getPMF(ObjectStore.java:246) >> at >> org.apache.hadoop.hive.metastore.ObjectStore.getPersistenceManager(ObjectStore.java:275) >> at >> org.apache.hadoop.hive.metastore.ObjectStore.initialize(ObjectStore.java:208) >> at >> org.apache.hadoop.hive.metastore.ObjectStore.setConf(ObjectStore.java:183) >> at >> org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:62) >> at >> org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:117) >> at >> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.getMS(HiveMetaStore.java:407) >> at >> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.executeWithRetry(HiveMetaStore.java:359) >> at >> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.createDefaultDB(HiveMetaStore.java:504) >> at >> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.init(HiveMetaStore.java:266) >> at >> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.<init>(HiveMetaStore.java:228) >> at >> org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:114) >> at >> org.apache.hadoop.hive.ql.metadata.Hive.createMetaStoreClient(Hive.java:2110) >> at org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:2120) >> at >> org.apache.hadoop.hive.ql.metadata.Hive.getDatabase(Hive.java:1095) >> ... 18 more >> Caused by: java.lang.reflect.InvocationTargetException >> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) >> at >> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) >> at >> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) >> at java.lang.reflect.Method.invoke(Method.java:601) >> at javax.jdo.JDOHelper$16.run(JDOHelper.java:1958) >> at java.security.AccessController.doPrivileged(Native Method) >> at javax.jdo.JDOHelper.invoke(JDOHelper.java:1953) >> at >> javax.jdo.JDOHelper.invokeGetPersistenceManagerFactoryOnImplementation(JDOHelper.java:1159) >> ... 35 more >> Caused by: java.lang.NullPointerException >> at >> org.datanucleus.plugin.NonManagedPluginRegistry.registerBundle(NonManagedPluginRegistry.java:443) >> at >> org.datanucleus.plugin.NonManagedPluginRegistry.registerBundle(NonManagedPluginRegistry.java:355) >> at >> org.datanucleus.plugin.NonManagedPluginRegistry.registerExtensions(NonManagedPluginRegistry.java:215) >> at >> org.datanucleus.plugin.NonManagedPluginRegistry.registerExtensionPoints(NonManagedPluginRegistry.java:156) >> at >> org.datanucleus.plugin.PluginManager.registerExtensionPoints(PluginManager.java:82) >> at org.datanucleus.OMFContext.<init>(OMFContext.java:156) >> at org.datanucleus.OMFContext.<init>(OMFContext.java:137) >> at >> org.datanucleus.ObjectManagerFactoryImpl.initialiseOMFContext(ObjectManagerFactoryImpl.java:132) >> at >> org.datanucleus.jdo.JDOPersistenceManagerFactory.initialiseProperties(JDOPersistenceManagerFactory.java:363) >> at >> org.datanucleus.jdo.JDOPersistenceManagerFactory.<init>(JDOPersistenceManagerFactory.java:307) >> at >> org.datanucleus.jdo.JDOPersistenceManagerFactory.createPersistenceManagerFactory(JDOPersistenceManagerFactory.java:255) >> at >> org.datanucleus.jdo.JDOPersistenceManagerFactory.getPersistenceManagerFactory(JDOPersistenceManagerFactory.java:182) >> ... 43 more >> >> >> >> On Wed, Jan 23, 2013 at 3:05 PM, Ehsan Haq <ehsan....@klarna.com> wrote: >> >>> Thanks dean, I knew about the .hiverc script, will try the other >>> alternative of renaming jar and get back. Thanks >>> >>> /Ehsan >>> >>> >>> On Wed, Jan 23, 2013 at 2:57 PM, Dean Wampler < >>> dean.wamp...@thinkbiganalytics.com> wrote: >>> >>>> Is there anything in the logs about problems loading the jar, etc.? >>>> >>>> The jar files in $HVE_HOME are added to the CLASSPATH in alphabetical >>>> order. As an experiment, rename your jar with a name that will go last, >>>> something like zzz.jar, and see what happens when you start Hive. If it >>>> seems to be working normally, it may be that some file in your jar file has >>>> a name that's the same as a file in one of Hive's jar files and your's gets >>>> read instead. An XML config file, for example. >>>> >>>> If this experiment appears to work, run "jar tf name-of-your-file.jar" >>>> and post the listing here, if you don't mind. >>>> >>>> Note that an alternative to dropping jar files in Hive's lib directory >>>> is to put the "add jar ..." command in your $HOME/.hiverc file, which Hive >>>> will read on startup (Hive v0.7.0 and later). What Hive version are you >>>> using, by the way? I also put lots of custom configuration setting >>>> commands, e.g., set hive.exec.mode.local.auto=true; (to encourage local >>>> mode execution, when possible). >>>> >>>> dean >>>> >>>> >>>> On Wed, Jan 23, 2013 at 6:41 AM, Ehsan Haq <ehsan....@klarna.com>wrote: >>>> >>>>> Hi, >>>>> I have writen a custom serde and bundle it in a jar file, which is >>>>> working fine, when I add the jar using the CLI command add jar. However >>>>> when I put the jar in the hive/lib folder so that I dont have to >>>>> explicitly >>>>> add the jar, it looks like other jars were failed to load. The outcome is >>>>> that the meta data is also not accessible due to that. The jar works just >>>>> fine if put it somewhere else and add it via add jar. >>>>> Any idea what might be wrong? >>>>> >>>>> /Ehsan >>>>> >>>> >>>> >>>> >>>> -- >>>> *Dean Wampler, Ph.D.* >>>> thinkbiganalytics.com >>>> +1-312-339-1330 >>>> >>>> >>> >>> >>> -- >>> *Muhammad Ehsan ul Haque* >>> Klarna AB >>> Norra Stationsgatan 61 >>> SE-113 43 Stockholm >>> >>> Tel: +46 (0)8- 120 120 00 >>> Fax: +46 (0)8- 120 120 99 >>> Web: www.klarna.com >>> >> >> >> >> -- >> *Muhammad Ehsan ul Haque* >> Klarna AB >> Norra Stationsgatan 61 >> SE-113 43 Stockholm >> >> Tel: +46 (0)8- 120 120 00 >> Fax: +46 (0)8- 120 120 99 >> Web: www.klarna.com >> > > > > -- > *Dean Wampler, Ph.D.* > thinkbiganalytics.com > +1-312-339-1330 > >