[ https://issues.apache.org/jira/browse/HIVE-1411?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=12891392#action_12891392 ]
Paul Yang commented on HIVE-1411: --------------------------------- About #4, I was able to reproduce the exception and verify the fix by creating a copy of the datanucleus core jar in hive's lib directory. Makes sense to me. Now that you mention it, I think I ran into a similar problem while using the 'hadoop jar' command. It would unpack the jar, but include both the jar and the unpacked classes in the classpath. If the jar contained datanucleus stuff, this would throw an exception. Anyway, the fix looks good. +1 > DataNucleus throws NucleusException if core-3.1.1 JAR appears more than once > on CLASSPATH > ----------------------------------------------------------------------------------------- > > Key: HIVE-1411 > URL: https://issues.apache.org/jira/browse/HIVE-1411 > Project: Hadoop Hive > Issue Type: Bug > Components: Metastore > Affects Versions: 0.4.0, 0.4.1, 0.5.0 > Reporter: Carl Steinbach > Assignee: Carl Steinbach > Fix For: 0.6.0, 0.7.0 > > Attachments: HIVE-1411.patch.txt > > > DataNucleus barfs when the core-3.1.1 JAR file appears more than once on the > CLASSPATH: > {code} > 2010-03-06 12:33:25,565 ERROR exec.DDLTask > (SessionState.java:printError(279)) - FAILED: Error in metadata: > javax.jdo.JDOFatalInter > nalException: Unexpected exception caught. > NestedThrowables: > java.lang.reflect.InvocationTargetException > org.apache.hadoop.hive.ql.metadata.HiveException: > javax.jdo.JDOFatalInternalException: Unexpected exception caught. > NestedThrowables: > java.lang.reflect.InvocationTargetException > at org.apache.hadoop.hive.ql.metadata.Hive.createTable(Hive.java:258) > at org.apache.hadoop.hive.ql.exec.DDLTask.createTable(DDLTask.java:879) > at org.apache.hadoop.hive.ql.exec.DDLTask.execute(DDLTask.java:103) > at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:379) > at org.apache.hadoop.hive.ql.Driver.run(Driver.java:285) > at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:123) > at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:181) > at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:287) > at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) > at > sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39) > at > sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25) > at java.lang.reflect.Method.invoke(Method.java:597) > at org.apache.hadoop.util.RunJar.main(RunJar.java:156) > Caused by: javax.jdo.JDOFatalInternalException: Unexpected exception caught. > NestedThrowables: > java.lang.reflect.InvocationTargetException > at > javax.jdo.JDOHelper.invokeGetPersistenceManagerFactoryOnImplementation(JDOHelper.java:1186) > at javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:803) > at javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:698) > at org.apache.hadoop.hive.metastore.ObjectStore.getPMF(ObjectStore.java:164) > at > org.apache.hadoop.hive.metastore.ObjectStore.getPersistenceManager(ObjectStore.java:181) > at > org.apache.hadoop.hive.metastore.ObjectStore.initialize(ObjectStore.java:125) > at org.apache.hadoop.hive.metastore.ObjectStore.setConf(ObjectStore.java:104) > at org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:62) > at > org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:117) > at > org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.getMS(HiveMetaStore.java:130) > at > org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.createDefaultDB(HiveMetaStore.java:146) > at > org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.init(HiveMetaStore.java:118) > > at > org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.(HiveMetaStore.java:100) > > at > org.apache.hadoop.hive.metastore.HiveMetaStoreClient.(HiveMetaStoreClient.java:74) > > at > org.apache.hadoop.hive.ql.metadata.Hive.createMetaStoreClient(Hive.java:783) > at org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:794) > at org.apache.hadoop.hive.ql.metadata.Hive.createTable(Hive.java:252) > ... 12 more > Caused by: java.lang.reflect.InvocationTargetException > at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) > at > sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39) > at > sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25) > at java.lang.reflect.Method.invoke(Method.java:597) > at javax.jdo.JDOHelper$16.run(JDOHelper.java:1956) > at java.security.AccessController.doPrivileged(Native Method) > at javax.jdo.JDOHelper.invoke(JDOHelper.java:1951) > at > javax.jdo.JDOHelper.invokeGetPersistenceManagerFactoryOnImplementation(JDOHelper.java:1159) > ... 28 more > Caused by: org.datanucleus.exceptions.NucleusException: Plugin (Bundle) > "org.eclipse.jdt.core" is already registered. Ensure you do > nt have multiple JAR versions of the same plugin in the classpath. The URL > "file:/Users/hadop/hadoop-0.20.1+152/build/ivy/lib/Hadoo > p/common/core-3.1.1.jar" is already registered, and you are trying to > register an identical plugin located at URL "file:/Users/hado > p/hadoop-0.20.1+152/lib/core-3.1.1.jar." > at > org.datanucleus.plugin.NonManagedPluginRegistry.registerBundle(NonManagedPluginRegistry.java:437) > at > org.datanucleus.plugin.NonManagedPluginRegistry.registerBundle(NonManagedPluginRegistry.java:343) > at > org.datanucleus.plugin.NonManagedPluginRegistry.registerExtensions(NonManagedPluginRegistry.java:227) > at > org.datanucleus.plugin.NonManagedPluginRegistry.registerExtensionPoints(NonManagedPluginRegistry.java:159) > at > org.datanucleus.plugin.PluginManager.registerExtensionPoints(PluginManager.java:82) > > at org.datanucleus.OMFContext.(OMFContext.java:164) > at org.datanucleus.OMFContext.(OMFContext.java:145) > at > org.datanucleus.ObjectManagerFactoryImpl.initialiseOMFContext(ObjectManagerFactoryImpl.java:143) > at > org.datanucleus.jdo.JDOPersistenceManagerFactory.initialiseProperties(JDOPersistenceManagerFactory.java:317) > at > org.datanucleus.jdo.JDOPersistenceManagerFactory.(JDOPersistenceManagerFactory.java:261) > at > org.datanucleus.jdo.JDOPersistenceManagerFactory.getPersistenceManagerFactory(JDOPersistenceManagerFactory.java:174) > ... 36 more > 2010-03-06 12:33:25,575 ERROR ql.Driver (SessionState.java:printError(279)) - > FAILED: Execution Error, return code 1 from org.apach > e.hadoop.hive.ql.exec.DDLTask > 2010-03-06 12:42:30,457 ERROR exec.DDLTask > (SessionState.java:printError(279)) - FAILED: Error in metadata: > javax.jdo.JDOFatalInter > nalException: Unexpected exception caught. > NestedThrowables: > java.lang.reflect.InvocationTargetException > org.apache.hadoop.hive.ql.metadata.HiveException: > javax.jdo.JDOFatalInternalException: Unexpected exception caught. > NestedThrowables: > {code} -- This message is automatically generated by JIRA. - You can reply to this email to add a comment to the issue online.