Samuel, What version of Oozie are you using?
[Hive action is not part of Apache Oozie yet, if you are using CDH's Oozie and have a Hive issue you should move this thread to the [email protected] alias.] Assuming you are using CDH Oozie, the oozie sharelib bundled with Oozie (you have to installed it as an extra step) contains all necessary JARs to run the Hive action. You only have to add the JDBC driver JAR of your DB to your WF lib and make sure Hive action is configured to use it. Thanks. Alejandro On Thu, Oct 27, 2011 at 2:51 PM, Samuel Dehouck <[email protected]> wrote: > I do see it. Actually I have resolved this issue by adding other jars to my > classpath (the 4 jars in this patch > https://issues.apache.org/jira/browse/HIVE-1373) > > I now have a new error: > > 2011-10-27 14:34:21,529 ERROR DataNucleus.Datastore.Schema: Failed > initialising database. > Communications link failure > > The last packet sent successfully to the server was 0 milliseconds > ago. The driver has not received any packets from the server. > org.datanucleus.exceptions.NucleusDataStoreException: Communications > link failure > > The last packet sent successfully to the server was 0 milliseconds > ago. The driver has not received any packets from the server. > at > org.datanucleus.store.rdbms.ConnectionFactoryImpl$ManagedConnectionImpl.getConnection(ConnectionFactoryImpl.java:536) > at > org.datanucleus.store.rdbms.RDBMSStoreManager.<init>(RDBMSStoreManager.java:290) > at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native > Method) > at > sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:39) > at > sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:27) > at java.lang.reflect.Constructor.newInstance(Constructor.java:513) > at > org.datanucleus.plugin.NonManagedPluginRegistry.createExecutableExtension(NonManagedPluginRegistry.java:588) > at > org.datanucleus.plugin.PluginManager.createExecutableExtension(PluginManager.java:300) > at > org.datanucleus.ObjectManagerFactoryImpl.initialiseStoreManager(ObjectManagerFactoryImpl.java:161) > at > org.datanucleus.jdo.JDOPersistenceManagerFactory.freezeConfiguration(JDOPersistenceManagerFactory.java:583) > at > org.datanucleus.jdo.JDOPersistenceManagerFactory.createPersistenceManagerFactory(JDOPersistenceManagerFactory.java:286) > at > org.datanucleus.jdo.JDOPersistenceManagerFactory.getPersistenceManagerFactory(JDOPersistenceManagerFactory.java:182) > at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) > at > sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39) > at > sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25) > at java.lang.reflect.Method.invoke(Method.java:597) > at javax.jdo.JDOHelper$16.run(JDOHelper.java:1958) > at java.security.AccessController.doPrivileged(Native Method) > at javax.jdo.JDOHelper.invoke(JDOHelper.java:1953) > at > javax.jdo.JDOHelper.invokeGetPersistenceManagerFactoryOnImplementation(JDOHelper.java:1159) > at > javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:803) > at > javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:698) > at > org.apache.hadoop.hive.metastore.ObjectStore.getPMF(ObjectStore.java:234) > at > org.apache.hadoop.hive.metastore.ObjectStore.getPersistenceManager(ObjectStore.java:261) > at > org.apache.hadoop.hive.metastore.ObjectStore.initialize(ObjectStore.java:196) > at > org.apache.hadoop.hive.metastore.ObjectStore.setConf(ObjectStore.java:171) > at > org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:62) > at > org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:117) > at > org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.getMS(HiveMetaStore.java:354) > at > org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.executeWithRetry(HiveMetaStore.java:306) > at > org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.createDefaultDB(HiveMetaStore.java:451) > at > org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.init(HiveMetaStore.java:232) > at > org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.<init>(HiveMetaStore.java:197) > at > org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:108) > at > org.apache.hadoop.hive.ql.metadata.Hive.createMetaStoreClient(Hive.java:1868) > at org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:1878) > at org.apache.hadoop.hive.ql.metadata.Hive.getTable(Hive.java:830) > at org.apache.hadoop.hive.ql.metadata.Hive.getTable(Hive.java:772) > at > org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.getMetaData(SemanticAnalyzer.java:782) > at > org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.analyzeInternal(SemanticAnalyzer.java:6599) > at > org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnalyzer.java:238) > at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:340) > at org.apache.hadoop.hive.ql.Driver.run(Driver.java:736) > at > org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:209) > at > org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:286) > at > org.apache.hadoop.hive.cli.CliDriver.processReader(CliDriver.java:310) > at > org.apache.hadoop.hive.cli.CliDriver.processFile(CliDriver.java:317) > at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:490) > at > org.apache.oozie.action.hadoop.HiveMain.runHive(HiveMain.java:301) > at org.apache.oozie.action.hadoop.HiveMain.run(HiveMain.java:278) > at > org.apache.oozie.action.hadoop.LauncherMain.run(LauncherMain.java:26) > at org.apache.oozie.action.hadoop.HiveMain.main(HiveMain.java:50) > at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) > at > sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39) > at > sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25) > at java.lang.reflect.Method.invoke(Method.java:597) > at > org.apache.oozie.action.hadoop.LauncherMapper.map(LauncherMapper.java:391) > at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:50) > at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:391) > at org.apache.hadoop.mapred.MapTask.run(MapTask.java:325) > at org.apache.hadoop.mapred.Child$4.run(Child.java:270) > at java.security.AccessController.doPrivileged(Native Method) > at javax.security.auth.Subject.doAs(Subject.java:396) > at > org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1127) > at org.apache.hadoop.mapred.Child.main(Child.java:264) > Caused by: com.mysql.jdbc.exceptions.jdbc4.CommunicationsException: > Communications link failure > > The last packet sent successfully to the server was 0 milliseconds > ago. The driver has not received any packets from the server. > at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native > Method) > at > sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:39) > at > sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:27) > at java.lang.reflect.Constructor.newInstance(Constructor.java:513) > at com.mysql.jdbc.Util.handleNewInstance(Util.java:411) > at > com.mysql.jdbc.SQLError.createCommunicationsException(SQLError.java:1116) > at com.mysql.jdbc.MysqlIO.<init>(MysqlIO.java:344) > at > com.mysql.jdbc.ConnectionImpl.coreConnect(ConnectionImpl.java:2333) > at > com.mysql.jdbc.ConnectionImpl.connectOneTryOnly(ConnectionImpl.java:2370) > at > com.mysql.jdbc.ConnectionImpl.createNewIO(ConnectionImpl.java:2154) > at com.mysql.jdbc.ConnectionImpl.<init>(ConnectionImpl.java:792) > at com.mysql.jdbc.JDBC4Connection.<init>(JDBC4Connection.java:47) > at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native > Method) > at > sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:39) > at > sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:27) > at java.lang.reflect.Constructor.newInstance(Constructor.java:513) > at com.mysql.jdbc.Util.handleNewInstance(Util.java:411) > at > com.mysql.jdbc.ConnectionImpl.getInstance(ConnectionImpl.java:381) > at > com.mysql.jdbc.NonRegisteringDriver.connect(NonRegisteringDriver.java:305) > at java.sql.DriverManager.getConnection(DriverManager.java:582) > at java.sql.DriverManager.getConnection(DriverManager.java:185) > at > org.apache.commons.dbcp.DriverManagerConnectionFactory.createConnection(DriverManagerConnectionFactory.java:75) > at > org.apache.commons.dbcp.PoolableConnectionFactory.makeObject(PoolableConnectionFactory.java:582) > at > org.apache.commons.pool.impl.GenericObjectPool.borrowObject(GenericObjectPool.java:1148) > at > org.apache.commons.dbcp.PoolingDataSource.getConnection(PoolingDataSource.java:106) > at > org.datanucleus.store.rdbms.ConnectionFactoryImpl$ManagedConnectionImpl.getConnection(ConnectionFactoryImpl.java:521) > ... 64 more > Caused by: java.net.ConnectException: Connection refused > at java.net.PlainSocketImpl.socketConnect(Native Method) > at java.net.PlainSocketImpl.doConnect(PlainSocketImpl.java:333) > at > java.net.PlainSocketImpl.connectToAddress(PlainSocketImpl.java:195) > at java.net.PlainSocketImpl.connect(PlainSocketImpl.java:182) > at java.net.SocksSocketImpl.connect(SocksSocketImpl.java:366) > at java.net.Socket.connect(Socket.java:519) > at java.net.Socket.connect(Socket.java:469) > at java.net.Socket.<init>(Socket.java:366) > at java.net.Socket.<init>(Socket.java:209) > at > com.mysql.jdbc.StandardSocketFactory.connect(StandardSocketFactory.java:257) > at com.mysql.jdbc.MysqlIO.<init>(MysqlIO.java:294) > ... 83 more > Nested Throwables StackTrace: > com.mysql.jdbc.exceptions.jdbc4.CommunicationsException: > Communications link failure > > The last packet sent successfully to the server was 0 milliseconds > ago. The driver has not received any packets from the server. > at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native > Method) > at > sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:39) > at > sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:27) > at java.lang.reflect.Constructor.newInstance(Constructor.java:513) > at com.mysql.jdbc.Util.handleNewInstance(Util.java:411) > at > com.mysql.jdbc.SQLError.createCommunicationsException(SQLError.java:1116) > at com.mysql.jdbc.MysqlIO.<init>(MysqlIO.java:344) > at > com.mysql.jdbc.ConnectionImpl.coreConnect(ConnectionImpl.java:2333) > at > com.mysql.jdbc.ConnectionImpl.connectOneTryOnly(ConnectionImpl.java:2370) > at > com.mysql.jdbc.ConnectionImpl.createNewIO(ConnectionImpl.java:2154) > at com.mysql.jdbc.ConnectionImpl.<init>(ConnectionImpl.java:792) > at com.mysql.jdbc.JDBC4Connection.<init>(JDBC4Connection.java:47) > at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native > Method) > at > sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:39) > at > sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:27) > at java.lang.reflect.Constructor.newInstance(Constructor.java:513) > at com.mysql.jdbc.Util.handleNewInstance(Util.java:411) > at > com.mysql.jdbc.ConnectionImpl.getInstance(ConnectionImpl.java:381) > at > com.mysql.jdbc.NonRegisteringDriver.connect(NonRegisteringDriver.java:305) > at java.sql.DriverManager.getConnection(DriverManager.java:582) > at java.sql.DriverManager.getConnection(DriverManager.java:185) > at > org.apache.commons.dbcp.DriverManagerConnectionFactory.createConnection(DriverManagerConnectionFactory.java:75) > at > org.apache.commons.dbcp.PoolableConnectionFactory.makeObject(PoolableConnectionFactory.java:582) > at > org.apache.commons.pool.impl.GenericObjectPool.borrowObject(GenericObjectPool.java:1148) > at > org.apache.commons.dbcp.PoolingDataSource.getConnection(PoolingDataSource.java:106) > at > org.datanucleus.store.rdbms.ConnectionFactoryImpl$ManagedConnectionImpl.getConnection(ConnectionFactoryImpl.java:521) > at > org.datanucleus.store.rdbms.RDBMSStoreManager.<init>(RDBMSStoreManager.java:290) > at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native > Method) > at > sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:39) > at > sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:27) > at java.lang.reflect.Constructor.newInstance(Constructor.java:513) > at > org.datanucleus.plugin.NonManagedPluginRegistry.createExecutableExtension(NonManagedPluginRegistry.java:588) > at > org.datanucleus.plugin.PluginManager.createExecutableExtension(PluginManager.java:300) > at > org.datanucleus.ObjectManagerFactoryImpl.initialiseStoreManager(ObjectManagerFactoryImpl.java:161) > at > org.datanucleus.jdo.JDOPersistenceManagerFactory.freezeConfiguration(JDOPersistenceManagerFactory.java:583) > at > org.datanucleus.jdo.JDOPersistenceManagerFactory.createPersistenceManagerFactory(JDOPersistenceManagerFactory.java:286) > at > org.datanucleus.jdo.JDOPersistenceManagerFactory.getPersistenceManagerFactory(JDOPersistenceManagerFactory.java:182) > at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) > at > sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39) > at > sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25) > at java.lang.reflect.Method.invoke(Method.java:597) > at javax.jdo.JDOHelper$16.run(JDOHelper.java:1958) > at java.security.AccessController.doPrivileged(Native Method) > at javax.jdo.JDOHelper.invoke(JDOHelper.java:1953) > at > javax.jdo.JDOHelper.invokeGetPersistenceManagerFactoryOnImplementation(JDOHelper.java:1159) > at > javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:803) > at > javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:698) > at > org.apache.hadoop.hive.metastore.ObjectStore.getPMF(ObjectStore.java:234) > at > org.apache.hadoop.hive.metastore.ObjectStore.getPersistenceManager(ObjectStore.java:261) > at > org.apache.hadoop.hive.metastore.ObjectStore.initialize(ObjectStore.java:196) > at > org.apache.hadoop.hive.metastore.ObjectStore.setConf(ObjectStore.java:171) > at > org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:62) > at > org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:117) > at > org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.getMS(HiveMetaStore.java:354) > at > org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.executeWithRetry(HiveMetaStore.java:306) > at > org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.createDefaultDB(HiveMetaStore.java:451) > at > org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.init(HiveMetaStore.java:232) > at > org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.<init>(HiveMetaStore.java:197) > at > org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:108) > at > org.apache.hadoop.hive.ql.metadata.Hive.createMetaStoreClient(Hive.java:1868) > at org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:1878) > at org.apache.hadoop.hive.ql.metadata.Hive.getTable(Hive.java:830) > at org.apache.hadoop.hive.ql.metadata.Hive.getTable(Hive.java:772) > at > org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.getMetaData(SemanticAnalyzer.java:782) > at > org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.analyzeInternal(SemanticAnalyzer.java:6599) > at > org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnalyzer.java:238) > at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:340) > at org.apache.hadoop.hive.ql.Driver.run(Driver.java:736) > at > org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:209) > at > org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:286) > at > org.apache.hadoop.hive.cli.CliDriver.processReader(CliDriver.java:310) > at > org.apache.hadoop.hive.cli.CliDriver.processFile(CliDriver.java:317) > at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:490) > at > org.apache.oozie.action.hadoop.HiveMain.runHive(HiveMain.java:301) > at org.apache.oozie.action.hadoop.HiveMain.run(HiveMain.java:278) > at > org.apache.oozie.action.hadoop.LauncherMain.run(LauncherMain.java:26) > at org.apache.oozie.action.hadoop.HiveMain.main(HiveMain.java:50) > at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) > at > sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39) > at > sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25) > at java.lang.reflect.Method.invoke(Method.java:597) > at > org.apache.oozie.action.hadoop.LauncherMapper.map(LauncherMapper.java:391) > at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:50) > at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:391) > at org.apache.hadoop.mapred.MapTask.run(MapTask.java:325) > at org.apache.hadoop.mapred.Child$4.run(Child.java:270) > at java.security.AccessController.doPrivileged(Native Method) > at javax.security.auth.Subject.doAs(Subject.java:396) > at > org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1127) > at org.apache.hadoop.mapred.Child.main(Child.java:264) > Caused by: java.net.ConnectException: Connection refused > at java.net.PlainSocketImpl.socketConnect(Native Method) > at java.net.PlainSocketImpl.doConnect(PlainSocketImpl.java:333) > at > java.net.PlainSocketImpl.connectToAddress(PlainSocketImpl.java:195) > at java.net.PlainSocketImpl.connect(PlainSocketImpl.java:182) > at java.net.SocksSocketImpl.connect(SocksSocketImpl.java:366) > at java.net.Socket.connect(Socket.java:519) > at java.net.Socket.connect(Socket.java:469) > at java.net.Socket.<init>(Socket.java:366) > at java.net.Socket.<init>(Socket.java:209) > at > com.mysql.jdbc.StandardSocketFactory.connect(StandardSocketFactory.java:257) > at com.mysql.jdbc.MysqlIO.<init>(MysqlIO.java:294) > > > > On Thu, Oct 27, 2011 at 2:43 PM, Mohammad Islam <[email protected]> > wrote: > > > Do you see the "commons-dbcp-1.4.jar" in the class path property shown in > > the log of Launcher Mapper? > > > > Regards, > > Mohammad > > > > > > ________________________________ > > From: Samuel Dehouck <[email protected]> > > To: [email protected] > > Sent: Thursday, October 27, 2011 10:39 AM > > Subject: Issue running hive action > > > > Hey guys, > > > > I'm trying to run a workflow job but all hive actions fail due to a dbpc > > plugin not being added to the classpath. > > > > Here is the error: > > > > 2011-10-27 10:31:21,383 ERROR > > org.apache.hadoop.hive.ql.parse.SemanticAnalyzer: > > org.apache.hadoop.hive.ql.metadata.HiveException: Unable to fetch > > table <table_name > > at org.apache.hadoop.hive.ql.metadata.Hive.getTable(Hive.java:838) > > at org.apache.hadoop.hive.ql.metadata.Hive.getTable(Hive.java:772) > > at > > > org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.getMetaData(SemanticAnalyzer.java:782) > > at > > > org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.analyzeInternal(SemanticAnalyzer.java:6599) > > at > > > org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnalyzer.java:238) > > at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:340) > > at org.apache.hadoop.hive.ql.Driver.run(Driver.java:736) > > at > org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:209) > > at > org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:286) > > at > > org.apache.hadoop.hive.cli.CliDriver.processReader(CliDriver.java:310) > > at > org.apache.hadoop.hive.cli.CliDriver.processFile(CliDriver.java:317) > > at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:490) > > at org.apache.oozie.action.hadoop.HiveMain.runHive(HiveMain.java:301) > > at org.apache.oozie.action.hadoop.HiveMain.run(HiveMain.java:278) > > at > > org.apache.oozie.action.hadoop.LauncherMain.run(LauncherMain.java:26) > > at org.apache.oozie.action.hadoop.HiveMain.main(HiveMain.java:50) > > at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) > > at > > > sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39) > > at > > > sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25) > > at java.lang.reflect.Method.invoke(Method.java:597) > > at > > > org.apache.oozie.action.hadoop.LauncherMapper.map(LauncherMapper.java:391) > > at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:50) > > at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:391) > > at org.apache.hadoop.mapred.MapTask.run(MapTask.java:325) > > at org.apache.hadoop.mapred.Child$4.run(Child.java:270) > > at java.security.AccessController.doPrivileged(Native Method) > > at javax.security.auth.Subject.doAs(Subject.java:396) > > at > > > org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1127) > > at org.apache.hadoop.mapred.Child.main(Child.java:264) > > Caused by: javax.jdo.JDOFatalInternalException: Error creating > > transactional connection factory > > NestedThrowables: > > java.lang.reflect.InvocationTargetException > > at > > > org.datanucleus.jdo.NucleusJDOHelper.getJDOExceptionForNucleusException(NucleusJDOHelper.java:425) > > at > > > org.datanucleus.jdo.JDOPersistenceManagerFactory.freezeConfiguration(JDOPersistenceManagerFactory.java:601) > > at > > > org.datanucleus.jdo.JDOPersistenceManagerFactory.createPersistenceManagerFactory(JDOPersistenceManagerFactory.java:286) > > at > > > org.datanucleus.jdo.JDOPersistenceManagerFactory.getPersistenceManagerFactory(JDOPersistenceManagerFactory.java:182) > > at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) > > at > > > sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39) > > at > > > sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25) > > at java.lang.reflect.Method.invoke(Method.java:597) > > at javax.jdo.JDOHelper$16.run(JDOHelper.java:1958) > > at java.security.AccessController.doPrivileged(Native Method) > > at javax.jdo.JDOHelper.invoke(JDOHelper.java:1953) > > at > > > javax.jdo.JDOHelper.invokeGetPersistenceManagerFactoryOnImplementation(JDOHelper.java:1159) > > at > javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:803) > > at > javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:698) > > at > > org.apache.hadoop.hive.metastore.ObjectStore.getPMF(ObjectStore.java:234) > > at > > > org.apache.hadoop.hive.metastore.ObjectStore.getPersistenceManager(ObjectStore.java:261) > > at > > > org.apache.hadoop.hive.metastore.ObjectStore.initialize(ObjectStore.java:196) > > at > > > org.apache.hadoop.hive.metastore.ObjectStore.setConf(ObjectStore.java:171) > > at > > org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:62) > > at > > > org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:117) > > at > > > org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.getMS(HiveMetaStore.java:354) > > at > > > org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.executeWithRetry(HiveMetaStore.java:306) > > at > > > org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.createDefaultDB(HiveMetaStore.java:451) > > at > > > org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.init(HiveMetaStore.java:232) > > at > > > org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.<init>(HiveMetaStore.java:197) > > at > > > org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:108) > > at > > > org.apache.hadoop.hive.ql.metadata.Hive.createMetaStoreClient(Hive.java:1868) > > at org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:1878) > > at org.apache.hadoop.hive.ql.metadata.Hive.getTable(Hive.java:830) > > ... 28 more > > Caused by: java.lang.reflect.InvocationTargetException > > at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native > > Method) > > at > > > sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:39) > > at > > > sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:27) > > at java.lang.reflect.Constructor.newInstance(Constructor.java:513) > > at > > > org.datanucleus.plugin.NonManagedPluginRegistry.createExecutableExtension(NonManagedPluginRegistry.java:588) > > at > > > org.datanucleus.plugin.PluginManager.createExecutableExtension(PluginManager.java:324) > > at > > > org.datanucleus.store.AbstractStoreManager.registerConnectionFactory(AbstractStoreManager.java:215) > > at > > > org.datanucleus.store.AbstractStoreManager.<init>(AbstractStoreManager.java:190) > > at > > > org.datanucleus.store.mapped.MappedStoreManager.<init>(MappedStoreManager.java:137) > > at > > > org.datanucleus.store.rdbms.RDBMSStoreManager.<init>(RDBMSStoreManager.java:253) > > at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native > > Method) > > at > > > sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:39) > > at > > > sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:27) > > at java.lang.reflect.Constructor.newInstance(Constructor.java:513) > > at > > > org.datanucleus.plugin.NonManagedPluginRegistry.createExecutableExtension(NonManagedPluginRegistry.java:588) > > at > > > org.datanucleus.plugin.PluginManager.createExecutableExtension(PluginManager.java:300) > > at > > > org.datanucleus.ObjectManagerFactoryImpl.initialiseStoreManager(ObjectManagerFactoryImpl.java:161) > > at > > > org.datanucleus.jdo.JDOPersistenceManagerFactory.freezeConfiguration(JDOPersistenceManagerFactory.java:583) > > ... 55 more > > Caused by: org.datanucleus.exceptions.NucleusException: Attempt to > > invoke the "DBCP" plugin to create a ConnectionPool gave an error : > > The connection pool plugin of type "DBCP" was not found in the > > CLASSPATH! > > at > > > org.datanucleus.store.rdbms.ConnectionFactoryImpl.initDataSourceTx(ConnectionFactoryImpl.java:165) > > at > > > org.datanucleus.store.rdbms.ConnectionFactoryImpl.<init>(ConnectionFactoryImpl.java:84) > > ... 73 more > > Caused by: org.datanucleus.exceptions.NucleusUserException: The > > connection pool plugin of type "DBCP" was not found in the CLASSPATH! > > at > > > org.datanucleus.store.rdbms.ConnectionFactoryImpl.initDataSourceTx(ConnectionFactoryImpl.java:139) > > ... 74 more > > > > 2011-10-27 10:31:21,395 ERROR org.apache.hadoop.hive.ql.Driver: > > FAILED: Error in semantic analysis: Unable to fetch table <table_name> > > org.apache.hadoop.hive.ql.parse.SemanticException: Unable to fetch table > > log_api > > at > > > org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.getMetaData(SemanticAnalyzer.java:922) > > at > > > org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.analyzeInternal(SemanticAnalyzer.java:6599) > > at > > > org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnalyzer.java:238) > > at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:340) > > at org.apache.hadoop.hive.ql.Driver.run(Driver.java:736) > > at > org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:209) > > at > org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:286) > > at > > org.apache.hadoop.hive.cli.CliDriver.processReader(CliDriver.java:310) > > at > org.apache.hadoop.hive.cli.CliDriver.processFile(CliDriver.java:317) > > at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:490) > > at org.apache.oozie.action.hadoop.HiveMain.runHive(HiveMain.java:301) > > at org.apache.oozie.action.hadoop.HiveMain.run(HiveMain.java:278) > > at > > org.apache.oozie.action.hadoop.LauncherMain.run(LauncherMain.java:26) > > at org.apache.oozie.action.hadoop.HiveMain.main(HiveMain.java:50) > > at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) > > at > > > sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39) > > at > > > sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25) > > at java.lang.reflect.Method.invoke(Method.java:597) > > at > > > org.apache.oozie.action.hadoop.LauncherMapper.map(LauncherMapper.java:391) > > at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:50) > > at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:391) > > at org.apache.hadoop.mapred.MapTask.run(MapTask.java:325) > > at org.apache.hadoop.mapred.Child$4.run(Child.java:270) > > at java.security.AccessController.doPrivileged(Native Method) > > at javax.security.auth.Subject.doAs(Subject.java:396) > > at > > > org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1127) > > at org.apache.hadoop.mapred.Child.main(Child.java:264) > > Caused by: org.apache.hadoop.hive.ql.metadata.HiveException: Unable to > > fetch table log_api > > at org.apache.hadoop.hive.ql.metadata.Hive.getTable(Hive.java:838) > > at org.apache.hadoop.hive.ql.metadata.Hive.getTable(Hive.java:772) > > at > > > org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.getMetaData(SemanticAnalyzer.java:782) > > ... 26 more > > Caused by: javax.jdo.JDOFatalInternalException: Error creating > > transactional connection factory > > NestedThrowables: > > java.lang.reflect.InvocationTargetException > > at > > > org.datanucleus.jdo.NucleusJDOHelper.getJDOExceptionForNucleusException(NucleusJDOHelper.java:425) > > at > > > org.datanucleus.jdo.JDOPersistenceManagerFactory.freezeConfiguration(JDOPersistenceManagerFactory.java:601) > > at > > > org.datanucleus.jdo.JDOPersistenceManagerFactory.createPersistenceManagerFactory(JDOPersistenceManagerFactory.java:286) > > at > > > org.datanucleus.jdo.JDOPersistenceManagerFactory.getPersistenceManagerFactory(JDOPersistenceManagerFactory.java:182) > > at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) > > at > > > sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39) > > at > > > sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25) > > at java.lang.reflect.Method.invoke(Method.java:597) > > at javax.jdo.JDOHelper$16.run(JDOHelper.java:1958) > > at java.security.AccessController.doPrivileged(Native Method) > > at javax.jdo.JDOHelper.invoke(JDOHelper.java:1953) > > at > > > javax.jdo.JDOHelper.invokeGetPersistenceManagerFactoryOnImplementation(JDOHelper.java:1159) > > at > javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:803) > > at > javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:698) > > at > > org.apache.hadoop.hive.metastore.ObjectStore.getPMF(ObjectStore.java:234) > > at > > > org.apache.hadoop.hive.metastore.ObjectStore.getPersistenceManager(ObjectStore.java:261) > > at > > > org.apache.hadoop.hive.metastore.ObjectStore.initialize(ObjectStore.java:196) > > at > > > org.apache.hadoop.hive.metastore.ObjectStore.setConf(ObjectStore.java:171) > > at > > org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:62) > > at > > > org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:117) > > at > > > org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.getMS(HiveMetaStore.java:354) > > at > > > org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.executeWithRetry(HiveMetaStore.java:306) > > at > > > org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.createDefaultDB(HiveMetaStore.java:451) > > at > > > org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.init(HiveMetaStore.java:232) > > at > > > org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.<init>(HiveMetaStore.java:197) > > at > > > org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:108) > > at > > > org.apache.hadoop.hive.ql.metadata.Hive.createMetaStoreClient(Hive.java:1868) > > at org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:1878) > > at org.apache.hadoop.hive.ql.metadata.Hive.getTable(Hive.java:830) > > ... 28 more > > Caused by: java.lang.reflect.InvocationTargetException > > at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native > > Method) > > at > > > sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:39) > > at > > > sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:27) > > at java.lang.reflect.Constructor.newInstance(Constructor.java:513) > > at > > > org.datanucleus.plugin.NonManagedPluginRegistry.createExecutableExtension(NonManagedPluginRegistry.java:588) > > at > > > org.datanucleus.plugin.PluginManager.createExecutableExtension(PluginManager.java:324) > > at > > > org.datanucleus.store.AbstractStoreManager.registerConnectionFactory(AbstractStoreManager.java:215) > > at > > > org.datanucleus.store.AbstractStoreManager.<init>(AbstractStoreManager.java:190) > > at > > > org.datanucleus.store.mapped.MappedStoreManager.<init>(MappedStoreManager.java:137) > > at > > > org.datanucleus.store.rdbms.RDBMSStoreManager.<init>(RDBMSStoreManager.java:253) > > at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native > > Method) > > at > > > sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:39) > > at > > > sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:27) > > at java.lang.reflect.Constructor.newInstance(Constructor.java:513) > > at > > > org.datanucleus.plugin.NonManagedPluginRegistry.createExecutableExtension(NonManagedPluginRegistry.java:588) > > at > > > org.datanucleus.plugin.PluginManager.createExecutableExtension(PluginManager.java:300) > > at > > > org.datanucleus.ObjectManagerFactoryImpl.initialiseStoreManager(ObjectManagerFactoryImpl.java:161) > > at > > > org.datanucleus.jdo.JDOPersistenceManagerFactory.freezeConfiguration(JDOPersistenceManagerFactory.java:583) > > ... 55 more > > Caused by: org.datanucleus.exceptions.NucleusException: Attempt to > > invoke the "DBCP" plugin to create a ConnectionPool gave an error : > > The connection pool plugin of type "DBCP" was not found in the > > CLASSPATH! > > at > > > org.datanucleus.store.rdbms.ConnectionFactoryImpl.initDataSourceTx(ConnectionFactoryImpl.java:165) > > at > > > org.datanucleus.store.rdbms.ConnectionFactoryImpl.<init>(ConnectionFactoryImpl.java:84) > > ... 73 more > > Caused by: org.datanucleus.exceptions.NucleusUserException: The > > connection pool plugin of type "DBCP" was not found in the CLASSPATH! > > at > > > org.datanucleus.store.rdbms.ConnectionFactoryImpl.initDataSourceTx(ConnectionFactoryImpl.java:139) > > > > > > > > I tried to add commons-dbcp-1.4.jar to my share/lib directory in hdfs > > and/or > > to the lib directory of my workflow but the error still occurs. > > > > Any idea what I'm missing here? > > > > Thanks > > > > Samuel > > >
