Hi Ramkumar, I believe that the file ojdbc7.jar is compiled for JDK7 and as a result won't work on JDK6. Would you mind using the file ojdbc6.jar and trying the import command with parameter --verbose? It would be great if you could share the output with us.
Jarcec On Tue, Sep 24, 2013 at 12:32:07PM +0530, Ramkumar Subramanian wrote: > Hi, > > > > I am using Sqoop 1.4.2 to import data from oracle database Release 12C to > Hive. > > > > Hadoop Version: CDH4.2.1 > > Java Version: 1.6 > > > > I am getting the following error with the Oracle JDBC driver, I tried with > both ojdbc6.jar and ojdbc7.jar of Release 12C. > > > > *Tried list-databases to check for the connectivity:* > > [xxx@aster4 lib]$ sqoop list-databases --connect > jdbc:oracle:thin:@10.226.226.55:1521:orcl --username XXX -P > > Enter password: > > 13/09/23 20:34:17 INFO manager.SqlManager: Using default fetchSize of 1000 > > 13/09/23 20:34:17 INFO manager.OracleManager: Time zone has been set to GMT > > 13/09/23 20:34:17 ERROR manager.OracleManager: Failed to rollback > transaction > > java.sql.SQLException: Could not rollback with auto-commit set on > > at > oracle.jdbc.driver.PhysicalConnection.rollback(PhysicalConnection.java:4510) > > at > org.apache.sqoop.manager.OracleManager.listDatabases(OracleManager.java:615) > > at > org.apache.sqoop.tool.ListDatabasesTool.run(ListDatabasesTool.java:49) > > at org.apache.sqoop.Sqoop.run(Sqoop.java:145) > > at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70) > > at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:181) > > at org.apache.sqoop.Sqoop.runTool(Sqoop.java:220) > > at org.apache.sqoop.Sqoop.runTool(Sqoop.java:229) > > at org.apache.sqoop.Sqoop.main(Sqoop.java:238) > > 13/09/23 20:34:17 ERROR manager.OracleManager: Failed to list databases > > java.sql.SQLException: Could not commit with auto-commit set on > > at > oracle.jdbc.driver.PhysicalConnection.commit(PhysicalConnection.java:4443) > > at > oracle.jdbc.driver.PhysicalConnection.commit(PhysicalConnection.java:4490) > > at > org.apache.sqoop.manager.OracleManager.listDatabases(OracleManager.java:612) > > at > org.apache.sqoop.tool.ListDatabasesTool.run(ListDatabasesTool.java:49) > > at org.apache.sqoop.Sqoop.run(Sqoop.java:145) > > at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70) > > at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:181) > > at org.apache.sqoop.Sqoop.runTool(Sqoop.java:220) > > at org.apache.sqoop.Sqoop.runTool(Sqoop.java:229) > > at org.apache.sqoop.Sqoop.main(Sqoop.java:238) > > > ============================================================ > > sqoop import --connect jdbc:oracle:thin:@10.226.226.55:1521:orcl > --username XXX -P --table ONESTAGING --target-dir /user/user/ONE42 > --split-by ST_KEY > > at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:181) > > at org.apache.sqoop.Sqoop.runTool(Sqoop.java:220) > > at org.apache.sqoop.Sqoop.runTool(Sqoop.java:229) > > at org.apache.sqoop.Sqoop.main(Sqoop.java:238) > > 13/09/23 20:40:55 INFO orm.CompilationManager: HADOOP_MAPRED_HOME is > /usr/lib/hadoop-0.20-mapreduce > > 13/09/23 20:40:55 INFO orm.CompilationManager: Found hadoop core jar at: > /usr/lib/hadoop-0.20-mapreduce/hadoop-core.jar > > Note: > /tmp/sqoop-cts378874/compile/cbdfc758f7e88cf11f63adf171787630/ONESTAGING.java > uses or overrides a deprecated API. > > Note: Recompile with -Xlint:deprecation for details. > > 13/09/23 20:40:58 INFO orm.CompilationManager: Writing jar file: > /tmp/sqoop-cts378874/compile/cbdfc758f7e88cf11f63adf171787630/ONESTAGING.jar > > 13/09/23 20:40:58 INFO mapreduce.ImportJobBase: Beginning import of > ONESTAGING > > 13/09/23 20:40:58 ERROR manager.OracleManager: Failed to rollback > transaction > > java.lang.NullPointerException > > at > org.apache.sqoop.manager.OracleManager.getColumnNames(OracleManager.java:741) > > at > org.apache.sqoop.mapreduce.DataDrivenImportJob.configureInputFormat(DataDrivenImportJob.java:165) > > at > org.apache.sqoop.mapreduce.ImportJobBase.runImport(ImportJobBase.java:203) > > at > org.apache.sqoop.manager.SqlManager.importTable(SqlManager.java:465) > > at > org.apache.sqoop.manager.OracleManager.importTable(OracleManager.java:380) > > at org.apache.sqoop.tool.ImportTool.importTable(ImportTool.java:403) > > at org.apache.sqoop.tool.ImportTool.run(ImportTool.java:476) > > at org.apache.sqoop.Sqoop.run(Sqoop.java:145) > > at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70) > > at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:181) > > at org.apache.sqoop.Sqoop.runTool(Sqoop.java:220) > > at org.apache.sqoop.Sqoop.runTool(Sqoop.java:229) > > at org.apache.sqoop.Sqoop.main(Sqoop.java:238) > > 13/09/23 20:40:58 ERROR manager.OracleManager: Failed to list columns > > java.sql.SQLException: Could not rollback with auto-commit set on > > at > oracle.jdbc.driver.PhysicalConnection.rollback(PhysicalConnection.java:4510) > > at > org.apache.sqoop.manager.OracleManager$ConnCache.getConnection(OracleManager.java:190) > > at > org.apache.sqoop.manager.OracleManager.makeConnection(OracleManager.java:283) > > at > org.apache.sqoop.manager.GenericJdbcManager.getConnection(GenericJdbcManager.java:52) > > at > org.apache.sqoop.manager.OracleManager.getColumnNames(OracleManager.java:725) > > at > org.apache.sqoop.mapreduce.DataDrivenImportJob.configureInputFormat(DataDrivenImportJob.java:165) > > at > org.apache.sqoop.mapreduce.ImportJobBase.runImport(ImportJobBase.java:203) > > at > org.apache.sqoop.manager.SqlManager.importTable(SqlManager.java:465) > > at > org.apache.sqoop.manager.OracleManager.importTable(OracleManager.java:380) > > at org.apache.sqoop.tool.ImportTool.importTable(ImportTool.java:403) > > at org.apache.sqoop.tool.ImportTool.run(ImportTool.java:476) > > at org.apache.sqoop.Sqoop.run(Sqoop.java:145) > > at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70) > > at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:181) > > at org.apache.sqoop.Sqoop.runTool(Sqoop.java:220) > > at org.apache.sqoop.Sqoop.runTool(Sqoop.java:229) > > at org.apache.sqoop.Sqoop.main(Sqoop.java:238) > > 13/09/23 20:40:59 WARN mapred.JobClient: Use GenericOptionsParser for > parsing the arguments. Applications should implement Tool for the same. > > 13/09/23 20:41:00 INFO db.DataDrivenDBInputFormat: BoundingValsQuery: > SELECT MIN(ST_KEY), MAX(ST_KEY) FROM ONESTAGING > > 13/09/23 20:41:00 WARN db.TextSplitter: Generating splits for a textual > index column. > > 13/09/23 20:41:00 WARN db.TextSplitter: If your database sorts in a > case-insensitive order, this may result in a partial import or duplicate > records. > > 13/09/23 20:41:00 WARN db.TextSplitter: You are strongly encouraged to > choose an integral split column. > > 13/09/23 20:41:00 INFO mapred.JobClient: Running job: job_201307150904_1154 > > 13/09/23 20:41:01 INFO mapred.JobClient: map 0% reduce 0% > > 13/09/23 20:41:09 INFO mapred.JobClient: Task Id : > attempt_201307150904_1154_m_000003_0, Status : FAILED > > Error: oracle/jdbc/OracleDriver : Unsupported major.minor version 51.0 > > 13/09/23 20:41:11 INFO mapred.JobClient: Task Id : > attempt_201307150904_1154_m_000002_0, Status : FAILED > > Error: oracle/jdbc/OracleDriver : Unsupported major.minor version 51.0 > > 13/09/23 20:41:13 INFO mapred.JobClient: Task Id : > attempt_201307150904_1154_m_000004_0, Status : FAILED > > java.lang.NullPointerException > > at > org.apache.sqoop.mapreduce.db.DataDrivenDBRecordReader.getSelectQuery(DataDrivenDBRecordReader.java:92) > > at > org.apache.sqoop.mapreduce.db.DBRecordReader.nextKeyValue(DBRecordReader.java:236) > > at > org.apache.hadoop.mapred.MapTask$NewTrackingRecordReader.nextKeyValue(MapTask.java:484) > > at > org.apache.hadoop.mapreduce.task.MapContextImpl.nextKeyValue(MapContextImpl.java:76) > > at > org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.nextKeyValue(WrappedMapper.java:85) > > at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:139) > > at > org.apache.sqoop.mapreduce.AutoProgressMapper.run(AutoProgressMapper.java:64) > > at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:673) > > at org.apache.hadoop.mapred.MapTask.run(MapTask.java:331) > > at org.apache.hadoop.mapred.Child$4.run(Child.java:268) > > at java.security.AccessController.doPrivileged(Native Method) > > at javax.security.auth.Subject.doAs(Subject.java:415) > > at > org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformati > > attempt_201307150904_1154_m_000004_0: log4j:WARN No appenders could be > found for logger (org.apache.hadoop.hdfs.DFSClient). > > attempt_201307150904_1154_m_000004_0: log4j:WARN Please initialize the > log4j system properly. > > attempt_201307150904_1154_m_000004_0: log4j:WARN See > http://logging.apache.org/log4j/1.2/faq.html#noconfig for more info. > > 13/09/23 20:41:13 INFO mapred.JobClient: Task Id : > attempt_201307150904_1154_m_000005_0, Status : FAILED > > java.lang.NullPointerException > > at > org.apache.sqoop.mapreduce.db.DataDrivenDBRecordReader.getSelectQuery(DataDrivenDBRecordReader.java:92) > > at > org.apache.sqoop.mapreduce.db.DBRecordReader.nextKeyValue(DBRecordReader.java:236) > > at > org.apache.hadoop.mapred.MapTask$NewTrackingRecordReader.nextKeyValue(MapTask.java:484) > > at > org.apache.hadoop.mapreduce.task.MapContextImpl.nextKeyValue(MapContextImpl.java:76) > > at > org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.nextKeyValue(WrappedMapper.java:85) > > at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:139) > > at > org.apache.sqoop.mapreduce.AutoProgressMapper.run(AutoProgressMapper.java:64) > > at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:673) > > at org.apache.hadoop.mapred.MapTask.run(MapTask.java:331) > > at org.apache.hadoop.mapred.Child$4.run(Child.java:268) > > at java.security.AccessController.doPrivileged(Native Method) > > at javax.security.auth.Subject.doAs(Subject.java:415) > > at > org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformati > > attempt_201307150904_1154_m_000005_0: log4j:WARN No appenders could be > found for logger (org.apache.hadoop.hdfs.DFSClient). > > attempt_201307150904_1154_m_000005_0: log4j:WARN Please initialize the > log4j system properly. > > attempt_201307150904_1154_m_000005_0: log4j:WARN See > http://logging.apache.org/log4j/1.2/faq.html#noconfig for more info. > > 13/09/23 20:41:13 INFO mapred.JobClient: Task Id : > attempt_201307150904_1154_m_000000_0, Status : FAILED > > java.lang.NullPointerException > > at > org.apache.sqoop.mapreduce.db.DataDrivenDBRecordReader.getSelectQuery(DataDrivenDBRecordReader.java:92) > > at > org.apache.sqoop.mapreduce.db.DBRecordReader.nextKeyValue(DBRecordReader.java:236) > > at > org.apache.hadoop.mapred.MapTask$NewTrackingRecordReader.nextKeyValue(MapTask.java:484) > > at > org.apache.hadoop.mapreduce.task.MapContextImpl.nextKeyValue(MapContextImpl.java:76) > > at > org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.nextKeyValue(WrappedMapper.java:85) > > at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:139) > > at > org.apache.sqoop.mapreduce.AutoProgressMapper.run(AutoProgressMapper.java:64) > > at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:673) > > at org.apache.hadoop.mapred.MapTask.run(MapTask.java:331) > > at org.apache.hadoop.mapred.Child$4.run(Child.java:268) > > at java.security.AccessController.doPrivileged(Native Method) > > at javax.security.auth.Subject.doAs(Subject.java:415) > > at > org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformati > > attempt_201307150904_1154_m_000000_0: log4j:WARN No appenders could be > found for logger (org.apache.hadoop.hdfs.DFSClient). > > attempt_201307150904_1154_m_000000_0: log4j:WARN Please initialize the > log4j system properly. > > attempt_201307150904_1154_m_000000_0: log4j:WARN See > http://logging.apache.org/log4j/1.2/faq.html#noconfig for more info. > > 13/09/23 20:41:13 INFO mapred.JobClient: Task Id : > attempt_201307150904_1154_m_000001_0, Status : FAILED > > java.lang.NullPointerException > > at > org.apache.sqoop.mapreduce.db.DataDrivenDBRecordReader.getSelectQuery(DataDrivenDBRecordReader.java:92) > > at > org.apache.sqoop.mapreduce.db.DBRecordReader.nextKeyValue(DBRecordReader.java:236) > > at > org.apache.hadoop.mapred.MapTask$NewTrackingRecordReader.nextKeyValue(MapTask.java:484) > > at > org.apache.hadoop.mapreduce.task.MapContextImpl.nextKeyValue(MapContextImpl.java:76) > > at > org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.nextKeyValue(WrappedMapper.java:85) > > at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:139) > > at > org.apache.sqoop.mapreduce.AutoProgressMapper.run(AutoProgressMapper.java:64) > > at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:673) > > at org.apache.hadoop.mapred.MapTask.run(MapTask.java:331) > > at org.apache.hadoop.mapred.Child$4.run(Child.java:268) > > at java.security.AccessController.doPrivileged(Native Method) > > at javax.security.auth.Subject.doAs(Subject.java:415) > > at > org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformati > > attempt_201307150904_1154_m_000001_0: log4j:WARN No appenders could be > found for logger (org.apache.hadoop.hdfs.DFSClient). > > attempt_201307150904_1154_m_000001_0: log4j:WARN Please initialize the > log4j system properly. > > attempt_201307150904_1154_m_000001_0: log4j:WARN See > http://logging.apache.org/log4j/1.2/faq.html#noconfig for more info. > > 13/09/23 20:41:16 INFO mapred.JobClient: Task Id : > attempt_201307150904_1154_m_000004_1, Status : FAILED > > Error: *oracle/jdbc/OracleDriver *: Unsupported major.minor version 51.0 > > 13/09/23 20:41:16 INFO mapred.JobClient: Task Id : > attempt_201307150904_1154_m_000005_1, Status : FAILED > > Error: oracle/jdbc/OracleDriver : Unsupported major.minor version 51.0 > > 13/09/23 20:41:18 INFO mapred.JobClient: Task Id : > attempt_201307150904_1154_m_000002_1, Status : FAILED > > java.lang.NullPointerException > > at > org.apache.sqoop.mapreduce.db.DataDrivenDBRecordReader.getSelectQuery(DataDrivenDBRecordReader.java:92) > > at > org.apache.sqoop.mapreduce.db.DBRecordReader.nextKeyValue(DBRecordReader.java:236) > > at > org.apache.hadoop.mapred.MapTask$NewTrackingRecordReader.nextKeyValue(MapTask.java:484) > > at > org.apache.hadoop.mapreduce.task.MapContextImpl.nextKeyValue(MapContextImpl.java:76) > > at > org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.nextKeyValue(WrappedMapper.java:85) > > at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:139) > > at > org.apache.sqoop.mapreduce.AutoProgressMapper.run(AutoProgressMapper.java:64) > > at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:673) > > at org.apache.hadoop.mapred.MapTask.run(MapTask.java:331) > > at org.apache.hadoop.mapred.Child$4.run(Child.java:268) > > at java.security.AccessController.doPrivileged(Native Method) > > at javax.security.auth.Subject.doAs(Subject.java:415) > > at > org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformati > > attempt_201307150904_1154_m_000002_1: log4j:WARN No appenders could be > found for logger (org.apache.hadoop.hdfs.DFSClient). > > attempt_201307150904_1154_m_000002_1: log4j:WARN Please initialize the > log4j system properly. > > attempt_201307150904_1154_m_000002_1: log4j:WARN See > http://logging.apache.org/log4j/1.2/faq.html#noconfig for more info. > > 13/09/23 20:41:18 INFO mapred.JobClient: Task Id : > attempt_201307150904_1154_m_000003_1, Status : FAILED > > java.lang.NullPointerException > > at > org.apache.sqoop.mapreduce.db.DataDrivenDBRecordReader.getSelectQuery(DataDrivenDBRecordReader.java:92) > > at > org.apache.sqoop.mapreduce.db.DBRecordReader.nextKeyValue(DBRecordReader.java:236) > > at > org.apache.hadoop.mapred.MapTask$NewTrackingRecordReader.nextKeyValue(MapTask.java:484) > > at > org.apache.hadoop.mapreduce.task.MapContextImpl.nextKeyValue(MapContextImpl.java:76) > > at > org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.nextKeyValue(WrappedMapper.java:85) > > at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:139) > > at > org.apache.sqoop.mapreduce.AutoProgressMapper.run(AutoProgressMapper.java:64) > > at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:673) > > at org.apache.hadoop.mapred.MapTask.run(MapTask.java:331) > > at org.apache.hadoop.mapred.Child$4.run(Child.java:268) > > at java.security.AccessController.doPrivileged(Native Method) > > at javax.security.auth.Subject.doAs(Subject.java:415) > > at > org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformati > > attempt_201307150904_1154_m_000003_1: log4j:WARN No appenders could be > found for logger (org.apache.hadoop.hdfs.DFSClient). > > attempt_201307150904_1154_m_000003_1: log4j:WARN Please initialize the > log4j system properly. > > attempt_201307150904_1154_m_000003_1: log4j:WARN See > http://logging.apache.org/log4j/1.2/faq.html#noconfig for more info. > > 13/09/23 20:41:19 INFO mapred.JobClient: Task Id : > attempt_201307150904_1154_m_000000_1, Status : FAILED > > Error: oracle/jdbc/OracleDriver : Unsupported major.minor version 51.0 > > 13/09/23 20:41:20 INFO mapred.JobClient: Task Id : > attempt_201307150904_1154_m_000001_1, Status : FAILED > > Error: oracle/jdbc/OracleDriver : Unsupported major.minor version 51.0 > > 13/09/23 20:41:22 INFO mapred.JobClient: Task Id : > attempt_201307150904_1154_m_000005_2, Status : FAILED > > java.lang.NullPointerException > > at > org.apache.sqoop.mapreduce.db.DataDrivenDBRecordReader.getSelectQuery(DataDrivenDBRecordReader.java:92) > > at > org.apache.sqoop.mapreduce.db.DBRecordReader.nextKeyValue(DBRecordReader.java:236) > > at > org.apache.hadoop.mapred.MapTask$NewTrackingRecordReader.nextKeyValue(MapTask.java:484) > > at > org.apache.hadoop.mapreduce.task.MapContextImpl.nextKeyValue(MapContextImpl.java:76) > > at > org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.nextKeyValue(WrappedMapper.java:85) > > at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:139) > > at > org.apache.sqoop.mapreduce.AutoProgressMapper.run(AutoProgressMapper.java:64) > > at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:673) > > at org.apache.hadoop.mapred.MapTask.run(MapTask.java:331) > > at org.apache.hadoop.mapred.Child$4.run(Child.java:268) > > at java.security.AccessController.doPrivileged(Native Method) > > at javax.security.auth.Subject.doAs(Subject.java:415) > > at > org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformati > > attempt_201307150904_1154_m_000005_2: log4j:WARN No appenders could be > found for logger (org.apache.hadoop.hdfs.DFSClient). > > attempt_201307150904_1154_m_000005_2: log4j:WARN Please initialize the > log4j system properly. > > attempt_201307150904_1154_m_000005_2: log4j:WARN See > http://logging.apache.org/log4j/1.2/faq.html#noconfig for more info. > > 13/09/23 20:41:22 INFO mapred.JobClient: Task Id : > attempt_201307150904_1154_m_000004_2, Status : FAILED > > java.lang.NullPointerException > > at > org.apache.sqoop.mapreduce.db.DataDrivenDBRecordReader.getSelectQuery(DataDrivenDBRecordReader.java:92) > > at > org.apache.sqoop.mapreduce.db.DBRecordReader.nextKeyValue(DBRecordReader.java:236) > > at > org.apache.hadoop.mapred.MapTask$NewTrackingRecordReader.nextKeyValue(MapTask.java:484) > > at > org.apache.hadoop.mapreduce.task.MapContextImpl.nextKeyValue(MapContextImpl.java:76) > > at > org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.nextKeyValue(WrappedMapper.java:85) > > at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:139) > > at > org.apache.sqoop.mapreduce.AutoProgressMapper.run(AutoProgressMapper.java:64) > > at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:673) > > at org.apache.hadoop.mapred.MapTask.run(MapTask.java:331) > > at org.apache.hadoop.mapred.Child$4.run(Child.java:268) > > at java.security.AccessController.doPrivileged(Native Method) > > at javax.security.auth.Subject.doAs(Subject.java:415) > > at > org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformati > > attempt_201307150904_1154_m_000004_2: log4j:WARN No appenders could be > found for logger (org.apache.hadoop.hdfs.DFSClient). > > attempt_201307150904_1154_m_000004_2: log4j:WARN Please initialize the > log4j system properly. > > attempt_201307150904_1154_m_000004_2: log4j:WARN See > http://logging.apache.org/log4j/1.2/faq.html#noconfig for more info. > > 13/09/23 20:41:24 INFO mapred.JobClient: Task Id : > attempt_201307150904_1154_m_000000_2, Status : FAILED > > java.lang.NullPointerException > > at > org.apache.sqoop.mapreduce.db.DataDrivenDBRecordReader.getSelectQuery(DataDrivenDBRecordReader.java:92) > > at > org.apache.sqoop.mapreduce.db.DBRecordReader.nextKeyValue(DBRecordReader.java:236) > > at > org.apache.hadoop.mapred.MapTask$NewTrackingRecordReader.nextKeyValue(MapTask.java:484) > > at > org.apache.hadoop.mapreduce.task.MapContextImpl.nextKeyValue(MapContextImpl.java:76) > > at > org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.nextKeyValue(WrappedMapper.java:85) > > at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:139) > > at > org.apache.sqoop.mapreduce.AutoProgressMapper.run(AutoProgressMapper.java:64) > > at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:673) > > at org.apache.hadoop.mapred.MapTask.run(MapTask.java:331) > > at org.apache.hadoop.mapred.Child$4.run(Child.java:268) > > at java.security.AccessController.doPrivileged(Native Method) > > at javax.security.auth.Subject.doAs(Subject.java:415) > > at > org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformati > > attempt_201307150904_1154_m_000000_2: log4j:WARN No appenders could be > found for logger (org.apache.hadoop.hdfs.DFSClient). > > attempt_201307150904_1154_m_000000_2: log4j:WARN Please initialize the > log4j system properly. > > attempt_201307150904_1154_m_000000_2: log4j:WARN See > http://logging.apache.org/log4j/1.2/faq.html#noconfig for more info. > > 13/09/23 20:41:25 INFO mapred.JobClient: Task Id : > attempt_201307150904_1154_m_000001_2, Status : FAILED > > java.lang.NullPointerException > > at > org.apache.sqoop.mapreduce.db.DataDrivenDBRecordReader.getSelectQuery(DataDrivenDBRecordReader.java:92) > > at > org.apache.sqoop.mapreduce.db.DBRecordReader.nextKeyValue(DBRecordReader.java:236) > > at > org.apache.hadoop.mapred.MapTask$NewTrackingRecordReader.nextKeyValue(MapTask.java:484) > > at > org.apache.hadoop.mapreduce.task.MapContextImpl.nextKeyValue(MapContextImpl.java:76) > > at > org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.nextKeyValue(WrappedMapper.java:85) > > at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:139) > > at > org.apache.sqoop.mapreduce.AutoProgressMapper.run(AutoProgressMapper.java:64) > > at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:673) > > at org.apache.hadoop.mapred.MapTask.run(MapTask.java:331) > > at org.apache.hadoop.mapred.Child$4.run(Child.java:268) > > at java.security.AccessController.doPrivileged(Native Method) > > at javax.security.auth.Subject.doAs(Subject.java:415) > > at > org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformati > > attempt_201307150904_1154_m_000001_2: log4j:WARN No appenders could be > found for logger (org.apache.hadoop.hdfs.DFSClient). > > attempt_201307150904_1154_m_000001_2: log4j:WARN Please initialize the > log4j system properly. > > attempt_201307150904_1154_m_000001_2: log4j:WARN See > http://logging.apache.org/log4j/1.2/faq.html#noconfig for more info. > > 13/09/23 20:41:29 INFO mapred.JobClient: Job complete: job_201307150904_1154 > > 13/09/23 20:41:29 INFO mapred.JobClient: Counters: 6 > > 13/09/23 20:41:29 INFO mapred.JobClient: Job Counters > > 13/09/23 20:41:29 INFO mapred.JobClient: Failed map tasks=1 > > 13/09/23 20:41:29 INFO mapred.JobClient: Launched map tasks=20 > > 13/09/23 20:41:29 INFO mapred.JobClient: Total time spent by all maps > in occupied slots (ms)=87582 > > 13/09/23 20:41:29 INFO mapred.JobClient: Total time spent by all > reduces in occupied slots (ms)=0 > > 13/09/23 20:41:29 INFO mapred.JobClient: Total time spent by all maps > waiting after reserving slots (ms)=0 > > 13/09/23 20:41:29 INFO mapred.JobClient: Total time spent by all > reduces waiting after reserving slots (ms)=0 > > 13/09/23 20:41:30 WARN mapreduce.Counters: Group FileSystemCounters is > deprecated. Use org.apache.hadoop.mapreduce.FileSystemCounter instead > > 13/09/23 20:41:30 INFO mapreduce.ImportJobBase: Transferred 0 bytes in > 31.523 seconds (0 bytes/sec) > > 13/09/23 20:41:30 WARN mapreduce.Counters: Group > org.apache.hadoop.mapred.Task$Counter is deprecated. Use > org.apache.hadoop.mapreduce.TaskCounter instead > > 13/09/23 20:41:30 INFO mapreduce.ImportJobBase: Retrieved 0 records. > > 13/09/23 20:41:30 ERROR tool.ImportTool: Error during import: Import job > failed! > > > Can someone please help me to resolve this issue. > > > Thanks & Regards, > > Ramkumar Subramanian.
signature.asc
Description: Digital signature
