[Moving the conversation to sqoop-user@incubator.apache.org. Please
subscribe to this mailing list.]

Hi Abhi,

The problem you are seeing is likely due to bad health of the task node. Can
you check to see if you have too many log files under your task log
directory? If so, that would explain it. Please see [1] below for discussion
on this issue.

[1]
http://mail-archives.apache.org/mod_mbox/hadoop-common-user/200910.mbox/%3c4acf807a.4020...@gmail.com%3E

Thanks,
Arvind

On Mon, Oct 10, 2011 at 10:10 AM, Abhi <manu.i...@gmail.com> wrote:

> Forget to mention that the Sqoop version is sqoop-1.2.0
>
> ~Abhi
>
> On Oct 10, 10:08 am, Abhi <manu.i...@gmail.com> wrote:
> > hi,
> >
> > I m executing the incremental append job into my HDFS and facing the
> > following error.
> >
> > job details
> >
> >  ./sqoop job --create TABLEABC -- import --connect jdbc:oracle:thin:@//
> > test.connection.com:1558/testURI  --username test_user --password
> > test_pwd --table TABLEABC  --verbose --incremental append --check-
> > column pid --last-value 100000
> >
> > When this job is executed following error is coming up.
> >
> > 11/10/09 17:39:12 INFO tool.CodeGenTool: Beginning code generation
> > 11/10/09 17:39:13 INFO manager.OracleManager: Time zone has been set
> > to GMT
> > 11/10/09 17:39:13 INFO manager.SqlManager: Executing SQL statement:
> > SELECT t.* FROM testABC t WHERE 1=0
> > 11/10/09 17:39:13 INFO orm.CompilationManager: HADOOP_HOME is /user/
> > app/hadoop
> > Note: /tmp/sqoop-user/compile/4c5307f0cee557496b5ad26e61faaf72/
> > testABC.java uses or overrides a deprecated API.
> > Note: Recompile with -Xlint:deprecation for details.
> > 11/10/09 17:39:14 ERROR orm.CompilationManager: Could not rename /tmp/
> > sqoop-user/compile/4c5307f0cee557496b5ad26e61faaf72/testABC.java to /
> > user/app/./testABC.java
> > 11/10/09 17:39:14 INFO orm.CompilationManager: Writing jar file: /tmp/
> > sqoop-user/compile/4c5307f0cee557496b5ad26e61faaf72/testABC.jar
> > 11/10/09 17:39:14 INFO manager.OracleManager: Time zone has been set
> > to GMT
> > 11/10/09 17:39:14 INFO tool.ImportTool: Incremental import based on
> > column skey
> > 11/10/09 17:39:14 INFO tool.ImportTool: Lower bound value: 100000
> > 11/10/09 17:39:14 INFO tool.ImportTool: Upper bound value: 222222
> > 11/10/09 17:39:15 INFO manager.OracleManager: Time zone has been set
> > to GMT
> > 11/10/09 17:39:15 INFO mapreduce.ImportJobBase: Beginning import of
> > testABC
> > 11/10/09 17:39:15 INFO manager.OracleManager: Time zone has been set
> > to GMT
> > 11/10/09 18:08:03 INFO mapred.JobClient: Cleaning up the staging area
> > hdfs://
> test.sample.com:9000/user/hadoop/tmp/mapred/staging/user/.staging/job_201108170009_0027
> > 11/10/09 18:08:03 ERROR tool.ImportTool: Encountered IOException
> > running import job: java.io.IOException: Filesystem closed
> >         at org.apache.hadoop.hdfs.DFSClient.checkOpen(DFSClient.java:232)
> >         at org.apache.hadoop.hdfs.DFSClient.delete(DFSClient.java:632)
> >         at
> >
> org.apache.hadoop.hdfs.DistributedFileSystem.delete(DistributedFileSystem.j
> ava:
> > 234)
> >         at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:870)
> >         at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:793)
> >         at java.security.AccessController.doPrivileged(Native Method)
> >         at javax.security.auth.Subject.doAs(Subject.java:396)
> >         at
> >
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.j
> ava:
> > 1063)
> >         at
> > org.apache.hadoop.mapred.JobClient.submitJobInternal(JobClient.java:
> > 793)
> >         at org.apache.hadoop.mapreduce.Job.submit(Job.java:465)
> >         at
> org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:495)
> >         at
> > com.cloudera.sqoop.mapreduce.ImportJobBase.runJob(ImportJobBase.java:
> > 107)
> >         at
> > com.cloudera.sqoop.mapreduce.ImportJobBase.runImport(ImportJobBase.java:
> > 166)
> >         at
> com.cloudera.sqoop.manager.SqlManager.importTable(SqlManager.java:
> > 386)
> >         at
> > com.cloudera.sqoop.manager.OracleManager.importTable(OracleManager.java:
> > 343)
> >         at
> com.cloudera.sqoop.tool.ImportTool.importTable(ImportTool.java:
> > 350)
> >         at com.cloudera.sqoop.tool.ImportTool.run(ImportTool.java:423)
> >         at com.cloudera.sqoop.tool.JobTool.execJob(JobTool.java:231)
> >         at com.cloudera.sqoop.tool.JobTool.run(JobTool.java:286)
> >         at com.cloudera.sqoop.Sqoop.run(Sqoop.java:144)
> >         at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65)
> >         at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:79)
> >         at com.cloudera.sqoop.Sqoop.runSqoop(Sqoop.java:180)
> >         at com.cloudera.sqoop.Sqoop.runTool(Sqoop.java:218)
> >         at com.cloudera.sqoop.Sqoop.main(Sqoop.java:228)
> >
> > Kindly suggest why the file system is closed for this job?
> >
> > ~Abhi
>
> --
> NOTE: The mailing list sqoop-u...@cloudera.org is deprecated in favor of
> Apache Sqoop mailing list sqoop-user@incubator.apache.org. Please
> subscribe to it by sending an email to
> incubator-sqoop-user-subscr...@apache.org.
>

Reply via email to