Right. But actually as far as I have seen, we don't call FileSystem.close().
Here is a corresponding stacktrace: it seems the exception happens when the
task child vm closes the sequence file (which is out of control of the
Mapper.map() function).

java.io.IOException: Filesystem closed
        at org.apache.hadoop.dfs.DFSClient.checkOpen(DFSClient.java:166)
        at org.apache.hadoop.dfs.DFSClient.access$500(DFSClient.java:58)
        at 
org.apache.hadoop.dfs.DFSClient$DFSInputStream.close(DFSClient.java:1103)
        at java.io.FilterInputStream.close(FilterInputStream.java:155)
        at 
org.apache.hadoop.io.SequenceFile$Reader.close(SequenceFile.java:1541)
        at 
org.apache.hadoop.mapred.SequenceFileRecordReader.close(SequenceFileRecordReader.java:125)
        at 
org.apache.hadoop.mapred.MapTask$TrackedRecordReader.close(MapTask.java:155)
        at org.apache.hadoop.mapred.MapTask.run(MapTask.java:212)
        at 
org.apache.hadoop.mapred.TaskTracker$Child.main(TaskTracker.java:2084)

I'll look into the logs...
Thanks,
Christophe

On Tue, Jun 17, 2008 at 11:14 PM, <[EMAIL PROTECTED]> wrote:

> Hi Christophe,
>
> This exception happens when you access the FileSystem after calling
> FileSystem.close().  From the error message below, a FileSystem input stream
> was accessed after FileSystem.close().  I guess the FileSystem was closed
> manually (and too early).  In most cases, you don't have to call
> FileSystem.close() since it will be closed automatically.
>
> Nicholas
>
>
> ----- Original Message ----
> > From: Christophe Taton <[EMAIL PROTECTED]>
> > To: [EMAIL PROTECTED]
> > Sent: Tuesday, June 17, 2008 4:18:45 AM
> > Subject: Task failing, cause FileSystem close?
> >
> > Hi all,
> >
> > I am experiencing (through my students) the following error on a 28
> > nodes cluster running Hadoop 0.16.4.
> > Some jobs fail with many map tasks aborting with this error message:
> >
> > 2008-06-17 12:25:01,512 WARN org.apache.hadoop.mapred.TaskTracker:
> > Error running child
> > java.io.IOException: Filesystem closed
> >     at org.apache.hadoop.dfs.DFSClient.checkOpen(DFSClient.java:166)
> >     at org.apache.hadoop.dfs.DFSClient.access$500(DFSClient.java:58)
> >     at
> org.apache.hadoop.dfs.DFSClient$DFSInputStream.close(DFSClient.java:1103)
> >     at java.io.FilterInputStream.close(FilterInputStream.java:155)
> >     at
> org.apache.hadoop.io.SequenceFile$Reader.close(SequenceFile.java:1541)
> >     at
> >
> org.apache.hadoop.mapred.SequenceFileRecordReader.close(SequenceFileRecordReader.java:125)
> >     at
> >
> org.apache.hadoop.mapred.MapTask$TrackedRecordReader.close(MapTask.java:155)
> >     at org.apache.hadoop.mapred.MapTask.run(MapTask.java:212)
> >     at
> org.apache.hadoop.mapred.TaskTracker$Child.main(TaskTracker.java:2084)
> >
> > Any clue why this would happen?
> >
> > Thanks in advance,
> > Christophe
>
>

Reply via email to