Hello,

I have a simple map-reduce job that reads in zipped files and converts them
to lzo compression.  Some of the files are not properly zipped which results
in Hadoop throwing an "java.io.EOFException: Unexpected end of input stream
error" and causes the job to fail.  Is there a way to catch this exception
and tell hadoop to just ignore the file and move on?  I think the exception
is being thrown by the class reading in the Gzip file and not my mapper
class.  Is this correct?  Is there a way to handle this type of error
gracefully?

Thank you!

~Ed

Reply via email to