Hi,

 

 I got the following exceptions , when I am using HDFS to write the logs
coming from Scribe

            

 1. java.io.IOException: Filesystem closed           

     <stack trace>

     ........

     ........

     call to org.apache.hadoop.fs.FSDataOutputStream::write failed!

            

 2. org.apache.hadoop.ipc.RemoteException:
org.apache.hadoop.hdfs.protocol.AlreadyBeingCreatedException: failed to
create

      file xxx-2010-04-01-12-40_00000 for DFSClient_1355960219 on client
10.18.22.55 because current leaseholder is trying to recreate file

      <stack trace>

     ........

     ........

     call to
org.apache.hadoop.conf.FileSystem::append((Lorg/apache/hadoop/fs/Path;)Lorg/
apache/hadoop/fs/FSDataOutputStream;)failed!

 

  I didn't apply the HDFS-265 to my hadoop patch yet.

 

  Are these exceptions due to the bugs in existing append-feature?? or some
other reason?  

 Should I need to apply the complete append patch or a simple patch will
solve this. 

            

 Thanks,

  Gokul

 

  

Reply via email to