Hello Neil

No matter how big the file is. It always report this to me. The file size is
from 10KB to 100MB.

On Sat, Sep 25, 2010 at 6:08 PM, Neil Ghosh <[email protected]> wrote:

> How Big is the file? Did you try Formatting Name node and Datanode?
>
> On Sun, Sep 26, 2010 at 2:12 AM, He Chen <[email protected]> wrote:
>
> > Hello everyone
> >
> > I can not load local file to HDFS. It gave the following errors.
> >
> > WARN hdfs.DFSClient: DFSOutputStream ResponseProcessor exception  for
> block
> > blk_-236192853234282209_419415java.io.EOFException
> >        at java.io.DataInputStream.readFully(DataInputStream.java:197)
> >        at java.io.DataInputStream.readLong(DataInputStream.java:416)
> >        at
> >
> >
> org.apache.hadoop.hdfs.DFSClient$DFSOutputStream$ResponseProcessor.run(DFSClient.java:2397)
> > 10/09/25 15:38:25 WARN hdfs.DFSClient: Error Recovery for block
> > blk_-236192853234282209_419415 bad datanode[0] 192.168.0.23:50010
> > 10/09/25 15:38:25 WARN hdfs.DFSClient: Error Recovery for block
> > blk_-236192853234282209_419415 in pipeline 192.168.0.23:50010,
> > 192.168.0.39:50010: bad datanode 192.168.0.23:50010
> > Any response will be appreciated!
> >
> >
> > --
> > Best Wishes!
> > 顺送商祺!
> >
> > --
> > Chen He
> >
>
>
>
> --
> Thanks and Regards
> Neil
> http://neilghosh.com
>



-- 
Best Wishes!
顺送商祺!

--
Chen He
(402)613-9298
PhD. student of CSE Dept.
Research Assistant of Holland Computing Center
University of Nebraska-Lincoln
Lincoln NE 68588

Reply via email to