Brian Thank you very much.
The version of Hadoop is 0.19.0,I think 4616 and 4635 patches is necessary. I will try it. -----Original Message----- From: Brian Bockelman [mailto:[email protected]] Sent: Monday, December 15, 2008 10:00 PM To: [email protected] Subject: Re: The error occurred when a lot of files created use fuse-dfs Hey, What version of Hadoop are you running? Have you taken a look at HADOOP-4775? https://issues.apache.org/jira/browse/HADOOP-4775 Basically, fuse-dfs is not usable on Hadoop 0.19.0 without a patch. Brian On Dec 15, 2008, at 12:24 AM, zhuweimin wrote: > Dear fuse-dfs users > > I copy 1000 files into hadoop from local disk use fuse-dfs, > Display the following error when the 600th files are copied: > > cp: cannot create regular file `/mnt/dfs/user/hadoop/fuse3/10m/ > 10m_33.dat': > Input/output error > cp: cannot create regular file `/mnt/dfs/user/hadoop/fuse3/10m/ > 10m_34.dat': > Input/output error > cp: cannot create regular file `/mnt/dfs/user/hadoop/fuse3/10m/ > 10m_35.dat': > Input/output error > cp: cannot create regular file `/mnt/dfs/user/hadoop/fuse3/10m/ > 10m_36.dat': > Input/output error > cp: cannot create regular file `/mnt/dfs/user/hadoop/fuse3/10m/ > 10m_37.dat': > Input/output error > cp: cannot create regular file `/mnt/dfs/user/hadoop/fuse3/10m/ > 10m_38.dat': > Input/output error > ... > > It is necessary to remount the fuse-dfs. > > Do you think about of the error. > > thanks > >
