Thanks for the reply Allen. I fixed it, though I am not sure of the
exact cause as I found more than one thing wrong. There were some
directories owned by root instead of hadoop that I think were the main
issue. Allen Wittenauer wrote: On 1/7/10 8:34 AM, "Scott" <skes...@weather.com> wrote:WARN hdfs.DFSClient: DataStreamer Exception: org.apache.hadoop.ipc.RemoteException: java.io.IOException: File /user/hadoop/ads3x11-1256301562.log.lzo could only be replicated to 0 nodes, instead of 1This almost always means your HDFS is in safemode and/or has no live datanodes.I have checked quotas and found none. I have also tried other users, including the hadoop user, and get the same result. Any ideas?How is the namenode heap? Are you out of physical space? What does hadoop fsck / say? |
- Cant get HDFS to load more than 1Gig total data Scott
- Re: Cant get HDFS to load more than 1Gig total data Allen Wittenauer
- Re: Cant get HDFS to load more than 1Gig total d... Scott