It does not appear that any datanodes have connected to your namenode.
on the datanode machines look in the hadoop logs directory at the datanode
log files.
There should be some information there that helps you diagnose the problem.

chapter 4 of my book provides some detail on work with this problem

On Thu, May 21, 2009 at 4:29 AM, ashish pareek <pareek...@gmail.com> wrote:

> Hi ,
>
>    I have two suggestion
>
> i)Choose a right version ( Hadoop- 0.18 is good)
> ii)replication should be 3 as ur having 3 modes.( Indirectly see to it that
> ur configuration is correct !!)
>
> Hey even i am just suggesting this as i am also a new to hadoop
>
> Ashish Pareek
>
>
> On Thu, May 21, 2009 at 2:41 PM, Stas Oskin <stas.os...@gmail.com> wrote:
>
> > Hi.
> >
> > I'm testing Hadoop in our lab, and started getting the following message
> > when trying to copy a file:
> > Could only be replicated to 0 nodes, instead of 1
> >
> > I have the following setup:
> >
> > * 3 machines, 2 of them with only 80GB of space, and 1 with 1.5GB
> > * Two clients are copying files all the time (one of them is the 1.5GB
> > machine)
> > * The replication is set on 2
> > * I let the space on 2 smaller machines to end, to test the behavior
> >
> > Now, one of the clients (the one located on 1.5GB) works fine, and the
> > other
> > one - the external, unable to copy and displays the error + the exception
> > below
> >
> > Any idea if this expected on my scenario? Or how it can be solved?
> >
> > Thanks in advance.
> >
> >
> >
> > 09/05/21 10:51:03 WARN dfs.DFSClient: NotReplicatedYetException sleeping
> > /test/test.bin retries left 1
> >
> > 09/05/21 10:51:06 WARN dfs.DFSClient: DataStreamer Exception:
> > org.apache.hadoop.ipc.RemoteException: java.io.IOException: File
> > /test/test.bin could only be replicated to 0 nodes, instead of 1
> >
> >            at
> >
> >
> org.apache.hadoop.dfs.FSNamesystem.getAdditionalBlock(FSNamesystem.java:1123
> > )
> >
> >            at org.apache.hadoop.dfs.NameNode.addBlock(NameNode.java:330)
> >
> >            at sun.reflect.GeneratedMethodAccessor8.invoke(Unknown Source)
> >
> >            at
> >
> >
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25
> > )
> >
> >            at java.lang.reflect.Method.invoke(Method.java:597)
> >
> >            at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:481)
> >
> >            at org.apache.hadoop.ipc.Server$Handler.run(Server.java:890)
> >
> >
> >
> >            at org.apache.hadoop.ipc.Client.call(Client.java:716)
> >
> >            at org.apache.hadoop.ipc.RPC$Invoker.invoke(RPC.java:216)
> >
> >            at org.apache.hadoop.dfs.$Proxy0.addBlock(Unknown Source)
> >
> >            at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> >
> >            at
> >
> >
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39
> > )
> >
> >            at
> >
> >
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25
> > )
> >
> >            at java.lang.reflect.Method.invoke(Method.java:597)
> >
> >            at
> >
> >
> org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:82
> > )
> >
> >            at
> >
> >
> org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:59
> > )
> >
> >            at org.apache.hadoop.dfs.$Proxy0.addBlock(Unknown Source)
> >
> >            at
> >
> >
> org.apache.hadoop.dfs.DFSClient$DFSOutputStream.locateFollowingBlock(DFSClient.java:2450
> > )
> >
> >            at
> >
> >
> org.apache.hadoop.dfs.DFSClient$DFSOutputStream.nextBlockOutputStream(DFSClient.java:2333
> > )
> >
> >            at
> >
> >
> org.apache.hadoop.dfs.DFSClient$DFSOutputStream.access$1800(DFSClient.java:1745
> > )
> >
> >            at
> >
> >
> org.apache.hadoop.dfs.DFSClient$DFSOutputStream$DataStreamer.run(DFSClient.java:1922
> > )
> >
> >
> >
> > 09/05/21 10:51:06 WARN dfs.DFSClient: Error Recovery for block null bad
> > datanode[0]
> >
> > java.io.IOException: Could not get block locations. Aborting...
> >
> >            at
> >
> >
> org.apache.hadoop.dfs.DFSClient$DFSOutputStream.processDatanodeError(DFSClient.java:2153
> > )
> >
> >            at
> >
> >
> org.apache.hadoop.dfs.DFSClient$DFSOutputStream.access$1400(DFSClient.java:1745
> > )
> >
> >            at
> >
> >
> org.apache.hadoop.dfs.DFSClient$DFSOutputStream$DataStreamer.run(DFSClient.java:1899
> > )
> >
>



-- 
Alpha Chapters of my book on Hadoop are available
http://www.apress.com/book/view/9781430219422
www.prohadoopbook.com a community for Hadoop Professionals

Reply via email to