That's not an error - that just means that the daemon thread is waiting for
a connection (IO event)

The logs in $HADOOP_HOME/log/ are entirely empty? Both the .log and .out
files? I find that hard to believe :)

-Todd

On Mon, Sep 14, 2009 at 7:57 AM, Vincenzo Gulisano <
[email protected]> wrote:

> Hi Todd,
> thanks for your answer. I've already tried this solution. No error is
> reported.
> As the program remains in a "wait state", no error is detected.
> I've seen that the error
> "sun.nio.ch.EPollArrayWrapper. epollWait (native method)"
> affects other old bugs of hadoop, but I couldn't solve mine.
> Thanks again
>
>
>
>
> 2009/9/14 Todd Lipcon <[email protected]>
>
> > Hi Vincenzo,
> >
> > Look at the log output of your daemons. My guess is that you'll find
> > something pretty clear there.
> >
> > -Todd
> >
> > On Mon, Sep 14, 2009 at 7:46 AM, Vincenzo Gulisano <
> > [email protected]> wrote:
> >
> > > Hi,
> > > after a lot of unsuccessful attempts of running hadoop distributed file
> > > system on my machine, I've located one possible error.
> > > Maybe you have some ideas about what's going on.
> > >
> > > Experiment:
> > > What I'm doing is simply executing start-all.sh and hadoop dfsadmin
> > -report
> > >
> > > After the setup I can check that everything is working using:
> > >
> > > jps
> > > ...
> > > 17421 NameNode
> > > 17519 DataNode
> > > 17611 SecondaryNameNode
> > > 17685 JobTracker
> > > 17778 TaskTracker
> > > 18425 Jps
> > > ...
> > >
> > >
> > >
> > >
> > > AND
> > >
> > > sudo netstat -plten | grep java
> > > ...
> > > tcp        0      0 127.0.0.1:54310         0.0.0.0:* LISTEN      1062
> > >      346907      17421/java      (namenode)
> > > tcp        0      0 127.0.0.1:54311         0.0.0.0:* LISTEN      1062
> > >      347480      17685/java      (job tracker)
> > >
> > >
> > >
> > >
> > > 2 things happen launching the application:
> > > 1) The program waits and nothing happens (99% of the times)
> > > 2) The program works but the report shows that the HDFS has some
> problems
> > >
> > > Taking a look to the debug:
> > >
> > >
> > > main:
> > >
> > >  [1] java.lang.Object.wait (native method)
> > >  [2] java.lang.Object.wait (Object.java:485)
> > >  [3] org.apache.hadoop.ipc.Client.call (Client.java:725)
> > >  [4] org.apache.hadoop.ipc.RPC$Invoker.invoke (RPC.java:220)
> > >  [5] $Proxy0.getProtocolVersion (null)
> > >  [6] org.apache.hadoop.ipc.RPC.getProxy (RPC.java:359)
> > >  [7] org.apache.hadoop.hdfs.DFSClient.createRPCNamenode
> > > (DFSClient.java:105)
> > >  [8] org.apache.hadoop.hdfs.DFSClient.<init> (DFSClient.java:208)
> > >  [9] org.apache.hadoop.hdfs.DFSClient.<init> (DFSClient.java:169)
> > >  [10] org.apache.hadoop.hdfs.DistributedFileSystem.initialize
> > > (DistributedFileSystem.java:82)
> > >  [11] org.apache.hadoop.fs.FileSystem.createFileSystem
> > > (FileSystem.java:1,384)
> > >  [12] org.apache.hadoop.fs.FileSystem.access$200 (FileSystem.java:66)
> > >  [13] org.apache.hadoop.fs.FileSystem$Cache.get (FileSystem.java:1,399)
> > >  [14] org.apache.hadoop.fs.FileSystem.get (FileSystem.java:199)
> > >  [15] org.apache.hadoop.fs.FileSystem.get (FileSystem.java:96)
> > >  [16] org.apache.hadoop.fs.FsShell.init (FsShell.java:85)
> > >  [17] org.apache.hadoop.hdfs.tools.DFSAdmin.run (DFSAdmin.java:777)
> > >  [18] org.apache.hadoop.util.ToolRunner.run (ToolRunner.java:65)
> > >  [19] org.apache.hadoop.util.ToolRunner.run (ToolRunner.java:79)
> > >  [20] org.apache.hadoop.hdfs.tools.DFSAdmin.main (DFSAdmin.java:858)
> > >
> > > IPC Client (47) connection to localhost/127.0.0.1:8020 from vincenzo:
> > >  [1] sun.nio.ch.EPollArrayWrapper.epollWait (native method)
> > >  [2] sun.nio.ch.EPollArrayWrapper.poll (EPollArrayWrapper.java:215)
> > >  [3] sun.nio.ch.EPollSelectorImpl.doSelect (EPollSelectorImpl.java:65)
> > >  [4] sun.nio.ch.SelectorImpl.lockAndDoSelect (SelectorImpl.java:69)
> > >  [5] sun.nio.ch.SelectorImpl.select (SelectorImpl.java:80)
> > >  [6] org.apache.hadoop.net.SocketIOWithTimeout$SelectorPool.select
> > > (SocketIOWithTimeout.java:332)
> > >  [7] org.apache.hadoop.net.SocketIOWithTimeout.doIO
> > > (SocketIOWithTimeout.java:157)
> > >  [8] org.apache.hadoop.net.SocketInputStream.read
> > > (SocketInputStream.java:155)
> > >  [9] org.apache.hadoop.net.SocketInputStream.read
> > > (SocketInputStream.java:128)
> > >  [10] java.io.FilterInputStream.read (FilterInputStream.java:116)
> > >  [11] org.apache.hadoop.ipc.Client$Connection$PingInputStream.read
> > > (Client.java:276)
> > >  [12] java.io.BufferedInputStream.fill (BufferedInputStream.java:218)
> > >  [13] java.io.BufferedInputStream.read (BufferedInputStream.java:237)
> > >  [14] java.io.DataInputStream.readInt (DataInputStream.java:370)
> > >  [15] org.apache.hadoop.ipc.Client$Connection.receiveResponse
> > > (Client.java:501)
> > >  [16] org.apache.hadoop.ipc.Client$Connection.run (Client.java:446)
> > >
> > > Have you any idea about why this can happen?
> > >
> > > I've tries also to telnet the host:port and it works. I've tried all
> > > possible addresses in the configuration (localhost / 127.0.0.1 / name /
> > > name.domain ).
> > >
> > > Any help is appreciated,
> > > Thanks in advance
> > >
> > > Vincenzo
> > >
> >
>
>
>
> --
> Vincenzo Massimiliano Gulisano
> PhD student - UPM - Distributed System Lab.
>

Reply via email to