Ah...I am using Ambari..so that does explain who is attempting to connect
to the DataNode consistently.

Thank you for the prompt reply, Yusaku !!

Regards,
Rajesh

On Thu, Jun 25, 2015 at 5:09 PM, Yusaku Sako <[email protected]> wrote:

>  Hi Rajesh,
>
>  Are you running Ambari?  If so, this is benign and can be ignored.
> Ambari "pings" the DataNode by making a socket connection once a minute to
> make sure it's up and running.  Otherwise, it will trigger an alert.
> Unfortunately, there's no known way to "ping" the DataNode with a valid
> payload to not cause this log (or at least when the devs implemented this
> in Ambari, there didn't seem to be one).
>
>  Yusaku
>
>   From: Rajesh Kartha <[email protected]>
> Reply-To: "[email protected]" <[email protected]>
> Date: Thursday, June 25, 2015 4:57 PM
> To: "[email protected]" <[email protected]>
> Subject: DataNode logs have exceptions - DataXceiver error processing
> unknown operation
>
>   Hello,
>
>  I am using a Hadoop 2.7.1 build and noticed a constant flow of exceptions
> every 60 seconds in the DataNode log files:
>
> 2015-06-25 13:02:36,292 ERROR datanode.DataNode
> (DataXceiver.java:run(278)) - bdavm063.svl.ibm.com:50010:DataXceiver
> error processing unknown operation  src: /127.0.0.1:54415 dst: /
> 127.0.0.1:50010
> java.io.EOFException
>         at java.io.DataInputStream.readShort(DataInputStream.java:315)
>         at
> org.apache.hadoop.hdfs.protocol.datatransfer.Receiver.readOp(Receiver.java:58)
>         at
> org.apache.hadoop.hdfs.server.datanode.DataXceiver.run(DataXceiver.java:227)
>         at java.lang.Thread.run(Thread.java:745)
> 2015-06-25 13:03:34,328 ERROR datanode.DataNode
> (DataXceiver.java:run(278)) - bdavm063.svl.ibm.com:50010:DataXceiver
> error processing unknown operation  src: /127.0.0.1:54441 dst: /
> 127.0.0.1:50010
> java.io.EOFException
>         at java.io.DataInputStream.readShort(DataInputStream.java:315)
>         at
> org.apache.hadoop.hdfs.protocol.datatransfer.Receiver.readOp(Receiver.java:58)
>         at
> org.apache.hadoop.hdfs.server.datanode.DataXceiver.run(DataXceiver.java:227)
>         at java.lang.Thread.run(Thread.java:745)
> 2015-06-25 13:05:36,081 ERROR datanode.DataNode
> (DataXceiver.java:run(278)) - bdavm063.svl.ibm.com:50010:DataXceiver
> error processing unknown operation  src: /127.0.0.1:54477 dst: /
> 127.0.0.1:50010
> java.io.EOFException
>         at java.io.DataInputStream.readShort(DataInputStream.java:315)
>         at
> org.apache.hadoop.hdfs.protocol.datatransfer.Receiver.readOp(Receiver.java:58)
>         at
> org.apache.hadoop.hdfs.server.datanode.DataXceiver.run(DataXceiver.java:227)
>         at java.lang.Thread.run(Thread.java:745)
>
>
>
>  I checked:
> -  ulimit to be -  open files                      (-n) *32768*
> - dfs.datanode.max.transfer.threads to be *16384*
>
>  While HDFS seem to work without issues, the logs are filled with these.
>
>  Any thoughts/ideas on resolving this is greatly appreciated.
>
>  Regards,
>  Rajesh
>
>

Reply via email to