[ 
https://issues.apache.org/jira/browse/HDFS-11525?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15906762#comment-15906762
 ] 

yuxh commented on HDFS-11525:
-----------------------------

I use 2.7.3 , actually it gave me the port information first time 50010 was 
occupied,so change it to 60010, still failed but this time it didn't tell exact 
information anymore.
here is my log:
************************************************************/
2017-03-10 10:22:17,893 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: 
registered UNIX signal handlers for [TERM, HUP, INT]
2017-03-10 10:22:18,153 WARN org.apache.hadoop.util.NativeCodeLoader: Unable to 
load native-hadoop library for your platform... using builtin-java classes 
where applicable
2017-03-10 10:22:18,406 INFO org.apache.hadoop.metrics2.impl.MetricsConfig: 
loaded properties from hadoop-metrics2.properties
2017-03-10 10:22:18,459 INFO org.apache.hadoop.metrics2.impl.MetricsSystemImpl: 
Scheduled snapshot period at 10 second(s).
2017-03-10 10:22:18,459 INFO org.apache.hadoop.metrics2.impl.MetricsSystemImpl: 
DataNode metrics system started
2017-03-10 10:22:18,463 INFO 
org.apache.hadoop.hdfs.server.datanode.BlockScanner: Initialized block scanner 
with targetBytesPerSec 1048576
2017-03-10 10:22:18,463 DEBUG org.apache.hadoop.hdfs.server.datanode.DataNode: 
File descriptor passing was not configured.
2017-03-10 10:22:18,464 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: 
Configured hostname is node136
2017-03-10 10:22:18,468 DEBUG 
org.apache.hadoop.hdfs.protocol.datatransfer.sasl.DataTransferSaslUtil: 
DataTransferProtocol not using SaslPropertiesResolver, no QOP found in 
configuration for dfs.data.transfer.protection
2017-03-10 10:22:18,472 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: 
Starting DataNode with maxLockedMemory = 0
2017-03-10 10:22:18,491 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: 
Opened streaming server at /192.168.156.136:60010
2017-03-10 10:22:18,493 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: 
Balancing bandwith is 1048576 bytes/s
2017-03-10 10:22:18,493 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: 
Number threads for balancing is 5
2017-03-10 10:22:18,558 INFO org.mortbay.log: Logging to 
org.slf4j.impl.Log4jLoggerAdapter(org.mortbay.log) via org.mortbay.log.Slf4jLog
2017-03-10 10:22:18,565 INFO 
org.apache.hadoop.security.authentication.server.AuthenticationFilter: Unable 
to initialize FileSignerSecretProvider, falling back to use random secrets.
2017-03-10 10:22:18,570 INFO org.apache.hadoop.http.HttpRequestLog: Http 
request log for http.requests.datanode is not defined
2017-03-10 10:22:18,574 INFO org.apache.hadoop.http.HttpServer2: Added global 
filter 'safety' (class=org.apache.hadoop.http.HttpServer2$QuotingInputFilter)
2017-03-10 10:22:18,575 INFO org.apache.hadoop.http.HttpServer2: Added filter 
static_user_filter 
(class=org.apache.hadoop.http.lib.StaticUserWebFilter$StaticUserFilter) to 
context datanode
2017-03-10 10:22:18,576 INFO org.apache.hadoop.http.HttpServer2: Added filter 
static_user_filter 
(class=org.apache.hadoop.http.lib.StaticUserWebFilter$StaticUserFilter) to 
context static
2017-03-10 10:22:18,576 INFO org.apache.hadoop.http.HttpServer2: Added filter 
static_user_filter 
(class=org.apache.hadoop.http.lib.StaticUserWebFilter$StaticUserFilter) to 
context logs
2017-03-10 10:22:18,586 INFO org.apache.hadoop.http.HttpServer2: Jetty bound to 
port 59109
2017-03-10 10:22:18,586 INFO org.mortbay.log: jetty-6.1.26
2017-03-10 10:22:18,708 INFO org.mortbay.log: Started 
HttpServer2$SelectChannelConnectorWithSafeStartup@localhost:59109
2017-03-10 10:22:18,825 INFO org.mortbay.log: Stopped 
HttpServer2$SelectChannelConnectorWithSafeStartup@localhost:0
2017-03-10 10:22:18,926 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: 
Shutdown complete.
2017-03-10 10:22:18,926 FATAL org.apache.hadoop.hdfs.server.datanode.DataNode: 
Exception in secureMain
java.net.BindException: Address already in use
        at sun.nio.ch.Net.bind0(Native Method)
        at sun.nio.ch.Net.bind(Net.java:433)
        at sun.nio.ch.Net.bind(Net.java:425)
        at 
sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:223)
        at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
        at 
io.netty.channel.socket.nio.NioServerSocketChannel.doBind(NioServerSocketChannel.java:125)
        at 
io.netty.channel.AbstractChannel$AbstractUnsafe.bind(AbstractChannel.java:475)
        at 
io.netty.channel.DefaultChannelPipeline$HeadContext.bind(DefaultChannelPipeline.java:1021)
        at 
io.netty.channel.AbstractChannelHandlerContext.invokeBind(AbstractChannelHandlerContext.java:455)
        at 
io.netty.channel.AbstractChannelHandlerContext.bind(AbstractChannelHandlerContext.java:440)
        at 
io.netty.channel.DefaultChannelPipeline.bind(DefaultChannelPipeline.java:844)
        at io.netty.channel.AbstractChannel.bind(AbstractChannel.java:194)
        at 
io.netty.bootstrap.AbstractBootstrap$2.run(AbstractBootstrap.java:340)
        at 
io.netty.util.concurrent.SingleThreadEventExecutor.runAllTasks(SingleThreadEventExecutor.java:380)
        at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:357)
        at 
io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThreadEventExecutor.java:116)
        at 
io.netty.util.concurrent.DefaultThreadFactory$DefaultRunnableDecorator.run(DefaultThreadFactory.java:137)
        at java.lang.Thread.run(Thread.java:745)
2017-03-10 10:22:18,929 INFO org.apache.hadoop.util.ExitUtil: Exiting with 
status 1
2017-03-10 10:22:18,930 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: 
SHUTDOWN_MSG:
/************************************************************
SHUTDOWN_MSG: Shutting down DataNode at node136/192.168.156.136
************************************************************/



> "Address already in use" has no exact conflict information
> ----------------------------------------------------------
>
>                 Key: HDFS-11525
>                 URL: https://issues.apache.org/jira/browse/HDFS-11525
>             Project: Hadoop HDFS
>          Issue Type: Improvement
>          Components: datanode
>    Affects Versions: 2.7.3
>            Reporter: yuxh
>
> when I found datanode didnot start successful, I checked log :
> 2017-03-10 09:36:08,455 FATAL 
> org.apache.hadoop.hdfs.server.datanode.DataNode: Exception in secureMain
> java.net.BindException: Address already in use
>         at sun.nio.ch.Net.bind0(Native Method)
>         at sun.nio.ch.Net.bind(Net.java:433)
>         at sun.nio.ch.Net.bind(Net.java:425)
> which doesn't show which port has been used. I have to try in another machine 
> to find port 50075 is need to start up datanode.



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to