[ 
https://issues.apache.org/jira/browse/AMBARI-14453?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15087806#comment-15087806
 ] 

Lakshmi VS commented on AMBARI-14453:
-------------------------------------

Matt, you are right, 'java.net.ConnectException' has nothing to do with the 
uppercase/lowercase hostnames. This exception was seen one time on our clusters 
and is not an issue anymore.
Having said that, there is still inconsistency in the way hostnames are 
displayed in Ambari when uppercase hostnames are used. For example, hosts page 
displays the names to be lowercase (attached screenshot hosts.jpg) while the 
Namenode UI page shows the datanodes to be uppercase(attached screenshot 
DataNodes.jpg). Hbase Master UI has the same behavior too.
I have lowered the priority of this Jira, but the issue with inconsistent 
hostname display needs to be addressed.

> Ambari has lowercase hostnames while cluster is installed with uppercase 
> hostnames.
> -----------------------------------------------------------------------------------
>
>                 Key: AMBARI-14453
>                 URL: https://issues.apache.org/jira/browse/AMBARI-14453
>             Project: Ambari
>          Issue Type: Bug
>          Components: ambari-server
>    Affects Versions: 2.2.0
>            Reporter: Lakshmi VS
>            Assignee: Dmitry Lysnichenko
>            Priority: Minor
>             Fix For: 2.2.1
>
>         Attachments: Ambari.JPG, DataNodes.JPG, 
> hbase-hbase-master-N9-1-1.log, hosts.JPG, yarn-yarn-resourcemanager-N9-1-1.log
>
>
> Steps followed -
> 1. Hadoop cluster with below config is installed successfully with uppercase 
> hostnames.
> {code}
> CLUSTERNAME='N91'
> MASTER1=N9-1-1.labs
> MASTER2=N9-1-2.labs
> DATANODE1=N9-1-3.labs
> DATANODE2=N9-1-4.labs
> DATANODE3=N9-1-5.labs
> {code}
> Snippet of /etc/hosts file -
> {code}
> 10.0.8.1    N9-1-1.labs N9-1-1 byn001-1 hadoopvm1-1
> 10.0.8.2    N9-1-2.labs N9-1-2 byn001-2 hadoopvm1-2
> 10.0.8.3    N9-1-3.labs N9-1-3 byn001-3 hadoopvm1-3
> 10.0.8.4    N9-1-4.labs N9-1-4 byn001-4 hadoopvm1-4
> 10.0.8.5    N9-1-5.labs N9-1-5 byn001-5 hadoopvm1-5
> {code}
> 2. Ambari host page shows all hostnames to be of lower case.
> 3. Attempt to run jobs fails on -
> {code}
> Error: java.net.ConnectException: Call From N9-1-4.labs/10.0.8.4 to 
> n9-1-1.labs:8020 failed on connection exception: java.net.ConnectException: 
> Connection refused;
> {code}
> Snippet of mapreduce job failure -
> {code}
> out: 15/12/21 08:12:09 INFO mapreduce.Job: Task Id : 
> attempt_1450702943449_0001_m_000002_0, Status : FAILED
>  out: Error: java.net.ConnectException: Call From N9-1-4.labs/10.0.8.4 to 
> n9-1-1.labs:8020 failed on connection exception: java.net.ConnectException: 
> Connection refused; For more details see:  
> http://wiki.apache.org/hadoop/ConnectionRefused
>  out:         at 
> sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
>  out:         at 
> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
>  out:         at 
> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
>  out:         at 
> java.lang.reflect.Constructor.newInstance(Constructor.java:422)
>  out:         at 
> org.apache.hadoop.net.NetUtils.wrapWithMessage(NetUtils.java:792)
>       at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:732)
>       at org.apache.hadoop.ipc.Client.call(Client.java:1431)
>       at org.apache.hadoop.ipc.Client.call(Client.java:1358)
>       at 
> org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:229)
>       at com.sun.proxy.$Proxy13.getFileInfo(Unknown Source)
>       at 
> org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.getFileInfo(ClientNamenodeProtocolTranslatorPB.java:771)
>       at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>       at 
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
>       at 
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>       at java.lang.reflect.Method.invoke(Method.java:497)
>       at 
> org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:252)
>       at 
> org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:104)
>       at com.sun.proxy.$Proxy14.getFileInfo(Unknown Source)
>       at org.apache.hadoop.hdfs.DFSClient.getFileInfo(DFSClient.java:2116)
>       at 
> org.apache.hadoop.hdfs.DistributedFileSystem$22.doCall(DistributedFileSystem.java:1315)
>       at 
> org.apache.hadoop.hdfs.DistributedFileSystem$22.doCall(DistributedFileSystem.java:1311)
>       at 
> org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
>       at 
> org.apache.hadoop.hdfs.DistributedFileSystem.getFileStatus(DistributedFileSystem.java:1311)
>       at org.apache.hadoop.fs.FileSystem.exists(FileSystem.java:1424)
>       at 
> org.apache.hadoop.mapreduce.lib.output.FileOutputCommitter.needsTaskCommit(FileOutputCommitter.java:641)
>       at 
> org.apache.hadoop.mapreduce.lib.output.FileOutputCommitter.needsTaskCommit(FileOutputCommitter.java:630)
>       at org.apache.hadoop.mapred.Task.isCommitRequired(Task.java:1085)
>       at org.apache.hadoop.mapred.Task.done(Task.java:1042)
>       at org.apache.hadoop.mapred.MapTask.run(MapTask.java:345)
>       at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:168)
>       at java.security.AccessController.doPrivileged(Native Method)
>       at javax.security.auth.Subject.doAs(Subject.java:422)
>       at 
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1657)
>       at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:162)
>  Caused by: java.net.ConnectException: Connection refused
>       at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
>       at 
> sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:717)
>       at 
> org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:206)
>       at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:531)
>       at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:495)
>       at 
> org.apache.hadoop.ipc.Client$Connection.setupConnection(Client.java:612)
>       at 
> org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:710)
>       at org.apache.hadoop.ipc.Client$Connection.access$2800(Client.java:373)
>       at org.apache.hadoop.ipc.Client.getConnection(Client.java:1493)
>       at org.apache.hadoop.ipc.Client.call(Client.java:1397)
>       ... 27 more
> {code}
> 3. Services - HDFS, Mapreduce, HBase, Yarn services which were up and running 
> after installation go down.
> Attached are the logs for Yarn and HBase services.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

Reply via email to