Seems to be property problem.. it should be principal ( ā€œlā€ is missed).

<property>
  <name>dfs.secondary.namenode.kerberos.principa</name>
  <value>hadoop/_h...@example.com<mailto:h...@example.com></value>
</property>


For namenode httpserver start fail, please check rakesh comments..

This is probably due to some missing configuration.
Could you please re-check the ssl-server.xml, keystore and truststore 
properties:

ssl.server.keystore.location
ssl.server.keystore.keypassword
ssl.client.truststore.location
ssl.client.truststore.password


--Brahma Reddy Battula

From: kevin [mailto:kiss.kevin...@gmail.com]
Sent: 20 September 2016 16:53
To: Rakesh Radhakrishnan
Cc: user.hadoop
Subject: Re: hdfs2.7.3 kerberos can not startup

thanks, but my issue is name node could  Login successful,but second namenode 
couldn't. and name node got a HttpServer.start() threw a non Bind IOException:

hdfs-site.xml:

<property>
    <name>dfs.webhdfs.enabled</name>
    <value>true</value>
</property>

<property>
  <name>dfs.block.access.token.enable</name>
  <value>true</value>
</property>

<!-- NameNode security config -->
<property>
  <name>dfs.namenode.kerberos.principal</name>
  <value>hadoop/_h...@example.com<mailto:h...@example.com></value>
</property>
<property>
  <name>dfs.namenode.keytab.file</name>
  <value>/etc/hadoop/conf/hdfs.keytab</value>
</property>
<property>
  <name>dfs.https.port</name>
  <value>50470</value>
</property>
<property>
  <name>dfs.namenode.https-address</name>
  <value>dmp1.example.com:50470<http://dmp1.example.com:50470></value>
</property>
<property>
  <name>dfs.namenode.kerberos.internal.spnego.principa</name>
  <value>HTTP/_h...@example.com<mailto:h...@example.com></value>
</property>
<property>
  <name>dfs.web.authentication.kerberos.keytab</name>
  <value>/etc/hadoop/conf/hdfs.keytab</value>
</property>
<property>
  <name>dfs.http.policy</name>
  <value>HTTPS_ONLY</value>
</property>
<property>
  <name>dfs.https.enable</name>
  <value>true</value>
</property>


<!-- secondary NameNode security config -->
<property>
  <name>dfs.namenode.secondary.http-address</name>
  <value>dmp1.example.com:50090<http://dmp1.example.com:50090></value>
</property>
<property>
  <name>dfs.secondary.namenode.keytab.file</name>
  <value>/etc/hadoop/conf/hdfs.keytab</value>
</property>
<property>
  <name>dfs.secondary.namenode.kerberos.principa</name>
  <value>hadoop/_h...@example.com<mailto:h...@example.com></value>
</property>
<property>
  <name>dfs.secondary.namenode.kerberos.internal.spnego.principal</name>
  <value>HTTP/_h...@example.com<mailto:h...@example.com></value>
</property>
<property>
  <name>dfs.namenode.secondary.https-port</name>
  <value>50470</value>
</property>


<!-- JournalNode security config -->

<property>
  <name>dfs.journalnode.keytab.file</name>
  <value>/etc/hadoop/conf/hdfs.keytab</value>
</property>
<property>
  <name>dfs.journalnode.kerberos.principa</name>
  <value>hadoop/_h...@example.com<mailto:h...@example.com></value>
</property>
<property>
  <name>dfs.journalnode.kerberos.internal.spnego.principa</name>
  <value>HTTP/_h...@example.com<mailto:h...@example.com></value>
</property>
<property>
  <name>dfs.web.authentication.kerberos.keytab</name>
  <value>/etc/hadoop/conf/hdfs.keytab</value>
</property>


<!-- DataNode security config -->
<property>
  <name>dfs.datanode.kerberos.principal</name>
  <value>hadoop/_h...@example.com<mailto:h...@example.com></value>
</property>
<property>
  <name>dfs.datanode.keytab.file</name>
  <value>/etc/hadoop/conf/hdfs.keytab</value>
</property>
<property>
  <name>dfs.datanode.data.dir.perm</name>
  <value>700</value>
</property>

<!-- datanode SASL-->
<property>
  <name>dfs.datanode.address</name>
  <value>0.0.0.0:61004<http://0.0.0.0:61004></value>
</property>
<property>
  <name>dfs.datanode.http.address</name>
  <value>0.0.0.0:61006<http://0.0.0.0:61006></value>
</property>
<property>
  <name>dfs.datanode.https.address</name>
  <value>0.0.0.0:50470<http://0.0.0.0:50470></value>
</property>

<property>
  <name>dfs.data.transfer.protection</name>
  <value>integrity</value>
</property>

<property>
     <name>dfs.web.authentication.kerberos.principal</name>
     <value>HTTP/_h...@example.com<mailto:h...@example.com></value>
</property>
<property>
     <name>dfs.web.authentication.kerberos.keytab</name>
     <value>/etc/hadoop/conf/hdfs.keytab</value>
</property>

and [hadoop@dmp1 hadoop-2.7.3]$ klist -ket /etc/hadoop/conf/hdfs.keytab


Keytab name: FILE:/etc/hadoop/conf/hdfs.keytab
KVNO Timestamp           Principal
---- ------------------- ------------------------------------------------------
   2 09/19/2016 16:00:41 
hdfs/dmp1.example....@example.com<mailto:dmp1.example....@example.com> 
(aes256-cts-hmac-sha1-96)
   2 09/19/2016 16:00:41 
hdfs/dmp1.example....@example.com<mailto:dmp1.example....@example.com> 
(aes128-cts-hmac-sha1-96)
   2 09/19/2016 16:00:41 
hdfs/dmp1.example....@example.com<mailto:dmp1.example....@example.com> 
(des3-cbc-sha1)
   2 09/19/2016 16:00:41 
hdfs/dmp1.example....@example.com<mailto:dmp1.example....@example.com> 
(arcfour-hmac)
   2 09/19/2016 16:00:41 
hdfs/dmp2.example....@example.com<mailto:dmp2.example....@example.com> 
(aes256-cts-hmac-sha1-96)
   2 09/19/2016 16:00:41 
hdfs/dmp2.example....@example.com<mailto:dmp2.example....@example.com> 
(aes128-cts-hmac-sha1-96)
   2 09/19/2016 16:00:41 
hdfs/dmp2.example....@example.com<mailto:dmp2.example....@example.com> 
(des3-cbc-sha1)
   2 09/19/2016 16:00:41 
hdfs/dmp2.example....@example.com<mailto:dmp2.example....@example.com> 
(arcfour-hmac)
   2 09/19/2016 16:00:41 
hdfs/dmp3.example....@example.com<mailto:dmp3.example....@example.com> 
(aes256-cts-hmac-sha1-96)
   2 09/19/2016 16:00:41 
hdfs/dmp3.example....@example.com<mailto:dmp3.example....@example.com> 
(aes128-cts-hmac-sha1-96)
   2 09/19/2016 16:00:41 
hdfs/dmp3.example....@example.com<mailto:dmp3.example....@example.com> 
(des3-cbc-sha1)
   2 09/19/2016 16:00:41 
hdfs/dmp3.example....@example.com<mailto:dmp3.example....@example.com> 
(arcfour-hmac)
   2 09/19/2016 16:00:41 
HTTP/dmp1.example....@example.com<mailto:dmp1.example....@example.com> 
(aes256-cts-hmac-sha1-96)
   2 09/19/2016 16:00:41 
HTTP/dmp1.example....@example.com<mailto:dmp1.example....@example.com> 
(aes128-cts-hmac-sha1-96)
   2 09/19/2016 16:00:41 
HTTP/dmp1.example....@example.com<mailto:dmp1.example....@example.com> 
(des3-cbc-sha1)
   2 09/19/2016 16:00:41 
HTTP/dmp1.example....@example.com<mailto:dmp1.example....@example.com> 
(arcfour-hmac)
   2 09/19/2016 16:00:41 
HTTP/dmp2.example....@example.com<mailto:dmp2.example....@example.com> 
(aes256-cts-hmac-sha1-96)
   2 09/19/2016 16:00:41 
HTTP/dmp2.example....@example.com<mailto:dmp2.example....@example.com> 
(aes128-cts-hmac-sha1-96)
   2 09/19/2016 16:00:41 
HTTP/dmp2.example....@example.com<mailto:dmp2.example....@example.com> 
(des3-cbc-sha1)
   2 09/19/2016 16:00:41 
HTTP/dmp2.example....@example.com<mailto:dmp2.example....@example.com> 
(arcfour-hmac)
   2 09/19/2016 16:00:41 
HTTP/dmp3.example....@example.com<mailto:dmp3.example....@example.com> 
(aes256-cts-hmac-sha1-96)
   2 09/19/2016 16:00:41 
HTTP/dmp3.example....@example.com<mailto:dmp3.example....@example.com> 
(aes128-cts-hmac-sha1-96)
   2 09/19/2016 16:00:41 
HTTP/dmp3.example....@example.com<mailto:dmp3.example....@example.com> 
(des3-cbc-sha1)
   2 09/19/2016 16:00:41 
HTTP/dmp3.example....@example.com<mailto:dmp3.example....@example.com> 
(arcfour-hmac)
   2 09/19/2016 20:21:03 
hadoop/dmp1.example....@example.com<mailto:dmp1.example....@example.com> 
(aes256-cts-hmac-sha1-96)
   2 09/19/2016 20:21:03 
hadoop/dmp1.example....@example.com<mailto:dmp1.example....@example.com> 
(aes128-cts-hmac-sha1-96)
   2 09/19/2016 20:21:03 
hadoop/dmp1.example....@example.com<mailto:dmp1.example....@example.com> 
(des3-cbc-sha1)
   2 09/19/2016 20:21:03 
hadoop/dmp1.example....@example.com<mailto:dmp1.example....@example.com> 
(arcfour-hmac)
   2 09/19/2016 20:21:03 
hadoop/dmp2.example....@example.com<mailto:dmp2.example....@example.com> 
(aes256-cts-hmac-sha1-96)
   2 09/19/2016 20:21:03 
hadoop/dmp2.example....@example.com<mailto:dmp2.example....@example.com> 
(aes128-cts-hmac-sha1-96)
   2 09/19/2016 20:21:03 
hadoop/dmp2.example....@example.com<mailto:dmp2.example....@example.com> 
(des3-cbc-sha1)
   2 09/19/2016 20:21:03 
hadoop/dmp2.example....@example.com<mailto:dmp2.example....@example.com> 
(arcfour-hmac)
   2 09/19/2016 20:21:03 
hadoop/dmp3.example....@example.com<mailto:dmp3.example....@example.com> 
(aes256-cts-hmac-sha1-96)
   2 09/19/2016 20:21:03 
hadoop/dmp3.example....@example.com<mailto:dmp3.example....@example.com> 
(aes128-cts-hmac-sha1-96)
   2 09/19/2016 20:21:03 
hadoop/dmp3.example....@example.com<mailto:dmp3.example....@example.com> 
(des3-cbc-sha1)
   2 09/19/2016 20:21:03 
hadoop/dmp3.example....@example.com<mailto:dmp3.example....@example.com> 
(arcfour-hmac)

2016-09-20 15:52 GMT+08:00 Rakesh Radhakrishnan 
<rake...@apache.org<mailto:rake...@apache.org>>:
>>>>>>Caused by: javax.security.auth.login.LoginException: Unable to obtain 
>>>>>>password from user

Could you please check kerberos principal name is specified correctly in
"hdfs-site.xml", which is used to authenticate against Kerberos.

If keytab file defined in "hdfs-site.xml" and doesn't exists or wrong path, you 
will see
this error. So, please verify the path and the keytab filename correctly
configured.

I hope hadoop discussion thread, https://goo.gl/M6l3vv may help you.


>>>>>>>2016-09-20 00:54:06,665 INFO org.apache.hadoop.http.HttpServer2: 
>>>>>>>HttpServer.start() threw a non Bind IOException
java.io.IOException: !JsseListener: java.lang.NullPointerException

This is probably due to some missing configuration.
Could you please re-check the ssl-server.xml, keystore and truststore 
properties:

ssl.server.keystore.location
ssl.server.keystore.keypassword
ssl.client.truststore.location
ssl.client.truststore.password

Rakesh

On Tue, Sep 20, 2016 at 10:53 AM, kevin 
<kiss.kevin...@gmail.com<mailto:kiss.kevin...@gmail.com>> wrote:
hi,all:
My environment : Centos7.2 hadoop2.7.3 jdk1.8
after I config hdfs with kerberos ,I can't start up with sbin/start-dfs.sh

::namenode log as below

STARTUP_MSG:   build = Unknown -r Unknown; compiled by 'root' on 
2016-09-18T09:05Z
STARTUP_MSG:   java = 1.8.0_102
************************************************************/
2016-09-20 00:54:05,822 INFO org.apache.hadoop.hdfs.server.namenode.NameNode: 
registered UNIX signal handlers for [TERM, HUP, INT]
2016-09-20 00:54:05,825 INFO org.apache.hadoop.hdfs.server.namenode.NameNode: 
createNameNode []
2016-09-20 00:54:06,078 INFO org.apache.hadoop.metrics2.impl.MetricsConfig: 
loaded properties from hadoop-metrics2.properties
2016-09-20 00:54:06,149 INFO org.apache.hadoop.metrics2.impl.MetricsSystemImpl: 
Scheduled snapshot period at 10 second(s).
2016-09-20 00:54:06,149 INFO org.apache.hadoop.metrics2.impl.MetricsSystemImpl: 
NameNode metrics system started
2016-09-20 00:54:06,151 INFO org.apache.hadoop.hdfs.server.namenode.NameNode: 
fs.defaultFS is hdfs://dmp1.example.com:9000<http://dmp1.example.com:9000>
2016-09-20 00:54:06,152 INFO org.apache.hadoop.hdfs.server.namenode.NameNode: 
Clients are to use dmp1.example.com:9000<http://dmp1.example.com:9000> to 
access this namenode/service.
2016-09-20 00:54:06,446 INFO org.apache.hadoop.security.UserGroupInformation: 
Login successful for user 
hadoop/dmp1.example....@example.com<mailto:dmp1.example....@example.com> using 
keytab file /etc/hadoop/conf/hdfs.keytab
2016-09-20 00:54:06,472 INFO org.apache.hadoop.hdfs.DFSUtil: Starting web 
server as: 
HTTP/dmp1.example....@example.com<mailto:dmp1.example....@example.com>
2016-09-20 00:54:06,475 INFO org.apache.hadoop.hdfs.DFSUtil: Starting 
Web-server for hdfs at: https://dmp1.example.com:50470
2016-09-20 00:54:06,517 INFO org.mortbay.log: Logging to 
org.slf4j.impl.Log4jLoggerAdapter(org.mortbay.log) via org.mortbay.log.Slf4jLog
2016-09-20 00:54:06,533 INFO 
org.apache.hadoop.security.authentication.server.AuthenticationFilter: Unable 
to initialize FileSignerSecretProvider, falling back to use random secrets.
2016-09-20 00:54:06,542 INFO org.apache.hadoop.http.HttpRequestLog: Http 
request log for http.requests.namenode is not defined
2016-09-20 00:54:06,546 INFO org.apache.hadoop.http.HttpServer2: Added global 
filter 'safety' (class=org.apache.hadoop.http.HttpServer2$QuotingInputFilter)
2016-09-20 00:54:06,548 INFO org.apache.hadoop.http.HttpServer2: Added filter 
static_user_filter 
(class=org.apache.hadoop.http.lib.StaticUserWebFilter$StaticUserFilter) to 
context hdfs
2016-09-20 00:54:06,548 INFO org.apache.hadoop.http.HttpServer2: Added filter 
static_user_filter 
(class=org.apache.hadoop.http.lib.StaticUserWebFilter$StaticUserFilter) to 
context static
2016-09-20 00:54:06,548 INFO org.apache.hadoop.http.HttpServer2: Added filter 
static_user_filter 
(class=org.apache.hadoop.http.lib.StaticUserWebFilter$StaticUserFilter) to 
context logs
2016-09-20 00:54:06,653 INFO org.apache.hadoop.http.HttpServer2: Added filter 
'org.apache.hadoop.hdfs.web.Au<http://org.apache.hadoop.hdfs.web.Au>thFilter' 
(class=org.apache.hadoop.hdfs.web.AuthFilter)
2016-09-20 00:54:06,654 INFO org.apache.hadoop.http.HttpServer2: 
addJerseyResourcePackage: 
packageName=org.apache.hadoop.hdfs.server.namenode.web.resources;org.apache.hadoop.hdfs.web.resources,
 pathSpec=/webhdfs/v1/*
2016-09-20 00:54:06,657 INFO org.apache.hadoop.http.HttpServer2: Adding 
Kerberos (SPNEGO) filter to getDelegationToken
2016-09-20 00:54:06,658 INFO org.apache.hadoop.http.HttpServer2: Adding 
Kerberos (SPNEGO) filter to renewDelegationToken
2016-09-20 00:54:06,658 INFO org.apache.hadoop.http.HttpServer2: Adding 
Kerberos (SPNEGO) filter to cancelDelegationToken
2016-09-20 00:54:06,659 INFO org.apache.hadoop.http.HttpServer2: Adding 
Kerberos (SPNEGO) filter to fsck
2016-09-20 00:54:06,659 INFO org.apache.hadoop.http.HttpServer2: Adding 
Kerberos (SPNEGO) filter to imagetransfer
2016-09-20 00:54:06,665 WARN org.mortbay.log: java.lang.NullPointerException
2016-09-20 00:54:06,665 INFO org.apache.hadoop.http.HttpServer2: 
HttpServer.start() threw a non Bind IOException
java.io.IOException: !JsseListener: java.lang.NullPointerException
at 
org.mortbay.jetty.security.SslSocketConnector.newServerSocket(SslSocketConnector.java:516)
at 
org.apache.hadoop.security.ssl.SslSocketConnectorSecure.newServerSocket(SslSocketConnectorSecure.java:47)
at org.mortbay.jetty.bio.SocketConnector.open(SocketConnector.java:73)
at org.apache.hadoop.http.HttpServer2.openListeners(HttpServer2.java:914)
at org.apache.hadoop.http.HttpServer2.start(HttpServer2.java:856)
at 
org.apache.hadoop.hdfs.server.namenode.NameNodeHttpServer.st<http://namenode.NameNodeHttpServer.st>art(NameNodeHttpServer.java:142)
at 
org.apache.hadoop.hdfs.server.namenode.NameNode.startHttpServer(NameNode.java:753)
at org.apache.hadoop.hdfs.server.namenode.NameNode.initialize(NameNode.java:639)
at org.apache.hadoop.hdfs.server.namenode.NameNode.<init>(NameNode.java:812)
at org.apache.hadoop.hdfs.server.namenode.NameNode.<init>(NameNode.java:796)
at 
org.apache.hadoop.hdfs.server.namenode.NameNode.createNameNode(NameNode.java:1493)
at org.apache.hadoop.hdfs.server.namenode.NameNode.main(NameNode.java:1559)


::second namenode log as below

STARTUP_MSG:   build = Unknown -r Unknown; compiled by 'root' on 
2016-09-18T09:05Z
STARTUP_MSG:   java = 1.8.0_102
************************************************************/
2016-09-20 00:54:14,885 INFO 
org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode: registered UNIX 
signal handlers for [TERM, HUP, INT]
2016-09-20 00:54:15,263 FATAL 
org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode: Failed to start 
secondary namenode
java.io.IOException: Login failure for hadoop from keytab 
/etc/hadoop/conf/hdfs.keytab: javax.security.auth.login.LoginException: Unable 
to obtain password from user

at 
org.apache.hadoop.security.UserGroupInformation.loginUserFromKeytab(UserGroupInformation.java:963)
at org.apache.hadoop.security.SecurityUtil.login(SecurityUtil.java:246)
at 
org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.initialize(SecondaryNameNode.java:217)
at 
org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.<init>(SecondaryNameNode.java:192)
at 
org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.main(SecondaryNameNode.java:671)
Caused by: javax.security.auth.login.LoginException: Unable to obtain password 
from user

at 
com.sun.security.auth.module.Krb5LoginModule.promptForPass(Krb5LoginModule.java:897)
at 
com.sun.security.auth.module.Krb5LoginModule.attemptAuthentication(Krb5LoginModule.java:760)
at com.sun.security.auth.module.Krb5LoginModule.login(Krb5LoginModule.java:617)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at javax.security.auth.login.LoginContext.invoke(LoginContext.java:755)
at javax.security.auth.login.LoginContext.access$000(LoginContext.java:195)
at javax.security.auth.login.LoginContext$4.run(LoginContext.java:682)
at javax.security.auth.login.LoginContext$4.run(LoginContext.java:680)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.login.LoginContext.invokePriv(LoginContext.java:680)
at javax.security.auth.login.LoginContext.login(LoginContext.java:587)
at 
org.apache.hadoop.security.UserGroupInformation.loginUserFromKeytab(UserGroupInformation.java:954)
... 4 more


Reply via email to