>>>>>>Caused by: javax.security.auth.login.LoginException: Unable to obtain
password from user

Could you please check kerberos principal name is specified correctly in
"hdfs-site.xml", which is used to authenticate against Kerberos.

If keytab file defined in "hdfs-site.xml" and doesn't exists or wrong path,
you will see
this error. So, please verify the path and the keytab filename correctly
configured.

I hope hadoop discussion thread, https://goo.gl/M6l3vv may help you.


>>>>>>>2016-09-20 00:54:06,665 INFO org.apache.hadoop.http.HttpServer2:
HttpServer.start() threw a non Bind IOException
java.io.IOException: !JsseListener: java.lang.NullPointerException

This is probably due to some missing configuration.
Could you please re-check the ssl-server.xml, keystore and truststore
properties:

ssl.server.keystore.location
ssl.server.keystore.keypassword
ssl.client.truststore.location
ssl.client.truststore.password

Rakesh

On Tue, Sep 20, 2016 at 10:53 AM, kevin <kiss.kevin...@gmail.com> wrote:

> *hi,all:*
> *My environment : Centos7.2 hadoop2.7.3 jdk1.8*
> *after I config hdfs with kerberos ,I can't start up with
> sbin/start-dfs.sh*
>
> *::namenode log as below  *
>
> *STARTUP_MSG:   build = Unknown -r Unknown; compiled by 'root' on
> 2016-09-18T09:05Z*
> *STARTUP_MSG:   java = 1.8.0_102*
> *************************************************************/*
> *2016-09-20 00:54:05,822 INFO
> org.apache.hadoop.hdfs.server.namenode.NameNode: registered UNIX signal
> handlers for [TERM, HUP, INT]*
> *2016-09-20 00:54:05,825 INFO
> org.apache.hadoop.hdfs.server.namenode.NameNode: createNameNode []*
> *2016-09-20 00:54:06,078 INFO
> org.apache.hadoop.metrics2.impl.MetricsConfig: loaded properties from
> hadoop-metrics2.properties*
> *2016-09-20 00:54:06,149 INFO
> org.apache.hadoop.metrics2.impl.MetricsSystemImpl: Scheduled snapshot
> period at 10 second(s).*
> *2016-09-20 00:54:06,149 INFO
> org.apache.hadoop.metrics2.impl.MetricsSystemImpl: NameNode metrics system
> started*
> *2016-09-20 00:54:06,151 INFO
> org.apache.hadoop.hdfs.server.namenode.NameNode: fs.defaultFS is
> hdfs://dmp1.example.com:9000 <http://dmp1.example.com:9000>*
> *2016-09-20 00:54:06,152 INFO
> org.apache.hadoop.hdfs.server.namenode.NameNode: Clients are to use
> dmp1.example.com:9000 <http://dmp1.example.com:9000> to access this
> namenode/service.*
> *2016-09-20 00:54:06,446 INFO
> org.apache.hadoop.security.UserGroupInformation: Login successful for user
> hadoop/dmp1.example....@example.com <dmp1.example....@example.com> using
> keytab file /etc/hadoop/conf/hdfs.keytab*
> *2016-09-20 00:54:06,472 INFO org.apache.hadoop.hdfs.DFSUtil: Starting web
> server as: HTTP/dmp1.example....@example.com <dmp1.example....@example.com>*
> *2016-09-20 00:54:06,475 INFO org.apache.hadoop.hdfs.DFSUtil: Starting
> Web-server for hdfs at: https://dmp1.example.com:50470
> <https://dmp1.example.com:50470>*
> *2016-09-20 00:54:06,517 INFO org.mortbay.log: Logging to
> org.slf4j.impl.Log4jLoggerAdapter(org.mortbay.log) via
> org.mortbay.log.Slf4jLog*
> *2016-09-20 00:54:06,533 INFO
> org.apache.hadoop.security.authentication.server.AuthenticationFilter:
> Unable to initialize FileSignerSecretProvider, falling back to use random
> secrets.*
> *2016-09-20 00:54:06,542 INFO org.apache.hadoop.http.HttpRequestLog: Http
> request log for http.requests.namenode is not defined*
> *2016-09-20 00:54:06,546 INFO org.apache.hadoop.http.HttpServer2: Added
> global filter 'safety'
> (class=org.apache.hadoop.http.HttpServer2$QuotingInputFilter)*
> *2016-09-20 00:54:06,548 INFO org.apache.hadoop.http.HttpServer2: Added
> filter static_user_filter
> (class=org.apache.hadoop.http.lib.StaticUserWebFilter$StaticUserFilter) to
> context hdfs*
> *2016-09-20 00:54:06,548 INFO org.apache.hadoop.http.HttpServer2: Added
> filter static_user_filter
> (class=org.apache.hadoop.http.lib.StaticUserWebFilter$StaticUserFilter) to
> context static*
> *2016-09-20 00:54:06,548 INFO org.apache.hadoop.http.HttpServer2: Added
> filter static_user_filter
> (class=org.apache.hadoop.http.lib.StaticUserWebFilter$StaticUserFilter) to
> context logs*
> *2016-09-20 00:54:06,653 INFO org.apache.hadoop.http.HttpServer2: Added
> filter 'org.apache.hadoop.hdfs.web.Au
> <http://org.apache.hadoop.hdfs.web.Au>thFilter'
> (class=org.apache.hadoop.hdfs.web.AuthFilter)*
> *2016-09-20 00:54:06,654 INFO org.apache.hadoop.http.HttpServer2:
> addJerseyResourcePackage:
> packageName=org.apache.hadoop.hdfs.server.namenode.web.resources;org.apache.hadoop.hdfs.web.resources,
> pathSpec=/webhdfs/v1/**
> *2016-09-20 00:54:06,657 INFO org.apache.hadoop.http.HttpServer2: Adding
> Kerberos (SPNEGO) filter to getDelegationToken*
> *2016-09-20 00:54:06,658 INFO org.apache.hadoop.http.HttpServer2: Adding
> Kerberos (SPNEGO) filter to renewDelegationToken*
> *2016-09-20 00:54:06,658 INFO org.apache.hadoop.http.HttpServer2: Adding
> Kerberos (SPNEGO) filter to cancelDelegationToken*
> *2016-09-20 00:54:06,659 INFO org.apache.hadoop.http.HttpServer2: Adding
> Kerberos (SPNEGO) filter to fsck*
> *2016-09-20 00:54:06,659 INFO org.apache.hadoop.http.HttpServer2: Adding
> Kerberos (SPNEGO) filter to imagetransfer*
> *2016-09-20 00:54:06,665 WARN org.mortbay.log:
> java.lang.NullPointerException*
> *2016-09-20 00:54:06,665 INFO org.apache.hadoop.http.HttpServer2:
> HttpServer.start() threw a non Bind IOException*
> *java.io.IOException: !JsseListener: java.lang.NullPointerException*
> * at
> org.mortbay.jetty.security.SslSocketConnector.newServerSocket(SslSocketConnector.java:516)*
> * at
> org.apache.hadoop.security.ssl.SslSocketConnectorSecure.newServerSocket(SslSocketConnectorSecure.java:47)*
> * at org.mortbay.jetty.bio.SocketConnector.open(SocketConnector.java:73)*
> * at
> org.apache.hadoop.http.HttpServer2.openListeners(HttpServer2.java:914)*
> * at org.apache.hadoop.http.HttpServer2.start(HttpServer2.java:856)*
> * at org.apache.hadoop.hdfs.server.namenode.NameNodeHttpServer.st
> <http://namenode.NameNodeHttpServer.st>art(NameNodeHttpServer.java:142)*
> * at
> org.apache.hadoop.hdfs.server.namenode.NameNode.startHttpServer(NameNode.java:753)*
> * at
> org.apache.hadoop.hdfs.server.namenode.NameNode.initialize(NameNode.java:639)*
> * at
> org.apache.hadoop.hdfs.server.namenode.NameNode.<init>(NameNode.java:812)*
> * at
> org.apache.hadoop.hdfs.server.namenode.NameNode.<init>(NameNode.java:796)*
> * at
> org.apache.hadoop.hdfs.server.namenode.NameNode.createNameNode(NameNode.java:1493)*
> * at
> org.apache.hadoop.hdfs.server.namenode.NameNode.main(NameNode.java:1559)*
>
>
> *::second namenode log as below  *
>
>
>
> *STARTUP_MSG:   build = Unknown -r Unknown; compiled by 'root' on
> 2016-09-18T09:05ZSTARTUP_MSG:   java =
> 1.8.0_102************************************************************/2016-09-20
> 00:54:14,885 INFO org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode:
> registered UNIX signal handlers for [TERM, HUP, INT]2016-09-20 00:54:15,263
> FATAL org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode: Failed to
> start secondary namenodejava.io.IOException: Login failure for hadoop from
> keytab /etc/hadoop/conf/hdfs.keytab:
> javax.security.auth.login.LoginException: Unable to obtain password from
> user at
> org.apache.hadoop.security.UserGroupInformation.loginUserFromKeytab(UserGroupInformation.java:963)
> at org.apache.hadoop.security.SecurityUtil.login(SecurityUtil.java:246) at
> org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.initialize(SecondaryNameNode.java:217)
> at
> org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.<init>(SecondaryNameNode.java:192)
> at
> org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.main(SecondaryNameNode.java:671)Caused
> by: javax.security.auth.login.LoginException: Unable to obtain password
> from user at
> com.sun.security.auth.module.Krb5LoginModule.promptForPass(Krb5LoginModule.java:897)
> at
> com.sun.security.auth.module.Krb5LoginModule.attemptAuthentication(Krb5LoginModule.java:760)
> at
> com.sun.security.auth.module.Krb5LoginModule.login(Krb5LoginModule.java:617)
> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
> at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> at java.lang.reflect.Method.invoke(Method.java:498) at
> javax.security.auth.login.LoginContext.invoke(LoginContext.java:755) at
> javax.security.auth.login.LoginContext.access$000(LoginContext.java:195) at
> javax.security.auth.login.LoginContext$4.run(LoginContext.java:682) at
> javax.security.auth.login.LoginContext$4.run(LoginContext.java:680) at
> java.security.AccessController.doPrivileged(Native Method) at
> javax.security.auth.login.LoginContext.invokePriv(LoginContext.java:680) at
> javax.security.auth.login.LoginContext.login(LoginContext.java:587) at
> org.apache.hadoop.security.UserGroupInformation.loginUserFromKeytab(UserGroupInformation.java:954)
> ... 4 more*
>

Reply via email to