Re: Re: hdfs2.7.3 not work with kerberos

2016-09-21 Thread lk_hadoop
Thank you Rakesh ,  Problem have been resolved.  
I should not config : default_ccache_name = KEYRING:persistent:%{uid} in 
krb5.conf. It's not work for hadoop.

2016-09-22 

lk_hadoop 



发件人:Rakesh Radhakrishnan <rake...@apache.org>
发送时间:2016-09-21 23:25
主题:Re: hdfs2.7.3 not work with kerberos
收件人:"lk_hadoop"<lk_had...@163.com>
抄送:"user"<user@hadoop.apache.org>

I could see "Ticket cache: KEYRING:persistent:1004:1004" in your env.


May be KEYRING persistent cache setting is causing trouble, Kerberos libraries 
to store the krb cache in a location and the Hadoop libraries can't seem to 
access it.


Please refer these links,
https://community.hortonworks.com/questions/818/ipa-kerberos-not-liking-my-kinit-ticket.html
https://community.hortonworks.com/articles/11291/kerberos-cache-in-ipa-redhat-idm-keyring-solved-1.html


Rakesh

Intel


On Wed, Sep 21, 2016 at 3:01 PM, lk_hadoop <lk_had...@163.com> wrote:

hi,all:
My environment : Centos7.2 hadoop2.7.3 jdk1.8 kerberos 1.13.2-12.el7_2
my hadoop user is hadoop,now I can startup hdfs with sbin/start-dfs.sh and I 
can see data node from the web.
but when I try to hdfs dfs -ls / ,I got error:

[hadoop@dmp1 ~]$ hdfs dfs -ls /
16/09/20 15:00:44 WARN ipc.Client: Exception encountered while connecting to 
the server : javax.security.sasl.SaslException: GSS initiate failed [Caused by 
GSSException: No valid credentials provided (Mechanism level: Failed to find 
any Kerberos tgt)]
ls: Failed on local exception: java.io.IOException: 
javax.security.sasl.SaslException: GSS initiate failed [Caused by GSSException: 
No valid credentials provided (Mechanism level: Failed to find any Kerberos 
tgt)]; Host Details : local host is: "dmp1.example.com/192.168.249.129"; 
destination host is: "dmp1.example.com":9000; 
[hadoop@dmp1 ~]$ klist
Ticket cache: KEYRING:persistent:1004:1004
Default principal: had...@example.com

Valid starting   Expires  Service principal
09/20/2016 14:57:34  09/21/2016 14:57:31  krbtgt/example@example.com
 renew until 09/27/2016 14:57:31
[hadoop@dmp1 ~]$ 

It's because of my jdk is 1.8 ?

2016-09-21


lk_hadoop 

Re: hdfs2.7.3 not work with kerberos

2016-09-21 Thread Rakesh Radhakrishnan
I could see "Ticket cache: KEYRING:persistent:1004:1004" in your env.

May be KEYRING persistent cache setting is causing trouble, Kerberos
libraries to store the krb cache in a location and the Hadoop libraries
can't seem to access it.

Please refer these links,
https://community.hortonworks.com/questions/818/ipa-kerberos-not-liking-my-kinit-ticket.html
https://community.hortonworks.com/articles/11291/kerberos-cache-in-ipa-redhat-idm-keyring-solved-1.html

Rakesh
Intel

On Wed, Sep 21, 2016 at 3:01 PM, lk_hadoop  wrote:

> hi,all:
> *My environment : Centos7.2 hadoop2.7.3 jdk1.8 kerberos 1.13.2-12.el7_2*
> *my hadoop user is hadoop,now I can startup hdfs with sbin/start-dfs.sh
> and I can see data node from the web.*
> *but when I try to hdfs dfs -ls / ,I got error:*
>
> [hadoop@dmp1 ~]$ hdfs dfs -ls /
> 16/09/20 15:00:44 WARN ipc.Client: Exception encountered while connecting
> to the server : javax.security.sasl.SaslException: GSS initiate failed
> [Caused by GSSException: No valid credentials provided (Mechanism level:
> Failed to find any Kerberos tgt)]
> ls: Failed on local exception: java.io.IOException: 
> javax.security.sasl.SaslException:
> GSS initiate failed [Caused by GSSException: No valid credentials provided
> (Mechanism level: Failed to find any Kerberos tgt)]; Host Details : local
> host is: "dmp1.example.com/192.168.249.129"; destination host is: "
> dmp1.example.com":9000;
> [hadoop@dmp1 ~]$ klist
> Ticket cache: KEYRING:persistent:1004:1004
> Default principal: had...@example.com
>
> Valid starting   Expires  Service principal
> 09/20/2016 14:57:34  09/21/2016 14:57:31  krbtgt/example@example.com
>  renew until 09/27/2016 14:57:31
> [hadoop@dmp1 ~]$
>
> It's because of my jdk is *1.8 ?*
>
> 2016-09-21
> --
> lk_hadoop
>


hdfs2.7.3 not work with kerberos

2016-09-21 Thread lk_hadoop
hi,all:
My environment : Centos7.2 hadoop2.7.3 jdk1.8 kerberos 1.13.2-12.el7_2
my hadoop user is hadoop,now I can startup hdfs with sbin/start-dfs.sh and I 
can see data node from the web.
but when I try to hdfs dfs -ls / ,I got error:

[hadoop@dmp1 ~]$ hdfs dfs -ls /
16/09/20 15:00:44 WARN ipc.Client: Exception encountered while connecting to 
the server : javax.security.sasl.SaslException: GSS initiate failed [Caused by 
GSSException: No valid credentials provided (Mechanism level: Failed to find 
any Kerberos tgt)]
ls: Failed on local exception: java.io.IOException: 
javax.security.sasl.SaslException: GSS initiate failed [Caused by GSSException: 
No valid credentials provided (Mechanism level: Failed to find any Kerberos 
tgt)]; Host Details : local host is: "dmp1.example.com/192.168.249.129"; 
destination host is: "dmp1.example.com":9000; 
[hadoop@dmp1 ~]$ klist
Ticket cache: KEYRING:persistent:1004:1004
Default principal: had...@example.com

Valid starting   Expires  Service principal
09/20/2016 14:57:34  09/21/2016 14:57:31  krbtgt/example@example.com
 renew until 09/27/2016 14:57:31
[hadoop@dmp1 ~]$ 

It's because of my jdk is 1.8 ?

2016-09-21


lk_hadoop