[jira] [Commented] (HDFS-13965) hadoop.security.kerberos.ticket.cache.path setting is not honored when KMS encryption is enabled.

2020-06-18 Thread LOKESKUMAR VIJAYAKUMAR (Jira)


[ 
https://issues.apache.org/jira/browse/HDFS-13965?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17139552#comment-17139552
 ] 

LOKESKUMAR VIJAYAKUMAR commented on HDFS-13965:
---

Hello Team!
Can anyone please help here?

> hadoop.security.kerberos.ticket.cache.path setting is not honored when KMS 
> encryption is enabled.
> -
>
> Key: HDFS-13965
> URL: https://issues.apache.org/jira/browse/HDFS-13965
> Project: Hadoop HDFS
>  Issue Type: Bug
>  Components: hdfs-client, kms
>Affects Versions: 2.7.3, 2.7.7
>Reporter: LOKESKUMAR VIJAYAKUMAR
>Assignee: Kitti Nanasi
>Priority: Major
>
> _We use the *+hadoop.security.kerberos.ticket.cache.path+* setting to provide 
> a custom kerberos cache path for all hadoop operations to be run as specified 
> user. But this setting is not honored when KMS encryption is enabled._
> _The below program to read a file works when KMS encryption is not enabled, 
> but it fails when the KMS encryption is enabled._
> _Looks like *hadoop.security.kerberos.ticket.cache.path* setting is not 
> honored by *createConnection on KMSClientProvider.java.*_
>  
> HadoopTest.java (CLASSPATH needs to be set to compile and run)
>  
> import java.io.InputStream;
> import java.net.URI;
> import org.apache.hadoop.conf.Configuration;
> import org.apache.hadoop.fs.FileSystem;
> import org.apache.hadoop.fs.Path;
>  
> public class HadoopTest {
>     public static int runRead(String[] args) throws Exception{
>     if (args.length < 3) {
>     System.err.println("HadoopTest hadoop_file_path 
> hadoop_user kerberos_cache");
>     return 1;
>     }
>     Path inputPath = new Path(args[0]);
>     Configuration conf = new Configuration();
>     URI defaultURI = FileSystem.getDefaultUri(conf);
>     
> conf.set("hadoop.security.kerberos.ticket.cache.path",args[2]);
>     FileSystem fs = 
> FileSystem.newInstance(defaultURI,conf,args[1]);
>     InputStream is = fs.open(inputPath);
>     byte[] buffer = new byte[4096];
>     int nr = is.read(buffer);
>     while (nr != -1)
>     {
>     System.out.write(buffer, 0, nr);
>     nr = is.read(buffer);
>     }
>     return 0;
>     }
>     public static void main( String[] args ) throws Exception {
>     int returnCode = HadoopTest.runRead(args);
>     System.exit(returnCode);
>     }
> }
>  
>  
>  
> [root@lstrost3 testhadoop]# pwd
> /testhadoop
>  
> [root@lstrost3 testhadoop]# ls
> HadoopTest.java
>  
> [root@lstrost3 testhadoop]# export CLASSPATH=`hadoop classpath --glob`:.
>  
> [root@lstrost3 testhadoop]# javac HadoopTest.java
>  
> [root@lstrost3 testhadoop]# java HadoopTest
> HadoopTest  hadoop_file_path  hadoop_user  kerberos_cache
>  
> [root@lstrost3 testhadoop]# java HadoopTest /loki/loki.file loki 
> /tmp/krb5cc_1006
> 18/09/27 23:23:20 WARN util.NativeCodeLoader: Unable to load native-hadoop 
> library for your platform... using builtin-java classes where applicable
> 18/09/27 23:23:21 WARN shortcircuit.DomainSocketFactory: The short-circuit 
> local reads feature cannot be used because libhadoop cannot be loaded.
> Exception in thread "main" java.io.IOException: 
> org.apache.hadoop.security.authentication.client.AuthenticationException: 
> GSSException: *{color:#FF}No valid credentials provided (Mechanism level: 
> Failed to find any Kerberos tgt){color}*
>     at 
> {color:#FF}*org.apache.hadoop.crypto.key.kms.KMSClientProvider.createConnection(KMSClientProvider.java:551)*{color}
>     at 
> org.apache.hadoop.crypto.key.kms.KMSClientProvider.decryptEncryptedKey(KMSClientProvider.java:831)
>     at 
> org.apache.hadoop.crypto.key.KeyProviderCryptoExtension.decryptEncryptedKey(KeyProviderCryptoExtension.java:388)
>     at 
> org.apache.hadoop.hdfs.DFSClient.decryptEncryptedDataEncryptionKey(DFSClient.java:1393)
>     at 
> org.apache.hadoop.hdfs.DFSClient.createWrappedInputStream(DFSClient.java:1463)
>     at 
> org.apache.hadoop.hdfs.DistributedFileSystem$4.doCall(DistributedFileSystem.java:333)
>     at 
> org.apache.hadoop.hdfs.DistributedFileSystem$4.doCall(DistributedFileSystem.java:327)
>     at 
> org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
>     at 
> org.apache.hadoop.hdfs.DistributedFileSystem.open(DistributedFileSystem.java:340)
>     at org.apache.hadoop.fs.FileSystem.open(FileSystem.java:786)
>     at HadoopTest.runRead(HadoopTest.java:18)
>     at HadoopTest.main(HadoopTest.java:29)
> Caused 

[jira] [Commented] (HDFS-13965) hadoop.security.kerberos.ticket.cache.path setting is not honored when KMS encryption is enabled.

2020-06-08 Thread LOKESKUMAR VIJAYAKUMAR (Jira)


[ 
https://issues.apache.org/jira/browse/HDFS-13965?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17128745#comment-17128745
 ] 

LOKESKUMAR VIJAYAKUMAR commented on HDFS-13965:
---

Hello  Kitti Nanasi,
We implemented the workaround you suggested.  (Setting the KRB5CCNAME 
environment variable with hadoop user kerberos cache path )
But this causes the hadoop user ticket cache to be renewed as root user 
automatically when ticket cache is about to expire / already expired.  Which 
changes the ownership of the ticket cache from hadoop user to root user.  As a 
result of all this, further kinit logins fails for hadoop user, as ticket cache 
path is owned by root user now. 

Is there anyway to avoid this issue?  What should we do to not run into this 
issue?





20/05/21 21:01:53 DEBUG security.UserGroupInformation: hadoop login
20/05/21 21:01:53 DEBUG security.UserGroupInformation: hadoop login commit
20/05/21 21:01:53 DEBUG security.UserGroupInformation: using kerberos 
user:hal...@cinfin.com
20/05/21 21:01:53 DEBUG security.UserGroupInformation: Using user: 
"hal...@cinfin.com" with name hal...@cinfin.com
20/05/21 21:01:53 DEBUG security.UserGroupInformation: User entry: 
"hal...@cinfin.com"
20/05/21 21:01:53 DEBUG security.UserGroupInformation: Assuming keytab is 
managed externally since logged in from subject.
20/05/21 21:01:53 DEBUG security.UserGroupInformation: UGI 
loginUser:hal...@cinfin.com (auth:KERBEROS)
20/05/21 21:01:53 DEBUG security.UserGroupInformation: Found tgt Ticket (hex) =
 
Client Principal = hal...@cinfin.com
Server Principal = krbtgt/cinfin@cinfin.com
 
Forwardable Ticket true
Forwarded Ticket false
Proxiable Ticket false
Proxy Ticket false
Postdated Ticket false
Renewable Ticket true
Initial Ticket true
Auth Time = Thu May 21 03:09:53 EDT 2020
Start Time = Thu May 21 11:10:09 EDT 2020
End Time = Thu May 21 21:10:09 EDT 2020
Renew Till = Thu May 28 03:09:53 EDT 2020
Client Addresses  Null
20/05/21 21:01:53 DEBUG security.UserGroupInformation: Current time is 
1590109313240
20/05/21 21:01:53 DEBUG security.UserGroupInformation: Next refresh is 
1590102609000
20/05/21 21:01:53 DEBUG security.UserGroupInformation: renewed ticket
20/05/21 21:01:53 DEBUG security.UserGroupInformation: Initiating logout for 
hal...@cinfin.com
20/05/21 21:01:53 DEBUG security.UserGroupInformation: hadoop logout
20/05/21 21:01:53 DEBUG security.UserGroupInformation: Initiating re-login for 
hal...@cinfin.com
20/05/21 21:01:53 DEBUG security.UserGroupInformation: hadoop login
20/05/21 21:01:53 DEBUG security.UserGroupInformation: hadoop login commit


> hadoop.security.kerberos.ticket.cache.path setting is not honored when KMS 
> encryption is enabled.
> -
>
> Key: HDFS-13965
> URL: https://issues.apache.org/jira/browse/HDFS-13965
> Project: Hadoop HDFS
>  Issue Type: Bug
>  Components: hdfs-client, kms
>Affects Versions: 2.7.3, 2.7.7
>Reporter: LOKESKUMAR VIJAYAKUMAR
>Assignee: Kitti Nanasi
>Priority: Major
>
> _We use the *+hadoop.security.kerberos.ticket.cache.path+* setting to provide 
> a custom kerberos cache path for all hadoop operations to be run as specified 
> user. But this setting is not honored when KMS encryption is enabled._
> _The below program to read a file works when KMS encryption is not enabled, 
> but it fails when the KMS encryption is enabled._
> _Looks like *hadoop.security.kerberos.ticket.cache.path* setting is not 
> honored by *createConnection on KMSClientProvider.java.*_
>  
> HadoopTest.java (CLASSPATH needs to be set to compile and run)
>  
> import java.io.InputStream;
> import java.net.URI;
> import org.apache.hadoop.conf.Configuration;
> import org.apache.hadoop.fs.FileSystem;
> import org.apache.hadoop.fs.Path;
>  
> public class HadoopTest {
>     public static int runRead(String[] args) throws Exception{
>     if (args.length < 3) {
>     System.err.println("HadoopTest hadoop_file_path 
> hadoop_user kerberos_cache");
>     return 1;
>     }
>     Path inputPath = new Path(args[0]);
>     Configuration conf = new Configuration();
>     URI defaultURI = FileSystem.getDefaultUri(conf);
>     
> conf.set("hadoop.security.kerberos.ticket.cache.path",args[2]);
>     FileSystem fs = 
> FileSystem.newInstance(defaultURI,conf,args[1]);
>     InputStream is = fs.open(inputPath);
>     byte[] buffer = new byte[4096];
>     int nr = is.read(buffer);
>     while (nr != -1)
>     {
>     System.out.write(buffer, 0, nr);
>     nr = is.read(buffer);
>     }
>     

[jira] [Commented] (HDFS-13965) hadoop.security.kerberos.ticket.cache.path setting is not honored when KMS encryption is enabled.

2018-12-20 Thread LOKESKUMAR VIJAYAKUMAR (JIRA)


[ 
https://issues.apache.org/jira/browse/HDFS-13965?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16726300#comment-16726300
 ] 

LOKESKUMAR VIJAYAKUMAR commented on HDFS-13965:
---

Hello [~knanasi],

Thanks for checking this issue.

Our software is implemented using the corresponding C APIs provided by libhdfs, 
 Our service runs as root user and when connecting to hadoop cluster, we do 
kerberos login as the hadoop user and use that ticket cache to access the data. 
Will it be possible to fix this so that the API works as expected in all cases?

Thanks,
Lokes


> hadoop.security.kerberos.ticket.cache.path setting is not honored when KMS 
> encryption is enabled.
> -
>
> Key: HDFS-13965
> URL: https://issues.apache.org/jira/browse/HDFS-13965
> Project: Hadoop HDFS
>  Issue Type: Bug
>  Components: hdfs-client, kms
>Affects Versions: 2.7.3, 2.7.7
>Reporter: LOKESKUMAR VIJAYAKUMAR
>Assignee: Kitti Nanasi
>Priority: Major
>
> _We use the *+hadoop.security.kerberos.ticket.cache.path+* setting to provide 
> a custom kerberos cache path for all hadoop operations to be run as specified 
> user. But this setting is not honored when KMS encryption is enabled._
> _The below program to read a file works when KMS encryption is not enabled, 
> but it fails when the KMS encryption is enabled._
> _Looks like *hadoop.security.kerberos.ticket.cache.path* setting is not 
> honored by *createConnection on KMSClientProvider.java.*_
>  
> HadoopTest.java (CLASSPATH needs to be set to compile and run)
>  
> import java.io.InputStream;
> import java.net.URI;
> import org.apache.hadoop.conf.Configuration;
> import org.apache.hadoop.fs.FileSystem;
> import org.apache.hadoop.fs.Path;
>  
> public class HadoopTest {
>     public static int runRead(String[] args) throws Exception{
>     if (args.length < 3) {
>     System.err.println("HadoopTest hadoop_file_path 
> hadoop_user kerberos_cache");
>     return 1;
>     }
>     Path inputPath = new Path(args[0]);
>     Configuration conf = new Configuration();
>     URI defaultURI = FileSystem.getDefaultUri(conf);
>     
> conf.set("hadoop.security.kerberos.ticket.cache.path",args[2]);
>     FileSystem fs = 
> FileSystem.newInstance(defaultURI,conf,args[1]);
>     InputStream is = fs.open(inputPath);
>     byte[] buffer = new byte[4096];
>     int nr = is.read(buffer);
>     while (nr != -1)
>     {
>     System.out.write(buffer, 0, nr);
>     nr = is.read(buffer);
>     }
>     return 0;
>     }
>     public static void main( String[] args ) throws Exception {
>     int returnCode = HadoopTest.runRead(args);
>     System.exit(returnCode);
>     }
> }
>  
>  
>  
> [root@lstrost3 testhadoop]# pwd
> /testhadoop
>  
> [root@lstrost3 testhadoop]# ls
> HadoopTest.java
>  
> [root@lstrost3 testhadoop]# export CLASSPATH=`hadoop classpath --glob`:.
>  
> [root@lstrost3 testhadoop]# javac HadoopTest.java
>  
> [root@lstrost3 testhadoop]# java HadoopTest
> HadoopTest  hadoop_file_path  hadoop_user  kerberos_cache
>  
> [root@lstrost3 testhadoop]# java HadoopTest /loki/loki.file loki 
> /tmp/krb5cc_1006
> 18/09/27 23:23:20 WARN util.NativeCodeLoader: Unable to load native-hadoop 
> library for your platform... using builtin-java classes where applicable
> 18/09/27 23:23:21 WARN shortcircuit.DomainSocketFactory: The short-circuit 
> local reads feature cannot be used because libhadoop cannot be loaded.
> Exception in thread "main" java.io.IOException: 
> org.apache.hadoop.security.authentication.client.AuthenticationException: 
> GSSException: *{color:#FF}No valid credentials provided (Mechanism level: 
> Failed to find any Kerberos tgt){color}*
>     at 
> {color:#FF}*org.apache.hadoop.crypto.key.kms.KMSClientProvider.createConnection(KMSClientProvider.java:551)*{color}
>     at 
> org.apache.hadoop.crypto.key.kms.KMSClientProvider.decryptEncryptedKey(KMSClientProvider.java:831)
>     at 
> org.apache.hadoop.crypto.key.KeyProviderCryptoExtension.decryptEncryptedKey(KeyProviderCryptoExtension.java:388)
>     at 
> org.apache.hadoop.hdfs.DFSClient.decryptEncryptedDataEncryptionKey(DFSClient.java:1393)
>     at 
> org.apache.hadoop.hdfs.DFSClient.createWrappedInputStream(DFSClient.java:1463)
>     at 
> org.apache.hadoop.hdfs.DistributedFileSystem$4.doCall(DistributedFileSystem.java:333)
>     at 
> org.apache.hadoop.hdfs.DistributedFileSystem$4.doCall(DistributedFileSystem.java:327)
>     at 
> 

[jira] [Commented] (HDFS-13965) hadoop.security.kerberos.ticket.cache.path setting is not honored when KMS encryption is enabled.

2018-10-31 Thread LOKESKUMAR VIJAYAKUMAR (JIRA)


[ 
https://issues.apache.org/jira/browse/HDFS-13965?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16670381#comment-16670381
 ] 

LOKESKUMAR VIJAYAKUMAR commented on HDFS-13965:
---

Did anyone get a chance to check this?
This has been open for quite sometime now.

> hadoop.security.kerberos.ticket.cache.path setting is not honored when KMS 
> encryption is enabled.
> -
>
> Key: HDFS-13965
> URL: https://issues.apache.org/jira/browse/HDFS-13965
> Project: Hadoop HDFS
>  Issue Type: Bug
>  Components: hdfs-client, kms
>Affects Versions: 2.7.3, 2.7.7
>Reporter: LOKESKUMAR VIJAYAKUMAR
>Priority: Major
>
> _We use the *+hadoop.security.kerberos.ticket.cache.path+* setting to provide 
> a custom kerberos cache path for all hadoop operations to be run as specified 
> user. But this setting is not honored when KMS encryption is enabled._
> _The below program to read a file works when KMS encryption is not enabled, 
> but it fails when the KMS encryption is enabled._
> _Looks like *hadoop.security.kerberos.ticket.cache.path* setting is not 
> honored by *createConnection on KMSClientProvider.java.*_
>  
> HadoopTest.java (CLASSPATH needs to be set to compile and run)
>  
> import java.io.InputStream;
> import java.net.URI;
> import org.apache.hadoop.conf.Configuration;
> import org.apache.hadoop.fs.FileSystem;
> import org.apache.hadoop.fs.Path;
>  
> public class HadoopTest {
>     public static int runRead(String[] args) throws Exception{
>     if (args.length < 3) {
>     System.err.println("HadoopTest hadoop_file_path 
> hadoop_user kerberos_cache");
>     return 1;
>     }
>     Path inputPath = new Path(args[0]);
>     Configuration conf = new Configuration();
>     URI defaultURI = FileSystem.getDefaultUri(conf);
>     
> conf.set("hadoop.security.kerberos.ticket.cache.path",args[2]);
>     FileSystem fs = 
> FileSystem.newInstance(defaultURI,conf,args[1]);
>     InputStream is = fs.open(inputPath);
>     byte[] buffer = new byte[4096];
>     int nr = is.read(buffer);
>     while (nr != -1)
>     {
>     System.out.write(buffer, 0, nr);
>     nr = is.read(buffer);
>     }
>     return 0;
>     }
>     public static void main( String[] args ) throws Exception {
>     int returnCode = HadoopTest.runRead(args);
>     System.exit(returnCode);
>     }
> }
>  
>  
>  
> [root@lstrost3 testhadoop]# pwd
> /testhadoop
>  
> [root@lstrost3 testhadoop]# ls
> HadoopTest.java
>  
> [root@lstrost3 testhadoop]# export CLASSPATH=`hadoop classpath --glob`:.
>  
> [root@lstrost3 testhadoop]# javac HadoopTest.java
>  
> [root@lstrost3 testhadoop]# java HadoopTest
> HadoopTest  hadoop_file_path  hadoop_user  kerberos_cache
>  
> [root@lstrost3 testhadoop]# java HadoopTest /loki/loki.file loki 
> /tmp/krb5cc_1006
> 18/09/27 23:23:20 WARN util.NativeCodeLoader: Unable to load native-hadoop 
> library for your platform... using builtin-java classes where applicable
> 18/09/27 23:23:21 WARN shortcircuit.DomainSocketFactory: The short-circuit 
> local reads feature cannot be used because libhadoop cannot be loaded.
> Exception in thread "main" java.io.IOException: 
> org.apache.hadoop.security.authentication.client.AuthenticationException: 
> GSSException: *{color:#FF}No valid credentials provided (Mechanism level: 
> Failed to find any Kerberos tgt){color}*
>     at 
> {color:#FF}*org.apache.hadoop.crypto.key.kms.KMSClientProvider.createConnection(KMSClientProvider.java:551)*{color}
>     at 
> org.apache.hadoop.crypto.key.kms.KMSClientProvider.decryptEncryptedKey(KMSClientProvider.java:831)
>     at 
> org.apache.hadoop.crypto.key.KeyProviderCryptoExtension.decryptEncryptedKey(KeyProviderCryptoExtension.java:388)
>     at 
> org.apache.hadoop.hdfs.DFSClient.decryptEncryptedDataEncryptionKey(DFSClient.java:1393)
>     at 
> org.apache.hadoop.hdfs.DFSClient.createWrappedInputStream(DFSClient.java:1463)
>     at 
> org.apache.hadoop.hdfs.DistributedFileSystem$4.doCall(DistributedFileSystem.java:333)
>     at 
> org.apache.hadoop.hdfs.DistributedFileSystem$4.doCall(DistributedFileSystem.java:327)
>     at 
> org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
>     at 
> org.apache.hadoop.hdfs.DistributedFileSystem.open(DistributedFileSystem.java:340)
>     at org.apache.hadoop.fs.FileSystem.open(FileSystem.java:786)
>     at HadoopTest.runRead(HadoopTest.java:18)
>     at HadoopTest.main(HadoopTest.java:29)
> 

[jira] [Created] (HDFS-13965) hadoop.security.kerberos.ticket.cache.path setting is not honored when KMS encryption is enabled.

2018-10-05 Thread LOKESKUMAR VIJAYAKUMAR (JIRA)
LOKESKUMAR VIJAYAKUMAR created HDFS-13965:
-

 Summary: hadoop.security.kerberos.ticket.cache.path setting is not 
honored when KMS encryption is enabled.
 Key: HDFS-13965
 URL: https://issues.apache.org/jira/browse/HDFS-13965
 Project: Hadoop HDFS
  Issue Type: Bug
  Components: hdfs-client, kms
Affects Versions: 2.7.7, 2.7.3
Reporter: LOKESKUMAR VIJAYAKUMAR


_We use the *+hadoop.security.kerberos.ticket.cache.path+* setting to provide a 
custom kerberos cache path for all hadoop operations to be run as specified 
user. But this setting is not honored when KMS encryption is enabled._


_The below program to read a file works when KMS encryption is not enabled, but 
it fails when the KMS encryption is enabled._

_Looks like *hadoop.security.kerberos.ticket.cache.path* setting is not honored 
by *createConnection on KMSClientProvider.java.*_

 

HadoopTest.java (CLASSPATH needs to be set to compile and run)

 

import java.io.InputStream;

import java.net.URI;

import org.apache.hadoop.conf.Configuration;

import org.apache.hadoop.fs.FileSystem;

import org.apache.hadoop.fs.Path;

 

public class HadoopTest {

    public static int runRead(String[] args) throws Exception{

    if (args.length < 3) {

    System.err.println("HadoopTest hadoop_file_path 
hadoop_user kerberos_cache");

    return 1;

    }

    Path inputPath = new Path(args[0]);

    Configuration conf = new Configuration();

    URI defaultURI = FileSystem.getDefaultUri(conf);

    conf.set("hadoop.security.kerberos.ticket.cache.path",args[2]);

    FileSystem fs = FileSystem.newInstance(defaultURI,conf,args[1]);

    InputStream is = fs.open(inputPath);

    byte[] buffer = new byte[4096];

    int nr = is.read(buffer);

    while (nr != -1)

    {

    System.out.write(buffer, 0, nr);

    nr = is.read(buffer);

    }

    return 0;

    }

    public static void main( String[] args ) throws Exception {

    int returnCode = HadoopTest.runRead(args);

    System.exit(returnCode);

    }

}

 

 

 

[root@lstrost3 testhadoop]# pwd

/testhadoop

 

[root@lstrost3 testhadoop]# ls

HadoopTest.java

 

[root@lstrost3 testhadoop]# export CLASSPATH=`hadoop classpath --glob`:.

 

[root@lstrost3 testhadoop]# javac HadoopTest.java

 

[root@lstrost3 testhadoop]# java HadoopTest

HadoopTest  hadoop_file_path  hadoop_user  kerberos_cache

 

[root@lstrost3 testhadoop]# java HadoopTest /loki/loki.file loki 
/tmp/krb5cc_1006

18/09/27 23:23:20 WARN util.NativeCodeLoader: Unable to load native-hadoop 
library for your platform... using builtin-java classes where applicable

18/09/27 23:23:21 WARN shortcircuit.DomainSocketFactory: The short-circuit 
local reads feature cannot be used because libhadoop cannot be loaded.

Exception in thread "main" java.io.IOException: 
org.apache.hadoop.security.authentication.client.AuthenticationException: 
GSSException: *{color:#FF}No valid credentials provided (Mechanism level: 
Failed to find any Kerberos tgt){color}*

    at 
{color:#FF}*org.apache.hadoop.crypto.key.kms.KMSClientProvider.createConnection(KMSClientProvider.java:551)*{color}

    at 
org.apache.hadoop.crypto.key.kms.KMSClientProvider.decryptEncryptedKey(KMSClientProvider.java:831)

    at 
org.apache.hadoop.crypto.key.KeyProviderCryptoExtension.decryptEncryptedKey(KeyProviderCryptoExtension.java:388)

    at 
org.apache.hadoop.hdfs.DFSClient.decryptEncryptedDataEncryptionKey(DFSClient.java:1393)

    at 
org.apache.hadoop.hdfs.DFSClient.createWrappedInputStream(DFSClient.java:1463)

    at 
org.apache.hadoop.hdfs.DistributedFileSystem$4.doCall(DistributedFileSystem.java:333)

    at 
org.apache.hadoop.hdfs.DistributedFileSystem$4.doCall(DistributedFileSystem.java:327)

    at 
org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)

    at 
org.apache.hadoop.hdfs.DistributedFileSystem.open(DistributedFileSystem.java:340)

    at org.apache.hadoop.fs.FileSystem.open(FileSystem.java:786)

    at HadoopTest.runRead(HadoopTest.java:18)

    at HadoopTest.main(HadoopTest.java:29)

Caused by: 
org.apache.hadoop.security.authentication.client.AuthenticationException: 
GSSException: No valid credentials provided (Mechanism level: Failed to find 
any Kerberos tgt)

    at 
org.apache.hadoop.security.authentication.client.KerberosAuthenticator.doSpnegoSequence(KerberosAuthenticator.java:333)

    at 
org.apache.hadoop.security.authentication.client.KerberosAuthenticator.authenticate(KerberosAuthenticator.java:203)

    at