[jira] [Commented] (HDFS-13965) hadoop.security.kerberos.ticket.cache.path setting is not honored when KMS encryption is enabled.

2020-12-14 Thread Arun Prabu (Jira)


[ 
https://issues.apache.org/jira/browse/HDFS-13965?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17248902#comment-17248902
 ] 

Arun Prabu commented on HDFS-13965:
---

Thanks for the suggestion. We are planning to use ticket cache file specific to 
our software as that would solve the issue. We will validate this completely 
and get back

> hadoop.security.kerberos.ticket.cache.path setting is not honored when KMS 
> encryption is enabled.
> -
>
> Key: HDFS-13965
> URL: https://issues.apache.org/jira/browse/HDFS-13965
> Project: Hadoop HDFS
>  Issue Type: Bug
>  Components: hdfs-client, kms
>Affects Versions: 2.7.3, 2.7.7
>Reporter: LOKESKUMAR VIJAYAKUMAR
>Assignee: Kitti Nanasi
>Priority: Major
>
> _We use the *+hadoop.security.kerberos.ticket.cache.path+* setting to provide 
> a custom kerberos cache path for all hadoop operations to be run as specified 
> user. But this setting is not honored when KMS encryption is enabled._
> _The below program to read a file works when KMS encryption is not enabled, 
> but it fails when the KMS encryption is enabled._
> _Looks like *hadoop.security.kerberos.ticket.cache.path* setting is not 
> honored by *createConnection on KMSClientProvider.java.*_
>  
> HadoopTest.java (CLASSPATH needs to be set to compile and run)
>  
> import java.io.InputStream;
> import java.net.URI;
> import org.apache.hadoop.conf.Configuration;
> import org.apache.hadoop.fs.FileSystem;
> import org.apache.hadoop.fs.Path;
>  
> public class HadoopTest {
>     public static int runRead(String[] args) throws Exception{
>     if (args.length < 3) {
>     System.err.println("HadoopTest hadoop_file_path 
> hadoop_user kerberos_cache");
>     return 1;
>     }
>     Path inputPath = new Path(args[0]);
>     Configuration conf = new Configuration();
>     URI defaultURI = FileSystem.getDefaultUri(conf);
>     
> conf.set("hadoop.security.kerberos.ticket.cache.path",args[2]);
>     FileSystem fs = 
> FileSystem.newInstance(defaultURI,conf,args[1]);
>     InputStream is = fs.open(inputPath);
>     byte[] buffer = new byte[4096];
>     int nr = is.read(buffer);
>     while (nr != -1)
>     {
>     System.out.write(buffer, 0, nr);
>     nr = is.read(buffer);
>     }
>     return 0;
>     }
>     public static void main( String[] args ) throws Exception {
>     int returnCode = HadoopTest.runRead(args);
>     System.exit(returnCode);
>     }
> }
>  
>  
>  
> [root@lstrost3 testhadoop]# pwd
> /testhadoop
>  
> [root@lstrost3 testhadoop]# ls
> HadoopTest.java
>  
> [root@lstrost3 testhadoop]# export CLASSPATH=`hadoop classpath --glob`:.
>  
> [root@lstrost3 testhadoop]# javac HadoopTest.java
>  
> [root@lstrost3 testhadoop]# java HadoopTest
> HadoopTest  hadoop_file_path  hadoop_user  kerberos_cache
>  
> [root@lstrost3 testhadoop]# java HadoopTest /loki/loki.file loki 
> /tmp/krb5cc_1006
> 18/09/27 23:23:20 WARN util.NativeCodeLoader: Unable to load native-hadoop 
> library for your platform... using builtin-java classes where applicable
> 18/09/27 23:23:21 WARN shortcircuit.DomainSocketFactory: The short-circuit 
> local reads feature cannot be used because libhadoop cannot be loaded.
> Exception in thread "main" java.io.IOException: 
> org.apache.hadoop.security.authentication.client.AuthenticationException: 
> GSSException: *{color:#FF}No valid credentials provided (Mechanism level: 
> Failed to find any Kerberos tgt){color}*
>     at 
> {color:#FF}*org.apache.hadoop.crypto.key.kms.KMSClientProvider.createConnection(KMSClientProvider.java:551)*{color}
>     at 
> org.apache.hadoop.crypto.key.kms.KMSClientProvider.decryptEncryptedKey(KMSClientProvider.java:831)
>     at 
> org.apache.hadoop.crypto.key.KeyProviderCryptoExtension.decryptEncryptedKey(KeyProviderCryptoExtension.java:388)
>     at 
> org.apache.hadoop.hdfs.DFSClient.decryptEncryptedDataEncryptionKey(DFSClient.java:1393)
>     at 
> org.apache.hadoop.hdfs.DFSClient.createWrappedInputStream(DFSClient.java:1463)
>     at 
> org.apache.hadoop.hdfs.DistributedFileSystem$4.doCall(DistributedFileSystem.java:333)
>     at 
> org.apache.hadoop.hdfs.DistributedFileSystem$4.doCall(DistributedFileSystem.java:327)
>     at 
> org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
>     at 
> org.apache.hadoop.hdfs.DistributedFileSystem.open(DistributedFileSystem.java:340)
>     at org.apache.hadoop.fs.FileSystem.open(FileSystem.java:786)
>  

[jira] [Commented] (HDFS-13965) hadoop.security.kerberos.ticket.cache.path setting is not honored when KMS encryption is enabled.

2020-12-10 Thread Jira


[ 
https://issues.apache.org/jira/browse/HDFS-13965?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17247381#comment-17247381
 ] 

István Fajth commented on HDFS-13965:
-

Ok, so to refine my understanding:
There is an application running as root, which operates with the following flow:
Software starts -> at one point changes the uid runs kinit with a specified 
ticket cache in the default path -> later on connections to HDFS uses that 
ticket cache based on KRB5CCNAME environment variable

During HDFS accesses the software is already running as root.

Auto renewal happens in the context of these HDFS accesses at one point in time 
later on, when the software is running as root, therefore the ticket cache file 
ownership is changed by kerberos libraries as expected, as the ticket should 
belong to the authenticating user.

What I still don't get: why do you need to kinit as the halsys user if 
otherwise the software is using the ticket cache as root, and as with the 
keytab any user can authenticate and get a ticket as the halsys user?

What I would do:
- run a kinit at software startup as root specifying 
/etc/mysoftware/ticketcahce as the ticketcache
- in the software's environment I would set KRB5CCNAME to point to 
/etc/mysoftware/ticketcahce
With that we let alone the whole system and any other thing in the system, we 
kinit to have a ticket for the software, as we can not use a keytab based auth 
in the C lib as it seems, and then from the environment we ensure that the 
software uses that ticket in the cache exclusive for this application. And in 
this case as the application is running as root, we can ignore the question of 
who owns the ticket cache, as root will own it, the software running as root 
will be able to access it, and noone and nothing else in the system will care 
about this particular ticket cache, for which the path can be anything not just 
/etc/mysoftware/ticketcahce, it can not be one thing, the default ticket cache 
path of any user...


> hadoop.security.kerberos.ticket.cache.path setting is not honored when KMS 
> encryption is enabled.
> -
>
> Key: HDFS-13965
> URL: https://issues.apache.org/jira/browse/HDFS-13965
> Project: Hadoop HDFS
>  Issue Type: Bug
>  Components: hdfs-client, kms
>Affects Versions: 2.7.3, 2.7.7
>Reporter: LOKESKUMAR VIJAYAKUMAR
>Assignee: Kitti Nanasi
>Priority: Major
>
> _We use the *+hadoop.security.kerberos.ticket.cache.path+* setting to provide 
> a custom kerberos cache path for all hadoop operations to be run as specified 
> user. But this setting is not honored when KMS encryption is enabled._
> _The below program to read a file works when KMS encryption is not enabled, 
> but it fails when the KMS encryption is enabled._
> _Looks like *hadoop.security.kerberos.ticket.cache.path* setting is not 
> honored by *createConnection on KMSClientProvider.java.*_
>  
> HadoopTest.java (CLASSPATH needs to be set to compile and run)
>  
> import java.io.InputStream;
> import java.net.URI;
> import org.apache.hadoop.conf.Configuration;
> import org.apache.hadoop.fs.FileSystem;
> import org.apache.hadoop.fs.Path;
>  
> public class HadoopTest {
>     public static int runRead(String[] args) throws Exception{
>     if (args.length < 3) {
>     System.err.println("HadoopTest hadoop_file_path 
> hadoop_user kerberos_cache");
>     return 1;
>     }
>     Path inputPath = new Path(args[0]);
>     Configuration conf = new Configuration();
>     URI defaultURI = FileSystem.getDefaultUri(conf);
>     
> conf.set("hadoop.security.kerberos.ticket.cache.path",args[2]);
>     FileSystem fs = 
> FileSystem.newInstance(defaultURI,conf,args[1]);
>     InputStream is = fs.open(inputPath);
>     byte[] buffer = new byte[4096];
>     int nr = is.read(buffer);
>     while (nr != -1)
>     {
>     System.out.write(buffer, 0, nr);
>     nr = is.read(buffer);
>     }
>     return 0;
>     }
>     public static void main( String[] args ) throws Exception {
>     int returnCode = HadoopTest.runRead(args);
>     System.exit(returnCode);
>     }
> }
>  
>  
>  
> [root@lstrost3 testhadoop]# pwd
> /testhadoop
>  
> [root@lstrost3 testhadoop]# ls
> HadoopTest.java
>  
> [root@lstrost3 testhadoop]# export CLASSPATH=`hadoop classpath --glob`:.
>  
> [root@lstrost3 testhadoop]# javac HadoopTest.java
>  
> [root@lstrost3 testhadoop]# java HadoopTest
> HadoopTest  hadoop_file_path  hadoop_user  kerberos_cache
>  
> [root@lstrost3 testhadoop]# java 

[jira] [Commented] (HDFS-13965) hadoop.security.kerberos.ticket.cache.path setting is not honored when KMS encryption is enabled.

2020-11-17 Thread Arun Prabu (Jira)


[ 
https://issues.apache.org/jira/browse/HDFS-13965?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17233467#comment-17233467
 ] 

Arun Prabu commented on HDFS-13965:
---

Our application has to run as root. So, we temporarily change the effective 
user id of our process to configured OS user and run kinit as that user

kinit halsys -kt /etc/krb5.keytab -c /tmp/krb5cc_1099

After kinit, hdfs connection is created as user halsys using libhdfs C apis

After running kinit the effective user id of the process is again elevated to 
root

When the auto-renewal thread in java layer detects its time to renew, the 
ticker gets renewed and owner of /tmp/krb5cc_1099  is changed to root

lets say arbitrary ticket cache is used for user halsys with id 1099. So, as 
per above suggestion if arbitrary ticket path like /custom_path/krb5cc_1009 is 
used,

kinit halsys -kt /etc/krb5.keytab -c /custom_path/krb5cc_1099

will auto-renewal thread stop renewing the ticket? or is that ownership will 
not be changed? What is the expected behavior.

 

> hadoop.security.kerberos.ticket.cache.path setting is not honored when KMS 
> encryption is enabled.
> -
>
> Key: HDFS-13965
> URL: https://issues.apache.org/jira/browse/HDFS-13965
> Project: Hadoop HDFS
>  Issue Type: Bug
>  Components: hdfs-client, kms
>Affects Versions: 2.7.3, 2.7.7
>Reporter: LOKESKUMAR VIJAYAKUMAR
>Assignee: Kitti Nanasi
>Priority: Major
>
> _We use the *+hadoop.security.kerberos.ticket.cache.path+* setting to provide 
> a custom kerberos cache path for all hadoop operations to be run as specified 
> user. But this setting is not honored when KMS encryption is enabled._
> _The below program to read a file works when KMS encryption is not enabled, 
> but it fails when the KMS encryption is enabled._
> _Looks like *hadoop.security.kerberos.ticket.cache.path* setting is not 
> honored by *createConnection on KMSClientProvider.java.*_
>  
> HadoopTest.java (CLASSPATH needs to be set to compile and run)
>  
> import java.io.InputStream;
> import java.net.URI;
> import org.apache.hadoop.conf.Configuration;
> import org.apache.hadoop.fs.FileSystem;
> import org.apache.hadoop.fs.Path;
>  
> public class HadoopTest {
>     public static int runRead(String[] args) throws Exception{
>     if (args.length < 3) {
>     System.err.println("HadoopTest hadoop_file_path 
> hadoop_user kerberos_cache");
>     return 1;
>     }
>     Path inputPath = new Path(args[0]);
>     Configuration conf = new Configuration();
>     URI defaultURI = FileSystem.getDefaultUri(conf);
>     
> conf.set("hadoop.security.kerberos.ticket.cache.path",args[2]);
>     FileSystem fs = 
> FileSystem.newInstance(defaultURI,conf,args[1]);
>     InputStream is = fs.open(inputPath);
>     byte[] buffer = new byte[4096];
>     int nr = is.read(buffer);
>     while (nr != -1)
>     {
>     System.out.write(buffer, 0, nr);
>     nr = is.read(buffer);
>     }
>     return 0;
>     }
>     public static void main( String[] args ) throws Exception {
>     int returnCode = HadoopTest.runRead(args);
>     System.exit(returnCode);
>     }
> }
>  
>  
>  
> [root@lstrost3 testhadoop]# pwd
> /testhadoop
>  
> [root@lstrost3 testhadoop]# ls
> HadoopTest.java
>  
> [root@lstrost3 testhadoop]# export CLASSPATH=`hadoop classpath --glob`:.
>  
> [root@lstrost3 testhadoop]# javac HadoopTest.java
>  
> [root@lstrost3 testhadoop]# java HadoopTest
> HadoopTest  hadoop_file_path  hadoop_user  kerberos_cache
>  
> [root@lstrost3 testhadoop]# java HadoopTest /loki/loki.file loki 
> /tmp/krb5cc_1006
> 18/09/27 23:23:20 WARN util.NativeCodeLoader: Unable to load native-hadoop 
> library for your platform... using builtin-java classes where applicable
> 18/09/27 23:23:21 WARN shortcircuit.DomainSocketFactory: The short-circuit 
> local reads feature cannot be used because libhadoop cannot be loaded.
> Exception in thread "main" java.io.IOException: 
> org.apache.hadoop.security.authentication.client.AuthenticationException: 
> GSSException: *{color:#FF}No valid credentials provided (Mechanism level: 
> Failed to find any Kerberos tgt){color}*
>     at 
> {color:#FF}*org.apache.hadoop.crypto.key.kms.KMSClientProvider.createConnection(KMSClientProvider.java:551)*{color}
>     at 
> org.apache.hadoop.crypto.key.kms.KMSClientProvider.decryptEncryptedKey(KMSClientProvider.java:831)
>     at 
> 

[jira] [Commented] (HDFS-13965) hadoop.security.kerberos.ticket.cache.path setting is not honored when KMS encryption is enabled.

2020-10-26 Thread Jira


[ 
https://issues.apache.org/jira/browse/HDFS-13965?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17220582#comment-17220582
 ] 

István Fajth commented on HDFS-13965:
-

I believe you should not use the default ticket cache path for an application 
that runs as root, but uses an other user's ticket cache... renewal has to 
happen at some point in time, so you should use root's own ticket cache, or an 
arbitrary ticket cache, but not the other user's default one, even though root 
can read that.

Your application should use keytab based auth via jaas instead of providing the 
ticket cache at the end of the day as I believe.
As a possible interim solution you can use an arbitrary ticket cache -but not 
the default one- into which you get the halsys principal's ticket with kinit, 
and use it from the application with KRB5CCNAME... but I am totally convinced 
that this is not a good solution, and if I were you I would do anything to get 
a keytab based authentication model working in my application, as that would 
enable the application to run as any arbitrary OS user with any principal's 
keytab, and would not require to run as root...

> hadoop.security.kerberos.ticket.cache.path setting is not honored when KMS 
> encryption is enabled.
> -
>
> Key: HDFS-13965
> URL: https://issues.apache.org/jira/browse/HDFS-13965
> Project: Hadoop HDFS
>  Issue Type: Bug
>  Components: hdfs-client, kms
>Affects Versions: 2.7.3, 2.7.7
>Reporter: LOKESKUMAR VIJAYAKUMAR
>Assignee: Kitti Nanasi
>Priority: Major
>
> _We use the *+hadoop.security.kerberos.ticket.cache.path+* setting to provide 
> a custom kerberos cache path for all hadoop operations to be run as specified 
> user. But this setting is not honored when KMS encryption is enabled._
> _The below program to read a file works when KMS encryption is not enabled, 
> but it fails when the KMS encryption is enabled._
> _Looks like *hadoop.security.kerberos.ticket.cache.path* setting is not 
> honored by *createConnection on KMSClientProvider.java.*_
>  
> HadoopTest.java (CLASSPATH needs to be set to compile and run)
>  
> import java.io.InputStream;
> import java.net.URI;
> import org.apache.hadoop.conf.Configuration;
> import org.apache.hadoop.fs.FileSystem;
> import org.apache.hadoop.fs.Path;
>  
> public class HadoopTest {
>     public static int runRead(String[] args) throws Exception{
>     if (args.length < 3) {
>     System.err.println("HadoopTest hadoop_file_path 
> hadoop_user kerberos_cache");
>     return 1;
>     }
>     Path inputPath = new Path(args[0]);
>     Configuration conf = new Configuration();
>     URI defaultURI = FileSystem.getDefaultUri(conf);
>     
> conf.set("hadoop.security.kerberos.ticket.cache.path",args[2]);
>     FileSystem fs = 
> FileSystem.newInstance(defaultURI,conf,args[1]);
>     InputStream is = fs.open(inputPath);
>     byte[] buffer = new byte[4096];
>     int nr = is.read(buffer);
>     while (nr != -1)
>     {
>     System.out.write(buffer, 0, nr);
>     nr = is.read(buffer);
>     }
>     return 0;
>     }
>     public static void main( String[] args ) throws Exception {
>     int returnCode = HadoopTest.runRead(args);
>     System.exit(returnCode);
>     }
> }
>  
>  
>  
> [root@lstrost3 testhadoop]# pwd
> /testhadoop
>  
> [root@lstrost3 testhadoop]# ls
> HadoopTest.java
>  
> [root@lstrost3 testhadoop]# export CLASSPATH=`hadoop classpath --glob`:.
>  
> [root@lstrost3 testhadoop]# javac HadoopTest.java
>  
> [root@lstrost3 testhadoop]# java HadoopTest
> HadoopTest  hadoop_file_path  hadoop_user  kerberos_cache
>  
> [root@lstrost3 testhadoop]# java HadoopTest /loki/loki.file loki 
> /tmp/krb5cc_1006
> 18/09/27 23:23:20 WARN util.NativeCodeLoader: Unable to load native-hadoop 
> library for your platform... using builtin-java classes where applicable
> 18/09/27 23:23:21 WARN shortcircuit.DomainSocketFactory: The short-circuit 
> local reads feature cannot be used because libhadoop cannot be loaded.
> Exception in thread "main" java.io.IOException: 
> org.apache.hadoop.security.authentication.client.AuthenticationException: 
> GSSException: *{color:#FF}No valid credentials provided (Mechanism level: 
> Failed to find any Kerberos tgt){color}*
>     at 
> {color:#FF}*org.apache.hadoop.crypto.key.kms.KMSClientProvider.createConnection(KMSClientProvider.java:551)*{color}
>     at 
> 

[jira] [Commented] (HDFS-13965) hadoop.security.kerberos.ticket.cache.path setting is not honored when KMS encryption is enabled.

2020-06-18 Thread LOKESKUMAR VIJAYAKUMAR (Jira)


[ 
https://issues.apache.org/jira/browse/HDFS-13965?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17139552#comment-17139552
 ] 

LOKESKUMAR VIJAYAKUMAR commented on HDFS-13965:
---

Hello Team!
Can anyone please help here?

> hadoop.security.kerberos.ticket.cache.path setting is not honored when KMS 
> encryption is enabled.
> -
>
> Key: HDFS-13965
> URL: https://issues.apache.org/jira/browse/HDFS-13965
> Project: Hadoop HDFS
>  Issue Type: Bug
>  Components: hdfs-client, kms
>Affects Versions: 2.7.3, 2.7.7
>Reporter: LOKESKUMAR VIJAYAKUMAR
>Assignee: Kitti Nanasi
>Priority: Major
>
> _We use the *+hadoop.security.kerberos.ticket.cache.path+* setting to provide 
> a custom kerberos cache path for all hadoop operations to be run as specified 
> user. But this setting is not honored when KMS encryption is enabled._
> _The below program to read a file works when KMS encryption is not enabled, 
> but it fails when the KMS encryption is enabled._
> _Looks like *hadoop.security.kerberos.ticket.cache.path* setting is not 
> honored by *createConnection on KMSClientProvider.java.*_
>  
> HadoopTest.java (CLASSPATH needs to be set to compile and run)
>  
> import java.io.InputStream;
> import java.net.URI;
> import org.apache.hadoop.conf.Configuration;
> import org.apache.hadoop.fs.FileSystem;
> import org.apache.hadoop.fs.Path;
>  
> public class HadoopTest {
>     public static int runRead(String[] args) throws Exception{
>     if (args.length < 3) {
>     System.err.println("HadoopTest hadoop_file_path 
> hadoop_user kerberos_cache");
>     return 1;
>     }
>     Path inputPath = new Path(args[0]);
>     Configuration conf = new Configuration();
>     URI defaultURI = FileSystem.getDefaultUri(conf);
>     
> conf.set("hadoop.security.kerberos.ticket.cache.path",args[2]);
>     FileSystem fs = 
> FileSystem.newInstance(defaultURI,conf,args[1]);
>     InputStream is = fs.open(inputPath);
>     byte[] buffer = new byte[4096];
>     int nr = is.read(buffer);
>     while (nr != -1)
>     {
>     System.out.write(buffer, 0, nr);
>     nr = is.read(buffer);
>     }
>     return 0;
>     }
>     public static void main( String[] args ) throws Exception {
>     int returnCode = HadoopTest.runRead(args);
>     System.exit(returnCode);
>     }
> }
>  
>  
>  
> [root@lstrost3 testhadoop]# pwd
> /testhadoop
>  
> [root@lstrost3 testhadoop]# ls
> HadoopTest.java
>  
> [root@lstrost3 testhadoop]# export CLASSPATH=`hadoop classpath --glob`:.
>  
> [root@lstrost3 testhadoop]# javac HadoopTest.java
>  
> [root@lstrost3 testhadoop]# java HadoopTest
> HadoopTest  hadoop_file_path  hadoop_user  kerberos_cache
>  
> [root@lstrost3 testhadoop]# java HadoopTest /loki/loki.file loki 
> /tmp/krb5cc_1006
> 18/09/27 23:23:20 WARN util.NativeCodeLoader: Unable to load native-hadoop 
> library for your platform... using builtin-java classes where applicable
> 18/09/27 23:23:21 WARN shortcircuit.DomainSocketFactory: The short-circuit 
> local reads feature cannot be used because libhadoop cannot be loaded.
> Exception in thread "main" java.io.IOException: 
> org.apache.hadoop.security.authentication.client.AuthenticationException: 
> GSSException: *{color:#FF}No valid credentials provided (Mechanism level: 
> Failed to find any Kerberos tgt){color}*
>     at 
> {color:#FF}*org.apache.hadoop.crypto.key.kms.KMSClientProvider.createConnection(KMSClientProvider.java:551)*{color}
>     at 
> org.apache.hadoop.crypto.key.kms.KMSClientProvider.decryptEncryptedKey(KMSClientProvider.java:831)
>     at 
> org.apache.hadoop.crypto.key.KeyProviderCryptoExtension.decryptEncryptedKey(KeyProviderCryptoExtension.java:388)
>     at 
> org.apache.hadoop.hdfs.DFSClient.decryptEncryptedDataEncryptionKey(DFSClient.java:1393)
>     at 
> org.apache.hadoop.hdfs.DFSClient.createWrappedInputStream(DFSClient.java:1463)
>     at 
> org.apache.hadoop.hdfs.DistributedFileSystem$4.doCall(DistributedFileSystem.java:333)
>     at 
> org.apache.hadoop.hdfs.DistributedFileSystem$4.doCall(DistributedFileSystem.java:327)
>     at 
> org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
>     at 
> org.apache.hadoop.hdfs.DistributedFileSystem.open(DistributedFileSystem.java:340)
>     at org.apache.hadoop.fs.FileSystem.open(FileSystem.java:786)
>     at HadoopTest.runRead(HadoopTest.java:18)
>     at HadoopTest.main(HadoopTest.java:29)
> Caused 

[jira] [Commented] (HDFS-13965) hadoop.security.kerberos.ticket.cache.path setting is not honored when KMS encryption is enabled.

2020-06-08 Thread LOKESKUMAR VIJAYAKUMAR (Jira)


[ 
https://issues.apache.org/jira/browse/HDFS-13965?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17128745#comment-17128745
 ] 

LOKESKUMAR VIJAYAKUMAR commented on HDFS-13965:
---

Hello  Kitti Nanasi,
We implemented the workaround you suggested.  (Setting the KRB5CCNAME 
environment variable with hadoop user kerberos cache path )
But this causes the hadoop user ticket cache to be renewed as root user 
automatically when ticket cache is about to expire / already expired.  Which 
changes the ownership of the ticket cache from hadoop user to root user.  As a 
result of all this, further kinit logins fails for hadoop user, as ticket cache 
path is owned by root user now. 

Is there anyway to avoid this issue?  What should we do to not run into this 
issue?





20/05/21 21:01:53 DEBUG security.UserGroupInformation: hadoop login
20/05/21 21:01:53 DEBUG security.UserGroupInformation: hadoop login commit
20/05/21 21:01:53 DEBUG security.UserGroupInformation: using kerberos 
user:hal...@cinfin.com
20/05/21 21:01:53 DEBUG security.UserGroupInformation: Using user: 
"hal...@cinfin.com" with name hal...@cinfin.com
20/05/21 21:01:53 DEBUG security.UserGroupInformation: User entry: 
"hal...@cinfin.com"
20/05/21 21:01:53 DEBUG security.UserGroupInformation: Assuming keytab is 
managed externally since logged in from subject.
20/05/21 21:01:53 DEBUG security.UserGroupInformation: UGI 
loginUser:hal...@cinfin.com (auth:KERBEROS)
20/05/21 21:01:53 DEBUG security.UserGroupInformation: Found tgt Ticket (hex) =
 
Client Principal = hal...@cinfin.com
Server Principal = krbtgt/cinfin@cinfin.com
 
Forwardable Ticket true
Forwarded Ticket false
Proxiable Ticket false
Proxy Ticket false
Postdated Ticket false
Renewable Ticket true
Initial Ticket true
Auth Time = Thu May 21 03:09:53 EDT 2020
Start Time = Thu May 21 11:10:09 EDT 2020
End Time = Thu May 21 21:10:09 EDT 2020
Renew Till = Thu May 28 03:09:53 EDT 2020
Client Addresses  Null
20/05/21 21:01:53 DEBUG security.UserGroupInformation: Current time is 
1590109313240
20/05/21 21:01:53 DEBUG security.UserGroupInformation: Next refresh is 
1590102609000
20/05/21 21:01:53 DEBUG security.UserGroupInformation: renewed ticket
20/05/21 21:01:53 DEBUG security.UserGroupInformation: Initiating logout for 
hal...@cinfin.com
20/05/21 21:01:53 DEBUG security.UserGroupInformation: hadoop logout
20/05/21 21:01:53 DEBUG security.UserGroupInformation: Initiating re-login for 
hal...@cinfin.com
20/05/21 21:01:53 DEBUG security.UserGroupInformation: hadoop login
20/05/21 21:01:53 DEBUG security.UserGroupInformation: hadoop login commit


> hadoop.security.kerberos.ticket.cache.path setting is not honored when KMS 
> encryption is enabled.
> -
>
> Key: HDFS-13965
> URL: https://issues.apache.org/jira/browse/HDFS-13965
> Project: Hadoop HDFS
>  Issue Type: Bug
>  Components: hdfs-client, kms
>Affects Versions: 2.7.3, 2.7.7
>Reporter: LOKESKUMAR VIJAYAKUMAR
>Assignee: Kitti Nanasi
>Priority: Major
>
> _We use the *+hadoop.security.kerberos.ticket.cache.path+* setting to provide 
> a custom kerberos cache path for all hadoop operations to be run as specified 
> user. But this setting is not honored when KMS encryption is enabled._
> _The below program to read a file works when KMS encryption is not enabled, 
> but it fails when the KMS encryption is enabled._
> _Looks like *hadoop.security.kerberos.ticket.cache.path* setting is not 
> honored by *createConnection on KMSClientProvider.java.*_
>  
> HadoopTest.java (CLASSPATH needs to be set to compile and run)
>  
> import java.io.InputStream;
> import java.net.URI;
> import org.apache.hadoop.conf.Configuration;
> import org.apache.hadoop.fs.FileSystem;
> import org.apache.hadoop.fs.Path;
>  
> public class HadoopTest {
>     public static int runRead(String[] args) throws Exception{
>     if (args.length < 3) {
>     System.err.println("HadoopTest hadoop_file_path 
> hadoop_user kerberos_cache");
>     return 1;
>     }
>     Path inputPath = new Path(args[0]);
>     Configuration conf = new Configuration();
>     URI defaultURI = FileSystem.getDefaultUri(conf);
>     
> conf.set("hadoop.security.kerberos.ticket.cache.path",args[2]);
>     FileSystem fs = 
> FileSystem.newInstance(defaultURI,conf,args[1]);
>     InputStream is = fs.open(inputPath);
>     byte[] buffer = new byte[4096];
>     int nr = is.read(buffer);
>     while (nr != -1)
>     {
>     System.out.write(buffer, 0, nr);
>     nr = is.read(buffer);
>     }
>     

[jira] [Commented] (HDFS-13965) hadoop.security.kerberos.ticket.cache.path setting is not honored when KMS encryption is enabled.

2018-12-21 Thread Kitti Nanasi (JIRA)


[ 
https://issues.apache.org/jira/browse/HDFS-13965?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16726756#comment-16726756
 ] 

Kitti Nanasi commented on HDFS-13965:
-

[~lokeskumarp], I think it is possible to fix, but it is not a trivial change, 
so until it is fixed you can work around this problem by setting the KRB5CCNAME 
environment variable to the path of the ticket cache.

> hadoop.security.kerberos.ticket.cache.path setting is not honored when KMS 
> encryption is enabled.
> -
>
> Key: HDFS-13965
> URL: https://issues.apache.org/jira/browse/HDFS-13965
> Project: Hadoop HDFS
>  Issue Type: Bug
>  Components: hdfs-client, kms
>Affects Versions: 2.7.3, 2.7.7
>Reporter: LOKESKUMAR VIJAYAKUMAR
>Assignee: Kitti Nanasi
>Priority: Major
>
> _We use the *+hadoop.security.kerberos.ticket.cache.path+* setting to provide 
> a custom kerberos cache path for all hadoop operations to be run as specified 
> user. But this setting is not honored when KMS encryption is enabled._
> _The below program to read a file works when KMS encryption is not enabled, 
> but it fails when the KMS encryption is enabled._
> _Looks like *hadoop.security.kerberos.ticket.cache.path* setting is not 
> honored by *createConnection on KMSClientProvider.java.*_
>  
> HadoopTest.java (CLASSPATH needs to be set to compile and run)
>  
> import java.io.InputStream;
> import java.net.URI;
> import org.apache.hadoop.conf.Configuration;
> import org.apache.hadoop.fs.FileSystem;
> import org.apache.hadoop.fs.Path;
>  
> public class HadoopTest {
>     public static int runRead(String[] args) throws Exception{
>     if (args.length < 3) {
>     System.err.println("HadoopTest hadoop_file_path 
> hadoop_user kerberos_cache");
>     return 1;
>     }
>     Path inputPath = new Path(args[0]);
>     Configuration conf = new Configuration();
>     URI defaultURI = FileSystem.getDefaultUri(conf);
>     
> conf.set("hadoop.security.kerberos.ticket.cache.path",args[2]);
>     FileSystem fs = 
> FileSystem.newInstance(defaultURI,conf,args[1]);
>     InputStream is = fs.open(inputPath);
>     byte[] buffer = new byte[4096];
>     int nr = is.read(buffer);
>     while (nr != -1)
>     {
>     System.out.write(buffer, 0, nr);
>     nr = is.read(buffer);
>     }
>     return 0;
>     }
>     public static void main( String[] args ) throws Exception {
>     int returnCode = HadoopTest.runRead(args);
>     System.exit(returnCode);
>     }
> }
>  
>  
>  
> [root@lstrost3 testhadoop]# pwd
> /testhadoop
>  
> [root@lstrost3 testhadoop]# ls
> HadoopTest.java
>  
> [root@lstrost3 testhadoop]# export CLASSPATH=`hadoop classpath --glob`:.
>  
> [root@lstrost3 testhadoop]# javac HadoopTest.java
>  
> [root@lstrost3 testhadoop]# java HadoopTest
> HadoopTest  hadoop_file_path  hadoop_user  kerberos_cache
>  
> [root@lstrost3 testhadoop]# java HadoopTest /loki/loki.file loki 
> /tmp/krb5cc_1006
> 18/09/27 23:23:20 WARN util.NativeCodeLoader: Unable to load native-hadoop 
> library for your platform... using builtin-java classes where applicable
> 18/09/27 23:23:21 WARN shortcircuit.DomainSocketFactory: The short-circuit 
> local reads feature cannot be used because libhadoop cannot be loaded.
> Exception in thread "main" java.io.IOException: 
> org.apache.hadoop.security.authentication.client.AuthenticationException: 
> GSSException: *{color:#FF}No valid credentials provided (Mechanism level: 
> Failed to find any Kerberos tgt){color}*
>     at 
> {color:#FF}*org.apache.hadoop.crypto.key.kms.KMSClientProvider.createConnection(KMSClientProvider.java:551)*{color}
>     at 
> org.apache.hadoop.crypto.key.kms.KMSClientProvider.decryptEncryptedKey(KMSClientProvider.java:831)
>     at 
> org.apache.hadoop.crypto.key.KeyProviderCryptoExtension.decryptEncryptedKey(KeyProviderCryptoExtension.java:388)
>     at 
> org.apache.hadoop.hdfs.DFSClient.decryptEncryptedDataEncryptionKey(DFSClient.java:1393)
>     at 
> org.apache.hadoop.hdfs.DFSClient.createWrappedInputStream(DFSClient.java:1463)
>     at 
> org.apache.hadoop.hdfs.DistributedFileSystem$4.doCall(DistributedFileSystem.java:333)
>     at 
> org.apache.hadoop.hdfs.DistributedFileSystem$4.doCall(DistributedFileSystem.java:327)
>     at 
> org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
>     at 
> org.apache.hadoop.hdfs.DistributedFileSystem.open(DistributedFileSystem.java:340)
>     at 

[jira] [Commented] (HDFS-13965) hadoop.security.kerberos.ticket.cache.path setting is not honored when KMS encryption is enabled.

2018-12-20 Thread LOKESKUMAR VIJAYAKUMAR (JIRA)


[ 
https://issues.apache.org/jira/browse/HDFS-13965?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16726300#comment-16726300
 ] 

LOKESKUMAR VIJAYAKUMAR commented on HDFS-13965:
---

Hello [~knanasi],

Thanks for checking this issue.

Our software is implemented using the corresponding C APIs provided by libhdfs, 
 Our service runs as root user and when connecting to hadoop cluster, we do 
kerberos login as the hadoop user and use that ticket cache to access the data. 
Will it be possible to fix this so that the API works as expected in all cases?

Thanks,
Lokes


> hadoop.security.kerberos.ticket.cache.path setting is not honored when KMS 
> encryption is enabled.
> -
>
> Key: HDFS-13965
> URL: https://issues.apache.org/jira/browse/HDFS-13965
> Project: Hadoop HDFS
>  Issue Type: Bug
>  Components: hdfs-client, kms
>Affects Versions: 2.7.3, 2.7.7
>Reporter: LOKESKUMAR VIJAYAKUMAR
>Assignee: Kitti Nanasi
>Priority: Major
>
> _We use the *+hadoop.security.kerberos.ticket.cache.path+* setting to provide 
> a custom kerberos cache path for all hadoop operations to be run as specified 
> user. But this setting is not honored when KMS encryption is enabled._
> _The below program to read a file works when KMS encryption is not enabled, 
> but it fails when the KMS encryption is enabled._
> _Looks like *hadoop.security.kerberos.ticket.cache.path* setting is not 
> honored by *createConnection on KMSClientProvider.java.*_
>  
> HadoopTest.java (CLASSPATH needs to be set to compile and run)
>  
> import java.io.InputStream;
> import java.net.URI;
> import org.apache.hadoop.conf.Configuration;
> import org.apache.hadoop.fs.FileSystem;
> import org.apache.hadoop.fs.Path;
>  
> public class HadoopTest {
>     public static int runRead(String[] args) throws Exception{
>     if (args.length < 3) {
>     System.err.println("HadoopTest hadoop_file_path 
> hadoop_user kerberos_cache");
>     return 1;
>     }
>     Path inputPath = new Path(args[0]);
>     Configuration conf = new Configuration();
>     URI defaultURI = FileSystem.getDefaultUri(conf);
>     
> conf.set("hadoop.security.kerberos.ticket.cache.path",args[2]);
>     FileSystem fs = 
> FileSystem.newInstance(defaultURI,conf,args[1]);
>     InputStream is = fs.open(inputPath);
>     byte[] buffer = new byte[4096];
>     int nr = is.read(buffer);
>     while (nr != -1)
>     {
>     System.out.write(buffer, 0, nr);
>     nr = is.read(buffer);
>     }
>     return 0;
>     }
>     public static void main( String[] args ) throws Exception {
>     int returnCode = HadoopTest.runRead(args);
>     System.exit(returnCode);
>     }
> }
>  
>  
>  
> [root@lstrost3 testhadoop]# pwd
> /testhadoop
>  
> [root@lstrost3 testhadoop]# ls
> HadoopTest.java
>  
> [root@lstrost3 testhadoop]# export CLASSPATH=`hadoop classpath --glob`:.
>  
> [root@lstrost3 testhadoop]# javac HadoopTest.java
>  
> [root@lstrost3 testhadoop]# java HadoopTest
> HadoopTest  hadoop_file_path  hadoop_user  kerberos_cache
>  
> [root@lstrost3 testhadoop]# java HadoopTest /loki/loki.file loki 
> /tmp/krb5cc_1006
> 18/09/27 23:23:20 WARN util.NativeCodeLoader: Unable to load native-hadoop 
> library for your platform... using builtin-java classes where applicable
> 18/09/27 23:23:21 WARN shortcircuit.DomainSocketFactory: The short-circuit 
> local reads feature cannot be used because libhadoop cannot be loaded.
> Exception in thread "main" java.io.IOException: 
> org.apache.hadoop.security.authentication.client.AuthenticationException: 
> GSSException: *{color:#FF}No valid credentials provided (Mechanism level: 
> Failed to find any Kerberos tgt){color}*
>     at 
> {color:#FF}*org.apache.hadoop.crypto.key.kms.KMSClientProvider.createConnection(KMSClientProvider.java:551)*{color}
>     at 
> org.apache.hadoop.crypto.key.kms.KMSClientProvider.decryptEncryptedKey(KMSClientProvider.java:831)
>     at 
> org.apache.hadoop.crypto.key.KeyProviderCryptoExtension.decryptEncryptedKey(KeyProviderCryptoExtension.java:388)
>     at 
> org.apache.hadoop.hdfs.DFSClient.decryptEncryptedDataEncryptionKey(DFSClient.java:1393)
>     at 
> org.apache.hadoop.hdfs.DFSClient.createWrappedInputStream(DFSClient.java:1463)
>     at 
> org.apache.hadoop.hdfs.DistributedFileSystem$4.doCall(DistributedFileSystem.java:333)
>     at 
> org.apache.hadoop.hdfs.DistributedFileSystem$4.doCall(DistributedFileSystem.java:327)
>     at 
> 

[jira] [Commented] (HDFS-13965) hadoop.security.kerberos.ticket.cache.path setting is not honored when KMS encryption is enabled.

2018-12-17 Thread Kitti Nanasi (JIRA)


[ 
https://issues.apache.org/jira/browse/HDFS-13965?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16723104#comment-16723104
 ] 

Kitti Nanasi commented on HDFS-13965:
-

[~jojochuang], you are correct, the problem is that KerberosConfiguration does 
not use the ticket cache set in the configuration.

A workaround for this is that you can set the "KRB5CCNAME" environment variable 
to the ticket cache path. However the root user using the ticket cache of 
another user's to read its encryption zone does not seem like a usual scenario 
to me. You might want to consider running your script in an oozie workflow, 
which can run your script in the name of the other user using delegation 
tokens. [~lokeskumarp], let me know if you have questions.

> hadoop.security.kerberos.ticket.cache.path setting is not honored when KMS 
> encryption is enabled.
> -
>
> Key: HDFS-13965
> URL: https://issues.apache.org/jira/browse/HDFS-13965
> Project: Hadoop HDFS
>  Issue Type: Bug
>  Components: hdfs-client, kms
>Affects Versions: 2.7.3, 2.7.7
>Reporter: LOKESKUMAR VIJAYAKUMAR
>Assignee: Kitti Nanasi
>Priority: Major
>
> _We use the *+hadoop.security.kerberos.ticket.cache.path+* setting to provide 
> a custom kerberos cache path for all hadoop operations to be run as specified 
> user. But this setting is not honored when KMS encryption is enabled._
> _The below program to read a file works when KMS encryption is not enabled, 
> but it fails when the KMS encryption is enabled._
> _Looks like *hadoop.security.kerberos.ticket.cache.path* setting is not 
> honored by *createConnection on KMSClientProvider.java.*_
>  
> HadoopTest.java (CLASSPATH needs to be set to compile and run)
>  
> import java.io.InputStream;
> import java.net.URI;
> import org.apache.hadoop.conf.Configuration;
> import org.apache.hadoop.fs.FileSystem;
> import org.apache.hadoop.fs.Path;
>  
> public class HadoopTest {
>     public static int runRead(String[] args) throws Exception{
>     if (args.length < 3) {
>     System.err.println("HadoopTest hadoop_file_path 
> hadoop_user kerberos_cache");
>     return 1;
>     }
>     Path inputPath = new Path(args[0]);
>     Configuration conf = new Configuration();
>     URI defaultURI = FileSystem.getDefaultUri(conf);
>     
> conf.set("hadoop.security.kerberos.ticket.cache.path",args[2]);
>     FileSystem fs = 
> FileSystem.newInstance(defaultURI,conf,args[1]);
>     InputStream is = fs.open(inputPath);
>     byte[] buffer = new byte[4096];
>     int nr = is.read(buffer);
>     while (nr != -1)
>     {
>     System.out.write(buffer, 0, nr);
>     nr = is.read(buffer);
>     }
>     return 0;
>     }
>     public static void main( String[] args ) throws Exception {
>     int returnCode = HadoopTest.runRead(args);
>     System.exit(returnCode);
>     }
> }
>  
>  
>  
> [root@lstrost3 testhadoop]# pwd
> /testhadoop
>  
> [root@lstrost3 testhadoop]# ls
> HadoopTest.java
>  
> [root@lstrost3 testhadoop]# export CLASSPATH=`hadoop classpath --glob`:.
>  
> [root@lstrost3 testhadoop]# javac HadoopTest.java
>  
> [root@lstrost3 testhadoop]# java HadoopTest
> HadoopTest  hadoop_file_path  hadoop_user  kerberos_cache
>  
> [root@lstrost3 testhadoop]# java HadoopTest /loki/loki.file loki 
> /tmp/krb5cc_1006
> 18/09/27 23:23:20 WARN util.NativeCodeLoader: Unable to load native-hadoop 
> library for your platform... using builtin-java classes where applicable
> 18/09/27 23:23:21 WARN shortcircuit.DomainSocketFactory: The short-circuit 
> local reads feature cannot be used because libhadoop cannot be loaded.
> Exception in thread "main" java.io.IOException: 
> org.apache.hadoop.security.authentication.client.AuthenticationException: 
> GSSException: *{color:#FF}No valid credentials provided (Mechanism level: 
> Failed to find any Kerberos tgt){color}*
>     at 
> {color:#FF}*org.apache.hadoop.crypto.key.kms.KMSClientProvider.createConnection(KMSClientProvider.java:551)*{color}
>     at 
> org.apache.hadoop.crypto.key.kms.KMSClientProvider.decryptEncryptedKey(KMSClientProvider.java:831)
>     at 
> org.apache.hadoop.crypto.key.KeyProviderCryptoExtension.decryptEncryptedKey(KeyProviderCryptoExtension.java:388)
>     at 
> org.apache.hadoop.hdfs.DFSClient.decryptEncryptedDataEncryptionKey(DFSClient.java:1393)
>     at 
> org.apache.hadoop.hdfs.DFSClient.createWrappedInputStream(DFSClient.java:1463)
>     at 
> 

[jira] [Commented] (HDFS-13965) hadoop.security.kerberos.ticket.cache.path setting is not honored when KMS encryption is enabled.

2018-12-11 Thread Wei-Chiu Chuang (JIRA)


[ 
https://issues.apache.org/jira/browse/HDFS-13965?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16718204#comment-16718204
 ] 

Wei-Chiu Chuang commented on HDFS-13965:


Thanks for reporting the issue, [~lokeskumarp].

It looks like part of the issue is that KerberosConfiguration class (which is 
used by KMS client to authenticate) hard code the ticket cache path:
{code:java}
String ticketCache = System.getenv("KRB5CCNAME"); < this line
if (IBM_JAVA) {
  USER_KERBEROS_OPTIONS.put("useDefaultCcache", "true");
} else {
  USER_KERBEROS_OPTIONS.put("doNotPrompt", "true");
  USER_KERBEROS_OPTIONS.put("useTicketCache", "true");
}
if (ticketCache != null) {
  if (IBM_JAVA) {
// The first value searched when "useDefaultCcache" is used.
System.setProperty("KRB5CCNAME", ticketCache);
  } else {
USER_KERBEROS_OPTIONS.put("ticketCache", ticketCache);
  }
}{code}
So it always uses system variable {{$KRB5CCNAME}}. If we make it configurable, 
just like in {{FileSystem#get(URI, Configuration, String)}}, this issue should 
go away.

[~knanasi] how do you think? Thanks.

> hadoop.security.kerberos.ticket.cache.path setting is not honored when KMS 
> encryption is enabled.
> -
>
> Key: HDFS-13965
> URL: https://issues.apache.org/jira/browse/HDFS-13965
> Project: Hadoop HDFS
>  Issue Type: Bug
>  Components: hdfs-client, kms
>Affects Versions: 2.7.3, 2.7.7
>Reporter: LOKESKUMAR VIJAYAKUMAR
>Assignee: Kitti Nanasi
>Priority: Major
>
> _We use the *+hadoop.security.kerberos.ticket.cache.path+* setting to provide 
> a custom kerberos cache path for all hadoop operations to be run as specified 
> user. But this setting is not honored when KMS encryption is enabled._
> _The below program to read a file works when KMS encryption is not enabled, 
> but it fails when the KMS encryption is enabled._
> _Looks like *hadoop.security.kerberos.ticket.cache.path* setting is not 
> honored by *createConnection on KMSClientProvider.java.*_
>  
> HadoopTest.java (CLASSPATH needs to be set to compile and run)
>  
> import java.io.InputStream;
> import java.net.URI;
> import org.apache.hadoop.conf.Configuration;
> import org.apache.hadoop.fs.FileSystem;
> import org.apache.hadoop.fs.Path;
>  
> public class HadoopTest {
>     public static int runRead(String[] args) throws Exception{
>     if (args.length < 3) {
>     System.err.println("HadoopTest hadoop_file_path 
> hadoop_user kerberos_cache");
>     return 1;
>     }
>     Path inputPath = new Path(args[0]);
>     Configuration conf = new Configuration();
>     URI defaultURI = FileSystem.getDefaultUri(conf);
>     
> conf.set("hadoop.security.kerberos.ticket.cache.path",args[2]);
>     FileSystem fs = 
> FileSystem.newInstance(defaultURI,conf,args[1]);
>     InputStream is = fs.open(inputPath);
>     byte[] buffer = new byte[4096];
>     int nr = is.read(buffer);
>     while (nr != -1)
>     {
>     System.out.write(buffer, 0, nr);
>     nr = is.read(buffer);
>     }
>     return 0;
>     }
>     public static void main( String[] args ) throws Exception {
>     int returnCode = HadoopTest.runRead(args);
>     System.exit(returnCode);
>     }
> }
>  
>  
>  
> [root@lstrost3 testhadoop]# pwd
> /testhadoop
>  
> [root@lstrost3 testhadoop]# ls
> HadoopTest.java
>  
> [root@lstrost3 testhadoop]# export CLASSPATH=`hadoop classpath --glob`:.
>  
> [root@lstrost3 testhadoop]# javac HadoopTest.java
>  
> [root@lstrost3 testhadoop]# java HadoopTest
> HadoopTest  hadoop_file_path  hadoop_user  kerberos_cache
>  
> [root@lstrost3 testhadoop]# java HadoopTest /loki/loki.file loki 
> /tmp/krb5cc_1006
> 18/09/27 23:23:20 WARN util.NativeCodeLoader: Unable to load native-hadoop 
> library for your platform... using builtin-java classes where applicable
> 18/09/27 23:23:21 WARN shortcircuit.DomainSocketFactory: The short-circuit 
> local reads feature cannot be used because libhadoop cannot be loaded.
> Exception in thread "main" java.io.IOException: 
> org.apache.hadoop.security.authentication.client.AuthenticationException: 
> GSSException: *{color:#FF}No valid credentials provided (Mechanism level: 
> Failed to find any Kerberos tgt){color}*
>     at 
> {color:#FF}*org.apache.hadoop.crypto.key.kms.KMSClientProvider.createConnection(KMSClientProvider.java:551)*{color}
>     at 
> org.apache.hadoop.crypto.key.kms.KMSClientProvider.decryptEncryptedKey(KMSClientProvider.java:831)
>     at 
> 

[jira] [Commented] (HDFS-13965) hadoop.security.kerberos.ticket.cache.path setting is not honored when KMS encryption is enabled.

2018-10-31 Thread LOKESKUMAR VIJAYAKUMAR (JIRA)


[ 
https://issues.apache.org/jira/browse/HDFS-13965?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16670381#comment-16670381
 ] 

LOKESKUMAR VIJAYAKUMAR commented on HDFS-13965:
---

Did anyone get a chance to check this?
This has been open for quite sometime now.

> hadoop.security.kerberos.ticket.cache.path setting is not honored when KMS 
> encryption is enabled.
> -
>
> Key: HDFS-13965
> URL: https://issues.apache.org/jira/browse/HDFS-13965
> Project: Hadoop HDFS
>  Issue Type: Bug
>  Components: hdfs-client, kms
>Affects Versions: 2.7.3, 2.7.7
>Reporter: LOKESKUMAR VIJAYAKUMAR
>Priority: Major
>
> _We use the *+hadoop.security.kerberos.ticket.cache.path+* setting to provide 
> a custom kerberos cache path for all hadoop operations to be run as specified 
> user. But this setting is not honored when KMS encryption is enabled._
> _The below program to read a file works when KMS encryption is not enabled, 
> but it fails when the KMS encryption is enabled._
> _Looks like *hadoop.security.kerberos.ticket.cache.path* setting is not 
> honored by *createConnection on KMSClientProvider.java.*_
>  
> HadoopTest.java (CLASSPATH needs to be set to compile and run)
>  
> import java.io.InputStream;
> import java.net.URI;
> import org.apache.hadoop.conf.Configuration;
> import org.apache.hadoop.fs.FileSystem;
> import org.apache.hadoop.fs.Path;
>  
> public class HadoopTest {
>     public static int runRead(String[] args) throws Exception{
>     if (args.length < 3) {
>     System.err.println("HadoopTest hadoop_file_path 
> hadoop_user kerberos_cache");
>     return 1;
>     }
>     Path inputPath = new Path(args[0]);
>     Configuration conf = new Configuration();
>     URI defaultURI = FileSystem.getDefaultUri(conf);
>     
> conf.set("hadoop.security.kerberos.ticket.cache.path",args[2]);
>     FileSystem fs = 
> FileSystem.newInstance(defaultURI,conf,args[1]);
>     InputStream is = fs.open(inputPath);
>     byte[] buffer = new byte[4096];
>     int nr = is.read(buffer);
>     while (nr != -1)
>     {
>     System.out.write(buffer, 0, nr);
>     nr = is.read(buffer);
>     }
>     return 0;
>     }
>     public static void main( String[] args ) throws Exception {
>     int returnCode = HadoopTest.runRead(args);
>     System.exit(returnCode);
>     }
> }
>  
>  
>  
> [root@lstrost3 testhadoop]# pwd
> /testhadoop
>  
> [root@lstrost3 testhadoop]# ls
> HadoopTest.java
>  
> [root@lstrost3 testhadoop]# export CLASSPATH=`hadoop classpath --glob`:.
>  
> [root@lstrost3 testhadoop]# javac HadoopTest.java
>  
> [root@lstrost3 testhadoop]# java HadoopTest
> HadoopTest  hadoop_file_path  hadoop_user  kerberos_cache
>  
> [root@lstrost3 testhadoop]# java HadoopTest /loki/loki.file loki 
> /tmp/krb5cc_1006
> 18/09/27 23:23:20 WARN util.NativeCodeLoader: Unable to load native-hadoop 
> library for your platform... using builtin-java classes where applicable
> 18/09/27 23:23:21 WARN shortcircuit.DomainSocketFactory: The short-circuit 
> local reads feature cannot be used because libhadoop cannot be loaded.
> Exception in thread "main" java.io.IOException: 
> org.apache.hadoop.security.authentication.client.AuthenticationException: 
> GSSException: *{color:#FF}No valid credentials provided (Mechanism level: 
> Failed to find any Kerberos tgt){color}*
>     at 
> {color:#FF}*org.apache.hadoop.crypto.key.kms.KMSClientProvider.createConnection(KMSClientProvider.java:551)*{color}
>     at 
> org.apache.hadoop.crypto.key.kms.KMSClientProvider.decryptEncryptedKey(KMSClientProvider.java:831)
>     at 
> org.apache.hadoop.crypto.key.KeyProviderCryptoExtension.decryptEncryptedKey(KeyProviderCryptoExtension.java:388)
>     at 
> org.apache.hadoop.hdfs.DFSClient.decryptEncryptedDataEncryptionKey(DFSClient.java:1393)
>     at 
> org.apache.hadoop.hdfs.DFSClient.createWrappedInputStream(DFSClient.java:1463)
>     at 
> org.apache.hadoop.hdfs.DistributedFileSystem$4.doCall(DistributedFileSystem.java:333)
>     at 
> org.apache.hadoop.hdfs.DistributedFileSystem$4.doCall(DistributedFileSystem.java:327)
>     at 
> org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
>     at 
> org.apache.hadoop.hdfs.DistributedFileSystem.open(DistributedFileSystem.java:340)
>     at org.apache.hadoop.fs.FileSystem.open(FileSystem.java:786)
>     at HadoopTest.runRead(HadoopTest.java:18)
>     at HadoopTest.main(HadoopTest.java:29)
>