[jira] [Commented] (HDFS-13965) hadoop.security.kerberos.ticket.cache.path setting is not honored when KMS encryption is enabled.
[ https://issues.apache.org/jira/browse/HDFS-13965?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17248902#comment-17248902 ] Arun Prabu commented on HDFS-13965: --- Thanks for the suggestion. We are planning to use ticket cache file specific to our software as that would solve the issue. We will validate this completely and get back > hadoop.security.kerberos.ticket.cache.path setting is not honored when KMS > encryption is enabled. > - > > Key: HDFS-13965 > URL: https://issues.apache.org/jira/browse/HDFS-13965 > Project: Hadoop HDFS > Issue Type: Bug > Components: hdfs-client, kms >Affects Versions: 2.7.3, 2.7.7 >Reporter: LOKESKUMAR VIJAYAKUMAR >Assignee: Kitti Nanasi >Priority: Major > > _We use the *+hadoop.security.kerberos.ticket.cache.path+* setting to provide > a custom kerberos cache path for all hadoop operations to be run as specified > user. But this setting is not honored when KMS encryption is enabled._ > _The below program to read a file works when KMS encryption is not enabled, > but it fails when the KMS encryption is enabled._ > _Looks like *hadoop.security.kerberos.ticket.cache.path* setting is not > honored by *createConnection on KMSClientProvider.java.*_ > > HadoopTest.java (CLASSPATH needs to be set to compile and run) > > import java.io.InputStream; > import java.net.URI; > import org.apache.hadoop.conf.Configuration; > import org.apache.hadoop.fs.FileSystem; > import org.apache.hadoop.fs.Path; > > public class HadoopTest { > public static int runRead(String[] args) throws Exception{ > if (args.length < 3) { > System.err.println("HadoopTest hadoop_file_path > hadoop_user kerberos_cache"); > return 1; > } > Path inputPath = new Path(args[0]); > Configuration conf = new Configuration(); > URI defaultURI = FileSystem.getDefaultUri(conf); > > conf.set("hadoop.security.kerberos.ticket.cache.path",args[2]); > FileSystem fs = > FileSystem.newInstance(defaultURI,conf,args[1]); > InputStream is = fs.open(inputPath); > byte[] buffer = new byte[4096]; > int nr = is.read(buffer); > while (nr != -1) > { > System.out.write(buffer, 0, nr); > nr = is.read(buffer); > } > return 0; > } > public static void main( String[] args ) throws Exception { > int returnCode = HadoopTest.runRead(args); > System.exit(returnCode); > } > } > > > > [root@lstrost3 testhadoop]# pwd > /testhadoop > > [root@lstrost3 testhadoop]# ls > HadoopTest.java > > [root@lstrost3 testhadoop]# export CLASSPATH=`hadoop classpath --glob`:. > > [root@lstrost3 testhadoop]# javac HadoopTest.java > > [root@lstrost3 testhadoop]# java HadoopTest > HadoopTest hadoop_file_path hadoop_user kerberos_cache > > [root@lstrost3 testhadoop]# java HadoopTest /loki/loki.file loki > /tmp/krb5cc_1006 > 18/09/27 23:23:20 WARN util.NativeCodeLoader: Unable to load native-hadoop > library for your platform... using builtin-java classes where applicable > 18/09/27 23:23:21 WARN shortcircuit.DomainSocketFactory: The short-circuit > local reads feature cannot be used because libhadoop cannot be loaded. > Exception in thread "main" java.io.IOException: > org.apache.hadoop.security.authentication.client.AuthenticationException: > GSSException: *{color:#FF}No valid credentials provided (Mechanism level: > Failed to find any Kerberos tgt){color}* > at > {color:#FF}*org.apache.hadoop.crypto.key.kms.KMSClientProvider.createConnection(KMSClientProvider.java:551)*{color} > at > org.apache.hadoop.crypto.key.kms.KMSClientProvider.decryptEncryptedKey(KMSClientProvider.java:831) > at > org.apache.hadoop.crypto.key.KeyProviderCryptoExtension.decryptEncryptedKey(KeyProviderCryptoExtension.java:388) > at > org.apache.hadoop.hdfs.DFSClient.decryptEncryptedDataEncryptionKey(DFSClient.java:1393) > at > org.apache.hadoop.hdfs.DFSClient.createWrappedInputStream(DFSClient.java:1463) > at > org.apache.hadoop.hdfs.DistributedFileSystem$4.doCall(DistributedFileSystem.java:333) > at > org.apache.hadoop.hdfs.DistributedFileSystem$4.doCall(DistributedFileSystem.java:327) > at > org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81) > at > org.apache.hadoop.hdfs.DistributedFileSystem.open(DistributedFileSystem.java:340) > at org.apache.hadoop.fs.FileSystem.open(FileSystem.java:786) >
[jira] [Commented] (HDFS-13965) hadoop.security.kerberos.ticket.cache.path setting is not honored when KMS encryption is enabled.
[ https://issues.apache.org/jira/browse/HDFS-13965?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17233467#comment-17233467 ] Arun Prabu commented on HDFS-13965: --- Our application has to run as root. So, we temporarily change the effective user id of our process to configured OS user and run kinit as that user kinit halsys -kt /etc/krb5.keytab -c /tmp/krb5cc_1099 After kinit, hdfs connection is created as user halsys using libhdfs C apis After running kinit the effective user id of the process is again elevated to root When the auto-renewal thread in java layer detects its time to renew, the ticker gets renewed and owner of /tmp/krb5cc_1099 is changed to root lets say arbitrary ticket cache is used for user halsys with id 1099. So, as per above suggestion if arbitrary ticket path like /custom_path/krb5cc_1009 is used, kinit halsys -kt /etc/krb5.keytab -c /custom_path/krb5cc_1099 will auto-renewal thread stop renewing the ticket? or is that ownership will not be changed? What is the expected behavior. > hadoop.security.kerberos.ticket.cache.path setting is not honored when KMS > encryption is enabled. > - > > Key: HDFS-13965 > URL: https://issues.apache.org/jira/browse/HDFS-13965 > Project: Hadoop HDFS > Issue Type: Bug > Components: hdfs-client, kms >Affects Versions: 2.7.3, 2.7.7 >Reporter: LOKESKUMAR VIJAYAKUMAR >Assignee: Kitti Nanasi >Priority: Major > > _We use the *+hadoop.security.kerberos.ticket.cache.path+* setting to provide > a custom kerberos cache path for all hadoop operations to be run as specified > user. But this setting is not honored when KMS encryption is enabled._ > _The below program to read a file works when KMS encryption is not enabled, > but it fails when the KMS encryption is enabled._ > _Looks like *hadoop.security.kerberos.ticket.cache.path* setting is not > honored by *createConnection on KMSClientProvider.java.*_ > > HadoopTest.java (CLASSPATH needs to be set to compile and run) > > import java.io.InputStream; > import java.net.URI; > import org.apache.hadoop.conf.Configuration; > import org.apache.hadoop.fs.FileSystem; > import org.apache.hadoop.fs.Path; > > public class HadoopTest { > public static int runRead(String[] args) throws Exception{ > if (args.length < 3) { > System.err.println("HadoopTest hadoop_file_path > hadoop_user kerberos_cache"); > return 1; > } > Path inputPath = new Path(args[0]); > Configuration conf = new Configuration(); > URI defaultURI = FileSystem.getDefaultUri(conf); > > conf.set("hadoop.security.kerberos.ticket.cache.path",args[2]); > FileSystem fs = > FileSystem.newInstance(defaultURI,conf,args[1]); > InputStream is = fs.open(inputPath); > byte[] buffer = new byte[4096]; > int nr = is.read(buffer); > while (nr != -1) > { > System.out.write(buffer, 0, nr); > nr = is.read(buffer); > } > return 0; > } > public static void main( String[] args ) throws Exception { > int returnCode = HadoopTest.runRead(args); > System.exit(returnCode); > } > } > > > > [root@lstrost3 testhadoop]# pwd > /testhadoop > > [root@lstrost3 testhadoop]# ls > HadoopTest.java > > [root@lstrost3 testhadoop]# export CLASSPATH=`hadoop classpath --glob`:. > > [root@lstrost3 testhadoop]# javac HadoopTest.java > > [root@lstrost3 testhadoop]# java HadoopTest > HadoopTest hadoop_file_path hadoop_user kerberos_cache > > [root@lstrost3 testhadoop]# java HadoopTest /loki/loki.file loki > /tmp/krb5cc_1006 > 18/09/27 23:23:20 WARN util.NativeCodeLoader: Unable to load native-hadoop > library for your platform... using builtin-java classes where applicable > 18/09/27 23:23:21 WARN shortcircuit.DomainSocketFactory: The short-circuit > local reads feature cannot be used because libhadoop cannot be loaded. > Exception in thread "main" java.io.IOException: > org.apache.hadoop.security.authentication.client.AuthenticationException: > GSSException: *{color:#FF}No valid credentials provided (Mechanism level: > Failed to find any Kerberos tgt){color}* > at > {color:#FF}*org.apache.hadoop.crypto.key.kms.KMSClientProvider.createConnection(KMSClientProvider.java:551)*{color} > at > org.apache.hadoop.crypto.key.kms.KMSClientProvider.decryptEncryptedKey(KMSClientProvider.java:831) > at >