[
https://issues.apache.org/jira/browse/HDFS-13965?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17128745#comment-17128745
]
LOKESKUMAR VIJAYAKUMAR commented on HDFS-13965:
-----------------------------------------------
Hello Kitti Nanasi,
We implemented the workaround you suggested. (Setting the KRB5CCNAME
environment variable with hadoop user kerberos cache path )
But this causes the hadoop user ticket cache to be renewed as root user
automatically when ticket cache is about to expire / already expired. Which
changes the ownership of the ticket cache from hadoop user to root user. As a
result of all this, further kinit logins fails for hadoop user, as ticket cache
path is owned by root user now.
Is there anyway to avoid this issue? What should we do to not run into this
issue?
20/05/21 21:01:53 DEBUG security.UserGroupInformation: hadoop login
20/05/21 21:01:53 DEBUG security.UserGroupInformation: hadoop login commit
20/05/21 21:01:53 DEBUG security.UserGroupInformation: using kerberos
user:[email protected]
20/05/21 21:01:53 DEBUG security.UserGroupInformation: Using user:
"[email protected]" with name [email protected]
20/05/21 21:01:53 DEBUG security.UserGroupInformation: User entry:
"[email protected]"
20/05/21 21:01:53 DEBUG security.UserGroupInformation: Assuming keytab is
managed externally since logged in from subject.
20/05/21 21:01:53 DEBUG security.UserGroupInformation: UGI
loginUser:[email protected] (auth:KERBEROS)
20/05/21 21:01:53 DEBUG security.UserGroupInformation: Found tgt Ticket (hex) =
Client Principal = [email protected]
Server Principal = krbtgt/[email protected]
Forwardable Ticket true
Forwarded Ticket false
Proxiable Ticket false
Proxy Ticket false
Postdated Ticket false
Renewable Ticket true
Initial Ticket true
Auth Time = Thu May 21 03:09:53 EDT 2020
Start Time = Thu May 21 11:10:09 EDT 2020
End Time = Thu May 21 21:10:09 EDT 2020
Renew Till = Thu May 28 03:09:53 EDT 2020
Client Addresses Null
20/05/21 21:01:53 DEBUG security.UserGroupInformation: Current time is
1590109313240
20/05/21 21:01:53 DEBUG security.UserGroupInformation: Next refresh is
1590102609000
20/05/21 21:01:53 DEBUG security.UserGroupInformation: renewed ticket
20/05/21 21:01:53 DEBUG security.UserGroupInformation: Initiating logout for
[email protected]
20/05/21 21:01:53 DEBUG security.UserGroupInformation: hadoop logout
20/05/21 21:01:53 DEBUG security.UserGroupInformation: Initiating re-login for
[email protected]
20/05/21 21:01:53 DEBUG security.UserGroupInformation: hadoop login
20/05/21 21:01:53 DEBUG security.UserGroupInformation: hadoop login commit
> hadoop.security.kerberos.ticket.cache.path setting is not honored when KMS
> encryption is enabled.
> -------------------------------------------------------------------------------------------------
>
> Key: HDFS-13965
> URL: https://issues.apache.org/jira/browse/HDFS-13965
> Project: Hadoop HDFS
> Issue Type: Bug
> Components: hdfs-client, kms
> Affects Versions: 2.7.3, 2.7.7
> Reporter: LOKESKUMAR VIJAYAKUMAR
> Assignee: Kitti Nanasi
> Priority: Major
>
> _We use the *+hadoop.security.kerberos.ticket.cache.path+* setting to provide
> a custom kerberos cache path for all hadoop operations to be run as specified
> user. But this setting is not honored when KMS encryption is enabled._
> _The below program to read a file works when KMS encryption is not enabled,
> but it fails when the KMS encryption is enabled._
> _Looks like *hadoop.security.kerberos.ticket.cache.path* setting is not
> honored by *createConnection on KMSClientProvider.java.*_
>
> HadoopTest.java (CLASSPATH needs to be set to compile and run)
>
> import java.io.InputStream;
> import java.net.URI;
> import org.apache.hadoop.conf.Configuration;
> import org.apache.hadoop.fs.FileSystem;
> import org.apache.hadoop.fs.Path;
>
> public class HadoopTest {
> public static int runRead(String[] args) throws Exception{
> if (args.length < 3) {
> System.err.println("HadoopTest hadoop_file_path
> hadoop_user kerberos_cache");
> return 1;
> }
> Path inputPath = new Path(args[0]);
> Configuration conf = new Configuration();
> URI defaultURI = FileSystem.getDefaultUri(conf);
>
> conf.set("hadoop.security.kerberos.ticket.cache.path",args[2]);
> FileSystem fs =
> FileSystem.newInstance(defaultURI,conf,args[1]);
> InputStream is = fs.open(inputPath);
> byte[] buffer = new byte[4096];
> int nr = is.read(buffer);
> while (nr != -1)
> {
> System.out.write(buffer, 0, nr);
> nr = is.read(buffer);
> }
> return 0;
> }
> public static void main( String[] args ) throws Exception {
> int returnCode = HadoopTest.runRead(args);
> System.exit(returnCode);
> }
> }
>
>
>
> [root@lstrost3 testhadoop]# pwd
> /testhadoop
>
> [root@lstrost3 testhadoop]# ls
> HadoopTest.java
>
> [root@lstrost3 testhadoop]# export CLASSPATH=`hadoop classpath --glob`:.
>
> [root@lstrost3 testhadoop]# javac HadoopTest.java
>
> [root@lstrost3 testhadoop]# java HadoopTest
> HadoopTest hadoop_file_path hadoop_user kerberos_cache
>
> [root@lstrost3 testhadoop]# java HadoopTest /loki/loki.file loki
> /tmp/krb5cc_1006
> 18/09/27 23:23:20 WARN util.NativeCodeLoader: Unable to load native-hadoop
> library for your platform... using builtin-java classes where applicable
> 18/09/27 23:23:21 WARN shortcircuit.DomainSocketFactory: The short-circuit
> local reads feature cannot be used because libhadoop cannot be loaded.
> Exception in thread "main" java.io.IOException:
> org.apache.hadoop.security.authentication.client.AuthenticationException:
> GSSException: *{color:#FF0000}No valid credentials provided (Mechanism level:
> Failed to find any Kerberos tgt){color}*
> at
> {color:#FF0000}*org.apache.hadoop.crypto.key.kms.KMSClientProvider.createConnection(KMSClientProvider.java:551)*{color}
> at
> org.apache.hadoop.crypto.key.kms.KMSClientProvider.decryptEncryptedKey(KMSClientProvider.java:831)
> at
> org.apache.hadoop.crypto.key.KeyProviderCryptoExtension.decryptEncryptedKey(KeyProviderCryptoExtension.java:388)
> at
> org.apache.hadoop.hdfs.DFSClient.decryptEncryptedDataEncryptionKey(DFSClient.java:1393)
> at
> org.apache.hadoop.hdfs.DFSClient.createWrappedInputStream(DFSClient.java:1463)
> at
> org.apache.hadoop.hdfs.DistributedFileSystem$4.doCall(DistributedFileSystem.java:333)
> at
> org.apache.hadoop.hdfs.DistributedFileSystem$4.doCall(DistributedFileSystem.java:327)
> at
> org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
> at
> org.apache.hadoop.hdfs.DistributedFileSystem.open(DistributedFileSystem.java:340)
> at org.apache.hadoop.fs.FileSystem.open(FileSystem.java:786)
> at HadoopTest.runRead(HadoopTest.java:18)
> at HadoopTest.main(HadoopTest.java:29)
> Caused by:
> org.apache.hadoop.security.authentication.client.AuthenticationException:
> GSSException: No valid credentials provided (Mechanism level: Failed to find
> any Kerberos tgt)
> at
> org.apache.hadoop.security.authentication.client.KerberosAuthenticator.doSpnegoSequence(KerberosAuthenticator.java:333)
> at
> org.apache.hadoop.security.authentication.client.KerberosAuthenticator.authenticate(KerberosAuthenticator.java:203)
> at
> org.apache.hadoop.security.token.delegation.web.DelegationTokenAuthenticator.authenticate(DelegationTokenAuthenticator.java:133)
> at
> org.apache.hadoop.security.authentication.client.AuthenticatedURL.openConnection(AuthenticatedURL.java:216)
> at
> org.apache.hadoop.security.token.delegation.web.DelegationTokenAuthenticatedURL.openConnection(DelegationTokenAuthenticatedURL.java:322)
> at
> org.apache.hadoop.crypto.key.kms.KMSClientProvider$1.run(KMSClientProvider.java:542)
> at
> org.apache.hadoop.crypto.key.kms.KMSClientProvider$1.run(KMSClientProvider.java:537)
> at java.security.AccessController.doPrivileged(Native Method)
> at javax.security.auth.Subject.doAs(Subject.java:422)
> at
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1869)
> at
> org.apache.hadoop.crypto.key.kms.KMSClientProvider.createConnection(KMSClientProvider.java:536)
> ... 11 more
> Caused by: GSSException: No valid credentials provided (Mechanism level:
> Failed to find any Kerberos tgt)
> at
> sun.security.jgss.krb5.Krb5InitCredential.getInstance(Krb5InitCredential.java:147)
> at
> sun.security.jgss.krb5.Krb5MechFactory.getCredentialElement(Krb5MechFactory.java:122)
> at
> sun.security.jgss.krb5.Krb5MechFactory.getMechanismContext(Krb5MechFactory.java:187)
> at
> sun.security.jgss.GSSManagerImpl.getMechanismContext(GSSManagerImpl.java:224)
> at
> sun.security.jgss.GSSContextImpl.initSecContext(GSSContextImpl.java:212)
> at
> sun.security.jgss.GSSContextImpl.initSecContext(GSSContextImpl.java:179)
> at
> org.apache.hadoop.security.authentication.client.KerberosAuthenticator$1.run(KerberosAuthenticator.java:309)
> at
> org.apache.hadoop.security.authentication.client.KerberosAuthenticator$1.run(KerberosAuthenticator.java:285)
> at java.security.AccessController.doPrivileged(Native Method)
> at javax.security.auth.Subject.doAs(Subject.java:422)
> at
> org.apache.hadoop.security.authentication.client.KerberosAuthenticator.doSpnegoSequence(KerberosAuthenticator.java:285)
> ... 21 more
>
--
This message was sent by Atlassian Jira
(v8.3.4#803005)
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]