bharatviswa504 commented on a change in pull request #2000:
URL: https://github.com/apache/ozone/pull/2000#discussion_r600356252



##########
File path: 
hadoop-hdds/framework/src/main/java/org/apache/hadoop/hdds/scm/proxy/SCMBlockLocationFailoverProxyProvider.java
##########
@@ -70,11 +70,20 @@
   private final int maxRetryCount;
   private final long retryInterval;
 
+  private final UserGroupInformation ugi;
+
 
   public SCMBlockLocationFailoverProxyProvider(ConfigurationSource conf) {
     this.conf = conf;
     this.scmVersion = RPC.getProtocolVersion(ScmBlockLocationProtocolPB.class);
 
+    try {
+      this.ugi = UserGroupInformation.getCurrentUser();

Review comment:
       This is to make use of UGI during the creation of FailOverProxyProvider, 
otherwise, we will use UGI during create proxy time which might not be correct 
one.
   Without this change, we shall see this error.
   
   
   ```
   om1_1        | 2021-03-23 05:59:25,420 [IPC Server handler 7 on default port 
9862] WARN ipc.Client: Exception encountered while connecting to the server
   om1_1        | javax.security.sasl.SaslException: GSS initiate failed 
[Caused by GSSException: No valid credentials provided (Mechanism level: Failed 
to find any Kerberos tgt)]
   om1_1        |       at 
jdk.security.jgss/com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(GssKrb5Client.java:211)
   om1_1        |       at 
org.apache.hadoop.security.SaslRpcClient.saslConnect(SaslRpcClient.java:408)
   om1_1        |       at 
org.apache.hadoop.ipc.Client$Connection.setupSaslConnection(Client.java:622)
   om1_1        |       at 
org.apache.hadoop.ipc.Client$Connection.access$2300(Client.java:413)
   om1_1        |       at 
org.apache.hadoop.ipc.Client$Connection$2.run(Client.java:822)
   om1_1        |       at 
org.apache.hadoop.ipc.Client$Connection$2.run(Client.java:818)
   om1_1        |       at 
java.base/java.security.AccessController.doPrivileged(Native Method)
   om1_1        |       at 
java.base/javax.security.auth.Subject.doAs(Subject.java:423)
   om1_1        |       at 
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1762)
   om1_1        |       at 
org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:818)
   om1_1        |       at 
org.apache.hadoop.ipc.Client$Connection.access$3800(Client.java:413)
   om1_1        |       at 
org.apache.hadoop.ipc.Client.getConnection(Client.java:1636)
   om1_1        |       at org.apache.hadoop.ipc.Client.call(Client.java:1452)
   om1_1        |       at org.apache.hadoop.ipc.Client.call(Client.java:1405)
   om1_1        |       at 
org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:233)
   om1_1        |       at 
org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:118)
   om1_1        |       at com.sun.proxy.$Proxy32.send(Unknown Source)
   om1_1        |       at 
java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
   om1_1        |       at 
java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
   om1_1        |       at 
java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
   om1_1        |       at 
java.base/java.lang.reflect.Method.invoke(Method.java:566)
   om1_1        |       at 
org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:422)
   om1_1        |       at 
org.apache.hadoop.io.retry.RetryInvocationHandler$Call.invokeMethod(RetryInvocationHandler.java:165)
   om1_1        |       at 
org.apache.hadoop.io.retry.RetryInvocationHandler$Call.invoke(RetryInvocationHandler.java:157)
   om1_1        |       at 
org.apache.hadoop.io.retry.RetryInvocationHandler$Call.invokeOnce(RetryInvocationHandler.java:95)
   om1_1        |       at 
org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:359)
   om1_1        |       at com.sun.proxy.$Proxy32.send(Unknown Source)
   om1_1        |       at 
org.apache.hadoop.hdds.scm.protocolPB.ScmBlockLocationProtocolClientSideTranslatorPB.submitRequest(ScmBlockLocationProtocolClientSideTranslatorPB.java:118)
   om1_1        |       at 
org.apache.hadoop.hdds.scm.protocolPB.ScmBlockLocationProtocolClientSideTranslatorPB.allocateBlock(ScmBlockLocationProtocolClientSideTranslatorPB.java:172)
   om1_1        |       at 
org.apache.hadoop.ozone.om.request.key.OMKeyRequest.allocateBlock(OMKeyRequest.java:128)
   om1_1        |       at 
org.apache.hadoop.ozone.om.request.key.OMKeyCreateRequest.preExecute(OMKeyCreateRequest.java:151)
   om1_1        |       at 
org.apache.hadoop.ozone.protocolPB.OzoneManagerProtocolServerSideTranslatorPB.processRequest(OzoneManagerProtocolServerSideTranslatorPB.java:139)
   om1_1        |       at 
org.apache.hadoop.hdds.server.OzoneProtocolMessageDispatcher.processRequest(OzoneProtocolMessageDispatcher.java:87)
   om1_1        |       at 
org.apache.hadoop.ozone.protocolPB.OzoneManagerProtocolServerSideTranslatorPB.submitRequest(OzoneManagerProtocolServerSideTranslatorPB.java:122)
   om1_1        |       at 
org.apache.hadoop.ozone.protocol.proto.OzoneManagerProtocolProtos$OzoneManagerService$2.callBlockingMethod(OzoneManagerProtocolProtos.java)
   om1_1        |       at 
org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:528)
   om1_1        |       at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:1086)
   om1_1        |       at 
org.apache.hadoop.ipc.Server$RpcCall.run(Server.java:1029)
   om1_1        |       at 
org.apache.hadoop.ipc.Server$RpcCall.run(Server.java:957)
   om1_1        |       at 
java.base/java.security.AccessController.doPrivileged(Native Method)
   om1_1        |       at 
java.base/javax.security.auth.Subject.doAs(Subject.java:423)
   om1_1        |       at 
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1762)
   om1_1        |       at 
org.apache.hadoop.ipc.Server$Handler.run(Server.java:2957)
   om1_1        | Caused by: GSSException: No valid credentials provided 
(Mechanism level: Failed to find any Kerberos tgt)
   om1_1        |       at 
java.security.jgss/sun.security.jgss.krb5.Krb5InitCredential.getInstance(Krb5InitCredential.java:162)
   om1_1        |       at 
java.security.jgss/sun.security.jgss.krb5.Krb5MechFactory.getCredentialElement(Krb5MechFactory.java:126)
   om1_1        |       at 
java.security.jgss/sun.security.jgss.krb5.Krb5MechFactory.getMechanismContext(Krb5MechFactory.java:193)
   om1_1        |       at 
java.security.jgss/sun.security.jgss.GSSManagerImpl.getMechanismContext(GSSManagerImpl.java:218)
   om1_1        |       at 
java.security.jgss/sun.security.jgss.GSSContextImpl.initSecContext(GSSContextImpl.java:230)
   om1_1        |       at 
java.security.jgss/sun.security.jgss.GSSContextImpl.initSecContext(GSSContextImpl.java:196)
   om1_1        |       at 
jdk.security.jgss/com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(GssKrb5Client.java:192)
   om1_1        |       ... 42 more
   ```

##########
File path: 
hadoop-hdds/framework/src/main/java/org/apache/hadoop/hdds/scm/proxy/SCMContainerLocationFailoverProxyProvider.java
##########
@@ -70,9 +70,18 @@
   private final int maxRetryCount;
   private final long retryInterval;
 
+  private final UserGroupInformation ugi;
+
 
   public SCMContainerLocationFailoverProxyProvider(ConfigurationSource conf) {
     this.conf = conf;
+
+    try {
+      this.ugi = UserGroupInformation.getCurrentUser();

Review comment:
       Same as above 




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
[email protected]



---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to