cxzl25 opened a new pull request, #5834:
URL: https://github.com/apache/paimon/pull/5834

   ### Purpose
   
   <!-- Linking this pull request to the issue -->
   Linked issue: close #xxx
   
   <!-- What is the purpose of the change -->
   
   In Spark executor attempts to connect to HMS, but executor fails without 
Kerberos authentication information for HMS.
   
   Due to the changes in https://github.com/apache/paimon/pull/4010, 
`HiveCatalog` will be connected to HMS when initialized.
   
   In this PR, only connect to HMS without SASL enabled in the initialization.
   
   ```java
   ERROR TSaslTransport: SASL negotiation failure
   javax.security.sasl.SaslException: GSS initiate failed [Caused by 
GSSException: No valid credentials provided (Mechanism level: Failed to find 
any Kerberos tgt)]
        at 
com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(GssKrb5Client.java:211)
        at 
org.apache.thrift.transport.TSaslClientTransport.handleSaslStartMessage(TSaslClientTransport.java:95)
        at 
org.apache.thrift.transport.TSaslTransport.open(TSaslTransport.java:271)
        at 
org.apache.thrift.transport.TSaslClientTransport.open(TSaslClientTransport.java:38)
        at 
org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport$1.run(TUGIAssumingTransport.java:52)
        at 
org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport$1.run(TUGIAssumingTransport.java:49)
        at java.security.AccessController.doPrivileged(Native Method)
        at javax.security.auth.Subject.doAs(Subject.java:422)
        at 
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1743)
        at 
org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport.open(TUGIAssumingTransport.java:49)
        at 
org.apache.hadoop.hive.metastore.HiveMetaStoreClient.open(HiveMetaStoreClient.java:486)
        at 
org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:246)
        at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
        at 
sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
        at 
sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
        at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
        at 
org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1740)
        at 
org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.<init>(RetryingMetaStoreClient.java:83)
        at 
org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:133)
        at 
org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:89)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
        at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:498)
        at 
org.apache.paimon.hive.RetryingMetaStoreClientFactory.lambda$static$7(RetryingMetaStoreClientFactory.java:136)
        at 
org.apache.paimon.hive.RetryingMetaStoreClientFactory.createClient(RetryingMetaStoreClientFactory.java:160)
        at 
org.apache.paimon.hive.pool.HiveClientPool.lambda$clientSupplier$0(HiveClientPool.java:46)
        at 
org.apache.paimon.client.ClientPool$ClientPoolImpl.<init>(ClientPool.java:52)
        at 
org.apache.paimon.hive.pool.HiveClientPool.<init>(HiveClientPool.java:39)
        at 
org.apache.paimon.hive.pool.CachedClientPool.lambda$clientPool$1(CachedClientPool.java:98)
        at 
org.apache.paimon.shade.caffeine2.com.github.benmanes.caffeine.cache.BoundedLocalCache.lambda$doComputeIfAbsent$14(BoundedLocalCache.java:2406)
        at 
java.util.concurrent.ConcurrentHashMap.compute(ConcurrentHashMap.java:1853)
        at 
org.apache.paimon.shade.caffeine2.com.github.benmanes.caffeine.cache.BoundedLocalCache.doComputeIfAbsent(BoundedLocalCache.java:2404)
        at 
org.apache.paimon.shade.caffeine2.com.github.benmanes.caffeine.cache.BoundedLocalCache.computeIfAbsent(BoundedLocalCache.java:2387)
        at 
org.apache.paimon.shade.caffeine2.com.github.benmanes.caffeine.cache.LocalCache.computeIfAbsent(LocalCache.java:108)
        at 
org.apache.paimon.shade.caffeine2.com.github.benmanes.caffeine.cache.LocalManualCache.get(LocalManualCache.java:62)
        at 
org.apache.paimon.hive.pool.CachedClientPool.clientPool(CachedClientPool.java:97)
        at 
org.apache.paimon.hive.pool.CachedClientPool.run(CachedClientPool.java:133)
        at 
org.apache.paimon.hive.pool.CachedClientPool.<init>(CachedClientPool.java:86)
        at org.apache.paimon.hive.HiveCatalog.<init>(HiveCatalog.java:190)
        at 
org.apache.paimon.hive.HiveCatalogLoader.load(HiveCatalogLoader.java:52)
        at 
org.apache.paimon.tag.SnapshotLoaderImpl.load(SnapshotLoaderImpl.java:45)
        at 
org.apache.paimon.utils.SnapshotManager.latestSnapshot(SnapshotManager.java:169)
        at 
org.apache.paimon.operation.AbstractFileStoreWrite.createWriterContainer(AbstractFileStoreWrite.java:436)
        at 
org.apache.paimon.operation.AbstractFileStoreWrite.lambda$getWriterWrapper$5(AbstractFileStoreWrite.java:414)
        at java.util.HashMap.computeIfAbsent(HashMap.java:1126)
        at 
org.apache.paimon.operation.AbstractFileStoreWrite.getWriterWrapper(AbstractFileStoreWrite.java:413)
        at 
org.apache.paimon.operation.AbstractFileStoreWrite.write(AbstractFileStoreWrite.java:160)
        at 
org.apache.paimon.table.sink.TableWriteImpl.writeAndReturn(TableWriteImpl.java:187)
        at 
org.apache.paimon.table.sink.TableWriteImpl.write(TableWriteImpl.java:152)
        at 
org.apache.paimon.spark.SparkTableWrite.write(SparkTableWrite.scala:44)
        at 
org.apache.paimon.spark.commands.PaimonSparkWriter.$anonfun$write$4(PaimonSparkWriter.scala:124)
        at 
org.apache.paimon.spark.commands.PaimonSparkWriter.$anonfun$write$4$adapted(PaimonSparkWriter.scala:124)
        at scala.collection.Iterator.foreach(Iterator.scala:943)
        at scala.collection.Iterator.foreach$(Iterator.scala:943)
        at scala.collection.AbstractIterator.foreach(Iterator.scala:1431)
   ```
   
   
   
   
https://github.com/apache/hive/blob/895f3f3f5c5a1f5b4ecbba5af63e6a60db083dfc/standalone-metastore/metastore-common/src/main/java/org/apache/hadoop/hive/metastore/HiveMetaStoreClient.java#L857-L862
   
   ```java
             if (isConnected && !useSasl && !usePasswordAuth && 
!isHttpTransportMode &&
                     MetastoreConf.getBoolVar(conf, ConfVars.EXECUTE_SET_UGI)) {
               // Call set_ugi, only in unsecure mode.
               try {
                 UserGroupInformation ugi = SecurityUtils.getUGI();
                 client.set_ugi(ugi.getUserName(), 
Arrays.asList(ugi.getGroupNames()));
   ```
   
   
   
   ### Tests
   
   <!-- List UT and IT cases to verify this change -->
   
   ### API and Format
   
   <!-- Does this change affect API or storage format -->
   
   ### Documentation
   
   <!-- Does this change introduce a new feature -->
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscr...@paimon.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org

Reply via email to