Nicholas-ehui opened a new issue, #22643: URL: https://github.com/apache/doris/issues/22643
### Search before asking - [X] I had searched in the [issues](https://github.com/apache/doris/issues?q=is%3Aissue) and found no similar issues. ### Version 1.2.3 RC2 ### What's Wrong? load HDFS(kerberos) parquet files with brokerload and one broker, some HDFS path load failed, eg: hdfs://x.x.108.68:8020/user/mcbigdata/hive_db/cdm.db/dwd_fico_income_sku_cost/ds=2023-06-28/type=*/* but, more specific path load success, eg: hdfs://x.x.108.68:8020/user/mcbigdata/hive_db/cdm.db/dwd_fico_income_sku_cost/ds=2023-06-28/type=VI/* Broker log: 2023-08-06 16:18:49 [ TThreadPoolServer WorkerProcess-%d:340832861 ] - [ INFO ] file system BrokerFileSystem [identity=org.apache.doris.broker.hdfs.FileSystemIdentity@717800c7, dfsFileSystem=DFS[DFSClient[clientName=DFSClient_NONMAPREDUCE_520979005_80, [email protected] (auth:KERBEROS)]], fileSystemId=1bcf3ba9-ff0c-4dc6-82a6-9002fc77ac42] is expired, update it. 2023-08-06 16:18:49 [ TThreadPoolServer WorkerProcess-%d:340832871 ] - [ INFO ] receive a open reader request, request detail: TBrokerOpenReaderRequest(version:VERSION_ONE, path:hdfs://x.x.108.68:8020/user/mcbigdata/hive_db/cdm.db/dwd_fico_income_sku_cost/ds=2023-06-28/type=I/part-00361-5bda1cf8-a4d1-4bb3-9461-65fe9282b11b.c000, startOffset:0, clientId:x.x.98.191:29984, properties:{hadoop.security.authentication=kerberos, [email protected], _DORIS_STORAGE_TYPE_=BROKER, kerberos_keytab=/home/omm/minix2023_keytab/user.keytab}) 2023-08-06 16:18:49 [ TThreadPoolServer WorkerProcess-%d:340832872 ] - [ INFO ] file system BrokerFileSystem [identity=org.apache.doris.broker.hdfs.FileSystemIdentity@717800c7, dfsFileSystem=DFS[DFSClient[clientName=DFSClient_NONMAPREDUCE_520979005_80, [email protected] (auth:KERBEROS)]], fileSystemId=1bcf3ba9-ff0c-4dc6-82a6-9002fc77ac42] is expired, update it. 2023-08-06 16:18:49 [ TThreadPoolServer WorkerProcess-%d:340832877 ] - [ ERROR ] errors while open path java.io.IOException: DestHost:destPort x.x.108.68:8020 , LocalHost:localPort doris-node-group-1Xakh0004/x.x.98.191:0. Failed on local exception: java.io.IOException: javax.security.sasl.SaslException: GSS initiate failed [Caused by GSSException: No valid credentials provided (Mechanism level: Failed to find any Kerberos tgt)] at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62) at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) at java.lang.reflect.Constructor.newInstance(Constructor.java:423) at org.apache.hadoop.net.NetUtils.wrapWithMessage(NetUtils.java:1051) at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:1026) at org.apache.hadoop.ipc.Client.getRpcResponse(Client.java:1583) at org.apache.hadoop.ipc.Client.call(Client.java:1525) at org.apache.hadoop.ipc.Client.call(Client.java:1422) at org.apache.hadoop.ipc.ProtobufRpcEngine2$Invoker.invoke(ProtobufRpcEngine2.java:245) at org.apache.hadoop.ipc.ProtobufRpcEngine2$Invoker.invoke(ProtobufRpcEngine2.java:131) at com.sun.proxy.$Proxy14.getBlockLocations(Unknown Source) at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.getBlockLocations(ClientNamenodeProtocolTranslatorPB.java:348) at sun.reflect.GeneratedMethodAccessor2.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:435) at org.apache.hadoop.io.retry.RetryInvocationHandler$Call.invokeMethod(RetryInvocationHandler.java:170) at org.apache.hadoop.io.retry.RetryInvocationHandler$Call.invoke(RetryInvocationHandler.java:162) at org.apache.hadoop.io.retry.RetryInvocationHandler$Call.invokeOnce(RetryInvocationHandler.java:100) at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:366) at com.sun.proxy.$Proxy15.getBlockLocations(Unknown Source) at org.apache.hadoop.hdfs.DFSClient.tryFromCacheForBlockLocations(DFSClient.java:3599) at org.apache.hadoop.hdfs.DFSClient.callGetBlockLocationsWithCache(DFSClient.java:3579) at org.apache.hadoop.hdfs.DFSClient.getLocatedBlocks(DFSClient.java:933) at org.apache.hadoop.hdfs.DFSClient.getLocatedBlocks(DFSClient.java:922) at org.apache.hadoop.hdfs.DFSClient.open(DFSClient.java:1091) at org.apache.hadoop.hdfs.DistributedFileSystem$4.doCall(DistributedFileSystem.java:355) at org.apache.hadoop.hdfs.DistributedFileSystem$4.doCall(DistributedFileSystem.java:351) at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81) at org.apache.hadoop.hdfs.DistributedFileSystem.open(DistributedFileSystem.java:368) at org.apache.doris.broker.hdfs.FileSystemManager.openReader(FileSystemManager.java:1078) at org.apache.doris.broker.hdfs.HDFSBrokerServiceImpl.openReader(HDFSBrokerServiceImpl.java:150) at org.apache.doris.thrift.TPaloBrokerService$Processor$openReader.getResult(TPaloBrokerService.java:915) at org.apache.doris.thrift.TPaloBrokerService$Processor$openReader.getResult(TPaloBrokerService.java:895) at org.apache.thrift.ProcessFunction.process(ProcessFunction.java:38) at org.apache.thrift.TBaseProcessor.process(TBaseProcessor.java:38) at org.apache.thrift.server.TThreadPoolServer$WorkerProcess.run(TThreadPoolServer.java:248) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) at java.lang.Thread.run(Thread.java:750) Caused by: java.io.IOException: javax.security.sasl.SaslException: GSS initiate failed [Caused by GSSException: No valid credentials provided (Mechanism level: Failed to find any Kerberos tgt)] at org.apache.hadoop.ipc.Client$Connection$1.run(Client.java:759) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1890) at org.apache.hadoop.ipc.Client$Connection.handleSaslConnectionFailure(Client.java:713) at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:817) at org.apache.hadoop.ipc.Client$Connection.access$3900(Client.java:362) at org.apache.hadoop.ipc.Client.getConnection(Client.java:1644) at org.apache.hadoop.ipc.Client.call(Client.java:1469) ... 33 more Caused by: javax.security.sasl.SaslException: GSS initiate failed [Caused by GSSException: No valid credentials provided (Mechanism level: Failed to find any Kerberos tgt)] at com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(GssKrb5Client.java:211) at org.apache.hadoop.security.SaslRpcClient.saslConnect(SaslRpcClient.java:410) at org.apache.hadoop.ipc.Client$Connection.setupSaslConnection(Client.java:576) at org.apache.hadoop.ipc.Client$Connection.access$2300(Client.java:362) at org.apache.hadoop.ipc.Client$Connection$2.run(Client.java:804) at org.apache.hadoop.ipc.Client$Connection$2.run(Client.java:800) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1890) at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:800) ... 36 more Caused by: GSSException: No valid credentials provided (Mechanism level: Failed to find any Kerberos tgt) at sun.security.jgss.krb5.Krb5InitCredential.getInstance(Krb5InitCredential.java:162) at sun.security.jgss.krb5.Krb5MechFactory.getCredentialElement(Krb5MechFactory.java:122) at sun.security.jgss.krb5.Krb5MechFactory.getMechanismContext(Krb5MechFactory.java:189) at sun.security.jgss.GSSManagerImpl.getMechanismContext(GSSManagerImpl.java:224) at sun.security.jgss.GSSContextImpl.initSecContext(GSSContextImpl.java:212) at sun.security.jgss.GSSContextImpl.initSecContext(GSSContextImpl.java:179) at com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(GssKrb5Client.java:192) ... 45 more 2023-08-06 16:18:49 [ TThreadPoolServer WorkerProcess-%d:340832878 ] - [ WARN ] failed to open reader for path: hdfs://x.x.108.68:8020/user/mcbigdata/hive_db/cdm.db/dwd_fico_income_sku_cost/ds=2023-06-28/type=VI/part-00250-5bda1cf8-a4d1-4bb3-9461-65fe9282b11b.c000 org.apache.doris.broker.hdfs.BrokerException: could not open file hdfs://x.x.108.68:8020/user/mcbigdata/hive_db/cdm.db/dwd_fico_income_sku_cost/ds=2023-06-28/type=VI/part-00250-5bda1cf8-a4d1-4bb3-9461-65fe9282b11b.c000 at org.apache.doris.broker.hdfs.FileSystemManager.openReader(FileSystemManager.java:1087) at org.apache.doris.broker.hdfs.HDFSBrokerServiceImpl.openReader(HDFSBrokerServiceImpl.java:150) at org.apache.doris.thrift.TPaloBrokerService$Processor$openReader.getResult(TPaloBrokerService.java:915) at org.apache.doris.thrift.TPaloBrokerService$Processor$openReader.getResult(TPaloBrokerService.java:895) at org.apache.thrift.ProcessFunction.process(ProcessFunction.java:38) at org.apache.thrift.TBaseProcessor.process(TBaseProcessor.java:38) at org.apache.thrift.server.TThreadPoolServer$WorkerProcess.run(TThreadPoolServer.java:248) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) at java.lang.Thread.run(Thread.java:750) Caused by: java.io.IOException: DestHost:destPort x.x.108.68:8020 , LocalHost:localPort doris-node-group-1Xakh0004/x.x.98.191:0. Failed on local exception: java.io.IOException: javax.security.sasl.SaslException: GSS initiate failed [Caused by GSSException: No valid credentials provided (Mechanism level: Failed to find any Kerberos tgt)] at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62) at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) at java.lang.reflect.Constructor.newInstance(Constructor.java:423) at org.apache.hadoop.net.NetUtils.wrapWithMessage(NetUtils.java:1051) at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:1026) at org.apache.hadoop.ipc.Client.getRpcResponse(Client.java:1583) at org.apache.hadoop.ipc.Client.call(Client.java:1525) at org.apache.hadoop.ipc.Client.call(Client.java:1422) at org.apache.hadoop.ipc.ProtobufRpcEngine2$Invoker.invoke(ProtobufRpcEngine2.java:245) at org.apache.hadoop.ipc.ProtobufRpcEngine2$Invoker.invoke(ProtobufRpcEngine2.java:131) at com.sun.proxy.$Proxy14.getBlockLocations(Unknown Source) at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.getBlockLocations(ClientNamenodeProtocolTranslatorPB.java:348) at sun.reflect.GeneratedMethodAccessor2.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:435) at org.apache.hadoop.io.retry.RetryInvocationHandler$Call.invokeMethod(RetryInvocationHandler.java:170) at org.apache.hadoop.io.retry.RetryInvocationHandler$Call.invoke(RetryInvocationHandler.java:162) at org.apache.hadoop.io.retry.RetryInvocationHandler$Call.invokeOnce(RetryInvocationHandler.java:100) at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:366) at com.sun.proxy.$Proxy15.getBlockLocations(Unknown Source) at org.apache.hadoop.hdfs.DFSClient.tryFromCacheForBlockLocations(DFSClient.java:3599) at org.apache.hadoop.hdfs.DFSClient.callGetBlockLocationsWithCache(DFSClient.java:3579) at org.apache.hadoop.hdfs.DFSClient.getLocatedBlocks(DFSClient.java:933) at org.apache.hadoop.hdfs.DFSClient.getLocatedBlocks(DFSClient.java:922) at org.apache.hadoop.hdfs.DFSClient.open(DFSClient.java:1091) at org.apache.hadoop.hdfs.DistributedFileSystem$4.doCall(DistributedFileSystem.java:355) at org.apache.hadoop.hdfs.DistributedFileSystem$4.doCall(DistributedFileSystem.java:351) at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81) at org.apache.hadoop.hdfs.DistributedFileSystem.open(DistributedFileSystem.java:368) at org.apache.doris.broker.hdfs.FileSystemManager.openReader(FileSystemManager.java:1078) ... 9 more Caused by: java.io.IOException: javax.security.sasl.SaslException: GSS initiate failed [Caused by GSSException: No valid credentials provided (Mechanism level: Failed to find any Kerberos tgt)] at org.apache.hadoop.ipc.Client$Connection$1.run(Client.java:759) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1890) at org.apache.hadoop.ipc.Client$Connection.handleSaslConnectionFailure(Client.java:713) at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:817) at org.apache.hadoop.ipc.Client$Connection.access$3900(Client.java:362) at org.apache.hadoop.ipc.Client.getConnection(Client.java:1644) at org.apache.hadoop.ipc.Client.call(Client.java:1469) ... 33 more Caused by: javax.security.sasl.SaslException: GSS initiate failed [Caused by GSSException: No valid credentials provided (Mechanism level: Failed to find any Kerberos tgt)] at com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(GssKrb5Client.java:211) at org.apache.hadoop.security.SaslRpcClient.saslConnect(SaslRpcClient.java:410) at org.apache.hadoop.ipc.Client$Connection.setupSaslConnection(Client.java:576) at org.apache.hadoop.ipc.Client$Connection.access$2300(Client.java:362) at org.apache.hadoop.ipc.Client$Connection$2.run(Client.java:804) at org.apache.hadoop.ipc.Client$Connection$2.run(Client.java:800) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1890) at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:800) ... 36 more Caused by: GSSException: No valid credentials provided (Mechanism level: Failed to find any Kerberos tgt) at sun.security.jgss.krb5.Krb5InitCredential.getInstance(Krb5InitCredential.java:162) at sun.security.jgss.krb5.Krb5MechFactory.getCredentialElement(Krb5MechFactory.java:122) at sun.security.jgss.krb5.Krb5MechFactory.getMechanismContext(Krb5MechFactory.java:189) at sun.security.jgss.GSSManagerImpl.getMechanismContext(GSSManagerImpl.java:224) at sun.security.jgss.GSSContextImpl.initSecContext(GSSContextImpl.java:212) at sun.security.jgss.GSSContextImpl.initSecContext(GSSContextImpl.java:179) at com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(GssKrb5Client.java:192) ... 45 more 2023-08-06 16:18:49 [ TThreadPoolServer WorkerProcess-%d:340832878 ] - [ INFO ] create file system for new path: hdfs://x.x.108.68:8020/user/mcbigdata/hive_db/cdm.db/dwd_fico_income_sku_cost/ds=2023-06-28/type=I/part-00361-5bda1cf8-a4d1-4bb3-9461-65fe9282b11b.c000 2023-08-06 16:18:49 [ TThreadPoolServer WorkerProcess-%d:340832953 ] - [ INFO ] receive a open reader request, request detail: TBrokerOpenReaderRequest(version:VERSION_ONE, path:hdfs://x.x.108.68:8020/user/mcbigdata/hive_db/cdm.db/dwd_fico_income_sku_cost/ds=2023-06-28/type=VI/part-00250-5bda1cf8-a4d1-4bb3-9461-65fe9282b11b.c000, startOffset:0, clientId:x.x.104.86:29984, properties:{hadoop.security.authentication=kerberos, [email protected], _DORIS_STORAGE_TYPE_=BROKER, kerberos_keytab=/home/omm/minix2023_keytab/user.keytab}) ### What You Expected? load file success ### How to Reproduce? _No response_ ### Anything Else? _No response_ ### Are you willing to submit PR? - [ ] Yes I am willing to submit a PR! ### Code of Conduct - [X] I agree to follow this project's [Code of Conduct](https://www.apache.org/foundation/policies/conduct) -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: [email protected] For queries about this service, please contact Infrastructure at: [email protected] --------------------------------------------------------------------- To unsubscribe, e-mail: [email protected] For additional commands, e-mail: [email protected]
