avishnus opened a new issue, #6393:
URL: https://github.com/apache/kyuubi/issues/6393

   ### Code of Conduct
   
   - [X] I agree to follow this project's [Code of 
Conduct](https://www.apache.org/foundation/policies/conduct)
   
   
   ### Search before asking
   
   - [X] I have searched in the 
[issues](https://github.com/apache/kyuubi/issues?q=is%3Aissue) and found no 
similar issues.
   
   
   ### Describe the bug
   
   Error: org.apache.kyuubi.KyuubiSQLException: 
org.apache.kyuubi.KyuubiSQLException: org.apache.hadoop.ipc.RemoteException: 
org.apache.hadoop.security.AccessControlException: SIMPLE authentication is not 
enabled.  Available:[TOKEN, KERBEROS]
   
   ### Affects Version(s)
   
   1.8.2
   
   ### Kyuubi Server Log Output
   
   ```logtalk
   2024-05-20 09:35:10.526 ERROR KyuubiTBinaryFrontendHandler-Pool: Thread-57 
org.apache.kyuubi.server.KyuubiTBinaryFrontendService: Error getting info:
   java.util.concurrent.ExecutionException: 
org.apache.kyuubi.KyuubiSQLException: org.apache.kyuubi.KyuubiSQLException: 
org.apache.hadoop.ipc.RemoteException: 
org.apache.hadoop.security.AccessControlException: SIMPLE authentication is not 
enabled.  Available:[TOKEN, KERBEROS]
        at org.apache.hadoop.ipc.Client.getRpcResponse(Client.java:1604)
        at org.apache.hadoop.ipc.Client.call(Client.java:1550)
        at org.apache.hadoop.ipc.Client.call(Client.java:1447)
        at 
org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:233)
        at 
org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:118)
        at com.sun.proxy.$Proxy22.getFileInfo(Unknown Source)
        at 
org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.getFileInfo(ClientNamenodeProtocolTranslatorPB.java:910)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
        at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:498)
        at 
org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:422)
        at 
org.apache.hadoop.io.retry.RetryInvocationHandler$Call.invokeMethod(RetryInvocationHandler.java:165)
        at 
org.apache.hadoop.io.retry.RetryInvocationHandler$Call.invoke(RetryInvocationHandler.java:157)
        at 
org.apache.hadoop.io.retry.RetryInvocationHandler$Call.invokeOnce(RetryInvocationHandler.java:95)
        at 
org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:359)
        at com.sun.proxy.$Proxy23.getFileInfo(Unknown Source)
        at org.apache.hadoop.hdfs.DFSClient.getFileInfo(DFSClient.java:1671)
        at 
org.apache.hadoop.hdfs.DistributedFileSystem$29.doCall(DistributedFileSystem.java:1603)
        at 
org.apache.hadoop.hdfs.DistributedFileSystem$29.doCall(DistributedFileSystem.java:1600)
        at 
org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
        at 
org.apache.hadoop.hdfs.DistributedFileSystem.getFileStatus(DistributedFileSystem.java:1615)
        at org.apache.hadoop.fs.FileSystem.exists(FileSystem.java:1690)
        at 
org.apache.spark.deploy.k8s.KubernetesUtils$.uploadFileUri(KubernetesUtils.scala:325)
        at 
org.apache.spark.deploy.k8s.KubernetesUtils$.$anonfun$uploadAndTransformFileUris$1(KubernetesUtils.scala:276)
        at 
scala.collection.TraversableLike.$anonfun$map$1(TraversableLike.scala:286)
        at 
scala.collection.mutable.ResizableArray.foreach(ResizableArray.scala:62)
        at 
scala.collection.mutable.ResizableArray.foreach$(ResizableArray.scala:55)
        at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:49)
        at scala.collection.TraversableLike.map(TraversableLike.scala:286)
        at scala.collection.TraversableLike.map$(TraversableLike.scala:279)
        at scala.collection.AbstractTraversable.map(Traversable.scala:108)
        at 
org.apache.spark.deploy.k8s.KubernetesUtils$.uploadAndTransformFileUris(KubernetesUtils.scala:275)
        at 
org.apache.spark.deploy.k8s.features.BasicDriverFeatureStep.$anonfun$getAdditionalPodSystemProperties$1(BasicDriverFeatureStep.scala:174)
        at scala.collection.immutable.List.foreach(List.scala:431)
        at 
org.apache.spark.deploy.k8s.features.BasicDriverFeatureStep.getAdditionalPodSystemProperties(BasicDriverFeatureStep.scala:165)
        at 
org.apache.spark.deploy.k8s.submit.KubernetesDriverBuilder.$anonfun$buildFromFeatures$4(KubernetesDriverBuilder.scala:65)
        at 
scala.collection.LinearSeqOptimized.foldLeft(LinearSeqOptimized.scala:126)
        at 
scala.collection.LinearSeqOptimized.foldLeft$(LinearSeqOptimized.scala:122)
        at scala.collection.immutable.List.foldLeft(List.scala:91)
        at 
org.apache.spark.deploy.k8s.submit.KubernetesDriverBuilder.buildFromFeatures(KubernetesDriverBuilder.scala:63)
        at 
org.apache.spark.deploy.k8s.submit.Client.run(KubernetesClientApplication.scala:107)
        at 
org.apache.spark.deploy.k8s.submit.KubernetesClientApplication.$anonfun$run$4(KubernetesClientApplication.scala:223)
        at 
org.apache.spark.deploy.k8s.submit.KubernetesClientApplication.$anonfun$run$4$adapted(KubernetesClientApplication.scala:217)
        at org.apache.spark.util.Utils$.tryWithResource(Utils.scala:2742)
        at 
org.apache.spark.deploy.k8s.submit.KubernetesClientApplication.run(KubernetesClientApplication.scala:217)
        at 
org.apache.spark.deploy.k8s.submit.KubernetesClientApplication.start(KubernetesClientApplication.scala:189)
        at 
org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:984)
        at 
org.apache.spark.deploy.SparkSubmit$$anon$1.run(SparkSubmit.scala:172)
        at 
org.apache.spark.deploy.SparkSubmit$$anon$1.run(SparkSubmit.scala:170)
        at java.security.AccessController.doPrivileged(Native Method)
        at javax.security.auth.Subject.doAs(Subject.java:422)
        at 
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1762)
        at 
org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:170)
        at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:211)
        at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:90)
        at 
org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:1072)
        at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:1081)
        at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
   09:35:09.404 [main] DEBUG org.apache.hadoop.security.UserGroupInformation - 
PrivilegedActionException as:avishnus (auth:PROXY) via kyuubi (auth:SIMPLE) 
cause:org.apache.spark.SparkException: Uploading file 
/opt/kyuubi/externals/engines/spark/kyuubi-spark-sql-engine_2.12-1.8.2-SNAPSHOT.jar
 failed...
    See more: /opt/kyuubi/work/avishnus/kyuubi-spark-sql-engine.log.0
        at 
org.apache.kyuubi.KyuubiSQLException$.apply(KyuubiSQLException.scala:70)
        at 
org.apache.kyuubi.engine.ProcBuilder.$anonfun$start$1(ProcBuilder.scala:232)
        at java.lang.Thread.run(Thread.java:750)
   .
   FYI: The last 10 line(s) of log are:
   09:35:09.393 [IPC Client (371440613) connection to 
sl73dpihmnu0108.visa.com/10.207.184.24:8020 from avishnus] DEBUG 
org.apache.hadoop.ipc.Client - IPC Client (371440613) connection to 
sl73dpihmnu0108.visa.com/10.207.184.24:8020 from avishnus: stopped, remaining 
connections 0
   09:35:09.399 [main] DEBUG org.apache.hadoop.io.retry.RetryInvocationHandler 
- Exception while invoking call #0 
ClientNamenodeProtocolTranslatorPB.getFileInfo over 
sl73dpihmnu0108.visa.com/10.207.184.24:8020. Not retrying because try once and 
fail.
   09:35:09.414 [shutdown-hook-0] INFO 
org.apache.spark.util.ShutdownHookManager - Deleting directory 
/tmp/spark-df9998e8-20b6-45b9-9c3d-a5c5f5444b6e
   09:35:09.424 [shutdown-hook-0] INFO 
org.apache.spark.util.ShutdownHookManager - Deleting directory 
/tmp/spark-25cd417f-3c44-4890-9dca-fbfa816d0ca6
   09:35:09.433 [shutdown-hook-0] DEBUG org.apache.hadoop.ipc.Client - stopping 
client from cache: Client-539fe933aad14d059e90457605f9693d
   09:35:09.433 [shutdown-hook-0] DEBUG org.apache.hadoop.ipc.Client - removing 
client from cache: Client-539fe933aad14d059e90457605f9693d
   09:35:09.434 [shutdown-hook-0] DEBUG org.apache.hadoop.ipc.Client - stopping 
actual client because no more references remain: 
Client-539fe933aad14d059e90457605f9693d
   09:35:09.434 [shutdown-hook-0] DEBUG org.apache.hadoop.ipc.Client - Stopping 
client
   09:35:09.436 [Thread-2] DEBUG org.apache.hadoop.util.ShutdownHookManager - 
Completed shutdown in 0.026 seconds; Timeouts: 0
   09:35:09.448 [Thread-2] DEBUG org.apache.hadoop.util.ShutdownHookManager - 
ShutdownHookManger completed shutdown.
        at java.util.concurrent.FutureTask.report(FutureTask.java:122)
        at java.util.concurrent.FutureTask.get(FutureTask.java:192)
        at 
org.apache.kyuubi.session.KyuubiSessionImpl.$anonfun$waitForEngineLaunched$1(KyuubiSessionImpl.scala:242)
        at 
org.apache.kyuubi.session.KyuubiSessionImpl.$anonfun$waitForEngineLaunched$1$adapted(KyuubiSessionImpl.scala:238)
        at scala.Option.foreach(Option.scala:407)
        at 
org.apache.kyuubi.session.KyuubiSessionImpl.waitForEngineLaunched(KyuubiSessionImpl.scala:238)
        at 
org.apache.kyuubi.session.KyuubiSessionImpl.$anonfun$getInfo$1(KyuubiSessionImpl.scala:285)
        at 
org.apache.kyuubi.session.AbstractSession.withAcquireRelease(AbstractSession.scala:82)
        at 
org.apache.kyuubi.session.KyuubiSessionImpl.getInfo(KyuubiSessionImpl.scala:284)
        at 
org.apache.kyuubi.service.AbstractBackendService.getInfo(AbstractBackendService.scala:54)
        at 
org.apache.kyuubi.server.KyuubiServer$$anon$1.org$apache$kyuubi$server$BackendServiceMetric$$super$getInfo(KyuubiServer.scala:171)
        at 
org.apache.kyuubi.server.BackendServiceMetric.$anonfun$getInfo$1(BackendServiceMetric.scala:51)
        at 
org.apache.kyuubi.metrics.MetricsSystem$.timerTracing(MetricsSystem.scala:112)
        at 
org.apache.kyuubi.server.BackendServiceMetric.getInfo(BackendServiceMetric.scala:51)
        at 
org.apache.kyuubi.server.BackendServiceMetric.getInfo$(BackendServiceMetric.scala:47)
        at 
org.apache.kyuubi.server.KyuubiServer$$anon$1.getInfo(KyuubiServer.scala:171)
        at 
org.apache.kyuubi.service.TFrontendService.GetInfo(TFrontendService.scala:226)
        at 
org.apache.hive.service.rpc.thrift.TCLIService$Processor$GetInfo.getResult(TCLIService.java:1537)
        at 
org.apache.hive.service.rpc.thrift.TCLIService$Processor$GetInfo.getResult(TCLIService.java:1522)
        at org.apache.thrift.ProcessFunction.process(ProcessFunction.java:39)
        at org.apache.thrift.TBaseProcessor.process(TBaseProcessor.java:39)
        at 
org.apache.kyuubi.service.authentication.HadoopThriftAuthBridgeServer$TUGIAssumingProcessor.process(HadoopThriftAuthBridgeServer.scala:163)
        at 
org.apache.thrift.server.TThreadPoolServer$WorkerProcess.run(TThreadPoolServer.java:286)
        at 
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
        at 
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
        at java.lang.Thread.run(Thread.java:750)
   Caused by: org.apache.kyuubi.KyuubiSQLException: 
org.apache.kyuubi.KyuubiSQLException: org.apache.hadoop.ipc.RemoteException: 
org.apache.hadoop.security.AccessControlException: SIMPLE authentication is not 
enabled.  Available:[TOKEN, KERBEROS]
        at org.apache.hadoop.ipc.Client.getRpcResponse(Client.java:1604)
        at org.apache.hadoop.ipc.Client.call(Client.java:1550)
        at org.apache.hadoop.ipc.Client.call(Client.java:1447)
        at 
org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:233)
        at 
org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:118)
        at com.sun.proxy.$Proxy22.getFileInfo(Unknown Source)
        at 
org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.getFileInfo(ClientNamenodeProtocolTranslatorPB.java:910)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
        at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:498)
        at 
org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:422)
        at 
org.apache.hadoop.io.retry.RetryInvocationHandler$Call.invokeMethod(RetryInvocationHandler.java:165)
        at 
org.apache.hadoop.io.retry.RetryInvocationHandler$Call.invoke(RetryInvocationHandler.java:157)
        at 
org.apache.hadoop.io.retry.RetryInvocationHandler$Call.invokeOnce(RetryInvocationHandler.java:95)
        at 
org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:359)
        at com.sun.proxy.$Proxy23.getFileInfo(Unknown Source)
        at org.apache.hadoop.hdfs.DFSClient.getFileInfo(DFSClient.java:1671)
        at 
org.apache.hadoop.hdfs.DistributedFileSystem$29.doCall(DistributedFileSystem.java:1603)
        at 
org.apache.hadoop.hdfs.DistributedFileSystem$29.doCall(DistributedFileSystem.java:1600)
        at 
org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
        at 
org.apache.hadoop.hdfs.DistributedFileSystem.getFileStatus(DistributedFileSystem.java:1615)
        at org.apache.hadoop.fs.FileSystem.exists(FileSystem.java:1690)
        at 
org.apache.spark.deploy.k8s.KubernetesUtils$.uploadFileUri(KubernetesUtils.scala:325)
        at 
org.apache.spark.deploy.k8s.KubernetesUtils$.$anonfun$uploadAndTransformFileUris$1(KubernetesUtils.scala:276)
        at 
scala.collection.TraversableLike.$anonfun$map$1(TraversableLike.scala:286)
        at 
scala.collection.mutable.ResizableArray.foreach(ResizableArray.scala:62)
        at 
scala.collection.mutable.ResizableArray.foreach$(ResizableArray.scala:55)
        at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:49)
        at scala.collection.TraversableLike.map(TraversableLike.scala:286)
        at scala.collection.TraversableLike.map$(TraversableLike.scala:279)
        at scala.collection.AbstractTraversable.map(Traversable.scala:108)
        at 
org.apache.spark.deploy.k8s.KubernetesUtils$.uploadAndTransformFileUris(KubernetesUtils.scala:275)
        at 
org.apache.spark.deploy.k8s.features.BasicDriverFeatureStep.$anonfun$getAdditionalPodSystemProperties$1(BasicDriverFeatureStep.scala:174)
        at scala.collection.immutable.List.foreach(List.scala:431)
        at 
org.apache.spark.deploy.k8s.features.BasicDriverFeatureStep.getAdditionalPodSystemProperties(BasicDriverFeatureStep.scala:165)
        at 
org.apache.spark.deploy.k8s.submit.KubernetesDriverBuilder.$anonfun$buildFromFeatures$4(KubernetesDriverBuilder.scala:65)
        at 
scala.collection.LinearSeqOptimized.foldLeft(LinearSeqOptimized.scala:126)
        at 
scala.collection.LinearSeqOptimized.foldLeft$(LinearSeqOptimized.scala:122)
        at scala.collection.immutable.List.foldLeft(List.scala:91)
        at 
org.apache.spark.deploy.k8s.submit.KubernetesDriverBuilder.buildFromFeatures(KubernetesDriverBuilder.scala:63)
        at 
org.apache.spark.deploy.k8s.submit.Client.run(KubernetesClientApplication.scala:107)
        at 
org.apache.spark.deploy.k8s.submit.KubernetesClientApplication.$anonfun$run$4(KubernetesClientApplication.scala:223)
        at 
org.apache.spark.deploy.k8s.submit.KubernetesClientApplication.$anonfun$run$4$adapted(KubernetesClientApplication.scala:217)
        at org.apache.spark.util.Utils$.tryWithResource(Utils.scala:2742)
        at 
org.apache.spark.deploy.k8s.submit.KubernetesClientApplication.run(KubernetesClientApplication.scala:217)
        at 
org.apache.spark.deploy.k8s.submit.KubernetesClientApplication.start(KubernetesClientApplication.scala:189)
        at 
org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:984)
        at 
org.apache.spark.deploy.SparkSubmit$$anon$1.run(SparkSubmit.scala:172)
        at 
org.apache.spark.deploy.SparkSubmit$$anon$1.run(SparkSubmit.scala:170)
        at java.security.AccessController.doPrivileged(Native Method)
        at javax.security.auth.Subject.doAs(Subject.java:422)
        at 
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1762)
        at 
org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:170)
        at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:211)
        at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:90)
        at 
org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:1072)
        at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:1081)
        at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
   09:35:09.404 [main] DEBUG org.apache.hadoop.security.UserGroupInformation - 
PrivilegedActionException as:avishnus (auth:PROXY) via kyuubi (auth:SIMPLE) 
cause:org.apache.spark.SparkException: Uploading file 
/opt/kyuubi/externals/engines/spark/kyuubi-spark-sql-engine_2.12-1.8.2-SNAPSHOT.jar
 failed...
   ```
   
   
   ### Kyuubi Engine Log Output
   
   _No response_
   
   ### Kyuubi Server Configurations
   
   ```yaml
   #
   # Licensed to the Apache Software Foundation (ASF) under one or more
   # contributor license agreements.  See the NOTICE file distributed with
   # this work for additional information regarding copyright ownership.
   # The ASF licenses this file to You under the Apache License, Version 2.0
   # (the "License"); you may not use this file except in compliance with
   # the License.  You may obtain a copy of the License at
   #
   #    http://www.apache.org/licenses/LICENSE-2.0
   #
   # Unless required by applicable law or agreed to in writing, software
   # distributed under the License is distributed on an "AS IS" BASIS,
   # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
   # See the License for the specific language governing permissions and
   # limitations under the License.
   #
   kyuubi.authentication=KERBEROS
   kyuubi.frontend.bind.host=localhost
   kyuubi.frontend.bind.port=10009
   kyuubi.kinit.principal=hive/xxxx@org
   kyuubi.kinit.keytab=hive.keytab
   kyuubi.zookeeper.embedded.client.port=2181
   spark.driver.port=7078
   spark.kubernetes.namespace=default
   kyuubi.kubernetes.master.address=k8s://https://abc
   spark.master=k8s://https://abc
   kyuubi.kubernetes.namespace=default
   spark.submit.deployMode=cluster
   spark.kubernetes.authenticate.driver.serviceAccountName=spark
   spark.kubernetes.container.image=xxxx
   spark.driver.extraJavaOptions=-Divy.home=/tmp
   spark.kubernetes.kerberos.krb5.path=/etc/krb5.conf
   spark.kubernetes.executor.deleteOnTermination=false
   spark.hadoop.scaas.skipDeleteOnTerminationValidation=true
   spark.kubernetes.file.upload.path=hdfs://xxxx:xxxx/tmp
   ```
   
   
   ### Kyuubi Engine Configurations
   
   _No response_
   
   ### Additional context
   
   _No response_
   
   ### Are you willing to submit PR?
   
   - [ ] Yes. I would be willing to submit a PR with guidance from the Kyuubi 
community to fix.
   - [ ] No. I cannot submit a PR at this time.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]


---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to