tnk-dev commented on issue #4629:
URL: https://github.com/apache/kyuubi/issues/4629#issuecomment-1488819840
Hey @zwangsheng @dnskr,
Using one of the mentioned releases indeed solved the error, thank you!
Running Kyuubi locally according to the quick start guide worked fine
without having any users setup (anonymous).
With helm though, it complains and I cannot connect to Kyuubi via DBeaver
using the above mentioned connection url:
## POD Logs
```
at
org.apache.kyuubi.operation.LaunchEngine.$anonfun$runInternal$2(LaunchEngine.scala:60)
~[kyuubi-server_2.12-1.7.0.jar:1.7.0]
at
java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
~[?:1.8.0_362]
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
~[?:1.8.0_362]
at
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
~[?:1.8.0_362]
at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
~[?:1.8.0_362]
at java.lang.Thread.run(Thread.java:750) ~[?:1.8.0_362]
2023-03-29 15:05:40.392 WARN org.apache.kyuubi.session.HadoopGroupProvider:
There is no group for anonymous, use the client user name as group directly
2023-03-29 15:05:40.427 INFO
org.apache.zookeeper.server.PrepRequestProcessor: Got user-level
KeeperException when processing sessionid:0x1006c9c32bb0001 type:create
cxid:0x2 zxid:0x5 txntype:-1 reqpath:n/a Error
Path:/kyuubi_1.7.0_USER_SPARK_SQL_lock/anonymous/default/locks
Error:KeeperErrorCode = NoNode for
/kyuubi_1.7.0_USER_SPARK_SQL_lock/anonymous/default/locks
2023-03-29 15:05:40.432 WARN org.apache.curator.utils.ZKPaths: The version
of ZooKeeper being used doesn't support Container nodes. CreateMode.PERSISTENT
will be used instead.
2023-03-29 15:05:40.460 INFO
org.apache.zookeeper.server.PrepRequestProcessor: Got user-level
KeeperException when processing sessionid:0x1006c9c32bb0001 type:create
cxid:0xd zxid:0xb txntype:-1 reqpath:n/a Error
Path:/kyuubi_1.7.0_USER_SPARK_SQL_lock/anonymous/default/leases
Error:KeeperErrorCode = NoNode for
/kyuubi_1.7.0_USER_SPARK_SQL_lock/anonymous/default/leases
2023-03-29 15:05:40.506 INFO org.apache.kyuubi.engine.ProcBuilder: Creating
anonymous's working directory at /opt/kyuubi/work/anonymous
2023-03-29 15:05:40.535 INFO org.apache.kyuubi.engine.EngineRef: Launching
engine:
/opt/kyuubi/externals/spark-3.3.2-bin-hadoop3/bin/spark-submit \
--class org.apache.kyuubi.engine.spark.SparkSQLEngine \
--conf spark.hive.server2.thrift.resultset.default.fetch.size=1000 \
--conf spark.kyuubi.client.ipAddress=192.168.178.43 \
--conf spark.kyuubi.client.version=1.7.0 \
--conf spark.kyuubi.engine.submit.time=1680102340489 \
--conf spark.kyuubi.frontend.protocols=THRIFT_BINARY \
--conf spark.kyuubi.ha.addresses=kyuubi-88df75948-4v5gx:2181 \
--conf
spark.kyuubi.ha.engine.ref.id=1d0aff22-2bd5-4370-aea4-71c5f154a79c \
--conf
spark.kyuubi.ha.namespace=/kyuubi_1.7.0_USER_SPARK_SQL/anonymous/default \
--conf spark.kyuubi.ha.zookeeper.auth.type=NONE \
--conf spark.kyuubi.kubernetes.namespace=kyuubi \
--conf spark.kyuubi.server.ipAddress=127.0.0.1 \
--conf spark.kyuubi.session.connection.url=localhost:10009 \
--conf spark.kyuubi.session.real.user=anonymous \
--conf
spark.app.name=kyuubi_USER_SPARK_SQL_anonymous_default_1d0aff22-2bd5-4370-aea4-71c5f154a79c
\
--conf
spark.kubernetes.driver.label.kyuubi-unique-tag=1d0aff22-2bd5-4370-aea4-71c5f154a79c
\
--conf spark.master=k8s://https://172.20.0.1:443 \
--conf spark.kubernetes.driverEnv.SPARK_USER_NAME=anonymous \
--conf spark.executorEnv.SPARK_USER_NAME=anonymous \
--proxy-user anonymous
/opt/kyuubi/externals/engines/spark/kyuubi-spark-sql-engine_2.12-1.7.0.jar
2023-03-29 15:05:40.550 INFO org.apache.kyuubi.engine.ProcBuilder: Logging
to /opt/kyuubi/work/anonymous/kyuubi-spark-sql-engine.log.0
```
## DBeaver logs
```
Could not open client transport with JDBC Uri: jdbc:hive2://127.0.0.1:10009:
org.apache.kyuubi.KyuubiSQLException:
The engine application has been terminated. Please check the engine log.
ApplicationInfo: (
name -> null,
state -> NOT_FOUND,
url -> null,
id -> null,
error -> null
)
at
org.apache.kyuubi.engine.EngineRef.$anonfun$create$6(EngineRef.scala:230)
at
org.apache.kyuubi.engine.EngineRef.$anonfun$create$6$adapted(EngineRef.scala:219)
at scala.Option.foreach(Option.scala:407)
at
org.apache.kyuubi.engine.EngineRef.$anonfun$create$5(EngineRef.scala:219)
at
org.apache.kyuubi.engine.EngineRef.$anonfun$create$5$adapted(EngineRef.scala:218)
at scala.Option.foreach(Option.scala:407)
at
org.apache.kyuubi.engine.EngineRef.$anonfun$create$1(EngineRef.scala:218)
at
org.apache.kyuubi.ha.client.zookeeper.ZookeeperDiscoveryClient.tryWithLock(ZookeeperDiscoveryClient.scala:180)
at org.apache.kyuubi.engine.EngineRef.tryWithLock(EngineRef.scala:166)
at org.apache.kyuubi.engine.EngineRef.create(EngineRef.scala:171)
at
org.apache.kyuubi.engine.EngineRef.$anonfun$getOrCreate$1(EngineRef.scala:266)
at scala.Option.getOrElse(Option.scala:189)
at org.apache.kyuubi.engine.EngineRef.getOrCreate(EngineRef.scala:266)
at
org.apache.kyuubi.session.KyuubiSessionImpl.$anonfun$openEngineSession$2(KyuubiSessionImpl.scala:147)
at
org.apache.kyuubi.session.KyuubiSessionImpl.$anonfun$openEngineSession$2$adapted(KyuubiSessionImpl.scala:123)
at
org.apache.kyuubi.ha.client.DiscoveryClientProvider$.withDiscoveryClient(DiscoveryClientProvider.scala:36)
at
org.apache.kyuubi.session.KyuubiSessionImpl.$anonfun$openEngineSession$1(KyuubiSessionImpl.scala:123)
at
org.apache.kyuubi.session.KyuubiSession.handleSessionException(KyuubiSession.scala:49)
at
org.apache.kyuubi.session.KyuubiSessionImpl.openEngineSession(KyuubiSessionImpl.scala:123)
at
org.apache.kyuubi.operation.LaunchEngine.$anonfun$runInternal$2(LaunchEngine.scala:60)
at
java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
Caused by: org.apache.kyuubi.KyuubiSQLException:
org.apache.kyuubi.KyuubiSQLException:
io.fabric8.kubernetes.client.KubernetesClientException: Failure executing: GET
at:
https://172.20.0.1/api/v1/namespaces/default/pods?labelSelector=spark-app-selector%3Dspark-de265d35dd144a23b24be9d979e75884%2Cspark-role%3Dexecutor&allowWatchBookmarks=true&watch=true.
Message: Forbidden.
at
io.fabric8.kubernetes.client.dsl.base.OperationSupport.requestFailure(OperationSupport.java:682)
at
io.fabric8.kubernetes.client.dsl.base.OperationSupport.requestFailure(OperationSupport.java:661)
at
io.fabric8.kubernetes.client.dsl.internal.WatchConnectionManager.lambda$run$2(WatchConnectionManager.java:126)
at
java.util.concurrent.CompletableFuture.uniHandle(CompletableFuture.java:836)
at
java.util.concurrent.CompletableFuture$UniHandle.tryFire(CompletableFuture.java:811)
at
java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at
java.util.concurrent.CompletableFuture.completeExceptionally(CompletableFuture.java:1990)
at
io.fabric8.kubernetes.client.okhttp.OkHttpWebSocketImpl$BuilderImpl$1.onFailure(OkHttpWebSocketImpl.java:66)
at
okhttp3.internal.ws.RealWebSocket.failWebSocket(RealWebSocket.java:571)
at
okhttp3.internal.ws.RealWebSocket$2.onResponse(RealWebSocket.java:198)
at okhttp3.RealCall$AsyncCall.execute(RealCall.java:203)
at okhttp3.internal.NamedRunnable.run(NamedRunnable.java:32)
at
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
See more: /opt/kyuubi/work/anonymous/kyuubi-spark-sql-engine.log.0
at
org.apache.kyuubi.KyuubiSQLException$.apply(KyuubiSQLException.scala:69)
at
org.apache.kyuubi.engine.ProcBuilder.$anonfun$start$1(ProcBuilder.scala:229)
at java.lang.Thread.run(Thread.java:750)
.
FYI: The last 10 line(s) of log are:
at
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1878)
at
org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:163)
at
org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:203)
at
org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:90)
at
org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:1046)
at
org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:1055)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
23/03/29 15:08:40 INFO ShutdownHookManager: Shutdown hook called
23/03/29 15:08:40 INFO ShutdownHookManager: Deleting directory
/tmp/spark-5630ff13-cd25-4944-8c78-a3859e6cfb7e
23/03/29 15:08:40 INFO ShutdownHookManager: Deleting directory
/tmp/spark-fa6a2d02-82a7-4965-b7a8-487957b15a08
at
org.apache.kyuubi.KyuubiSQLException$.apply(KyuubiSQLException.scala:69)
at org.apache.kyuubi.engine.ProcBuilder.getError(ProcBuilder.scala:275)
at org.apache.kyuubi.engine.ProcBuilder.getError$(ProcBuilder.scala:264)
at
org.apache.kyuubi.engine.spark.SparkProcessBuilder.getError(SparkProcessBuilder.scala:37)
... 25 more
org.apache.kyuubi.KyuubiSQLException:
The engine application has been terminated. Please check the engine log.
ApplicationInfo: (
name -> null,
state -> NOT_FOUND,
url -> null,
id -> null,
error -> null
)
at
org.apache.kyuubi.engine.EngineRef.$anonfun$create$6(EngineRef.scala:230)
at
org.apache.kyuubi.engine.EngineRef.$anonfun$create$6$adapted(EngineRef.scala:219)
at scala.Option.foreach(Option.scala:407)
at
org.apache.kyuubi.engine.EngineRef.$anonfun$create$5(EngineRef.scala:219)
at
org.apache.kyuubi.engine.EngineRef.$anonfun$create$5$adapted(EngineRef.scala:218)
at scala.Option.foreach(Option.scala:407)
at
org.apache.kyuubi.engine.EngineRef.$anonfun$create$1(EngineRef.scala:218)
at
org.apache.kyuubi.ha.client.zookeeper.ZookeeperDiscoveryClient.tryWithLock(ZookeeperDiscoveryClient.scala:180)
at org.apache.kyuubi.engine.EngineRef.tryWithLock(EngineRef.scala:166)
at org.apache.kyuubi.engine.EngineRef.create(EngineRef.scala:171)
at
org.apache.kyuubi.engine.EngineRef.$anonfun$getOrCreate$1(EngineRef.scala:266)
at scala.Option.getOrElse(Option.scala:189)
at org.apache.kyuubi.engine.EngineRef.getOrCreate(EngineRef.scala:266)
at
org.apache.kyuubi.session.KyuubiSessionImpl.$anonfun$openEngineSession$2(KyuubiSessionImpl.scala:147)
at
org.apache.kyuubi.session.KyuubiSessionImpl.$anonfun$openEngineSession$2$adapted(KyuubiSessionImpl.scala:123)
at
org.apache.kyuubi.ha.client.DiscoveryClientProvider$.withDiscoveryClient(DiscoveryClientProvider.scala:36)
at
org.apache.kyuubi.session.KyuubiSessionImpl.$anonfun$openEngineSession$1(KyuubiSessionImpl.scala:123)
at
org.apache.kyuubi.session.KyuubiSession.handleSessionException(KyuubiSession.scala:49)
at
org.apache.kyuubi.session.KyuubiSessionImpl.openEngineSession(KyuubiSessionImpl.scala:123)
at
org.apache.kyuubi.operation.LaunchEngine.$anonfun$runInternal$2(LaunchEngine.scala:60)
at
java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
Caused by: org.apache.kyuubi.KyuubiSQLException:
org.apache.kyuubi.KyuubiSQLException:
io.fabric8.kubernetes.client.KubernetesClientException: Failure executing: GET
at:
https://172.20.0.1/api/v1/namespaces/default/pods?labelSelector=spark-app-selector%3Dspark-de265d35dd144a23b24be9d979e75884%2Cspark-role%3Dexecutor&allowWatchBookmarks=true&watch=true.
Message: Forbidden.
at
io.fabric8.kubernetes.client.dsl.base.OperationSupport.requestFailure(OperationSupport.java:682)
at
io.fabric8.kubernetes.client.dsl.base.OperationSupport.requestFailure(OperationSupport.java:661)
at
io.fabric8.kubernetes.client.dsl.internal.WatchConnectionManager.lambda$run$2(WatchConnectionManager.java:126)
at
java.util.concurrent.CompletableFuture.uniHandle(CompletableFuture.java:836)
at
java.util.concurrent.CompletableFuture$UniHandle.tryFire(CompletableFuture.java:811)
at
java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at
java.util.concurrent.CompletableFuture.completeExceptionally(CompletableFuture.java:1990)
at
io.fabric8.kubernetes.client.okhttp.OkHttpWebSocketImpl$BuilderImpl$1.onFailure(OkHttpWebSocketImpl.java:66)
at
okhttp3.internal.ws.RealWebSocket.failWebSocket(RealWebSocket.java:571)
at
okhttp3.internal.ws.RealWebSocket$2.onResponse(RealWebSocket.java:198)
at okhttp3.RealCall$AsyncCall.execute(RealCall.java:203)
at okhttp3.internal.NamedRunnable.run(NamedRunnable.java:32)
at
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
See more: /opt/kyuubi/work/anonymous/kyuubi-spark-sql-engine.log.0
at
org.apache.kyuubi.KyuubiSQLException$.apply(KyuubiSQLException.scala:69)
at
org.apache.kyuubi.engine.ProcBuilder.$anonfun$start$1(ProcBuilder.scala:229)
at java.lang.Thread.run(Thread.java:750)
.
FYI: The last 10 line(s) of log are:
at
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1878)
at
org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:163)
at
org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:203)
at
org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:90)
at
org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:1046)
at
org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:1055)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
23/03/29 15:08:40 INFO ShutdownHookManager: Shutdown hook called
23/03/29 15:08:40 INFO ShutdownHookManager: Deleting directory
/tmp/spark-5630ff13-cd25-4944-8c78-a3859e6cfb7e
23/03/29 15:08:40 INFO ShutdownHookManager: Deleting directory
/tmp/spark-fa6a2d02-82a7-4965-b7a8-487957b15a08
at
org.apache.kyuubi.KyuubiSQLException$.apply(KyuubiSQLException.scala:69)
at org.apache.kyuubi.engine.ProcBuilder.getError(ProcBuilder.scala:275)
at org.apache.kyuubi.engine.ProcBuilder.getError$(ProcBuilder.scala:264)
at
org.apache.kyuubi.engine.spark.SparkProcessBuilder.getError(SparkProcessBuilder.scala:37)
... 25 more
org.apache.kyuubi.KyuubiSQLException:
The engine application has been terminated. Please check the engine log.
ApplicationInfo: (
name -> null,
state -> NOT_FOUND,
url -> null,
id -> null,
error -> null
)
at
org.apache.kyuubi.engine.EngineRef.$anonfun$create$6(EngineRef.scala:230)
at
org.apache.kyuubi.engine.EngineRef.$anonfun$create$6$adapted(EngineRef.scala:219)
at scala.Option.foreach(Option.scala:407)
at
org.apache.kyuubi.engine.EngineRef.$anonfun$create$5(EngineRef.scala:219)
at
org.apache.kyuubi.engine.EngineRef.$anonfun$create$5$adapted(EngineRef.scala:218)
at scala.Option.foreach(Option.scala:407)
at
org.apache.kyuubi.engine.EngineRef.$anonfun$create$1(EngineRef.scala:218)
at
org.apache.kyuubi.ha.client.zookeeper.ZookeeperDiscoveryClient.tryWithLock(ZookeeperDiscoveryClient.scala:180)
at org.apache.kyuubi.engine.EngineRef.tryWithLock(EngineRef.scala:166)
at org.apache.kyuubi.engine.EngineRef.create(EngineRef.scala:171)
at
org.apache.kyuubi.engine.EngineRef.$anonfun$getOrCreate$1(EngineRef.scala:266)
at scala.Option.getOrElse(Option.scala:189)
at org.apache.kyuubi.engine.EngineRef.getOrCreate(EngineRef.scala:266)
at
org.apache.kyuubi.session.KyuubiSessionImpl.$anonfun$openEngineSession$2(KyuubiSessionImpl.scala:147)
at
org.apache.kyuubi.session.KyuubiSessionImpl.$anonfun$openEngineSession$2$adapted(KyuubiSessionImpl.scala:123)
at
org.apache.kyuubi.ha.client.DiscoveryClientProvider$.withDiscoveryClient(DiscoveryClientProvider.scala:36)
at
org.apache.kyuubi.session.KyuubiSessionImpl.$anonfun$openEngineSession$1(KyuubiSessionImpl.scala:123)
at
org.apache.kyuubi.session.KyuubiSession.handleSessionException(KyuubiSession.scala:49)
at
org.apache.kyuubi.session.KyuubiSessionImpl.openEngineSession(KyuubiSessionImpl.scala:123)
at
org.apache.kyuubi.operation.LaunchEngine.$anonfun$runInternal$2(LaunchEngine.scala:60)
at
java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
Caused by: org.apache.kyuubi.KyuubiSQLException:
org.apache.kyuubi.KyuubiSQLException:
io.fabric8.kubernetes.client.KubernetesClientException: Failure executing: GET
at:
https://172.20.0.1/api/v1/namespaces/default/pods?labelSelector=spark-app-selector%3Dspark-de265d35dd144a23b24be9d979e75884%2Cspark-role%3Dexecutor&allowWatchBookmarks=true&watch=true.
Message: Forbidden.
at
io.fabric8.kubernetes.client.dsl.base.OperationSupport.requestFailure(OperationSupport.java:682)
at
io.fabric8.kubernetes.client.dsl.base.OperationSupport.requestFailure(OperationSupport.java:661)
at
io.fabric8.kubernetes.client.dsl.internal.WatchConnectionManager.lambda$run$2(WatchConnectionManager.java:126)
at
java.util.concurrent.CompletableFuture.uniHandle(CompletableFuture.java:836)
at
java.util.concurrent.CompletableFuture$UniHandle.tryFire(CompletableFuture.java:811)
at
java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at
java.util.concurrent.CompletableFuture.completeExceptionally(CompletableFuture.java:1990)
at
io.fabric8.kubernetes.client.okhttp.OkHttpWebSocketImpl$BuilderImpl$1.onFailure(OkHttpWebSocketImpl.java:66)
at
okhttp3.internal.ws.RealWebSocket.failWebSocket(RealWebSocket.java:571)
at
okhttp3.internal.ws.RealWebSocket$2.onResponse(RealWebSocket.java:198)
at okhttp3.RealCall$AsyncCall.execute(RealCall.java:203)
at okhttp3.internal.NamedRunnable.run(NamedRunnable.java:32)
at
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
See more: /opt/kyuubi/work/anonymous/kyuubi-spark-sql-engine.log.0
at
org.apache.kyuubi.KyuubiSQLException$.apply(KyuubiSQLException.scala:69)
at
org.apache.kyuubi.engine.ProcBuilder.$anonfun$start$1(ProcBuilder.scala:229)
at java.lang.Thread.run(Thread.java:750)
.
FYI: The last 10 line(s) of log are:
at
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1878)
at
org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:163)
at
org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:203)
at
org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:90)
at
org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:1046)
at
org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:1055)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
23/03/29 15:08:40 INFO ShutdownHookManager: Shutdown hook called
23/03/29 15:08:40 INFO ShutdownHookManager: Deleting directory
/tmp/spark-5630ff13-cd25-4944-8c78-a3859e6cfb7e
23/03/29 15:08:40 INFO ShutdownHookManager: Deleting directory
/tmp/spark-fa6a2d02-82a7-4965-b7a8-487957b15a08
at
org.apache.kyuubi.KyuubiSQLException$.apply(KyuubiSQLException.scala:69)
at org.apache.kyuubi.engine.ProcBuilder.getError(ProcBuilder.scala:275)
at org.apache.kyuubi.engine.ProcBuilder.getError$(ProcBuilder.scala:264)
at
org.apache.kyuubi.engine.spark.SparkProcessBuilder.getError(SparkProcessBuilder.scala:37)
... 25 more
```
## Work logs
Also this spark error occures?
```
kyuubi@kyuubi-88df75948-4v5gx:/opt/kyuubi$ cat
work/anonymous/kyuubi-spark-sql-engine.log.0
23/03/29 15:05:43 WARN NativeCodeLoader: Unable to load native-hadoop
library for your platform... using builtin-java classes where applicable
23/03/29 15:05:44 INFO SignalRegister: Registering signal handler for TERM
23/03/29 15:05:44 INFO SignalRegister: Registering signal handler for HUP
23/03/29 15:05:44 INFO SignalRegister: Registering signal handler for INT
23/03/29 15:05:44 INFO HiveConf: Found configuration file null
23/03/29 15:05:44 INFO SparkContext: Running Spark version 3.3.2
23/03/29 15:05:44 INFO ResourceUtils:
==============================================================
23/03/29 15:05:44 INFO ResourceUtils: No custom resources configured for
spark.driver.
23/03/29 15:05:44 INFO ResourceUtils:
==============================================================
23/03/29 15:05:44 INFO SparkContext: Submitted application:
kyuubi_USER_SPARK_SQL_anonymous_default_1d0aff22-2bd5-4370-aea4-71c5f154a79c
23/03/29 15:05:44 INFO ResourceProfile: Default ResourceProfile created,
executor resources: Map(cores -> name: cores, amount: 1, script: , vendor: ,
memory -> name: memory, amount: 1024, script: , vendor: , offHeap -> name:
offHeap, amount: 0, script: , vendor: ), task resources: Map(cpus -> name:
cpus, amount: 1.0)
23/03/29 15:05:44 INFO ResourceProfile: Limiting resource is cpus at 1 tasks
per executor
23/03/29 15:05:44 INFO ResourceProfileManager: Added ResourceProfile id: 0
23/03/29 15:05:44 INFO SecurityManager: Changing view acls to:
kyuubi,anonymous
23/03/29 15:05:44 INFO SecurityManager: Changing modify acls to:
kyuubi,anonymous
23/03/29 15:05:44 INFO SecurityManager: Changing view acls groups to:
23/03/29 15:05:44 INFO SecurityManager: Changing modify acls groups to:
23/03/29 15:05:44 INFO SecurityManager: SecurityManager: authentication
disabled; ui acls disabled; users with view permissions: Set(kyuubi,
anonymous); groups with view permissions: Set(); users with modify
permissions: Set(kyuubi, anonymous); groups with modify permissions: Set()
23/03/29 15:05:45 INFO Utils: Successfully started service 'sparkDriver' on
port 34783.
23/03/29 15:05:45 INFO SparkEnv: Registering MapOutputTracker
23/03/29 15:05:45 INFO SparkEnv: Registering BlockManagerMaster
23/03/29 15:05:45 INFO BlockManagerMasterEndpoint: Using
org.apache.spark.storage.DefaultTopologyMapper for getting topology information
23/03/29 15:05:45 INFO BlockManagerMasterEndpoint:
BlockManagerMasterEndpoint up
23/03/29 15:05:45 INFO SparkEnv: Registering BlockManagerMasterHeartbeat
23/03/29 15:05:45 INFO DiskBlockManager: Created local directory at
/tmp/blockmgr-3d826bde-2af8-4d6a-b559-f32254ee744e
23/03/29 15:05:45 INFO MemoryStore: MemoryStore started with capacity 413.9
MiB
23/03/29 15:05:45 INFO SparkEnv: Registering OutputCommitCoordinator
23/03/29 15:05:45 INFO Utils: Successfully started service 'SparkUI' on port
44019.
23/03/29 15:05:45 INFO SparkContext: Added JAR
file:/opt/kyuubi/externals/engines/spark/kyuubi-spark-sql-engine_2.12-1.7.0.jar
at spark://10.2.45.168:34783/jars/kyuubi-spark-sql-engine_2.12-1.7.0.jar with
timestamp 1680102344497
23/03/29 15:05:45 INFO SparkKubernetesClientFactory: Auto-configuring K8S
client using current context from users K8S config file
23/03/29 15:05:46 INFO ExecutorPodsAllocator: Going to request 2 executors
from Kubernetes for ResourceProfile Id: 0, target: 2, known: 0,
sharedSlotFromPendingPods: 2147483647.
23/03/29 15:05:46 WARN ExecutorPodsSnapshotsStoreImpl: Exception when
notifying snapshot subscriber.
org.apache.spark.SparkException: Must specify the executor container image
at
org.apache.spark.deploy.k8s.features.BasicExecutorFeatureStep.$anonfun$executorContainerImage$1(BasicExecutorFeatureStep.scala:44)
at scala.Option.getOrElse(Option.scala:189)
at
org.apache.spark.deploy.k8s.features.BasicExecutorFeatureStep.<init>(BasicExecutorFeatureStep.scala:44)
at
org.apache.spark.scheduler.cluster.k8s.KubernetesExecutorBuilder.buildFromFeatures(KubernetesExecutorBuilder.scala:69)
at
org.apache.spark.scheduler.cluster.k8s.ExecutorPodsAllocator.$anonfun$requestNewExecutors$1(ExecutorPodsAllocator.scala:398)
at scala.collection.immutable.Range.foreach$mVc$sp(Range.scala:158)
at
org.apache.spark.scheduler.cluster.k8s.ExecutorPodsAllocator.requestNewExecutors(ExecutorPodsAllocator.scala:389)
at
org.apache.spark.scheduler.cluster.k8s.ExecutorPodsAllocator.$anonfun$onNewSnapshots$35(ExecutorPodsAllocator.scala:349)
at
org.apache.spark.scheduler.cluster.k8s.ExecutorPodsAllocator.$anonfun$onNewSnapshots$35$adapted(ExecutorPodsAllocator.scala:342)
at
scala.collection.mutable.ResizableArray.foreach(ResizableArray.scala:62)
at
scala.collection.mutable.ResizableArray.foreach$(ResizableArray.scala:55)
at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:49)
at
org.apache.spark.scheduler.cluster.k8s.ExecutorPodsAllocator.onNewSnapshots(ExecutorPodsAllocator.scala:342)
at
org.apache.spark.scheduler.cluster.k8s.ExecutorPodsAllocator.$anonfun$start$3(ExecutorPodsAllocator.scala:120)
at
org.apache.spark.scheduler.cluster.k8s.ExecutorPodsAllocator.$anonfun$start$3$adapted(ExecutorPodsAllocator.scala:120)
at
org.apache.spark.scheduler.cluster.k8s.ExecutorPodsSnapshotsStoreImpl$SnapshotsSubscriber.org$apache$spark$scheduler$cluster$k8s$ExecutorPodsSnapshotsStoreImpl$SnapshotsSubscriber$$processSnapshotsInternal(ExecutorPodsSnapshotsStoreImpl.scala:138)
at
org.apache.spark.scheduler.cluster.k8s.ExecutorPodsSnapshotsStoreImpl$SnapshotsSubscriber.processSnapshots(ExecutorPodsSnapshotsStoreImpl.scala:126)
at
org.apache.spark.scheduler.cluster.k8s.ExecutorPodsSnapshotsStoreImpl.$anonfun$addSubscriber$1(ExecutorPodsSnapshotsStoreImpl.scala:81)
at
java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308)
at
java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180)
at
java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294)
at
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
23/03/29 15:05:46 INFO ExecutorPodsAllocator: Going to request 2 executors
from Kubernetes for ResourceProfile Id: 0, target: 2, known: 0,
sharedSlotFromPendingPods: 2147483647.
23/03/29 15:05:46 WARN ExecutorPodsSnapshotsStoreImpl: Exception when
notifying snapshot subscriber.
org.apache.spark.SparkException: Must specify the executor container image
at
org.apache.spark.deploy.k8s.features.BasicExecutorFeatureStep.$anonfun$executorContainerImage$1(BasicExecutorFeatureStep.scala:44)
at scala.Option.getOrElse(Option.scala:189)
at
org.apache.spark.deploy.k8s.features.BasicExecutorFeatureStep.<init>(BasicExecutorFeatureStep.scala:44)
at
org.apache.spark.scheduler.cluster.k8s.KubernetesExecutorBuilder.buildFromFeatures(KubernetesExecutorBuilder.scala:69)
at
org.apache.spark.scheduler.cluster.k8s.ExecutorPodsAllocator.$anonfun$requestNewExecutors$1(ExecutorPodsAllocator.scala:398)
at scala.collection.immutable.Range.foreach$mVc$sp(Range.scala:158)
at
org.apache.spark.scheduler.cluster.k8s.ExecutorPodsAllocator.requestNewExecutors(ExecutorPodsAllocator.scala:389)
at
org.apache.spark.scheduler.cluster.k8s.ExecutorPodsAllocator.$anonfun$onNewSnapshots$35(ExecutorPodsAllocator.scala:349)
at
org.apache.spark.scheduler.cluster.k8s.ExecutorPodsAllocator.$anonfun$onNewSnapshots$35$adapted(ExecutorPodsAllocator.scala:342)
at
scala.collection.mutable.ResizableArray.foreach(ResizableArray.scala:62)
at
scala.collection.mutable.ResizableArray.foreach$(ResizableArray.scala:55)
at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:49)
at
org.apache.spark.scheduler.cluster.k8s.ExecutorPodsAllocator.onNewSnapshots(ExecutorPodsAllocator.scala:342)
at
org.apache.spark.scheduler.cluster.k8s.ExecutorPodsAllocator.$anonfun$start$3(ExecutorPodsAllocator.scala:120)
at
org.apache.spark.scheduler.cluster.k8s.ExecutorPodsAllocator.$anonfun$start$3$adapted(ExecutorPodsAllocator.scala:120)
at
org.apache.spark.scheduler.cluster.k8s.ExecutorPodsSnapshotsStoreImpl$SnapshotsSubscriber.org$apache$spark$scheduler$cluster$k8s$ExecutorPodsSnapshotsStoreImpl$SnapshotsSubscriber$$processSnapshotsInternal(ExecutorPodsSnapshotsStoreImpl.scala:138)
at
org.apache.spark.scheduler.cluster.k8s.ExecutorPodsSnapshotsStoreImpl$SnapshotsSubscriber$$anon$2.run(ExecutorPodsSnapshotsStoreImpl.scala:158)
at
java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at
java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$201(ScheduledThreadPoolExecutor.java:180)
at
java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:293)
at
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
23/03/29 15:05:47 WARN WatchConnectionManager: Exec Failure: HTTP 403,
Status: 403 - Forbidden
23/03/29 15:05:47 WARN ExecutorPodsWatchSnapshotSource: Kubernetes client
has been closed.
23/03/29 15:05:47 ERROR SparkContext: Error initializing SparkContext.
io.fabric8.kubernetes.client.KubernetesClientException: Failure executing:
GET at:
https://172.20.0.1/api/v1/namespaces/default/pods?labelSelector=spark-app-selector%3Dspark-de265d35dd144a23b24be9d979e75884%2Cspark-role%3Dexecutor&allowWatchBookmarks=true&watch=true.
Message: Forbidden.
at
io.fabric8.kubernetes.client.dsl.base.OperationSupport.requestFailure(OperationSupport.java:682)
at
io.fabric8.kubernetes.client.dsl.base.OperationSupport.requestFailure(OperationSupport.java:661)
at
io.fabric8.kubernetes.client.dsl.internal.WatchConnectionManager.lambda$run$2(WatchConnectionManager.java:126)
at
java.util.concurrent.CompletableFuture.uniHandle(CompletableFuture.java:836)
at
java.util.concurrent.CompletableFuture$UniHandle.tryFire(CompletableFuture.java:811)
at
java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at
java.util.concurrent.CompletableFuture.completeExceptionally(CompletableFuture.java:1990)
at
io.fabric8.kubernetes.client.okhttp.OkHttpWebSocketImpl$BuilderImpl$1.onFailure(OkHttpWebSocketImpl.java:66)
at
okhttp3.internal.ws.RealWebSocket.failWebSocket(RealWebSocket.java:571)
at
okhttp3.internal.ws.RealWebSocket$2.onResponse(RealWebSocket.java:198)
at okhttp3.RealCall$AsyncCall.execute(RealCall.java:203)
at okhttp3.internal.NamedRunnable.run(NamedRunnable.java:32)
at
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
Suppressed: java.lang.Throwable: waiting here
at
io.fabric8.kubernetes.client.utils.Utils.waitUntilReady(Utils.java:169)
at
io.fabric8.kubernetes.client.utils.Utils.waitUntilReadyOrFail(Utils.java:180)
at
io.fabric8.kubernetes.client.dsl.internal.WatchConnectionManager.waitUntilReady(WatchConnectionManager.java:96)
at
io.fabric8.kubernetes.client.dsl.base.BaseOperation.watch(BaseOperation.java:572)
at
io.fabric8.kubernetes.client.dsl.base.BaseOperation.watch(BaseOperation.java:547)
at
io.fabric8.kubernetes.client.dsl.base.BaseOperation.watch(BaseOperation.java:83)
at
org.apache.spark.scheduler.cluster.k8s.ExecutorPodsWatchSnapshotSource.start(ExecutorPodsWatchSnapshotSource.scala:53)
at
org.apache.spark.scheduler.cluster.k8s.KubernetesClusterSchedulerBackend.start(KubernetesClusterSchedulerBackend.scala:109)
at
org.apache.spark.scheduler.TaskSchedulerImpl.start(TaskSchedulerImpl.scala:222)
at
org.apache.spark.SparkContext.<init>(SparkContext.scala:595)
at
org.apache.spark.SparkContext$.getOrCreate(SparkContext.scala:2714)
at
org.apache.spark.sql.SparkSession$Builder.$anonfun$getOrCreate$2(SparkSession.scala:953)
at scala.Option.getOrElse(Option.scala:189)
at
org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:947)
at
org.apache.kyuubi.engine.spark.SparkSQLEngine$.createSpark(SparkSQLEngine.scala:253)
at
org.apache.kyuubi.engine.spark.SparkSQLEngine$.main(SparkSQLEngine.scala:326)
at
org.apache.kyuubi.engine.spark.SparkSQLEngine.main(SparkSQLEngine.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native
Method)
at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at
org.apache.spark.deploy.JavaMainApplication.start(SparkApplication.scala:52)
at
org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:958)
at
org.apache.spark.deploy.SparkSubmit$$anon$1.run(SparkSubmit.scala:165)
at
org.apache.spark.deploy.SparkSubmit$$anon$1.run(SparkSubmit.scala:163)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1878)
at
org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:163)
at
org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:203)
at
org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:90)
at
org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:1046)
at
org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:1055)
at
org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
23/03/29 15:05:47 INFO SparkUI: Stopped Spark web UI at
http://10.2.45.168:44019
23/03/29 15:05:47 INFO KubernetesClusterSchedulerBackend: Shutting down all
executors
23/03/29 15:05:47 INFO
KubernetesClusterSchedulerBackend$KubernetesDriverEndpoint: Asking each
executor to shut down
23/03/29 15:05:47 ERROR Utils: Uncaught exception in thread main
io.fabric8.kubernetes.client.KubernetesClientException: Failure executing:
GET at:
https://172.20.0.1/api/v1/namespaces/default/services?labelSelector=spark-app-selector%3Dspark-de265d35dd144a23b24be9d979e75884.
Message: Forbidden!Configured service account doesn't have access. Service
account may have been revoked. services is forbidden: User
"system:serviceaccount:kyuubi:kyuubi" cannot list resource "services" in API
group "" in the namespace "default".
at
io.fabric8.kubernetes.client.dsl.base.OperationSupport.requestFailure(OperationSupport.java:682)
at
io.fabric8.kubernetes.client.dsl.base.OperationSupport.requestFailure(OperationSupport.java:661)
at
io.fabric8.kubernetes.client.dsl.base.OperationSupport.assertResponseCode(OperationSupport.java:610)
at
io.fabric8.kubernetes.client.dsl.base.OperationSupport.handleResponse(OperationSupport.java:555)
at
io.fabric8.kubernetes.client.dsl.base.OperationSupport.handleResponse(OperationSupport.java:518)
at
io.fabric8.kubernetes.client.dsl.base.OperationSupport.handleResponse(OperationSupport.java:502)
at
io.fabric8.kubernetes.client.dsl.base.BaseOperation.listRequestHelper(BaseOperation.java:133)
at
io.fabric8.kubernetes.client.dsl.base.BaseOperation.list(BaseOperation.java:415)
at
io.fabric8.kubernetes.client.dsl.base.BaseOperation.list(BaseOperation.java:404)
at
io.fabric8.kubernetes.client.dsl.base.BaseOperation.deleteList(BaseOperation.java:537)
at
io.fabric8.kubernetes.client.dsl.base.BaseOperation.delete(BaseOperation.java:455)
at
org.apache.spark.scheduler.cluster.k8s.KubernetesClusterSchedulerBackend.$anonfun$stop$5(KubernetesClusterSchedulerBackend.scala:139)
at org.apache.spark.util.Utils$.tryLogNonFatalError(Utils.scala:1484)
at
org.apache.spark.scheduler.cluster.k8s.KubernetesClusterSchedulerBackend.stop(KubernetesClusterSchedulerBackend.scala:140)
at
org.apache.spark.scheduler.TaskSchedulerImpl.stop(TaskSchedulerImpl.scala:931)
at
org.apache.spark.scheduler.DAGScheduler.stop(DAGScheduler.scala:2785)
at
org.apache.spark.SparkContext.$anonfun$stop$11(SparkContext.scala:2105)
at org.apache.spark.util.Utils$.tryLogNonFatalError(Utils.scala:1484)
at org.apache.spark.SparkContext.stop(SparkContext.scala:2105)
at org.apache.spark.SparkContext.<init>(SparkContext.scala:695)
at
org.apache.spark.SparkContext$.getOrCreate(SparkContext.scala:2714)
at
org.apache.spark.sql.SparkSession$Builder.$anonfun$getOrCreate$2(SparkSession.scala:953)
at scala.Option.getOrElse(Option.scala:189)
at
org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:947)
at
org.apache.kyuubi.engine.spark.SparkSQLEngine$.createSpark(SparkSQLEngine.scala:253)
at
org.apache.kyuubi.engine.spark.SparkSQLEngine$.main(SparkSQLEngine.scala:326)
at
org.apache.kyuubi.engine.spark.SparkSQLEngine.main(SparkSQLEngine.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at
org.apache.spark.deploy.JavaMainApplication.start(SparkApplication.scala:52)
at
org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:958)
at
org.apache.spark.deploy.SparkSubmit$$anon$1.run(SparkSubmit.scala:165)
at
org.apache.spark.deploy.SparkSubmit$$anon$1.run(SparkSubmit.scala:163)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1878)
at
org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:163)
at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:203)
at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:90)
at
org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:1046)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:1055)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
23/03/29 15:05:47 ERROR Utils: Uncaught exception in thread main
io.fabric8.kubernetes.client.KubernetesClientException: Failure executing:
GET at:
https://172.20.0.1/api/v1/namespaces/default/pods?labelSelector=spark-app-selector%3Dspark-de265d35dd144a23b24be9d979e75884%2Cspark-role%3Dexecutor.
Message: Forbidden!Configured service account doesn't have access. Service
account may have been revoked. pods is forbidden: User
"system:serviceaccount:kyuubi:kyuubi" cannot list resource "pods" in API group
"" in the namespace "default".
at
io.fabric8.kubernetes.client.dsl.base.OperationSupport.requestFailure(OperationSupport.java:682)
at
io.fabric8.kubernetes.client.dsl.base.OperationSupport.requestFailure(OperationSupport.java:661)
at
io.fabric8.kubernetes.client.dsl.base.OperationSupport.assertResponseCode(OperationSupport.java:610)
at
io.fabric8.kubernetes.client.dsl.base.OperationSupport.handleResponse(OperationSupport.java:555)
at
io.fabric8.kubernetes.client.dsl.base.OperationSupport.handleResponse(OperationSupport.java:518)
at
io.fabric8.kubernetes.client.dsl.base.OperationSupport.handleResponse(OperationSupport.java:502)
at
io.fabric8.kubernetes.client.dsl.base.BaseOperation.listRequestHelper(BaseOperation.java:133)
at
io.fabric8.kubernetes.client.dsl.base.BaseOperation.list(BaseOperation.java:415)
at
io.fabric8.kubernetes.client.dsl.base.BaseOperation.list(BaseOperation.java:404)
at
io.fabric8.kubernetes.client.dsl.base.BaseOperation.deleteList(BaseOperation.java:537)
at
io.fabric8.kubernetes.client.dsl.base.BaseOperation.delete(BaseOperation.java:455)
at
org.apache.spark.scheduler.cluster.k8s.ExecutorPodsAllocator.$anonfun$stop$1(ExecutorPodsAllocator.scala:477)
at org.apache.spark.util.Utils$.tryLogNonFatalError(Utils.scala:1484)
at
org.apache.spark.scheduler.cluster.k8s.ExecutorPodsAllocator.stop(ExecutorPodsAllocator.scala:478)
at
org.apache.spark.scheduler.cluster.k8s.KubernetesClusterSchedulerBackend.stop(KubernetesClusterSchedulerBackend.scala:155)
at
org.apache.spark.scheduler.TaskSchedulerImpl.stop(TaskSchedulerImpl.scala:931)
at
org.apache.spark.scheduler.DAGScheduler.stop(DAGScheduler.scala:2785)
at
org.apache.spark.SparkContext.$anonfun$stop$11(SparkContext.scala:2105)
at org.apache.spark.util.Utils$.tryLogNonFatalError(Utils.scala:1484)
at org.apache.spark.SparkContext.stop(SparkContext.scala:2105)
at org.apache.spark.SparkContext.<init>(SparkContext.scala:695)
at
org.apache.spark.SparkContext$.getOrCreate(SparkContext.scala:2714)
at
org.apache.spark.sql.SparkSession$Builder.$anonfun$getOrCreate$2(SparkSession.scala:953)
at scala.Option.getOrElse(Option.scala:189)
at
org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:947)
at
org.apache.kyuubi.engine.spark.SparkSQLEngine$.createSpark(SparkSQLEngine.scala:253)
at
org.apache.kyuubi.engine.spark.SparkSQLEngine$.main(SparkSQLEngine.scala:326)
at
org.apache.kyuubi.engine.spark.SparkSQLEngine.main(SparkSQLEngine.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at
org.apache.spark.deploy.JavaMainApplication.start(SparkApplication.scala:52)
at
org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:958)
at
org.apache.spark.deploy.SparkSubmit$$anon$1.run(SparkSubmit.scala:165)
at
org.apache.spark.deploy.SparkSubmit$$anon$1.run(SparkSubmit.scala:163)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1878)
at
org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:163)
at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:203)
at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:90)
at
org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:1046)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:1055)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
23/03/29 15:05:47 ERROR Utils: Uncaught exception in thread main
io.fabric8.kubernetes.client.KubernetesClientException: Failure executing:
GET at:
https://172.20.0.1/api/v1/namespaces/default/configmaps?labelSelector=spark-app-selector%3Dspark-de265d35dd144a23b24be9d979e75884%2Cspark-role%3Dexecutor.
Message: Forbidden!Configured service account doesn't have access. Service
account may have been revoked. configmaps is forbidden: User
"system:serviceaccount:kyuubi:kyuubi" cannot list resource "configmaps" in API
group "" in the namespace "default".
at
io.fabric8.kubernetes.client.dsl.base.OperationSupport.requestFailure(OperationSupport.java:682)
at
io.fabric8.kubernetes.client.dsl.base.OperationSupport.requestFailure(OperationSupport.java:661)
at
io.fabric8.kubernetes.client.dsl.base.OperationSupport.assertResponseCode(OperationSupport.java:610)
at
io.fabric8.kubernetes.client.dsl.base.OperationSupport.handleResponse(OperationSupport.java:555)
at
io.fabric8.kubernetes.client.dsl.base.OperationSupport.handleResponse(OperationSupport.java:518)
at
io.fabric8.kubernetes.client.dsl.base.OperationSupport.handleResponse(OperationSupport.java:502)
at
io.fabric8.kubernetes.client.dsl.base.BaseOperation.listRequestHelper(BaseOperation.java:133)
at
io.fabric8.kubernetes.client.dsl.base.BaseOperation.list(BaseOperation.java:415)
at
io.fabric8.kubernetes.client.dsl.base.BaseOperation.list(BaseOperation.java:404)
at
io.fabric8.kubernetes.client.dsl.base.BaseOperation.deleteList(BaseOperation.java:537)
at
io.fabric8.kubernetes.client.dsl.base.BaseOperation.delete(BaseOperation.java:455)
at
org.apache.spark.scheduler.cluster.k8s.KubernetesClusterSchedulerBackend.$anonfun$stop$7(KubernetesClusterSchedulerBackend.scala:162)
at org.apache.spark.util.Utils$.tryLogNonFatalError(Utils.scala:1484)
at
org.apache.spark.scheduler.cluster.k8s.KubernetesClusterSchedulerBackend.stop(KubernetesClusterSchedulerBackend.scala:163)
at
org.apache.spark.scheduler.TaskSchedulerImpl.stop(TaskSchedulerImpl.scala:931)
at
org.apache.spark.scheduler.DAGScheduler.stop(DAGScheduler.scala:2785)
at
org.apache.spark.SparkContext.$anonfun$stop$11(SparkContext.scala:2105)
at org.apache.spark.util.Utils$.tryLogNonFatalError(Utils.scala:1484)
at org.apache.spark.SparkContext.stop(SparkContext.scala:2105)
at org.apache.spark.SparkContext.<init>(SparkContext.scala:695)
at
org.apache.spark.SparkContext$.getOrCreate(SparkContext.scala:2714)
at
org.apache.spark.sql.SparkSession$Builder.$anonfun$getOrCreate$2(SparkSession.scala:953)
at scala.Option.getOrElse(Option.scala:189)
at
org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:947)
at
org.apache.kyuubi.engine.spark.SparkSQLEngine$.createSpark(SparkSQLEngine.scala:253)
at
org.apache.kyuubi.engine.spark.SparkSQLEngine$.main(SparkSQLEngine.scala:326)
at
org.apache.kyuubi.engine.spark.SparkSQLEngine.main(SparkSQLEngine.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at
org.apache.spark.deploy.JavaMainApplication.start(SparkApplication.scala:52)
at
org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:958)
at
org.apache.spark.deploy.SparkSubmit$$anon$1.run(SparkSubmit.scala:165)
at
org.apache.spark.deploy.SparkSubmit$$anon$1.run(SparkSubmit.scala:163)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1878)
at
org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:163)
at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:203)
at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:90)
at
org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:1046)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:1055)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
23/03/29 15:05:47 INFO MapOutputTrackerMasterEndpoint:
MapOutputTrackerMasterEndpoint stopped!
23/03/29 15:05:47 INFO MemoryStore: MemoryStore cleared
23/03/29 15:05:47 INFO BlockManager: BlockManager stopped
23/03/29 15:05:47 INFO BlockManagerMaster: BlockManagerMaster stopped
23/03/29 15:05:47 WARN MetricsSystem: Stopping a MetricsSystem that is not
running
23/03/29 15:05:47 INFO
OutputCommitCoordinator$OutputCommitCoordinatorEndpoint:
OutputCommitCoordinator stopped!
23/03/29 15:05:47 INFO SparkContext: Successfully stopped SparkContext
23/03/29 15:05:47 ERROR SparkSQLEngine: Failed to instantiate SparkSession:
Failure executing: GET at:
https://172.20.0.1/api/v1/namespaces/default/pods?labelSelector=spark-app-selector%3Dspark-de265d35dd144a23b24be9d979e75884%2Cspark-role%3Dexecutor&allowWatchBookmarks=true&watch=true.
Message: Forbidden.
io.fabric8.kubernetes.client.KubernetesClientException: Failure executing:
GET at:
https://172.20.0.1/api/v1/namespaces/default/pods?labelSelector=spark-app-selector%3Dspark-de265d35dd144a23b24be9d979e75884%2Cspark-role%3Dexecutor&allowWatchBookmarks=true&watch=true.
Message: Forbidden.
at
io.fabric8.kubernetes.client.dsl.base.OperationSupport.requestFailure(OperationSupport.java:682)
at
io.fabric8.kubernetes.client.dsl.base.OperationSupport.requestFailure(OperationSupport.java:661)
at
io.fabric8.kubernetes.client.dsl.internal.WatchConnectionManager.lambda$run$2(WatchConnectionManager.java:126)
at
java.util.concurrent.CompletableFuture.uniHandle(CompletableFuture.java:836)
at
java.util.concurrent.CompletableFuture$UniHandle.tryFire(CompletableFuture.java:811)
at
java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at
java.util.concurrent.CompletableFuture.completeExceptionally(CompletableFuture.java:1990)
at
io.fabric8.kubernetes.client.okhttp.OkHttpWebSocketImpl$BuilderImpl$1.onFailure(OkHttpWebSocketImpl.java:66)
at
okhttp3.internal.ws.RealWebSocket.failWebSocket(RealWebSocket.java:571)
at
okhttp3.internal.ws.RealWebSocket$2.onResponse(RealWebSocket.java:198)
at okhttp3.RealCall$AsyncCall.execute(RealCall.java:203)
at okhttp3.internal.NamedRunnable.run(NamedRunnable.java:32)
at
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
Suppressed: java.lang.Throwable: waiting here
at
io.fabric8.kubernetes.client.utils.Utils.waitUntilReady(Utils.java:169)
at
io.fabric8.kubernetes.client.utils.Utils.waitUntilReadyOrFail(Utils.java:180)
at
io.fabric8.kubernetes.client.dsl.internal.WatchConnectionManager.waitUntilReady(WatchConnectionManager.java:96)
at
io.fabric8.kubernetes.client.dsl.base.BaseOperation.watch(BaseOperation.java:572)
at
io.fabric8.kubernetes.client.dsl.base.BaseOperation.watch(BaseOperation.java:547)
at
io.fabric8.kubernetes.client.dsl.base.BaseOperation.watch(BaseOperation.java:83)
at
org.apache.spark.scheduler.cluster.k8s.ExecutorPodsWatchSnapshotSource.start(ExecutorPodsWatchSnapshotSource.scala:53)
at
org.apache.spark.scheduler.cluster.k8s.KubernetesClusterSchedulerBackend.start(KubernetesClusterSchedulerBackend.scala:109)
at
org.apache.spark.scheduler.TaskSchedulerImpl.start(TaskSchedulerImpl.scala:222)
at
org.apache.spark.SparkContext.<init>(SparkContext.scala:595)
at
org.apache.spark.SparkContext$.getOrCreate(SparkContext.scala:2714)
at
org.apache.spark.sql.SparkSession$Builder.$anonfun$getOrCreate$2(SparkSession.scala:953)
at scala.Option.getOrElse(Option.scala:189)
at
org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:947)
at
org.apache.kyuubi.engine.spark.SparkSQLEngine$.createSpark(SparkSQLEngine.scala:253)
at
org.apache.kyuubi.engine.spark.SparkSQLEngine$.main(SparkSQLEngine.scala:326)
at
org.apache.kyuubi.engine.spark.SparkSQLEngine.main(SparkSQLEngine.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native
Method)
at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at
org.apache.spark.deploy.JavaMainApplication.start(SparkApplication.scala:52)
at
org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:958)
at
org.apache.spark.deploy.SparkSubmit$$anon$1.run(SparkSubmit.scala:165)
at
org.apache.spark.deploy.SparkSubmit$$anon$1.run(SparkSubmit.scala:163)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1878)
at
org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:163)
at
org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:203)
at
org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:90)
at
org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:1046)
at
org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:1055)
at
org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
23/03/29 15:08:40 INFO ShutdownHookManager: Shutdown hook called
23/03/29 15:08:40 INFO ShutdownHookManager: Deleting directory
/tmp/spark-5630ff13-cd25-4944-8c78-a3859e6cfb7e
23/03/29 15:08:40 INFO ShutdownHookManager: Deleting directory
/tmp/spark-fa6a2d02-82a7-4965-b7a8-487957b15a08
```
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]