wForget opened a new issue #1796:
URL: https://github.com/apache/incubator-kyuubi/issues/1796


   ### Code of Conduct
   
   - [X] I agree to follow this project's [Code of 
Conduct](https://www.apache.org/foundation/policies/conduct)
   
   
   ### Search before asking
   
   - [X] I have searched in the 
[issues](https://github.com/apache/incubator-kyuubi/issues?q=is%3Aissue) and 
found no similar issues.
   
   
   ### Describe the bug
   
   Because `spark.yarn.submit.waitAppCompletion=false` is configured, the spark 
submit process exits normally before the spark application completes, so when 
the spark application fails, the session still waits for a timeout to exit.
   
   ### Affects Version(s)
   
   master
   
   ### Kyuubi Server Log Output
   
   ```logtalk
   2022-01-19 08:50:01,593 ERROR 
[KyuubiThriftBinaryFrontendServiceHandler-Pool: Thread-13751]: 
org.apache.kyuubi.server.KyuubiThriftBinaryFrontendService(72) - Error opening 
session: 
   org.apache.kyuubi.KyuubiSQLException: Error opening session for cupid_dp 
client ip *.*.*.*, due to org.apache.kyuubi.KyuubiSQLException: 
Timeout(43200000 ms) to launched SPARK_SQL engine with /*/bin/spark-submit \
           --class org.apache.kyuubi.engine.spark.SparkSQLEngine \
           --conf spark.kyuubi.session.engine.initialize.timeout=43200000 \
   ......
   ```
   
   
   ### Kyuubi Engine Log Output
   
   ```logtalk
   22/01/18 20:51:59 ERROR ApplicationMaster: Uncaught exception: 
   java.util.concurrent.TimeoutException: Futures timed out after [100000 
milliseconds]
        at scala.concurrent.impl.Promise$DefaultPromise.ready(Promise.scala:259)
        at 
scala.concurrent.impl.Promise$DefaultPromise.result(Promise.scala:263)
        at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:293)
        at 
org.apache.spark.deploy.yarn.ApplicationMaster.runDriver(ApplicationMaster.scala:504)
        at 
org.apache.spark.deploy.yarn.ApplicationMaster.run(ApplicationMaster.scala:268)
        at 
org.apache.spark.deploy.yarn.ApplicationMaster$$anon$3.run(ApplicationMaster.scala:899)
        at 
org.apache.spark.deploy.yarn.ApplicationMaster$$anon$3.run(ApplicationMaster.scala:898)
        at java.security.AccessController.doPrivileged(Native Method)
        at javax.security.auth.Subject.doAs(Subject.java:422)
        at 
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1762)
        at 
org.apache.spark.deploy.yarn.ApplicationMaster$.main(ApplicationMaster.scala:898)
        at 
org.apache.spark.deploy.yarn.ApplicationMaster.main(ApplicationMaster.scala)
   22/01/18 20:51:59 INFO ApplicationMaster: Final app status: FAILED, 
exitCode: 13, (reason: Uncaught exception: 
java.util.concurrent.TimeoutException: Futures timed out after [100000 
milliseconds]
        at scala.concurrent.impl.Promise$DefaultPromise.ready(Promise.scala:259)
        at 
scala.concurrent.impl.Promise$DefaultPromise.result(Promise.scala:263)
        at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:293)
        at 
org.apache.spark.deploy.yarn.ApplicationMaster.runDriver(ApplicationMaster.scala:504)
        at 
org.apache.spark.deploy.yarn.ApplicationMaster.run(ApplicationMaster.scala:268)
        at 
org.apache.spark.deploy.yarn.ApplicationMaster$$anon$3.run(ApplicationMaster.scala:899)
        at 
org.apache.spark.deploy.yarn.ApplicationMaster$$anon$3.run(ApplicationMaster.scala:898)
        at java.security.AccessController.doPrivileged(Native Method)
        at javax.security.auth.Subject.doAs(Subject.java:422)
        at 
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1762)
        at 
org.apache.spark.deploy.yarn.ApplicationMaster$.main(ApplicationMaster.scala:898)
        at 
org.apache.spark.deploy.yarn.ApplicationMaster.main(ApplicationMaster.scala)
   ```
   
   
   ### Kyuubi Server Configurations
   
   ```yaml
   kyuubi.session.engine.initialize.timeout 43200000
   spark.yarn.submit.waitAppCompletion false
   ```
   
   
   ### Kyuubi Engine Configurations
   
   _No response_
   
   ### Additional context
   
   Thanks to @turboFei and @ulysses-you for their suggestions, there are two 
ways:
   1. Remove the `spark.yarn.submit.waitAppCompletion=false` configuration and 
call `process.destroyForcibly()` in 
`org.apache.kyuubi.engine.ProcBuilder#close` to end the spark submit process. 
(The process cannot be destroyed in client mode)
   2. When the spark submit process is submitted normally, add a shell script 
to judge the spark application status.
   
   ### Are you willing to submit PR?
   
   - [X] Yes I am willing to submit a PR!


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]


Reply via email to