cxzl25 commented on code in PR #2669:
URL: https://github.com/apache/incubator-kyuubi/pull/2669#discussion_r874831326
##########
kyuubi-server/src/test/scala/org/apache/kyuubi/engine/JpsApplicationOperationSuite.scala:
##########
@@ -81,16 +81,18 @@ class JpsApplicationOperationSuite extends KyuubiFunSuite {
assert(desc1.contains("id"))
assert(desc1("name").contains(id))
assert(desc1("state") === "RUNNING")
+ val response = jps.killApplicationByTag(id)
Review Comment:
The root cause of this problem is that `spark-submit` will generate two
processes.
First start `org.apache.spark.launcher.Main
org.apache.spark.deploy.SparkSubmit` to generate submission parameters,
Then start `org.apache.spark.deploy.SparkSubmit` to execute the main logic.
So the call to `getApplicationInfoByTag` is successful, it sees process 1,
but sometimes `killApplicationByTag` happens to be executed when the two
processes are being switched. At this time, jps cannot see the process of the
relevant tag.
Now add in the eventually block and it seems to work ok.
Is there some possibility, such as the engine startup is slow, more than
10s, or the kill command fails?
However, after this investigation, the next time we encounter this case, it
should be quickly fixed.
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]