juliuszsompolski commented on a change in pull request #25611:
[SPARK-28901][SQL] SparkThriftServer's Cancel SQL Operation show it in JDBC Tab
UI
URL: https://github.com/apache/spark/pull/25611#discussion_r321920608
##########
File path:
sql/hive-thriftserver/src/main/scala/org/apache/spark/sql/hive/thriftserver/SparkExecuteStatementOperation.scala
##########
@@ -249,32 +261,43 @@ private[hive] class SparkExecuteStatementOperation(
}
dataTypes = result.queryExecution.analyzed.output.map(_.dataType).toArray
} catch {
- case e: HiveSQLException =>
- if (getStatus().getState() == OperationState.CANCELED) {
- return
+ // Actually do need to catch Throwable as some failures don't inherit
from Exception and
+ // HiveServer will silently swallow them.
+ case e: Throwable =>
Review comment:
@AngersZhuuuu
I think I found one more problem:
If `cancel()` and `close()` is called very quickly after the query is
started, then they may both call `cleanup()` before Spark Jobs are started.
Then `sqlContext.sparkContext.cancelJobGroup(statementId)` does nothing.
But then the `execute` thread can start the jobs, and only then get
interrupted and exit through here. But then it will exit here, and no-one will
cancel these jobs and they will keep running even though this execution has
exited.
I think it can be fixed by:
```
case e: Throwable =>
// In any case, cancel any remaining running jobs.
// E.g. a cancel() operation could have called cleanup() which
canceled the Jobs before
// they started.
if (statementId != null) {
sqlContext.sparkContext.cancelJobGroup(statementId)
}
```
----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
For queries about this service, please contact Infrastructure at:
[email protected]
With regards,
Apache Git Services
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]