Eric Liang created SPARK-20217:
----------------------------------
Summary: Executor should not fail stage if killed task throws
non-interrupted exception
Key: SPARK-20217
URL: https://issues.apache.org/jira/browse/SPARK-20217
Project: Spark
Issue Type: Bug
Components: Spark Core
Affects Versions: 2.2.0
Reporter: Eric Liang
This is reproducible as follows. Run the following, and then use
SparkContext.killTaskAttempt to kill one of the tasks. The entire stage will
fail since we threw a RuntimeException instead of InterruptedException.
We should probably unconditionally return TaskKilled instead of TaskFailed if
the task was killed by the driver, regardless of the actual exception thrown.
{code}
spark.range(100).repartition(100).foreach { i =>
try {
Thread.sleep(10000000)
} catch {
case t: InterruptedException =>
throw new RuntimeException(t)
}
}
{code}
--
This message was sent by Atlassian JIRA
(v6.3.15#6346)
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]