Sahil Takiar created HIVE-20273: ----------------------------------- Summary: Spark jobs aren't cancelled if getSparkJobInfo or getSparkStagesInfo Key: HIVE-20273 URL: https://issues.apache.org/jira/browse/HIVE-20273 Project: Hive Issue Type: Sub-task Components: Spark Reporter: Sahil Takiar Assignee: Sahil Takiar
HIVE-19053 and HIVE-19733 add handling of {{InterruptedException}} to {{#getSparkJobInfo}} and {{#getSparkStagesInfo}} in {{RemoteSparkJobStatus}}, but that means the {{InterruptedException}} is wrapped in a {{HiveException}} and then thrown. The {{HiveException}} is then cause in {{RemoteSparkJobMonitor}} and then wrapped in another Hive exception. The double nesting of hive exception causes the logic in {{SparkTask#setSparkException}} to break, and it doesn't kill the job if an interrupted exception is thrown. -- This message was sent by Atlassian JIRA (v7.6.3#76005)