[ https://issues.apache.org/jira/browse/SPARK-8612?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Sean Owen resolved SPARK-8612. ------------------------------ Resolution: Duplicate I believe so. I think Marcelo is following up on this general issue; there are a few tickets. > Yarn application status is misreported for failed PySpark apps. > --------------------------------------------------------------- > > Key: SPARK-8612 > URL: https://issues.apache.org/jira/browse/SPARK-8612 > Project: Spark > Issue Type: Bug > Components: YARN > Affects Versions: 1.3.0, 1.3.1, 1.4.0 > Environment: PySpark job run in yarn-client mode on CDH 5.4.2 > Reporter: Juliet Hougland > Priority: Minor > > When a PySpark job fails, the YARN records and reports its status as > successful. Hari Shreedharan pointed out to me that [the ApplicationMaster > records app success when system.exit is called. | > https://github.com/apache/spark/blob/master/yarn/src/main/scala/org/apache/spark/deploy/yarn/ApplicationMaster.scala#L124] > PySpark always [exits by calling os._exit. | > https://github.com/apache/spark/blob/master/python/pyspark/daemon.py#L169] > Because of this, every PySpark application run on yarn is marked as completed > successfully. -- This message was sent by Atlassian JIRA (v6.3.4#6332) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org