[ 
https://issues.apache.org/jira/browse/LIVY-712?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16979271#comment-16979271
 ] 

Yiheng Wang commented on LIVY-712:
----------------------------------

This code is changed in this patch:
https://github.com/apache/incubator-livy/commit/ca4cad22968e1a2f88fa0ec262c1088812e3d251

[~jshao] Any suggestion about this?

> EMR 5.23/5.27 - Livy does not recognise that Spark job failed
> -------------------------------------------------------------
>
>                 Key: LIVY-712
>                 URL: https://issues.apache.org/jira/browse/LIVY-712
>             Project: Livy
>          Issue Type: Bug
>          Components: API
>    Affects Versions: 0.5.0, 0.6.0
>         Environment: AWS EMR 5.23/5.27, Scala
>            Reporter: Michal Sankot
>            Priority: Major
>              Labels: EMR, api, spark
>
> We've upgraded from AWS EMR 5.13 -> 5.23 (Livy 0.4.0 -> 0.5.0, Spark 2.3.0 -> 
> 2.4.0) and an issue appears that when there is an exception thrown during 
> Spark job execution, Spark shuts down as if there was no problem and job 
> appears as Completed in EMR. So we're not notified when system crashes. The 
> same problem appears in EMR 5.27 (Livy 0.6.0, Spark 2.4.4).
> Is it something with Spark? Or a known issue with Livy?
> In Livy logs I see that spark-submit exists with error code 1
> {quote}{{05:34:59 WARN BatchSession$: spark-submit exited with code 1}}
> {quote}
>  And then Livy API states that batch state is
> {quote}{{"state": "success"}}
> {quote}
> How can it be made work again?



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

Reply via email to