[ 
https://issues.apache.org/jira/browse/SPARK-2604?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Twinkle Sachdeva updated SPARK-2604:
------------------------------------

    Description: 
In yarn environment, let's say :
MaxAM = Maximum allocatable memory
ExecMem - Executor's memory

if (MaxAM > ExecMem && ( MaxAM - ExecMem) > 384m ))
  then Maximum resource validation fails w.r.t executor memory , and 
application master gets launched, but when resource is allocated and again 
validated, they are returned and application appears to be hanged.

Typical use case is to ask for executor memory = maximum allowed memory as per 
yarn config

  was:
In yarn environment, let's say :
MaxAM = Maximum allocatable memory
ExecMem - Executor's memory

if (MaxAM > ExecMem && ( MaxAM - ExecMem) > 384m ))
  then Maximum resource validation fails w.r.t executor memory , and 
application master gets launched, but when resource is allocated and again 
validated, they are returned and application appears to be hanged.




> Spark Application hangs on yarn in edge case scenario of executor memory 
> requirement
> ------------------------------------------------------------------------------------
>
>                 Key: SPARK-2604
>                 URL: https://issues.apache.org/jira/browse/SPARK-2604
>             Project: Spark
>          Issue Type: Bug
>          Components: Spark Core
>    Affects Versions: 1.0.0
>            Reporter: Twinkle Sachdeva
>
> In yarn environment, let's say :
> MaxAM = Maximum allocatable memory
> ExecMem - Executor's memory
> if (MaxAM > ExecMem && ( MaxAM - ExecMem) > 384m ))
>   then Maximum resource validation fails w.r.t executor memory , and 
> application master gets launched, but when resource is allocated and again 
> validated, they are returned and application appears to be hanged.
> Typical use case is to ask for executor memory = maximum allowed memory as 
> per yarn config



--
This message was sent by Atlassian JIRA
(v6.2#6252)

Reply via email to