TopGunViper opened a new pull request #23509: SPARK-26588:Idle executor should 
properly be killed when no job is su…
URL: https://github.com/apache/spark/pull/23509
 
 
   ## What changes were proposed in this pull request?
   
   I enable dynamic allocation feature with **spark-shell** and do not submit 
any task. After **spark.dynamicAllocation.executorIdleTimeout** seconds(default 
60s), there is still one active executor, which is abnormal. All idle executors 
are timeout and should be removed.(default  
**spark.dynamicAllocation.minExecutors**=0). The spark-shell command show below:
   
   `spark-shell --master=yarn --conf spark.ui.port=8040 --conf 
spark.dynamicAllocation.enabled=true --conf 
spark.dynamicAllocation.maxExecutors=8 --conf 
spark.dynamicAllocation.initialExecutors=4 --conf 
spark.shuffle.service.enabled=true`
   
   ## How was this patch tested?
   unit tests
   

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
[email protected]


With regards,
Apache Git Services

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to