Yes, this is a change in Spark 2.0.  you can take a look at 
https://issues.apache.org/jira/browse/SPARK-13723 
<https://issues.apache.org/jira/browse/SPARK-13723>

In the latest Spark On Yarn documentation 
<http://spark.apache.org/docs/latest/running-on-yarn.html> for Spark 2.0, there 
is updated description for --num-executors:
> spark.executor.instances      2       The number of executors for static 
> allocation. Withspark.dynamicAllocation.enabled, the initial set of executors 
> will be at least this large.
You can disable the dynamic allocation for an application by specifying “--conf 
spark.dynamicAllocation.enabled=false” in the command line.

> On Jul 28, 2016, at 15:44, LONG WANG <wanglong_...@163.com> wrote:
> 
> Hi Spark Experts,
> 
>                           Today I tried Spark 2.0 on YARN and also enabled 
> Dynamic Resource Allocation feature, I just find that no matter I specify 
> --num-executor in spark-submit command or not, the Dynamic Resource 
> Allocation is used, but I remember when I specify --num-executor option in 
> spark-submit command in Spark 1.6, the Dynamic Resource Allocation feature 
> will not be used/effect for that job. And I can see below log in Spark 1.6 .
> 
>                             <截图1.png>
>                            
>                           Is this a behavior change in Spark 2.0? And How can 
> I disable Dynamic Resource Allocation for a specific job submission 
> temporarily as before? 
> 
> 
>  
>  邮件带有附件预览链接,若您转发或回复此邮件时不希望对方预览附件,建议您手动删除链接。
> 共有 1 个附件
> 截图1.png(23K)
> 极速下载 
> <http://preview.mail.163.com/xdownload?filename=%E6%88%AA%E5%9B%BE1.png&mid=1tbi6xC0mlXlUxVjsQAAsx&part=3&sign=eb8eae267f296465e4c019b1fe6e68d0&time=1469691919&uid=sunrise_win%40163.com>
>  在线预览 
> <http://preview.mail.163.com/preview?mid=1tbi6xC0mlXlUxVjsQAAsx&part=3&sign=eb8eae267f296465e4c019b1fe6e68d0&time=1469691919&uid=sunrise_win%40163.com>

Reply via email to