Its either --num-workers or --num-executors when using the spark-class 
interface directly.  If you use spark-submit with --num-executors it ends up 
setting spark.executor.instances which works around the issue.
Tom 


     On Friday, November 6, 2015 2:14 PM, Marcelo Vanzin <van...@cloudera.com> 
wrote:
   

 The way I read Tom's report, it just affects a long-deprecated command
line option (--num-workers). I wouldn't block the release for it.

On Fri, Nov 6, 2015 at 12:10 PM, Sean Owen <so...@cloudera.com> wrote:
> Hm, if I read that right, looks like --num-executors doesn't work at
> all on YARN unless dynamic allocation is on? the fix is easy, but
> sounds like it could be a Blocker.
>
> On Fri, Nov 6, 2015 at 2:51 PM, Tom Graves <tgraves...@yahoo.com> wrote:
>>  While running our regression tests I found
>> https://issues.apache.org/jira/browse/SPARK-11555.  It is a break in
>> backwards compatibility but its using the old spark-class and --num-workers
>> interface which I hope no one is still using.
>>
>> I'm a +0 as it doesn't seem super critical but I hate to break backwards
>> compatibility unless we explicitly decide to.
>>
>>
>> Tom
>>
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: dev-unsubscr...@spark.apache.org
> For additional commands, e-mail: dev-h...@spark.apache.org
>



-- 
Marcelo

---------------------------------------------------------------------
To unsubscribe, e-mail: dev-unsubscr...@spark.apache.org
For additional commands, e-mail: dev-h...@spark.apache.org



  

Reply via email to