If you're really sure that 4 executors are on 1 machine, then it means your
resource manager allowed it. What are you using, YARN? check that you
really are limited to 40 cores per machine in the YARN config.

On Mon, Oct 24, 2016 at 3:33 PM TheGeorge1918 . <zhangxuan1...@gmail.com>
wrote:

> Hi all,
>
> I'm deeply confused by the executor configuration in Spark. I have two
> machines, each with 40 vcores. By mistake, I assign 7 executors and each
> with 11 vcores (It ran without any problem). As a result, one machine has 4
> executors and the other has 3 executors + driver. But this means for the
> machine with 4 executors, it needs 4 x 11 = 44 vcores which is more than 40
> vcores available on that machine. Do I miss something here? Thanks a lot.
>
> aws emr cluster:
> 2 x m4.10xlarge machine, each with 40 vcores, 160G memory
>
> spark:
> num executors: 7
> executor memory: 33G
> num cores: 11
> driver memory: 39G
> driver cores: 6
>
>
>
> ---------------------------------------------------------------------
> To unsubscribe e-mail: user-unsubscr...@spark.apache.org

Reply via email to