start per box.
Yong
Date: Mon, 28 Sep 2015 20:47:18 -0700
Subject: Re: Setting executors per worker - Standalone
From: james.p...@gmail.com
To: zjf...@gmail.com
CC: user@spark.apache.org
Thanks for your reply.
Setting it as
--conf spark.executor.cores=1
when I start spark-shell (as an example a
Thanks for your help.
You were correct about the memory settings. Previously I had following
config:
--executor-memory 8g --conf spark.executor.cores=1
Which was really conflicting, as in spark-env.sh I had:
export SPARK_WORKER_CORES=4
export SPARK_WORKER_MEMORY=8192m
So the memory budget per
use "--executor-cores 1" you will get 4 executors per worker since you have
4 cores per worker
On Tue, Sep 29, 2015 at 8:24 AM, James Pirz wrote:
> Hi,
>
> I am using speak 1.5 (standalone mode) on a cluster with 10 nodes while
> each machine has 12GB of RAM and 4 cores.
Thanks for your reply.
Setting it as
--conf spark.executor.cores=1
when I start spark-shell (as an example application) indeed sets the number
of cores per executor as 1 (which is 4 before), but I still have 1 executor
per worker. What I am really looking for is having 1 worker with 4 executor