Totally depends on the use-case that you are solving with Spark, for
instance there was some discussion around the same which you could read
over here
http://apache-spark-user-list.1001560.n3.nabble.com/How-does-one-decide-no-of-executors-cores-memory-allocation-td23326.html

Thanks
Best Regards

On Mon, Jun 22, 2015 at 10:57 AM, pth001 <patcharee.thong...@uni.no> wrote:

> Hi,
>
> How can I know the size of memory needed for each executor (one core) to
> execute each job? If there are many cores per executors, will the memory be
> the multiplication (memory needed for each executor (one core) * no. of
> cores)?
>
> Any suggestions/guidelines?
>
> BR,
> Patcharee
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
> For additional commands, e-mail: user-h...@spark.apache.org
>
>

Reply via email to