I noticed that it is configurable in job level spark.task.cpus. Anyway to
support on task level?
Thanks.
Zhan Zhang
On Dec 11, 2015, at 10:46 AM, Zhan Zhang wrote:
> Hi Folks,
>
> Is it possible to assign multiple core per task and how? Suppose we have some
>
Hi Folks,
Is it possible to assign multiple core per task and how? Suppose we have some
scenario, in which some tasks are really heavy processing each record and
require multi-threading, and we want to avoid similar tasks assigned to the
same executors/hosts.
If it is not supported, does it