You can not change dynamically the number of cores per executor or cores
per task, but you can change the number of executors.
In one of my jobs I have something like this, so when I know that I don't
need more than 4 executors, I kill all other executors (assuming that they
don't hold any cached
Hi Donni,
Please check spark dynamic allocation and external shuffle service .
On Fri, 27 Apr 2018 at 2:52 AM, Donni Khan
wrote:
> Hi All,
>
> Is there any way to change the number of executors/cores during running
> Saprk Job.
> I have Spark Job containing two
Hi All,
Is there any way to change the number of executors/cores during running
Saprk Job.
I have Spark Job containing two tasks: First task need many executors to
run fastly. the second task has many input and output opeartions and
shuffling, so it needs few executors, otherwise it taks loong