You can open the application UI (that runs on 4040) and see how much memory
is being allocated to the executor tabs and from the environments tab.
Thanks
Best Regards
On Wed, Oct 22, 2014 at 9:55 PM, Holden Karau wrote:
> Hi Michael Campbell,
>
> Are you deploying against yarn or standalone mod
Hi Michael Campbell,
Are you deploying against yarn or standalone mode? In yarn try setting the
shell variables SPARK_EXECUTOR_MEMORY=2G in standalone try and
set SPARK_WORKER_MEMORY=2G.
Cheers,
Holden :)
On Thu, Oct 16, 2014 at 2:22 PM, Michael Campbell <
michael.campb...@gmail.com> wrote:
>
TL;DR - a spark SQL job fails with an OOM (Out of heap space) error. If
given "--executor-memory" values, it won't even start. Even (!) if the
values given ARE THE SAME AS THE DEFAULT.
Without --executor-memory:
14/10/16 17:14:58 INFO TaskSetManager: Serialized task 1.0:64 as 14710
bytes in 1