On Wed, 11 Mar 2015 11:19:56 +0100
Marcin Cylke <marcin.cy...@ext.allegro.pl> wrote:

> Hi
> 
> I'm trying to do a join of two datasets: 800GB with ~50MB.

The job finishes if I set spark.yarn.executor.memoryOverhead to 2048MB.
If it is around 1000MB it fails with "executor lost" errors.

My spark settings are:

- executor cores - 8
- num executors - 32
- executor memory - 4g

Regards
Marcin

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to