I understand that i shoud set the executor memory. I tried with the parameters 
below but OOM still occures...

./spark-submit --class main.scala.Test1 --master local[8]  --driver-memory 20g 
--executor-memory 20g

 

From: Sean Owen [mailto:so...@cloudera.com] 
Sent: Monday, November 7, 2016 12:21 PM
To: Kürşat Kurt <kur...@kursatkurt.com>; user@spark.apache.org
Subject: Re: Out of memory at 60GB free memory.

 

You say "out of memory", and you allocate a huge amount of driver memory, but, 
it's your executor that's running out of memory. You want --executor-memory. 
You can't set it after the driver has run.

On Mon, Nov 7, 2016 at 5:35 AM Kürşat Kurt <kur...@kursatkurt.com 
<mailto:kur...@kursatkurt.com> > wrote:

Hi;

I am trying to use Naive Bayes for multi-class classification.

I am getting OOM at “pipeline.fit(train)” line. When i submit the code, 
everything is ok so far the stage “collect at NaiveBayes.scala:400”.

At this stage, starting 375 tasks very fast and going slowing down at this 
point. Task count could not became 500, getting OOM at 380-390th task.

 

Reply via email to