Re: Spark 1.5.2 memory error

2016-02-03 Thread Nirav Patel
Data Analytics with Spark >>> <http://www.amazon.com/Big-Data-Analytics-Spark-Practitioners/dp/1484209656/> >>> >>> >>> >>> *From:* Nirav Patel [mailto:npa...@xactlycorp.com] >>> *Sent:* Wednesday, February 3, 2016 11:31 AM >>> *To:*

Re: Spark 1.5.2 memory error

2016-02-03 Thread Nirav Patel
with Spark > <http://www.amazon.com/Big-Data-Analytics-Spark-Practitioners/dp/1484209656/> > > > > *From:* Nirav Patel [mailto:npa...@xactlycorp.com] > *Sent:* Wednesday, February 3, 2016 11:31 AM > *To:* Stefan Panayotov > *Cc:* Jim Green; Ted Yu; Jakob Oders

Re: Spark 1.5.2 memory error

2016-02-03 Thread Jerry Lam
with Spark >>> <http://www.amazon.com/Big-Data-Analytics-Spark-Practitioners/dp/1484209656/> >>> >>> >>> >>> *From:* Nirav Patel [mailto:npa...@xactlycorp.com] >>> *Sent:* Wednesday, February 3, 2016 11:31 AM >>> *To:* Stefan P

RE: Spark 1.5.2 memory error

2016-02-03 Thread Mohammed Guller
656/> From: Nirav Patel [mailto:npa...@xactlycorp.com] Sent: Wednesday, February 3, 2016 11:31 AM To: Stefan Panayotov Cc: Jim Green; Ted Yu; Jakob Odersky; user@spark.apache.org Subject: Re: Spark 1.5.2 memory error Hi Stefan, Welcome to the OOM - heap space club. I have been struggling with s

Re: Spark 1.5.2 memory error

2016-02-03 Thread Nirav Patel
orp.com] >> *Sent:* Wednesday, February 3, 2016 11:31 AM >> *To:* Stefan Panayotov >> *Cc:* Jim Green; Ted Yu; Jakob Odersky; user@spark.apache.org >> >> *Subject:* Re: Spark 1.5.2 memory error >> >> >> >> Hi Stefan, >> >> >> >

Re: Spark 1.5.2 memory error

2016-02-03 Thread Rishabh Wadhawan
actitioners/dp/1484209656/> > > > From: Nirav Patel [mailto:npa...@xactlycorp.com > <mailto:npa...@xactlycorp.com>] > Sent: Wednesday, February 3, 2016 11:31 AM > To: Stefan Panayotov > Cc: Jim Green; Ted Yu; Jakob Odersky; user@spark.apache.org > <mailto:use

Re: Spark 1.5.2 memory error

2016-02-03 Thread Nirav Patel
M, Mohammed Guller <moham...@glassbeam.com >>>> > wrote: >>>> >>>>> Nirav, >>>>> >>>>> Sorry to hear about your experience with Spark; however, sucks is a >>>>> very strong word. Many organizations are process

Re: Spark 1.5.2 memory error

2016-02-03 Thread Ted Yu
ore than 150GB >>>> of data with Spark. >>>> >>>> >>>> >>>> Mohammed >>>> >>>> Author: Big Data Analytics with Spark >>>> <http://www.amazon.com/Big-Data-Analytics-Spark-Practitioners/dp/1484209656/&

Re: Spark 1.5.2 memory error

2016-02-03 Thread Nirav Patel
mpl.java:removeOrTrackCompletedContainersFromContext(529)) > - Removed completed containers from NM context: > [container_1454509557526_0014_01_93] > > I'll appreciate any suggestions. > > Thanks, > > > *Stefan Panayotov, PhD **Home*: 610-355-0919 > *Cell*: 610-517-5586 > *email

RE: Spark 1.5.2 memory error

2016-02-03 Thread Stefan Panayotov
Re: Spark 1.5.2 memory error From: openkbi...@gmail.com To: spanayo...@msn.com CC: yuzhih...@gmail.com; ja...@odersky.com; user@spark.apache.org Look at part#3 in below blog:http://www.openkb.info/2015/06/resource-allocation-configurations-for.html You may want to increase the executor mem

Re: Spark 1.5.2 memory error

2016-02-03 Thread Rishabh Wadhawan
gt; - Removed completed containers from NM context: > [container_1454509557526_0014_01_93] > > I'll appreciate any suggestions. > > Thanks, > > Stefan Panayotov, PhD > Home: 610-355-0919 > Cell: 610-517-5586 > email: spanayo...@msn.com <mailto:spanayo...@

Spark 1.5.2 memory error

2016-02-02 Thread Stefan Panayotov
Hi Guys, I need help with Spark memory errors when executing ML pipelines. The error that I see is: 16/02/02 20:34:17 INFO Executor: Executor is trying to kill task 32.0 in stage 32.0 (TID 3298) 16/02/02 20:34:17 INFO Executor: Executor is trying to kill task 12.0 in stage 32.0 (TID 3278)

Re: Spark 1.5.2 memory error

2016-02-02 Thread Jakob Odersky
Can you share some code that produces the error? It is probably not due to spark but rather the way data is handled in the user code. Does your code call any reduceByKey actions? These are often a source for OOM errors. On Tue, Feb 2, 2016 at 1:22 PM, Stefan Panayotov wrote:

RE: Spark 1.5.2 memory error

2016-02-02 Thread Stefan Panayotov
For the memoryOvethead I have the default of 10% of 16g, and Spark version is 1.5.2. Stefan Panayotov, PhD Sent from Outlook Mail for Windows 10 phone From: Ted Yu Sent: Tuesday, February 2, 2016 4:52 PM To: Jakob Odersky Cc: Stefan Panayotov; user@spark.apache.org Subject: Re: Spark 1.5.2

Re: Spark 1.5.2 memory error

2016-02-02 Thread Ted Yu
What value do you use for spark.yarn.executor.memoryOverhead ? Please see https://spark.apache.org/docs/latest/running-on-yarn.html for description of the parameter. Which Spark release are you using ? Cheers On Tue, Feb 2, 2016 at 1:38 PM, Jakob Odersky wrote: > Can you

Re: Spark 1.5.2 memory error

2016-02-02 Thread Jim Green
*To: *Jakob Odersky <ja...@odersky.com> > *Cc: *Stefan Panayotov <spanayo...@msn.com>; user@spark.apache.org > *Subject: *Re: Spark 1.5.2 memory error > > > > What value do you use for spark.yarn.executor.memoryOverhead ? > > > > Please see https://spark.apa