Re: [Worker Crashing] OutOfMemoryError: GC overhead limit execeeded

2017-03-24 Thread Yong Zhang
: user@spark.apache.org Subject: Re: [Worker Crashing] OutOfMemoryError: GC overhead limit execeeded Yea we also didn't find anything related to this online. Are you aware of any memory leaks in worker in 1.6.2 spark which might be causing this ? Do you know of any documentation which explai

Re: [Worker Crashing] OutOfMemoryError: GC overhead limit execeeded

2017-03-24 Thread Behroz Sikander
t; *From:* Behroz Sikander <behro...@gmail.com> > *Sent:* Friday, March 24, 2017 9:15 AM > *To:* Yong Zhang > *Cc:* user@spark.apache.org > *Subject:* Re: [Worker Crashing] OutOfMemoryError: GC overhead limit > execeeded > > Thank you for the response. > > Yes, I am su

Re: [Worker Crashing] OutOfMemoryError: GC overhead limit execeeded

2017-03-24 Thread Yong Zhang
he.org Subject: Re: [Worker Crashing] OutOfMemoryError: GC overhead limit execeeded Thank you for the response. Yes, I am sure because the driver was working fine. Only 2 workers went down with OOM. Regards, Behroz On Fri, Mar 24, 2017 at 2:12 PM, Yong Zhang <java8...@hotmail.com<

Re: [Worker Crashing] OutOfMemoryError: GC overhead limit execeeded

2017-03-24 Thread Behroz Sikander
ns the > driver OOM. Are you sure your workers OOM? > > > Yong > > > -- > *From:* bsikander <behro...@gmail.com> > *Sent:* Friday, March 24, 2017 5:48 AM > *To:* user@spark.apache.org > *Subject:* [Worker Crashing] OutOfMemoryError: GC

Re: [Worker Crashing] OutOfMemoryError: GC overhead limit execeeded

2017-03-24 Thread Yong Zhang
Memory ? How can we avoid that in future. -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Worker-Crashing-OutOfMemoryError-GC-overhead-limit-execeeded-tp28535.html Sent from the Apache Spark User

[Worker Crashing] OutOfMemoryError: GC overhead limit execeeded

2017-03-24 Thread bsikander
avoid that in future. -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Worker-Crashing-OutOfMemoryError-GC-overhead-limit-execeeded-tp28535.html Sent from the Apache Spark User List mailing list archive at Nabble.com

[Worker Crashing] OutOfMemoryError: GC overhead limit execeeded

2017-03-23 Thread Behroz Sikander
Hello, Spark version: 1.6.2 Hadoop: 2.6.0 Cluster: All VMS are deployed on AWS. 1 Master (t2.large) 1 Secondary Master (t2.large) 5 Workers (m4.xlarge) Zookeeper (t2.large) Recently, 2 of our workers went down with out of memory exception. > java.lang.OutOfMemoryError: GC overhead limit