Please use user@, not dev@
This message does not appear to be from your driver. It also doesn't say
you ran out of memory. It says you didn't tell YARN to let it use the
memory you want. Look at the memory overhead param and please search first
for related discussions.
On Apr 29, 2015 11:43 AM,
The memory leak could be related to this
https://issues.apache.org/jira/browse/SPARK-5967 defect that was resolved
in Spark 1.2.2 and 1.3.0.
@Sean
Will it be backported to CDH? I did't find that bug in CDH 5.4 release
notes.
2015-04-29 14:51 GMT+02:00 Conor Fennell conor.fenn...@altocloud.com:
The memory leak could be related to this
https://issues.apache.org/jira/browse/SPARK-5967 defect that was resolved
in Spark 1.2.2 and 1.3.0.
It also was a HashMap causing the issue.
-Conor
On Wed, Apr 29, 2015 at 12:01 PM, Sean Owen so...@cloudera.com wrote:
Please use user@, not dev@
Not sure what you mean. It's already in CDH since 5.4 = 1.3.0
(This isn't the place to ask about CDH)
I also don't think that's the problem. The process did not run out of
memory.
On Wed, Apr 29, 2015 at 2:08 PM, Serega Sheypak serega.shey...@gmail.com
wrote:
The memory leak could be related to
It could be related to this.
https://issues.apache.org/jira/browse/SPARK-6737
This was fixed in Spark 1.3.1.
On Wed, Apr 29, 2015 at 8:38 AM, Sean Owen so...@cloudera.com wrote:
Not sure what you mean. It's already in CDH since 5.4 = 1.3.0
(This isn't the place to ask about CDH)
I also