It could be related to this.
https://issues.apache.org/jira/browse/SPARK-6737

This was fixed in Spark 1.3.1.



On Wed, Apr 29, 2015 at 8:38 AM, Sean Owen <so...@cloudera.com> wrote:

> Not sure what you mean. It's already in CDH since 5.4 = 1.3.0
> (This isn't the place to ask about CDH)
> I also don't think that's the problem. The process did not run out of
> memory.
>
> On Wed, Apr 29, 2015 at 2:08 PM, Serega Sheypak <serega.shey...@gmail.com>
> wrote:
>
>> >The memory leak could be related to this
>> <https://issues.apache.org/jira/browse/SPARK-5967> defect that was
>> resolved in Spark 1.2.2 and 1.3.0.
>> @Sean
>> Will it be backported to CDH? I did't find that bug in CDH 5.4 release
>> notes.
>>
>> 2015-04-29 14:51 GMT+02:00 Conor Fennell <conor.fenn...@altocloud.com>:
>>
>>> The memory leak could be related to this
>>> <https://issues.apache.org/jira/browse/SPARK-5967> defect that was
>>> resolved in Spark 1.2.2 and 1.3.0.
>>>
>>> It also was a HashMap causing the issue.
>>>
>>> -Conor
>>>
>>>
>>>
>>> On Wed, Apr 29, 2015 at 12:01 PM, Sean Owen <so...@cloudera.com> wrote:
>>>
>>>> Please use user@, not dev@
>>>>
>>>> This message does not appear to be from your driver. It also doesn't
>>>> say you ran out of memory. It says you didn't tell YARN to let it use the
>>>> memory you want. Look at the memory overhead param and please search first
>>>> for related discussions.
>>>> On Apr 29, 2015 11:43 AM, "wyphao.2007" <wyphao.2...@163.com> wrote:
>>>>
>>>>> Hi, Dear developer, I am using Spark Streaming to read data from
>>>>> kafka, the program already run about 120 hours, but today the program
>>>>> failed because of driver's OOM as follow:
>>>>>
>>>>> Container
>>>>> [pid=49133,containerID=container_1429773909253_0050_02_000001] is running
>>>>> beyond physical memory limits. Current usage: 2.5 GB of 2.5 GB physical
>>>>> memory used; 3.2 GB of 50 GB virtual memory used. Killing container.
>>>>>
>>>>> I set --driver-memory to 2g, In my mind, driver is responsibility for
>>>>> job scheduler and job monitor(Please correct me If I'm wrong), Why it 
>>>>> using
>>>>> so much memory?
>>>>>
>>>>> So I using jmap to monitor other program(already run about 48 hours):
>>>>> sudo /home/q/java7/jdk1.7.0_45/bin/jmap -histo:live 31256, the result
>>>>> as follow:
>>>>> the java.util.HashMap$Entry and java.lang.Long  object using about
>>>>> 600Mb memory!
>>>>>
>>>>> and I also using jmap to monitor other program(already run about 1
>>>>> hours),  the result as follow:
>>>>> the java.util.HashMap$Entry and java.lang.Long object doesn't using so
>>>>> many memory, But I found, as time goes by, the
>>>>> java.util.HashMap$Entry and java.lang.Long object will occupied more
>>>>> and more memory,
>>>>> It is driver's memory leak question? or other reason?
>>>>>
>>>>> Thanks
>>>>> Best Regards
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>
>>
>

Reply via email to