I don't know the relation between the DAG size and the AppMaster memory
yet. Maybe others can fill in.
When the situation you mentioned happens, I just raise the memory of the
AppMaster by few GBs.

On Tue, Jul 12, 2016 at 2:33 PM Raja.Aravapalli <[email protected]>
wrote:

>
> Sure Sandesh Thanks.
>
> Also, one quick question,
>
> When will the size/memory of the Application Master grows ?
>
> Does the memory of AM depends on the no.of operators in the pipeline ?
>
> One issue I observed with my DAG is,
>
> Memory of the application master is growing for my DAG and after reaching
> max. memory allowed, it is killed/failed… and after trying max allowed
> attempts entire DAG is failing!!
>
> Wish to know why the size of my AM is growing and control it, so that…
> Application master doesn’t fail and eventually entire DAG doesn’t fail!
>
>
> Regards,
> Raja.
>
> From: Sandesh Hegde <[email protected]>
> Reply-To: "[email protected]" <[email protected]>
> Date: Tuesday, July 12, 2016 at 2:43 PM
>
> To: "[email protected]" <[email protected]>
> Subject: Re: DAG is failing due to memory issues
>
> UI Memory = Total Memory - AppMaster Memory
>
> DAG size can vary between different setups, that happens because the max
> size of the container is defined by the yarn parameter mentioned above.
>
> Apex does the following:
>
> if (csr.container.getRequiredMemoryMB() > maxMem) {
>   LOG.warn("Container memory {}m above max threshold of cluster. Using max 
> value {}m.", csr.container.getRequiredMemoryMB(), maxMem);
>   csr.container.setRequiredMemoryMB(maxMem);
> }
>
>
> On Tue, Jul 12, 2016 at 10:21 AM Raja.Aravapalli <
> [email protected]> wrote:
>
>>
>> Hi,
>>
>>
>> What memory does the “allocated mem.” refers to on UI for a DAG ?
>> Application Master OR Containers memory of an operators ?
>>
>>
>>
>>
>> I included below properties as well and re-triggered the DAG, still it is
>> showing 32GB only!!
>>
>> <property>
>>     <name>dt.application.<APP_NAME>.attr.MASTER_MEMORY_MB</name>
>>     <value>4096</value>
>> </property>
>>
>> <property>
>>     <name>dt.application.<APP_NAME>.operator.*.attr.MEMORY_MB</name>
>>     <value>4096</value>
>> </property>
>>
>>
>>
>> I have the same DAG running on other hadoop environment, which is showing
>> approx. 125gb, but in other environment only 32gb, which is what I am
>> assuming to be the problem !!
>>
>>
>> Regards,
>> Raja.
>>
>>
>> From: Sandesh Hegde <[email protected]>
>> Reply-To: "[email protected]" <[email protected]>
>> Date: Tuesday, July 12, 2016 at 11:35 AM
>>
>> To: "[email protected]" <[email protected]>
>> Subject: Re: DAG is failing due to memory issues
>>
>> Raja,
>>
>> Please increase the container size and launch the app again.  yarn
>> .scheduler.maximum-allocation-mb is for the container and not for the
>> DAG and the error message showed by you is for the container.
>>
>> Here is one quick way, use the following attribute.
>>
>> <property>
>>   <name>dt.operator.*.attr.MEMORY_MB</name>
>>   <value>4096</value>
>> </property>
>>
>>
>>
>> On Tue, Jul 12, 2016 at 9:24 AM Raja.Aravapalli <
>> [email protected]> wrote:
>>
>>>
>>> Hi Ram,
>>>
>>> Sorry I did not share that details of 32gb with you.
>>>
>>> I am saying 32gb is allocated because, I observed the same on UI, when
>>> the application is running. But now, as the DAG is failed, I cannot take a
>>> screenshot and send!!
>>>
>>>
>>> Regards,
>>> Raja.
>>>
>>> From: Munagala Ramanath <[email protected]>
>>> Reply-To: "[email protected]" <[email protected]>
>>> Date: Tuesday, July 12, 2016 at 11:06 AM
>>>
>>> To: "[email protected]" <[email protected]>
>>> Subject: Re: DAG is failing due to memory issues
>>>
>>> How do you know it is allocating 32GB ? The diagnostic message you
>>> posted does not show
>>> that.
>>>
>>> Ram
>>>
>>> On Tue, Jul 12, 2016 at 8:51 AM, Raja.Aravapalli <
>>> [email protected]> wrote:
>>>
>>>>
>>>> Thanks for the response Sandesh.
>>>>
>>>> Since our yarn-site is configured with value *32768* for the property *
>>>> yarn.scheduler.maximum-allocation-mb*, it is allocating a max of *32gb*
>>>> and not more than that!!
>>>>
>>>>
>>>> Wish to know, is there a way I can increase the max allowed value ? OR,
>>>> since it is configured in yarn-site.xml, I *cannot* increase it ?
>>>>
>>>>
>>>>
>>>> Regards,
>>>> Raja.
>>>>
>>>> From: Sandesh Hegde <[email protected]>
>>>> Reply-To: "[email protected]" <[email protected]>
>>>> Date: Tuesday, July 12, 2016 at 10:46 AM
>>>> To: "[email protected]" <[email protected]>
>>>> Subject: Re: DAG is failing due to memory issues
>>>>
>>>> Quoting from the doc shared by the Ram, those parameters control
>>>> operator memory size.
>>>>
>>>>  actual container memory allocated by RM has to lie between
>>>>
>>>> [yarn.scheduler.minimum-allocation-mb, 
>>>> yarn.scheduler.maximum-allocation-mb]
>>>>
>>>>
>>>> On Tue, Jul 12, 2016 at 8:38 AM Raja.Aravapalli <
>>>> [email protected]> wrote:
>>>>
>>>>>
>>>>> Hi Ram,
>>>>>
>>>>> I see in the cluster yarn-site.xml, below two properties are
>>>>> configured with below settings..
>>>>>
>>>>> yarn.scheduler.minimum-allocation-mb ===> 1024
>>>>> yarn.scheduler.maximum-allocation-mb ===> 32768
>>>>>
>>>>>
>>>>> So with the above settings at cluster level, I can’t increase the
>>>>> memory allocated for my DAG ?  Is there is any other way, I can increase
>>>>> the memory ?
>>>>>
>>>>>
>>>>> Thanks a lot.
>>>>>
>>>>>
>>>>> Regards,
>>>>> Raja.
>>>>>
>>>>> From: Munagala Ramanath <[email protected]>
>>>>> Reply-To: "[email protected]" <[email protected]>
>>>>> Date: Tuesday, July 12, 2016 at 9:31 AM
>>>>> To: "[email protected]" <[email protected]>
>>>>> Subject: Re: DAG is failing due to memory issues
>>>>>
>>>>> Please see:
>>>>> http://docs.datatorrent.com/troubleshooting/#configuring-memory
>>>>>
>>>>> Ram
>>>>>
>>>>> On Tue, Jul 12, 2016 at 6:57 AM, Raja.Aravapalli <
>>>>> [email protected]> wrote:
>>>>>
>>>>>>
>>>>>> Hi,
>>>>>>
>>>>>> My DAG is failing with memory issues for container. Seeing below
>>>>>> information in the log.
>>>>>>
>>>>>>
>>>>>>
>>>>>> Diagnostics: Container [pid=xxx,containerID=container_xyclksdjf] is
>>>>>> running beyond physical memory limits. Current usage: 1.0 GB of 1 GB
>>>>>> physical memory used; 2.9 GB of 2.1 GB virtual memory used. Killing
>>>>>> container.
>>>>>>
>>>>>>
>>>>>> Can someone help me on how I can fix this issue. Thanks a lot.
>>>>>>
>>>>>>
>>>>>>
>>>>>> Regards,
>>>>>> Raja.
>>>>>>
>>>>>
>>>>>
>>>

Reply via email to