You can increase the maximum allocation MB  but it will require a resource
manager restart.

Thanks,
Dev

On Jul 12, 2016 9:01 AM, "Raja.Aravapalli" <[email protected]>
wrote:

>
> Thanks for the response Sandesh.
>
> Since our yarn-site is configured with value *32768* for the property *
> yarn.scheduler.maximum-allocation-mb*, it is allocating a max of *32gb*
> and not more than that!!
>
>
> Wish to know, is there a way I can increase the max allowed value ? OR,
> since it is configured in yarn-site.xml, I *cannot* increase it ?
>
>
>
> Regards,
> Raja.
>
> From: Sandesh Hegde <[email protected]>
> Reply-To: "[email protected]" <[email protected]>
> Date: Tuesday, July 12, 2016 at 10:46 AM
> To: "[email protected]" <[email protected]>
> Subject: Re: DAG is failing due to memory issues
>
> Quoting from the doc shared by the Ram, those parameters control operator
> memory size.
>
>  actual container memory allocated by RM has to lie between
>
> [yarn.scheduler.minimum-allocation-mb, yarn.scheduler.maximum-allocation-mb]
>
>
> On Tue, Jul 12, 2016 at 8:38 AM Raja.Aravapalli <
> [email protected]> wrote:
>
>>
>> Hi Ram,
>>
>> I see in the cluster yarn-site.xml, below two properties are configured
>> with below settings..
>>
>> yarn.scheduler.minimum-allocation-mb ===> 1024
>> yarn.scheduler.maximum-allocation-mb ===> 32768
>>
>>
>> So with the above settings at cluster level, I can’t increase the memory
>> allocated for my DAG ?  Is there is any other way, I can increase the
>> memory ?
>>
>>
>> Thanks a lot.
>>
>>
>> Regards,
>> Raja.
>>
>> From: Munagala Ramanath <[email protected]>
>> Reply-To: "[email protected]" <[email protected]>
>> Date: Tuesday, July 12, 2016 at 9:31 AM
>> To: "[email protected]" <[email protected]>
>> Subject: Re: DAG is failing due to memory issues
>>
>> Please see:
>> http://docs.datatorrent.com/troubleshooting/#configuring-memory
>>
>> Ram
>>
>> On Tue, Jul 12, 2016 at 6:57 AM, Raja.Aravapalli <
>> [email protected]> wrote:
>>
>>>
>>> Hi,
>>>
>>> My DAG is failing with memory issues for container. Seeing below
>>> information in the log.
>>>
>>>
>>>
>>> Diagnostics: Container [pid=xxx,containerID=container_xyclksdjf] is
>>> running beyond physical memory limits. Current usage: 1.0 GB of 1 GB
>>> physical memory used; 2.9 GB of 2.1 GB virtual memory used. Killing
>>> container.
>>>
>>>
>>> Can someone help me on how I can fix this issue. Thanks a lot.
>>>
>>>
>>>
>>> Regards,
>>> Raja.
>>>
>>
>>

Reply via email to