Hi,
What memory does the “allocated mem.” refers to on UI for a DAG ? Application
Master OR Containers memory of an operators ?
[cid:B61FE0C9-4767-4FF8-9E23-454CB502C53C]
I included below properties as well and re-triggered the DAG, still it is
showing 32GB only!!
<property>
<name>dt.application.<APP_NAME>.attr.MASTER_MEMORY_MB</name>
<value>4096</value>
</property>
<property>
<name>dt.application.<APP_NAME>.operator.*.attr.MEMORY_MB</name>
<value>4096</value>
</property>
I have the same DAG running on other hadoop environment, which is showing
approx. 125gb, but in other environment only 32gb, which is what I am assuming
to be the problem !!
Regards,
Raja.
From: Sandesh Hegde <[email protected]<mailto:[email protected]>>
Reply-To: "[email protected]<mailto:[email protected]>"
<[email protected]<mailto:[email protected]>>
Date: Tuesday, July 12, 2016 at 11:35 AM
To: "[email protected]<mailto:[email protected]>"
<[email protected]<mailto:[email protected]>>
Subject: Re: DAG is failing due to memory issues
Raja,
Please increase the container size and launch the app again.
yarn.scheduler.maximum-allocation-mb is for the container and not for the DAG
and the error message showed by you is for the container.
Here is one quick way, use the following attribute.
<property>
<name>dt.operator.*.attr.MEMORY_MB</name>
<value>4096</value>
</property>
On Tue, Jul 12, 2016 at 9:24 AM Raja.Aravapalli
<[email protected]<mailto:[email protected]>> wrote:
Hi Ram,
Sorry I did not share that details of 32gb with you.
I am saying 32gb is allocated because, I observed the same on UI, when the
application is running. But now, as the DAG is failed, I cannot take a
screenshot and send!!
Regards,
Raja.
From: Munagala Ramanath <[email protected]<mailto:[email protected]>>
Reply-To: "[email protected]<mailto:[email protected]>"
<[email protected]<mailto:[email protected]>>
Date: Tuesday, July 12, 2016 at 11:06 AM
To: "[email protected]<mailto:[email protected]>"
<[email protected]<mailto:[email protected]>>
Subject: Re: DAG is failing due to memory issues
How do you know it is allocating 32GB ? The diagnostic message you posted does
not show
that.
Ram
On Tue, Jul 12, 2016 at 8:51 AM, Raja.Aravapalli
<[email protected]<mailto:[email protected]>> wrote:
Thanks for the response Sandesh.
Since our yarn-site is configured with value 32768 for the property
yarn.scheduler.maximum-allocation-mb, it is allocating a max of 32gb and not
more than that!!
Wish to know, is there a way I can increase the max allowed value ? OR, since
it is configured in yarn-site.xml, I cannot increase it ?
Regards,
Raja.
From: Sandesh Hegde <[email protected]<mailto:[email protected]>>
Reply-To: "[email protected]<mailto:[email protected]>"
<[email protected]<mailto:[email protected]>>
Date: Tuesday, July 12, 2016 at 10:46 AM
To: "[email protected]<mailto:[email protected]>"
<[email protected]<mailto:[email protected]>>
Subject: Re: DAG is failing due to memory issues
Quoting from the doc shared by the Ram, those parameters control operator
memory size.
actual container memory allocated by RM has to lie between
[yarn.scheduler.minimum-allocation-mb, yarn.scheduler.maximum-allocation-mb]
On Tue, Jul 12, 2016 at 8:38 AM Raja.Aravapalli
<[email protected]<mailto:[email protected]>> wrote:
Hi Ram,
I see in the cluster yarn-site.xml, below two properties are configured with
below settings..
yarn.scheduler.minimum-allocation-mb ===> 1024
yarn.scheduler.maximum-allocation-mb ===> 32768
So with the above settings at cluster level, I can’t increase the memory
allocated for my DAG ? Is there is any other way, I can increase the memory ?
Thanks a lot.
Regards,
Raja.
From: Munagala Ramanath <[email protected]<mailto:[email protected]>>
Reply-To: "[email protected]<mailto:[email protected]>"
<[email protected]<mailto:[email protected]>>
Date: Tuesday, July 12, 2016 at 9:31 AM
To: "[email protected]<mailto:[email protected]>"
<[email protected]<mailto:[email protected]>>
Subject: Re: DAG is failing due to memory issues
Please see: http://docs.datatorrent.com/troubleshooting/#configuring-memory
Ram
On Tue, Jul 12, 2016 at 6:57 AM, Raja.Aravapalli
<[email protected]<mailto:[email protected]>> wrote:
Hi,
My DAG is failing with memory issues for container. Seeing below information in
the log.
Diagnostics: Container [pid=xxx,containerID=container_xyclksdjf] is running
beyond physical memory limits. Current usage: 1.0 GB of 1 GB physical memory
used; 2.9 GB of 2.1 GB virtual memory used. Killing container.
Can someone help me on how I can fix this issue. Thanks a lot.
Regards,
Raja.