[
https://issues.apache.org/jira/browse/YARN-8558?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16555518#comment-16555518
]
genericqa commented on YARN-8558:
---------------------------------
| (/) *{color:green}+1 overall{color}* |
\\
\\
|| Vote || Subsystem || Runtime || Comment ||
| {color:blue}0{color} | {color:blue} reexec {color} | {color:blue} 0m
21s{color} | {color:blue} Docker mode activated. {color} |
|| || || || {color:brown} Prechecks {color} ||
| {color:green}+1{color} | {color:green} @author {color} | {color:green} 0m
0s{color} | {color:green} The patch does not contain any @author tags. {color} |
| {color:green}+1{color} | {color:green} test4tests {color} | {color:green} 0m
0s{color} | {color:green} The patch appears to include 1 new or modified test
files. {color} |
|| || || || {color:brown} trunk Compile Tests {color} ||
| {color:green}+1{color} | {color:green} mvninstall {color} | {color:green} 25m
14s{color} | {color:green} trunk passed {color} |
| {color:green}+1{color} | {color:green} compile {color} | {color:green} 1m
0s{color} | {color:green} trunk passed {color} |
| {color:green}+1{color} | {color:green} checkstyle {color} | {color:green} 0m
16s{color} | {color:green} trunk passed {color} |
| {color:green}+1{color} | {color:green} mvnsite {color} | {color:green} 0m
32s{color} | {color:green} trunk passed {color} |
| {color:green}+1{color} | {color:green} shadedclient {color} | {color:green}
10m 32s{color} | {color:green} branch has no errors when building and testing
our client artifacts. {color} |
| {color:green}+1{color} | {color:green} findbugs {color} | {color:green} 0m
56s{color} | {color:green} trunk passed {color} |
| {color:green}+1{color} | {color:green} javadoc {color} | {color:green} 0m
23s{color} | {color:green} trunk passed {color} |
|| || || || {color:brown} Patch Compile Tests {color} ||
| {color:green}+1{color} | {color:green} mvninstall {color} | {color:green} 0m
34s{color} | {color:green} the patch passed {color} |
| {color:green}+1{color} | {color:green} compile {color} | {color:green} 0m
53s{color} | {color:green} the patch passed {color} |
| {color:green}+1{color} | {color:green} javac {color} | {color:green} 0m
53s{color} | {color:green} the patch passed {color} |
| {color:green}+1{color} | {color:green} checkstyle {color} | {color:green} 0m
8s{color} | {color:green} the patch passed {color} |
| {color:green}+1{color} | {color:green} mvnsite {color} | {color:green} 0m
32s{color} | {color:green} the patch passed {color} |
| {color:green}+1{color} | {color:green} whitespace {color} | {color:green} 0m
0s{color} | {color:green} The patch has no whitespace issues. {color} |
| {color:green}+1{color} | {color:green} shadedclient {color} | {color:green}
10m 52s{color} | {color:green} patch has no errors when building and testing
our client artifacts. {color} |
| {color:green}+1{color} | {color:green} findbugs {color} | {color:green} 0m
57s{color} | {color:green} the patch passed {color} |
| {color:green}+1{color} | {color:green} javadoc {color} | {color:green} 0m
20s{color} | {color:green} the patch passed {color} |
|| || || || {color:brown} Other Tests {color} ||
| {color:green}+1{color} | {color:green} unit {color} | {color:green} 18m
42s{color} | {color:green} hadoop-yarn-server-nodemanager in the patch passed.
{color} |
| {color:green}+1{color} | {color:green} asflicense {color} | {color:green} 0m
25s{color} | {color:green} The patch does not generate ASF License warnings.
{color} |
| {color:black}{color} | {color:black} {color} | {color:black} 72m 45s{color} |
{color:black} {color} |
\\
\\
|| Subsystem || Report/Notes ||
| Docker | Client=17.05.0-ce Server=17.05.0-ce Image:yetus/hadoop:ba1ab08 |
| JIRA Issue | YARN-8558 |
| JIRA Patch URL |
https://issues.apache.org/jira/secure/attachment/12933027/YARN-8558.002.patch |
| Optional Tests | asflicense compile javac javadoc mvninstall mvnsite
unit shadedclient findbugs checkstyle |
| uname | Linux 98dac162c806 4.4.0-89-generic #112-Ubuntu SMP Mon Jul 31
19:38:41 UTC 2017 x86_64 x86_64 x86_64 GNU/Linux |
| Build tool | maven |
| Personality | /testptch/patchprocess/precommit/personality/provided.sh |
| git revision | trunk / 955f795 |
| maven | version: Apache Maven 3.3.9 |
| Default Java | 1.8.0_171 |
| findbugs | v3.1.0-RC1 |
| Test Results |
https://builds.apache.org/job/PreCommit-YARN-Build/21366/testReport/ |
| Max. process+thread count | 410 (vs. ulimit of 10000) |
| modules | C:
hadoop-yarn-project/hadoop-yarn/hadoop-yarn-server/hadoop-yarn-server-nodemanager
U:
hadoop-yarn-project/hadoop-yarn/hadoop-yarn-server/hadoop-yarn-server-nodemanager
|
| Console output |
https://builds.apache.org/job/PreCommit-YARN-Build/21366/console |
| Powered by | Apache Yetus 0.8.0-SNAPSHOT http://yetus.apache.org |
This message was automatically generated.
> NM recovery level db not cleaned up properly on container finish
> ----------------------------------------------------------------
>
> Key: YARN-8558
> URL: https://issues.apache.org/jira/browse/YARN-8558
> Project: Hadoop YARN
> Issue Type: Bug
> Affects Versions: 3.0.0, 3.1.0
> Reporter: Bibin A Chundatt
> Assignee: Bibin A Chundatt
> Priority: Critical
> Attachments: YARN-8558.001.patch, YARN-8558.002.patch
>
>
> {code}
> 2018-07-20 16:49:23,117 INFO
> org.apache.hadoop.yarn.server.nodemanager.containermanager.application.ApplicationImpl:
> Application application_1531994217928_0054 transitioned from NEW to INITING
> 2018-07-20 16:49:23,204 WARN
> org.apache.hadoop.yarn.server.nodemanager.recovery.NMLeveldbStateStoreService:
> Remove container container_1531994217928_0001_01_000018 with incomplete
> records
> 2018-07-20 16:49:23,204 WARN
> org.apache.hadoop.yarn.server.nodemanager.recovery.NMLeveldbStateStoreService:
> Remove container container_1531994217928_0001_01_000019 with incomplete
> records
> 2018-07-20 16:49:23,204 WARN
> org.apache.hadoop.yarn.server.nodemanager.recovery.NMLeveldbStateStoreService:
> Remove container container_1531994217928_0001_01_000020 with incomplete
> records
> 2018-07-20 16:49:23,205 WARN
> org.apache.hadoop.yarn.server.nodemanager.recovery.NMLeveldbStateStoreService:
> Remove container container_1531994217928_0001_01_000021 with incomplete
> records
> 2018-07-20 16:49:23,205 WARN
> org.apache.hadoop.yarn.server.nodemanager.recovery.NMLeveldbStateStoreService:
> Remove container container_1531994217928_0001_01_000022 with incomplete
> records
> 2018-07-20 16:49:23,205 WARN
> org.apache.hadoop.yarn.server.nodemanager.recovery.NMLeveldbStateStoreService:
> Remove container container_1531994217928_0001_01_000023 with incomplete
> records
> 2018-07-20 16:49:23,205 WARN
> org.apache.hadoop.yarn.server.nodemanager.recovery.NMLeveldbStateStoreService:
> Remove container container_1531994217928_0001_01_000024 with incomplete
> records
> 2018-07-20 16:49:23,205 WARN
> org.apache.hadoop.yarn.server.nodemanager.recovery.NMLeveldbStateStoreService:
> Remove container container_1531994217928_0001_01_000025 with incomplete
> records
> 2018-07-20 16:49:23,205 WARN
> org.apache.hadoop.yarn.server.nodemanager.recovery.NMLeveldbStateStoreService:
> Remove container container_1531994217928_0001_01_000038 with incomplete
> records
> 2018-07-20 16:49:23,205 WARN
> org.apache.hadoop.yarn.server.nodemanager.recovery.NMLeveldbStateStoreService:
> Remove container container_1531994217928_0001_01_000039 with incomplete
> records
> 2018-07-20 16:49:23,206 WARN
> org.apache.hadoop.yarn.server.nodemanager.recovery.NMLeveldbStateStoreService:
> Remove container container_1531994217928_0001_01_000041 with incomplete
> records
> 2018-07-20 16:49:23,206 WARN
> org.apache.hadoop.yarn.server.nodemanager.recovery.NMLeveldbStateStoreService:
> Remove container container_1531994217928_0001_01_000044 with incomplete
> records
> 2018-07-20 16:49:23,206 WARN
> org.apache.hadoop.yarn.server.nodemanager.recovery.NMLeveldbStateStoreService:
> Remove container container_1531994217928_0001_01_000046 with incomplete
> records
> 2018-07-20 16:49:23,206 WARN
> org.apache.hadoop.yarn.server.nodemanager.recovery.NMLeveldbStateStoreService:
> Remove container container_1531994217928_0001_01_000049 with incomplete
> records
> 2018-07-20 16:49:23,206 WARN
> org.apache.hadoop.yarn.server.nodemanager.recovery.NMLeveldbStateStoreService:
> Remove container container_1531994217928_0001_01_000052 with incomplete
> records
> 2018-07-20 16:49:23,206 WARN
> org.apache.hadoop.yarn.server.nodemanager.recovery.NMLeveldbStateStoreService:
> Remove container container_1531994217928_0001_01_000054 with incomplete
> records
> 2018-07-20 16:49:23,206 WARN
> org.apache.hadoop.yarn.server.nodemanager.recovery.NMLeveldbStateStoreService:
> Remove container container_1531994217928_0001_01_000073 with incomplete
> records
> 2018-07-20 16:49:23,207 WARN
> org.apache.hadoop.yarn.server.nodemanager.recovery.NMLeveldbStateStoreService:
> Remove container container_1531994217928_0001_01_000074 with incomplete
> records
> 2018-07-20 16:49:23,207 WARN
> org.apache.hadoop.yarn.server.nodemanager.recovery.NMLeveldbStateStoreService:
> Remove container container_1531994217928_0001_01_000075 with incomplete
> records
> 2018-07-20 16:49:23,207 WARN
> org.apache.hadoop.yarn.server.nodemanager.recovery.NMLeveldbStateStoreService:
> Remove container container_1531994217928_0001_01_000078 with incomplete
> records
> 2018-07-20 16:49:23,207 WARN
> org.apache.hadoop.yarn.server.nodemanager.recovery.NMLeveldbStateStoreService:
> Remove container container_1531994217928_0001_01_000079 with incomplete
> records
> 2018-07-20 16:49:23,207 WARN
> org.apache.hadoop.yarn.server.nodemanager.recovery.NMLeveldbStateStoreService:
> Remove container container_1531994217928_0001_01_000082 with incomplete
> records
> 2018-07-20 16:49:23,207 WARN
> org.apache.hadoop.yarn.server.nodemanager.recovery.NMLeveldbStateStoreService:
> Remove container container_1531994217928_0001_01_000083 with incomplete
> records
> 2018-07-20 16:49:23,207 WARN
> org.apache.hadoop.yarn.server.nodemanager.recovery.NMLeveldbStateStoreService:
> Remove container container_1531994217928_0001_01_000085 with incomplete
> records
> 2018-07-20 16:49:23,208 WARN
> org.apache.hadoop.yarn.server.nodemanager.recovery.NMLeveldbStateStoreService:
> Remove container container_1531994217928_0001_01_1099511627738 with
> incomplete records
> 2018-07-20 16:49:23,208 WARN
> org.apache.hadoop.yarn.server.nodemanager.recovery.NMLeveldbStateStoreService:
> Remove container container_1531994217928_0001_01_1099511627742 with
> incomplete records
> 2018-07-20 16:49:23,208 WARN
> org.apache.hadoop.yarn.server.nodemanager.recovery.NMLeveldbStateStoreService:
> Remove container container_1531994217928_0001_01_1099511627746 with
> incomplete records
> 2018-07-20 16:49:23,208 WARN
> org.apache.hadoop.yarn.server.nodemanager.recovery.NMLeveldbStateStoreService:
> Remove container container_1531994217928_0001_01_1099511627749 with
> incomplete records
> 2018-07-20 16:49:23,208 WARN
> org.apache.hadoop.yarn.server.nodemanager.recovery.NMLeveldbStateStoreService:
> Remove container container_1531994217928_0001_01_1099511627753 with
> incomplete records
> 2018-07-20 16:49:23,208 WARN
> org.apache.hadoop.yarn.server.nodemanager.recovery.NMLeveldbStateStoreService:
> Remove container container_1531994217928_0001_01_1099511627757 with
> incomplete records
> 2018-07-20 16:49:23,208 WARN
> org.apache.hadoop.yarn.server.nodemanager.recovery.NMLeveldbStateStoreService:
> Remove container container_1531994217928_0001_01_1099511627761 with
> incomplete records
> 2018-07-20 16:49:23,209 WARN
> org.apache.hadoop.yarn.server.nodemanager.recovery.NMLeveldbStateStoreService:
> Remove container container_1531994217928_0001_01_1099511627765 with
> incomplete records
> 2018-07-20 16:49:23,209 WARN
> org.apache.hadoop.yarn.server.nodemanager.recovery.NMLeveldbStateStoreService:
> Remove container container_1531994217928_0001_01_1099511627769 with
> incomplete records
> 2018-07-20 16:49:23,209 WARN
> org.apache.hadoop.yarn.server.nodemanager.recovery.NMLeveldbStateStoreService:
> Remove container container_1531994217928_0001_01_1099511627773 with
> incomplete records
> 2018-07-20 16:49:23,210 WARN
> org.apache.hadoop.yarn.server.nodemanager.recovery.NMLeveldbStateStoreService:
> Remove container container_1531994217928_0002_01_1099511627679 with
> incomplete records
> 2018-07-20 16:49:23,210 WARN
> org.apache.hadoop.yarn.server.nodemanager.recovery.NMLeveldbStateStoreService:
> Remove container container_1531994217928_0002_01_1099511627681 with
> incomplete records
> 2018-07-20 16:49:23,210 WARN
> org.apache.hadoop.yarn.server.nodemanager.recovery.NMLeveldbStateStoreService:
> Remove container container_1531994217928_0002_01_1099511627684 with
> incomplete records
> 2018-07-20 16:49:23,210 WARN
> org.apache.hadoop.yarn.server.nodemanager.recovery.NMLeveldbStateStoreService:
> Remove container container_1531994217928_0002_01_1099511627690 with
> incomplete records
> 2018-07-20 16:49:23,210 WARN
> org.apache.hadoop.yarn.server.nodemanager.recovery.NMLeveldbStateStoreService:
> Remove container container_1531994217928_0002_01_1099511627695 with
> incomplete records
> 2018-07-20 16:49:23,210 WARN
> org.apache.hadoop.yarn.server.nodemanager.recovery.NMLeveldbStateStoreService:
> Remove container container_1531994217928_0002_01_1099511627696 with
> incomplete records
> 2018-07-20 16:49:23,210 WARN
> org.apache.hadoop.yarn.server.nodemanager.recovery.NMLeveldbStateStoreService:
> Remove container container_1531994217928_0002_01_1099511627702 with
> incomplete records
> 2018-07-20 16:49:23,210 WARN
> org.apache.hadoop.yarn.server.nodemanager.recovery.NMLeveldbStateStoreService:
> Remove container container_1531994217928_0002_01_1099511627706 with
> incomplete records
> 2018-07-20 16:49:23,210 WARN
> org.apache.hadoop.yarn.server.nodemanager.recovery.NMLeveldbStateStoreService:
> Remove container container_1531994217928_0002_01_1099511627710 with
> incomplete records
> 2018-07-20 16:49:23,211 WARN
> org.apache.hadoop.yarn.server.nodemanager.recovery.NMLeveldbStateStoreService:
> Remove container container_1531994217928_0002_01_1099511627712 with
> incomplete records
> {code}
> NM state store size could increase in long running scenarios, and recovery
> could be slow
--
This message was sent by Atlassian JIRA
(v7.6.3#76005)
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]