[ https://issues.apache.org/jira/browse/HDDS-399?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16608484#comment-16608484 ]
Hadoop QA commented on HDDS-399: -------------------------------- | (x) *{color:red}-1 overall{color}* | \\ \\ || Vote || Subsystem || Runtime || Comment || | {color:blue}0{color} | {color:blue} reexec {color} | {color:blue} 0m 26s{color} | {color:blue} Docker mode activated. {color} | || || || || {color:brown} Prechecks {color} || | {color:green}+1{color} | {color:green} @author {color} | {color:green} 0m 0s{color} | {color:green} The patch does not contain any @author tags. {color} | | {color:green}+1{color} | {color:green} test4tests {color} | {color:green} 0m 0s{color} | {color:green} The patch appears to include 5 new or modified test files. {color} | || || || || {color:brown} trunk Compile Tests {color} || | {color:blue}0{color} | {color:blue} mvndep {color} | {color:blue} 2m 27s{color} | {color:blue} Maven dependency ordering for branch {color} | | {color:green}+1{color} | {color:green} mvninstall {color} | {color:green} 22m 39s{color} | {color:green} trunk passed {color} | | {color:green}+1{color} | {color:green} compile {color} | {color:green} 18m 41s{color} | {color:green} trunk passed {color} | | {color:green}+1{color} | {color:green} checkstyle {color} | {color:green} 3m 29s{color} | {color:green} trunk passed {color} | | {color:green}+1{color} | {color:green} mvnsite {color} | {color:green} 1m 55s{color} | {color:green} trunk passed {color} | | {color:green}+1{color} | {color:green} shadedclient {color} | {color:green} 17m 38s{color} | {color:green} branch has no errors when building and testing our client artifacts. {color} | | {color:blue}0{color} | {color:blue} findbugs {color} | {color:blue} 0m 0s{color} | {color:blue} Skipped patched modules with no Java source: hadoop-ozone/integration-test {color} | | {color:green}+1{color} | {color:green} findbugs {color} | {color:green} 1m 51s{color} | {color:green} trunk passed {color} | | {color:green}+1{color} | {color:green} javadoc {color} | {color:green} 2m 0s{color} | {color:green} trunk passed {color} | || || || || {color:brown} Patch Compile Tests {color} || | {color:blue}0{color} | {color:blue} mvndep {color} | {color:blue} 0m 22s{color} | {color:blue} Maven dependency ordering for patch {color} | | {color:green}+1{color} | {color:green} mvninstall {color} | {color:green} 1m 44s{color} | {color:green} the patch passed {color} | | {color:green}+1{color} | {color:green} compile {color} | {color:green} 23m 34s{color} | {color:green} the patch passed {color} | | {color:green}+1{color} | {color:green} javac {color} | {color:green} 23m 34s{color} | {color:green} the patch passed {color} | | {color:orange}-0{color} | {color:orange} checkstyle {color} | {color:orange} 4m 3s{color} | {color:orange} root: The patch generated 5 new + 20 unchanged - 1 fixed = 25 total (was 21) {color} | | {color:green}+1{color} | {color:green} mvnsite {color} | {color:green} 2m 31s{color} | {color:green} the patch passed {color} | | {color:green}+1{color} | {color:green} whitespace {color} | {color:green} 0m 0s{color} | {color:green} The patch has no whitespace issues. {color} | | {color:green}+1{color} | {color:green} shadedclient {color} | {color:green} 12m 10s{color} | {color:green} patch has no errors when building and testing our client artifacts. {color} | | {color:blue}0{color} | {color:blue} findbugs {color} | {color:blue} 0m 0s{color} | {color:blue} Skipped patched modules with no Java source: hadoop-ozone/integration-test {color} | | {color:red}-1{color} | {color:red} findbugs {color} | {color:red} 1m 26s{color} | {color:red} hadoop-hdds/common generated 1 new + 0 unchanged - 0 fixed = 1 total (was 0) {color} | | {color:red}-1{color} | {color:red} findbugs {color} | {color:red} 0m 56s{color} | {color:red} hadoop-hdds/server-scm generated 4 new + 0 unchanged - 0 fixed = 4 total (was 0) {color} | | {color:green}+1{color} | {color:green} javadoc {color} | {color:green} 2m 4s{color} | {color:green} the patch passed {color} | || || || || {color:brown} Other Tests {color} || | {color:green}+1{color} | {color:green} unit {color} | {color:green} 1m 6s{color} | {color:green} common in the patch passed. {color} | | {color:green}+1{color} | {color:green} unit {color} | {color:green} 1m 44s{color} | {color:green} server-scm in the patch passed. {color} | | {color:red}-1{color} | {color:red} unit {color} | {color:red} 7m 50s{color} | {color:red} integration-test in the patch failed. {color} | | {color:red}-1{color} | {color:red} asflicense {color} | {color:red} 0m 50s{color} | {color:red} The patch generated 2 ASF License warnings. {color} | | {color:black}{color} | {color:black} {color} | {color:black}129m 17s{color} | {color:black} {color} | \\ \\ || Reason || Tests || | FindBugs | module:hadoop-hdds/common | | | org.apache.hadoop.hdds.scm.container.common.helpers.Pipeline defines equals and uses Object.hashCode() At Pipeline.java:Object.hashCode() At Pipeline.java:[lines 250-258] | | FindBugs | module:hadoop-hdds/server-scm | | | Dead store to factor in org.apache.hadoop.hdds.scm.pipelines.PipelineManager.createPipeline(HddsProtos$ReplicationFactor, HddsProtos$ReplicationType) At PipelineManager.java:org.apache.hadoop.hdds.scm.pipelines.PipelineManager.createPipeline(HddsProtos$ReplicationFactor, HddsProtos$ReplicationType) At PipelineManager.java:[line 156] | | | org.apache.hadoop.hdds.scm.container.common.helpers.PipelineID is incompatible with expected argument type PipelineManager$ActivePipelines in org.apache.hadoop.hdds.scm.pipelines.PipelineManager.finalizePipeline(Pipeline) At PipelineManager.java:argument type PipelineManager$ActivePipelines in org.apache.hadoop.hdds.scm.pipelines.PipelineManager.finalizePipeline(Pipeline) At PipelineManager.java:[line 172] | | | Should org.apache.hadoop.hdds.scm.pipelines.PipelineManager$ActivePipelines be a _static_ inner class? At PipelineManager.java:inner class? At PipelineManager.java:[lines 53-81] | | | Switch statement found in org.apache.hadoop.hdds.scm.pipelines.PipelineStateManager.addExistingPipeline(Pipeline) where default case is missing At PipelineStateManager.java:where default case is missing At PipelineStateManager.java:[lines 149-162] | | Failed junit tests | hadoop.hdds.scm.pipeline.TestNodeFailure | | | hadoop.ozone.client.rpc.TestCloseContainerHandlingByClient | | | hadoop.ozone.freon.TestDataValidate | \\ \\ || Subsystem || Report/Notes || | Docker | Client=17.05.0-ce Server=17.05.0-ce Image:yetus/hadoop:ba1ab08 | | JIRA Issue | HDDS-399 | | JIRA Patch URL | https://issues.apache.org/jira/secure/attachment/12938989/HDDS-399.003.patch | | Optional Tests | asflicense compile javac javadoc mvninstall mvnsite unit shadedclient findbugs checkstyle | | uname | Linux d962eace4bbc 3.13.0-153-generic #203-Ubuntu SMP Thu Jun 14 08:52:28 UTC 2018 x86_64 x86_64 x86_64 GNU/Linux | | Build tool | maven | | Personality | /testptch/patchprocess/precommit/personality/provided.sh | | git revision | trunk / d924ca2 | | maven | version: Apache Maven 3.3.9 | | Default Java | 1.8.0_181 | | findbugs | v3.1.0-RC1 | | checkstyle | https://builds.apache.org/job/PreCommit-HDDS-Build/1014/artifact/out/diff-checkstyle-root.txt | | findbugs | https://builds.apache.org/job/PreCommit-HDDS-Build/1014/artifact/out/new-findbugs-hadoop-hdds_common.html | | findbugs | https://builds.apache.org/job/PreCommit-HDDS-Build/1014/artifact/out/new-findbugs-hadoop-hdds_server-scm.html | | unit | https://builds.apache.org/job/PreCommit-HDDS-Build/1014/artifact/out/patch-unit-hadoop-ozone_integration-test.txt | | Test Results | https://builds.apache.org/job/PreCommit-HDDS-Build/1014/testReport/ | | asflicense | https://builds.apache.org/job/PreCommit-HDDS-Build/1014/artifact/out/patch-asflicense-problems.txt | | Max. process+thread count | 2263 (vs. ulimit of 10000) | | modules | C: hadoop-hdds/common hadoop-hdds/server-scm hadoop-ozone/integration-test U: . | | Console output | https://builds.apache.org/job/PreCommit-HDDS-Build/1014/console | | Powered by | Apache Yetus 0.8.0-SNAPSHOT http://yetus.apache.org | This message was automatically generated. > Handle pipeline discovery on SCM restart. > ----------------------------------------- > > Key: HDDS-399 > URL: https://issues.apache.org/jira/browse/HDDS-399 > Project: Hadoop Distributed Data Store > Issue Type: Bug > Components: SCM > Affects Versions: 0.2.1 > Reporter: Mukul Kumar Singh > Assignee: Mukul Kumar Singh > Priority: Blocker > Fix For: 0.2.1 > > Attachments: HDDS-399.001.patch, HDDS-399.002.patch, > HDDS-399.003.patch > > > On SCM restart, as part on node registration, SCM should find out the list on > open pipeline on the node. Once all the nodes of the pipeline have reported > back, they should be added as active pipelines for further allocations. -- This message was sent by Atlassian JIRA (v7.6.3#76005) --------------------------------------------------------------------- To unsubscribe, e-mail: hdfs-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: hdfs-issues-h...@hadoop.apache.org