[GitHub] [hadoop] hadoop-yetus commented on issue #1480: HDFS-14857 FS operations fail in HA mode: DataNode fails to connect to NameNode

2019-09-20 Thread GitBox
hadoop-yetus commented on issue #1480: HDFS-14857 FS operations fail in HA 
mode: DataNode fails to connect to NameNode
URL: https://github.com/apache/hadoop/pull/1480#issuecomment-533770419
 
 
   :broken_heart: **-1 overall**
   
   
   
   
   
   
   | Vote | Subsystem | Runtime | Comment |
   |::|--:|:|:|
   | 0 | reexec | 39 | Docker mode activated. |
   ||| _ Prechecks _ |
   | +1 | dupname | 0 | No case conflicting files found. |
   | +1 | @author | 0 | The patch does not contain any @author tags. |
   | +1 | test4tests | 0 | The patch appears to include 1 new or modified test 
files. |
   ||| _ trunk Compile Tests _ |
   | 0 | mvndep | 23 | Maven dependency ordering for branch |
   | +1 | mvninstall | 1059 | trunk passed |
   | +1 | compile | 183 | trunk passed |
   | +1 | checkstyle | 54 | trunk passed |
   | +1 | mvnsite | 122 | trunk passed |
   | -1 | shadedclient | 222 | branch has errors when building and testing our 
client artifacts. |
   | +1 | javadoc | 115 | trunk passed |
   | 0 | spotbugs | 163 | Used deprecated FindBugs config; considering 
switching to SpotBugs. |
   | +1 | findbugs | 291 | trunk passed |
   ||| _ Patch Compile Tests _ |
   | 0 | mvndep | 14 | Maven dependency ordering for patch |
   | +1 | mvninstall | 108 | the patch passed |
   | +1 | compile | 178 | the patch passed |
   | +1 | javac | 178 | the patch passed |
   | -0 | checkstyle | 48 | hadoop-hdfs-project: The patch generated 13 new + 
22 unchanged - 0 fixed = 35 total (was 22) |
   | +1 | mvnsite | 113 | the patch passed |
   | +1 | whitespace | 0 | The patch has no whitespace issues. |
   | -1 | shadedclient | 33 | patch has errors when building and testing our 
client artifacts. |
   | +1 | javadoc | 106 | the patch passed |
   | -1 | findbugs | 142 | hadoop-hdfs-project/hadoop-hdfs-client generated 1 
new + 0 unchanged - 0 fixed = 1 total (was 0) |
   ||| _ Other Tests _ |
   | +1 | unit | 120 | hadoop-hdfs-client in the patch passed. |
   | -1 | unit | 5233 | hadoop-hdfs in the patch failed. |
   | +1 | asflicense | 36 | The patch does not generate ASF License warnings. |
   | | | 8430 | |
   
   
   | Reason | Tests |
   |---:|:--|
   | FindBugs | module:hadoop-hdfs-project/hadoop-hdfs-client |
   |  |  Inconsistent synchronization of 
org.apache.hadoop.hdfs.server.namenode.ha.ConfiguredFailoverProxyProvider.currentProxyIndex;
 locked 75% of time  Unsynchronized access at 
ConfiguredFailoverProxyProvider.java:75% of time  Unsynchronized access at 
ConfiguredFailoverProxyProvider.java:[line 67] |
   | Failed junit tests | 
hadoop.hdfs.server.namenode.TestAddOverReplicatedStripedBlocks |
   |   | hadoop.fs.TestHdfsNativeCodeLoader |
   |   | hadoop.hdfs.server.blockmanagement.TestUnderReplicatedBlocks |
   
   
   | Subsystem | Report/Notes |
   |--:|:-|
   | Docker | Client=19.03.1 Server=19.03.1 base: 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1480/3/artifact/out/Dockerfile
 |
   | GITHUB PR | https://github.com/apache/hadoop/pull/1480 |
   | Optional Tests | dupname asflicense compile javac javadoc mvninstall 
mvnsite unit shadedclient findbugs checkstyle |
   | uname | Linux b8db6bd4a757 4.15.0-60-generic #67-Ubuntu SMP Thu Aug 22 
16:55:30 UTC 2019 x86_64 x86_64 x86_64 GNU/Linux |
   | Build tool | maven |
   | Personality | personality/hadoop.sh |
   | git revision | trunk / d7d6ec8 |
   | Default Java | 1.8.0_222 |
   | checkstyle | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1480/3/artifact/out/diff-checkstyle-hadoop-hdfs-project.txt
 |
   | findbugs | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1480/3/artifact/out/new-findbugs-hadoop-hdfs-project_hadoop-hdfs-client.html
 |
   | unit | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1480/3/artifact/out/patch-unit-hadoop-hdfs-project_hadoop-hdfs.txt
 |
   |  Test Results | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1480/3/testReport/ |
   | Max. process+thread count | 4124 (vs. ulimit of 5500) |
   | modules | C: hadoop-hdfs-project/hadoop-hdfs-client 
hadoop-hdfs-project/hadoop-hdfs U: hadoop-hdfs-project |
   | Console output | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1480/3/console |
   | versions | git=2.7.4 maven=3.3.9 findbugs=3.1.0-RC1 |
   | Powered by | Apache Yetus 0.10.0 http://yetus.apache.org |
   
   
   This message was automatically generated.
   
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: 

[GitHub] [hadoop] hadoop-yetus commented on issue #1484: [HADOOP-16590] - LoginModule Classes and Principal Classes deprecated in IBM Java.

2019-09-20 Thread GitBox
hadoop-yetus commented on issue #1484: [HADOOP-16590] - LoginModule Classes and 
Principal Classes deprecated in IBM Java.
URL: https://github.com/apache/hadoop/pull/1484#issuecomment-533769770
 
 
   :broken_heart: **-1 overall**
   
   
   
   
   
   
   | Vote | Subsystem | Runtime | Comment |
   |::|--:|:|:|
   | 0 | reexec | 103 | Docker mode activated. |
   ||| _ Prechecks _ |
   | +1 | dupname | 0 | No case conflicting files found. |
   | +1 | @author | 0 | The patch does not contain any @author tags. |
   | -1 | test4tests | 0 | The patch doesn't appear to include any new or 
modified tests.  Please justify why no new tests are needed for this patch. 
Also please list what manual steps were performed to verify this patch. |
   ||| _ trunk Compile Tests _ |
   | +1 | mvninstall | 1341 | trunk passed |
   | +1 | compile | 1059 | trunk passed |
   | +1 | checkstyle | 46 | trunk passed |
   | +1 | mvnsite | 82 | trunk passed |
   | -1 | shadedclient | 181 | branch has errors when building and testing our 
client artifacts. |
   | +1 | javadoc | 84 | trunk passed |
   | 0 | spotbugs | 127 | Used deprecated FindBugs config; considering 
switching to SpotBugs. |
   | +1 | findbugs | 125 | trunk passed |
   ||| _ Patch Compile Tests _ |
   | +1 | mvninstall | 48 | the patch passed |
   | +1 | compile | 992 | the patch passed |
   | +1 | javac | 992 | the patch passed |
   | +1 | checkstyle | 46 | hadoop-common-project/hadoop-common: The patch 
generated 0 new + 76 unchanged - 3 fixed = 76 total (was 79) |
   | +1 | mvnsite | 77 | the patch passed |
   | +1 | whitespace | 0 | The patch has no whitespace issues. |
   | -1 | shadedclient | 41 | patch has errors when building and testing our 
client artifacts. |
   | +1 | javadoc | 83 | the patch passed |
   | +1 | findbugs | 131 | the patch passed |
   ||| _ Other Tests _ |
   | +1 | unit | 563 | hadoop-common in the patch passed. |
   | +1 | asflicense | 45 | The patch does not generate ASF License warnings. |
   | | | 5111 | |
   
   
   | Subsystem | Report/Notes |
   |--:|:-|
   | Docker | Client=19.03.0 Server=19.03.0 base: 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1484/1/artifact/out/Dockerfile
 |
   | GITHUB PR | https://github.com/apache/hadoop/pull/1484 |
   | JIRA Issue | HADOOP-16590 |
   | Optional Tests | dupname asflicense compile javac javadoc mvninstall 
mvnsite unit shadedclient findbugs checkstyle |
   | uname | Linux 44288b07d517 4.15.0-58-generic #64-Ubuntu SMP Tue Aug 6 
11:12:41 UTC 2019 x86_64 x86_64 x86_64 GNU/Linux |
   | Build tool | maven |
   | Personality | personality/hadoop.sh |
   | git revision | trunk / d7d6ec8 |
   | Default Java | 1.8.0_222 |
   |  Test Results | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1484/1/testReport/ |
   | Max. process+thread count | 1733 (vs. ulimit of 5500) |
   | modules | C: hadoop-common-project/hadoop-common U: 
hadoop-common-project/hadoop-common |
   | Console output | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1484/1/console |
   | versions | git=2.7.4 maven=3.3.9 findbugs=3.1.0-RC1 |
   | Powered by | Apache Yetus 0.10.0 http://yetus.apache.org |
   
   
   This message was automatically generated.
   
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[jira] [Commented] (HADOOP-16590) IBM Java has deprecated OS login module classes and OS principal classes.

2019-09-20 Thread Hadoop QA (Jira)


[ 
https://issues.apache.org/jira/browse/HADOOP-16590?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16934918#comment-16934918
 ] 

Hadoop QA commented on HADOOP-16590:


| (x) *{color:red}-1 overall{color}* |
\\
\\
|| Vote || Subsystem || Runtime || Comment ||
| {color:blue}0{color} | {color:blue} reexec {color} | {color:blue}  1m 
43s{color} | {color:blue} Docker mode activated. {color} |
|| || || || {color:brown} Prechecks {color} ||
| {color:green}+1{color} | {color:green} dupname {color} | {color:green}  0m  
0s{color} | {color:green} No case conflicting files found. {color} |
| {color:green}+1{color} | {color:green} @author {color} | {color:green}  0m  
0s{color} | {color:green} The patch does not contain any @author tags. {color} |
| {color:red}-1{color} | {color:red} test4tests {color} | {color:red}  0m  
0s{color} | {color:red} The patch doesn't appear to include any new or modified 
tests. Please justify why no new tests are needed for this patch. Also please 
list what manual steps were performed to verify this patch. {color} |
|| || || || {color:brown} trunk Compile Tests {color} ||
| {color:green}+1{color} | {color:green} mvninstall {color} | {color:green} 22m 
21s{color} | {color:green} trunk passed {color} |
| {color:green}+1{color} | {color:green} compile {color} | {color:green} 17m 
39s{color} | {color:green} trunk passed {color} |
| {color:green}+1{color} | {color:green} checkstyle {color} | {color:green}  0m 
46s{color} | {color:green} trunk passed {color} |
| {color:green}+1{color} | {color:green} mvnsite {color} | {color:green}  1m 
22s{color} | {color:green} trunk passed {color} |
| {color:red}-1{color} | {color:red} shadedclient {color} | {color:red}  3m  
1s{color} | {color:red} branch has errors when building and testing our client 
artifacts. {color} |
| {color:green}+1{color} | {color:green} javadoc {color} | {color:green}  1m 
24s{color} | {color:green} trunk passed {color} |
| {color:blue}0{color} | {color:blue} spotbugs {color} | {color:blue}  2m  
7s{color} | {color:blue} Used deprecated FindBugs config; considering switching 
to SpotBugs. {color} |
| {color:green}+1{color} | {color:green} findbugs {color} | {color:green}  2m  
5s{color} | {color:green} trunk passed {color} |
|| || || || {color:brown} Patch Compile Tests {color} ||
| {color:green}+1{color} | {color:green} mvninstall {color} | {color:green}  0m 
48s{color} | {color:green} the patch passed {color} |
| {color:green}+1{color} | {color:green} compile {color} | {color:green} 16m 
32s{color} | {color:green} the patch passed {color} |
| {color:green}+1{color} | {color:green} javac {color} | {color:green} 16m 
32s{color} | {color:green} the patch passed {color} |
| {color:green}+1{color} | {color:green} checkstyle {color} | {color:green}  0m 
46s{color} | {color:green} hadoop-common-project/hadoop-common: The patch 
generated 0 new + 76 unchanged - 3 fixed = 76 total (was 79) {color} |
| {color:green}+1{color} | {color:green} mvnsite {color} | {color:green}  1m 
17s{color} | {color:green} the patch passed {color} |
| {color:green}+1{color} | {color:green} whitespace {color} | {color:green}  0m 
 0s{color} | {color:green} The patch has no whitespace issues. {color} |
| {color:red}-1{color} | {color:red} shadedclient {color} | {color:red}  0m 
41s{color} | {color:red} patch has errors when building and testing our client 
artifacts. {color} |
| {color:green}+1{color} | {color:green} javadoc {color} | {color:green}  1m 
23s{color} | {color:green} the patch passed {color} |
| {color:green}+1{color} | {color:green} findbugs {color} | {color:green}  2m 
11s{color} | {color:green} the patch passed {color} |
|| || || || {color:brown} Other Tests {color} ||
| {color:green}+1{color} | {color:green} unit {color} | {color:green}  9m 
23s{color} | {color:green} hadoop-common in the patch passed. {color} |
| {color:green}+1{color} | {color:green} asflicense {color} | {color:green}  0m 
45s{color} | {color:green} The patch does not generate ASF License warnings. 
{color} |
| {color:black}{color} | {color:black} {color} | {color:black} 85m 11s{color} | 
{color:black} {color} |
\\
\\
|| Subsystem || Report/Notes ||
| Docker | Client=19.03.0 Server=19.03.0 base: 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1484/1/artifact/out/Dockerfile
 |
| GITHUB PR | https://github.com/apache/hadoop/pull/1484 |
| JIRA Issue | HADOOP-16590 |
| Optional Tests | dupname asflicense compile javac javadoc mvninstall mvnsite 
unit shadedclient findbugs checkstyle |
| uname | Linux 44288b07d517 4.15.0-58-generic #64-Ubuntu SMP Tue Aug 6 
11:12:41 UTC 2019 x86_64 x86_64 x86_64 GNU/Linux |
| Build tool | maven |
| Personality | personality/hadoop.sh |
| git revision | trunk / d7d6ec8 |
| Default Java | 1.8.0_222 |
|  Test Results | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1484/1/testReport/ |
| Max. process+thread count | 1733 (vs. ulimit of 5500) |
| modules | C: 

[GitHub] [hadoop] dineshchitlangia commented on issue #1491: HDDS-2161. Create RepeatedKeyInfo structure to be saved in deletedTable

2019-09-20 Thread GitBox
dineshchitlangia commented on issue #1491: HDDS-2161. Create RepeatedKeyInfo 
structure to be saved in deletedTable
URL: https://github.com/apache/hadoop/pull/1491#issuecomment-533767921
 
 
   /retest


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[GitHub] [hadoop] hadoop-yetus commented on issue #1487: HADOOP-16591 Fix S3A ITest*MRjob failures.

2019-09-20 Thread GitBox
hadoop-yetus commented on issue #1487: HADOOP-16591 Fix S3A ITest*MRjob 
failures.
URL: https://github.com/apache/hadoop/pull/1487#issuecomment-533767353
 
 
   :broken_heart: **-1 overall**
   
   
   
   
   
   
   | Vote | Subsystem | Runtime | Comment |
   |::|--:|:|:|
   | 0 | reexec | 110 | Docker mode activated. |
   ||| _ Prechecks _ |
   | +1 | dupname | 0 | No case conflicting files found. |
   | +1 | @author | 0 | The patch does not contain any @author tags. |
   | +1 | test4tests | 0 | The patch appears to include 1 new or modified test 
files. |
   ||| _ trunk Compile Tests _ |
   | +1 | mvninstall | 1455 | trunk passed |
   | +1 | compile | 37 | trunk passed |
   | +1 | checkstyle | 28 | trunk passed |
   | +1 | mvnsite | 42 | trunk passed |
   | -1 | shadedclient | 116 | branch has errors when building and testing our 
client artifacts. |
   | +1 | javadoc | 34 | trunk passed |
   | 0 | spotbugs | 67 | Used deprecated FindBugs config; considering switching 
to SpotBugs. |
   | +1 | findbugs | 65 | trunk passed |
   ||| _ Patch Compile Tests _ |
   | +1 | mvninstall | 38 | the patch passed |
   | +1 | compile | 28 | the patch passed |
   | +1 | javac | 28 | the patch passed |
   | +1 | checkstyle | 21 | the patch passed |
   | +1 | mvnsite | 37 | the patch passed |
   | +1 | whitespace | 0 | The patch has no whitespace issues. |
   | -1 | shadedclient | 33 | patch has errors when building and testing our 
client artifacts. |
   | +1 | javadoc | 26 | the patch passed |
   | +1 | findbugs | 70 | the patch passed |
   ||| _ Other Tests _ |
   | +1 | unit | 92 | hadoop-aws in the patch passed. |
   | +1 | asflicense | 30 | The patch does not generate ASF License warnings. |
   | | | 2355 | |
   
   
   | Subsystem | Report/Notes |
   |--:|:-|
   | Docker | Client=18.09.7 Server=18.09.7 base: 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1487/1/artifact/out/Dockerfile
 |
   | GITHUB PR | https://github.com/apache/hadoop/pull/1487 |
   | Optional Tests | dupname asflicense compile javac javadoc mvninstall 
mvnsite unit shadedclient findbugs checkstyle |
   | uname | Linux 314cce32d3ca 4.15.0-58-generic #64-Ubuntu SMP Tue Aug 6 
11:12:41 UTC 2019 x86_64 x86_64 x86_64 GNU/Linux |
   | Build tool | maven |
   | Personality | personality/hadoop.sh |
   | git revision | trunk / d7d6ec8 |
   | Default Java | 1.8.0_222 |
   |  Test Results | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1487/1/testReport/ |
   | Max. process+thread count | 264 (vs. ulimit of 5500) |
   | modules | C: hadoop-tools/hadoop-aws U: hadoop-tools/hadoop-aws |
   | Console output | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1487/1/console |
   | versions | git=2.7.4 maven=3.3.9 findbugs=3.1.0-RC1 |
   | Powered by | Apache Yetus 0.10.0 http://yetus.apache.org |
   
   
   This message was automatically generated.
   
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[GitHub] [hadoop] hadoop-yetus commented on issue #1482: HADOOP-16589. [pb-upgrade] Update docker image to make 3.7.1 protoc as default

2019-09-20 Thread GitBox
hadoop-yetus commented on issue #1482: HADOOP-16589. [pb-upgrade] Update docker 
image to make 3.7.1 protoc as default
URL: https://github.com/apache/hadoop/pull/1482#issuecomment-533767129
 
 
   :broken_heart: **-1 overall**
   
   
   
   
   
   
   | Vote | Subsystem | Runtime | Comment |
   |::|--:|:|:|
   | 0 | reexec | 1381 | Docker mode activated. |
   | -1 | patch | 13 | https://github.com/apache/hadoop/pull/1482 does not 
apply to trunk. Rebase required? Wrong Branch? See 
https://wiki.apache.org/hadoop/HowToContribute for help. |
   
   
   | Subsystem | Report/Notes |
   |--:|:-|
   | Docker | Client=19.03.1 Server=19.03.1 base: 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1482/3/artifact/out/Dockerfile
 |
   | GITHUB PR | https://github.com/apache/hadoop/pull/1482 |
   | Console output | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1482/3/console |
   | versions | git=2.7.4 |
   | Powered by | Apache Yetus 0.10.0 http://yetus.apache.org |
   
   
   This message was automatically generated.
   
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[jira] [Commented] (HADOOP-16589) [pb-upgrade] Update docker image to make 3.7.1 protoc as default

2019-09-20 Thread Hudson (Jira)


[ 
https://issues.apache.org/jira/browse/HADOOP-16589?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16934907#comment-16934907
 ] 

Hudson commented on HADOOP-16589:
-

FAILURE: Integrated in Jenkins build Hadoop-trunk-Commit #17349 (See 
[https://builds.apache.org/job/Hadoop-trunk-Commit/17349/])
HADOOP-16589. [pb-upgrade] Update docker image to make 3.7.1 protoc as (github: 
rev efed4450bf1c820a5ad38fb96c28bf162fca9c9f)
* (edit) dev-support/docker/Dockerfile
* (edit) hadoop-project/pom.xml


> [pb-upgrade] Update docker image to make 3.7.1 protoc as default
> 
>
> Key: HADOOP-16589
> URL: https://issues.apache.org/jira/browse/HADOOP-16589
> Project: Hadoop Common
>  Issue Type: Sub-task
>Reporter: Vinayakumar B
>Assignee: Vinayakumar B
>Priority: Major
>
> Right now, docker image contains both 2.5.0 protoc and 3.7.1 protoc.
> 2.5.0 is default protoc in PATH.
> After HADOOP-16557, protoc version expected in PATH is 3.7.1. 



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[GitHub] [hadoop] vinayakumarb merged pull request #1482: HADOOP-16589. [pb-upgrade] Update docker image to make 3.7.1 protoc as default

2019-09-20 Thread GitBox
vinayakumarb merged pull request #1482: HADOOP-16589. [pb-upgrade] Update 
docker image to make 3.7.1 protoc as default
URL: https://github.com/apache/hadoop/pull/1482
 
 
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[GitHub] [hadoop] vinayakumarb commented on issue #1482: HADOOP-16589. [pb-upgrade] Update docker image to make 3.7.1 protoc as default

2019-09-20 Thread GitBox
vinayakumarb commented on issue #1482: HADOOP-16589. [pb-upgrade] Update docker 
image to make 3.7.1 protoc as default
URL: https://github.com/apache/hadoop/pull/1482#issuecomment-533766926
 
 
   > Well I found the test results to be available
   > https://builds.apache.org/job/hadoop-multibranch/job/PR-1482/2/testReport/
   > Seems nothing to worry.
   > 
   > I guess you need to change comment in L109 too.
   > Other then that LGTM
   
   Updated the comment. Thanks @ayushtkn .


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[GitHub] [hadoop] hadoop-yetus commented on issue #1489: HDDS-2019. Handle Set DtService of token in S3Gateway for OM HA.

2019-09-20 Thread GitBox
hadoop-yetus commented on issue #1489: HDDS-2019. Handle Set DtService of token 
in S3Gateway for OM HA.
URL: https://github.com/apache/hadoop/pull/1489#issuecomment-533766921
 
 
   :broken_heart: **-1 overall**
   
   
   
   
   
   
   | Vote | Subsystem | Runtime | Comment |
   |::|--:|:|:|
   | 0 | reexec | 44 | Docker mode activated. |
   ||| _ Prechecks _ |
   | +1 | dupname | 1 | No case conflicting files found. |
   | +1 | @author | 0 | The patch does not contain any @author tags. |
   | -1 | test4tests | 0 | The patch doesn't appear to include any new or 
modified tests.  Please justify why no new tests are needed for this patch. 
Also please list what manual steps were performed to verify this patch. |
   ||| _ trunk Compile Tests _ |
   | -1 | mvninstall | 31 | hadoop-ozone in trunk failed. |
   | -1 | compile | 21 | hadoop-ozone in trunk failed. |
   | +1 | checkstyle | 65 | trunk passed |
   | +1 | mvnsite | 0 | trunk passed |
   | -1 | shadedclient | 109 | branch has errors when building and testing our 
client artifacts. |
   | -1 | javadoc | 49 | hadoop-ozone in trunk failed. |
   | 0 | spotbugs | 179 | Used deprecated FindBugs config; considering 
switching to SpotBugs. |
   | -1 | findbugs | 24 | hadoop-ozone in trunk failed. |
   ||| _ Patch Compile Tests _ |
   | -1 | mvninstall | 35 | hadoop-ozone in the patch failed. |
   | -1 | compile | 23 | hadoop-ozone in the patch failed. |
   | -1 | javac | 23 | hadoop-ozone in the patch failed. |
   | -0 | checkstyle | 28 | hadoop-ozone: The patch generated 1 new + 0 
unchanged - 0 fixed = 1 total (was 0) |
   | +1 | mvnsite | 0 | the patch passed |
   | +1 | whitespace | 0 | The patch has no whitespace issues. |
   | -1 | shadedclient | 32 | patch has errors when building and testing our 
client artifacts. |
   | -1 | javadoc | 53 | hadoop-ozone in the patch failed. |
   | -1 | findbugs | 25 | hadoop-ozone in the patch failed. |
   ||| _ Other Tests _ |
   | +1 | unit | 258 | hadoop-hdds in the patch passed. |
   | -1 | unit | 25 | hadoop-ozone in the patch failed. |
   | +1 | asflicense | 29 | The patch does not generate ASF License warnings. |
   | | | 1819 | |
   
   
   | Subsystem | Report/Notes |
   |--:|:-|
   | Docker | Client=19.03.1 Server=19.03.1 base: 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1489/1/artifact/out/Dockerfile
 |
   | GITHUB PR | https://github.com/apache/hadoop/pull/1489 |
   | Optional Tests | dupname asflicense compile javac javadoc mvninstall 
mvnsite unit shadedclient findbugs checkstyle |
   | uname | Linux a444e8fa4173 4.15.0-60-generic #67-Ubuntu SMP Thu Aug 22 
16:55:30 UTC 2019 x86_64 x86_64 x86_64 GNU/Linux |
   | Build tool | maven |
   | Personality | personality/hadoop.sh |
   | git revision | trunk / d7d6ec8 |
   | Default Java | 1.8.0_222 |
   | mvninstall | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1489/1/artifact/out/branch-mvninstall-hadoop-ozone.txt
 |
   | compile | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1489/1/artifact/out/branch-compile-hadoop-ozone.txt
 |
   | javadoc | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1489/1/artifact/out/branch-javadoc-hadoop-ozone.txt
 |
   | findbugs | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1489/1/artifact/out/branch-findbugs-hadoop-ozone.txt
 |
   | mvninstall | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1489/1/artifact/out/patch-mvninstall-hadoop-ozone.txt
 |
   | compile | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1489/1/artifact/out/patch-compile-hadoop-ozone.txt
 |
   | javac | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1489/1/artifact/out/patch-compile-hadoop-ozone.txt
 |
   | checkstyle | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1489/1/artifact/out/diff-checkstyle-hadoop-ozone.txt
 |
   | javadoc | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1489/1/artifact/out/patch-javadoc-hadoop-ozone.txt
 |
   | findbugs | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1489/1/artifact/out/patch-findbugs-hadoop-ozone.txt
 |
   | unit | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1489/1/artifact/out/patch-unit-hadoop-ozone.txt
 |
   |  Test Results | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1489/1/testReport/ |
   | Max. process+thread count | 542 (vs. ulimit of 5500) |
   | modules | C: hadoop-ozone/s3gateway U: hadoop-ozone/s3gateway |
   | Console output | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1489/1/console |
   | versions | git=2.7.4 maven=3.3.9 findbugs=3.1.0-RC1 |
   | Powered by | Apache Yetus 0.10.0 http://yetus.apache.org |
   
   
   This message was automatically generated.
   
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 

[GitHub] [hadoop] hadoop-yetus commented on issue #1486: HDDS-2158. Fixing Json Injection Issue in JsonUtils.

2019-09-20 Thread GitBox
hadoop-yetus commented on issue #1486: HDDS-2158. Fixing Json Injection Issue 
in JsonUtils.
URL: https://github.com/apache/hadoop/pull/1486#issuecomment-533766865
 
 
   :broken_heart: **-1 overall**
   
   
   
   
   
   
   | Vote | Subsystem | Runtime | Comment |
   |::|--:|:|:|
   | 0 | reexec | 38 | Docker mode activated. |
   ||| _ Prechecks _ |
   | +1 | dupname | 1 | No case conflicting files found. |
   | +1 | @author | 0 | The patch does not contain any @author tags. |
   | -1 | test4tests | 0 | The patch doesn't appear to include any new or 
modified tests.  Please justify why no new tests are needed for this patch. 
Also please list what manual steps were performed to verify this patch. |
   ||| _ trunk Compile Tests _ |
   | 0 | mvndep | 70 | Maven dependency ordering for branch |
   | -1 | mvninstall | 32 | hadoop-ozone in trunk failed. |
   | -1 | compile | 22 | hadoop-ozone in trunk failed. |
   | +1 | checkstyle | 64 | trunk passed |
   | +1 | mvnsite | 0 | trunk passed |
   | -1 | shadedclient | 114 | branch has errors when building and testing our 
client artifacts. |
   | -1 | javadoc | 46 | hadoop-ozone in trunk failed. |
   | 0 | spotbugs | 157 | Used deprecated FindBugs config; considering 
switching to SpotBugs. |
   | -1 | findbugs | 24 | hadoop-ozone in trunk failed. |
   ||| _ Patch Compile Tests _ |
   | 0 | mvndep | 25 | Maven dependency ordering for patch |
   | -1 | mvninstall | 30 | hadoop-ozone in the patch failed. |
   | -1 | compile | 23 | hadoop-ozone in the patch failed. |
   | -1 | javac | 23 | hadoop-ozone in the patch failed. |
   | -0 | checkstyle | 24 | hadoop-hdds: The patch generated 1 new + 0 
unchanged - 0 fixed = 1 total (was 0) |
   | +1 | mvnsite | 0 | the patch passed |
   | +1 | whitespace | 0 | The patch has no whitespace issues. |
   | -1 | shadedclient | 30 | patch has errors when building and testing our 
client artifacts. |
   | -1 | javadoc | 48 | hadoop-ozone in the patch failed. |
   | -1 | findbugs | 23 | hadoop-ozone in the patch failed. |
   ||| _ Other Tests _ |
   | +1 | unit | 211 | hadoop-hdds in the patch passed. |
   | -1 | unit | 27 | hadoop-ozone in the patch failed. |
   | +1 | asflicense | 28 | The patch does not generate ASF License warnings. |
   | | | 1745 | |
   
   
   | Subsystem | Report/Notes |
   |--:|:-|
   | Docker | Client=19.03.1 Server=19.03.1 base: 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1486/1/artifact/out/Dockerfile
 |
   | GITHUB PR | https://github.com/apache/hadoop/pull/1486 |
   | Optional Tests | dupname asflicense compile javac javadoc mvninstall 
mvnsite unit shadedclient findbugs checkstyle |
   | uname | Linux 17bb1cefdfb0 4.15.0-58-generic #64-Ubuntu SMP Tue Aug 6 
11:12:41 UTC 2019 x86_64 x86_64 x86_64 GNU/Linux |
   | Build tool | maven |
   | Personality | personality/hadoop.sh |
   | git revision | trunk / d7d6ec8 |
   | Default Java | 1.8.0_222 |
   | mvninstall | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1486/1/artifact/out/branch-mvninstall-hadoop-ozone.txt
 |
   | compile | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1486/1/artifact/out/branch-compile-hadoop-ozone.txt
 |
   | javadoc | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1486/1/artifact/out/branch-javadoc-hadoop-ozone.txt
 |
   | findbugs | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1486/1/artifact/out/branch-findbugs-hadoop-ozone.txt
 |
   | mvninstall | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1486/1/artifact/out/patch-mvninstall-hadoop-ozone.txt
 |
   | compile | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1486/1/artifact/out/patch-compile-hadoop-ozone.txt
 |
   | javac | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1486/1/artifact/out/patch-compile-hadoop-ozone.txt
 |
   | checkstyle | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1486/1/artifact/out/diff-checkstyle-hadoop-hdds.txt
 |
   | javadoc | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1486/1/artifact/out/patch-javadoc-hadoop-ozone.txt
 |
   | findbugs | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1486/1/artifact/out/patch-findbugs-hadoop-ozone.txt
 |
   | unit | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1486/1/artifact/out/patch-unit-hadoop-ozone.txt
 |
   |  Test Results | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1486/1/testReport/ |
   | Max. process+thread count | 578 (vs. ulimit of 5500) |
   | modules | C: hadoop-hdds/common hadoop-hdds/tools 
hadoop-ozone/ozone-manager U: . |
   | Console output | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1486/1/console |
   | versions | git=2.7.4 maven=3.3.9 findbugs=3.1.0-RC1 |
   | Powered by | Apache Yetus 0.10.0 http://yetus.apache.org |
   
   
   This message was automatically generated.
   
   


This is an automated message from 

[GitHub] [hadoop] hadoop-yetus commented on issue #1488: HDDS-2159. Fix Race condition in ProfileServlet#pid.

2019-09-20 Thread GitBox
hadoop-yetus commented on issue #1488: HDDS-2159. Fix Race condition in 
ProfileServlet#pid.
URL: https://github.com/apache/hadoop/pull/1488#issuecomment-533766842
 
 
   :broken_heart: **-1 overall**
   
   
   
   
   
   
   | Vote | Subsystem | Runtime | Comment |
   |::|--:|:|:|
   | 0 | reexec | 39 | Docker mode activated. |
   ||| _ Prechecks _ |
   | +1 | dupname | 0 | No case conflicting files found. |
   | +1 | @author | 0 | The patch does not contain any @author tags. |
   | -1 | test4tests | 1 | The patch doesn't appear to include any new or 
modified tests.  Please justify why no new tests are needed for this patch. 
Also please list what manual steps were performed to verify this patch. |
   ||| _ trunk Compile Tests _ |
   | -1 | mvninstall | 31 | hadoop-ozone in trunk failed. |
   | -1 | compile | 22 | hadoop-ozone in trunk failed. |
   | +1 | checkstyle | 50 | trunk passed |
   | +1 | mvnsite | 0 | trunk passed |
   | -1 | shadedclient | 100 | branch has errors when building and testing our 
client artifacts. |
   | -1 | javadoc | 47 | hadoop-ozone in trunk failed. |
   | 0 | spotbugs | 186 | Used deprecated FindBugs config; considering 
switching to SpotBugs. |
   | -1 | findbugs | 25 | hadoop-ozone in trunk failed. |
   ||| _ Patch Compile Tests _ |
   | -1 | mvninstall | 29 | hadoop-ozone in the patch failed. |
   | -1 | compile | 20 | hadoop-ozone in the patch failed. |
   | -1 | javac | 20 | hadoop-ozone in the patch failed. |
   | +1 | checkstyle | 51 | the patch passed |
   | +1 | mvnsite | 0 | the patch passed |
   | +1 | whitespace | 0 | The patch has no whitespace issues. |
   | -1 | shadedclient | 32 | patch has errors when building and testing our 
client artifacts. |
   | -1 | javadoc | 47 | hadoop-ozone in the patch failed. |
   | -1 | findbugs | 22 | hadoop-ozone in the patch failed. |
   ||| _ Other Tests _ |
   | +1 | unit | 243 | hadoop-hdds in the patch passed. |
   | -1 | unit | 26 | hadoop-ozone in the patch failed. |
   | +1 | asflicense | 25 | The patch does not generate ASF License warnings. |
   | | | 1714 | |
   
   
   | Subsystem | Report/Notes |
   |--:|:-|
   | Docker | Client=19.03.1 Server=19.03.1 base: 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1488/1/artifact/out/Dockerfile
 |
   | GITHUB PR | https://github.com/apache/hadoop/pull/1488 |
   | Optional Tests | dupname asflicense compile javac javadoc mvninstall 
mvnsite unit shadedclient findbugs checkstyle |
   | uname | Linux 250fc8e627d1 4.15.0-58-generic #64-Ubuntu SMP Tue Aug 6 
11:12:41 UTC 2019 x86_64 x86_64 x86_64 GNU/Linux |
   | Build tool | maven |
   | Personality | personality/hadoop.sh |
   | git revision | trunk / d7d6ec8 |
   | Default Java | 1.8.0_222 |
   | mvninstall | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1488/1/artifact/out/branch-mvninstall-hadoop-ozone.txt
 |
   | compile | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1488/1/artifact/out/branch-compile-hadoop-ozone.txt
 |
   | javadoc | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1488/1/artifact/out/branch-javadoc-hadoop-ozone.txt
 |
   | findbugs | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1488/1/artifact/out/branch-findbugs-hadoop-ozone.txt
 |
   | mvninstall | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1488/1/artifact/out/patch-mvninstall-hadoop-ozone.txt
 |
   | compile | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1488/1/artifact/out/patch-compile-hadoop-ozone.txt
 |
   | javac | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1488/1/artifact/out/patch-compile-hadoop-ozone.txt
 |
   | javadoc | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1488/1/artifact/out/patch-javadoc-hadoop-ozone.txt
 |
   | findbugs | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1488/1/artifact/out/patch-findbugs-hadoop-ozone.txt
 |
   | unit | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1488/1/artifact/out/patch-unit-hadoop-ozone.txt
 |
   |  Test Results | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1488/1/testReport/ |
   | Max. process+thread count | 487 (vs. ulimit of 5500) |
   | modules | C: hadoop-hdds/framework U: hadoop-hdds/framework |
   | Console output | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1488/1/console |
   | versions | git=2.7.4 maven=3.3.9 findbugs=3.1.0-RC1 |
   | Powered by | Apache Yetus 0.10.0 http://yetus.apache.org |
   
   
   This message was automatically generated.
   
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To 

[GitHub] [hadoop] hadoop-yetus commented on issue #1491: HDDS-2161. Create RepeatedKeyInfo structure to be saved in deletedTable

2019-09-20 Thread GitBox
hadoop-yetus commented on issue #1491: HDDS-2161. Create RepeatedKeyInfo 
structure to be saved in deletedTable
URL: https://github.com/apache/hadoop/pull/1491#issuecomment-533766852
 
 
   :broken_heart: **-1 overall**
   
   
   
   
   
   
   | Vote | Subsystem | Runtime | Comment |
   |::|--:|:|:|
   | 0 | reexec | 42 | Docker mode activated. |
   ||| _ Prechecks _ |
   | +1 | dupname | 1 | No case conflicting files found. |
   | +1 | @author | 0 | The patch does not contain any @author tags. |
   | +1 | test4tests | 0 | The patch appears to include 3 new or modified test 
files. |
   ||| _ trunk Compile Tests _ |
   | 0 | mvndep | 15 | Maven dependency ordering for branch |
   | -1 | mvninstall | 30 | hadoop-ozone in trunk failed. |
   | -1 | compile | 22 | hadoop-ozone in trunk failed. |
   | +1 | checkstyle | 52 | trunk passed |
   | +1 | mvnsite | 0 | trunk passed |
   | -1 | shadedclient | 102 | branch has errors when building and testing our 
client artifacts. |
   | -1 | javadoc | 44 | hadoop-ozone in trunk failed. |
   | 0 | spotbugs | 183 | Used deprecated FindBugs config; considering 
switching to SpotBugs. |
   | -1 | findbugs | 25 | hadoop-ozone in trunk failed. |
   ||| _ Patch Compile Tests _ |
   | 0 | mvndep | 15 | Maven dependency ordering for patch |
   | -1 | mvninstall | 30 | hadoop-ozone in the patch failed. |
   | -1 | compile | 24 | hadoop-ozone in the patch failed. |
   | -1 | cc | 24 | hadoop-ozone in the patch failed. |
   | -1 | javac | 24 | hadoop-ozone in the patch failed. |
   | +1 | checkstyle | 51 | the patch passed |
   | +1 | mvnsite | 0 | the patch passed |
   | +1 | whitespace | 0 | The patch has no whitespace issues. |
   | -1 | shadedclient | 30 | patch has errors when building and testing our 
client artifacts. |
   | -1 | javadoc | 46 | hadoop-ozone in the patch failed. |
   | -1 | findbugs | 22 | hadoop-ozone in the patch failed. |
   ||| _ Other Tests _ |
   | -1 | unit | 243 | hadoop-hdds in the patch failed. |
   | -1 | unit | 24 | hadoop-ozone in the patch failed. |
   | +1 | asflicense | 27 | The patch does not generate ASF License warnings. |
   | | | 1732 | |
   
   
   | Reason | Tests |
   |---:|:--|
   | Failed junit tests | hadoop.hdds.scm.block.TestBlockManager |
   
   
   | Subsystem | Report/Notes |
   |--:|:-|
   | Docker | Client=19.03.1 Server=19.03.1 base: 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1491/1/artifact/out/Dockerfile
 |
   | GITHUB PR | https://github.com/apache/hadoop/pull/1491 |
   | Optional Tests | dupname asflicense compile javac javadoc mvninstall 
mvnsite unit shadedclient findbugs checkstyle cc |
   | uname | Linux aeefee5b3fd6 4.15.0-58-generic #64-Ubuntu SMP Tue Aug 6 
11:12:41 UTC 2019 x86_64 x86_64 x86_64 GNU/Linux |
   | Build tool | maven |
   | Personality | personality/hadoop.sh |
   | git revision | trunk / d7d6ec8 |
   | Default Java | 1.8.0_222 |
   | mvninstall | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1491/1/artifact/out/branch-mvninstall-hadoop-ozone.txt
 |
   | compile | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1491/1/artifact/out/branch-compile-hadoop-ozone.txt
 |
   | javadoc | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1491/1/artifact/out/branch-javadoc-hadoop-ozone.txt
 |
   | findbugs | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1491/1/artifact/out/branch-findbugs-hadoop-ozone.txt
 |
   | mvninstall | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1491/1/artifact/out/patch-mvninstall-hadoop-ozone.txt
 |
   | compile | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1491/1/artifact/out/patch-compile-hadoop-ozone.txt
 |
   | cc | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1491/1/artifact/out/patch-compile-hadoop-ozone.txt
 |
   | javac | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1491/1/artifact/out/patch-compile-hadoop-ozone.txt
 |
   | javadoc | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1491/1/artifact/out/patch-javadoc-hadoop-ozone.txt
 |
   | findbugs | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1491/1/artifact/out/patch-findbugs-hadoop-ozone.txt
 |
   | unit | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1491/1/artifact/out/patch-unit-hadoop-hdds.txt
 |
   | unit | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1491/1/artifact/out/patch-unit-hadoop-ozone.txt
 |
   |  Test Results | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1491/1/testReport/ |
   | Max. process+thread count | 506 (vs. ulimit of 5500) |
   | modules | C: hadoop-ozone/common hadoop-ozone/ozone-manager U: 
hadoop-ozone |
   | Console output | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1491/1/console |
   | versions | git=2.7.4 maven=3.3.9 findbugs=3.1.0-RC1 |
   | Powered by | Apache Yetus 0.10.0 http://yetus.apache.org |
   
   
   This message was automatically generated.
   
   


[GitHub] [hadoop] vinayakumarb opened a new pull request #1494: HADOOP-16558. [COMMON+HDFS] use protobuf-maven-plugin to generate protobuf classes

2019-09-20 Thread GitBox
vinayakumarb opened a new pull request #1494: HADOOP-16558. [COMMON+HDFS] use 
protobuf-maven-plugin to generate protobuf classes
URL: https://github.com/apache/hadoop/pull/1494
 
 
   Use "protoc-maven-plugin" to dynamically download protobuf executable to 
generate protobuf classes from proto files.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[GitHub] [hadoop] hadoop-yetus commented on issue #1490: HDDS-2160. Add acceptance test for ozonesecure-mr compose. Contribute…

2019-09-20 Thread GitBox
hadoop-yetus commented on issue #1490: HDDS-2160. Add acceptance test for 
ozonesecure-mr compose. Contribute…
URL: https://github.com/apache/hadoop/pull/1490#issuecomment-533766473
 
 
   :broken_heart: **-1 overall**
   
   
   
   
   
   
   | Vote | Subsystem | Runtime | Comment |
   |::|--:|:|:|
   | 0 | reexec | 104 | Docker mode activated. |
   ||| _ Prechecks _ |
   | +1 | dupname | 0 | No case conflicting files found. |
   | 0 | shelldocs | 0 | Shelldocs was not available. |
   | +1 | @author | 0 | The patch does not contain any @author tags. |
   | -1 | test4tests | 0 | The patch doesn't appear to include any new or 
modified tests.  Please justify why no new tests are needed for this patch. 
Also please list what manual steps were performed to verify this patch. |
   ||| _ trunk Compile Tests _ |
   | -1 | mvninstall | 35 | hadoop-ozone in trunk failed. |
   | -1 | compile | 25 | hadoop-ozone in trunk failed. |
   | +1 | mvnsite | 0 | trunk passed |
   | -1 | shadedclient | 34 | branch has errors when building and testing our 
client artifacts. |
   | -1 | javadoc | 56 | hadoop-ozone in trunk failed. |
   ||| _ Patch Compile Tests _ |
   | -1 | mvninstall | 33 | hadoop-ozone in the patch failed. |
   | -1 | compile | 23 | hadoop-ozone in the patch failed. |
   | -1 | javac | 23 | hadoop-ozone in the patch failed. |
   | +1 | mvnsite | 0 | the patch passed |
   | +1 | shellcheck | 0 | There were no new shellcheck issues. |
   | +1 | whitespace | 0 | The patch has no whitespace issues. |
   | -1 | shadedclient | 30 | patch has errors when building and testing our 
client artifacts. |
   | -1 | javadoc | 48 | hadoop-ozone in the patch failed. |
   ||| _ Other Tests _ |
   | -1 | unit | 150 | hadoop-hdds in the patch failed. |
   | -1 | unit | 25 | hadoop-ozone in the patch failed. |
   | +1 | asflicense | 26 | The patch does not generate ASF License warnings. |
   | | | 1238 | |
   
   
   | Reason | Tests |
   |---:|:--|
   | Failed junit tests | hadoop.ozone.container.ozoneimpl.TestOzoneContainer |
   
   
   | Subsystem | Report/Notes |
   |--:|:-|
   | Docker | Client=19.03.0 Server=19.03.0 base: 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1490/1/artifact/out/Dockerfile
 |
   | GITHUB PR | https://github.com/apache/hadoop/pull/1490 |
   | Optional Tests | dupname asflicense mvnsite unit shellcheck shelldocs 
compile javac javadoc mvninstall shadedclient |
   | uname | Linux a01eca9bda51 4.15.0-58-generic #64-Ubuntu SMP Tue Aug 6 
11:12:41 UTC 2019 x86_64 x86_64 x86_64 GNU/Linux |
   | Build tool | maven |
   | Personality | personality/hadoop.sh |
   | git revision | trunk / d7d6ec8 |
   | Default Java | 1.8.0_222 |
   | mvninstall | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1490/1/artifact/out/branch-mvninstall-hadoop-ozone.txt
 |
   | compile | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1490/1/artifact/out/branch-compile-hadoop-ozone.txt
 |
   | javadoc | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1490/1/artifact/out/branch-javadoc-hadoop-ozone.txt
 |
   | mvninstall | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1490/1/artifact/out/patch-mvninstall-hadoop-ozone.txt
 |
   | compile | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1490/1/artifact/out/patch-compile-hadoop-ozone.txt
 |
   | javac | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1490/1/artifact/out/patch-compile-hadoop-ozone.txt
 |
   | javadoc | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1490/1/artifact/out/patch-javadoc-hadoop-ozone.txt
 |
   | unit | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1490/1/artifact/out/patch-unit-hadoop-hdds.txt
 |
   | unit | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1490/1/artifact/out/patch-unit-hadoop-ozone.txt
 |
   |  Test Results | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1490/1/testReport/ |
   | Max. process+thread count | 292 (vs. ulimit of 5500) |
   | modules | C: hadoop-ozone/dist U: hadoop-ozone/dist |
   | Console output | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1490/1/console |
   | versions | git=2.7.4 maven=3.3.9 shellcheck=0.4.6 |
   | Powered by | Apache Yetus 0.10.0 http://yetus.apache.org |
   
   
   This message was automatically generated.
   
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[GitHub] [hadoop] hadoop-yetus commented on issue #1482: HADOOP-16589. [pb-upgrade] Update docker image to make 3.7.1 protoc as default

2019-09-20 Thread GitBox
hadoop-yetus commented on issue #1482: HADOOP-16589. [pb-upgrade] Update docker 
image to make 3.7.1 protoc as default
URL: https://github.com/apache/hadoop/pull/1482#issuecomment-533766104
 
 
   (!) A patch to the testing environment has been detected. 
   Re-executing against the patched versions to perform further tests. 
   The console is at 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1482/3/console in case 
of problems.
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[GitHub] [hadoop] hadoop-yetus commented on issue #1491: HDDS-2161. Create RepeatedKeyInfo structure to be saved in deletedTable

2019-09-20 Thread GitBox
hadoop-yetus commented on issue #1491: HDDS-2161. Create RepeatedKeyInfo 
structure to be saved in deletedTable
URL: https://github.com/apache/hadoop/pull/1491#issuecomment-533762852
 
 
   :broken_heart: **-1 overall**
   
   
   
   
   
   
   | Vote | Subsystem | Runtime | Comment |
   |::|--:|:|:|
   | 0 | reexec | 39 | Docker mode activated. |
   ||| _ Prechecks _ |
   | +1 | dupname | 0 | No case conflicting files found. |
   | +1 | @author | 0 | The patch does not contain any @author tags. |
   | +1 | test4tests | 0 | The patch appears to include 3 new or modified test 
files. |
   ||| _ trunk Compile Tests _ |
   | 0 | mvndep | 15 | Maven dependency ordering for branch |
   | -1 | mvninstall | 29 | hadoop-ozone in trunk failed. |
   | -1 | compile | 21 | hadoop-ozone in trunk failed. |
   | +1 | checkstyle | 50 | trunk passed |
   | +1 | mvnsite | 0 | trunk passed |
   | -1 | shadedclient | 97 | branch has errors when building and testing our 
client artifacts. |
   | -1 | javadoc | 43 | hadoop-ozone in trunk failed. |
   | 0 | spotbugs | 161 | Used deprecated FindBugs config; considering 
switching to SpotBugs. |
   | -1 | findbugs | 24 | hadoop-ozone in trunk failed. |
   ||| _ Patch Compile Tests _ |
   | 0 | mvndep | 16 | Maven dependency ordering for patch |
   | -1 | mvninstall | 30 | hadoop-ozone in the patch failed. |
   | -1 | compile | 23 | hadoop-ozone in the patch failed. |
   | -1 | cc | 23 | hadoop-ozone in the patch failed. |
   | -1 | javac | 23 | hadoop-ozone in the patch failed. |
   | +1 | checkstyle | 50 | the patch passed |
   | +1 | mvnsite | 0 | the patch passed |
   | +1 | whitespace | 0 | The patch has no whitespace issues. |
   | -1 | shadedclient | 30 | patch has errors when building and testing our 
client artifacts. |
   | -1 | javadoc | 46 | hadoop-ozone in the patch failed. |
   | -1 | findbugs | 23 | hadoop-ozone in the patch failed. |
   ||| _ Other Tests _ |
   | -1 | unit | 229 | hadoop-hdds in the patch failed. |
   | -1 | unit | 26 | hadoop-ozone in the patch failed. |
   | +1 | asflicense | 28 | The patch does not generate ASF License warnings. |
   | | | 1666 | |
   
   
   | Reason | Tests |
   |---:|:--|
   | Failed junit tests | 
hadoop.hdds.scm.container.placement.algorithms.TestSCMContainerPlacementRackAware
 |
   
   
   | Subsystem | Report/Notes |
   |--:|:-|
   | Docker | Client=19.03.1 Server=19.03.1 base: 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1491/3/artifact/out/Dockerfile
 |
   | GITHUB PR | https://github.com/apache/hadoop/pull/1491 |
   | Optional Tests | dupname asflicense compile javac javadoc mvninstall 
mvnsite unit shadedclient findbugs checkstyle cc |
   | uname | Linux 5864805974c4 4.15.0-58-generic #64-Ubuntu SMP Tue Aug 6 
11:12:41 UTC 2019 x86_64 x86_64 x86_64 GNU/Linux |
   | Build tool | maven |
   | Personality | personality/hadoop.sh |
   | git revision | trunk / d7d6ec8 |
   | Default Java | 1.8.0_222 |
   | mvninstall | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1491/3/artifact/out/branch-mvninstall-hadoop-ozone.txt
 |
   | compile | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1491/3/artifact/out/branch-compile-hadoop-ozone.txt
 |
   | javadoc | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1491/3/artifact/out/branch-javadoc-hadoop-ozone.txt
 |
   | findbugs | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1491/3/artifact/out/branch-findbugs-hadoop-ozone.txt
 |
   | mvninstall | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1491/3/artifact/out/patch-mvninstall-hadoop-ozone.txt
 |
   | compile | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1491/3/artifact/out/patch-compile-hadoop-ozone.txt
 |
   | cc | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1491/3/artifact/out/patch-compile-hadoop-ozone.txt
 |
   | javac | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1491/3/artifact/out/patch-compile-hadoop-ozone.txt
 |
   | javadoc | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1491/3/artifact/out/patch-javadoc-hadoop-ozone.txt
 |
   | findbugs | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1491/3/artifact/out/patch-findbugs-hadoop-ozone.txt
 |
   | unit | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1491/3/artifact/out/patch-unit-hadoop-hdds.txt
 |
   | unit | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1491/3/artifact/out/patch-unit-hadoop-ozone.txt
 |
   |  Test Results | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1491/3/testReport/ |
   | Max. process+thread count | 588 (vs. ulimit of 5500) |
   | modules | C: hadoop-ozone/common hadoop-ozone/ozone-manager U: 
hadoop-ozone |
   | Console output | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1491/3/console |
   | versions | git=2.7.4 maven=3.3.9 findbugs=3.1.0-RC1 |
   | Powered by | Apache Yetus 0.10.0 http://yetus.apache.org |
   
   
   This 

[GitHub] [hadoop] hadoop-yetus commented on issue #1491: HDDS-2161. Create RepeatedKeyInfo structure to be saved in deletedTable

2019-09-20 Thread GitBox
hadoop-yetus commented on issue #1491: HDDS-2161. Create RepeatedKeyInfo 
structure to be saved in deletedTable
URL: https://github.com/apache/hadoop/pull/1491#issuecomment-533761420
 
 
   :broken_heart: **-1 overall**
   
   
   
   
   
   
   | Vote | Subsystem | Runtime | Comment |
   |::|--:|:|:|
   | 0 | reexec | 39 | Docker mode activated. |
   ||| _ Prechecks _ |
   | +1 | dupname | 0 | No case conflicting files found. |
   | +1 | @author | 0 | The patch does not contain any @author tags. |
   | +1 | test4tests | 0 | The patch appears to include 3 new or modified test 
files. |
   ||| _ trunk Compile Tests _ |
   | 0 | mvndep | 25 | Maven dependency ordering for branch |
   | -1 | mvninstall | 28 | hadoop-ozone in trunk failed. |
   | -1 | compile | 22 | hadoop-ozone in trunk failed. |
   | +1 | checkstyle | 57 | trunk passed |
   | +1 | mvnsite | 0 | trunk passed |
   | -1 | shadedclient | 110 | branch has errors when building and testing our 
client artifacts. |
   | -1 | javadoc | 45 | hadoop-ozone in trunk failed. |
   | 0 | spotbugs | 157 | Used deprecated FindBugs config; considering 
switching to SpotBugs. |
   | -1 | findbugs | 23 | hadoop-ozone in trunk failed. |
   ||| _ Patch Compile Tests _ |
   | 0 | mvndep | 15 | Maven dependency ordering for patch |
   | -1 | mvninstall | 30 | hadoop-ozone in the patch failed. |
   | -1 | compile | 22 | hadoop-ozone in the patch failed. |
   | -1 | cc | 22 | hadoop-ozone in the patch failed. |
   | -1 | javac | 22 | hadoop-ozone in the patch failed. |
   | -0 | checkstyle | 27 | hadoop-ozone: The patch generated 2 new + 0 
unchanged - 0 fixed = 2 total (was 0) |
   | +1 | mvnsite | 0 | the patch passed |
   | +1 | whitespace | 1 | The patch has no whitespace issues. |
   | -1 | shadedclient | 32 | patch has errors when building and testing our 
client artifacts. |
   | -1 | javadoc | 44 | hadoop-ozone in the patch failed. |
   | -1 | findbugs | 23 | hadoop-ozone in the patch failed. |
   ||| _ Other Tests _ |
   | +1 | unit | 233 | hadoop-hdds in the patch passed. |
   | -1 | unit | 22 | hadoop-ozone in the patch failed. |
   | +1 | asflicense | 29 | The patch does not generate ASF License warnings. |
   | | | 1693 | |
   
   
   | Subsystem | Report/Notes |
   |--:|:-|
   | Docker | Client=19.03.1 Server=19.03.1 base: 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1491/2/artifact/out/Dockerfile
 |
   | GITHUB PR | https://github.com/apache/hadoop/pull/1491 |
   | Optional Tests | dupname asflicense compile javac javadoc mvninstall 
mvnsite unit shadedclient findbugs checkstyle cc |
   | uname | Linux 702b37690e30 4.15.0-58-generic #64-Ubuntu SMP Tue Aug 6 
11:12:41 UTC 2019 x86_64 x86_64 x86_64 GNU/Linux |
   | Build tool | maven |
   | Personality | personality/hadoop.sh |
   | git revision | trunk / d7d6ec8 |
   | Default Java | 1.8.0_222 |
   | mvninstall | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1491/2/artifact/out/branch-mvninstall-hadoop-ozone.txt
 |
   | compile | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1491/2/artifact/out/branch-compile-hadoop-ozone.txt
 |
   | javadoc | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1491/2/artifact/out/branch-javadoc-hadoop-ozone.txt
 |
   | findbugs | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1491/2/artifact/out/branch-findbugs-hadoop-ozone.txt
 |
   | mvninstall | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1491/2/artifact/out/patch-mvninstall-hadoop-ozone.txt
 |
   | compile | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1491/2/artifact/out/patch-compile-hadoop-ozone.txt
 |
   | cc | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1491/2/artifact/out/patch-compile-hadoop-ozone.txt
 |
   | javac | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1491/2/artifact/out/patch-compile-hadoop-ozone.txt
 |
   | checkstyle | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1491/2/artifact/out/diff-checkstyle-hadoop-ozone.txt
 |
   | javadoc | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1491/2/artifact/out/patch-javadoc-hadoop-ozone.txt
 |
   | findbugs | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1491/2/artifact/out/patch-findbugs-hadoop-ozone.txt
 |
   | unit | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1491/2/artifact/out/patch-unit-hadoop-ozone.txt
 |
   |  Test Results | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1491/2/testReport/ |
   | Max. process+thread count | 511 (vs. ulimit of 5500) |
   | modules | C: hadoop-ozone/common hadoop-ozone/ozone-manager U: 
hadoop-ozone |
   | Console output | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1491/2/console |
   | versions | git=2.7.4 maven=3.3.9 findbugs=3.1.0-RC1 |
   | Powered by | Apache Yetus 0.10.0 http://yetus.apache.org |
   
   
   This message was automatically generated.
   
   


[GitHub] [hadoop] dineshchitlangia commented on issue #1491: HDDS-2161. Create RepeatedKeyInfo structure to be saved in deletedTable

2019-09-20 Thread GitBox
dineshchitlangia commented on issue #1491: HDDS-2161. Create RepeatedKeyInfo 
structure to be saved in deletedTable
URL: https://github.com/apache/hadoop/pull/1491#issuecomment-533761076
 
 
   /restest


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[GitHub] [hadoop] dineshchitlangia removed a comment on issue #1491: HDDS-2161. Create RepeatedKeyInfo structure to be saved in deletedTable

2019-09-20 Thread GitBox
dineshchitlangia removed a comment on issue #1491: HDDS-2161. Create 
RepeatedKeyInfo structure to be saved in deletedTable
URL: https://github.com/apache/hadoop/pull/1491#issuecomment-533761076
 
 
   /restest


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[GitHub] [hadoop] dineshchitlangia commented on issue #1491: HDDS-2161. Create RepeatedKeyInfo structure to be saved in deletedTable

2019-09-20 Thread GitBox
dineshchitlangia commented on issue #1491: HDDS-2161. Create RepeatedKeyInfo 
structure to be saved in deletedTable
URL: https://github.com/apache/hadoop/pull/1491#issuecomment-533761103
 
 
   /retest


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[GitHub] [hadoop] hadoop-yetus commented on issue #1493: HDDS-2163. Add 'Replication factor' to the output of list keys

2019-09-20 Thread GitBox
hadoop-yetus commented on issue #1493: HDDS-2163. Add 'Replication factor' to 
the output of list keys
URL: https://github.com/apache/hadoop/pull/1493#issuecomment-533760793
 
 
   :broken_heart: **-1 overall**
   
   
   
   
   
   
   | Vote | Subsystem | Runtime | Comment |
   |::|--:|:|:|
   | 0 | reexec | 84 | Docker mode activated. |
   ||| _ Prechecks _ |
   | +1 | dupname | 0 | No case conflicting files found. |
   | +1 | @author | 0 | The patch does not contain any @author tags. |
   | +1 | test4tests | 0 | The patch appears to include 1 new or modified test 
files. |
   ||| _ trunk Compile Tests _ |
   | 0 | mvndep | 27 | Maven dependency ordering for branch |
   | -1 | mvninstall | 30 | hadoop-ozone in trunk failed. |
   | -1 | compile | 21 | hadoop-ozone in trunk failed. |
   | +1 | checkstyle | 64 | trunk passed |
   | +1 | mvnsite | 0 | trunk passed |
   | -1 | shadedclient | 119 | branch has errors when building and testing our 
client artifacts. |
   | -1 | javadoc | 51 | hadoop-ozone in trunk failed. |
   | 0 | spotbugs | 185 | Used deprecated FindBugs config; considering 
switching to SpotBugs. |
   | -1 | findbugs | 22 | hadoop-ozone in trunk failed. |
   ||| _ Patch Compile Tests _ |
   | 0 | mvndep | 13 | Maven dependency ordering for patch |
   | -1 | mvninstall | 29 | hadoop-ozone in the patch failed. |
   | -1 | compile | 20 | hadoop-ozone in the patch failed. |
   | -1 | javac | 20 | hadoop-ozone in the patch failed. |
   | +1 | checkstyle | 50 | the patch passed |
   | +1 | mvnsite | 0 | the patch passed |
   | +1 | whitespace | 0 | The patch has no whitespace issues. |
   | -1 | shadedclient | 30 | patch has errors when building and testing our 
client artifacts. |
   | -1 | javadoc | 47 | hadoop-ozone in the patch failed. |
   | -1 | findbugs | 21 | hadoop-ozone in the patch failed. |
   ||| _ Other Tests _ |
   | +1 | unit | 266 | hadoop-hdds in the patch passed. |
   | -1 | unit | 24 | hadoop-ozone in the patch failed. |
   | +1 | asflicense | 26 | The patch does not generate ASF License warnings. |
   | | | 1855 | |
   
   
   | Subsystem | Report/Notes |
   |--:|:-|
   | Docker | Client=19.03.1 Server=19.03.1 base: 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1493/1/artifact/out/Dockerfile
 |
   | GITHUB PR | https://github.com/apache/hadoop/pull/1493 |
   | Optional Tests | dupname asflicense compile javac javadoc mvninstall 
mvnsite unit shadedclient findbugs checkstyle |
   | uname | Linux 5109e31b4ad3 4.15.0-54-generic #58-Ubuntu SMP Mon Jun 24 
10:55:24 UTC 2019 x86_64 x86_64 x86_64 GNU/Linux |
   | Build tool | maven |
   | Personality | personality/hadoop.sh |
   | git revision | trunk / d7d6ec8 |
   | Default Java | 1.8.0_222 |
   | mvninstall | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1493/1/artifact/out/branch-mvninstall-hadoop-ozone.txt
 |
   | compile | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1493/1/artifact/out/branch-compile-hadoop-ozone.txt
 |
   | javadoc | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1493/1/artifact/out/branch-javadoc-hadoop-ozone.txt
 |
   | findbugs | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1493/1/artifact/out/branch-findbugs-hadoop-ozone.txt
 |
   | mvninstall | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1493/1/artifact/out/patch-mvninstall-hadoop-ozone.txt
 |
   | compile | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1493/1/artifact/out/patch-compile-hadoop-ozone.txt
 |
   | javac | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1493/1/artifact/out/patch-compile-hadoop-ozone.txt
 |
   | javadoc | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1493/1/artifact/out/patch-javadoc-hadoop-ozone.txt
 |
   | findbugs | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1493/1/artifact/out/patch-findbugs-hadoop-ozone.txt
 |
   | unit | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1493/1/artifact/out/patch-unit-hadoop-ozone.txt
 |
   |  Test Results | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1493/1/testReport/ |
   | Max. process+thread count | 436 (vs. ulimit of 5500) |
   | modules | C: hadoop-ozone/client hadoop-ozone/s3gateway U: hadoop-ozone |
   | Console output | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1493/1/console |
   | versions | git=2.7.4 maven=3.3.9 findbugs=3.1.0-RC1 |
   | Powered by | Apache Yetus 0.10.0 http://yetus.apache.org |
   
   
   This message was automatically generated.
   
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To 

[GitHub] [hadoop] hadoop-yetus commented on issue #1478: HDFS-14856 Fetch file ACLs while mounting external store

2019-09-20 Thread GitBox
hadoop-yetus commented on issue #1478: HDFS-14856 Fetch file ACLs while 
mounting external store
URL: https://github.com/apache/hadoop/pull/1478#issuecomment-533760591
 
 
   :broken_heart: **-1 overall**
   
   
   
   
   
   
   | Vote | Subsystem | Runtime | Comment |
   |::|--:|:|:|
   | 0 | reexec | 35 | Docker mode activated. |
   ||| _ Prechecks _ |
   | +1 | dupname | 0 | No case conflicting files found. |
   | +1 | @author | 0 | The patch does not contain any @author tags. |
   | +1 | test4tests | 0 | The patch appears to include 2 new or modified test 
files. |
   ||| _ trunk Compile Tests _ |
   | 0 | mvndep | 24 | Maven dependency ordering for branch |
   | +1 | mvninstall | 1254 | trunk passed |
   | +1 | compile | 1153 | trunk passed |
   | +1 | checkstyle | 160 | trunk passed |
   | +1 | mvnsite | 114 | trunk passed |
   | -1 | shadedclient | 370 | branch has errors when building and testing our 
client artifacts. |
   | +1 | javadoc | 117 | trunk passed |
   | 0 | spotbugs | 46 | Used deprecated FindBugs config; considering switching 
to SpotBugs. |
   | +1 | findbugs | 242 | trunk passed |
   ||| _ Patch Compile Tests _ |
   | 0 | mvndep | 24 | Maven dependency ordering for patch |
   | +1 | mvninstall | 102 | the patch passed |
   | +1 | compile | 1127 | the patch passed |
   | +1 | javac | 1127 | the patch passed |
   | -0 | checkstyle | 158 | root: The patch generated 10 new + 452 unchanged - 
1 fixed = 462 total (was 453) |
   | +1 | mvnsite | 115 | the patch passed |
   | +1 | whitespace | 0 | The patch has no whitespace issues. |
   | +1 | xml | 1 | The patch has no ill-formed XML file. |
   | -1 | shadedclient | 50 | patch has errors when building and testing our 
client artifacts. |
   | +1 | javadoc | 121 | the patch passed |
   | +1 | findbugs | 269 | the patch passed |
   ||| _ Other Tests _ |
   | -1 | unit | 5828 | hadoop-hdfs in the patch failed. |
   | +1 | unit | 41 | hadoop-fs2img in the patch passed. |
   | +1 | asflicense | 50 | The patch does not generate ASF License warnings. |
   | | | 11403 | |
   
   
   | Reason | Tests |
   |---:|:--|
   | Failed junit tests | hadoop.hdfs.server.datanode.TestDirectoryScanner |
   |   | hadoop.hdfs.TestFileChecksum |
   |   | hadoop.hdfs.TestLeaseRecovery2 |
   |   | hadoop.hdfs.TestErasureCodingPoliciesWithRandomECPolicy |
   
   
   | Subsystem | Report/Notes |
   |--:|:-|
   | Docker | Client=19.03.1 Server=19.03.1 base: 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1478/3/artifact/out/Dockerfile
 |
   | GITHUB PR | https://github.com/apache/hadoop/pull/1478 |
   | Optional Tests | dupname asflicense compile javac javadoc mvninstall 
mvnsite unit shadedclient findbugs checkstyle xml |
   | uname | Linux 7f059105e6b8 4.15.0-60-generic #67-Ubuntu SMP Thu Aug 22 
16:55:30 UTC 2019 x86_64 x86_64 x86_64 GNU/Linux |
   | Build tool | maven |
   | Personality | personality/hadoop.sh |
   | git revision | trunk / d7d6ec8 |
   | Default Java | 1.8.0_222 |
   | checkstyle | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1478/3/artifact/out/diff-checkstyle-root.txt
 |
   | unit | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1478/3/artifact/out/patch-unit-hadoop-hdfs-project_hadoop-hdfs.txt
 |
   |  Test Results | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1478/3/testReport/ |
   | Max. process+thread count | 3376 (vs. ulimit of 5500) |
   | modules | C: hadoop-hdfs-project/hadoop-hdfs hadoop-tools/hadoop-fs2img U: 
. |
   | Console output | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1478/3/console |
   | versions | git=2.7.4 maven=3.3.9 findbugs=3.1.0-RC1 |
   | Powered by | Apache Yetus 0.10.0 http://yetus.apache.org |
   
   
   This message was automatically generated.
   
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[GitHub] [hadoop] hadoop-yetus commented on issue #1478: HDFS-14856 Fetch file ACLs while mounting external store

2019-09-20 Thread GitBox
hadoop-yetus commented on issue #1478: HDFS-14856 Fetch file ACLs while 
mounting external store
URL: https://github.com/apache/hadoop/pull/1478#issuecomment-533760485
 
 
   :broken_heart: **-1 overall**
   
   
   
   
   
   
   | Vote | Subsystem | Runtime | Comment |
   |::|--:|:|:|
   | 0 | reexec | 33 | Docker mode activated. |
   ||| _ Prechecks _ |
   | +1 | dupname | 1 | No case conflicting files found. |
   | +1 | @author | 0 | The patch does not contain any @author tags. |
   | +1 | test4tests | 0 | The patch appears to include 2 new or modified test 
files. |
   ||| _ trunk Compile Tests _ |
   | 0 | mvndep | 59 | Maven dependency ordering for branch |
   | +1 | mvninstall | 1287 | trunk passed |
   | +1 | compile | 1158 | trunk passed |
   | +1 | checkstyle | 211 | trunk passed |
   | +1 | mvnsite | 114 | trunk passed |
   | -1 | shadedclient | 422 | branch has errors when building and testing our 
client artifacts. |
   | +1 | javadoc | 115 | trunk passed |
   | 0 | spotbugs | 47 | Used deprecated FindBugs config; considering switching 
to SpotBugs. |
   | +1 | findbugs | 244 | trunk passed |
   ||| _ Patch Compile Tests _ |
   | 0 | mvndep | 24 | Maven dependency ordering for patch |
   | +1 | mvninstall | 92 | the patch passed |
   | +1 | compile | 1141 | the patch passed |
   | +1 | javac | 1141 | the patch passed |
   | -0 | checkstyle | 169 | root: The patch generated 9 new + 452 unchanged - 
1 fixed = 461 total (was 453) |
   | +1 | mvnsite | 110 | the patch passed |
   | +1 | whitespace | 0 | The patch has no whitespace issues. |
   | +1 | xml | 2 | The patch has no ill-formed XML file. |
   | -1 | shadedclient | 52 | patch has errors when building and testing our 
client artifacts. |
   | +1 | javadoc | 115 | the patch passed |
   | +1 | findbugs | 264 | the patch passed |
   ||| _ Other Tests _ |
   | -1 | unit | 5827 | hadoop-hdfs in the patch failed. |
   | +1 | unit | 61 | hadoop-fs2img in the patch passed. |
   | +1 | asflicense | 55 | The patch does not generate ASF License warnings. |
   | | | 11538 | |
   
   
   | Reason | Tests |
   |---:|:--|
   | Failed junit tests | 
hadoop.hdfs.server.datanode.TestDataNodeVolumeFailureReporting |
   |   | hadoop.hdfs.TestFileChecksumCompositeCrc |
   
   
   | Subsystem | Report/Notes |
   |--:|:-|
   | Docker | Client=19.03.1 Server=19.03.1 base: 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1478/2/artifact/out/Dockerfile
 |
   | GITHUB PR | https://github.com/apache/hadoop/pull/1478 |
   | Optional Tests | dupname asflicense compile javac javadoc mvninstall 
mvnsite unit shadedclient findbugs checkstyle xml |
   | uname | Linux 0ffddb4b3ead 4.15.0-60-generic #67-Ubuntu SMP Thu Aug 22 
16:55:30 UTC 2019 x86_64 x86_64 x86_64 GNU/Linux |
   | Build tool | maven |
   | Personality | personality/hadoop.sh |
   | git revision | trunk / d7d6ec8 |
   | Default Java | 1.8.0_222 |
   | checkstyle | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1478/2/artifact/out/diff-checkstyle-root.txt
 |
   | unit | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1478/2/artifact/out/patch-unit-hadoop-hdfs-project_hadoop-hdfs.txt
 |
   |  Test Results | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1478/2/testReport/ |
   | Max. process+thread count | 2883 (vs. ulimit of 5500) |
   | modules | C: hadoop-hdfs-project/hadoop-hdfs hadoop-tools/hadoop-fs2img U: 
. |
   | Console output | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1478/2/console |
   | versions | git=2.7.4 maven=3.3.9 findbugs=3.1.0-RC1 |
   | Powered by | Apache Yetus 0.10.0 http://yetus.apache.org |
   
   
   This message was automatically generated.
   
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[GitHub] [hadoop] jeffsaremi commented on a change in pull request #1480: HDFS-14857 FS operations fail in HA mode: DataNode fails to connect to NameNode

2019-09-20 Thread GitBox
jeffsaremi commented on a change in pull request #1480: HDFS-14857 FS 
operations fail in HA mode: DataNode fails to connect to NameNode
URL: https://github.com/apache/hadoop/pull/1480#discussion_r326843506
 
 

 ##
 File path: 
hadoop-hdfs-project/hadoop-hdfs-client/src/main/java/org/apache/hadoop/hdfs/server/namenode/ha/AbstractNNFailoverProxyProvider.java
 ##
 @@ -200,6 +201,24 @@ public HAServiceState getCachedState() {
 return proxies;
   }
 
+  /**
+   * Resets the NameNode proxy addresses in case they're stale
+   */
+  protected void resetProxyAddresses(List> proxies) {
 
 Review comment:
   changed as suggested


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[GitHub] [hadoop] jeffsaremi commented on a change in pull request #1480: HDFS-14857 FS operations fail in HA mode: DataNode fails to connect to NameNode

2019-09-20 Thread GitBox
jeffsaremi commented on a change in pull request #1480: HDFS-14857 FS 
operations fail in HA mode: DataNode fails to connect to NameNode
URL: https://github.com/apache/hadoop/pull/1480#discussion_r326843503
 
 

 ##
 File path: 
hadoop-hdfs-project/hadoop-hdfs-client/src/main/java/org/apache/hadoop/hdfs/server/namenode/ha/ConfiguredFailoverProxyProvider.java
 ##
 @@ -36,6 +36,8 @@
 public class ConfiguredFailoverProxyProvider extends
 AbstractNNFailoverProxyProvider {
 
+  public static final String RESET_PROXY_ON_FAILOVER = 
"dfs.client.failover.proxy.provider.reset-proxy-on-failure";
 
 Review comment:
   removed this setting


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[GitHub] [hadoop] jeffsaremi commented on a change in pull request #1480: HDFS-14857 FS operations fail in HA mode: DataNode fails to connect to NameNode

2019-09-20 Thread GitBox
jeffsaremi commented on a change in pull request #1480: HDFS-14857 FS 
operations fail in HA mode: DataNode fails to connect to NameNode
URL: https://github.com/apache/hadoop/pull/1480#discussion_r326843514
 
 

 ##
 File path: 
hadoop-hdfs-project/hadoop-hdfs-client/src/main/java/org/apache/hadoop/hdfs/server/namenode/ha/ConfiguredFailoverProxyProvider.java
 ##
 @@ -62,6 +64,11 @@ public ConfiguredFailoverProxyProvider(Configuration conf, 
URI uri,
 
   @Override
   public  void performFailover(T currentProxy) {
+if(conf.getBoolean(RESET_PROXY_ON_FAILOVER, false)) {
 
 Review comment:
   added one 


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[GitHub] [hadoop] dineshchitlangia commented on issue #1491: HDDS-2161. Create RepeatedKeyInfo structure to be saved in deletedTable

2019-09-20 Thread GitBox
dineshchitlangia commented on issue #1491: HDDS-2161. Create RepeatedKeyInfo 
structure to be saved in deletedTable
URL: https://github.com/apache/hadoop/pull/1491#issuecomment-533759741
 
 
   /retest


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[jira] [Updated] (HADOOP-16591) S3A ITest*MRjob failures

2019-09-20 Thread Siddharth Seth (Jira)


 [ 
https://issues.apache.org/jira/browse/HADOOP-16591?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Siddharth Seth updated HADOOP-16591:

Status: Patch Available  (was: Open)

> S3A ITest*MRjob failures
> 
>
> Key: HADOOP-16591
> URL: https://issues.apache.org/jira/browse/HADOOP-16591
> Project: Hadoop Common
>  Issue Type: Test
>  Components: fs/s3
>Reporter: Siddharth Seth
>Assignee: Siddharth Seth
>Priority: Major
>
> ITest*MRJob fail with a FileNotFoundException
> {code}
> [ERROR]   
> ITestMagicCommitMRJob>AbstractITCommitMRJob.testMRJob:146->AbstractFSContractTestBase.assertIsDirectory:327
>  » FileNotFound
> [ERROR]   
> ITestDirectoryCommitMRJob>AbstractITCommitMRJob.testMRJob:146->AbstractFSContractTestBase.assertIsDirectory:327
>  » FileNotFound
> [ERROR]   
> ITestPartitionCommitMRJob>AbstractITCommitMRJob.testMRJob:146->AbstractFSContractTestBase.assertIsDirectory:327
>  » FileNotFound
> [ERROR]   
> ITestStagingCommitMRJob>AbstractITCommitMRJob.testMRJob:146->AbstractFSContractTestBase.assertIsDirectory:327
>  » FileNotFound
> {code}
> Details here: 
> https://issues.apache.org/jira/browse/HADOOP-16207?focusedCommentId=16933718=com.atlassian.jira.plugin.system.issuetabpanels%3Acomment-tabpanel#comment-16933718
> Creating a separate jira since HADOOP-16207 already has a patch which is 
> trying to parallelize the test runs.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[GitHub] [hadoop] vivekratnavel commented on issue #1493: HDDS-2163. Add 'Replication factor' to the output of list keys

2019-09-20 Thread GitBox
vivekratnavel commented on issue #1493: HDDS-2163. Add 'Replication factor' to 
the output of list keys
URL: https://github.com/apache/hadoop/pull/1493#issuecomment-533758900
 
 
   @bharatviswa504 @anuengineer @elek Please review


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[GitHub] [hadoop] vivekratnavel commented on issue #1493: HDDS-2163. Add 'Replication factor' to the output of list keys

2019-09-20 Thread GitBox
vivekratnavel commented on issue #1493: HDDS-2163. Add 'Replication factor' to 
the output of list keys
URL: https://github.com/apache/hadoop/pull/1493#issuecomment-533758842
 
 
   /label ozone


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[GitHub] [hadoop] vivekratnavel opened a new pull request #1493: HDDS-2163. Add 'Replication factor' to the output of list keys

2019-09-20 Thread GitBox
vivekratnavel opened a new pull request #1493: HDDS-2163. Add 'Replication 
factor' to the output of list keys
URL: https://github.com/apache/hadoop/pull/1493
 
 
   This patch adds replication factor to the output of list keys.
   
   ```
   bash-4.2$ ozone sh key list /vol1/bucket1
   {
 "volumeName" : "vol1",
 "bucketName" : "bucket1",
 "name" : "dockerfile",
 "dataSize" : 876,
 "creationTime" : 1569022651963,
 "modificationTime" : 1569022654752,
 "replicationType" : "RATIS",
 "replicationFactor" : 1
   }
   ```


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[GitHub] [hadoop] hadoop-yetus commented on issue #1492: MAPREDUCE-7241. FileInputFormat listStatus oom caused by lots of unwanted block infos

2019-09-20 Thread GitBox
hadoop-yetus commented on issue #1492: MAPREDUCE-7241. FileInputFormat 
listStatus oom caused by lots of unwanted block infos
URL: https://github.com/apache/hadoop/pull/1492#issuecomment-533756725
 
 
   :broken_heart: **-1 overall**
   
   
   
   
   
   
   | Vote | Subsystem | Runtime | Comment |
   |::|--:|:|:|
   | 0 | reexec | 37 | Docker mode activated. |
   ||| _ Prechecks _ |
   | +1 | dupname | 0 | No case conflicting files found. |
   | +1 | @author | 0 | The patch does not contain any @author tags. |
   | -1 | test4tests | 0 | The patch doesn't appear to include any new or 
modified tests.  Please justify why no new tests are needed for this patch. 
Also please list what manual steps were performed to verify this patch. |
   ||| _ trunk Compile Tests _ |
   | +1 | mvninstall | 1065 | trunk passed |
   | +1 | compile | 35 | trunk passed |
   | +1 | checkstyle | 32 | trunk passed |
   | +1 | mvnsite | 38 | trunk passed |
   | -1 | shadedclient | 113 | branch has errors when building and testing our 
client artifacts. |
   | +1 | javadoc | 24 | trunk passed |
   | 0 | spotbugs | 71 | Used deprecated FindBugs config; considering switching 
to SpotBugs. |
   | +1 | findbugs | 69 | trunk passed |
   ||| _ Patch Compile Tests _ |
   | +1 | mvninstall | 32 | the patch passed |
   | +1 | compile | 28 | the patch passed |
   | +1 | javac | 28 | the patch passed |
   | -0 | checkstyle | 25 | 
hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-core: 
The patch generated 1 new + 146 unchanged - 0 fixed = 147 total (was 146) |
   | +1 | mvnsite | 31 | the patch passed |
   | +1 | whitespace | 0 | The patch has no whitespace issues. |
   | -1 | shadedclient | 28 | patch has errors when building and testing our 
client artifacts. |
   | +1 | javadoc | 17 | the patch passed |
   | +1 | findbugs | 71 | the patch passed |
   ||| _ Other Tests _ |
   | +1 | unit | 322 | hadoop-mapreduce-client-core in the patch passed. |
   | +1 | asflicense | 25 | The patch does not generate ASF License warnings. |
   | | | 2081 | |
   
   
   | Subsystem | Report/Notes |
   |--:|:-|
   | Docker | Client=19.03.1 Server=19.03.1 base: 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1492/1/artifact/out/Dockerfile
 |
   | GITHUB PR | https://github.com/apache/hadoop/pull/1492 |
   | Optional Tests | dupname asflicense compile javac javadoc mvninstall 
mvnsite unit shadedclient findbugs checkstyle |
   | uname | Linux f5f75b4ab6b5 4.15.0-60-generic #67-Ubuntu SMP Thu Aug 22 
16:55:30 UTC 2019 x86_64 x86_64 x86_64 GNU/Linux |
   | Build tool | maven |
   | Personality | personality/hadoop.sh |
   | git revision | trunk / d7d6ec8 |
   | Default Java | 1.8.0_222 |
   | checkstyle | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1492/1/artifact/out/diff-checkstyle-hadoop-mapreduce-project_hadoop-mapreduce-client_hadoop-mapreduce-client-core.txt
 |
   |  Test Results | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1492/1/testReport/ |
   | Max. process+thread count | 1607 (vs. ulimit of 5500) |
   | modules | C: 
hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-core 
U: 
hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-core |
   | Console output | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1492/1/console |
   | versions | git=2.7.4 maven=3.3.9 findbugs=3.1.0-RC1 |
   | Powered by | Apache Yetus 0.10.0 http://yetus.apache.org |
   
   
   This message was automatically generated.
   
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[GitHub] [hadoop] xiaoyuyao commented on a change in pull request #1469: HDDS-2034. Async RATIS pipeline creation and destroy through heartbea…

2019-09-20 Thread GitBox
xiaoyuyao commented on a change in pull request #1469: HDDS-2034. Async RATIS 
pipeline creation and destroy through heartbea…
URL: https://github.com/apache/hadoop/pull/1469#discussion_r326841135
 
 

 ##
 File path: 
hadoop-hdds/server-scm/src/main/java/org/apache/hadoop/hdds/scm/block/BlockManagerImpl.java
 ##
 @@ -117,6 +121,22 @@ public BlockManagerImpl(final Configuration conf,
 scm.getScmNodeManager(), scm.getEventQueue(), svcInterval,
 serviceTimeout, conf);
 safeModePrecheck = new SafeModePrecheck(conf);
+
+long heartbeatInterval =  conf.getTimeDuration(
+HddsConfigKeys.HDDS_HEARTBEAT_INTERVAL,
+HddsConfigKeys.HDDS_COMMAND_STATUS_REPORT_INTERVAL,
 
 Review comment:
   Should this HDDS_HEARTBEAT_INTERVAL_DEFAULT?


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[GitHub] [hadoop] xiaoyuyao commented on a change in pull request #1469: HDDS-2034. Async RATIS pipeline creation and destroy through heartbea…

2019-09-20 Thread GitBox
xiaoyuyao commented on a change in pull request #1469: HDDS-2034. Async RATIS 
pipeline creation and destroy through heartbea…
URL: https://github.com/apache/hadoop/pull/1469#discussion_r326840996
 
 

 ##
 File path: 
hadoop-hdds/container-service/src/main/java/org/apache/hadoop/ozone/container/common/statemachine/commandhandler/CreatePipelineCommandHandler.java
 ##
 @@ -0,0 +1,226 @@
+/**
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with this
+ * work for additional information regarding copyright ownership.  The ASF
+ * licenses this file to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance with the License.
+ * You may obtain a copy of the License at
+ * 
+ * http://www.apache.org/licenses/LICENSE-2.0
+ * 
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
+ * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
+ * License for the specific language governing permissions and limitations 
under
+ * the License.
+ */
+package org.apache.hadoop.ozone.container.common.statemachine.commandhandler;
+
+import org.apache.hadoop.conf.Configuration;
+import org.apache.hadoop.hdds.protocol.DatanodeDetails;
+import org.apache.hadoop.hdds.protocol.proto.
+StorageContainerDatanodeProtocolProtos.CreatePipelineCommandProto;
+import org.apache.hadoop.hdds.protocol.proto.
+StorageContainerDatanodeProtocolProtos.SCMCommandProto;
+import org.apache.hadoop.hdds.protocol.proto.
+StorageContainerDatanodeProtocolProtos.CreatePipelineACKProto;
+import org.apache.hadoop.hdds.ratis.RatisHelper;
+import org.apache.hadoop.hdds.scm.ScmConfigKeys;
+import org.apache.hadoop.hdds.scm.pipeline.PipelineID;
+import org.apache.hadoop.hdds.security.x509.SecurityConfig;
+import org.apache.hadoop.io.MultipleIOException;
+import org.apache.hadoop.ozone.container.common.statemachine
+.SCMConnectionManager;
+import org.apache.hadoop.ozone.container.common.statemachine.StateContext;
+import org.apache.hadoop.ozone.container.ozoneimpl.OzoneContainer;
+import org.apache.hadoop.ozone.protocol.commands.CommandStatus;
+import org.apache.hadoop.ozone.protocol.commands.CreatePipelineCommand;
+import org.apache.hadoop.ozone.protocol.commands.CreatePipelineCommandStatus;
+import org.apache.hadoop.ozone.protocol.commands.SCMCommand;
+import org.apache.hadoop.util.Time;
+import org.apache.ratis.client.RaftClient;
+import org.apache.ratis.grpc.GrpcTlsConfig;
+import org.apache.ratis.protocol.NotLeaderException;
+import org.apache.ratis.protocol.RaftClientReply;
+import org.apache.ratis.protocol.RaftGroup;
+import org.apache.ratis.protocol.RaftGroupId;
+import org.apache.ratis.protocol.RaftPeer;
+import org.apache.ratis.retry.RetryPolicy;
+import org.apache.ratis.rpc.SupportedRpcType;
+import org.apache.ratis.util.TimeDuration;
+import org.slf4j.Logger;
+import org.slf4j.LoggerFactory;
+
+import java.io.IOException;
+import java.util.ArrayList;
+import java.util.Collection;
+import java.util.Collections;
+import java.util.List;
+import java.util.concurrent.RejectedExecutionException;
+import java.util.function.Consumer;
+import java.util.stream.Collectors;
+
+/**
+ * Handler for create pipeline command received from SCM.
+ */
+public class CreatePipelineCommandHandler implements CommandHandler {
+
+  private static final Logger LOG =
+  LoggerFactory.getLogger(CreatePipelineCommandHandler.class);
+
+  private int invocationCount;
+  private long totalTime;
+
+  /**
+   * Constructs a createPipelineCommand handler.
+   */
+  public CreatePipelineCommandHandler() {
+  }
+
+  /**
+   * Handles a given SCM command.
+   *
+   * @param command   - SCM Command
+   * @param ozoneContainer- Ozone Container.
+   * @param context   - Current Context.
+   * @param connectionManager - The SCMs that we are talking to.
+   */
+  @Override
+  public void handle(SCMCommand command, OzoneContainer ozoneContainer,
+  StateContext context, SCMConnectionManager connectionManager) {
+invocationCount++;
+final long startTime = Time.monotonicNow();
+final DatanodeDetails dn = context.getParent()
+.getDatanodeDetails();
+final CreatePipelineCommandProto createCommand =
+((CreatePipelineCommand)command).getProto();
+final PipelineID pipelineID = PipelineID.getFromProtobuf(
+createCommand.getPipelineID());
+Collection peers =
+createCommand.getDatanodeList().stream()
+.map(DatanodeDetails::getFromProtoBuf)
+.collect(Collectors.toList());
+
+final CreatePipelineACKProto createPipelineACK =
+CreatePipelineACKProto.newBuilder()
+.setPipelineID(createCommand.getPipelineID())
+.setDatanodeUUID(dn.getUuidString()).build();
+boolean success = 

[GitHub] [hadoop] xiaoyuyao commented on a change in pull request #1469: HDDS-2034. Async RATIS pipeline creation and destroy through heartbea…

2019-09-20 Thread GitBox
xiaoyuyao commented on a change in pull request #1469: HDDS-2034. Async RATIS 
pipeline creation and destroy through heartbea…
URL: https://github.com/apache/hadoop/pull/1469#discussion_r326840990
 
 

 ##
 File path: 
hadoop-hdds/container-service/src/main/java/org/apache/hadoop/ozone/container/common/statemachine/commandhandler/CreatePipelineCommandHandler.java
 ##
 @@ -0,0 +1,226 @@
+/**
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with this
+ * work for additional information regarding copyright ownership.  The ASF
+ * licenses this file to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance with the License.
+ * You may obtain a copy of the License at
+ * 
+ * http://www.apache.org/licenses/LICENSE-2.0
+ * 
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
+ * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
+ * License for the specific language governing permissions and limitations 
under
+ * the License.
+ */
+package org.apache.hadoop.ozone.container.common.statemachine.commandhandler;
+
+import org.apache.hadoop.conf.Configuration;
+import org.apache.hadoop.hdds.protocol.DatanodeDetails;
+import org.apache.hadoop.hdds.protocol.proto.
+StorageContainerDatanodeProtocolProtos.CreatePipelineCommandProto;
+import org.apache.hadoop.hdds.protocol.proto.
+StorageContainerDatanodeProtocolProtos.SCMCommandProto;
+import org.apache.hadoop.hdds.protocol.proto.
+StorageContainerDatanodeProtocolProtos.CreatePipelineACKProto;
+import org.apache.hadoop.hdds.ratis.RatisHelper;
+import org.apache.hadoop.hdds.scm.ScmConfigKeys;
+import org.apache.hadoop.hdds.scm.pipeline.PipelineID;
+import org.apache.hadoop.hdds.security.x509.SecurityConfig;
+import org.apache.hadoop.io.MultipleIOException;
+import org.apache.hadoop.ozone.container.common.statemachine
+.SCMConnectionManager;
+import org.apache.hadoop.ozone.container.common.statemachine.StateContext;
+import org.apache.hadoop.ozone.container.ozoneimpl.OzoneContainer;
+import org.apache.hadoop.ozone.protocol.commands.CommandStatus;
+import org.apache.hadoop.ozone.protocol.commands.CreatePipelineCommand;
+import org.apache.hadoop.ozone.protocol.commands.CreatePipelineCommandStatus;
+import org.apache.hadoop.ozone.protocol.commands.SCMCommand;
+import org.apache.hadoop.util.Time;
+import org.apache.ratis.client.RaftClient;
+import org.apache.ratis.grpc.GrpcTlsConfig;
+import org.apache.ratis.protocol.NotLeaderException;
+import org.apache.ratis.protocol.RaftClientReply;
+import org.apache.ratis.protocol.RaftGroup;
+import org.apache.ratis.protocol.RaftGroupId;
+import org.apache.ratis.protocol.RaftPeer;
+import org.apache.ratis.retry.RetryPolicy;
+import org.apache.ratis.rpc.SupportedRpcType;
+import org.apache.ratis.util.TimeDuration;
+import org.slf4j.Logger;
+import org.slf4j.LoggerFactory;
+
+import java.io.IOException;
+import java.util.ArrayList;
+import java.util.Collection;
+import java.util.Collections;
+import java.util.List;
+import java.util.concurrent.RejectedExecutionException;
+import java.util.function.Consumer;
+import java.util.stream.Collectors;
+
+/**
+ * Handler for create pipeline command received from SCM.
+ */
+public class CreatePipelineCommandHandler implements CommandHandler {
+
+  private static final Logger LOG =
+  LoggerFactory.getLogger(CreatePipelineCommandHandler.class);
+
+  private int invocationCount;
+  private long totalTime;
+
+  /**
+   * Constructs a createPipelineCommand handler.
+   */
+  public CreatePipelineCommandHandler() {
+  }
+
+  /**
+   * Handles a given SCM command.
+   *
+   * @param command   - SCM Command
+   * @param ozoneContainer- Ozone Container.
+   * @param context   - Current Context.
+   * @param connectionManager - The SCMs that we are talking to.
+   */
+  @Override
+  public void handle(SCMCommand command, OzoneContainer ozoneContainer,
+  StateContext context, SCMConnectionManager connectionManager) {
+invocationCount++;
+final long startTime = Time.monotonicNow();
+final DatanodeDetails dn = context.getParent()
+.getDatanodeDetails();
+final CreatePipelineCommandProto createCommand =
+((CreatePipelineCommand)command).getProto();
+final PipelineID pipelineID = PipelineID.getFromProtobuf(
+createCommand.getPipelineID());
+Collection peers =
+createCommand.getDatanodeList().stream()
+.map(DatanodeDetails::getFromProtoBuf)
+.collect(Collectors.toList());
+
+final CreatePipelineACKProto createPipelineACK =
+CreatePipelineACKProto.newBuilder()
+.setPipelineID(createCommand.getPipelineID())
+.setDatanodeUUID(dn.getUuidString()).build();
+boolean success = 

[GitHub] [hadoop] xiaoyuyao commented on a change in pull request #1469: HDDS-2034. Async RATIS pipeline creation and destroy through heartbea…

2019-09-20 Thread GitBox
xiaoyuyao commented on a change in pull request #1469: HDDS-2034. Async RATIS 
pipeline creation and destroy through heartbea…
URL: https://github.com/apache/hadoop/pull/1469#discussion_r326840925
 
 

 ##
 File path: 
hadoop-hdds/container-service/src/main/java/org/apache/hadoop/ozone/container/common/statemachine/commandhandler/CreatePipelineCommandHandler.java
 ##
 @@ -0,0 +1,226 @@
+/**
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with this
+ * work for additional information regarding copyright ownership.  The ASF
+ * licenses this file to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance with the License.
+ * You may obtain a copy of the License at
+ * 
+ * http://www.apache.org/licenses/LICENSE-2.0
+ * 
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
+ * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
+ * License for the specific language governing permissions and limitations 
under
+ * the License.
+ */
+package org.apache.hadoop.ozone.container.common.statemachine.commandhandler;
+
+import org.apache.hadoop.conf.Configuration;
+import org.apache.hadoop.hdds.protocol.DatanodeDetails;
+import org.apache.hadoop.hdds.protocol.proto.
+StorageContainerDatanodeProtocolProtos.CreatePipelineCommandProto;
+import org.apache.hadoop.hdds.protocol.proto.
+StorageContainerDatanodeProtocolProtos.SCMCommandProto;
+import org.apache.hadoop.hdds.protocol.proto.
+StorageContainerDatanodeProtocolProtos.CreatePipelineACKProto;
+import org.apache.hadoop.hdds.ratis.RatisHelper;
+import org.apache.hadoop.hdds.scm.ScmConfigKeys;
+import org.apache.hadoop.hdds.scm.pipeline.PipelineID;
+import org.apache.hadoop.hdds.security.x509.SecurityConfig;
+import org.apache.hadoop.io.MultipleIOException;
+import org.apache.hadoop.ozone.container.common.statemachine
+.SCMConnectionManager;
+import org.apache.hadoop.ozone.container.common.statemachine.StateContext;
+import org.apache.hadoop.ozone.container.ozoneimpl.OzoneContainer;
+import org.apache.hadoop.ozone.protocol.commands.CommandStatus;
+import org.apache.hadoop.ozone.protocol.commands.CreatePipelineCommand;
+import org.apache.hadoop.ozone.protocol.commands.CreatePipelineCommandStatus;
+import org.apache.hadoop.ozone.protocol.commands.SCMCommand;
+import org.apache.hadoop.util.Time;
+import org.apache.ratis.client.RaftClient;
+import org.apache.ratis.grpc.GrpcTlsConfig;
+import org.apache.ratis.protocol.NotLeaderException;
+import org.apache.ratis.protocol.RaftClientReply;
+import org.apache.ratis.protocol.RaftGroup;
+import org.apache.ratis.protocol.RaftGroupId;
+import org.apache.ratis.protocol.RaftPeer;
+import org.apache.ratis.retry.RetryPolicy;
+import org.apache.ratis.rpc.SupportedRpcType;
+import org.apache.ratis.util.TimeDuration;
+import org.slf4j.Logger;
+import org.slf4j.LoggerFactory;
+
+import java.io.IOException;
+import java.util.ArrayList;
+import java.util.Collection;
+import java.util.Collections;
+import java.util.List;
+import java.util.concurrent.RejectedExecutionException;
+import java.util.function.Consumer;
+import java.util.stream.Collectors;
+
+/**
+ * Handler for create pipeline command received from SCM.
+ */
+public class CreatePipelineCommandHandler implements CommandHandler {
+
+  private static final Logger LOG =
+  LoggerFactory.getLogger(CreatePipelineCommandHandler.class);
+
+  private int invocationCount;
+  private long totalTime;
+
+  /**
+   * Constructs a createPipelineCommand handler.
+   */
+  public CreatePipelineCommandHandler() {
+  }
+
+  /**
+   * Handles a given SCM command.
+   *
+   * @param command   - SCM Command
+   * @param ozoneContainer- Ozone Container.
+   * @param context   - Current Context.
+   * @param connectionManager - The SCMs that we are talking to.
+   */
+  @Override
+  public void handle(SCMCommand command, OzoneContainer ozoneContainer,
+  StateContext context, SCMConnectionManager connectionManager) {
+invocationCount++;
+final long startTime = Time.monotonicNow();
+final DatanodeDetails dn = context.getParent()
+.getDatanodeDetails();
+final CreatePipelineCommandProto createCommand =
+((CreatePipelineCommand)command).getProto();
+final PipelineID pipelineID = PipelineID.getFromProtobuf(
+createCommand.getPipelineID());
+Collection peers =
+createCommand.getDatanodeList().stream()
+.map(DatanodeDetails::getFromProtoBuf)
+.collect(Collectors.toList());
+
+final CreatePipelineACKProto createPipelineACK =
+CreatePipelineACKProto.newBuilder()
+.setPipelineID(createCommand.getPipelineID())
+.setDatanodeUUID(dn.getUuidString()).build();
+boolean success = 

[GitHub] [hadoop] xiaoyuyao commented on a change in pull request #1469: HDDS-2034. Async RATIS pipeline creation and destroy through heartbea…

2019-09-20 Thread GitBox
xiaoyuyao commented on a change in pull request #1469: HDDS-2034. Async RATIS 
pipeline creation and destroy through heartbea…
URL: https://github.com/apache/hadoop/pull/1469#discussion_r326840829
 
 

 ##
 File path: 
hadoop-hdds/container-service/src/main/java/org/apache/hadoop/ozone/container/common/statemachine/commandhandler/ClosePipelineCommandHandler.java
 ##
 @@ -0,0 +1,159 @@
+/**
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with this
+ * work for additional information regarding copyright ownership.  The ASF
+ * licenses this file to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance with the License.
+ * You may obtain a copy of the License at
+ * 
+ * http://www.apache.org/licenses/LICENSE-2.0
+ * 
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
+ * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
+ * License for the specific language governing permissions and limitations 
under
+ * the License.
+ */
+package org.apache.hadoop.ozone.container.common.statemachine.commandhandler;
+
+import org.apache.hadoop.conf.Configuration;
+import org.apache.hadoop.hdds.protocol.DatanodeDetails;
+import org.apache.hadoop.hdds.protocol.proto.
+StorageContainerDatanodeProtocolProtos.ClosePipelineCommandProto;
+import org.apache.hadoop.hdds.protocol.proto.
+StorageContainerDatanodeProtocolProtos.SCMCommandProto;
+import org.apache.hadoop.hdds.ratis.RatisHelper;
+import org.apache.hadoop.hdds.scm.ScmConfigKeys;
+import org.apache.hadoop.hdds.scm.pipeline.PipelineID;
+import org.apache.hadoop.hdds.security.x509.SecurityConfig;
+import org.apache.hadoop.ozone.container.common.statemachine
+.SCMConnectionManager;
+import org.apache.hadoop.ozone.container.common.statemachine.StateContext;
+import org.apache.hadoop.ozone.container.ozoneimpl.OzoneContainer;
+import org.apache.hadoop.ozone.protocol.commands.ClosePipelineCommand;
+import org.apache.hadoop.ozone.protocol.commands.SCMCommand;
+import org.apache.hadoop.util.Time;
+import org.apache.ratis.client.RaftClient;
+import org.apache.ratis.grpc.GrpcTlsConfig;
+import org.apache.ratis.protocol.RaftGroupId;
+import org.apache.ratis.protocol.RaftPeer;
+import org.apache.ratis.retry.RetryPolicy;
+import org.apache.ratis.rpc.SupportedRpcType;
+import org.apache.ratis.util.TimeDuration;
+import org.slf4j.Logger;
+import org.slf4j.LoggerFactory;
+
+import java.io.IOException;
+
+/**
+ * Handler for close pipeline command received from SCM.
+ */
+public class ClosePipelineCommandHandler implements CommandHandler {
+
+  private static final Logger LOG =
+  LoggerFactory.getLogger(ClosePipelineCommandHandler.class);
+
+  private int invocationCount;
+  private long totalTime;
+
+  /**
+   * Constructs a closePipelineCommand handler.
+   */
+  public ClosePipelineCommandHandler() {
+  }
+
+  /**
+   * Handles a given SCM command.
+   *
+   * @param command   - SCM Command
+   * @param ozoneContainer- Ozone Container.
+   * @param context   - Current Context.
+   * @param connectionManager - The SCMs that we are talking to.
+   */
+  @Override
+  public void handle(SCMCommand command, OzoneContainer ozoneContainer,
+  StateContext context, SCMConnectionManager connectionManager) {
+invocationCount++;
+final long startTime = Time.monotonicNow();
+final DatanodeDetails datanode = context.getParent()
+.getDatanodeDetails();
+final ClosePipelineCommandProto closeCommand =
+((ClosePipelineCommand)command).getProto();
+final PipelineID pipelineID = PipelineID.getFromProtobuf(
+closeCommand.getPipelineID());
+
+try {
+  destroyPipeline(datanode, pipelineID, context.getParent().getConf());
+  LOG.info("Close Pipeline #{} command on datanode #{}.", pipelineID,
+  datanode.getUuidString());
+} catch (IOException e) {
+  LOG.error("Can't close pipeline #{}", pipelineID, e);
+} finally {
+  long endTime = Time.monotonicNow();
+  totalTime += endTime - startTime;
+}
+  }
+
+  /**
+   * Returns the command type that this command handler handles.
+   *
+   * @return Type
+   */
+  @Override
+  public SCMCommandProto.Type getCommandType() {
+return SCMCommandProto.Type.closePipelineCommand;
+  }
+
+  /**
+   * Returns number of times this handler has been invoked.
+   *
+   * @return int
+   */
+  @Override
+  public int getInvocationCount() {
+return invocationCount;
+  }
+
+  /**
+   * Returns the average time this function takes to run.
+   *
+   * @return long
+   */
+  @Override
+  public long getAverageRunTime() {
+if (invocationCount > 0) {
+  return totalTime / invocationCount;
+}
+return 0;
+  }
+
+  /**
+   * Destroy pipeline on this datanode.

[GitHub] [hadoop] xiaoyuyao commented on a change in pull request #1469: HDDS-2034. Async RATIS pipeline creation and destroy through heartbea…

2019-09-20 Thread GitBox
xiaoyuyao commented on a change in pull request #1469: HDDS-2034. Async RATIS 
pipeline creation and destroy through heartbea…
URL: https://github.com/apache/hadoop/pull/1469#discussion_r326840609
 
 

 ##
 File path: 
hadoop-hdds/container-service/src/main/java/org/apache/hadoop/ozone/container/common/statemachine/commandhandler/ClosePipelineCommandHandler.java
 ##
 @@ -0,0 +1,159 @@
+/**
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with this
+ * work for additional information regarding copyright ownership.  The ASF
+ * licenses this file to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance with the License.
+ * You may obtain a copy of the License at
+ * 
+ * http://www.apache.org/licenses/LICENSE-2.0
+ * 
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
+ * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
+ * License for the specific language governing permissions and limitations 
under
+ * the License.
+ */
+package org.apache.hadoop.ozone.container.common.statemachine.commandhandler;
+
+import org.apache.hadoop.conf.Configuration;
+import org.apache.hadoop.hdds.protocol.DatanodeDetails;
+import org.apache.hadoop.hdds.protocol.proto.
+StorageContainerDatanodeProtocolProtos.ClosePipelineCommandProto;
+import org.apache.hadoop.hdds.protocol.proto.
+StorageContainerDatanodeProtocolProtos.SCMCommandProto;
+import org.apache.hadoop.hdds.ratis.RatisHelper;
+import org.apache.hadoop.hdds.scm.ScmConfigKeys;
+import org.apache.hadoop.hdds.scm.pipeline.PipelineID;
+import org.apache.hadoop.hdds.security.x509.SecurityConfig;
+import org.apache.hadoop.ozone.container.common.statemachine
+.SCMConnectionManager;
+import org.apache.hadoop.ozone.container.common.statemachine.StateContext;
+import org.apache.hadoop.ozone.container.ozoneimpl.OzoneContainer;
+import org.apache.hadoop.ozone.protocol.commands.ClosePipelineCommand;
+import org.apache.hadoop.ozone.protocol.commands.SCMCommand;
+import org.apache.hadoop.util.Time;
+import org.apache.ratis.client.RaftClient;
+import org.apache.ratis.grpc.GrpcTlsConfig;
+import org.apache.ratis.protocol.RaftGroupId;
+import org.apache.ratis.protocol.RaftPeer;
+import org.apache.ratis.retry.RetryPolicy;
+import org.apache.ratis.rpc.SupportedRpcType;
+import org.apache.ratis.util.TimeDuration;
+import org.slf4j.Logger;
+import org.slf4j.LoggerFactory;
+
+import java.io.IOException;
+
+/**
+ * Handler for close pipeline command received from SCM.
+ */
+public class ClosePipelineCommandHandler implements CommandHandler {
+
+  private static final Logger LOG =
+  LoggerFactory.getLogger(ClosePipelineCommandHandler.class);
+
+  private int invocationCount;
 
 Review comment:
   Can we change this to AtomicLong?


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[GitHub] [hadoop] xiaoyuyao commented on a change in pull request #1469: HDDS-2034. Async RATIS pipeline creation and destroy through heartbea…

2019-09-20 Thread GitBox
xiaoyuyao commented on a change in pull request #1469: HDDS-2034. Async RATIS 
pipeline creation and destroy through heartbea…
URL: https://github.com/apache/hadoop/pull/1469#discussion_r326840609
 
 

 ##
 File path: 
hadoop-hdds/container-service/src/main/java/org/apache/hadoop/ozone/container/common/statemachine/commandhandler/ClosePipelineCommandHandler.java
 ##
 @@ -0,0 +1,159 @@
+/**
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with this
+ * work for additional information regarding copyright ownership.  The ASF
+ * licenses this file to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance with the License.
+ * You may obtain a copy of the License at
+ * 
+ * http://www.apache.org/licenses/LICENSE-2.0
+ * 
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
+ * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
+ * License for the specific language governing permissions and limitations 
under
+ * the License.
+ */
+package org.apache.hadoop.ozone.container.common.statemachine.commandhandler;
+
+import org.apache.hadoop.conf.Configuration;
+import org.apache.hadoop.hdds.protocol.DatanodeDetails;
+import org.apache.hadoop.hdds.protocol.proto.
+StorageContainerDatanodeProtocolProtos.ClosePipelineCommandProto;
+import org.apache.hadoop.hdds.protocol.proto.
+StorageContainerDatanodeProtocolProtos.SCMCommandProto;
+import org.apache.hadoop.hdds.ratis.RatisHelper;
+import org.apache.hadoop.hdds.scm.ScmConfigKeys;
+import org.apache.hadoop.hdds.scm.pipeline.PipelineID;
+import org.apache.hadoop.hdds.security.x509.SecurityConfig;
+import org.apache.hadoop.ozone.container.common.statemachine
+.SCMConnectionManager;
+import org.apache.hadoop.ozone.container.common.statemachine.StateContext;
+import org.apache.hadoop.ozone.container.ozoneimpl.OzoneContainer;
+import org.apache.hadoop.ozone.protocol.commands.ClosePipelineCommand;
+import org.apache.hadoop.ozone.protocol.commands.SCMCommand;
+import org.apache.hadoop.util.Time;
+import org.apache.ratis.client.RaftClient;
+import org.apache.ratis.grpc.GrpcTlsConfig;
+import org.apache.ratis.protocol.RaftGroupId;
+import org.apache.ratis.protocol.RaftPeer;
+import org.apache.ratis.retry.RetryPolicy;
+import org.apache.ratis.rpc.SupportedRpcType;
+import org.apache.ratis.util.TimeDuration;
+import org.slf4j.Logger;
+import org.slf4j.LoggerFactory;
+
+import java.io.IOException;
+
+/**
+ * Handler for close pipeline command received from SCM.
+ */
+public class ClosePipelineCommandHandler implements CommandHandler {
+
+  private static final Logger LOG =
+  LoggerFactory.getLogger(ClosePipelineCommandHandler.class);
+
+  private int invocationCount;
 
 Review comment:
   should this be long?


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[GitHub] [hadoop] xiaoyuyao commented on a change in pull request #1469: HDDS-2034. Async RATIS pipeline creation and destroy through heartbea…

2019-09-20 Thread GitBox
xiaoyuyao commented on a change in pull request #1469: HDDS-2034. Async RATIS 
pipeline creation and destroy through heartbea…
URL: https://github.com/apache/hadoop/pull/1469#discussion_r326840553
 
 

 ##
 File path: hadoop-hdds/common/src/main/resources/ozone-default.xml
 ##
 @@ -1388,6 +1379,25 @@
 
   
 
+  
+hdds.scm.safemode.pipeline.creation
+true
+HDDS,SCM,OPERATION
+
+  Boolean value to enable background pipeline creation in SCM safe mode.
+
+  
+
+  
+hdds.scm.safemode.min.pipeline
 
 Review comment:
   Same as mentioned earlier, can we consolidate the criteria for pipeline to 
exit safemode?


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[GitHub] [hadoop] xiaoyuyao commented on a change in pull request #1469: HDDS-2034. Async RATIS pipeline creation and destroy through heartbea…

2019-09-20 Thread GitBox
xiaoyuyao commented on a change in pull request #1469: HDDS-2034. Async RATIS 
pipeline creation and destroy through heartbea…
URL: https://github.com/apache/hadoop/pull/1469#discussion_r326840379
 
 

 ##
 File path: hadoop-hdds/common/src/main/resources/ozone-default.xml
 ##
 @@ -310,15 +310,6 @@
   datanode periodically send container report to SCM. Unit could be
   defined with postfix (ns,ms,s,m,h,d)
   
-  
-hdds.command.status.report.interval
 
 Review comment:
   Is this a dead key being removed? Should we remove the corresponding consts 
defined in HddsConfigKeys.java?


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[GitHub] [hadoop] xiaoyuyao commented on a change in pull request #1469: HDDS-2034. Async RATIS pipeline creation and destroy through heartbea…

2019-09-20 Thread GitBox
xiaoyuyao commented on a change in pull request #1469: HDDS-2034. Async RATIS 
pipeline creation and destroy through heartbea…
URL: https://github.com/apache/hadoop/pull/1469#discussion_r326840379
 
 

 ##
 File path: hadoop-hdds/common/src/main/resources/ozone-default.xml
 ##
 @@ -310,15 +310,6 @@
   datanode periodically send container report to SCM. Unit could be
   defined with postfix (ns,ms,s,m,h,d)
   
-  
-hdds.command.status.report.interval
 
 Review comment:
   Is this a dead key being removed?


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[GitHub] [hadoop] xiaoyuyao commented on a change in pull request #1469: HDDS-2034. Async RATIS pipeline creation and destroy through heartbea…

2019-09-20 Thread GitBox
xiaoyuyao commented on a change in pull request #1469: HDDS-2034. Async RATIS 
pipeline creation and destroy through heartbea…
URL: https://github.com/apache/hadoop/pull/1469#discussion_r326840253
 
 

 ##
 File path: 
hadoop-hdds/common/src/main/java/org/apache/hadoop/hdds/HddsConfigKeys.java
 ##
 @@ -84,21 +84,29 @@
   public static final String HDDS_SCM_SAFEMODE_PIPELINE_AVAILABILITY_CHECK =
   "hdds.scm.safemode.pipeline-availability.check";
   public static final boolean
-  HDDS_SCM_SAFEMODE_PIPELINE_AVAILABILITY_CHECK_DEFAULT = false;
+  HDDS_SCM_SAFEMODE_PIPELINE_AVAILABILITY_CHECK_DEFAULT = true;
+
+  public static final String HDDS_SCM_SAFEMODE_PIPELINE_CREATION =
+  "hdds.scm.safemode.pipeline.creation";
+  public static final boolean
+  HDDS_SCM_SAFEMODE_PIPELINE_CREATION_DEFAULT = true;
 
   // % of containers which should have at least one reported replica
   // before SCM comes out of safe mode.
   public static final String HDDS_SCM_SAFEMODE_THRESHOLD_PCT =
   "hdds.scm.safemode.threshold.pct";
   public static final double HDDS_SCM_SAFEMODE_THRESHOLD_PCT_DEFAULT = 0.99;
 
-
   // percentage of healthy pipelines, where all 3 datanodes are reported in the
   // pipeline.
   public static final String HDDS_SCM_SAFEMODE_HEALTHY_PIPELINE_THRESHOLD_PCT =
   "hdds.scm.safemode.healthy.pipelie.pct";
   public static final double
   HDDS_SCM_SAFEMODE_HEALTHY_PIPELINE_THRESHOLD_PCT_DEFAULT = 0.10;
 
 Review comment:
   Can we reuse HDDS_SCM_SAFEMODE_HEALTHY_PIPELINE_THRESHOLD_PCT_DEFAULT or 
consolidate the criteria? 


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[GitHub] [hadoop] dengzhhu653 opened a new pull request #1492: MAPREDUCE-7241. FileInputFormat listStatus oom caused by lots of unwanted block infos

2019-09-20 Thread GitBox
dengzhhu653 opened a new pull request #1492: MAPREDUCE-7241. FileInputFormat 
listStatus oom caused by lots of unwanted block infos
URL: https://github.com/apache/hadoop/pull/1492
 
 
   FileInputFormat may cause oom problem when scanning lots of files during 
listing status, as described in 
https://issues.apache.org/jira/browse/MAPREDUCE-7241. This patch can reduce the 
memory usage, allowing MR scanning more files with less memory footprint.
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[GitHub] [hadoop] xiaoyuyao commented on a change in pull request #1488: HDDS-2159. Fix Race condition in ProfileServlet#pid.

2019-09-20 Thread GitBox
xiaoyuyao commented on a change in pull request #1488: HDDS-2159. Fix Race 
condition in ProfileServlet#pid.
URL: https://github.com/apache/hadoop/pull/1488#discussion_r326838670
 
 

 ##
 File path: 
hadoop-hdds/framework/src/main/java/org/apache/hadoop/hdds/server/ProfileServlet.java
 ##
 @@ -208,11 +208,11 @@ protected void doGet(final HttpServletRequest req,
   return;
 }
 // if pid is explicitly specified, use it else default to current process
-pid = getInteger(req, "pid", pid);
+Integer processId = getInteger(req, "pid", pid);
 
 // if pid is not specified in query param and if current process pid
 
 Review comment:
   should we update the comment? It seems we are not trying to determine 
current process pid?


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[GitHub] [hadoop] xiaoyuyao commented on a change in pull request #1431: HDDS-1569 Support creating multiple pipelines with same datanode

2019-09-20 Thread GitBox
xiaoyuyao commented on a change in pull request #1431: HDDS-1569 Support 
creating multiple pipelines with same datanode
URL: https://github.com/apache/hadoop/pull/1431#discussion_r326837652
 
 

 ##
 File path: 
hadoop-hdds/server-scm/src/main/java/org/apache/hadoop/hdds/scm/pipeline/SCMPipelineManager.java
 ##
 @@ -158,6 +158,8 @@ public synchronized Pipeline createPipeline(
   throw idEx;
 } catch (IOException ex) {
   metrics.incNumPipelineCreationFailed();
+  LOG.error("Pipeline creation failed due to exception: "
 
 Review comment:
   This can be changed to: LOG.error("Pipeline creation failed", ex);

   
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[GitHub] [hadoop] xiaoyuyao commented on a change in pull request #1431: HDDS-1569 Support creating multiple pipelines with same datanode

2019-09-20 Thread GitBox
xiaoyuyao commented on a change in pull request #1431: HDDS-1569 Support 
creating multiple pipelines with same datanode
URL: https://github.com/apache/hadoop/pull/1431#discussion_r326837288
 
 

 ##
 File path: 
hadoop-hdds/server-scm/src/main/java/org/apache/hadoop/hdds/scm/pipeline/PipelinePlacementPolicy.java
 ##
 @@ -80,7 +81,19 @@ public PipelinePlacementPolicy(
*/
   @VisibleForTesting
   boolean meetCriteria(DatanodeDetails datanodeDetails, long heavyNodeLimit) {
-return (nodeManager.getPipelinesCount(datanodeDetails) <= heavyNodeLimit);
+if (heavyNodeLimit == 0) {
+  // no limit applied.
+  return true;
+}
+boolean meet = (nodeManager.getPipelinesCount(datanodeDetails)
+<= heavyNodeLimit);
+if (!meet) {
+  LOG.info("Pipeline Placement: Doesn't meet the criteria to be viable " +
+  "node: " + datanodeDetails.getUuid().toString() + " Heaviness: " +
 
 Review comment:
   NIT: Doesn't meet the criteria to be viable node => Can't place more 
pipeline on heavy datanode


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[GitHub] [hadoop] xiaoyuyao commented on a change in pull request #1431: HDDS-1569 Support creating multiple pipelines with same datanode

2019-09-20 Thread GitBox
xiaoyuyao commented on a change in pull request #1431: HDDS-1569 Support 
creating multiple pipelines with same datanode
URL: https://github.com/apache/hadoop/pull/1431#discussion_r326836211
 
 

 ##
 File path: 
hadoop-hdds/server-scm/src/main/java/org/apache/hadoop/hdds/scm/node/states/Node2PipelineMap.java
 ##
 @@ -71,6 +71,10 @@ public synchronized void addPipeline(Pipeline pipeline) {
   UUID dnId = details.getUuid();
   dn2ObjectMap.computeIfAbsent(dnId, k -> ConcurrentHashMap.newKeySet())
   .add(pipeline.getId());
+  dn2ObjectMap.computeIfPresent(dnId, (k, v) -> {
 
 Review comment:
   Line 73 should be able to add(pipeline.getId() if the HashMap exist. Line 
74-77 seems redundant. 


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[GitHub] [hadoop] hadoop-yetus commented on issue #1399: HADOOP-16543: Cached DNS name resolution error

2019-09-20 Thread GitBox
hadoop-yetus commented on issue #1399: HADOOP-16543: Cached DNS name resolution 
error
URL: https://github.com/apache/hadoop/pull/1399#issuecomment-533745609
 
 
   :broken_heart: **-1 overall**
   
   
   
   
   
   
   | Vote | Subsystem | Runtime | Comment |
   |::|--:|:|:|
   | 0 | reexec | 135 | Docker mode activated. |
   ||| _ Prechecks _ |
   | +1 | dupname | 1 | No case conflicting files found. |
   | +1 | @author | 0 | The patch does not contain any @author tags. |
   | +1 | test4tests | 0 | The patch appears to include 2 new or modified test 
files. |
   ||| _ trunk Compile Tests _ |
   | 0 | mvndep | 21 | Maven dependency ordering for branch |
   | +1 | mvninstall | 1507 | trunk passed |
   | +1 | compile | 752 | trunk passed |
   | +1 | checkstyle | 103 | trunk passed |
   | +1 | mvnsite | 213 | trunk passed |
   | -1 | shadedclient | 397 | branch has errors when building and testing our 
client artifacts. |
   | +1 | javadoc | 187 | trunk passed |
   | 0 | spotbugs | 34 | Used deprecated FindBugs config; considering switching 
to SpotBugs. |
   | 0 | findbugs | 34 | 
branch/hadoop-yarn-project/hadoop-yarn/hadoop-yarn-site no findbugs output file 
(findbugsXml.xml) |
   | -0 | patch | 90 | Used diff version of patch file. Binary files and 
potentially other changes not applied. Please rebase and squash commits if 
necessary. |
   ||| _ Patch Compile Tests _ |
   | 0 | mvndep | 18 | Maven dependency ordering for patch |
   | -1 | mvninstall | 32 | hadoop-yarn-common in the patch failed. |
   | -1 | mvninstall | 28 | hadoop-yarn-client in the patch failed. |
   | -1 | compile | 55 | hadoop-yarn in the patch failed. |
   | -1 | javac | 55 | hadoop-yarn in the patch failed. |
   | -0 | checkstyle | 84 | hadoop-yarn-project/hadoop-yarn: The patch 
generated 15 new + 214 unchanged - 0 fixed = 229 total (was 214) |
   | -1 | mvnsite | 32 | hadoop-yarn-common in the patch failed. |
   | -1 | mvnsite | 29 | hadoop-yarn-client in the patch failed. |
   | -1 | whitespace | 0 | The patch has 8 line(s) that end in whitespace. Use 
git apply --whitespace=fix <>. Refer 
https://git-scm.com/docs/git-apply |
   | +1 | xml | 2 | The patch has no ill-formed XML file. |
   | -1 | shadedclient | 38 | patch has errors when building and testing our 
client artifacts. |
   | -1 | javadoc | 52 | hadoop-yarn-project_hadoop-yarn_hadoop-yarn-common 
generated 4 new + 4190 unchanged - 0 fixed = 4194 total (was 4190) |
   | -1 | findbugs | 29 | hadoop-yarn-common in the patch failed. |
   | -1 | findbugs | 27 | hadoop-yarn-client in the patch failed. |
   | 0 | findbugs | 17 | hadoop-yarn-project/hadoop-yarn/hadoop-yarn-site has 
no data from findbugs |
   ||| _ Other Tests _ |
   | +1 | unit | 56 | hadoop-yarn-api in the patch passed. |
   | -1 | unit | 32 | hadoop-yarn-common in the patch failed. |
   | -1 | unit | 32 | hadoop-yarn-client in the patch failed. |
   | +1 | unit | 15 | hadoop-yarn-site in the patch passed. |
   | +1 | asflicense | 29 | The patch does not generate ASF License warnings. |
   | | | 4557 | |
   
   
   | Subsystem | Report/Notes |
   |--:|:-|
   | Docker | Client=18.09.7 Server=18.09.7 base: 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1399/10/artifact/out/Dockerfile
 |
   | GITHUB PR | https://github.com/apache/hadoop/pull/1399 |
   | JIRA Issue | HADOOP-16543 |
   | Optional Tests | dupname asflicense compile javac javadoc mvninstall 
mvnsite unit shadedclient findbugs checkstyle xml |
   | uname | Linux ce709c0d40b8 4.15.0-58-generic #64-Ubuntu SMP Tue Aug 6 
11:12:41 UTC 2019 x86_64 x86_64 x86_64 GNU/Linux |
   | Build tool | maven |
   | Personality | personality/hadoop.sh |
   | git revision | trunk / d7d6ec8 |
   | Default Java | 1.8.0_222 |
   | mvninstall | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1399/10/artifact/out/patch-mvninstall-hadoop-yarn-project_hadoop-yarn_hadoop-yarn-common.txt
 |
   | mvninstall | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1399/10/artifact/out/patch-mvninstall-hadoop-yarn-project_hadoop-yarn_hadoop-yarn-client.txt
 |
   | compile | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1399/10/artifact/out/patch-compile-hadoop-yarn-project_hadoop-yarn.txt
 |
   | javac | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1399/10/artifact/out/patch-compile-hadoop-yarn-project_hadoop-yarn.txt
 |
   | checkstyle | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1399/10/artifact/out/diff-checkstyle-hadoop-yarn-project_hadoop-yarn.txt
 |
   | mvnsite | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1399/10/artifact/out/patch-mvnsite-hadoop-yarn-project_hadoop-yarn_hadoop-yarn-common.txt
 |
   | mvnsite | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1399/10/artifact/out/patch-mvnsite-hadoop-yarn-project_hadoop-yarn_hadoop-yarn-client.txt
 |
   | whitespace | 

[jira] [Commented] (HADOOP-16543) Cached DNS name resolution error

2019-09-20 Thread Hadoop QA (Jira)


[ 
https://issues.apache.org/jira/browse/HADOOP-16543?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16934835#comment-16934835
 ] 

Hadoop QA commented on HADOOP-16543:


| (x) *{color:red}-1 overall{color}* |
\\
\\
|| Vote || Subsystem || Runtime || Comment ||
| {color:blue}0{color} | {color:blue} reexec {color} | {color:blue}  2m 
15s{color} | {color:blue} Docker mode activated. {color} |
|| || || || {color:brown} Prechecks {color} ||
| {color:green}+1{color} | {color:green} dupname {color} | {color:green}  0m  
1s{color} | {color:green} No case conflicting files found. {color} |
| {color:green}+1{color} | {color:green} @author {color} | {color:green}  0m  
0s{color} | {color:green} The patch does not contain any @author tags. {color} |
| {color:green}+1{color} | {color:green} test4tests {color} | {color:green}  0m 
 0s{color} | {color:green} The patch appears to include 2 new or modified test 
files. {color} |
|| || || || {color:brown} trunk Compile Tests {color} ||
| {color:blue}0{color} | {color:blue} mvndep {color} | {color:blue}  0m 
21s{color} | {color:blue} Maven dependency ordering for branch {color} |
| {color:green}+1{color} | {color:green} mvninstall {color} | {color:green} 25m 
 7s{color} | {color:green} trunk passed {color} |
| {color:green}+1{color} | {color:green} compile {color} | {color:green} 12m 
32s{color} | {color:green} trunk passed {color} |
| {color:green}+1{color} | {color:green} checkstyle {color} | {color:green}  1m 
43s{color} | {color:green} trunk passed {color} |
| {color:green}+1{color} | {color:green} mvnsite {color} | {color:green}  3m 
33s{color} | {color:green} trunk passed {color} |
| {color:red}-1{color} | {color:red} shadedclient {color} | {color:red}  6m 
37s{color} | {color:red} branch has errors when building and testing our client 
artifacts. {color} |
| {color:green}+1{color} | {color:green} javadoc {color} | {color:green}  3m  
7s{color} | {color:green} trunk passed {color} |
| {color:blue}0{color} | {color:blue} spotbugs {color} | {color:blue}  0m 
34s{color} | {color:blue} Used deprecated FindBugs config; considering 
switching to SpotBugs. {color} |
| {color:blue}0{color} | {color:blue} findbugs {color} | {color:blue}  0m 
34s{color} | {color:blue} 
branch/hadoop-yarn-project/hadoop-yarn/hadoop-yarn-site no findbugs output file 
(findbugsXml.xml) {color} |
| {color:orange}-0{color} | {color:orange} patch {color} | {color:orange}  1m 
30s{color} | {color:orange} Used diff version of patch file. Binary files and 
potentially other changes not applied. Please rebase and squash commits if 
necessary. {color} |
|| || || || {color:brown} Patch Compile Tests {color} ||
| {color:blue}0{color} | {color:blue} mvndep {color} | {color:blue}  0m 
18s{color} | {color:blue} Maven dependency ordering for patch {color} |
| {color:red}-1{color} | {color:red} mvninstall {color} | {color:red}  0m 
32s{color} | {color:red} hadoop-yarn-common in the patch failed. {color} |
| {color:red}-1{color} | {color:red} mvninstall {color} | {color:red}  0m 
28s{color} | {color:red} hadoop-yarn-client in the patch failed. {color} |
| {color:red}-1{color} | {color:red} compile {color} | {color:red}  0m 
55s{color} | {color:red} hadoop-yarn in the patch failed. {color} |
| {color:red}-1{color} | {color:red} javac {color} | {color:red}  0m 55s{color} 
| {color:red} hadoop-yarn in the patch failed. {color} |
| {color:orange}-0{color} | {color:orange} checkstyle {color} | {color:orange}  
1m 24s{color} | {color:orange} hadoop-yarn-project/hadoop-yarn: The patch 
generated 15 new + 214 unchanged - 0 fixed = 229 total (was 214) {color} |
| {color:red}-1{color} | {color:red} mvnsite {color} | {color:red}  0m 
32s{color} | {color:red} hadoop-yarn-common in the patch failed. {color} |
| {color:red}-1{color} | {color:red} mvnsite {color} | {color:red}  0m 
29s{color} | {color:red} hadoop-yarn-client in the patch failed. {color} |
| {color:red}-1{color} | {color:red} whitespace {color} | {color:red}  0m  
0s{color} | {color:red} The patch has 8 line(s) that end in whitespace. Use git 
apply --whitespace=fix <>. Refer https://git-scm.com/docs/git-apply 
{color} |
| {color:green}+1{color} | {color:green} xml {color} | {color:green}  0m  
2s{color} | {color:green} The patch has no ill-formed XML file. {color} |
| {color:red}-1{color} | {color:red} shadedclient {color} | {color:red}  0m 
38s{color} | {color:red} patch has errors when building and testing our client 
artifacts. {color} |
| {color:red}-1{color} | {color:red} javadoc {color} | {color:red}  0m 
52s{color} | {color:red} hadoop-yarn-project_hadoop-yarn_hadoop-yarn-common 
generated 4 new + 4190 unchanged - 0 fixed = 4194 total (was 4190) {color} |
| {color:red}-1{color} | {color:red} findbugs {color} | {color:red}  0m 
29s{color} | {color:red} hadoop-yarn-common in the patch failed. {color} |
| {color:red}-1{color} | {color:red} findbugs {color} | {color:red}  0m 
27s{color} | 

[GitHub] [hadoop] ashvina commented on a change in pull request #1478: HDFS-14856 Fetch file ACLs while mounting external store

2019-09-20 Thread GitBox
ashvina commented on a change in pull request #1478: HDFS-14856 Fetch file ACLs 
while mounting external store
URL: https://github.com/apache/hadoop/pull/1478#discussion_r326831610
 
 

 ##
 File path: 
hadoop-tools/hadoop-fs2img/src/main/java/org/apache/hadoop/hdfs/server/namenode/TreePath.java
 ##
 @@ -141,7 +158,11 @@ INode toFile(UGIResolver ugi, BlockResolver blk,
 "Exact path handle not supported by filesystem " + fs.toString());
   }
 }
-// TODO: storage policy should be configurable per path; use BlockResolver
+if (aclStatus != null) {
+  throw new UnsupportedOperationException(
+  "Acls not supported by ImageWriter");
+}
+//TODO: storage policy should be configurable per path; use BlockResolver
 
 Review comment:
   This is an existing comment, will need to build some context to improve it.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[GitHub] [hadoop] ashvina commented on a change in pull request #1478: HDFS-14856 Fetch file ACLs while mounting external store

2019-09-20 Thread GitBox
ashvina commented on a change in pull request #1478: HDFS-14856 Fetch file ACLs 
while mounting external store
URL: https://github.com/apache/hadoop/pull/1478#discussion_r326831492
 
 

 ##
 File path: 
hadoop-tools/hadoop-fs2img/src/main/java/org/apache/hadoop/hdfs/server/namenode/FSTreeWalk.java
 ##
 @@ -71,14 +95,24 @@ private FSTreeIterator() {
 }
 
 FSTreeIterator(TreePath p) {
+  AclStatus acls = null;
+  Path remotePath = p.getFileStatus().getPath();
+  try {
+acls = getAclStatus(fs, remotePath);
+  } catch (IOException e) {
+LOG.warn(
+"Got exception when trying to get remote acls for path {} : {}",
+remotePath, e.getMessage());
+  }
   getPendingQueue().addFirst(
-  new TreePath(p.getFileStatus(), p.getParentId(), this, fs));
+  new TreePath(p.getFileStatus(), p.getParentId(), this, fs, acls));
 }
 
 FSTreeIterator(Path p) throws IOException {
   try {
 FileStatus s = fs.getFileStatus(root);
-getPendingQueue().addFirst(new TreePath(s, -1L, this, fs));
+AclStatus acls = getAclStatus(fs, s.getPath());
 
 Review comment:
   Good catch. The scanner will not fail if the remote FS does not support 
ACLs. All other IOException fail the operation in all paths now. Thanks !


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[GitHub] [hadoop] hadoop-yetus commented on issue #1491: HDDS-2161. Create RepeatedKeyInfo structure to be saved in deletedTable

2019-09-20 Thread GitBox
hadoop-yetus commented on issue #1491: HDDS-2161. Create RepeatedKeyInfo 
structure to be saved in deletedTable
URL: https://github.com/apache/hadoop/pull/1491#issuecomment-533736737
 
 
   :broken_heart: **-1 overall**
   
   
   
   
   
   
   | Vote | Subsystem | Runtime | Comment |
   |::|--:|:|:|
   | 0 | reexec | 71 | Docker mode activated. |
   ||| _ Prechecks _ |
   | +1 | dupname | 0 | No case conflicting files found. |
   | +1 | @author | 0 | The patch does not contain any @author tags. |
   | +1 | test4tests | 0 | The patch appears to include 2 new or modified test 
files. |
   ||| _ trunk Compile Tests _ |
   | 0 | mvndep | 25 | Maven dependency ordering for branch |
   | -1 | mvninstall | 30 | hadoop-ozone in trunk failed. |
   | -1 | compile | 21 | hadoop-ozone in trunk failed. |
   | -0 | checkstyle | 34 | The patch fails to run checkstyle in hadoop-ozone |
   | +1 | mvnsite | 0 | trunk passed |
   | -1 | shadedclient | 107 | branch has errors when building and testing our 
client artifacts. |
   | -1 | javadoc | 46 | hadoop-ozone in trunk failed. |
   | 0 | spotbugs | 156 | Used deprecated FindBugs config; considering 
switching to SpotBugs. |
   | -1 | findbugs | 24 | hadoop-ozone in trunk failed. |
   ||| _ Patch Compile Tests _ |
   | 0 | mvndep | 15 | Maven dependency ordering for patch |
   | -1 | mvninstall | 30 | hadoop-ozone in the patch failed. |
   | -1 | compile | 22 | hadoop-ozone in the patch failed. |
   | -1 | cc | 22 | hadoop-ozone in the patch failed. |
   | -1 | javac | 22 | hadoop-ozone in the patch failed. |
   | -0 | checkstyle | 26 | The patch fails to run checkstyle in hadoop-ozone |
   | +1 | mvnsite | 0 | the patch passed |
   | +1 | whitespace | 0 | The patch has no whitespace issues. |
   | -1 | shadedclient | 32 | patch has errors when building and testing our 
client artifacts. |
   | -1 | javadoc | 46 | hadoop-ozone in the patch failed. |
   | -1 | findbugs | 24 | hadoop-ozone in the patch failed. |
   ||| _ Other Tests _ |
   | -1 | unit | 253 | hadoop-hdds in the patch failed. |
   | -1 | unit | 25 | hadoop-ozone in the patch failed. |
   | +1 | asflicense | 28 | The patch does not generate ASF License warnings. |
   | | | 1739 | |
   
   
   | Reason | Tests |
   |---:|:--|
   | Failed junit tests | 
hadoop.hdds.scm.container.placement.algorithms.TestSCMContainerPlacementRackAware
 |
   
   
   | Subsystem | Report/Notes |
   |--:|:-|
   | Docker | Client=19.03.2 Server=19.03.2 base: 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1491/1/artifact/out/Dockerfile
 |
   | GITHUB PR | https://github.com/apache/hadoop/pull/1491 |
   | Optional Tests | dupname asflicense compile javac javadoc mvninstall 
mvnsite unit shadedclient findbugs checkstyle cc |
   | uname | Linux e35d020848de 4.15.0-60-generic #67-Ubuntu SMP Thu Aug 22 
16:55:30 UTC 2019 x86_64 x86_64 x86_64 GNU/Linux |
   | Build tool | maven |
   | Personality | personality/hadoop.sh |
   | git revision | trunk / d7d6ec8 |
   | Default Java | 1.8.0_222 |
   | mvninstall | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1491/1/artifact/out/branch-mvninstall-hadoop-ozone.txt
 |
   | compile | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1491/1/artifact/out/branch-compile-hadoop-ozone.txt
 |
   | checkstyle | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1491/1/artifact/out//home/jenkins/jenkins-slave/workspace/hadoop-multibranch_PR-1491/out/maven-branch-checkstyle-hadoop-ozone.txt
 |
   | javadoc | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1491/1/artifact/out/branch-javadoc-hadoop-ozone.txt
 |
   | findbugs | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1491/1/artifact/out/branch-findbugs-hadoop-ozone.txt
 |
   | mvninstall | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1491/1/artifact/out/patch-mvninstall-hadoop-ozone.txt
 |
   | compile | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1491/1/artifact/out/patch-compile-hadoop-ozone.txt
 |
   | cc | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1491/1/artifact/out/patch-compile-hadoop-ozone.txt
 |
   | javac | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1491/1/artifact/out/patch-compile-hadoop-ozone.txt
 |
   | checkstyle | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1491/1/artifact/out//home/jenkins/jenkins-slave/workspace/hadoop-multibranch_PR-1491/out/maven-patch-checkstyle-hadoop-ozone.txt
 |
   | javadoc | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1491/1/artifact/out/patch-javadoc-hadoop-ozone.txt
 |
   | findbugs | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1491/1/artifact/out/patch-findbugs-hadoop-ozone.txt
 |
   | unit | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1491/1/artifact/out/patch-unit-hadoop-hdds.txt
 |
   | unit | 

[GitHub] [hadoop] goiri commented on a change in pull request #1399: HADOOP-16543: Cached DNS name resolution error

2019-09-20 Thread GitBox
goiri commented on a change in pull request #1399: HADOOP-16543: Cached DNS 
name resolution error
URL: https://github.com/apache/hadoop/pull/1399#discussion_r326824345
 
 

 ##
 File path: 
hadoop-yarn-project/hadoop-yarn/hadoop-yarn-client/src/test/java/org/apache/hadoop/yarn/client/TestRMFailoverProxyProvider.java
 ##
 @@ -0,0 +1,320 @@
+
+
+/**
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ * 
+ * http://www.apache.org/licenses/LICENSE-2.0
+ * 
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.hadoop.yarn.client;
+
+import org.apache.hadoop.conf.Configuration;
+import org.apache.hadoop.io.retry.FailoverProxyProvider;
+import org.apache.hadoop.yarn.api.ApplicationClientProtocol;
+import org.apache.hadoop.yarn.client.api.YarnClient;
+import org.apache.hadoop.yarn.conf.YarnConfiguration;
+import org.apache.hadoop.yarn.exceptions.YarnException;
+import org.junit.Before;
+import org.junit.Test;
+
+import java.io.Closeable;
+import java.io.IOException;
+import java.lang.reflect.InvocationHandler;
+import java.lang.reflect.Proxy;
+import java.net.InetSocketAddress;
+
+import static org.junit.Assert.*;
+import static org.mockito.Mockito.any;
+import static org.mockito.Mockito.eq;
+import static org.mockito.Mockito.times;
+import static org.mockito.Mockito.mock;
+import static org.mockito.Mockito.verify;
+import static org.mockito.Mockito.when;
+
+/**
+ * Unit tests for {@link ConfiguredRMFailoverProxyProvider} and
+ * {@link AutoRefreshRMFailoverProxyProvider}.
+ */
+public class TestRMFailoverProxyProvider {
+  private Configuration conf;
+
+  @Before
+  public void setUp() throws IOException, YarnException {
+conf = new YarnConfiguration();
+conf.setClass(YarnConfiguration.CLIENT_FAILOVER_NO_HA_PROXY_PROVIDER,
+ConfiguredRMFailoverProxyProvider.class, 
RMFailoverProxyProvider.class);
+conf.setBoolean(YarnConfiguration.RM_HA_ENABLED, true);
+  }
+
+  /**
+   * Test that the {@link ConfiguredRMFailoverProxyProvider}
+   * will loop through its set of proxies when
+   * and {@link ConfiguredRMFailoverProxyProvider#performFailover(Object)}
+   * gets called.
+   */
+  @Test
+  public void testFailoverChange() throws Exception {
+class MockProxy extends Proxy implements Closeable {
+  protected MockProxy(InvocationHandler h) {
+super(h);
+  }
+
+  @Override
+  public void close() throws IOException {
+  }
+}
+
+//Adjusting the YARN Conf
+conf.set(YarnConfiguration.RM_HA_IDS, "rm0, rm1");
+
+// Create two proxies and mock a RMProxy
+Proxy mockProxy2 = new MockProxy((proxy, method, args) -> null);
+Proxy mockProxy1 = new MockProxy((proxy, method, args) -> null);
+
+Class protocol = ApplicationClientProtocol.class;
+RMProxy mockRMProxy = mock(RMProxy.class);
+ConfiguredRMFailoverProxyProvider  fpp =
+new ConfiguredRMFailoverProxyProvider  ();
+
+// generate two address with different ports.
+// Default port of yarn RM
+int port1 = 8032;
+int port2 = 8031;
+InetSocketAddress mockAdd1 = new InetSocketAddress(port1);
+InetSocketAddress mockAdd2 = new InetSocketAddress(port2);
+
+// Mock RMProxy methods
+when(mockRMProxy.getRMAddress(any(YarnConfiguration.class),
+any(Class.class))).thenReturn(mockAdd1);
+when(mockRMProxy.getProxy(any(YarnConfiguration.class),
+any(Class.class), eq(mockAdd1))).thenReturn(mockProxy1);
+
+// Initialize failover proxy provider and get proxy from it.
+fpp.init(conf, mockRMProxy, protocol);
+FailoverProxyProvider.ProxyInfo  actualProxy1 = fpp.getProxy();
+assertEquals(
+"ConfiguredRMFailoverProxyProvider doesn't generate " +
+"expected proxy",
+mockProxy1, actualProxy1.proxy);
+
+// Invoke fpp.getProxy() multiple times and
+// validate the returned proxy is always mockProxy1
+actualProxy1 = fpp.getProxy();
+assertEquals(
+"ConfiguredRMFailoverProxyProvider doesn't generate " +
+"expected proxy",
+mockProxy1, actualProxy1.proxy);
+actualProxy1 = fpp.getProxy();
+assertEquals(
+"ConfiguredRMFailoverProxyProvider doesn't generate " +
+"expected proxy",
+mockProxy1, actualProxy1.proxy);
+
+// verify that mockRMProxy.getProxy() is invoked once 

[GitHub] [hadoop] anuengineer commented on issue #1491: HDDS-2161. Create RepeatedKeyInfo structure to be saved in deletedTable

2019-09-20 Thread GitBox
anuengineer commented on issue #1491: HDDS-2161. Create RepeatedKeyInfo 
structure to be saved in deletedTable
URL: https://github.com/apache/hadoop/pull/1491#issuecomment-533732606
 
 
   +1, pending Jenkins. I think you don't need the change in KeymanagerImpl. 
But no harm in doing that change. 


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[GitHub] [hadoop] anuengineer commented on issue #1491: HDDS-2161. Create RepeatedKeyInfo structure to be saved in deletedTable

2019-09-20 Thread GitBox
anuengineer commented on issue #1491: HDDS-2161. Create RepeatedKeyInfo 
structure to be saved in deletedTable
URL: https://github.com/apache/hadoop/pull/1491#issuecomment-533732216
 
 
   /label ozone


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[GitHub] [hadoop] hadoop-yetus commented on issue #1489: HDDS-2019. Handle Set DtService of token in S3Gateway for OM HA.

2019-09-20 Thread GitBox
hadoop-yetus commented on issue #1489: HDDS-2019. Handle Set DtService of token 
in S3Gateway for OM HA.
URL: https://github.com/apache/hadoop/pull/1489#issuecomment-533728099
 
 
   :broken_heart: **-1 overall**
   
   
   
   
   
   
   | Vote | Subsystem | Runtime | Comment |
   |::|--:|:|:|
   | 0 | reexec | 46 | Docker mode activated. |
   ||| _ Prechecks _ |
   | +1 | dupname | 0 | No case conflicting files found. |
   | +1 | @author | 0 | The patch does not contain any @author tags. |
   | -1 | test4tests | 0 | The patch doesn't appear to include any new or 
modified tests.  Please justify why no new tests are needed for this patch. 
Also please list what manual steps were performed to verify this patch. |
   ||| _ trunk Compile Tests _ |
   | -1 | mvninstall | 35 | hadoop-ozone in trunk failed. |
   | -1 | compile | 24 | hadoop-ozone in trunk failed. |
   | +1 | checkstyle | 82 | trunk passed |
   | +1 | mvnsite | 0 | trunk passed |
   | -1 | shadedclient | 145 | branch has errors when building and testing our 
client artifacts. |
   | -1 | javadoc | 56 | hadoop-ozone in trunk failed. |
   | 0 | spotbugs | 226 | Used deprecated FindBugs config; considering 
switching to SpotBugs. |
   | -1 | findbugs | 25 | hadoop-ozone in trunk failed. |
   ||| _ Patch Compile Tests _ |
   | -1 | mvninstall | 33 | hadoop-ozone in the patch failed. |
   | -1 | compile | 22 | hadoop-ozone in the patch failed. |
   | -1 | javac | 22 | hadoop-ozone in the patch failed. |
   | -0 | checkstyle | 30 | hadoop-ozone: The patch generated 1 new + 0 
unchanged - 0 fixed = 1 total (was 0) |
   | +1 | mvnsite | 0 | the patch passed |
   | +1 | whitespace | 0 | The patch has no whitespace issues. |
   | -1 | shadedclient | 32 | patch has errors when building and testing our 
client artifacts. |
   | -1 | javadoc | 56 | hadoop-ozone in the patch failed. |
   | -1 | findbugs | 26 | hadoop-ozone in the patch failed. |
   ||| _ Other Tests _ |
   | +1 | unit | 266 | hadoop-hdds in the patch passed. |
   | -1 | unit | 25 | hadoop-ozone in the patch failed. |
   | +1 | asflicense | 28 | The patch does not generate ASF License warnings. |
   | | | 2058 | |
   
   
   | Subsystem | Report/Notes |
   |--:|:-|
   | Docker | Client=19.03.1 Server=19.03.1 base: 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1489/2/artifact/out/Dockerfile
 |
   | GITHUB PR | https://github.com/apache/hadoop/pull/1489 |
   | Optional Tests | dupname asflicense compile javac javadoc mvninstall 
mvnsite unit shadedclient findbugs checkstyle |
   | uname | Linux 1395f9604b97 4.15.0-60-generic #67-Ubuntu SMP Thu Aug 22 
16:55:30 UTC 2019 x86_64 x86_64 x86_64 GNU/Linux |
   | Build tool | maven |
   | Personality | personality/hadoop.sh |
   | git revision | trunk / d7d6ec8 |
   | Default Java | 1.8.0_222 |
   | mvninstall | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1489/2/artifact/out/branch-mvninstall-hadoop-ozone.txt
 |
   | compile | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1489/2/artifact/out/branch-compile-hadoop-ozone.txt
 |
   | javadoc | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1489/2/artifact/out/branch-javadoc-hadoop-ozone.txt
 |
   | findbugs | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1489/2/artifact/out/branch-findbugs-hadoop-ozone.txt
 |
   | mvninstall | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1489/2/artifact/out/patch-mvninstall-hadoop-ozone.txt
 |
   | compile | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1489/2/artifact/out/patch-compile-hadoop-ozone.txt
 |
   | javac | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1489/2/artifact/out/patch-compile-hadoop-ozone.txt
 |
   | checkstyle | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1489/2/artifact/out/diff-checkstyle-hadoop-ozone.txt
 |
   | javadoc | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1489/2/artifact/out/patch-javadoc-hadoop-ozone.txt
 |
   | findbugs | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1489/2/artifact/out/patch-findbugs-hadoop-ozone.txt
 |
   | unit | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1489/2/artifact/out/patch-unit-hadoop-ozone.txt
 |
   |  Test Results | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1489/2/testReport/ |
   | Max. process+thread count | 478 (vs. ulimit of 5500) |
   | modules | C: hadoop-ozone/s3gateway U: hadoop-ozone/s3gateway |
   | Console output | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1489/2/console |
   | versions | git=2.7.4 maven=3.3.9 findbugs=3.1.0-RC1 |
   | Powered by | Apache Yetus 0.10.0 http://yetus.apache.org |
   
   
   This message was automatically generated.
   
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 

[GitHub] [hadoop] dineshchitlangia opened a new pull request #1491: HDDS-2161. Create RepeatedKeyInfo structure to be saved in deletedTable

2019-09-20 Thread GitBox
dineshchitlangia opened a new pull request #1491: HDDS-2161. Create 
RepeatedKeyInfo structure to be saved in deletedTable
URL: https://github.com/apache/hadoop/pull/1491
 
 
   /label ozone
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[GitHub] [hadoop] hadoop-yetus commented on issue #1489: HDDS-2019. Handle Set DtService of token in S3Gateway for OM HA.

2019-09-20 Thread GitBox
hadoop-yetus commented on issue #1489: HDDS-2019. Handle Set DtService of token 
in S3Gateway for OM HA.
URL: https://github.com/apache/hadoop/pull/1489#issuecomment-533725569
 
 
   :broken_heart: **-1 overall**
   
   
   
   
   
   
   | Vote | Subsystem | Runtime | Comment |
   |::|--:|:|:|
   | 0 | reexec | 75 | Docker mode activated. |
   ||| _ Prechecks _ |
   | +1 | dupname | 0 | No case conflicting files found. |
   | +1 | @author | 0 | The patch does not contain any @author tags. |
   | -1 | test4tests | 0 | The patch doesn't appear to include any new or 
modified tests.  Please justify why no new tests are needed for this patch. 
Also please list what manual steps were performed to verify this patch. |
   ||| _ trunk Compile Tests _ |
   | -1 | mvninstall | 30 | hadoop-ozone in trunk failed. |
   | -1 | compile | 19 | hadoop-ozone in trunk failed. |
   | +1 | checkstyle | 59 | trunk passed |
   | +1 | mvnsite | 0 | trunk passed |
   | -1 | shadedclient | 108 | branch has errors when building and testing our 
client artifacts. |
   | -1 | javadoc | 47 | hadoop-ozone in trunk failed. |
   | 0 | spotbugs | 174 | Used deprecated FindBugs config; considering 
switching to SpotBugs. |
   | -1 | findbugs | 20 | hadoop-ozone in trunk failed. |
   ||| _ Patch Compile Tests _ |
   | -1 | mvninstall | 28 | hadoop-ozone in the patch failed. |
   | -1 | compile | 19 | hadoop-ozone in the patch failed. |
   | -1 | javac | 19 | hadoop-ozone in the patch failed. |
   | -0 | checkstyle | 25 | hadoop-ozone: The patch generated 1 new + 0 
unchanged - 0 fixed = 1 total (was 0) |
   | +1 | mvnsite | 0 | the patch passed |
   | +1 | whitespace | 1 | The patch has no whitespace issues. |
   | -1 | shadedclient | 28 | patch has errors when building and testing our 
client artifacts. |
   | -1 | javadoc | 47 | hadoop-ozone in the patch failed. |
   | -1 | findbugs | 21 | hadoop-ozone in the patch failed. |
   ||| _ Other Tests _ |
   | +1 | unit | 276 | hadoop-hdds in the patch passed. |
   | -1 | unit | 22 | hadoop-ozone in the patch failed. |
   | +1 | asflicense | 24 | The patch does not generate ASF License warnings. |
   | | | 1735 | |
   
   
   | Subsystem | Report/Notes |
   |--:|:-|
   | Docker | Client=19.03.2 Server=19.03.2 base: 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1489/1/artifact/out/Dockerfile
 |
   | GITHUB PR | https://github.com/apache/hadoop/pull/1489 |
   | Optional Tests | dupname asflicense compile javac javadoc mvninstall 
mvnsite unit shadedclient findbugs checkstyle |
   | uname | Linux addcf29dbc03 4.15.0-60-generic #67-Ubuntu SMP Thu Aug 22 
16:55:30 UTC 2019 x86_64 x86_64 x86_64 GNU/Linux |
   | Build tool | maven |
   | Personality | personality/hadoop.sh |
   | git revision | trunk / d7d6ec8 |
   | Default Java | 1.8.0_222 |
   | mvninstall | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1489/1/artifact/out/branch-mvninstall-hadoop-ozone.txt
 |
   | compile | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1489/1/artifact/out/branch-compile-hadoop-ozone.txt
 |
   | javadoc | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1489/1/artifact/out/branch-javadoc-hadoop-ozone.txt
 |
   | findbugs | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1489/1/artifact/out/branch-findbugs-hadoop-ozone.txt
 |
   | mvninstall | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1489/1/artifact/out/patch-mvninstall-hadoop-ozone.txt
 |
   | compile | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1489/1/artifact/out/patch-compile-hadoop-ozone.txt
 |
   | javac | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1489/1/artifact/out/patch-compile-hadoop-ozone.txt
 |
   | checkstyle | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1489/1/artifact/out/diff-checkstyle-hadoop-ozone.txt
 |
   | javadoc | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1489/1/artifact/out/patch-javadoc-hadoop-ozone.txt
 |
   | findbugs | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1489/1/artifact/out/patch-findbugs-hadoop-ozone.txt
 |
   | unit | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1489/1/artifact/out/patch-unit-hadoop-ozone.txt
 |
   |  Test Results | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1489/1/testReport/ |
   | Max. process+thread count | 429 (vs. ulimit of 5500) |
   | modules | C: hadoop-ozone/s3gateway U: hadoop-ozone/s3gateway |
   | Console output | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1489/1/console |
   | versions | git=2.7.4 maven=3.3.9 findbugs=3.1.0-RC1 |
   | Powered by | Apache Yetus 0.10.0 http://yetus.apache.org |
   
   
   This message was automatically generated.
   
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 

[GitHub] [hadoop] hadoop-yetus commented on issue #1490: HDDS-2160. Add acceptance test for ozonesecure-mr compose. Contribute…

2019-09-20 Thread GitBox
hadoop-yetus commented on issue #1490: HDDS-2160. Add acceptance test for 
ozonesecure-mr compose. Contribute…
URL: https://github.com/apache/hadoop/pull/1490#issuecomment-533724849
 
 
   :broken_heart: **-1 overall**
   
   
   
   
   
   
   | Vote | Subsystem | Runtime | Comment |
   |::|--:|:|:|
   | 0 | reexec | 153 | Docker mode activated. |
   ||| _ Prechecks _ |
   | +1 | dupname | 0 | No case conflicting files found. |
   | 0 | shelldocs | 0 | Shelldocs was not available. |
   | +1 | @author | 0 | The patch does not contain any @author tags. |
   | -1 | test4tests | 0 | The patch doesn't appear to include any new or 
modified tests.  Please justify why no new tests are needed for this patch. 
Also please list what manual steps were performed to verify this patch. |
   ||| _ trunk Compile Tests _ |
   | -1 | mvninstall | 36 | hadoop-ozone in trunk failed. |
   | -1 | compile | 25 | hadoop-ozone in trunk failed. |
   | +1 | mvnsite | 0 | trunk passed |
   | -1 | shadedclient | 38 | branch has errors when building and testing our 
client artifacts. |
   | -1 | javadoc | 59 | hadoop-ozone in trunk failed. |
   ||| _ Patch Compile Tests _ |
   | -1 | mvninstall | 35 | hadoop-ozone in the patch failed. |
   | -1 | compile | 27 | hadoop-ozone in the patch failed. |
   | -1 | javac | 27 | hadoop-ozone in the patch failed. |
   | +1 | mvnsite | 0 | the patch passed |
   | +1 | shellcheck | 1 | There were no new shellcheck issues. |
   | +1 | whitespace | 0 | The patch has no whitespace issues. |
   | -1 | shadedclient | 33 | patch has errors when building and testing our 
client artifacts. |
   | -1 | javadoc | 60 | hadoop-ozone in the patch failed. |
   ||| _ Other Tests _ |
   | +1 | unit | 309 | hadoop-hdds in the patch passed. |
   | -1 | unit | 31 | hadoop-ozone in the patch failed. |
   | +1 | asflicense | 37 | The patch does not generate ASF License warnings. |
   | | | 1531 | |
   
   
   | Subsystem | Report/Notes |
   |--:|:-|
   | Docker | Client=18.09.7 Server=18.09.7 base: 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1490/1/artifact/out/Dockerfile
 |
   | GITHUB PR | https://github.com/apache/hadoop/pull/1490 |
   | Optional Tests | dupname asflicense mvnsite unit shellcheck shelldocs 
compile javac javadoc mvninstall shadedclient |
   | uname | Linux 3b6e16cad131 4.15.0-58-generic #64-Ubuntu SMP Tue Aug 6 
11:12:41 UTC 2019 x86_64 x86_64 x86_64 GNU/Linux |
   | Build tool | maven |
   | Personality | personality/hadoop.sh |
   | git revision | trunk / d7d6ec8 |
   | Default Java | 1.8.0_222 |
   | mvninstall | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1490/1/artifact/out/branch-mvninstall-hadoop-ozone.txt
 |
   | compile | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1490/1/artifact/out/branch-compile-hadoop-ozone.txt
 |
   | javadoc | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1490/1/artifact/out/branch-javadoc-hadoop-ozone.txt
 |
   | mvninstall | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1490/1/artifact/out/patch-mvninstall-hadoop-ozone.txt
 |
   | compile | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1490/1/artifact/out/patch-compile-hadoop-ozone.txt
 |
   | javac | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1490/1/artifact/out/patch-compile-hadoop-ozone.txt
 |
   | javadoc | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1490/1/artifact/out/patch-javadoc-hadoop-ozone.txt
 |
   | unit | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1490/1/artifact/out/patch-unit-hadoop-ozone.txt
 |
   |  Test Results | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1490/1/testReport/ |
   | Max. process+thread count | 423 (vs. ulimit of 5500) |
   | modules | C: hadoop-ozone/dist U: hadoop-ozone/dist |
   | Console output | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1490/1/console |
   | versions | git=2.7.4 maven=3.3.9 shellcheck=0.4.6 |
   | Powered by | Apache Yetus 0.10.0 http://yetus.apache.org |
   
   
   This message was automatically generated.
   
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[GitHub] [hadoop] xiaoyuyao opened a new pull request #1490: HDDS-2160. Add acceptance test for ozonesecure-mr compose. Contribute…

2019-09-20 Thread GitBox
xiaoyuyao opened a new pull request #1490: HDDS-2160. Add acceptance test for 
ozonesecure-mr compose. Contribute…
URL: https://github.com/apache/hadoop/pull/1490
 
 
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[GitHub] [hadoop] bharatviswa504 opened a new pull request #1489: HDDS-2019. Handle Set DtService of token in S3Gateway for OM HA.

2019-09-20 Thread GitBox
bharatviswa504 opened a new pull request #1489: HDDS-2019. Handle Set DtService 
of token in S3Gateway for OM HA.
URL: https://github.com/apache/hadoop/pull/1489
 
 
   Need some work on server-side for adding secure-om-s3-ha cluster, as some 
fields like keytab are still not supporting parameter with serviceid and node 
id.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[GitHub] [hadoop] anuengineer merged pull request #1445: HDDS-2128. Make ozone sh command work with OM HA service ids

2019-09-20 Thread GitBox
anuengineer merged pull request #1445: HDDS-2128. Make ozone sh command work 
with OM HA service ids
URL: https://github.com/apache/hadoop/pull/1445
 
 
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[GitHub] [hadoop] anuengineer merged pull request #1485: HDDS-2157. checkstyle: print filenames relative to project root

2019-09-20 Thread GitBox
anuengineer merged pull request #1485: HDDS-2157. checkstyle: print filenames 
relative to project root
URL: https://github.com/apache/hadoop/pull/1485
 
 
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[GitHub] [hadoop] hadoop-yetus commented on issue #1488: HDDS-2159. Fix Race condition in ProfileServlet#pid.

2019-09-20 Thread GitBox
hadoop-yetus commented on issue #1488: HDDS-2159. Fix Race condition in 
ProfileServlet#pid.
URL: https://github.com/apache/hadoop/pull/1488#issuecomment-533705368
 
 
   :broken_heart: **-1 overall**
   
   
   
   
   
   
   | Vote | Subsystem | Runtime | Comment |
   |::|--:|:|:|
   | 0 | reexec | 36 | Docker mode activated. |
   ||| _ Prechecks _ |
   | +1 | dupname | 0 | No case conflicting files found. |
   | +1 | @author | 0 | The patch does not contain any @author tags. |
   | -1 | test4tests | 0 | The patch doesn't appear to include any new or 
modified tests.  Please justify why no new tests are needed for this patch. 
Also please list what manual steps were performed to verify this patch. |
   ||| _ trunk Compile Tests _ |
   | -1 | mvninstall | 28 | hadoop-ozone in trunk failed. |
   | -1 | compile | 22 | hadoop-ozone in trunk failed. |
   | +1 | checkstyle | 51 | trunk passed |
   | +1 | mvnsite | 0 | trunk passed |
   | -1 | shadedclient | 99 | branch has errors when building and testing our 
client artifacts. |
   | +1 | javadoc | 144 | trunk passed |
   | 0 | spotbugs | 158 | Used deprecated FindBugs config; considering 
switching to SpotBugs. |
   | -1 | findbugs | 25 | hadoop-ozone in trunk failed. |
   ||| _ Patch Compile Tests _ |
   | -1 | mvninstall | 30 | hadoop-ozone in the patch failed. |
   | -1 | compile | 21 | hadoop-ozone in the patch failed. |
   | -1 | javac | 21 | hadoop-ozone in the patch failed. |
   | +1 | checkstyle | 50 | the patch passed |
   | +1 | mvnsite | 0 | the patch passed |
   | +1 | whitespace | 0 | The patch has no whitespace issues. |
   | -1 | shadedclient | 31 | patch has errors when building and testing our 
client artifacts. |
   | +1 | javadoc | 146 | the patch passed |
   | -1 | findbugs | 23 | hadoop-ozone in the patch failed. |
   ||| _ Other Tests _ |
   | -1 | unit | 228 | hadoop-hdds in the patch failed. |
   | -1 | unit | 26 | hadoop-ozone in the patch failed. |
   | +1 | asflicense | 28 | The patch does not generate ASF License warnings. |
   | | | 1687 | |
   
   
   | Reason | Tests |
   |---:|:--|
   | Failed junit tests | 
hadoop.hdds.scm.container.placement.algorithms.TestSCMContainerPlacementRackAware
 |
   
   
   | Subsystem | Report/Notes |
   |--:|:-|
   | Docker | Client=19.03.1 Server=19.03.1 base: 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1488/1/artifact/out/Dockerfile
 |
   | GITHUB PR | https://github.com/apache/hadoop/pull/1488 |
   | Optional Tests | dupname asflicense compile javac javadoc mvninstall 
mvnsite unit shadedclient findbugs checkstyle |
   | uname | Linux e5ff34138d70 4.15.0-60-generic #67-Ubuntu SMP Thu Aug 22 
16:55:30 UTC 2019 x86_64 x86_64 x86_64 GNU/Linux |
   | Build tool | maven |
   | Personality | personality/hadoop.sh |
   | git revision | trunk / 3f223be |
   | Default Java | 1.8.0_222 |
   | mvninstall | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1488/1/artifact/out/branch-mvninstall-hadoop-ozone.txt
 |
   | compile | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1488/1/artifact/out/branch-compile-hadoop-ozone.txt
 |
   | findbugs | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1488/1/artifact/out/branch-findbugs-hadoop-ozone.txt
 |
   | mvninstall | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1488/1/artifact/out/patch-mvninstall-hadoop-ozone.txt
 |
   | compile | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1488/1/artifact/out/patch-compile-hadoop-ozone.txt
 |
   | javac | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1488/1/artifact/out/patch-compile-hadoop-ozone.txt
 |
   | findbugs | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1488/1/artifact/out/patch-findbugs-hadoop-ozone.txt
 |
   | unit | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1488/1/artifact/out/patch-unit-hadoop-hdds.txt
 |
   | unit | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1488/1/artifact/out/patch-unit-hadoop-ozone.txt
 |
   |  Test Results | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1488/1/testReport/ |
   | Max. process+thread count | 487 (vs. ulimit of 5500) |
   | modules | C: hadoop-hdds/framework U: hadoop-hdds/framework |
   | Console output | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1488/1/console |
   | versions | git=2.7.4 maven=3.3.9 findbugs=3.1.0-RC1 |
   | Powered by | Apache Yetus 0.10.0 http://yetus.apache.org |
   
   
   This message was automatically generated.
   
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, 

[GitHub] [hadoop] hadoop-yetus commented on issue #1442: HADOOP-16570. S3A committers leak threads on job/task commit.

2019-09-20 Thread GitBox
hadoop-yetus commented on issue #1442: HADOOP-16570. S3A committers leak 
threads on job/task commit.
URL: https://github.com/apache/hadoop/pull/1442#issuecomment-533697723
 
 
   :broken_heart: **-1 overall**
   
   
   
   
   
   
   | Vote | Subsystem | Runtime | Comment |
   |::|--:|:|:|
   | 0 | reexec | 83 | Docker mode activated. |
   ||| _ Prechecks _ |
   | +1 | dupname | 0 | No case conflicting files found. |
   | +1 | @author | 0 | The patch does not contain any @author tags. |
   | +1 | test4tests | 0 | The patch appears to include 4 new or modified test 
files. |
   ||| _ trunk Compile Tests _ |
   | +1 | mvninstall | 1310 | trunk passed |
   | +1 | compile | 35 | trunk passed |
   | +1 | checkstyle | 28 | trunk passed |
   | +1 | mvnsite | 41 | trunk passed |
   | -1 | shadedclient | 109 | branch has errors when building and testing our 
client artifacts. |
   | +1 | javadoc | 29 | trunk passed |
   | 0 | spotbugs | 71 | Used deprecated FindBugs config; considering switching 
to SpotBugs. |
   | +1 | findbugs | 68 | trunk passed |
   ||| _ Patch Compile Tests _ |
   | +1 | mvninstall | 38 | the patch passed |
   | +1 | compile | 32 | the patch passed |
   | +1 | javac | 32 | the patch passed |
   | -0 | checkstyle | 22 | hadoop-tools/hadoop-aws: The patch generated 10 new 
+ 40 unchanged - 0 fixed = 50 total (was 40) |
   | +1 | mvnsite | 34 | the patch passed |
   | +1 | whitespace | 0 | The patch has no whitespace issues. |
   | -1 | shadedclient | 32 | patch has errors when building and testing our 
client artifacts. |
   | +1 | javadoc | 25 | the patch passed |
   | +1 | findbugs | 77 | the patch passed |
   ||| _ Other Tests _ |
   | -1 | unit | 72 | hadoop-aws in the patch failed. |
   | +1 | asflicense | 24 | The patch does not generate ASF License warnings. |
   | | | 2125 | |
   
   
   | Reason | Tests |
   |---:|:--|
   | Failed junit tests | hadoop.fs.s3a.commit.staging.TestStagingCommitter |
   |   | hadoop.fs.s3a.commit.staging.TestStagingPartitionedJobCommit |
   
   
   | Subsystem | Report/Notes |
   |--:|:-|
   | Docker | Client=19.03.2 Server=19.03.2 base: 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1442/4/artifact/out/Dockerfile
 |
   | GITHUB PR | https://github.com/apache/hadoop/pull/1442 |
   | Optional Tests | dupname asflicense compile javac javadoc mvninstall 
mvnsite unit shadedclient findbugs checkstyle |
   | uname | Linux b164b6d10a2f 4.15.0-60-generic #67-Ubuntu SMP Thu Aug 22 
16:55:30 UTC 2019 x86_64 x86_64 x86_64 GNU/Linux |
   | Build tool | maven |
   | Personality | personality/hadoop.sh |
   | git revision | trunk / 3f223be |
   | Default Java | 1.8.0_222 |
   | checkstyle | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1442/4/artifact/out/diff-checkstyle-hadoop-tools_hadoop-aws.txt
 |
   | unit | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1442/4/artifact/out/patch-unit-hadoop-tools_hadoop-aws.txt
 |
   |  Test Results | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1442/4/testReport/ |
   | Max. process+thread count | 327 (vs. ulimit of 5500) |
   | modules | C: hadoop-tools/hadoop-aws U: hadoop-tools/hadoop-aws |
   | Console output | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1442/4/console |
   | versions | git=2.7.4 maven=3.3.9 findbugs=3.1.0-RC1 |
   | Powered by | Apache Yetus 0.10.0 http://yetus.apache.org |
   
   
   This message was automatically generated.
   
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[GitHub] [hadoop] hadoop-yetus commented on issue #1487: HADOOP-16591 Fix S3A ITest*MRjob failures.

2019-09-20 Thread GitBox
hadoop-yetus commented on issue #1487: HADOOP-16591 Fix S3A ITest*MRjob 
failures.
URL: https://github.com/apache/hadoop/pull/1487#issuecomment-533696736
 
 
   :broken_heart: **-1 overall**
   
   
   
   
   
   
   | Vote | Subsystem | Runtime | Comment |
   |::|--:|:|:|
   | 0 | reexec | 79 | Docker mode activated. |
   ||| _ Prechecks _ |
   | +1 | dupname | 0 | No case conflicting files found. |
   | +1 | @author | 0 | The patch does not contain any @author tags. |
   | +1 | test4tests | 0 | The patch appears to include 1 new or modified test 
files. |
   ||| _ trunk Compile Tests _ |
   | +1 | mvninstall | 1189 | trunk passed |
   | +1 | compile | 31 | trunk passed |
   | +1 | checkstyle | 24 | trunk passed |
   | +1 | mvnsite | 37 | trunk passed |
   | -1 | shadedclient | 98 | branch has errors when building and testing our 
client artifacts. |
   | +1 | javadoc | 24 | trunk passed |
   | 0 | spotbugs | 60 | Used deprecated FindBugs config; considering switching 
to SpotBugs. |
   | +1 | findbugs | 58 | trunk passed |
   ||| _ Patch Compile Tests _ |
   | +1 | mvninstall | 32 | the patch passed |
   | +1 | compile | 26 | the patch passed |
   | +1 | javac | 26 | the patch passed |
   | +1 | checkstyle | 17 | the patch passed |
   | +1 | mvnsite | 30 | the patch passed |
   | +1 | whitespace | 0 | The patch has no whitespace issues. |
   | -1 | shadedclient | 27 | patch has errors when building and testing our 
client artifacts. |
   | +1 | javadoc | 20 | the patch passed |
   | +1 | findbugs | 60 | the patch passed |
   ||| _ Other Tests _ |
   | +1 | unit | 78 | hadoop-aws in the patch passed. |
   | +1 | asflicense | 23 | The patch does not generate ASF License warnings. |
   | | | 1927 | |
   
   
   | Subsystem | Report/Notes |
   |--:|:-|
   | Docker | Client=19.03.2 Server=19.03.2 base: 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1487/1/artifact/out/Dockerfile
 |
   | GITHUB PR | https://github.com/apache/hadoop/pull/1487 |
   | Optional Tests | dupname asflicense compile javac javadoc mvninstall 
mvnsite unit shadedclient findbugs checkstyle |
   | uname | Linux 732c1198d1c9 4.15.0-58-generic #64-Ubuntu SMP Tue Aug 6 
11:12:41 UTC 2019 x86_64 x86_64 x86_64 GNU/Linux |
   | Build tool | maven |
   | Personality | personality/hadoop.sh |
   | git revision | trunk / 3f223be |
   | Default Java | 1.8.0_222 |
   |  Test Results | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1487/1/testReport/ |
   | Max. process+thread count | 305 (vs. ulimit of 5500) |
   | modules | C: hadoop-tools/hadoop-aws U: hadoop-tools/hadoop-aws |
   | Console output | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1487/1/console |
   | versions | git=2.7.4 maven=3.3.9 findbugs=3.1.0-RC1 |
   | Powered by | Apache Yetus 0.10.0 http://yetus.apache.org |
   
   
   This message was automatically generated.
   
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[GitHub] [hadoop] hadoop-yetus commented on issue #1486: HDDS-2158. Fixing Json Injection Issue in JsonUtils.

2019-09-20 Thread GitBox
hadoop-yetus commented on issue #1486: HDDS-2158. Fixing Json Injection Issue 
in JsonUtils.
URL: https://github.com/apache/hadoop/pull/1486#issuecomment-533695825
 
 
   :broken_heart: **-1 overall**
   
   
   
   
   
   
   | Vote | Subsystem | Runtime | Comment |
   |::|--:|:|:|
   | 0 | reexec | 75 | Docker mode activated. |
   ||| _ Prechecks _ |
   | +1 | dupname | 0 | No case conflicting files found. |
   | +1 | @author | 0 | The patch does not contain any @author tags. |
   | -1 | test4tests | 0 | The patch doesn't appear to include any new or 
modified tests.  Please justify why no new tests are needed for this patch. 
Also please list what manual steps were performed to verify this patch. |
   ||| _ trunk Compile Tests _ |
   | 0 | mvndep | 22 | Maven dependency ordering for branch |
   | -1 | mvninstall | 28 | hadoop-ozone in trunk failed. |
   | -1 | compile | 19 | hadoop-ozone in trunk failed. |
   | +1 | checkstyle | 48 | trunk passed |
   | +1 | mvnsite | 0 | trunk passed |
   | -1 | shadedclient | 91 | branch has errors when building and testing our 
client artifacts. |
   | +1 | javadoc | 150 | trunk passed |
   | 0 | spotbugs | 172 | Used deprecated FindBugs config; considering 
switching to SpotBugs. |
   | -1 | findbugs | 21 | hadoop-ozone in trunk failed. |
   ||| _ Patch Compile Tests _ |
   | 0 | mvndep | 21 | Maven dependency ordering for patch |
   | -1 | mvninstall | 29 | hadoop-ozone in the patch failed. |
   | -1 | compile | 19 | hadoop-ozone in the patch failed. |
   | -1 | javac | 19 | hadoop-ozone in the patch failed. |
   | -0 | checkstyle | 23 | hadoop-hdds: The patch generated 1 new + 0 
unchanged - 0 fixed = 1 total (was 0) |
   | +1 | mvnsite | 0 | the patch passed |
   | +1 | whitespace | 0 | The patch has no whitespace issues. |
   | -1 | shadedclient | 28 | patch has errors when building and testing our 
client artifacts. |
   | +1 | javadoc | 149 | the patch passed |
   | -1 | findbugs | 21 | hadoop-ozone in the patch failed. |
   ||| _ Other Tests _ |
   | +1 | unit | 265 | hadoop-hdds in the patch passed. |
   | -1 | unit | 22 | hadoop-ozone in the patch failed. |
   | +1 | asflicense | 24 | The patch does not generate ASF License warnings. |
   | | | 1807 | |
   
   
   | Subsystem | Report/Notes |
   |--:|:-|
   | Docker | Client=19.03.2 Server=19.03.2 base: 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1486/1/artifact/out/Dockerfile
 |
   | GITHUB PR | https://github.com/apache/hadoop/pull/1486 |
   | Optional Tests | dupname asflicense compile javac javadoc mvninstall 
mvnsite unit shadedclient findbugs checkstyle |
   | uname | Linux 414a6d781e05 4.15.0-58-generic #64-Ubuntu SMP Tue Aug 6 
11:12:41 UTC 2019 x86_64 x86_64 x86_64 GNU/Linux |
   | Build tool | maven |
   | Personality | personality/hadoop.sh |
   | git revision | trunk / 3f223be |
   | Default Java | 1.8.0_222 |
   | mvninstall | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1486/1/artifact/out/branch-mvninstall-hadoop-ozone.txt
 |
   | compile | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1486/1/artifact/out/branch-compile-hadoop-ozone.txt
 |
   | findbugs | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1486/1/artifact/out/branch-findbugs-hadoop-ozone.txt
 |
   | mvninstall | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1486/1/artifact/out/patch-mvninstall-hadoop-ozone.txt
 |
   | compile | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1486/1/artifact/out/patch-compile-hadoop-ozone.txt
 |
   | javac | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1486/1/artifact/out/patch-compile-hadoop-ozone.txt
 |
   | checkstyle | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1486/1/artifact/out/diff-checkstyle-hadoop-hdds.txt
 |
   | findbugs | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1486/1/artifact/out/patch-findbugs-hadoop-ozone.txt
 |
   | unit | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1486/1/artifact/out/patch-unit-hadoop-ozone.txt
 |
   |  Test Results | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1486/1/testReport/ |
   | Max. process+thread count | 483 (vs. ulimit of 5500) |
   | modules | C: hadoop-hdds/common hadoop-hdds/tools 
hadoop-ozone/ozone-manager U: . |
   | Console output | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1486/1/console |
   | versions | git=2.7.4 maven=3.3.9 findbugs=3.1.0-RC1 |
   | Powered by | Apache Yetus 0.10.0 http://yetus.apache.org |
   
   
   This message was automatically generated.
   
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [hadoop] hanishakoneru opened a new pull request #1488: HDDS-2159. Fix Race condition in ProfileServlet#pid.

2019-09-20 Thread GitBox
hanishakoneru opened a new pull request #1488: HDDS-2159. Fix Race condition in 
ProfileServlet#pid.
URL: https://github.com/apache/hadoop/pull/1488
 
 
   There is a race condition in ProfileServlet. The Servlet member field pid 
should not be used for local assignment. It could lead to race condition.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[GitHub] [hadoop] jeffsaremi commented on a change in pull request #1480: HDFS-14857 FS operations fail in HA mode: DataNode fails to connect to NameNode

2019-09-20 Thread GitBox
jeffsaremi commented on a change in pull request #1480: HDFS-14857 FS 
operations fail in HA mode: DataNode fails to connect to NameNode
URL: https://github.com/apache/hadoop/pull/1480#discussion_r326781148
 
 

 ##
 File path: 
hadoop-hdfs-project/hadoop-hdfs-client/src/main/java/org/apache/hadoop/hdfs/server/namenode/ha/AbstractNNFailoverProxyProvider.java
 ##
 @@ -200,6 +201,24 @@ public HAServiceState getCachedState() {
 return proxies;
   }
 
+  /**
+   * Resets the NameNode proxy addresses in case they're stale
+   */
+  protected void resetProxyAddresses(List> proxies) {
 
 Review comment:
   Yes this is a sensible suggestion. I'll work on incorporating that


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[jira] [Commented] (HADOOP-16590) IBM Java has deprecated OS login module classes and OS principal classes.

2019-09-20 Thread Hadoop QA (Jira)


[ 
https://issues.apache.org/jira/browse/HADOOP-16590?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16934718#comment-16934718
 ] 

Hadoop QA commented on HADOOP-16590:


| (x) *{color:red}-1 overall{color}* |
\\
\\
|| Vote || Subsystem || Runtime || Comment ||
| {color:blue}0{color} | {color:blue} reexec {color} | {color:blue}  0m 
37s{color} | {color:blue} Docker mode activated. {color} |
|| || || || {color:brown} Prechecks {color} ||
| {color:green}+1{color} | {color:green} dupname {color} | {color:green}  0m  
0s{color} | {color:green} No case conflicting files found. {color} |
| {color:green}+1{color} | {color:green} @author {color} | {color:green}  0m  
0s{color} | {color:green} The patch does not contain any @author tags. {color} |
| {color:red}-1{color} | {color:red} test4tests {color} | {color:red}  0m  
0s{color} | {color:red} The patch doesn't appear to include any new or modified 
tests. Please justify why no new tests are needed for this patch. Also please 
list what manual steps were performed to verify this patch. {color} |
|| || || || {color:brown} trunk Compile Tests {color} ||
| {color:green}+1{color} | {color:green} mvninstall {color} | {color:green} 17m 
47s{color} | {color:green} trunk passed {color} |
| {color:green}+1{color} | {color:green} compile {color} | {color:green} 16m 
30s{color} | {color:green} trunk passed {color} |
| {color:green}+1{color} | {color:green} checkstyle {color} | {color:green}  0m 
52s{color} | {color:green} trunk passed {color} |
| {color:green}+1{color} | {color:green} mvnsite {color} | {color:green}  1m 
24s{color} | {color:green} trunk passed {color} |
| {color:red}-1{color} | {color:red} shadedclient {color} | {color:red}  3m 
15s{color} | {color:red} branch has errors when building and testing our client 
artifacts. {color} |
| {color:green}+1{color} | {color:green} javadoc {color} | {color:green}  1m 
33s{color} | {color:green} trunk passed {color} |
| {color:blue}0{color} | {color:blue} spotbugs {color} | {color:blue}  2m  
5s{color} | {color:blue} Used deprecated FindBugs config; considering switching 
to SpotBugs. {color} |
| {color:green}+1{color} | {color:green} findbugs {color} | {color:green}  2m  
4s{color} | {color:green} trunk passed {color} |
|| || || || {color:brown} Patch Compile Tests {color} ||
| {color:green}+1{color} | {color:green} mvninstall {color} | {color:green}  0m 
48s{color} | {color:green} the patch passed {color} |
| {color:green}+1{color} | {color:green} compile {color} | {color:green} 15m 
35s{color} | {color:green} the patch passed {color} |
| {color:green}+1{color} | {color:green} javac {color} | {color:green} 15m 
35s{color} | {color:green} the patch passed {color} |
| {color:green}+1{color} | {color:green} checkstyle {color} | {color:green}  0m 
51s{color} | {color:green} hadoop-common-project/hadoop-common: The patch 
generated 0 new + 76 unchanged - 3 fixed = 76 total (was 79) {color} |
| {color:green}+1{color} | {color:green} mvnsite {color} | {color:green}  1m 
22s{color} | {color:green} the patch passed {color} |
| {color:green}+1{color} | {color:green} whitespace {color} | {color:green}  0m 
 0s{color} | {color:green} The patch has no whitespace issues. {color} |
| {color:red}-1{color} | {color:red} shadedclient {color} | {color:red}  0m 
50s{color} | {color:red} patch has errors when building and testing our client 
artifacts. {color} |
| {color:green}+1{color} | {color:green} javadoc {color} | {color:green}  1m 
36s{color} | {color:green} the patch passed {color} |
| {color:green}+1{color} | {color:green} findbugs {color} | {color:green}  2m  
9s{color} | {color:green} the patch passed {color} |
|| || || || {color:brown} Other Tests {color} ||
| {color:green}+1{color} | {color:green} unit {color} | {color:green}  9m  
7s{color} | {color:green} hadoop-common in the patch passed. {color} |
| {color:green}+1{color} | {color:green} asflicense {color} | {color:green}  0m 
51s{color} | {color:green} The patch does not generate ASF License warnings. 
{color} |
| {color:black}{color} | {color:black} {color} | {color:black} 78m 43s{color} | 
{color:black} {color} |
\\
\\
|| Subsystem || Report/Notes ||
| Docker | Client=19.03.1 Server=19.03.1 base: 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1484/1/artifact/out/Dockerfile
 |
| GITHUB PR | https://github.com/apache/hadoop/pull/1484 |
| JIRA Issue | HADOOP-16590 |
| Optional Tests | dupname asflicense compile javac javadoc mvninstall mvnsite 
unit shadedclient findbugs checkstyle |
| uname | Linux 1853247e5ab8 4.15.0-58-generic #64-Ubuntu SMP Tue Aug 6 
11:12:41 UTC 2019 x86_64 x86_64 x86_64 GNU/Linux |
| Build tool | maven |
| Personality | personality/hadoop.sh |
| git revision | trunk / b3173e1 |
| Default Java | 1.8.0_222 |
|  Test Results | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1484/1/testReport/ |
| Max. process+thread count | 1373 (vs. ulimit of 5500) |
| modules | C: 

[GitHub] [hadoop] hadoop-yetus commented on issue #1484: [HADOOP-16590] - LoginModule Classes and Principal Classes deprecated in IBM Java.

2019-09-20 Thread GitBox
hadoop-yetus commented on issue #1484: [HADOOP-16590] - LoginModule Classes and 
Principal Classes deprecated in IBM Java.
URL: https://github.com/apache/hadoop/pull/1484#issuecomment-533689091
 
 
   :broken_heart: **-1 overall**
   
   
   
   
   
   
   | Vote | Subsystem | Runtime | Comment |
   |::|--:|:|:|
   | 0 | reexec | 37 | Docker mode activated. |
   ||| _ Prechecks _ |
   | +1 | dupname | 0 | No case conflicting files found. |
   | +1 | @author | 0 | The patch does not contain any @author tags. |
   | -1 | test4tests | 0 | The patch doesn't appear to include any new or 
modified tests.  Please justify why no new tests are needed for this patch. 
Also please list what manual steps were performed to verify this patch. |
   ||| _ trunk Compile Tests _ |
   | +1 | mvninstall | 1067 | trunk passed |
   | +1 | compile | 990 | trunk passed |
   | +1 | checkstyle | 52 | trunk passed |
   | +1 | mvnsite | 84 | trunk passed |
   | -1 | shadedclient | 195 | branch has errors when building and testing our 
client artifacts. |
   | +1 | javadoc | 93 | trunk passed |
   | 0 | spotbugs | 125 | Used deprecated FindBugs config; considering 
switching to SpotBugs. |
   | +1 | findbugs | 124 | trunk passed |
   ||| _ Patch Compile Tests _ |
   | +1 | mvninstall | 48 | the patch passed |
   | +1 | compile | 935 | the patch passed |
   | +1 | javac | 935 | the patch passed |
   | +1 | checkstyle | 51 | hadoop-common-project/hadoop-common: The patch 
generated 0 new + 76 unchanged - 3 fixed = 76 total (was 79) |
   | +1 | mvnsite | 82 | the patch passed |
   | +1 | whitespace | 0 | The patch has no whitespace issues. |
   | -1 | shadedclient | 50 | patch has errors when building and testing our 
client artifacts. |
   | +1 | javadoc | 96 | the patch passed |
   | +1 | findbugs | 129 | the patch passed |
   ||| _ Other Tests _ |
   | +1 | unit | 547 | hadoop-common in the patch passed. |
   | +1 | asflicense | 51 | The patch does not generate ASF License warnings. |
   | | | 4723 | |
   
   
   | Subsystem | Report/Notes |
   |--:|:-|
   | Docker | Client=19.03.1 Server=19.03.1 base: 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1484/1/artifact/out/Dockerfile
 |
   | GITHUB PR | https://github.com/apache/hadoop/pull/1484 |
   | JIRA Issue | HADOOP-16590 |
   | Optional Tests | dupname asflicense compile javac javadoc mvninstall 
mvnsite unit shadedclient findbugs checkstyle |
   | uname | Linux 1853247e5ab8 4.15.0-58-generic #64-Ubuntu SMP Tue Aug 6 
11:12:41 UTC 2019 x86_64 x86_64 x86_64 GNU/Linux |
   | Build tool | maven |
   | Personality | personality/hadoop.sh |
   | git revision | trunk / b3173e1 |
   | Default Java | 1.8.0_222 |
   |  Test Results | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1484/1/testReport/ |
   | Max. process+thread count | 1373 (vs. ulimit of 5500) |
   | modules | C: hadoop-common-project/hadoop-common U: 
hadoop-common-project/hadoop-common |
   | Console output | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1484/1/console |
   | versions | git=2.7.4 maven=3.3.9 findbugs=3.1.0-RC1 |
   | Powered by | Apache Yetus 0.10.0 http://yetus.apache.org |
   
   
   This message was automatically generated.
   
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[GitHub] [hadoop] vinayakumarb commented on issue #1482: HADOOP-16589. [pb-upgrade] Update docker image to make 3.7.1 protoc as default

2019-09-20 Thread GitBox
vinayakumarb commented on issue #1482: HADOOP-16589. [pb-upgrade] Update docker 
image to make 3.7.1 protoc as default
URL: https://github.com/apache/hadoop/pull/1482#issuecomment-533689284
 
 
   Since changes are detected to be in parent module, unit tests were triggered 
for all modules.
Jenkins run was aborted  after 5 hrs.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[GitHub] [hadoop] ayushtkn commented on issue #1482: HADOOP-16589. [pb-upgrade] Update docker image to make 3.7.1 protoc as default

2019-09-20 Thread GitBox
ayushtkn commented on issue #1482: HADOOP-16589. [pb-upgrade] Update docker 
image to make 3.7.1 protoc as default
URL: https://github.com/apache/hadoop/pull/1482#issuecomment-533688181
 
 
   /rebuild


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[GitHub] [hadoop] ayushtkn commented on issue #1482: HADOOP-16589. [pb-upgrade] Update docker image to make 3.7.1 protoc as default

2019-09-20 Thread GitBox
ayushtkn commented on issue #1482: HADOOP-16589. [pb-upgrade] Update docker 
image to make 3.7.1 protoc as default
URL: https://github.com/apache/hadoop/pull/1482#issuecomment-533688094
 
 
   The build seems to have failed weirdly :
   
   > Error when executing cleanup post condition:
   org.jenkinsci.plugins.workflow.steps.FlowInterruptedException
at 
org.jenkinsci.plugins.workflow.steps.TimeoutStepExecution.cancel(TimeoutStepExecution.java:160)
at 
org.jenkinsci.plugins.workflow.steps.TimeoutStepExecution.access$100(TimeoutStepExecution.java:38)
at 
org.jenkinsci.plugins.workflow.steps.TimeoutStepExecution$1.run(TimeoutStepExecution.java:134)
at 
jenkins.security.ImpersonatingScheduledExecutorService$1.run(ImpersonatingScheduledExecutorService.java:58)
at 
java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at 
java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$201(ScheduledThreadPoolExecutor.java:180)
at 
java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:293)
at 
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at 
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
   
   I guess you need to retriger once again.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[jira] [Commented] (HADOOP-16579) Upgrade to Apache Curator 4.2.0 in Hadoop

2019-09-20 Thread Steve Loughran (Jira)


[ 
https://issues.apache.org/jira/browse/HADOOP-16579?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16934708#comment-16934708
 ] 

Steve Loughran commented on HADOOP-16579:
-

hadoop trunk is on guava 27; jackson 2.9.x and should be kept high. swift is 
new, as may be resteasy

> Upgrade to Apache Curator 4.2.0 in Hadoop
> -
>
> Key: HADOOP-16579
> URL: https://issues.apache.org/jira/browse/HADOOP-16579
> Project: Hadoop Common
>  Issue Type: Improvement
>Reporter: Mate Szalay-Beko
>Priority: Major
>
> Currently in Hadoop we are using [ZooKeeper version 
> 3.4.13|https://github.com/apache/hadoop/blob/7f9073132dcc9db157a6792635d2ed099f2ef0d2/hadoop-project/pom.xml#L90].
>  ZooKeeper 3.5.5 is the latest stable Apache ZooKeeper release. It contains 
> many new features (including SSL related improvements which can be very 
> important for production use; see [the release 
> notes|https://zookeeper.apache.org/doc/r3.5.5/releasenotes.html]).
> Apache Curator is a high level ZooKeeper client library, that makes it easier 
> to use the low level ZooKeeper API. Currently [in Hadoop we are using Curator 
> 2.13.0|https://github.com/apache/hadoop/blob/7f9073132dcc9db157a6792635d2ed099f2ef0d2/hadoop-project/pom.xml#L91]
>  and [in Ozone we use Curator 
> 2.12.0|https://github.com/apache/hadoop/blob/7f9073132dcc9db157a6792635d2ed099f2ef0d2/pom.ozone.xml#L146].
> Curator 2.x is supporting only the ZooKeeper 3.4.x releases, while Curator 
> 3.x is compatible only with the new ZooKeeper 3.5.x releases. Fortunately, 
> the latest Curator 4.x versions are compatible with both ZooKeeper 3.4.x and 
> 3.5.x. (see [the relevant Curator 
> page|https://curator.apache.org/zk-compatibility.html]). Many Apache projects 
> have already migrated to Curator 4 (like HBase, Phoenix, Druid, etc.), other 
> components are doing it right now (e.g. Hive).
> *The aims of this task are* to:
>  - change Curator version in Hadoop to the latest stable 4.x version 
> (currently 4.2.0)
>  - also make sure we don't have multiple ZooKeeper versions in the classpath 
> to avoid runtime problems (it is 
> [recommended|https://curator.apache.org/zk-compatibility.html] to exclude the 
> ZooKeeper which come with Curator, so that there will be only a single 
> ZooKeeper version used runtime in Hadoop)
> In this ticket we still don't want to change the default ZooKeeper version in 
> Hadoop, we only want to make it possible for the community to be able to 
> build / use Hadoop with the new ZooKeeper (e.g. if they need to secure the 
> ZooKeeper communication with SSL, what is only supported in the new ZooKeeper 
> version). Upgrading to Curator 4.x should keep Hadoop to be compatible with 
> both ZooKeeper 3.4 and 3.5.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[jira] [Commented] (HADOOP-13363) Upgrade protobuf from 2.5.0 to something newer

2019-09-20 Thread Steve Loughran (Jira)


[ 
https://issues.apache.org/jira/browse/HADOOP-13363?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16934705#comment-16934705
 ] 

Steve Loughran commented on HADOOP-13363:
-

> Since I'm the originator of this JIRA issue, I can't get away from it.

 

[~aw]

 Before you sign up for the ASF developer protection program -- where you are 
given a false identity and can live the rest of your life (happily?) fielding 
Lotus Notes support calls as the IT team for an organisation still debating 
when to make the switch to Windows Millenium -- have you considered just 
editing the reporter field and putting someone else's name in instead?

> Upgrade protobuf from 2.5.0 to something newer
> --
>
> Key: HADOOP-13363
> URL: https://issues.apache.org/jira/browse/HADOOP-13363
> Project: Hadoop Common
>  Issue Type: Improvement
>  Components: build
>Affects Versions: 3.0.0-alpha1, 3.0.0-alpha2
>Reporter: Allen Wittenauer
>Assignee: Vinayakumar B
>Priority: Major
>  Labels: security
> Attachments: HADOOP-13363.001.patch, HADOOP-13363.002.patch, 
> HADOOP-13363.003.patch, HADOOP-13363.004.patch, HADOOP-13363.005.patch
>
>
> Standard protobuf 2.5.0 does not work properly on many platforms.  (See, for 
> example, https://gist.github.com/BennettSmith/7111094 ).  In order for us to 
> avoid crazy work arounds in the build environment and the fact that 2.5.0 is 
> starting to slowly disappear as a standard install-able package for even 
> Linux/x86, we need to either upgrade or self bundle or something else.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[GitHub] [hadoop] hadoop-yetus commented on issue #1485: HDDS-2157. checkstyle: print filenames relative to project root

2019-09-20 Thread GitBox
hadoop-yetus commented on issue #1485: HDDS-2157. checkstyle: print filenames 
relative to project root
URL: https://github.com/apache/hadoop/pull/1485#issuecomment-533681882
 
 
   :broken_heart: **-1 overall**
   
   
   
   
   
   
   | Vote | Subsystem | Runtime | Comment |
   |::|--:|:|:|
   | 0 | reexec | 105 | Docker mode activated. |
   ||| _ Prechecks _ |
   | +1 | dupname | 0 | No case conflicting files found. |
   | 0 | shelldocs | 0 | Shelldocs was not available. |
   | +1 | @author | 0 | The patch does not contain any @author tags. |
   | -1 | test4tests | 0 | The patch doesn't appear to include any new or 
modified tests.  Please justify why no new tests are needed for this patch. 
Also please list what manual steps were performed to verify this patch. |
   ||| _ trunk Compile Tests _ |
   | -1 | mvninstall | 34 | hadoop-ozone in trunk failed. |
   | +1 | mvnsite | 0 | trunk passed |
   | -1 | shadedclient | 35 | branch has errors when building and testing our 
client artifacts. |
   ||| _ Patch Compile Tests _ |
   | -1 | mvninstall | 32 | hadoop-ozone in the patch failed. |
   | +1 | mvnsite | 0 | the patch passed |
   | +1 | shellcheck | 0 | There were no new shellcheck issues. |
   | +1 | whitespace | 0 | The patch has no whitespace issues. |
   | -1 | shadedclient | 28 | patch has errors when building and testing our 
client artifacts. |
   ||| _ Other Tests _ |
   | +1 | unit | 57 | hadoop-hdds in the patch passed. |
   | -1 | unit | 25 | hadoop-ozone in the patch failed. |
   | +1 | asflicense | 30 | The patch does not generate ASF License warnings. |
   | | | 659 | |
   
   
   | Subsystem | Report/Notes |
   |--:|:-|
   | Docker | Client=18.09.7 Server=18.09.7 base: 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1485/1/artifact/out/Dockerfile
 |
   | GITHUB PR | https://github.com/apache/hadoop/pull/1485 |
   | Optional Tests | dupname asflicense mvnsite unit shellcheck shelldocs |
   | uname | Linux feb834285bcb 4.15.0-58-generic #64-Ubuntu SMP Tue Aug 6 
11:12:41 UTC 2019 x86_64 x86_64 x86_64 GNU/Linux |
   | Build tool | maven |
   | Personality | personality/hadoop.sh |
   | git revision | trunk / 3f223be |
   | mvninstall | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1485/1/artifact/out/branch-mvninstall-hadoop-ozone.txt
 |
   | mvninstall | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1485/1/artifact/out/patch-mvninstall-hadoop-ozone.txt
 |
   | unit | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1485/1/artifact/out/patch-unit-hadoop-ozone.txt
 |
   |  Test Results | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1485/1/testReport/ |
   | Max. process+thread count | 50 (vs. ulimit of 5500) |
   | modules | C: hadoop-ozone U: hadoop-ozone |
   | Console output | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1485/1/console |
   | versions | git=2.7.4 maven=3.3.9 shellcheck=0.4.6 |
   | Powered by | Apache Yetus 0.10.0 http://yetus.apache.org |
   
   
   This message was automatically generated.
   
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[GitHub] [hadoop] steveloughran commented on issue #1332: HADOOP-16445. Allow separate custom signing algorithms for S3 and DDB

2019-09-20 Thread GitBox
steveloughran commented on issue #1332: HADOOP-16445. Allow separate custom 
signing algorithms for S3 and DDB
URL: https://github.com/apache/hadoop/pull/1332#issuecomment-533678354
 
 
   +1


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[GitHub] [hadoop] sidseth commented on issue #1332: HADOOP-16445. Allow separate custom signing algorithms for S3 and DDB

2019-09-20 Thread GitBox
sidseth commented on issue #1332: HADOOP-16445. Allow separate custom signing 
algorithms for S3 and DDB
URL: https://github.com/apache/hadoop/pull/1332#issuecomment-533677794
 
 
   /retest


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[GitHub] [hadoop] sidseth commented on issue #1332: HADOOP-16445. Allow separate custom signing algorithms for S3 and DDB

2019-09-20 Thread GitBox
sidseth commented on issue #1332: HADOOP-16445. Allow separate custom signing 
algorithms for S3 and DDB
URL: https://github.com/apache/hadoop/pull/1332#issuecomment-533677639
 
 
   Also, the *MRJob tests pass after applying HADOOP-16591 or setting configs 
in auth-keys.xml


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[jira] [Comment Edited] (HADOOP-16207) Fix ITestDirectoryCommitMRJob.testMRJob

2019-09-20 Thread Siddharth Seth (Jira)


[ 
https://issues.apache.org/jira/browse/HADOOP-16207?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16934667#comment-16934667
 ] 

Siddharth Seth edited comment on HADOOP-16207 at 9/20/19 7:13 PM:
--

-Attached a simple patch which fixes just the test failures. Doesn't do 
anything with parallelism, changing dir names to be different across tests etc. 
Can submit this in a separate jira, if this one is being used for parallelizing 
the tests.-
Switched to https://issues.apache.org/jira/browse/HADOOP-16591


was (Author: sseth):
Attached a simple patch which fixes just the test failures. Doesn't do anything 
with parallelism, changing dir names to be different across tests etc. Can 
submit this in a separate jira, if this one is being used for parallelizing the 
tests.

> Fix ITestDirectoryCommitMRJob.testMRJob
> ---
>
> Key: HADOOP-16207
> URL: https://issues.apache.org/jira/browse/HADOOP-16207
> Project: Hadoop Common
>  Issue Type: Sub-task
>  Components: fs/s3, test
>Affects Versions: 3.3.0
>Reporter: Steve Loughran
>Assignee: Steve Loughran
>Priority: Critical
>
> Reported failure of {{ITestDirectoryCommitMRJob}} in validation runs of 
> HADOOP-16186; assertIsDirectory with s3guard enabled and a parallel test run: 
> Path "is recorded as deleted by S3Guard"
> {code}
> waitForConsistency();
> assertIsDirectory(outputPath) /* here */
> {code}
> The file is there but there's a tombstone. Possibilities
> * some race condition with another test
> * tombstones aren't timing out
> * committers aren't creating that base dir in a way which cleans up S3Guard's 
> tombstones. 
> Remember: we do have to delete that dest dir before the committer runs unless 
> overwrite==true, so at the start of the run there will be a tombstone. It 
> should be overwritten by a success.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[jira] [Updated] (HADOOP-16207) Fix ITestDirectoryCommitMRJob.testMRJob

2019-09-20 Thread Siddharth Seth (Jira)


 [ 
https://issues.apache.org/jira/browse/HADOOP-16207?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Siddharth Seth updated HADOOP-16207:

Attachment: (was: HADOOP-16207.fixtestsonly.txt)

> Fix ITestDirectoryCommitMRJob.testMRJob
> ---
>
> Key: HADOOP-16207
> URL: https://issues.apache.org/jira/browse/HADOOP-16207
> Project: Hadoop Common
>  Issue Type: Sub-task
>  Components: fs/s3, test
>Affects Versions: 3.3.0
>Reporter: Steve Loughran
>Assignee: Steve Loughran
>Priority: Critical
>
> Reported failure of {{ITestDirectoryCommitMRJob}} in validation runs of 
> HADOOP-16186; assertIsDirectory with s3guard enabled and a parallel test run: 
> Path "is recorded as deleted by S3Guard"
> {code}
> waitForConsistency();
> assertIsDirectory(outputPath) /* here */
> {code}
> The file is there but there's a tombstone. Possibilities
> * some race condition with another test
> * tombstones aren't timing out
> * committers aren't creating that base dir in a way which cleans up S3Guard's 
> tombstones. 
> Remember: we do have to delete that dest dir before the committer runs unless 
> overwrite==true, so at the start of the run there will be a tombstone. It 
> should be overwritten by a success.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[GitHub] [hadoop] sidseth commented on issue #1487: HADOOP-16591 Fix S3A ITest*MRjob failures.

2019-09-20 Thread GitBox
sidseth commented on issue #1487: HADOOP-16591 Fix S3A ITest*MRjob failures.
URL: https://github.com/apache/hadoop/pull/1487#issuecomment-533676732
 
 
   Test command run: mvn -T 1C verify -Dparallel-tests -DtestsThreadCount=12 
-Ds3guard -Dauth -Ddynamo
   Region: us-east-2
   There's the usual flaky test failures. Otherwise, all tests succeed.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[GitHub] [hadoop] sidseth opened a new pull request #1487: HADOOP-16591 Fix S3A ITest*MRjob failures.

2019-09-20 Thread GitBox
sidseth opened a new pull request #1487: HADOOP-16591 Fix S3A ITest*MRjob 
failures.
URL: https://github.com/apache/hadoop/pull/1487
 
 
   ## NOTICE
   
   Please create an issue in ASF JIRA before opening a pull request,
   and you need to set the title of the pull request which starts with
   the corresponding JIRA issue number. (e.g. HADOOP-X. Fix a typo in YYY.)
   For more details, please see 
https://cwiki.apache.org/confluence/display/HADOOP/How+To+Contribute
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[jira] [Created] (HADOOP-16591) S3A ITest*MRjob failures

2019-09-20 Thread Siddharth Seth (Jira)
Siddharth Seth created HADOOP-16591:
---

 Summary: S3A ITest*MRjob failures
 Key: HADOOP-16591
 URL: https://issues.apache.org/jira/browse/HADOOP-16591
 Project: Hadoop Common
  Issue Type: Test
  Components: fs/s3
Reporter: Siddharth Seth
Assignee: Siddharth Seth


ITest*MRJob fail with a FileNotFoundException
{code}
[ERROR]   
ITestMagicCommitMRJob>AbstractITCommitMRJob.testMRJob:146->AbstractFSContractTestBase.assertIsDirectory:327
 » FileNotFound
[ERROR]   
ITestDirectoryCommitMRJob>AbstractITCommitMRJob.testMRJob:146->AbstractFSContractTestBase.assertIsDirectory:327
 » FileNotFound
[ERROR]   
ITestPartitionCommitMRJob>AbstractITCommitMRJob.testMRJob:146->AbstractFSContractTestBase.assertIsDirectory:327
 » FileNotFound
[ERROR]   
ITestStagingCommitMRJob>AbstractITCommitMRJob.testMRJob:146->AbstractFSContractTestBase.assertIsDirectory:327
 » FileNotFound
{code}
Details here: 
https://issues.apache.org/jira/browse/HADOOP-16207?focusedCommentId=16933718=com.atlassian.jira.plugin.system.issuetabpanels%3Acomment-tabpanel#comment-16933718

Creating a separate jira since HADOOP-16207 already has a patch which is trying 
to parallelize the test runs.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[GitHub] [hadoop] ashvina commented on a change in pull request #1478: HDFS-14856 Fetch file ACLs while mounting external store

2019-09-20 Thread GitBox
ashvina commented on a change in pull request #1478: HDFS-14856 Fetch file ACLs 
while mounting external store
URL: https://github.com/apache/hadoop/pull/1478#discussion_r326765094
 
 

 ##
 File path: 
hadoop-tools/hadoop-fs2img/src/main/java/org/apache/hadoop/hdfs/server/namenode/TreePath.java
 ##
 @@ -159,13 +180,17 @@ INode toFile(UGIResolver ugi, BlockResolver blk,
 
   INode toDirectory(UGIResolver ugi) {
 final FileStatus s = getFileStatus();
-ugi.addUser(s.getOwner());
-ugi.addGroup(s.getGroup());
+final AclStatus aclStatus = getAclStatus();
+long permissions = ugi.getPermissionsProto(s, aclStatus);
 INodeDirectory.Builder b = INodeDirectory.newBuilder()
 .setModificationTime(s.getModificationTime())
 .setNsQuota(DEFAULT_NAMESPACE_QUOTA)
 .setDsQuota(DEFAULT_STORAGE_SPACE_QUOTA)
-.setPermission(ugi.resolve(s));
+.setPermission(permissions);
+if (aclStatus != null) {
+  throw new UnsupportedOperationException(
 
 Review comment:
   Yes. I am proposing to fix the scanner in the PR which is independent of the 
`ImageWriter`. The consumer of the scanner can choose to ignore the ACL.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[GitHub] [hadoop] hanishakoneru opened a new pull request #1486: HDDS-2158. Fixing Json Injection Issue in JsonUtils.

2019-09-20 Thread GitBox
hanishakoneru opened a new pull request #1486: HDDS-2158. Fixing Json Injection 
Issue in JsonUtils.
URL: https://github.com/apache/hadoop/pull/1486
 
 
   JsonUtils#toJsonStringWithDefaultPrettyPrinter() does not validate the Json 
String  before serializing it which could result in Json Injection.
   This method is mostly used along with JsonUtils#toJsonString() by first 
converting to String and then back to object. This can be avoided by directly 
writing the object through PrettyPrinter.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[GitHub] [hadoop] hadoop-yetus commented on issue #1332: HADOOP-16445. Allow separate custom signing algorithms for S3 and DDB

2019-09-20 Thread GitBox
hadoop-yetus commented on issue #1332: HADOOP-16445. Allow separate custom 
signing algorithms for S3 and DDB
URL: https://github.com/apache/hadoop/pull/1332#issuecomment-533673862
 
 
   :broken_heart: **-1 overall**
   
   
   
   
   
   
   | Vote | Subsystem | Runtime | Comment |
   |::|--:|:|:|
   | 0 | reexec | 39 | Docker mode activated. |
   ||| _ Prechecks _ |
   | +1 | dupname | 1 | No case conflicting files found. |
   | +1 | @author | 0 | The patch does not contain any @author tags. |
   | +1 | test4tests | 0 | The patch appears to include 4 new or modified test 
files. |
   ||| _ trunk Compile Tests _ |
   | +1 | mvninstall | 1041 | trunk passed |
   | +1 | compile | 36 | trunk passed |
   | +1 | checkstyle | 29 | trunk passed |
   | +1 | mvnsite | 40 | trunk passed |
   | -1 | shadedclient | 110 | branch has errors when building and testing our 
client artifacts. |
   | +1 | javadoc | 31 | trunk passed |
   | 0 | spotbugs | 60 | Used deprecated FindBugs config; considering switching 
to SpotBugs. |
   | +1 | findbugs | 57 | trunk passed |
   ||| _ Patch Compile Tests _ |
   | +1 | mvninstall | 34 | the patch passed |
   | +1 | compile | 28 | the patch passed |
   | +1 | javac | 28 | the patch passed |
   | +1 | checkstyle | 21 | hadoop-tools/hadoop-aws: The patch generated 0 new 
+ 42 unchanged - 1 fixed = 42 total (was 43) |
   | +1 | mvnsite | 32 | the patch passed |
   | +1 | whitespace | 0 | The patch has no whitespace issues. |
   | -1 | shadedclient | 29 | patch has errors when building and testing our 
client artifacts. |
   | +1 | javadoc | 24 | the patch passed |
   | +1 | findbugs | 59 | the patch passed |
   ||| _ Other Tests _ |
   | +1 | unit | 82 | hadoop-aws in the patch passed. |
   | +1 | asflicense | 26 | The patch does not generate ASF License warnings. |
   | | | 1815 | |
   
   
   | Subsystem | Report/Notes |
   |--:|:-|
   | Docker | Client=19.03.1 Server=19.03.1 base: 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1332/11/artifact/out/Dockerfile
 |
   | GITHUB PR | https://github.com/apache/hadoop/pull/1332 |
   | Optional Tests | dupname asflicense compile javac javadoc mvninstall 
mvnsite unit shadedclient findbugs checkstyle |
   | uname | Linux e64d223753e4 4.15.0-60-generic #67-Ubuntu SMP Thu Aug 22 
16:55:30 UTC 2019 x86_64 x86_64 x86_64 GNU/Linux |
   | Build tool | maven |
   | Personality | personality/hadoop.sh |
   | git revision | trunk / b3173e1 |
   | Default Java | 1.8.0_222 |
   |  Test Results | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1332/11/testReport/ |
   | Max. process+thread count | 342 (vs. ulimit of 5500) |
   | modules | C: hadoop-tools/hadoop-aws U: hadoop-tools/hadoop-aws |
   | Console output | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1332/11/console |
   | versions | git=2.7.4 maven=3.3.9 findbugs=3.1.0-RC1 |
   | Powered by | Apache Yetus 0.10.0 http://yetus.apache.org |
   
   
   This message was automatically generated.
   
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[GitHub] [hadoop] shanyu commented on a change in pull request #1480: HDFS-14857 FS operations fail in HA mode: DataNode fails to connect to NameNode

2019-09-20 Thread GitBox
shanyu commented on a change in pull request #1480: HDFS-14857 FS operations 
fail in HA mode: DataNode fails to connect to NameNode
URL: https://github.com/apache/hadoop/pull/1480#discussion_r326763525
 
 

 ##
 File path: 
hadoop-hdfs-project/hadoop-hdfs-client/src/main/java/org/apache/hadoop/hdfs/server/namenode/ha/AbstractNNFailoverProxyProvider.java
 ##
 @@ -200,6 +201,24 @@ public HAServiceState getCachedState() {
 return proxies;
   }
 
+  /**
+   * Resets the NameNode proxy addresses in case they're stale
+   */
+  protected void resetProxyAddresses(List> proxies) {
 
 Review comment:
   instead of resetting all proxies, we can just reset the one 
currentProxyIndex pinted to after incrementing it, right?


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[GitHub] [hadoop] jeffsaremi commented on a change in pull request #1480: HDFS-14857 FS operations fail in HA mode: DataNode fails to connect to NameNode

2019-09-20 Thread GitBox
jeffsaremi commented on a change in pull request #1480: HDFS-14857 FS 
operations fail in HA mode: DataNode fails to connect to NameNode
URL: https://github.com/apache/hadoop/pull/1480#discussion_r326763136
 
 

 ##
 File path: 
hadoop-hdfs-project/hadoop-hdfs-client/src/main/java/org/apache/hadoop/hdfs/server/namenode/ha/ConfiguredFailoverProxyProvider.java
 ##
 @@ -62,6 +64,11 @@ public ConfiguredFailoverProxyProvider(Configuration conf, 
URI uri,
 
   @Override
   public  void performFailover(T currentProxy) {
+if(conf.getBoolean(RESET_PROXY_ON_FAILOVER, false)) {
 
 Review comment:
   I'm trying to add one as soon as I can figure out how to mock a DNS so that 
NetUtils.createSocketAddr() can return what I want it to.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[GitHub] [hadoop] ashvina commented on a change in pull request #1478: HDFS-14856 Fetch file ACLs while mounting external store

2019-09-20 Thread GitBox
ashvina commented on a change in pull request #1478: HDFS-14856 Fetch file ACLs 
while mounting external store
URL: https://github.com/apache/hadoop/pull/1478#discussion_r326762306
 
 

 ##
 File path: 
hadoop-tools/hadoop-fs2img/src/main/java/org/apache/hadoop/hdfs/server/namenode/FSTreeWalk.java
 ##
 @@ -71,14 +95,24 @@ private FSTreeIterator() {
 }
 
 FSTreeIterator(TreePath p) {
+  AclStatus acls = null;
+  Path remotePath = p.getFileStatus().getPath();
+  try {
+acls = getAclStatus(fs, remotePath);
+  } catch (IOException e) {
+LOG.warn(
+"Got exception when trying to get remote acls for path {} : {}",
+remotePath, e.getMessage());
+  }
   getPendingQueue().addFirst(
-  new TreePath(p.getFileStatus(), p.getParentId(), this, fs));
+  new TreePath(p.getFileStatus(), p.getParentId(), this, fs, acls));
 
 Review comment:
   ACL can be `null`.
   `IOException` comes from the `FileSystems` API, `AclStatus getAclStatus(Path 
path) throws IOException` 


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[GitHub] [hadoop] steveloughran commented on issue #1442: HADOOP-16570. S3A committers leak threads on job/task commit.

2019-09-20 Thread GitBox
steveloughran commented on issue #1442: HADOOP-16570. S3A committers leak 
threads on job/task commit.
URL: https://github.com/apache/hadoop/pull/1442#issuecomment-533671857
 
 
   This patch is about to get bigger and more complicated, as when you get 
really big with the scale tests, you also discover that trying to keep a list 
of all pending commits in the job committer it's just a way to see OOM 
exception traces. I going to have to be more incremental about loading and 
committing files.
   
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[GitHub] [hadoop] goiri commented on a change in pull request #1480: HDFS-14857 FS operations fail in HA mode: DataNode fails to connect to NameNode

2019-09-20 Thread GitBox
goiri commented on a change in pull request #1480: HDFS-14857 FS operations 
fail in HA mode: DataNode fails to connect to NameNode
URL: https://github.com/apache/hadoop/pull/1480#discussion_r326462143
 
 

 ##
 File path: 
hadoop-hdfs-project/hadoop-hdfs-client/src/main/java/org/apache/hadoop/hdfs/server/namenode/ha/ConfiguredFailoverProxyProvider.java
 ##
 @@ -62,6 +64,11 @@ public ConfiguredFailoverProxyProvider(Configuration conf, 
URI uri,
 
   @Override
   public  void performFailover(T currentProxy) {
+if(conf.getBoolean(RESET_PROXY_ON_FAILOVER, false)) {
 
 Review comment:
   Can we have a unit test?


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[GitHub] [hadoop] goiri commented on a change in pull request #1480: HDFS-14857 FS operations fail in HA mode: DataNode fails to connect to NameNode

2019-09-20 Thread GitBox
goiri commented on a change in pull request #1480: HDFS-14857 FS operations 
fail in HA mode: DataNode fails to connect to NameNode
URL: https://github.com/apache/hadoop/pull/1480#discussion_r326462034
 
 

 ##
 File path: 
hadoop-hdfs-project/hadoop-hdfs-client/src/main/java/org/apache/hadoop/hdfs/server/namenode/ha/ConfiguredFailoverProxyProvider.java
 ##
 @@ -36,6 +36,8 @@
 public class ConfiguredFailoverProxyProvider extends
 AbstractNNFailoverProxyProvider {
 
+  public static final String RESET_PROXY_ON_FAILOVER = 
"dfs.client.failover.proxy.provider.reset-proxy-on-failure";
 
 Review comment:
   This should go to the config file.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[GitHub] [hadoop] goiri commented on a change in pull request #1478: HDFS-14856 Fetch file ACLs while mounting external store

2019-09-20 Thread GitBox
goiri commented on a change in pull request #1478: HDFS-14856 Fetch file ACLs 
while mounting external store
URL: https://github.com/apache/hadoop/pull/1478#discussion_r326761613
 
 

 ##
 File path: 
hadoop-hdfs-project/hadoop-hdfs/src/main/java/org/apache/hadoop/hdfs/DFSConfigKeys.java
 ##
 @@ -379,6 +379,8 @@
   public static final String DFS_PROVIDED_ALIASMAP_TEXT_WRITE_DIR_DEFAULT = 
"file:///tmp/";
 
   public static final String DFS_PROVIDED_ALIASMAP_LEVELDB_PATH = 
"dfs.provided.aliasmap.leveldb.path";
+  public static final String DFS_NAMENODE_MOUNT_ACLS_ENABLED = 
"dfs.namenode.mount.acls.enabled";
 
 Review comment:
   Better than before for sure.
   Not sure about the end policy for distinguishing namenode from not.
   It should be consistent.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[GitHub] [hadoop] hadoop-yetus removed a comment on issue #1442: HADOOP-16570. S3A committers leak threads on job/task commit.

2019-09-20 Thread GitBox
hadoop-yetus removed a comment on issue #1442: HADOOP-16570. S3A committers 
leak threads on job/task commit.
URL: https://github.com/apache/hadoop/pull/1442#issuecomment-532857835
 
 
   :confetti_ball: **+1 overall**
   
   
   
   
   
   
   | Vote | Subsystem | Runtime | Comment |
   |::|--:|:|:|
   | 0 | reexec | 71 | Docker mode activated. |
   ||| _ Prechecks _ |
   | +1 | dupname | 0 | No case conflicting files found. |
   | +1 | @author | 0 | The patch does not contain any @author tags. |
   | +1 | test4tests | 0 | The patch appears to include 3 new or modified test 
files. |
   ||| _ trunk Compile Tests _ |
   | +1 | mvninstall | 1068 | trunk passed |
   | +1 | compile | 34 | trunk passed |
   | +1 | checkstyle | 26 | trunk passed |
   | +1 | mvnsite | 39 | trunk passed |
   | +1 | shadedclient | 732 | branch has no errors when building and testing 
our client artifacts. |
   | +1 | javadoc | 28 | trunk passed |
   | 0 | spotbugs | 60 | Used deprecated FindBugs config; considering switching 
to SpotBugs. |
   | +1 | findbugs | 58 | trunk passed |
   ||| _ Patch Compile Tests _ |
   | +1 | mvninstall | 35 | the patch passed |
   | +1 | compile | 27 | the patch passed |
   | +1 | javac | 27 | the patch passed |
   | -0 | checkstyle | 20 | hadoop-tools/hadoop-aws: The patch generated 2 new 
+ 39 unchanged - 0 fixed = 41 total (was 39) |
   | +1 | mvnsite | 37 | the patch passed |
   | +1 | whitespace | 0 | The patch has no whitespace issues. |
   | +1 | shadedclient | 745 | patch has no errors when building and testing 
our client artifacts. |
   | +1 | javadoc | 27 | the patch passed |
   | +1 | findbugs | 64 | the patch passed |
   ||| _ Other Tests _ |
   | +1 | unit | 83 | hadoop-aws in the patch passed. |
   | +1 | asflicense | 35 | The patch does not generate ASF License warnings. |
   | | | 3228 | |
   
   
   | Subsystem | Report/Notes |
   |--:|:-|
   | Docker | Client=19.03.1 Server=19.03.1 base: 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1442/2/artifact/out/Dockerfile
 |
   | GITHUB PR | https://github.com/apache/hadoop/pull/1442 |
   | Optional Tests | dupname asflicense compile javac javadoc mvninstall 
mvnsite unit shadedclient findbugs checkstyle |
   | uname | Linux 310d4f645932 4.15.0-60-generic #67-Ubuntu SMP Thu Aug 22 
16:55:30 UTC 2019 x86_64 x86_64 x86_64 GNU/Linux |
   | Build tool | maven |
   | Personality | personality/hadoop.sh |
   | git revision | trunk / 5db32b8 |
   | Default Java | 1.8.0_222 |
   | checkstyle | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1442/2/artifact/out/diff-checkstyle-hadoop-tools_hadoop-aws.txt
 |
   |  Test Results | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1442/2/testReport/ |
   | Max. process+thread count | 412 (vs. ulimit of 5500) |
   | modules | C: hadoop-tools/hadoop-aws U: hadoop-tools/hadoop-aws |
   | Console output | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1442/2/console |
   | versions | git=2.7.4 maven=3.3.9 findbugs=3.1.0-RC1 |
   | Powered by | Apache Yetus 0.10.0 http://yetus.apache.org |
   
   
   This message was automatically generated.
   
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[GitHub] [hadoop] hadoop-yetus removed a comment on issue #1442: HADOOP-16570. S3A committers leak threads on job/task commit.

2019-09-20 Thread GitBox
hadoop-yetus removed a comment on issue #1442: HADOOP-16570. S3A committers 
leak threads on job/task commit.
URL: https://github.com/apache/hadoop/pull/1442#issuecomment-533064098
 
 
   :confetti_ball: **+1 overall**
   
   
   
   
   
   
   | Vote | Subsystem | Runtime | Comment |
   |::|--:|:|:|
   | 0 | reexec | 33 | Docker mode activated. |
   ||| _ Prechecks _ |
   | +1 | dupname | 0 | No case conflicting files found. |
   | +1 | @author | 0 | The patch does not contain any @author tags. |
   | +1 | test4tests | 0 | The patch appears to include 3 new or modified test 
files. |
   ||| _ trunk Compile Tests _ |
   | +1 | mvninstall | 1284 | trunk passed |
   | +1 | compile | 34 | trunk passed |
   | +1 | checkstyle | 24 | trunk passed |
   | +1 | mvnsite | 41 | trunk passed |
   | +1 | shadedclient | 854 | branch has no errors when building and testing 
our client artifacts. |
   | +1 | javadoc | 27 | trunk passed |
   | 0 | spotbugs | 66 | Used deprecated FindBugs config; considering switching 
to SpotBugs. |
   | +1 | findbugs | 64 | trunk passed |
   ||| _ Patch Compile Tests _ |
   | +1 | mvninstall | 37 | the patch passed |
   | +1 | compile | 31 | the patch passed |
   | +1 | javac | 31 | the patch passed |
   | -0 | checkstyle | 20 | hadoop-tools/hadoop-aws: The patch generated 2 new 
+ 39 unchanged - 0 fixed = 41 total (was 39) |
   | +1 | mvnsite | 36 | the patch passed |
   | +1 | whitespace | 0 | The patch has no whitespace issues. |
   | +1 | shadedclient | 844 | patch has no errors when building and testing 
our client artifacts. |
   | +1 | javadoc | 23 | the patch passed |
   | +1 | findbugs | 66 | the patch passed |
   ||| _ Other Tests _ |
   | +1 | unit | 82 | hadoop-aws in the patch passed. |
   | +1 | asflicense | 27 | The patch does not generate ASF License warnings. |
   | | | 3618 | |
   
   
   | Subsystem | Report/Notes |
   |--:|:-|
   | Docker | Client=19.03.1 Server=19.03.1 base: 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1442/3/artifact/out/Dockerfile
 |
   | GITHUB PR | https://github.com/apache/hadoop/pull/1442 |
   | Optional Tests | dupname asflicense compile javac javadoc mvninstall 
mvnsite unit shadedclient findbugs checkstyle |
   | uname | Linux 71fbb1294e97 4.15.0-60-generic #67-Ubuntu SMP Thu Aug 22 
16:55:30 UTC 2019 x86_64 x86_64 x86_64 GNU/Linux |
   | Build tool | maven |
   | Personality | personality/hadoop.sh |
   | git revision | trunk / 1029060 |
   | Default Java | 1.8.0_222 |
   | checkstyle | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1442/3/artifact/out/diff-checkstyle-hadoop-tools_hadoop-aws.txt
 |
   |  Test Results | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1442/3/testReport/ |
   | Max. process+thread count | 307 (vs. ulimit of 5500) |
   | modules | C: hadoop-tools/hadoop-aws U: hadoop-tools/hadoop-aws |
   | Console output | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1442/3/console |
   | versions | git=2.7.4 maven=3.3.9 findbugs=3.1.0-RC1 |
   | Powered by | Apache Yetus 0.10.0 http://yetus.apache.org |
   
   
   This message was automatically generated.
   
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[GitHub] [hadoop] steveloughran commented on a change in pull request #1442: HADOOP-16570. S3A committers leak threads on job/task commit.

2019-09-20 Thread GitBox
steveloughran commented on a change in pull request #1442: HADOOP-16570. S3A 
committers leak threads on job/task commit.
URL: https://github.com/apache/hadoop/pull/1442#discussion_r326761254
 
 

 ##
 File path: 
hadoop-tools/hadoop-aws/src/main/java/org/apache/hadoop/fs/s3a/commit/AbstractS3ACommitter.java
 ##
 @@ -742,6 +754,27 @@ protected final synchronized ExecutorService 
buildThreadPool(
 return threadPool;
   }
 
+  /**
+   * Destroy any thread pool; wait for it to finish,
+   * but don't overreact if it doesn't finish in time.
+   */
+  protected final synchronized void destroyThreadPool() {
 
 Review comment:
   No real reason


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



  1   2   >