[jira] [Work logged] (HADOOP-18302) Remove WhiteBox in hadoop-common module.

2022-07-21 Thread ASF GitHub Bot (Jira)


 [ 
https://issues.apache.org/jira/browse/HADOOP-18302?focusedWorklogId=794046=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-794046
 ]

ASF GitHub Bot logged work on HADOOP-18302:
---

Author: ASF GitHub Bot
Created on: 22/Jul/22 04:13
Start Date: 22/Jul/22 04:13
Worklog Time Spent: 10m 
  Work Description: hadoop-yetus commented on PR #4457:
URL: https://github.com/apache/hadoop/pull/4457#issuecomment-1192161273

   :confetti_ball: **+1 overall**
   
   
   
   
   
   
   | Vote | Subsystem | Runtime |  Logfile | Comment |
   |::|--:|:|::|:---:|
   | +0 :ok: |  reexec  |   0m 48s |  |  Docker mode activated.  |
    _ Prechecks _ |
   | +1 :green_heart: |  dupname  |   0m  1s |  |  No case conflicting files 
found.  |
   | +0 :ok: |  codespell  |   0m  0s |  |  codespell was not available.  |
   | +0 :ok: |  detsecrets  |   0m  0s |  |  detect-secrets was not available.  
|
   | +1 :green_heart: |  @author  |   0m  0s |  |  The patch does not contain 
any @author tags.  |
   | +1 :green_heart: |  test4tests  |   0m  0s |  |  The patch appears to 
include 7 new or modified test files.  |
    _ trunk Compile Tests _ |
   | +0 :ok: |  mvndep  |  15m 20s |  |  Maven dependency ordering for branch  |
   | +1 :green_heart: |  mvninstall  |  27m 24s |  |  trunk passed  |
   | +1 :green_heart: |  compile  |  26m  5s |  |  trunk passed with JDK 
Private Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1  |
   | +1 :green_heart: |  compile  |  22m 45s |  |  trunk passed with JDK 
Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07  |
   | +1 :green_heart: |  checkstyle  |   1m 37s |  |  trunk passed  |
   | +1 :green_heart: |  mvnsite  |   2m 53s |  |  trunk passed  |
   | +1 :green_heart: |  javadoc  |   2m 24s |  |  trunk passed with JDK 
Private Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1  |
   | +1 :green_heart: |  javadoc  |   2m  1s |  |  trunk passed with JDK 
Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07  |
   | +1 :green_heart: |  spotbugs  |   4m 10s |  |  trunk passed  |
   | +1 :green_heart: |  shadedclient  |  22m 25s |  |  branch has no errors 
when building and testing our client artifacts.  |
   | -0 :warning: |  patch  |  22m 57s |  |  Used diff version of patch file. 
Binary files and potentially other changes not applied. Please rebase and 
squash commits if necessary.  |
    _ Patch Compile Tests _ |
   | +0 :ok: |  mvndep  |   0m 33s |  |  Maven dependency ordering for patch  |
   | +1 :green_heart: |  mvninstall  |   1m 32s |  |  the patch passed  |
   | +1 :green_heart: |  compile  |  23m 29s |  |  the patch passed with JDK 
Private Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1  |
   | +1 :green_heart: |  javac  |  23m 29s |  |  
root-jdkPrivateBuild-11.0.15+10-Ubuntu-0ubuntu0.20.04.1 with JDK Private 
Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1 generated 0 new + 2863 unchanged - 17 
fixed = 2863 total (was 2880)  |
   | +1 :green_heart: |  compile  |  20m 53s |  |  the patch passed with JDK 
Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07  |
   | +1 :green_heart: |  javac  |  20m 53s |  |  
root-jdkPrivateBuild-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 with JDK Private 
Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 generated 0 new + 2660 unchanged - 
17 fixed = 2660 total (was 2677)  |
   | +1 :green_heart: |  blanks  |   0m  0s |  |  The patch has no blanks 
issues.  |
   | +1 :green_heart: |  checkstyle  |   1m 50s |  |  hadoop-common-project: 
The patch generated 0 new + 380 unchanged - 233 fixed = 380 total (was 613)  |
   | +1 :green_heart: |  mvnsite  |   3m 13s |  |  the patch passed  |
   | +1 :green_heart: |  javadoc  |   2m 55s |  |  the patch passed with JDK 
Private Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1  |
   | +1 :green_heart: |  javadoc  |   2m 38s |  |  the patch passed with JDK 
Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07  |
   | +1 :green_heart: |  spotbugs  |   4m 47s |  |  the patch passed  |
   | +1 :green_heart: |  shadedclient  |  21m 58s |  |  patch has no errors 
when building and testing our client artifacts.  |
    _ Other Tests _ |
   | +1 :green_heart: |  unit  |  18m 38s |  |  hadoop-common in the patch 
passed.  |
   | +1 :green_heart: |  unit  |   1m 35s |  |  hadoop-nfs in the patch passed. 
 |
   | +1 :green_heart: |  asflicense  |   1m 35s |  |  The patch does not 
generate ASF License warnings.  |
   |  |   | 239m  7s |  |  |
   
   
   | Subsystem | Report/Notes |
   |--:|:-|
   | Docker | ClientAPI=1.41 ServerAPI=1.41 base: 
https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4457/15/artifact/out/Dockerfile
 |
   | GITHUB PR | https://github.com/apache/hadoop/pull/4457 |
   | Optional Tests | dupname asflicense compile javac javadoc mvninstall 
mvnsite unit shadedclient spotbugs checkstyle codespell detsecrets |
   | uname | Linux 3e1eef207db6 4.15.0-58-generic 

[GitHub] [hadoop] hadoop-yetus commented on pull request #4457: HADOOP-18302. Remove WhiteBox in hadoop-common module.

2022-07-21 Thread GitBox


hadoop-yetus commented on PR #4457:
URL: https://github.com/apache/hadoop/pull/4457#issuecomment-1192161273

   :confetti_ball: **+1 overall**
   
   
   
   
   
   
   | Vote | Subsystem | Runtime |  Logfile | Comment |
   |::|--:|:|::|:---:|
   | +0 :ok: |  reexec  |   0m 48s |  |  Docker mode activated.  |
    _ Prechecks _ |
   | +1 :green_heart: |  dupname  |   0m  1s |  |  No case conflicting files 
found.  |
   | +0 :ok: |  codespell  |   0m  0s |  |  codespell was not available.  |
   | +0 :ok: |  detsecrets  |   0m  0s |  |  detect-secrets was not available.  
|
   | +1 :green_heart: |  @author  |   0m  0s |  |  The patch does not contain 
any @author tags.  |
   | +1 :green_heart: |  test4tests  |   0m  0s |  |  The patch appears to 
include 7 new or modified test files.  |
    _ trunk Compile Tests _ |
   | +0 :ok: |  mvndep  |  15m 20s |  |  Maven dependency ordering for branch  |
   | +1 :green_heart: |  mvninstall  |  27m 24s |  |  trunk passed  |
   | +1 :green_heart: |  compile  |  26m  5s |  |  trunk passed with JDK 
Private Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1  |
   | +1 :green_heart: |  compile  |  22m 45s |  |  trunk passed with JDK 
Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07  |
   | +1 :green_heart: |  checkstyle  |   1m 37s |  |  trunk passed  |
   | +1 :green_heart: |  mvnsite  |   2m 53s |  |  trunk passed  |
   | +1 :green_heart: |  javadoc  |   2m 24s |  |  trunk passed with JDK 
Private Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1  |
   | +1 :green_heart: |  javadoc  |   2m  1s |  |  trunk passed with JDK 
Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07  |
   | +1 :green_heart: |  spotbugs  |   4m 10s |  |  trunk passed  |
   | +1 :green_heart: |  shadedclient  |  22m 25s |  |  branch has no errors 
when building and testing our client artifacts.  |
   | -0 :warning: |  patch  |  22m 57s |  |  Used diff version of patch file. 
Binary files and potentially other changes not applied. Please rebase and 
squash commits if necessary.  |
    _ Patch Compile Tests _ |
   | +0 :ok: |  mvndep  |   0m 33s |  |  Maven dependency ordering for patch  |
   | +1 :green_heart: |  mvninstall  |   1m 32s |  |  the patch passed  |
   | +1 :green_heart: |  compile  |  23m 29s |  |  the patch passed with JDK 
Private Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1  |
   | +1 :green_heart: |  javac  |  23m 29s |  |  
root-jdkPrivateBuild-11.0.15+10-Ubuntu-0ubuntu0.20.04.1 with JDK Private 
Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1 generated 0 new + 2863 unchanged - 17 
fixed = 2863 total (was 2880)  |
   | +1 :green_heart: |  compile  |  20m 53s |  |  the patch passed with JDK 
Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07  |
   | +1 :green_heart: |  javac  |  20m 53s |  |  
root-jdkPrivateBuild-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 with JDK Private 
Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 generated 0 new + 2660 unchanged - 
17 fixed = 2660 total (was 2677)  |
   | +1 :green_heart: |  blanks  |   0m  0s |  |  The patch has no blanks 
issues.  |
   | +1 :green_heart: |  checkstyle  |   1m 50s |  |  hadoop-common-project: 
The patch generated 0 new + 380 unchanged - 233 fixed = 380 total (was 613)  |
   | +1 :green_heart: |  mvnsite  |   3m 13s |  |  the patch passed  |
   | +1 :green_heart: |  javadoc  |   2m 55s |  |  the patch passed with JDK 
Private Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1  |
   | +1 :green_heart: |  javadoc  |   2m 38s |  |  the patch passed with JDK 
Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07  |
   | +1 :green_heart: |  spotbugs  |   4m 47s |  |  the patch passed  |
   | +1 :green_heart: |  shadedclient  |  21m 58s |  |  patch has no errors 
when building and testing our client artifacts.  |
    _ Other Tests _ |
   | +1 :green_heart: |  unit  |  18m 38s |  |  hadoop-common in the patch 
passed.  |
   | +1 :green_heart: |  unit  |   1m 35s |  |  hadoop-nfs in the patch passed. 
 |
   | +1 :green_heart: |  asflicense  |   1m 35s |  |  The patch does not 
generate ASF License warnings.  |
   |  |   | 239m  7s |  |  |
   
   
   | Subsystem | Report/Notes |
   |--:|:-|
   | Docker | ClientAPI=1.41 ServerAPI=1.41 base: 
https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4457/15/artifact/out/Dockerfile
 |
   | GITHUB PR | https://github.com/apache/hadoop/pull/4457 |
   | Optional Tests | dupname asflicense compile javac javadoc mvninstall 
mvnsite unit shadedclient spotbugs checkstyle codespell detsecrets |
   | uname | Linux 3e1eef207db6 4.15.0-58-generic #64-Ubuntu SMP Tue Aug 6 
11:12:41 UTC 2019 x86_64 x86_64 x86_64 GNU/Linux |
   | Build tool | maven |
   | Personality | dev-support/bin/hadoop.sh |
   | git revision | trunk / 3ad32d1bf1dc282877fccf02f91ecf1dd7a62909 |
   | Default Java | Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 |
   | Multi-JDK versions | /usr/lib/jvm/java-11-openjdk-amd64:Private 
Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1 

[jira] [Work logged] (HADOOP-18340) deleteOnExit does not work with S3AFileSystem

2022-07-21 Thread ASF GitHub Bot (Jira)


 [ 
https://issues.apache.org/jira/browse/HADOOP-18340?focusedWorklogId=794044=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-794044
 ]

ASF GitHub Bot logged work on HADOOP-18340:
---

Author: ASF GitHub Bot
Created on: 22/Jul/22 03:41
Start Date: 22/Jul/22 03:41
Worklog Time Spent: 10m 
  Work Description: hadoop-yetus commented on PR #4608:
URL: https://github.com/apache/hadoop/pull/4608#issuecomment-1192147355

   :confetti_ball: **+1 overall**
   
   
   
   
   
   
   | Vote | Subsystem | Runtime |  Logfile | Comment |
   |::|--:|:|::|:---:|
   | +0 :ok: |  reexec  |   0m 54s |  |  Docker mode activated.  |
    _ Prechecks _ |
   | +1 :green_heart: |  dupname  |   0m  0s |  |  No case conflicting files 
found.  |
   | +0 :ok: |  codespell  |   0m  1s |  |  codespell was not available.  |
   | +0 :ok: |  detsecrets  |   0m  1s |  |  detect-secrets was not available.  
|
   | +1 :green_heart: |  @author  |   0m  0s |  |  The patch does not contain 
any @author tags.  |
   | +1 :green_heart: |  test4tests  |   0m  0s |  |  The patch appears to 
include 1 new or modified test files.  |
    _ trunk Compile Tests _ |
   | +0 :ok: |  mvndep  |  15m 19s |  |  Maven dependency ordering for branch  |
   | +1 :green_heart: |  mvninstall  |  28m 50s |  |  trunk passed  |
   | +1 :green_heart: |  compile  |  25m 25s |  |  trunk passed with JDK 
Private Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1  |
   | +1 :green_heart: |  compile  |  21m 54s |  |  trunk passed with JDK 
Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07  |
   | +1 :green_heart: |  checkstyle  |   5m 21s |  |  trunk passed  |
   | +1 :green_heart: |  mvnsite  |   3m 29s |  |  trunk passed  |
   | +1 :green_heart: |  javadoc  |   2m 24s |  |  trunk passed with JDK 
Private Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1  |
   | +1 :green_heart: |  javadoc  |   2m  6s |  |  trunk passed with JDK 
Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07  |
   | +1 :green_heart: |  spotbugs  |   4m 37s |  |  trunk passed  |
   | +1 :green_heart: |  shadedclient  |  24m 56s |  |  branch has no errors 
when building and testing our client artifacts.  |
    _ Patch Compile Tests _ |
   | +0 :ok: |  mvndep  |   0m 34s |  |  Maven dependency ordering for patch  |
   | +1 :green_heart: |  mvninstall  |   1m 42s |  |  the patch passed  |
   | +1 :green_heart: |  compile  |  24m 29s |  |  the patch passed with JDK 
Private Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1  |
   | +1 :green_heart: |  javac  |  24m 29s |  |  the patch passed  |
   | +1 :green_heart: |  compile  |  21m 57s |  |  the patch passed with JDK 
Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07  |
   | +1 :green_heart: |  javac  |  21m 57s |  |  the patch passed  |
   | +1 :green_heart: |  blanks  |   0m  0s |  |  The patch has no blanks 
issues.  |
   | -0 :warning: |  checkstyle  |   4m 27s | 
[/results-checkstyle-root.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4608/1/artifact/out/results-checkstyle-root.txt)
 |  root: The patch generated 3 new + 70 unchanged - 0 fixed = 73 total (was 
70)  |
   | +1 :green_heart: |  mvnsite  |   3m 11s |  |  the patch passed  |
   | +1 :green_heart: |  javadoc  |   2m 17s |  |  the patch passed with JDK 
Private Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1  |
   | +1 :green_heart: |  javadoc  |   2m  7s |  |  the patch passed with JDK 
Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07  |
   | +1 :green_heart: |  spotbugs  |   4m 51s |  |  the patch passed  |
   | +1 :green_heart: |  shadedclient  |  24m 38s |  |  patch has no errors 
when building and testing our client artifacts.  |
    _ Other Tests _ |
   | +1 :green_heart: |  unit  |  18m 37s |  |  hadoop-common in the patch 
passed.  |
   | +1 :green_heart: |  unit  |   3m  8s |  |  hadoop-aws in the patch passed. 
 |
   | +1 :green_heart: |  asflicense  |   1m 20s |  |  The patch does not 
generate ASF License warnings.  |
   |  |   | 252m 55s |  |  |
   
   
   | Subsystem | Report/Notes |
   |--:|:-|
   | Docker | ClientAPI=1.41 ServerAPI=1.41 base: 
https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4608/1/artifact/out/Dockerfile
 |
   | GITHUB PR | https://github.com/apache/hadoop/pull/4608 |
   | Optional Tests | dupname asflicense compile javac javadoc mvninstall 
mvnsite unit shadedclient spotbugs checkstyle codespell detsecrets |
   | uname | Linux 422779c483bf 4.15.0-166-generic #174-Ubuntu SMP Wed Dec 8 
19:07:44 UTC 2021 x86_64 x86_64 x86_64 GNU/Linux |
   | Build tool | maven |
   | Personality | dev-support/bin/hadoop.sh |
   | git revision | trunk / 6e4a99265d8935a2c67e47b5f5e3a364d87c4f77 |
   | Default Java | Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 |
   | Multi-JDK versions | /usr/lib/jvm/java-11-openjdk-amd64:Private 
Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1 

[GitHub] [hadoop] hadoop-yetus commented on pull request #4608: HADOOP-18340 deleteOnExit does not work with S3AFileSystem

2022-07-21 Thread GitBox


hadoop-yetus commented on PR #4608:
URL: https://github.com/apache/hadoop/pull/4608#issuecomment-1192147355

   :confetti_ball: **+1 overall**
   
   
   
   
   
   
   | Vote | Subsystem | Runtime |  Logfile | Comment |
   |::|--:|:|::|:---:|
   | +0 :ok: |  reexec  |   0m 54s |  |  Docker mode activated.  |
    _ Prechecks _ |
   | +1 :green_heart: |  dupname  |   0m  0s |  |  No case conflicting files 
found.  |
   | +0 :ok: |  codespell  |   0m  1s |  |  codespell was not available.  |
   | +0 :ok: |  detsecrets  |   0m  1s |  |  detect-secrets was not available.  
|
   | +1 :green_heart: |  @author  |   0m  0s |  |  The patch does not contain 
any @author tags.  |
   | +1 :green_heart: |  test4tests  |   0m  0s |  |  The patch appears to 
include 1 new or modified test files.  |
    _ trunk Compile Tests _ |
   | +0 :ok: |  mvndep  |  15m 19s |  |  Maven dependency ordering for branch  |
   | +1 :green_heart: |  mvninstall  |  28m 50s |  |  trunk passed  |
   | +1 :green_heart: |  compile  |  25m 25s |  |  trunk passed with JDK 
Private Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1  |
   | +1 :green_heart: |  compile  |  21m 54s |  |  trunk passed with JDK 
Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07  |
   | +1 :green_heart: |  checkstyle  |   5m 21s |  |  trunk passed  |
   | +1 :green_heart: |  mvnsite  |   3m 29s |  |  trunk passed  |
   | +1 :green_heart: |  javadoc  |   2m 24s |  |  trunk passed with JDK 
Private Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1  |
   | +1 :green_heart: |  javadoc  |   2m  6s |  |  trunk passed with JDK 
Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07  |
   | +1 :green_heart: |  spotbugs  |   4m 37s |  |  trunk passed  |
   | +1 :green_heart: |  shadedclient  |  24m 56s |  |  branch has no errors 
when building and testing our client artifacts.  |
    _ Patch Compile Tests _ |
   | +0 :ok: |  mvndep  |   0m 34s |  |  Maven dependency ordering for patch  |
   | +1 :green_heart: |  mvninstall  |   1m 42s |  |  the patch passed  |
   | +1 :green_heart: |  compile  |  24m 29s |  |  the patch passed with JDK 
Private Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1  |
   | +1 :green_heart: |  javac  |  24m 29s |  |  the patch passed  |
   | +1 :green_heart: |  compile  |  21m 57s |  |  the patch passed with JDK 
Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07  |
   | +1 :green_heart: |  javac  |  21m 57s |  |  the patch passed  |
   | +1 :green_heart: |  blanks  |   0m  0s |  |  The patch has no blanks 
issues.  |
   | -0 :warning: |  checkstyle  |   4m 27s | 
[/results-checkstyle-root.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4608/1/artifact/out/results-checkstyle-root.txt)
 |  root: The patch generated 3 new + 70 unchanged - 0 fixed = 73 total (was 
70)  |
   | +1 :green_heart: |  mvnsite  |   3m 11s |  |  the patch passed  |
   | +1 :green_heart: |  javadoc  |   2m 17s |  |  the patch passed with JDK 
Private Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1  |
   | +1 :green_heart: |  javadoc  |   2m  7s |  |  the patch passed with JDK 
Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07  |
   | +1 :green_heart: |  spotbugs  |   4m 51s |  |  the patch passed  |
   | +1 :green_heart: |  shadedclient  |  24m 38s |  |  patch has no errors 
when building and testing our client artifacts.  |
    _ Other Tests _ |
   | +1 :green_heart: |  unit  |  18m 37s |  |  hadoop-common in the patch 
passed.  |
   | +1 :green_heart: |  unit  |   3m  8s |  |  hadoop-aws in the patch passed. 
 |
   | +1 :green_heart: |  asflicense  |   1m 20s |  |  The patch does not 
generate ASF License warnings.  |
   |  |   | 252m 55s |  |  |
   
   
   | Subsystem | Report/Notes |
   |--:|:-|
   | Docker | ClientAPI=1.41 ServerAPI=1.41 base: 
https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4608/1/artifact/out/Dockerfile
 |
   | GITHUB PR | https://github.com/apache/hadoop/pull/4608 |
   | Optional Tests | dupname asflicense compile javac javadoc mvninstall 
mvnsite unit shadedclient spotbugs checkstyle codespell detsecrets |
   | uname | Linux 422779c483bf 4.15.0-166-generic #174-Ubuntu SMP Wed Dec 8 
19:07:44 UTC 2021 x86_64 x86_64 x86_64 GNU/Linux |
   | Build tool | maven |
   | Personality | dev-support/bin/hadoop.sh |
   | git revision | trunk / 6e4a99265d8935a2c67e47b5f5e3a364d87c4f77 |
   | Default Java | Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 |
   | Multi-JDK versions | /usr/lib/jvm/java-11-openjdk-amd64:Private 
Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1 
/usr/lib/jvm/java-8-openjdk-amd64:Private 
Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 |
   |  Test Results | 
https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4608/1/testReport/ |
   | Max. process+thread count | 3137 (vs. ulimit of 5500) |
   | modules | C: hadoop-common-project/hadoop-common hadoop-tools/hadoop-aws 
U: . |
   | Console output | 
https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4608/1/console 

[jira] [Updated] (HADOOP-18348) Echo java process's parent pid to the pid file intermediate state

2022-07-21 Thread jiangrui (Jira)


 [ 
https://issues.apache.org/jira/browse/HADOOP-18348?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

jiangrui updated HADOOP-18348:
--
Fix Version/s: 3.3.3
   3.3.2
   3.3.1
Affects Version/s: 3.3.3
   3.3.2
   3.3.1
   Status: Patch Available  (was: Open)

> Echo java process's parent pid to the pid file intermediate state
> -
>
> Key: HADOOP-18348
> URL: https://issues.apache.org/jira/browse/HADOOP-18348
> Project: Hadoop Common
>  Issue Type: Bug
>  Components: common
>Affects Versions: 3.3.3, 3.3.2, 3.3.1
>Reporter: jiangrui
>Priority: Major
>  Labels: pull-request-available
> Fix For: 3.3.3, 3.3.2, 3.3.1
>
>  Time Spent: 20m
>  Remaining Estimate: 0h
>
> In hadoop-function.sh file,there is hadoop_start_daemon and 
> hadoop_start_daemon_wrapper functions.
> hadoop_start_daemon_wrapper invoke hadoop_start_daemon and put it to 
> background.
>  
> In  hadoop_start_daemon function, echo $$ > pidfile,cause this scenario
> because hadoop_start_daemon is in a subshell by ampersand, and $ expands to 
> the process ID of the current shell, not the subshell.
>  



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[GitHub] [hadoop] hadoop-yetus commented on pull request #4601: HDFS-16467. Ensure Protobuf generated headers are included first

2022-07-21 Thread GitBox


hadoop-yetus commented on PR #4601:
URL: https://github.com/apache/hadoop/pull/4601#issuecomment-1192122166

   :confetti_ball: **+1 overall**
   
   
   
   
   
   
   | Vote | Subsystem | Runtime |  Logfile | Comment |
   |::|--:|:|::|:---:|
   | +0 :ok: |  reexec  |   0m 43s |  |  Docker mode activated.  |
    _ Prechecks _ |
   | +1 :green_heart: |  dupname  |   0m  1s |  |  No case conflicting files 
found.  |
   | +0 :ok: |  codespell  |   0m  0s |  |  codespell was not available.  |
   | +0 :ok: |  detsecrets  |   0m  0s |  |  detect-secrets was not available.  
|
   | +1 :green_heart: |  @author  |   0m  0s |  |  The patch does not contain 
any @author tags.  |
   | +1 :green_heart: |  test4tests  |   0m  0s |  |  The patch appears to 
include 7 new or modified test files.  |
    _ trunk Compile Tests _ |
   | +1 :green_heart: |  mvninstall  |  23m 19s |  |  trunk passed  |
   | +1 :green_heart: |  compile  |   4m 21s |  |  trunk passed with JDK 
Private Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1  |
   | +1 :green_heart: |  compile  |   4m 19s |  |  trunk passed with JDK 
Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07  |
   | +1 :green_heart: |  mvnsite  |   0m 39s |  |  trunk passed  |
   | +1 :green_heart: |  shadedclient  |  53m 58s |  |  branch has no errors 
when building and testing our client artifacts.  |
    _ Patch Compile Tests _ |
   | +1 :green_heart: |  mvninstall  |   0m 22s |  |  the patch passed  |
   | +1 :green_heart: |  compile  |   3m 53s |  |  the patch passed with JDK 
Private Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1  |
   | +1 :green_heart: |  cc  |   3m 53s |  |  the patch passed  |
   | +1 :green_heart: |  golang  |   3m 54s |  |  the patch passed  |
   | +1 :green_heart: |  javac  |   3m 53s |  |  the patch passed  |
   | +1 :green_heart: |  compile  |   4m  0s |  |  the patch passed with JDK 
Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07  |
   | +1 :green_heart: |  cc  |   4m  0s |  |  the patch passed  |
   | +1 :green_heart: |  golang  |   4m  0s |  |  the patch passed  |
   | +1 :green_heart: |  javac  |   4m  0s |  |  the patch passed  |
   | +1 :green_heart: |  blanks  |   0m  0s |  |  The patch has no blanks 
issues.  |
   | +1 :green_heart: |  mvnsite  |   0m 24s |  |  the patch passed  |
   | +1 :green_heart: |  shadedclient  |  21m  8s |  |  patch has no errors 
when building and testing our client artifacts.  |
    _ Other Tests _ |
   | +1 :green_heart: |  unit  |  34m 31s |  |  hadoop-hdfs-native-client in 
the patch passed.  |
   | +1 :green_heart: |  asflicense  |   0m 39s |  |  The patch does not 
generate ASF License warnings.  |
   |  |   | 122m  1s |  |  |
   
   
   | Subsystem | Report/Notes |
   |--:|:-|
   | Docker | ClientAPI=1.41 ServerAPI=1.41 base: 
https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4601/2/artifact/out/Dockerfile
 |
   | GITHUB PR | https://github.com/apache/hadoop/pull/4601 |
   | Optional Tests | dupname asflicense compile cc mvnsite javac unit 
codespell detsecrets golang |
   | uname | Linux 4017d1e252ac 4.15.0-58-generic #64-Ubuntu SMP Tue Aug 6 
11:12:41 UTC 2019 x86_64 x86_64 x86_64 GNU/Linux |
   | Build tool | maven |
   | Personality | dev-support/bin/hadoop.sh |
   | git revision | trunk / 6845f0c7d9631c56b92ded7adc539c0de23a1045 |
   | Default Java | Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 |
   | Multi-JDK versions | /usr/lib/jvm/java-11-openjdk-amd64:Private 
Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1 
/usr/lib/jvm/java-8-openjdk-amd64:Private 
Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 |
   |  Test Results | 
https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4601/2/testReport/ |
   | Max. process+thread count | 726 (vs. ulimit of 5500) |
   | modules | C: hadoop-hdfs-project/hadoop-hdfs-native-client U: 
hadoop-hdfs-project/hadoop-hdfs-native-client |
   | Console output | 
https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4601/2/console |
   | versions | git=2.25.1 maven=3.6.3 |
   | Powered by | Apache Yetus 0.14.0 https://yetus.apache.org |
   
   
   This message was automatically generated.
   
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[GitHub] [hadoop] hadoop-yetus commented on pull request #4426: YARN-10883. [Router] Router Audit Log Add Client IP Address.

2022-07-21 Thread GitBox


hadoop-yetus commented on PR #4426:
URL: https://github.com/apache/hadoop/pull/4426#issuecomment-1192098960

   :confetti_ball: **+1 overall**
   
   
   
   
   
   
   | Vote | Subsystem | Runtime |  Logfile | Comment |
   |::|--:|:|::|:---:|
   | +0 :ok: |  reexec  |   0m 48s |  |  Docker mode activated.  |
    _ Prechecks _ |
   | +1 :green_heart: |  dupname  |   0m  0s |  |  No case conflicting files 
found.  |
   | +0 :ok: |  codespell  |   0m  0s |  |  codespell was not available.  |
   | +0 :ok: |  detsecrets  |   0m  0s |  |  detect-secrets was not available.  
|
   | +1 :green_heart: |  @author  |   0m  0s |  |  The patch does not contain 
any @author tags.  |
   | +1 :green_heart: |  test4tests  |   0m  0s |  |  The patch appears to 
include 1 new or modified test files.  |
    _ trunk Compile Tests _ |
   | +1 :green_heart: |  mvninstall  |  40m 44s |  |  trunk passed  |
   | +1 :green_heart: |  compile  |   0m 41s |  |  trunk passed with JDK 
Private Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1  |
   | +1 :green_heart: |  compile  |   0m 37s |  |  trunk passed with JDK 
Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07  |
   | +1 :green_heart: |  checkstyle  |   0m 38s |  |  trunk passed  |
   | +1 :green_heart: |  mvnsite  |   0m 42s |  |  trunk passed  |
   | +1 :green_heart: |  javadoc  |   0m 47s |  |  trunk passed with JDK 
Private Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1  |
   | +1 :green_heart: |  javadoc  |   0m 34s |  |  trunk passed with JDK 
Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07  |
   | +1 :green_heart: |  spotbugs  |   1m  9s |  |  trunk passed  |
   | +1 :green_heart: |  shadedclient  |  23m 25s |  |  branch has no errors 
when building and testing our client artifacts.  |
    _ Patch Compile Tests _ |
   | +1 :green_heart: |  mvninstall  |   0m 26s |  |  the patch passed  |
   | +1 :green_heart: |  compile  |   0m 28s |  |  the patch passed with JDK 
Private Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1  |
   | +1 :green_heart: |  javac  |   0m 28s |  |  the patch passed  |
   | +1 :green_heart: |  compile  |   0m 25s |  |  the patch passed with JDK 
Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07  |
   | +1 :green_heart: |  javac  |   0m 25s |  |  the patch passed  |
   | +1 :green_heart: |  blanks  |   0m  0s |  |  The patch has no blanks 
issues.  |
   | +1 :green_heart: |  checkstyle  |   0m 18s |  |  the patch passed  |
   | +1 :green_heart: |  mvnsite  |   0m 27s |  |  the patch passed  |
   | +1 :green_heart: |  javadoc  |   0m 25s |  |  the patch passed with JDK 
Private Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1  |
   | +1 :green_heart: |  javadoc  |   0m 22s |  |  the patch passed with JDK 
Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07  |
   | +1 :green_heart: |  spotbugs  |   0m 55s |  |  the patch passed  |
   | +1 :green_heart: |  shadedclient  |  22m 52s |  |  patch has no errors 
when building and testing our client artifacts.  |
    _ Other Tests _ |
   | +1 :green_heart: |  unit  |   3m 11s |  |  hadoop-yarn-server-router in 
the patch passed.  |
   | +1 :green_heart: |  asflicense  |   0m 41s |  |  The patch does not 
generate ASF License warnings.  |
   |  |   | 102m 18s |  |  |
   
   
   | Subsystem | Report/Notes |
   |--:|:-|
   | Docker | ClientAPI=1.41 ServerAPI=1.41 base: 
https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4426/4/artifact/out/Dockerfile
 |
   | GITHUB PR | https://github.com/apache/hadoop/pull/4426 |
   | Optional Tests | dupname asflicense compile javac javadoc mvninstall 
mvnsite unit shadedclient spotbugs checkstyle codespell detsecrets |
   | uname | Linux 5c32f9110705 4.15.0-175-generic #184-Ubuntu SMP Thu Mar 24 
17:48:36 UTC 2022 x86_64 x86_64 x86_64 GNU/Linux |
   | Build tool | maven |
   | Personality | dev-support/bin/hadoop.sh |
   | git revision | trunk / a4960439068617952dcb0074704cdfd24fec67e3 |
   | Default Java | Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 |
   | Multi-JDK versions | /usr/lib/jvm/java-11-openjdk-amd64:Private 
Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1 
/usr/lib/jvm/java-8-openjdk-amd64:Private 
Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 |
   |  Test Results | 
https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4426/4/testReport/ |
   | Max. process+thread count | 1453 (vs. ulimit of 5500) |
   | modules | C: 
hadoop-yarn-project/hadoop-yarn/hadoop-yarn-server/hadoop-yarn-server-router U: 
hadoop-yarn-project/hadoop-yarn/hadoop-yarn-server/hadoop-yarn-server-router |
   | Console output | 
https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4426/4/console |
   | versions | git=2.25.1 maven=3.6.3 spotbugs=4.2.2 |
   | Powered by | Apache Yetus 0.14.0 https://yetus.apache.org |
   
   
   This message was automatically generated.
   
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the 

[GitHub] [hadoop] hadoop-yetus commented on pull request #4510: YARN-11203. Fix typo in hadoop-yarn-server-router module.

2022-07-21 Thread GitBox


hadoop-yetus commented on PR #4510:
URL: https://github.com/apache/hadoop/pull/4510#issuecomment-1192097577

   :confetti_ball: **+1 overall**
   
   
   
   
   
   
   | Vote | Subsystem | Runtime |  Logfile | Comment |
   |::|--:|:|::|:---:|
   | +0 :ok: |  reexec  |   0m 56s |  |  Docker mode activated.  |
    _ Prechecks _ |
   | +1 :green_heart: |  dupname  |   0m  1s |  |  No case conflicting files 
found.  |
   | +0 :ok: |  codespell  |   0m  1s |  |  codespell was not available.  |
   | +0 :ok: |  detsecrets  |   0m  1s |  |  detect-secrets was not available.  
|
   | +1 :green_heart: |  @author  |   0m  0s |  |  The patch does not contain 
any @author tags.  |
   | +1 :green_heart: |  test4tests  |   0m  0s |  |  The patch appears to 
include 13 new or modified test files.  |
    _ trunk Compile Tests _ |
   | +1 :green_heart: |  mvninstall  |  41m 42s |  |  trunk passed  |
   | +1 :green_heart: |  compile  |   0m 42s |  |  trunk passed with JDK 
Private Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1  |
   | +1 :green_heart: |  compile  |   0m 39s |  |  trunk passed with JDK 
Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07  |
   | +1 :green_heart: |  checkstyle  |   0m 39s |  |  trunk passed  |
   | +1 :green_heart: |  mvnsite  |   0m 44s |  |  trunk passed  |
   | +1 :green_heart: |  javadoc  |   0m 47s |  |  trunk passed with JDK 
Private Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1  |
   | +1 :green_heart: |  javadoc  |   0m 35s |  |  trunk passed with JDK 
Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07  |
   | +1 :green_heart: |  spotbugs  |   1m 11s |  |  trunk passed  |
   | +1 :green_heart: |  shadedclient  |  23m 40s |  |  branch has no errors 
when building and testing our client artifacts.  |
    _ Patch Compile Tests _ |
   | +1 :green_heart: |  mvninstall  |   0m 27s |  |  the patch passed  |
   | +1 :green_heart: |  compile  |   0m 29s |  |  the patch passed with JDK 
Private Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1  |
   | +1 :green_heart: |  javac  |   0m 29s |  |  the patch passed  |
   | +1 :green_heart: |  compile  |   0m 25s |  |  the patch passed with JDK 
Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07  |
   | +1 :green_heart: |  javac  |   0m 25s |  |  the patch passed  |
   | +1 :green_heart: |  blanks  |   0m  0s |  |  The patch has no blanks 
issues.  |
   | +1 :green_heart: |  checkstyle  |   0m 19s |  |  the patch passed  |
   | +1 :green_heart: |  mvnsite  |   0m 27s |  |  the patch passed  |
   | +1 :green_heart: |  javadoc  |   0m 24s |  |  the patch passed with JDK 
Private Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1  |
   | +1 :green_heart: |  javadoc  |   0m 22s |  |  the patch passed with JDK 
Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07  |
   | +1 :green_heart: |  spotbugs  |   0m 55s |  |  the patch passed  |
   | +1 :green_heart: |  shadedclient  |  22m 53s |  |  patch has no errors 
when building and testing our client artifacts.  |
    _ Other Tests _ |
   | +1 :green_heart: |  unit  |   3m 13s |  |  hadoop-yarn-server-router in 
the patch passed.  |
   | +1 :green_heart: |  asflicense  |   0m 43s |  |  The patch does not 
generate ASF License warnings.  |
   |  |   | 104m  6s |  |  |
   
   
   | Subsystem | Report/Notes |
   |--:|:-|
   | Docker | ClientAPI=1.41 ServerAPI=1.41 base: 
https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4510/4/artifact/out/Dockerfile
 |
   | GITHUB PR | https://github.com/apache/hadoop/pull/4510 |
   | Optional Tests | dupname asflicense compile javac javadoc mvninstall 
mvnsite unit shadedclient spotbugs checkstyle codespell detsecrets |
   | uname | Linux e31075ba4cc8 4.15.0-175-generic #184-Ubuntu SMP Thu Mar 24 
17:48:36 UTC 2022 x86_64 x86_64 x86_64 GNU/Linux |
   | Build tool | maven |
   | Personality | dev-support/bin/hadoop.sh |
   | git revision | trunk / 23ffbd867a3c8c1a0c2d4643b5f3c3675ce93fd8 |
   | Default Java | Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 |
   | Multi-JDK versions | /usr/lib/jvm/java-11-openjdk-amd64:Private 
Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1 
/usr/lib/jvm/java-8-openjdk-amd64:Private 
Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 |
   |  Test Results | 
https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4510/4/testReport/ |
   | Max. process+thread count | 1542 (vs. ulimit of 5500) |
   | modules | C: 
hadoop-yarn-project/hadoop-yarn/hadoop-yarn-server/hadoop-yarn-server-router U: 
hadoop-yarn-project/hadoop-yarn/hadoop-yarn-server/hadoop-yarn-server-router |
   | Console output | 
https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4510/4/console |
   | versions | git=2.25.1 maven=3.6.3 spotbugs=4.2.2 |
   | Powered by | Apache Yetus 0.14.0 https://yetus.apache.org |
   
   
   This message was automatically generated.
   
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the 

[GitHub] [hadoop] goiri merged pull request #4375: HDFS-16605. Improve Code With Lambda in hadoop-hdfs-rbf moudle.

2022-07-21 Thread GitBox


goiri merged PR #4375:
URL: https://github.com/apache/hadoop/pull/4375


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[GitHub] [hadoop] hadoop-yetus commented on pull request #4601: HDFS-16467. Ensure Protobuf generated headers are included first

2022-07-21 Thread GitBox


hadoop-yetus commented on PR #4601:
URL: https://github.com/apache/hadoop/pull/4601#issuecomment-1192063612

   :confetti_ball: **+1 overall**
   
   
   
   
   
   
   | Vote | Subsystem | Runtime |  Logfile | Comment |
   |::|--:|:|::|:---:|
   | +0 :ok: |  reexec  |  20m 54s |  |  Docker mode activated.  |
    _ Prechecks _ |
   | +1 :green_heart: |  dupname  |   0m  1s |  |  No case conflicting files 
found.  |
   | +0 :ok: |  codespell  |   0m  0s |  |  codespell was not available.  |
   | +0 :ok: |  detsecrets  |   0m  0s |  |  detect-secrets was not available.  
|
   | +1 :green_heart: |  @author  |   0m  0s |  |  The patch does not contain 
any @author tags.  |
   | +1 :green_heart: |  test4tests  |   0m  0s |  |  The patch appears to 
include 7 new or modified test files.  |
    _ trunk Compile Tests _ |
   | +1 :green_heart: |  mvninstall  |  25m 19s |  |  trunk passed  |
   | +1 :green_heart: |  compile  |   3m 33s |  |  trunk passed  |
   | +1 :green_heart: |  mvnsite  |   0m 45s |  |  trunk passed  |
   | +1 :green_heart: |  shadedclient  |  55m 44s |  |  branch has no errors 
when building and testing our client artifacts.  |
    _ Patch Compile Tests _ |
   | +1 :green_heart: |  mvninstall  |   0m 24s |  |  the patch passed  |
   | +1 :green_heart: |  compile  |   3m 16s |  |  the patch passed  |
   | +1 :green_heart: |  cc  |   3m 16s |  |  the patch passed  |
   | +1 :green_heart: |  golang  |   3m 16s |  |  the patch passed  |
   | +1 :green_heart: |  javac  |   3m 16s |  |  the patch passed  |
   | +1 :green_heart: |  blanks  |   0m  0s |  |  The patch has no blanks 
issues.  |
   | +1 :green_heart: |  mvnsite  |   0m 28s |  |  the patch passed  |
   | +1 :green_heart: |  shadedclient  |  26m 34s |  |  patch has no errors 
when building and testing our client artifacts.  |
    _ Other Tests _ |
   | +1 :green_heart: |  unit  |  33m 51s |  |  hadoop-hdfs-native-client in 
the patch passed.  |
   | +1 :green_heart: |  asflicense  |   0m 44s |  |  The patch does not 
generate ASF License warnings.  |
   |  |   | 144m 29s |  |  |
   
   
   | Subsystem | Report/Notes |
   |--:|:-|
   | Docker | ClientAPI=1.41 ServerAPI=1.41 base: 
https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4601/2/artifact/out/Dockerfile
 |
   | GITHUB PR | https://github.com/apache/hadoop/pull/4601 |
   | Optional Tests | dupname asflicense compile cc mvnsite javac unit 
codespell detsecrets golang |
   | uname | Linux 2e12dc8bfcea 4.15.0-58-generic #64-Ubuntu SMP Tue Aug 6 
11:12:41 UTC 2019 x86_64 GNU/Linux |
   | Build tool | maven |
   | Personality | dev-support/bin/hadoop.sh |
   | git revision | trunk / 6845f0c7d9631c56b92ded7adc539c0de23a1045 |
   | Default Java | Debian-11.0.15+10-post-Debian-1deb10u1 |
   |  Test Results | 
https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4601/2/testReport/ |
   | modules | C: hadoop-hdfs-project/hadoop-hdfs-native-client U: 
hadoop-hdfs-project/hadoop-hdfs-native-client |
   | Console output | 
https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4601/2/console |
   | versions | git=2.20.1 maven=3.6.0 |
   | Powered by | Apache Yetus 0.14.0 https://yetus.apache.org |
   
   
   This message was automatically generated.
   
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[GitHub] [hadoop] slfan1989 commented on pull request #4403: MAPREDUCE-7385. improve JobEndNotifier#httpNotification With recommended methods

2022-07-21 Thread GitBox


slfan1989 commented on PR #4403:
URL: https://github.com/apache/hadoop/pull/4403#issuecomment-1192050710

   @ferhui @Hexiaoqiao Can you help review this pr? thank you very much!
   
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[GitHub] [hadoop] slfan1989 commented on pull request #4406: HDFS-16619. Fix HttpHeaders.Values And HttpHeaders.Names Deprecated Import

2022-07-21 Thread GitBox


slfan1989 commented on PR #4406:
URL: https://github.com/apache/hadoop/pull/4406#issuecomment-1192045850

   @ferhui Can you help review this pr? This is a relatively simple change, I 
hope it can replace the deprecated import, thank you very much!


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[GitHub] [hadoop] slfan1989 commented on pull request #4529: HDFS-16648. Add isDebugEnabled check for debug logs in some classes

2022-07-21 Thread GitBox


slfan1989 commented on PR #4529:
URL: https://github.com/apache/hadoop/pull/4529#issuecomment-1192041169

   @ZanderXu After completing this pr, can you help summarize the precautions 
in the process of using logger, so that other partners can refer to it later?


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[GitHub] [hadoop] slfan1989 commented on pull request #4375: HDFS-16605. Improve Code With Lambda in hadoop-hdfs-rbf moudle.

2022-07-21 Thread GitBox


slfan1989 commented on PR #4375:
URL: https://github.com/apache/hadoop/pull/4375#issuecomment-1192038685

   @goiri Can you help merge this pr into the trunk branch? Later, other 
partners will continue this optimization in the hadoop-hdfs module, thank you 
very much!


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[GitHub] [hadoop] slfan1989 commented on pull request #4375: HDFS-16605. Improve Code With Lambda in hadoop-hdfs-rbf moudle.

2022-07-21 Thread GitBox


slfan1989 commented on PR #4375:
URL: https://github.com/apache/hadoop/pull/4375#issuecomment-1192038471

   @goiri Can you help merge this pr into the trunk branch? Later, other 
partners will continue this optimization in the hadoop-hdfs module, thank you 
very much!


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[GitHub] [hadoop] slfan1989 commented on a diff in pull request #4594: YARN-6572. Refactoring Router services to use common util classes for pipeline creations.

2022-07-21 Thread GitBox


slfan1989 commented on code in PR #4594:
URL: https://github.com/apache/hadoop/pull/4594#discussion_r927180189


##
hadoop-yarn-project/hadoop-yarn/hadoop-yarn-server/hadoop-yarn-server-router/src/main/java/org/apache/hadoop/yarn/server/router/clientrm/RouterClientRMService.java:
##
@@ -19,11 +19,9 @@
 package org.apache.hadoop.yarn.server.router.clientrm;
 
 import java.io.IOException;
+import java.lang.reflect.InvocationTargetException;

Review Comment:
   I will fix it.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[GitHub] [hadoop] slfan1989 commented on pull request #4540: YARN-11160. Support getResourceProfiles, getResourceProfile API's for Federation

2022-07-21 Thread GitBox


slfan1989 commented on PR #4540:
URL: https://github.com/apache/hadoop/pull/4540#issuecomment-1192031586

   @goiri thank you very much!


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[jira] [Commented] (HADOOP-18340) deleteOnExit does not work with S3AFileSystem

2022-07-21 Thread Huaxiang Sun (Jira)


[ 
https://issues.apache.org/jira/browse/HADOOP-18340?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17569719#comment-17569719
 ] 

Huaxiang Sun commented on HADOOP-18340:
---

I posted a patch. The _parallelise the delete_ is not implemented in this patch 
as I think it needs more thoughts. [~ste...@apache.org] , when you get chance, 
could you take a look at the patch? Thanks. __ 

> deleteOnExit does not work with S3AFileSystem
> -
>
> Key: HADOOP-18340
> URL: https://issues.apache.org/jira/browse/HADOOP-18340
> Project: Hadoop Common
>  Issue Type: Bug
>  Components: fs/s3
>Affects Versions: 3.3.3
>Reporter: Huaxiang Sun
>Priority: Minor
>  Labels: pull-request-available
>  Time Spent: 10m
>  Remaining Estimate: 0h
>
> When deleteOnExit is set on some paths, they are not removed when file system 
> object is closed. The following exception is logged when printing out the 
> exception in info log.
> {code:java}
> 2022-07-15 19:29:12,552 [main] INFO  fs.FileSystem 
> (FileSystem.java:processDeleteOnExit(1810)) - Ignoring failure to 
> deleteOnExit for path /file, exception {}
> java.io.IOException: s3a://mock-bucket: FileSystem is closed!
>         at 
> org.apache.hadoop.fs.s3a.S3AFileSystem.checkNotClosed(S3AFileSystem.java:3887)
>         at 
> org.apache.hadoop.fs.s3a.S3AFileSystem.trackDurationAndSpan(S3AFileSystem.java:2333)
>         at 
> org.apache.hadoop.fs.s3a.S3AFileSystem.trackDurationAndSpan(S3AFileSystem.java:2355)
>         at 
> org.apache.hadoop.fs.s3a.S3AFileSystem.exists(S3AFileSystem.java:4402)
>         at 
> org.apache.hadoop.fs.FileSystem.processDeleteOnExit(FileSystem.java:1805)
>         at org.apache.hadoop.fs.FileSystem.close(FileSystem.java:2669)
>         at 
> org.apache.hadoop.fs.s3a.S3AFileSystem.close(S3AFileSystem.java:3830)
>         at 
> org.apache.hadoop.fs.s3a.TestS3AGetFileStatus.testFile(TestS3AGetFileStatus.java:87)
>         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>         at 
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
>         at 
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>         at java.lang.reflect.Method.invoke(Method.java:498)
>         at 
> org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:59)
>         at 
> org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
>         at 
> org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:56)
>         at 
> org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
>         at 
> org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
>         at 
> org.junit.internal.runners.statements.RunAfters.evaluate(RunAfters.java:27)
>         at 
> org.junit.rules.ExpectedException$ExpectedExceptionStatement.evaluate(ExpectedException.java:258)
>         at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306)
>         at 
> org.junit.runners.BlockJUnit4ClassRunner$1.evaluate(BlockJUnit4ClassRunner.java:100)
>         at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:366)
>         at 
> org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:103)
>         at 
> org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:63)
>         at org.junit.runners.ParentRunner$4.run(ParentRunner.java:331)
>         at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:79)
>         at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:329)
>         at org.junit.runners.ParentRunner.access$100(ParentRunner.java:66)
>         at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:293)
>         at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306)
>         at org.junit.runners.ParentRunner.run(ParentRunner.java:413)
>         at 
> org.apache.maven.surefire.junit4.JUnit4Provider.execute(JUnit4Provider.java:365)
>         at 
> org.apache.maven.surefire.junit4.JUnit4Provider.executeWithRerun(JUnit4Provider.java:273)
>         at 
> org.apache.maven.surefire.junit4.JUnit4Provider.executeTestSet(JUnit4Provider.java:238)
>         at 
> org.apache.maven.surefire.junit4.JUnit4Provider.invoke(JUnit4Provider.java:159)
>         at 
> org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:384)
>         at 
> org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:345)
>         at 
> org.apache.maven.surefire.booter.ForkedBooter.execute(ForkedBooter.java:126)
>         at 
> org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:418)
>  {code}



--
This message was sent by Atlassian Jira

[jira] [Updated] (HADOOP-18340) deleteOnExit does not work with S3AFileSystem

2022-07-21 Thread ASF GitHub Bot (Jira)


 [ 
https://issues.apache.org/jira/browse/HADOOP-18340?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

ASF GitHub Bot updated HADOOP-18340:

Labels: pull-request-available  (was: )

> deleteOnExit does not work with S3AFileSystem
> -
>
> Key: HADOOP-18340
> URL: https://issues.apache.org/jira/browse/HADOOP-18340
> Project: Hadoop Common
>  Issue Type: Bug
>  Components: fs/s3
>Affects Versions: 3.3.3
>Reporter: Huaxiang Sun
>Priority: Minor
>  Labels: pull-request-available
>  Time Spent: 10m
>  Remaining Estimate: 0h
>
> When deleteOnExit is set on some paths, they are not removed when file system 
> object is closed. The following exception is logged when printing out the 
> exception in info log.
> {code:java}
> 2022-07-15 19:29:12,552 [main] INFO  fs.FileSystem 
> (FileSystem.java:processDeleteOnExit(1810)) - Ignoring failure to 
> deleteOnExit for path /file, exception {}
> java.io.IOException: s3a://mock-bucket: FileSystem is closed!
>         at 
> org.apache.hadoop.fs.s3a.S3AFileSystem.checkNotClosed(S3AFileSystem.java:3887)
>         at 
> org.apache.hadoop.fs.s3a.S3AFileSystem.trackDurationAndSpan(S3AFileSystem.java:2333)
>         at 
> org.apache.hadoop.fs.s3a.S3AFileSystem.trackDurationAndSpan(S3AFileSystem.java:2355)
>         at 
> org.apache.hadoop.fs.s3a.S3AFileSystem.exists(S3AFileSystem.java:4402)
>         at 
> org.apache.hadoop.fs.FileSystem.processDeleteOnExit(FileSystem.java:1805)
>         at org.apache.hadoop.fs.FileSystem.close(FileSystem.java:2669)
>         at 
> org.apache.hadoop.fs.s3a.S3AFileSystem.close(S3AFileSystem.java:3830)
>         at 
> org.apache.hadoop.fs.s3a.TestS3AGetFileStatus.testFile(TestS3AGetFileStatus.java:87)
>         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>         at 
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
>         at 
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>         at java.lang.reflect.Method.invoke(Method.java:498)
>         at 
> org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:59)
>         at 
> org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
>         at 
> org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:56)
>         at 
> org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
>         at 
> org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
>         at 
> org.junit.internal.runners.statements.RunAfters.evaluate(RunAfters.java:27)
>         at 
> org.junit.rules.ExpectedException$ExpectedExceptionStatement.evaluate(ExpectedException.java:258)
>         at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306)
>         at 
> org.junit.runners.BlockJUnit4ClassRunner$1.evaluate(BlockJUnit4ClassRunner.java:100)
>         at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:366)
>         at 
> org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:103)
>         at 
> org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:63)
>         at org.junit.runners.ParentRunner$4.run(ParentRunner.java:331)
>         at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:79)
>         at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:329)
>         at org.junit.runners.ParentRunner.access$100(ParentRunner.java:66)
>         at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:293)
>         at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306)
>         at org.junit.runners.ParentRunner.run(ParentRunner.java:413)
>         at 
> org.apache.maven.surefire.junit4.JUnit4Provider.execute(JUnit4Provider.java:365)
>         at 
> org.apache.maven.surefire.junit4.JUnit4Provider.executeWithRerun(JUnit4Provider.java:273)
>         at 
> org.apache.maven.surefire.junit4.JUnit4Provider.executeTestSet(JUnit4Provider.java:238)
>         at 
> org.apache.maven.surefire.junit4.JUnit4Provider.invoke(JUnit4Provider.java:159)
>         at 
> org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:384)
>         at 
> org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:345)
>         at 
> org.apache.maven.surefire.booter.ForkedBooter.execute(ForkedBooter.java:126)
>         at 
> org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:418)
>  {code}



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: 

[jira] [Work logged] (HADOOP-18340) deleteOnExit does not work with S3AFileSystem

2022-07-21 Thread ASF GitHub Bot (Jira)


 [ 
https://issues.apache.org/jira/browse/HADOOP-18340?focusedWorklogId=793971=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-793971
 ]

ASF GitHub Bot logged work on HADOOP-18340:
---

Author: ASF GitHub Bot
Created on: 21/Jul/22 23:27
Start Date: 21/Jul/22 23:27
Worklog Time Spent: 10m 
  Work Description: huaxiangsun opened a new pull request, #4608:
URL: https://github.com/apache/hadoop/pull/4608

   
   
   ### Description of PR
   processDeleteOnExit() is overiden in S3AFilesystem, it skips exist() check 
and delete objects without checking if FileSystem is closed. 
   
   ### How was this patch tested?
   A new unitest case is added. And all unittest cases under 
hadoop-tools/hadoop-aws passed.
   mvn -Dparallel-tests clean test
   
   Did S3A Integration tests against us-west-2 region and there are a few 
failures/errors. Run the trunk code without the patch, there are same 
errors/failures. The errors/failures are not caused by the patch, probably due 
to misconfiguration ( I could not figure out)
   mvn -Dparallel-tests clean verify
   
   The result is
   `
   Tests | Errors | Failures | Skipped | Success Rate | Time
   

Issue Time Tracking
---

Worklog Id: (was: 793971)
Remaining Estimate: 0h
Time Spent: 10m

> deleteOnExit does not work with S3AFileSystem
> -
>
> Key: HADOOP-18340
> URL: https://issues.apache.org/jira/browse/HADOOP-18340
> Project: Hadoop Common
>  Issue Type: Bug
>  Components: fs/s3
>Affects Versions: 3.3.3
>Reporter: Huaxiang Sun
>Priority: Minor
>  Time Spent: 10m
>  Remaining Estimate: 0h
>
> When deleteOnExit is set on some paths, they are not removed when file system 
> object is closed. The following exception is logged when printing out the 
> exception in info log.
> {code:java}
> 2022-07-15 19:29:12,552 [main] INFO  fs.FileSystem 
> (FileSystem.java:processDeleteOnExit(1810)) - Ignoring failure to 
> deleteOnExit for path /file, exception {}
> java.io.IOException: s3a://mock-bucket: FileSystem is closed!
>         at 
> org.apache.hadoop.fs.s3a.S3AFileSystem.checkNotClosed(S3AFileSystem.java:3887)
>         at 
> org.apache.hadoop.fs.s3a.S3AFileSystem.trackDurationAndSpan(S3AFileSystem.java:2333)
>         at 
> org.apache.hadoop.fs.s3a.S3AFileSystem.trackDurationAndSpan(S3AFileSystem.java:2355)
>         at 
> org.apache.hadoop.fs.s3a.S3AFileSystem.exists(S3AFileSystem.java:4402)
>         at 
> org.apache.hadoop.fs.FileSystem.processDeleteOnExit(FileSystem.java:1805)
>         at org.apache.hadoop.fs.FileSystem.close(FileSystem.java:2669)
>         at 
> org.apache.hadoop.fs.s3a.S3AFileSystem.close(S3AFileSystem.java:3830)
>         at 
> org.apache.hadoop.fs.s3a.TestS3AGetFileStatus.testFile(TestS3AGetFileStatus.java:87)
>         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>         at 
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
>         at 
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>         at java.lang.reflect.Method.invoke(Method.java:498)
>         at 
> org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:59)
>         at 
> org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
>         at 
> org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:56)
>         at 
> org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
>         at 
> org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
>         at 
> org.junit.internal.runners.statements.RunAfters.evaluate(RunAfters.java:27)
>         at 
> org.junit.rules.ExpectedException$ExpectedExceptionStatement.evaluate(ExpectedException.java:258)
>         at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306)
>         at 
> org.junit.runners.BlockJUnit4ClassRunner$1.evaluate(BlockJUnit4ClassRunner.java:100)
>         at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:366)
>         at 
> org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:103)
>         at 
> org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:63)
>         at org.junit.runners.ParentRunner$4.run(ParentRunner.java:331)
>         at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:79)
>         at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:329)
>         at org.junit.runners.ParentRunner.access$100(ParentRunner.java:66)
>         at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:293)
>         at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306)
>         at 

[GitHub] [hadoop] huaxiangsun opened a new pull request, #4608: HADOOP-18340 deleteOnExit does not work with S3AFileSystem

2022-07-21 Thread GitBox


huaxiangsun opened a new pull request, #4608:
URL: https://github.com/apache/hadoop/pull/4608

   
   
   ### Description of PR
   processDeleteOnExit() is overiden in S3AFilesystem, it skips exist() check 
and delete objects without checking if FileSystem is closed. 
   
   ### How was this patch tested?
   A new unitest case is added. And all unittest cases under 
hadoop-tools/hadoop-aws passed.
   mvn -Dparallel-tests clean test
   
   Did S3A Integration tests against us-west-2 region and there are a few 
failures/errors. Run the trunk code without the patch, there are same 
errors/failures. The errors/failures are not caused by the patch, probably due 
to misconfiguration ( I could not figure out)
   mvn -Dparallel-tests clean verify
   
   The result is
   `
   Tests | Errors | Failures | Skipped | Success Rate | Time
   -- | -- | -- | -- | -- | --
   1252 | 6 | 1 | 270 | 77.875% | 3,627.473
   
   `
   The errors are 
   `
   
org.apache.hadoop.fs.s3a.auth.delegation.ITestDelegatedMRJob#testCommonCrawlLookup[1]
 + [ Detail ] | 0.324
   
     | s3a://hbase-test-data/fork-0001/test: getFileStatus on 
s3a://hbase-test-data/fork-0001/test: 
com.amazonaws.services.s3.model.AmazonS3Exception: The AWS Access Key Id you 
provided does not exist in our records. (Service: Amazon S3; Status Code: 403; 
Error Code: InvalidAccessKeyId; Request ID: XJ3DCCR6Q7SXTJDW; S3 Extended 
Request ID: 
fRmP3m1lThWxhj3s9VkSNEtuBz1JeBWYw65aRajrSg/H7IN+muB7d8PavSeqJ2urvLZtguTbnlc=; 
Proxy: null), S3 Extended Request ID: 
fRmP3m1lThWxhj3s9VkSNEtuBz1JeBWYw65aRajrSg/H7IN+muB7d8PavSeqJ2urvLZtguTbnlc=:InvalidAccessKeyId
 |  
     |   |  
     | testJobSubmissionCollectsTokens[1] + [ Detail ] | 0.329
     | s3a://hbase-test-data/fork-0001/test: getFileStatus on 
s3a://hbase-test-data/fork-0001/test: 
com.amazonaws.services.s3.model.AmazonS3Exception: The AWS Access Key Id you 
provided does not exist in our records. (Service: Amazon S3; Status Code: 403; 
Error Code: InvalidAccessKeyId; Request ID: XJ35WHK7X6EMP9B6; S3 Extended 
Request ID: 
rtWEsDYcGqNiaoKy2D5EQQqN+O7MbYe1bYbiSmkF+FOz9/wb6+t+dQooqj7ppCSCZMBgC3PeEw4=; 
Proxy: null), S3 Extended Request ID: 
rtWEsDYcGqNiaoKy2D5EQQqN+O7MbYe1bYbiSmkF+FOz9/wb6+t+dQooqj7ppCSCZMBgC3PeEw4=:InvalidAccessKeyId
   `
   
   `
   
org.apache.hadoop.fs.s3a.ITestS3AEndpointRegion#testBlankRegionTriggersSDKResolution
 + [ Detail ] | 2.817
   
     | [Client region name] expected:<"[mars-north]-2"> but was:<"[us-west]-2">
   `
   and 
   `
   org.apache.hadoop.fs.s3a.ITestS3ATemporaryCredentials#estSTS 
     | : request session credentials: 
com.amazonaws.services.securitytoken.model.AWSSecurityTokenServiceException: 
Cannot call GetSessionToken with session credentials (Service: 
AWSSecurityTokenService; Status Code: 403; Error Code: AccessDenied; Request 
ID: d935696d-7d98-4a1c-825f-22bf3d28ed9d; Proxy: null):AccessDenied
   `
   
   ### For code changes:
   
   - [ ] Does the title or this PR starts with the corresponding JIRA issue id 
(e.g. 'HADOOP-17799. Your PR title ...')?
   - [ ] Object storage: have the integration tests been executed and the 
endpoint declared according to the connector-specific documentation?
   - [ ] If adding new dependencies to the code, are these dependencies 
licensed in a way that is compatible for inclusion under [ASF 
2.0](http://www.apache.org/legal/resolved.html#category-a)?
   - [ ] If applicable, have you updated the `LICENSE`, `LICENSE-binary`, 
`NOTICE-binary` files?
   
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[GitHub] [hadoop] goiri merged pull request #4488: HDFS-16640. RBF: Show datanode IP list when click DN histogram in Router

2022-07-21 Thread GitBox


goiri merged PR #4488:
URL: https://github.com/apache/hadoop/pull/4488


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[GitHub] [hadoop] goiri commented on a diff in pull request #4597: HDFS-16671. RBF: RouterRpcFairnessPolicyController supports configurable permit acquire timeout

2022-07-21 Thread GitBox


goiri commented on code in PR #4597:
URL: https://github.com/apache/hadoop/pull/4597#discussion_r927170619


##
hadoop-hdfs-project/hadoop-hdfs-rbf/src/main/java/org/apache/hadoop/hdfs/server/federation/fairness/AbstractRouterRpcFairnessPolicyController.java:
##
@@ -42,15 +45,22 @@
   /** Hash table to hold semaphore for each configured name service. */
   private Map permits;
 
+  private long acquireTimeoutMs = DFS_ROUTER_FAIRNESS_ACQUIRE_TIMEOUT_DEFAULT;
+
   public void init(Configuration conf) {
 this.permits = new HashMap<>();
+long timeoutMs = 
conf.getTimeDuration(DFS_ROUTER_FAIRNESS_ACQUIRE_TIMEOUT_MS,

Review Comment:
   If we do getTimeDuration() we don't need the prefix in the key 
DFS_ROUTER_FAIRNESS_ACQUIRE_TIMEOUT_MS.
   We should set the default in the XML to "1s"



##
hadoop-hdfs-project/hadoop-hdfs-rbf/src/test/java/org/apache/hadoop/hdfs/server/federation/fairness/TestRouterRpcFairnessPolicyController.java:
##
@@ -83,6 +85,29 @@ public void testHandlerAllocationPreconfigured() {
 
assertFalse(routerRpcFairnessPolicyController.acquirePermit(CONCURRENT_NS));
   }
 
+  @Test
+  public void testAcquireTimeout() {
+Configuration conf = createConf(40);
+conf.setInt(DFS_ROUTER_FAIR_HANDLER_COUNT_KEY_PREFIX + "ns1", 30);
+conf.setLong(DFS_ROUTER_FAIRNESS_ACQUIRE_TIMEOUT_MS, 100);

Review Comment:
   setTimeDuration(DFS_ROUTER_FAIRNESS_ACQUIRE_TIMEOUT, 1, TimeUnit.SECONDS)



##
hadoop-hdfs-project/hadoop-hdfs-rbf/src/main/resources/hdfs-rbf-default.xml:
##
@@ -723,6 +723,14 @@
 
   
 
+  
+dfs.federation.router.fairness.acquire.timeout.ms
+1000

Review Comment:
   1s and remove the ms



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[GitHub] [hadoop] ZanderXu commented on a diff in pull request #4606: HDFS-16678. RBF should supports disable getNodeUsage() in RBFMetrics

2022-07-21 Thread GitBox


ZanderXu commented on code in PR #4606:
URL: https://github.com/apache/hadoop/pull/4606#discussion_r927154231


##
hadoop-hdfs-project/hadoop-hdfs-rbf/src/main/java/org/apache/hadoop/hdfs/server/federation/metrics/RBFMetrics.java:
##
@@ -544,28 +548,30 @@ public String getNodeUsage() {
 
 final Map> info = new HashMap<>();
 try {
-  RouterRpcServer rpcServer = this.router.getRpcServer();
-  DatanodeInfo[] live = rpcServer.getDatanodeReport(
-  DatanodeReportType.LIVE, false, timeOut);
-
-  if (live.length > 0) {
-float totalDfsUsed = 0;
-float[] usages = new float[live.length];
-int i = 0;
-for (DatanodeInfo dn : live) {
-  usages[i++] = dn.getDfsUsedPercent();
-  totalDfsUsed += dn.getDfsUsedPercent();
-}
-totalDfsUsed /= live.length;
-Arrays.sort(usages);
-median = usages[usages.length / 2];
-max = usages[usages.length - 1];
-min = usages[0];
-
-for (i = 0; i < usages.length; i++) {
-  dev += (usages[i] - totalDfsUsed) * (usages[i] - totalDfsUsed);
+  if (this.enableGetDNUsage) {
+RouterRpcServer rpcServer = this.router.getRpcServer();
+DatanodeInfo[] live = rpcServer.getDatanodeReport(
+DatanodeReportType.LIVE, false, timeOut);
+
+if (live.length > 0) {
+  float totalDfsUsed = 0;
+  float[] usages = new float[live.length];
+  int i = 0;
+  for (DatanodeInfo dn : live) {
+usages[i++] = dn.getDfsUsedPercent();

Review Comment:
   Yes,  `rpcServer.getDatanodeReport()` is expensive.  As the number of DNs or 
downstream nameservices in the cluster increases, it will become more and more 
expensive. such as 1w+ DNs, 5w+ DNs, 20+ NSs, 50+ NSs.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[GitHub] [hadoop] ZanderXu commented on a diff in pull request #4606: HDFS-16678. RBF should supports disable getNodeUsage() in RBFMetrics

2022-07-21 Thread GitBox


ZanderXu commented on code in PR #4606:
URL: https://github.com/apache/hadoop/pull/4606#discussion_r927153919


##
hadoop-hdfs-project/hadoop-hdfs-rbf/src/main/java/org/apache/hadoop/hdfs/server/federation/router/RBFConfigKeys.java:
##
@@ -315,6 +315,9 @@ public class RBFConfigKeys extends 
CommonConfigurationKeysPublic {
   FEDERATION_ROUTER_PREFIX + "dn-report.cache-expire";
   public static final long DN_REPORT_CACHE_EXPIRE_MS_DEFAULT =
   TimeUnit.SECONDS.toMillis(10);
+  public static final String DFS_ROUTER_ENABLE_GET_DN_USAGE_KEY =
+  FEDERATION_ROUTER_PREFIX + "enable.get.dn.usage";
+  public static final boolean DFS_ROUTER_ENABLE_GET_DN_USAGE_DEFAULT = true;

Review Comment:
   Copy, I will do it.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[jira] [Updated] (HADOOP-18354) upgrade reload4j due to XXE vulnerability

2022-07-21 Thread ASF GitHub Bot (Jira)


 [ 
https://issues.apache.org/jira/browse/HADOOP-18354?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

ASF GitHub Bot updated HADOOP-18354:

Labels: pull-request-available  (was: )

> upgrade reload4j due to XXE vulnerability
> -
>
> Key: HADOOP-18354
> URL: https://issues.apache.org/jira/browse/HADOOP-18354
> Project: Hadoop Common
>  Issue Type: Improvement
>Reporter: PJ Fanning
>Priority: Major
>  Labels: pull-request-available
>  Time Spent: 10m
>  Remaining Estimate: 0h
>
> https://github.com/qos-ch/reload4j/issues/53 fixed in reload4j 1.2.22



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[jira] [Work logged] (HADOOP-18354) upgrade reload4j due to XXE vulnerability

2022-07-21 Thread ASF GitHub Bot (Jira)


 [ 
https://issues.apache.org/jira/browse/HADOOP-18354?focusedWorklogId=793958=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-793958
 ]

ASF GitHub Bot logged work on HADOOP-18354:
---

Author: ASF GitHub Bot
Created on: 21/Jul/22 22:36
Start Date: 21/Jul/22 22:36
Worklog Time Spent: 10m 
  Work Description: pjfanning opened a new pull request, #4607:
URL: https://github.com/apache/hadoop/pull/4607

   
   
   ### Description of PR
   
   XXE issue in reload4j (probably not very exploitable)
   
   
   ### How was this patch tested?
   
   
   ### For code changes:
   
   - [X] Does the title or this PR starts with the corresponding JIRA issue id 
(e.g. 'HADOOP-17799. Your PR title ...')?
   - [ ] Object storage: have the integration tests been executed and the 
endpoint declared according to the connector-specific documentation?
   - [ ] If adding new dependencies to the code, are these dependencies 
licensed in a way that is compatible for inclusion under [ASF 
2.0](http://www.apache.org/legal/resolved.html#category-a)?
   - [ ] If applicable, have you updated the `LICENSE`, `LICENSE-binary`, 
`NOTICE-binary` files?
   
   




Issue Time Tracking
---

Worklog Id: (was: 793958)
Remaining Estimate: 0h
Time Spent: 10m

> upgrade reload4j due to XXE vulnerability
> -
>
> Key: HADOOP-18354
> URL: https://issues.apache.org/jira/browse/HADOOP-18354
> Project: Hadoop Common
>  Issue Type: Improvement
>Reporter: PJ Fanning
>Priority: Major
>  Time Spent: 10m
>  Remaining Estimate: 0h
>
> https://github.com/qos-ch/reload4j/issues/53 fixed in reload4j 1.2.22



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[GitHub] [hadoop] pjfanning opened a new pull request, #4607: HADOOP-18354: reload4j 1.22.2

2022-07-21 Thread GitBox


pjfanning opened a new pull request, #4607:
URL: https://github.com/apache/hadoop/pull/4607

   
   
   ### Description of PR
   
   XXE issue in reload4j (probably not very exploitable)
   
   
   ### How was this patch tested?
   
   
   ### For code changes:
   
   - [X] Does the title or this PR starts with the corresponding JIRA issue id 
(e.g. 'HADOOP-17799. Your PR title ...')?
   - [ ] Object storage: have the integration tests been executed and the 
endpoint declared according to the connector-specific documentation?
   - [ ] If adding new dependencies to the code, are these dependencies 
licensed in a way that is compatible for inclusion under [ASF 
2.0](http://www.apache.org/legal/resolved.html#category-a)?
   - [ ] If applicable, have you updated the `LICENSE`, `LICENSE-binary`, 
`NOTICE-binary` files?
   
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[jira] [Created] (HADOOP-18354) upgrade reload4j due to XXE vulnerability

2022-07-21 Thread PJ Fanning (Jira)
PJ Fanning created HADOOP-18354:
---

 Summary: upgrade reload4j due to XXE vulnerability
 Key: HADOOP-18354
 URL: https://issues.apache.org/jira/browse/HADOOP-18354
 Project: Hadoop Common
  Issue Type: Improvement
Reporter: PJ Fanning


https://github.com/qos-ch/reload4j/issues/53 fixed in reload4j 1.2.22



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[GitHub] [hadoop] jojochuang commented on a diff in pull request #4155: HDFS-16533. COMPOSITE_CRC failed between replicated file and striped …

2022-07-21 Thread GitBox


jojochuang commented on code in PR #4155:
URL: https://github.com/apache/hadoop/pull/4155#discussion_r927145405


##
hadoop-hdfs-project/hadoop-hdfs-client/src/main/java/org/apache/hadoop/hdfs/FileChecksumHelper.java:
##
@@ -303,7 +303,8 @@ FileChecksum makeCompositeCrcResult() throws IOException {
   byte[] blockChecksumBytes = blockChecksumBuf.getData();
 
   long sumBlockLengths = 0;
-  for (int i = 0; i < locatedBlocks.size() - 1; ++i) {
+  int i = 0;

Review Comment:
   is this change necessary?



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[GitHub] [hadoop] jojochuang commented on a diff in pull request #4155: HDFS-16533. COMPOSITE_CRC failed between replicated file and striped …

2022-07-21 Thread GitBox


jojochuang commented on code in PR #4155:
URL: https://github.com/apache/hadoop/pull/4155#discussion_r927145405


##
hadoop-hdfs-project/hadoop-hdfs-client/src/main/java/org/apache/hadoop/hdfs/FileChecksumHelper.java:
##
@@ -303,7 +303,8 @@ FileChecksum makeCompositeCrcResult() throws IOException {
   byte[] blockChecksumBytes = blockChecksumBuf.getData();
 
   long sumBlockLengths = 0;
-  for (int i = 0; i < locatedBlocks.size() - 1; ++i) {
+  int i = 0;

Review Comment:
   is this change necessary?



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[GitHub] [hadoop] hadoop-yetus commented on pull request #4601: HDFS-16467. Ensure Protobuf generated headers are included first

2022-07-21 Thread GitBox


hadoop-yetus commented on PR #4601:
URL: https://github.com/apache/hadoop/pull/4601#issuecomment-1191987874

   :confetti_ball: **+1 overall**
   
   
   
   
   
   
   | Vote | Subsystem | Runtime |  Logfile | Comment |
   |::|--:|:|::|:---:|
   | +0 :ok: |  reexec  |  21m 36s |  |  Docker mode activated.  |
    _ Prechecks _ |
   | +1 :green_heart: |  dupname  |   0m  1s |  |  No case conflicting files 
found.  |
   | +0 :ok: |  codespell  |   0m  0s |  |  codespell was not available.  |
   | +0 :ok: |  detsecrets  |   0m  0s |  |  detect-secrets was not available.  
|
   | +1 :green_heart: |  @author  |   0m  0s |  |  The patch does not contain 
any @author tags.  |
   | +1 :green_heart: |  test4tests  |   0m  0s |  |  The patch appears to 
include 7 new or modified test files.  |
    _ trunk Compile Tests _ |
   | +1 :green_heart: |  mvninstall  |  22m 42s |  |  trunk passed  |
   | +1 :green_heart: |  compile  |   4m 10s |  |  trunk passed  |
   | +1 :green_heart: |  mvnsite  |   1m  4s |  |  trunk passed  |
   | +1 :green_heart: |  shadedclient  |  47m 29s |  |  branch has no errors 
when building and testing our client artifacts.  |
    _ Patch Compile Tests _ |
   | +1 :green_heart: |  mvninstall  |   0m 32s |  |  the patch passed  |
   | +1 :green_heart: |  compile  |   3m 45s |  |  the patch passed  |
   | +1 :green_heart: |  cc  |   3m 45s |  |  the patch passed  |
   | +1 :green_heart: |  golang  |   3m 45s |  |  the patch passed  |
   | +1 :green_heart: |  javac  |   3m 45s |  |  the patch passed  |
   | +1 :green_heart: |  blanks  |   0m  0s |  |  The patch has no blanks 
issues.  |
   | +1 :green_heart: |  mvnsite  |   0m 36s |  |  the patch passed  |
   | +1 :green_heart: |  shadedclient  |  19m  6s |  |  patch has no errors 
when building and testing our client artifacts.  |
    _ Other Tests _ |
   | +1 :green_heart: |  unit  |  33m 11s |  |  hadoop-hdfs-native-client in 
the patch passed.  |
   | +1 :green_heart: |  asflicense  |   0m 57s |  |  The patch does not 
generate ASF License warnings.  |
   |  |   | 129m 50s |  |  |
   
   
   | Subsystem | Report/Notes |
   |--:|:-|
   | Docker | ClientAPI=1.41 ServerAPI=1.41 base: 
https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4601/2/artifact/out/Dockerfile
 |
   | GITHUB PR | https://github.com/apache/hadoop/pull/4601 |
   | Optional Tests | dupname asflicense compile cc mvnsite javac unit 
codespell detsecrets golang |
   | uname | Linux 61d15e62e08c 4.15.0-58-generic #64-Ubuntu SMP Tue Aug 6 
11:12:41 UTC 2019 x86_64 x86_64 x86_64 GNU/Linux |
   | Build tool | maven |
   | Personality | dev-support/bin/hadoop.sh |
   | git revision | trunk / 6845f0c7d9631c56b92ded7adc539c0de23a1045 |
   | Default Java | Red Hat, Inc.-1.8.0_312-b07 |
   |  Test Results | 
https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4601/2/testReport/ |
   | Max. process+thread count | 553 (vs. ulimit of 5500) |
   | modules | C: hadoop-hdfs-project/hadoop-hdfs-native-client U: 
hadoop-hdfs-project/hadoop-hdfs-native-client |
   | Console output | 
https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4601/2/console |
   | versions | git=2.27.0 maven=3.6.3 |
   | Powered by | Apache Yetus 0.14.0 https://yetus.apache.org |
   
   
   This message was automatically generated.
   
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[GitHub] [hadoop] goiri commented on a diff in pull request #4531: HDFS-13274. RBF: Extend RouterRpcClient to use multiple sockets

2022-07-21 Thread GitBox


goiri commented on code in PR #4531:
URL: https://github.com/apache/hadoop/pull/4531#discussion_r927072119


##
hadoop-hdfs-project/hadoop-hdfs-rbf/src/main/java/org/apache/hadoop/hdfs/server/federation/router/RBFConfigKeys.java:
##
@@ -135,6 +135,13 @@ public class RBFConfigKeys extends 
CommonConfigurationKeysPublic {
   FEDERATION_ROUTER_PREFIX + "connection.clean.ms";
   public static final long DFS_ROUTER_NAMENODE_CONNECTION_CLEAN_MS_DEFAULT =
   TimeUnit.SECONDS.toMillis(10);
+  public static final String DFS_ROUTER_NAMENODE_ENABLE_MULTIPLE_SOCKET_KEY =
+  FEDERATION_ROUTER_PREFIX + "enable.multiple.socket";
+  public static final boolean 
DFS_ROUTER_NAMENODE_ENABLE_MULTIPLE_SOCKET_DEFAULT

Review Comment:
   Actually this fits in one line.



##
hadoop-hdfs-project/hadoop-hdfs-rbf/src/main/resources/hdfs-rbf-default.xml:
##
@@ -134,6 +134,30 @@
 
   
 
+  
+dfs.federation.router.enable.multiple.socket
+false
+
+  If enable multiple downstream socket or not. If true, ConnectionPool
+  will use a new socket when creating a new connection for the same user,
+  and RouterRPCClient will get a better throughput. It's best used with
+  dfs.federation.router.max.concurrency.per.connection together to get
+  a better throughput with fewer sockets.
+
+  
+
+  
+dfs.federation.router.max.concurrency.per.connection
+1
+
+  The maximum number of requests that a connection can handle concurrently.
+  It's best used with dfs.federation.router.enable.multiple.socket 
together.

Review Comment:
   If dfs.federation.router.enable.multiple.socket=false does it even take 
effect?
   Can we also give an example or a rule of thumb to what value would be good 
(moderate is a little ambiguous)?



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[jira] [Work logged] (HADOOP-18190) s3a prefetching streams to collect iostats on prefetching operations

2022-07-21 Thread ASF GitHub Bot (Jira)


 [ 
https://issues.apache.org/jira/browse/HADOOP-18190?focusedWorklogId=793925=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-793925
 ]

ASF GitHub Bot logged work on HADOOP-18190:
---

Author: ASF GitHub Bot
Created on: 21/Jul/22 20:23
Start Date: 21/Jul/22 20:23
Worklog Time Spent: 10m 
  Work Description: hadoop-yetus commented on PR #4458:
URL: https://github.com/apache/hadoop/pull/4458#issuecomment-1191898786

   :broken_heart: **-1 overall**
   
   
   
   
   
   
   | Vote | Subsystem | Runtime |  Logfile | Comment |
   |::|--:|:|::|:---:|
   | +0 :ok: |  reexec  |   1m 26s |  |  Docker mode activated.  |
    _ Prechecks _ |
   | +1 :green_heart: |  dupname  |   0m  0s |  |  No case conflicting files 
found.  |
   | +0 :ok: |  codespell  |   0m  0s |  |  codespell was not available.  |
   | +1 :green_heart: |  @author  |   0m  0s |  |  The patch does not contain 
any @author tags.  |
   | +1 :green_heart: |  test4tests  |   0m  0s |  |  The patch appears to 
include 6 new or modified test files.  |
    _ feature-HADOOP-18028-s3a-prefetch Compile Tests _ |
   | +0 :ok: |  mvndep  |  15m 17s |  |  Maven dependency ordering for branch  |
   | +1 :green_heart: |  mvninstall  |  31m 49s |  |  
feature-HADOOP-18028-s3a-prefetch passed  |
   | +1 :green_heart: |  compile  |  28m  0s |  |  
feature-HADOOP-18028-s3a-prefetch passed with JDK Private 
Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1  |
   | +1 :green_heart: |  compile  |  23m 31s |  |  
feature-HADOOP-18028-s3a-prefetch passed with JDK Private 
Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07  |
   | +1 :green_heart: |  checkstyle  |   5m 26s |  |  
feature-HADOOP-18028-s3a-prefetch passed  |
   | +1 :green_heart: |  mvnsite  |   3m 34s |  |  
feature-HADOOP-18028-s3a-prefetch passed  |
   | +1 :green_heart: |  javadoc  |   2m 37s |  |  
feature-HADOOP-18028-s3a-prefetch passed with JDK Private 
Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1  |
   | +1 :green_heart: |  javadoc  |   2m 10s |  |  
feature-HADOOP-18028-s3a-prefetch passed with JDK Private 
Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07  |
   | +1 :green_heart: |  spotbugs  |   5m  5s |  |  
feature-HADOOP-18028-s3a-prefetch passed  |
   | +1 :green_heart: |  shadedclient  |  26m  4s |  |  branch has no errors 
when building and testing our client artifacts.  |
   | -0 :warning: |  patch  |  26m 34s |  |  Used diff version of patch file. 
Binary files and potentially other changes not applied. Please rebase and 
squash commits if necessary.  |
    _ Patch Compile Tests _ |
   | +0 :ok: |  mvndep  |   0m 30s |  |  Maven dependency ordering for patch  |
   | +1 :green_heart: |  mvninstall  |   2m  2s |  |  the patch passed  |
   | +1 :green_heart: |  compile  |  26m 54s |  |  the patch passed with JDK 
Private Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1  |
   | +1 :green_heart: |  javac  |  26m 54s |  |  the patch passed  |
   | +1 :green_heart: |  compile  |  22m 43s |  |  the patch passed with JDK 
Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07  |
   | +1 :green_heart: |  javac  |  22m 43s |  |  the patch passed  |
   | +1 :green_heart: |  blanks  |   0m  0s |  |  The patch has no blanks 
issues.  |
   | -0 :warning: |  checkstyle  |   4m 31s | 
[/results-checkstyle-root.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4458/9/artifact/out/results-checkstyle-root.txt)
 |  root: The patch generated 8 new + 1 unchanged - 0 fixed = 9 total (was 1)  |
   | +1 :green_heart: |  mvnsite  |   3m 27s |  |  the patch passed  |
   | +1 :green_heart: |  javadoc  |   2m 21s |  |  the patch passed with JDK 
Private Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1  |
   | +1 :green_heart: |  javadoc  |   2m  7s |  |  the patch passed with JDK 
Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07  |
   | -1 :x: |  spotbugs  |   2m  8s | 
[/new-spotbugs-hadoop-tools_hadoop-aws.html](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4458/9/artifact/out/new-spotbugs-hadoop-tools_hadoop-aws.html)
 |  hadoop-tools/hadoop-aws generated 1 new + 0 unchanged - 0 fixed = 1 total 
(was 0)  |
   | +1 :green_heart: |  shadedclient  |  25m 30s |  |  patch has no errors 
when building and testing our client artifacts.  |
    _ Other Tests _ |
   | +1 :green_heart: |  unit  |  18m 14s |  |  hadoop-common in the patch 
passed.  |
   | +1 :green_heart: |  unit  |   3m 12s |  |  hadoop-aws in the patch passed. 
 |
   | +1 :green_heart: |  asflicense  |   1m 18s |  |  The patch does not 
generate ASF License warnings.  |
   |  |   | 266m 57s |  |  |
   
   
   | Reason | Tests |
   |---:|:--|
   | SpotBugs | module:hadoop-tools/hadoop-aws |
   |  |  Possible null pointer dereference of tracker in 
org.apache.hadoop.fs.common.CachingBlockManager.readBlock(BufferData, boolean, 
BufferData$State[])  Dereferenced at 

[GitHub] [hadoop] hadoop-yetus commented on pull request #4458: HADOOP-18190. Adds iostats for prefetching

2022-07-21 Thread GitBox


hadoop-yetus commented on PR #4458:
URL: https://github.com/apache/hadoop/pull/4458#issuecomment-1191898786

   :broken_heart: **-1 overall**
   
   
   
   
   
   
   | Vote | Subsystem | Runtime |  Logfile | Comment |
   |::|--:|:|::|:---:|
   | +0 :ok: |  reexec  |   1m 26s |  |  Docker mode activated.  |
    _ Prechecks _ |
   | +1 :green_heart: |  dupname  |   0m  0s |  |  No case conflicting files 
found.  |
   | +0 :ok: |  codespell  |   0m  0s |  |  codespell was not available.  |
   | +1 :green_heart: |  @author  |   0m  0s |  |  The patch does not contain 
any @author tags.  |
   | +1 :green_heart: |  test4tests  |   0m  0s |  |  The patch appears to 
include 6 new or modified test files.  |
    _ feature-HADOOP-18028-s3a-prefetch Compile Tests _ |
   | +0 :ok: |  mvndep  |  15m 17s |  |  Maven dependency ordering for branch  |
   | +1 :green_heart: |  mvninstall  |  31m 49s |  |  
feature-HADOOP-18028-s3a-prefetch passed  |
   | +1 :green_heart: |  compile  |  28m  0s |  |  
feature-HADOOP-18028-s3a-prefetch passed with JDK Private 
Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1  |
   | +1 :green_heart: |  compile  |  23m 31s |  |  
feature-HADOOP-18028-s3a-prefetch passed with JDK Private 
Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07  |
   | +1 :green_heart: |  checkstyle  |   5m 26s |  |  
feature-HADOOP-18028-s3a-prefetch passed  |
   | +1 :green_heart: |  mvnsite  |   3m 34s |  |  
feature-HADOOP-18028-s3a-prefetch passed  |
   | +1 :green_heart: |  javadoc  |   2m 37s |  |  
feature-HADOOP-18028-s3a-prefetch passed with JDK Private 
Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1  |
   | +1 :green_heart: |  javadoc  |   2m 10s |  |  
feature-HADOOP-18028-s3a-prefetch passed with JDK Private 
Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07  |
   | +1 :green_heart: |  spotbugs  |   5m  5s |  |  
feature-HADOOP-18028-s3a-prefetch passed  |
   | +1 :green_heart: |  shadedclient  |  26m  4s |  |  branch has no errors 
when building and testing our client artifacts.  |
   | -0 :warning: |  patch  |  26m 34s |  |  Used diff version of patch file. 
Binary files and potentially other changes not applied. Please rebase and 
squash commits if necessary.  |
    _ Patch Compile Tests _ |
   | +0 :ok: |  mvndep  |   0m 30s |  |  Maven dependency ordering for patch  |
   | +1 :green_heart: |  mvninstall  |   2m  2s |  |  the patch passed  |
   | +1 :green_heart: |  compile  |  26m 54s |  |  the patch passed with JDK 
Private Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1  |
   | +1 :green_heart: |  javac  |  26m 54s |  |  the patch passed  |
   | +1 :green_heart: |  compile  |  22m 43s |  |  the patch passed with JDK 
Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07  |
   | +1 :green_heart: |  javac  |  22m 43s |  |  the patch passed  |
   | +1 :green_heart: |  blanks  |   0m  0s |  |  The patch has no blanks 
issues.  |
   | -0 :warning: |  checkstyle  |   4m 31s | 
[/results-checkstyle-root.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4458/9/artifact/out/results-checkstyle-root.txt)
 |  root: The patch generated 8 new + 1 unchanged - 0 fixed = 9 total (was 1)  |
   | +1 :green_heart: |  mvnsite  |   3m 27s |  |  the patch passed  |
   | +1 :green_heart: |  javadoc  |   2m 21s |  |  the patch passed with JDK 
Private Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1  |
   | +1 :green_heart: |  javadoc  |   2m  7s |  |  the patch passed with JDK 
Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07  |
   | -1 :x: |  spotbugs  |   2m  8s | 
[/new-spotbugs-hadoop-tools_hadoop-aws.html](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4458/9/artifact/out/new-spotbugs-hadoop-tools_hadoop-aws.html)
 |  hadoop-tools/hadoop-aws generated 1 new + 0 unchanged - 0 fixed = 1 total 
(was 0)  |
   | +1 :green_heart: |  shadedclient  |  25m 30s |  |  patch has no errors 
when building and testing our client artifacts.  |
    _ Other Tests _ |
   | +1 :green_heart: |  unit  |  18m 14s |  |  hadoop-common in the patch 
passed.  |
   | +1 :green_heart: |  unit  |   3m 12s |  |  hadoop-aws in the patch passed. 
 |
   | +1 :green_heart: |  asflicense  |   1m 18s |  |  The patch does not 
generate ASF License warnings.  |
   |  |   | 266m 57s |  |  |
   
   
   | Reason | Tests |
   |---:|:--|
   | SpotBugs | module:hadoop-tools/hadoop-aws |
   |  |  Possible null pointer dereference of tracker in 
org.apache.hadoop.fs.common.CachingBlockManager.readBlock(BufferData, boolean, 
BufferData$State[])  Dereferenced at CachingBlockManager.java:tracker in 
org.apache.hadoop.fs.common.CachingBlockManager.readBlock(BufferData, boolean, 
BufferData$State[])  Dereferenced at CachingBlockManager.java:[line 365] |
   
   
   | Subsystem | Report/Notes |
   |--:|:-|
   | Docker | ClientAPI=1.41 ServerAPI=1.41 base: 
https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4458/9/artifact/out/Dockerfile
 |
   | GITHUB PR | 

[GitHub] [hadoop] hadoop-yetus commented on pull request #4601: HDFS-16467. Ensure Protobuf generated headers are included first

2022-07-21 Thread GitBox


hadoop-yetus commented on PR #4601:
URL: https://github.com/apache/hadoop/pull/4601#issuecomment-1191884859

   :confetti_ball: **+1 overall**
   
   
   
   
   
   
   | Vote | Subsystem | Runtime |  Logfile | Comment |
   |::|--:|:|::|:---:|
   | +0 :ok: |  reexec  |   0m 38s |  |  Docker mode activated.  |
    _ Prechecks _ |
   | +1 :green_heart: |  dupname  |   0m  1s |  |  No case conflicting files 
found.  |
   | +0 :ok: |  codespell  |   0m  0s |  |  codespell was not available.  |
   | +0 :ok: |  detsecrets  |   0m  0s |  |  detect-secrets was not available.  
|
   | +1 :green_heart: |  @author  |   0m  0s |  |  The patch does not contain 
any @author tags.  |
   | +1 :green_heart: |  test4tests  |   0m  0s |  |  The patch appears to 
include 7 new or modified test files.  |
    _ trunk Compile Tests _ |
   | +1 :green_heart: |  mvninstall  |  39m 55s |  |  trunk passed  |
   | +1 :green_heart: |  compile  |   3m 59s |  |  trunk passed  |
   | +1 :green_heart: |  mvnsite  |   0m 50s |  |  trunk passed  |
   | +1 :green_heart: |  shadedclient  |  63m 48s |  |  branch has no errors 
when building and testing our client artifacts.  |
    _ Patch Compile Tests _ |
   | +1 :green_heart: |  mvninstall  |   0m 25s |  |  the patch passed  |
   | +1 :green_heart: |  compile  |   3m 38s |  |  the patch passed  |
   | +1 :green_heart: |  cc  |   3m 38s |  |  the patch passed  |
   | +1 :green_heart: |  golang  |   3m 38s |  |  the patch passed  |
   | +1 :green_heart: |  javac  |   3m 38s |  |  the patch passed  |
   | +1 :green_heart: |  blanks  |   0m  0s |  |  The patch has no blanks 
issues.  |
   | +1 :green_heart: |  mvnsite  |   0m 28s |  |  the patch passed  |
   | +1 :green_heart: |  shadedclient  |  18m 59s |  |  patch has no errors 
when building and testing our client artifacts.  |
    _ Other Tests _ |
   | +1 :green_heart: |  unit  |  32m 41s |  |  hadoop-hdfs-native-client in 
the patch passed.  |
   | +1 :green_heart: |  asflicense  |   0m 54s |  |  The patch does not 
generate ASF License warnings.  |
   |  |   | 124m 18s |  |  |
   
   
   | Subsystem | Report/Notes |
   |--:|:-|
   | Docker | ClientAPI=1.41 ServerAPI=1.41 base: 
https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4601/2/artifact/out/Dockerfile
 |
   | GITHUB PR | https://github.com/apache/hadoop/pull/4601 |
   | Optional Tests | dupname asflicense compile cc mvnsite javac unit 
codespell detsecrets golang |
   | uname | Linux abcc7e8d04f2 4.15.0-58-generic #64-Ubuntu SMP Tue Aug 6 
11:12:41 UTC 2019 x86_64 x86_64 x86_64 GNU/Linux |
   | Build tool | maven |
   | Personality | dev-support/bin/hadoop.sh |
   | git revision | trunk / 6845f0c7d9631c56b92ded7adc539c0de23a1045 |
   | Default Java | Red Hat, Inc.-1.8.0_332-b09 |
   |  Test Results | 
https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4601/2/testReport/ |
   | Max. process+thread count | 560 (vs. ulimit of 5500) |
   | modules | C: hadoop-hdfs-project/hadoop-hdfs-native-client U: 
hadoop-hdfs-project/hadoop-hdfs-native-client |
   | Console output | 
https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4601/2/console |
   | versions | git=2.9.5 maven=3.6.3 |
   | Powered by | Apache Yetus 0.14.0 https://yetus.apache.org |
   
   
   This message was automatically generated.
   
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[jira] [Work logged] (HADOOP-18190) s3a prefetching streams to collect iostats on prefetching operations

2022-07-21 Thread ASF GitHub Bot (Jira)


 [ 
https://issues.apache.org/jira/browse/HADOOP-18190?focusedWorklogId=793918=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-793918
 ]

ASF GitHub Bot logged work on HADOOP-18190:
---

Author: ASF GitHub Bot
Created on: 21/Jul/22 19:38
Start Date: 21/Jul/22 19:38
Worklog Time Spent: 10m 
  Work Description: hadoop-yetus commented on PR #4458:
URL: https://github.com/apache/hadoop/pull/4458#issuecomment-1191862585

   :broken_heart: **-1 overall**
   
   
   
   
   
   
   | Vote | Subsystem | Runtime |  Logfile | Comment |
   |::|--:|:|::|:---:|
   | +0 :ok: |  reexec  |  21m  2s |  |  Docker mode activated.  |
    _ Prechecks _ |
   | +1 :green_heart: |  dupname  |   0m  0s |  |  No case conflicting files 
found.  |
   | +0 :ok: |  codespell  |   0m  1s |  |  codespell was not available.  |
   | +1 :green_heart: |  @author  |   0m  0s |  |  The patch does not contain 
any @author tags.  |
   | +1 :green_heart: |  test4tests  |   0m  0s |  |  The patch appears to 
include 6 new or modified test files.  |
    _ feature-HADOOP-18028-s3a-prefetch Compile Tests _ |
   | +0 :ok: |  mvndep  |  14m 45s |  |  Maven dependency ordering for branch  |
   | +1 :green_heart: |  mvninstall  |  28m 20s |  |  
feature-HADOOP-18028-s3a-prefetch passed  |
   | +1 :green_heart: |  compile  |  26m 41s |  |  
feature-HADOOP-18028-s3a-prefetch passed with JDK Private 
Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1  |
   | +1 :green_heart: |  compile  |  23m 53s |  |  
feature-HADOOP-18028-s3a-prefetch passed with JDK Private 
Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07  |
   | +1 :green_heart: |  checkstyle  |   4m 45s |  |  
feature-HADOOP-18028-s3a-prefetch passed  |
   | +1 :green_heart: |  mvnsite  |   4m 29s |  |  
feature-HADOOP-18028-s3a-prefetch passed  |
   | +1 :green_heart: |  javadoc  |   2m 39s |  |  
feature-HADOOP-18028-s3a-prefetch passed with JDK Private 
Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1  |
   | +1 :green_heart: |  javadoc  |   2m 14s |  |  
feature-HADOOP-18028-s3a-prefetch passed with JDK Private 
Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07  |
   | +1 :green_heart: |  spotbugs  |   5m 16s |  |  
feature-HADOOP-18028-s3a-prefetch passed  |
   | +1 :green_heart: |  shadedclient  |  26m 17s |  |  branch has no errors 
when building and testing our client artifacts.  |
   | -0 :warning: |  patch  |  26m 45s |  |  Used diff version of patch file. 
Binary files and potentially other changes not applied. Please rebase and 
squash commits if necessary.  |
    _ Patch Compile Tests _ |
   | +0 :ok: |  mvndep  |   0m 32s |  |  Maven dependency ordering for patch  |
   | +1 :green_heart: |  mvninstall  |   1m 48s |  |  the patch passed  |
   | +1 :green_heart: |  compile  |  27m 28s |  |  the patch passed with JDK 
Private Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1  |
   | +1 :green_heart: |  javac  |  27m 28s |  |  the patch passed  |
   | +1 :green_heart: |  compile  |  23m  2s |  |  the patch passed with JDK 
Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07  |
   | +1 :green_heart: |  javac  |  23m  2s |  |  the patch passed  |
   | +1 :green_heart: |  blanks  |   0m  0s |  |  The patch has no blanks 
issues.  |
   | -0 :warning: |  checkstyle  |   4m 43s | 
[/results-checkstyle-root.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4458/8/artifact/out/results-checkstyle-root.txt)
 |  root: The patch generated 8 new + 1 unchanged - 0 fixed = 9 total (was 1)  |
   | +1 :green_heart: |  mvnsite  |   4m 23s |  |  the patch passed  |
   | +1 :green_heart: |  javadoc  |   2m 24s |  |  the patch passed with JDK 
Private Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1  |
   | +1 :green_heart: |  javadoc  |   2m 13s |  |  the patch passed with JDK 
Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07  |
   | -1 :x: |  spotbugs  |   1m 53s | 
[/new-spotbugs-hadoop-tools_hadoop-aws.html](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4458/8/artifact/out/new-spotbugs-hadoop-tools_hadoop-aws.html)
 |  hadoop-tools/hadoop-aws generated 1 new + 0 unchanged - 0 fixed = 1 total 
(was 0)  |
   | +1 :green_heart: |  shadedclient  |  27m 19s |  |  patch has no errors 
when building and testing our client artifacts.  |
    _ Other Tests _ |
   | +1 :green_heart: |  unit  |  20m 44s |  |  hadoop-common in the patch 
passed.  |
   | +1 :green_heart: |  unit  |   3m 22s |  |  hadoop-aws in the patch passed. 
 |
   | +1 :green_heart: |  asflicense  |   1m 19s |  |  The patch does not 
generate ASF License warnings.  |
   |  |   | 288m 57s |  |  |
   
   
   | Reason | Tests |
   |---:|:--|
   | SpotBugs | module:hadoop-tools/hadoop-aws |
   |  |  Possible null pointer dereference of tracker in 
org.apache.hadoop.fs.common.CachingBlockManager.readBlock(BufferData, boolean, 
BufferData$State[])  Dereferenced at 

[GitHub] [hadoop] hadoop-yetus commented on pull request #4458: HADOOP-18190. Adds iostats for prefetching

2022-07-21 Thread GitBox


hadoop-yetus commented on PR #4458:
URL: https://github.com/apache/hadoop/pull/4458#issuecomment-1191862585

   :broken_heart: **-1 overall**
   
   
   
   
   
   
   | Vote | Subsystem | Runtime |  Logfile | Comment |
   |::|--:|:|::|:---:|
   | +0 :ok: |  reexec  |  21m  2s |  |  Docker mode activated.  |
    _ Prechecks _ |
   | +1 :green_heart: |  dupname  |   0m  0s |  |  No case conflicting files 
found.  |
   | +0 :ok: |  codespell  |   0m  1s |  |  codespell was not available.  |
   | +1 :green_heart: |  @author  |   0m  0s |  |  The patch does not contain 
any @author tags.  |
   | +1 :green_heart: |  test4tests  |   0m  0s |  |  The patch appears to 
include 6 new or modified test files.  |
    _ feature-HADOOP-18028-s3a-prefetch Compile Tests _ |
   | +0 :ok: |  mvndep  |  14m 45s |  |  Maven dependency ordering for branch  |
   | +1 :green_heart: |  mvninstall  |  28m 20s |  |  
feature-HADOOP-18028-s3a-prefetch passed  |
   | +1 :green_heart: |  compile  |  26m 41s |  |  
feature-HADOOP-18028-s3a-prefetch passed with JDK Private 
Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1  |
   | +1 :green_heart: |  compile  |  23m 53s |  |  
feature-HADOOP-18028-s3a-prefetch passed with JDK Private 
Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07  |
   | +1 :green_heart: |  checkstyle  |   4m 45s |  |  
feature-HADOOP-18028-s3a-prefetch passed  |
   | +1 :green_heart: |  mvnsite  |   4m 29s |  |  
feature-HADOOP-18028-s3a-prefetch passed  |
   | +1 :green_heart: |  javadoc  |   2m 39s |  |  
feature-HADOOP-18028-s3a-prefetch passed with JDK Private 
Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1  |
   | +1 :green_heart: |  javadoc  |   2m 14s |  |  
feature-HADOOP-18028-s3a-prefetch passed with JDK Private 
Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07  |
   | +1 :green_heart: |  spotbugs  |   5m 16s |  |  
feature-HADOOP-18028-s3a-prefetch passed  |
   | +1 :green_heart: |  shadedclient  |  26m 17s |  |  branch has no errors 
when building and testing our client artifacts.  |
   | -0 :warning: |  patch  |  26m 45s |  |  Used diff version of patch file. 
Binary files and potentially other changes not applied. Please rebase and 
squash commits if necessary.  |
    _ Patch Compile Tests _ |
   | +0 :ok: |  mvndep  |   0m 32s |  |  Maven dependency ordering for patch  |
   | +1 :green_heart: |  mvninstall  |   1m 48s |  |  the patch passed  |
   | +1 :green_heart: |  compile  |  27m 28s |  |  the patch passed with JDK 
Private Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1  |
   | +1 :green_heart: |  javac  |  27m 28s |  |  the patch passed  |
   | +1 :green_heart: |  compile  |  23m  2s |  |  the patch passed with JDK 
Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07  |
   | +1 :green_heart: |  javac  |  23m  2s |  |  the patch passed  |
   | +1 :green_heart: |  blanks  |   0m  0s |  |  The patch has no blanks 
issues.  |
   | -0 :warning: |  checkstyle  |   4m 43s | 
[/results-checkstyle-root.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4458/8/artifact/out/results-checkstyle-root.txt)
 |  root: The patch generated 8 new + 1 unchanged - 0 fixed = 9 total (was 1)  |
   | +1 :green_heart: |  mvnsite  |   4m 23s |  |  the patch passed  |
   | +1 :green_heart: |  javadoc  |   2m 24s |  |  the patch passed with JDK 
Private Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1  |
   | +1 :green_heart: |  javadoc  |   2m 13s |  |  the patch passed with JDK 
Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07  |
   | -1 :x: |  spotbugs  |   1m 53s | 
[/new-spotbugs-hadoop-tools_hadoop-aws.html](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4458/8/artifact/out/new-spotbugs-hadoop-tools_hadoop-aws.html)
 |  hadoop-tools/hadoop-aws generated 1 new + 0 unchanged - 0 fixed = 1 total 
(was 0)  |
   | +1 :green_heart: |  shadedclient  |  27m 19s |  |  patch has no errors 
when building and testing our client artifacts.  |
    _ Other Tests _ |
   | +1 :green_heart: |  unit  |  20m 44s |  |  hadoop-common in the patch 
passed.  |
   | +1 :green_heart: |  unit  |   3m 22s |  |  hadoop-aws in the patch passed. 
 |
   | +1 :green_heart: |  asflicense  |   1m 19s |  |  The patch does not 
generate ASF License warnings.  |
   |  |   | 288m 57s |  |  |
   
   
   | Reason | Tests |
   |---:|:--|
   | SpotBugs | module:hadoop-tools/hadoop-aws |
   |  |  Possible null pointer dereference of tracker in 
org.apache.hadoop.fs.common.CachingBlockManager.readBlock(BufferData, boolean, 
BufferData$State[])  Dereferenced at CachingBlockManager.java:tracker in 
org.apache.hadoop.fs.common.CachingBlockManager.readBlock(BufferData, boolean, 
BufferData$State[])  Dereferenced at CachingBlockManager.java:[line 365] |
   
   
   | Subsystem | Report/Notes |
   |--:|:-|
   | Docker | ClientAPI=1.41 ServerAPI=1.41 base: 
https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4458/8/artifact/out/Dockerfile
 |
   | GITHUB PR | 

[GitHub] [hadoop] goiri merged pull request #4540: YARN-11160. Support getResourceProfiles, getResourceProfile API's for Federation

2022-07-21 Thread GitBox


goiri merged PR #4540:
URL: https://github.com/apache/hadoop/pull/4540


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[GitHub] [hadoop] goiri commented on a diff in pull request #4594: YARN-6572. Refactoring Router services to use common util classes for pipeline creations.

2022-07-21 Thread GitBox


goiri commented on code in PR #4594:
URL: https://github.com/apache/hadoop/pull/4594#discussion_r926998010


##
hadoop-yarn-project/hadoop-yarn/hadoop-yarn-server/hadoop-yarn-server-router/src/main/java/org/apache/hadoop/yarn/server/router/clientrm/RouterClientRMService.java:
##
@@ -19,11 +19,9 @@
 package org.apache.hadoop.yarn.server.router.clientrm;
 
 import java.io.IOException;
+import java.lang.reflect.InvocationTargetException;

Review Comment:
   import java.lang.reflect.InvocationTargetException;:8: Unused import - 
java.lang.reflect.InvocationTargetException. [UnusedImports]



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[GitHub] [hadoop] hadoop-yetus commented on pull request #4603: YARN-10793. Upgrade Junit from 4 to 5 in hadoop-yarn-server-applicationhistoryservice

2022-07-21 Thread GitBox


hadoop-yetus commented on PR #4603:
URL: https://github.com/apache/hadoop/pull/4603#issuecomment-1191779140

   :broken_heart: **-1 overall**
   
   
   
   
   
   
   | Vote | Subsystem | Runtime |  Logfile | Comment |
   |::|--:|:|::|:---:|
   | +0 :ok: |  reexec  |   0m 46s |  |  Docker mode activated.  |
    _ Prechecks _ |
   | +1 :green_heart: |  dupname  |   0m  0s |  |  No case conflicting files 
found.  |
   | +0 :ok: |  codespell  |   0m  1s |  |  codespell was not available.  |
   | +0 :ok: |  detsecrets  |   0m  1s |  |  detect-secrets was not available.  
|
   | +0 :ok: |  xmllint  |   0m  1s |  |  xmllint was not available.  |
   | +1 :green_heart: |  @author  |   0m  0s |  |  The patch does not contain 
any @author tags.  |
   | +1 :green_heart: |  test4tests  |   0m  0s |  |  The patch appears to 
include 20 new or modified test files.  |
    _ trunk Compile Tests _ |
   | -1 :x: |  mvninstall  |   0m 31s | 
[/branch-mvninstall-root.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4603/4/artifact/out/branch-mvninstall-root.txt)
 |  root in trunk failed.  |
   | -1 :x: |  compile  |   0m 32s | 
[/branch-compile-hadoop-yarn-project_hadoop-yarn_hadoop-yarn-server_hadoop-yarn-server-applicationhistoryservice-jdkPrivateBuild-11.0.15+10-Ubuntu-0ubuntu0.20.04.1.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4603/4/artifact/out/branch-compile-hadoop-yarn-project_hadoop-yarn_hadoop-yarn-server_hadoop-yarn-server-applicationhistoryservice-jdkPrivateBuild-11.0.15+10-Ubuntu-0ubuntu0.20.04.1.txt)
 |  hadoop-yarn-server-applicationhistoryservice in trunk failed with JDK 
Private Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1.  |
   | -1 :x: |  compile  |   0m 31s | 
[/branch-compile-hadoop-yarn-project_hadoop-yarn_hadoop-yarn-server_hadoop-yarn-server-applicationhistoryservice-jdkPrivateBuild-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4603/4/artifact/out/branch-compile-hadoop-yarn-project_hadoop-yarn_hadoop-yarn-server_hadoop-yarn-server-applicationhistoryservice-jdkPrivateBuild-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07.txt)
 |  hadoop-yarn-server-applicationhistoryservice in trunk failed with JDK 
Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07.  |
   | -0 :warning: |  checkstyle  |   0m 29s | 
[/buildtool-branch-checkstyle-hadoop-yarn-project_hadoop-yarn_hadoop-yarn-server_hadoop-yarn-server-applicationhistoryservice.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4603/4/artifact/out/buildtool-branch-checkstyle-hadoop-yarn-project_hadoop-yarn_hadoop-yarn-server_hadoop-yarn-server-applicationhistoryservice.txt)
 |  The patch fails to run checkstyle in 
hadoop-yarn-server-applicationhistoryservice  |
   | -1 :x: |  mvnsite  |   0m 31s | 
[/branch-mvnsite-hadoop-yarn-project_hadoop-yarn_hadoop-yarn-server_hadoop-yarn-server-applicationhistoryservice.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4603/4/artifact/out/branch-mvnsite-hadoop-yarn-project_hadoop-yarn_hadoop-yarn-server_hadoop-yarn-server-applicationhistoryservice.txt)
 |  hadoop-yarn-server-applicationhistoryservice in trunk failed.  |
   | -1 :x: |  javadoc  |   0m 31s | 
[/branch-javadoc-hadoop-yarn-project_hadoop-yarn_hadoop-yarn-server_hadoop-yarn-server-applicationhistoryservice-jdkPrivateBuild-11.0.15+10-Ubuntu-0ubuntu0.20.04.1.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4603/4/artifact/out/branch-javadoc-hadoop-yarn-project_hadoop-yarn_hadoop-yarn-server_hadoop-yarn-server-applicationhistoryservice-jdkPrivateBuild-11.0.15+10-Ubuntu-0ubuntu0.20.04.1.txt)
 |  hadoop-yarn-server-applicationhistoryservice in trunk failed with JDK 
Private Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1.  |
   | -1 :x: |  javadoc  |   0m 31s | 
[/branch-javadoc-hadoop-yarn-project_hadoop-yarn_hadoop-yarn-server_hadoop-yarn-server-applicationhistoryservice-jdkPrivateBuild-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4603/4/artifact/out/branch-javadoc-hadoop-yarn-project_hadoop-yarn_hadoop-yarn-server_hadoop-yarn-server-applicationhistoryservice-jdkPrivateBuild-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07.txt)
 |  hadoop-yarn-server-applicationhistoryservice in trunk failed with JDK 
Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07.  |
   | -1 :x: |  spotbugs  |   0m 31s | 
[/branch-spotbugs-hadoop-yarn-project_hadoop-yarn_hadoop-yarn-server_hadoop-yarn-server-applicationhistoryservice.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4603/4/artifact/out/branch-spotbugs-hadoop-yarn-project_hadoop-yarn_hadoop-yarn-server_hadoop-yarn-server-applicationhistoryservice.txt)
 |  hadoop-yarn-server-applicationhistoryservice in trunk failed.  |
   | +1 :green_heart: |  shadedclient  |   3m 39s |  |  branch has no errors 
when building and testing our client artifacts.  |
    _ Patch Compile Tests _ |
   | -1 :x: |  

[GitHub] [hadoop] goiri commented on a diff in pull request #4606: HDFS-16678. RBF should supports disable getNodeUsage() in RBFMetrics

2022-07-21 Thread GitBox


goiri commented on code in PR #4606:
URL: https://github.com/apache/hadoop/pull/4606#discussion_r926936268


##
hadoop-hdfs-project/hadoop-hdfs-rbf/src/main/java/org/apache/hadoop/hdfs/server/federation/metrics/RBFMetrics.java:
##
@@ -544,28 +548,30 @@ public String getNodeUsage() {
 
 final Map> info = new HashMap<>();
 try {
-  RouterRpcServer rpcServer = this.router.getRpcServer();
-  DatanodeInfo[] live = rpcServer.getDatanodeReport(
-  DatanodeReportType.LIVE, false, timeOut);
-
-  if (live.length > 0) {
-float totalDfsUsed = 0;
-float[] usages = new float[live.length];
-int i = 0;
-for (DatanodeInfo dn : live) {
-  usages[i++] = dn.getDfsUsedPercent();
-  totalDfsUsed += dn.getDfsUsedPercent();
-}
-totalDfsUsed /= live.length;
-Arrays.sort(usages);
-median = usages[usages.length / 2];
-max = usages[usages.length - 1];
-min = usages[0];
-
-for (i = 0; i < usages.length; i++) {
-  dev += (usages[i] - totalDfsUsed) * (usages[i] - totalDfsUsed);
+  if (this.enableGetDNUsage) {

Review Comment:
   I would do:
   ```
   DatanodeInfo[] live = null;
   if (this.enableGetDNUsage) {
 RouterRpcServer rpcServer = this.router.getRpcServer();
 DatanodeInfo[] live = rpcServer.getDatanodeReport(DatanodeReportType.LIVE, 
false, timeOut);
   } else {
 LOG.debug("Getting information is disabled."); // similar message
   }
   if (live != null && live.length > 0) {
   ```



##
hadoop-hdfs-project/hadoop-hdfs-rbf/src/main/java/org/apache/hadoop/hdfs/server/federation/metrics/RBFMetrics.java:
##
@@ -544,28 +548,30 @@ public String getNodeUsage() {
 
 final Map> info = new HashMap<>();
 try {
-  RouterRpcServer rpcServer = this.router.getRpcServer();
-  DatanodeInfo[] live = rpcServer.getDatanodeReport(
-  DatanodeReportType.LIVE, false, timeOut);
-
-  if (live.length > 0) {
-float totalDfsUsed = 0;
-float[] usages = new float[live.length];
-int i = 0;
-for (DatanodeInfo dn : live) {
-  usages[i++] = dn.getDfsUsedPercent();
-  totalDfsUsed += dn.getDfsUsedPercent();
-}
-totalDfsUsed /= live.length;
-Arrays.sort(usages);
-median = usages[usages.length / 2];
-max = usages[usages.length - 1];
-min = usages[0];
-
-for (i = 0; i < usages.length; i++) {
-  dev += (usages[i] - totalDfsUsed) * (usages[i] - totalDfsUsed);
+  if (this.enableGetDNUsage) {
+RouterRpcServer rpcServer = this.router.getRpcServer();
+DatanodeInfo[] live = rpcServer.getDatanodeReport(
+DatanodeReportType.LIVE, false, timeOut);
+
+if (live.length > 0) {
+  float totalDfsUsed = 0;
+  float[] usages = new float[live.length];
+  int i = 0;
+  for (DatanodeInfo dn : live) {
+usages[i++] = dn.getDfsUsedPercent();

Review Comment:
   What is the expensive part of this whole block? this?



##
hadoop-hdfs-project/hadoop-hdfs-rbf/src/main/java/org/apache/hadoop/hdfs/server/federation/metrics/RBFMetrics.java:
##
@@ -544,28 +548,30 @@ public String getNodeUsage() {
 
 final Map> info = new HashMap<>();
 try {
-  RouterRpcServer rpcServer = this.router.getRpcServer();
-  DatanodeInfo[] live = rpcServer.getDatanodeReport(
-  DatanodeReportType.LIVE, false, timeOut);
-
-  if (live.length > 0) {
-float totalDfsUsed = 0;
-float[] usages = new float[live.length];
-int i = 0;
-for (DatanodeInfo dn : live) {
-  usages[i++] = dn.getDfsUsedPercent();
-  totalDfsUsed += dn.getDfsUsedPercent();
-}
-totalDfsUsed /= live.length;
-Arrays.sort(usages);
-median = usages[usages.length / 2];
-max = usages[usages.length - 1];
-min = usages[0];
-
-for (i = 0; i < usages.length; i++) {
-  dev += (usages[i] - totalDfsUsed) * (usages[i] - totalDfsUsed);
+  if (this.enableGetDNUsage) {
+RouterRpcServer rpcServer = this.router.getRpcServer();
+DatanodeInfo[] live = rpcServer.getDatanodeReport(
+DatanodeReportType.LIVE, false, timeOut);
+
+if (live.length > 0) {
+  float totalDfsUsed = 0;
+  float[] usages = new float[live.length];
+  int i = 0;
+  for (DatanodeInfo dn : live) {
+usages[i++] = dn.getDfsUsedPercent();
+totalDfsUsed += dn.getDfsUsedPercent();
+  }
+  totalDfsUsed /= live.length;
+  Arrays.sort(usages);
+  median = usages[usages.length / 2];
+  max = usages[usages.length - 1];
+  min = usages[0];
+
+  for (i = 0; i < usages.length; i++) {
+dev += (usages[i] - totalDfsUsed) * (usages[i] - totalDfsUsed);
+  }
+  dev = (float) Math.sqrt(dev / 

[GitHub] [hadoop] hadoop-yetus commented on pull request #4603: YARN-10793. Upgrade Junit from 4 to 5 in hadoop-yarn-server-applicationhistoryservice

2022-07-21 Thread GitBox


hadoop-yetus commented on PR #4603:
URL: https://github.com/apache/hadoop/pull/4603#issuecomment-1191753286

   :broken_heart: **-1 overall**
   
   
   
   
   
   
   | Vote | Subsystem | Runtime |  Logfile | Comment |
   |::|--:|:|::|:---:|
   | +0 :ok: |  reexec  |   0m 43s |  |  Docker mode activated.  |
    _ Prechecks _ |
   | +1 :green_heart: |  dupname  |   0m  1s |  |  No case conflicting files 
found.  |
   | +0 :ok: |  codespell  |   0m  0s |  |  codespell was not available.  |
   | +0 :ok: |  detsecrets  |   0m  0s |  |  detect-secrets was not available.  
|
   | +0 :ok: |  xmllint  |   0m  0s |  |  xmllint was not available.  |
   | +1 :green_heart: |  @author  |   0m  0s |  |  The patch does not contain 
any @author tags.  |
   | +1 :green_heart: |  test4tests  |   0m  0s |  |  The patch appears to 
include 20 new or modified test files.  |
    _ trunk Compile Tests _ |
   | -1 :x: |  mvninstall  |  21m 10s | 
[/branch-mvninstall-root.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4603/3/artifact/out/branch-mvninstall-root.txt)
 |  root in trunk failed.  |
   | -1 :x: |  compile  |   0m 33s | 
[/branch-compile-hadoop-yarn-project_hadoop-yarn_hadoop-yarn-server_hadoop-yarn-server-applicationhistoryservice-jdkPrivateBuild-11.0.15+10-Ubuntu-0ubuntu0.20.04.1.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4603/3/artifact/out/branch-compile-hadoop-yarn-project_hadoop-yarn_hadoop-yarn-server_hadoop-yarn-server-applicationhistoryservice-jdkPrivateBuild-11.0.15+10-Ubuntu-0ubuntu0.20.04.1.txt)
 |  hadoop-yarn-server-applicationhistoryservice in trunk failed with JDK 
Private Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1.  |
   | +1 :green_heart: |  compile  |   0m 38s |  |  trunk passed with JDK 
Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07  |
   | +1 :green_heart: |  checkstyle  |   0m 40s |  |  trunk passed  |
   | +1 :green_heart: |  mvnsite  |   0m 47s |  |  trunk passed  |
   | -1 :x: |  javadoc  |   0m 41s | 
[/branch-javadoc-hadoop-yarn-project_hadoop-yarn_hadoop-yarn-server_hadoop-yarn-server-applicationhistoryservice-jdkPrivateBuild-11.0.15+10-Ubuntu-0ubuntu0.20.04.1.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4603/3/artifact/out/branch-javadoc-hadoop-yarn-project_hadoop-yarn_hadoop-yarn-server_hadoop-yarn-server-applicationhistoryservice-jdkPrivateBuild-11.0.15+10-Ubuntu-0ubuntu0.20.04.1.txt)
 |  hadoop-yarn-server-applicationhistoryservice in trunk failed with JDK 
Private Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1.  |
   | -1 :x: |  javadoc  |   0m 41s | 
[/branch-javadoc-hadoop-yarn-project_hadoop-yarn_hadoop-yarn-server_hadoop-yarn-server-applicationhistoryservice-jdkPrivateBuild-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4603/3/artifact/out/branch-javadoc-hadoop-yarn-project_hadoop-yarn_hadoop-yarn-server_hadoop-yarn-server-applicationhistoryservice-jdkPrivateBuild-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07.txt)
 |  hadoop-yarn-server-applicationhistoryservice in trunk failed with JDK 
Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07.  |
   | -1 :x: |  spotbugs  |   0m 43s | 
[/branch-spotbugs-hadoop-yarn-project_hadoop-yarn_hadoop-yarn-server_hadoop-yarn-server-applicationhistoryservice.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4603/3/artifact/out/branch-spotbugs-hadoop-yarn-project_hadoop-yarn_hadoop-yarn-server_hadoop-yarn-server-applicationhistoryservice.txt)
 |  hadoop-yarn-server-applicationhistoryservice in trunk failed.  |
   | -1 :x: |  shadedclient  |  11m 50s |  |  branch has errors when building 
and testing our client artifacts.  |
    _ Patch Compile Tests _ |
   | +1 :green_heart: |  mvninstall  |   0m 33s |  |  the patch passed  |
   | +1 :green_heart: |  compile  |   0m 37s |  |  the patch passed with JDK 
Private Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1  |
   | -1 :x: |  javac  |   0m 37s | 
[/results-compile-javac-hadoop-yarn-project_hadoop-yarn_hadoop-yarn-server_hadoop-yarn-server-applicationhistoryservice-jdkPrivateBuild-11.0.15+10-Ubuntu-0ubuntu0.20.04.1.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4603/3/artifact/out/results-compile-javac-hadoop-yarn-project_hadoop-yarn_hadoop-yarn-server_hadoop-yarn-server-applicationhistoryservice-jdkPrivateBuild-11.0.15+10-Ubuntu-0ubuntu0.20.04.1.txt)
 |  
hadoop-yarn-project_hadoop-yarn_hadoop-yarn-server_hadoop-yarn-server-applicationhistoryservice-jdkPrivateBuild-11.0.15+10-Ubuntu-0ubuntu0.20.04.1
 with JDK Private Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1 generated 11 new + 0 
unchanged - 0 fixed = 11 total (was 0)  |
   | +1 :green_heart: |  compile  |   0m 32s |  |  the patch passed with JDK 
Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07  |
   | +1 :green_heart: |  javac  |   0m 32s |  |  the patch passed  |
   | +1 :green_heart: |  blanks  |   0m  0s |  |  The patch has no blanks 
issues.  |
   | -0 :warning: |  

[GitHub] [hadoop] hadoop-yetus commented on pull request #4602: HDFS-16673. Fix usage of chown

2022-07-21 Thread GitBox


hadoop-yetus commented on PR #4602:
URL: https://github.com/apache/hadoop/pull/4602#issuecomment-1191727321

   :broken_heart: **-1 overall**
   
   
   
   
   
   
   | Vote | Subsystem | Runtime |  Logfile | Comment |
   |::|--:|:|::|:---:|
   | +0 :ok: |  reexec  |  13m  8s |  |  Docker mode activated.  |
    _ Prechecks _ |
   | +1 :green_heart: |  dupname  |   0m  0s |  |  No case conflicting files 
found.  |
   | +0 :ok: |  codespell  |   0m  0s |  |  codespell was not available.  |
   | +0 :ok: |  detsecrets  |   0m  0s |  |  detect-secrets was not available.  
|
   | +0 :ok: |  markdownlint  |   0m  0s |  |  markdownlint was not available.  
|
   | +1 :green_heart: |  @author  |   0m  0s |  |  The patch does not contain 
any @author tags.  |
   | +1 :green_heart: |  test4tests  |   0m  0s |  |  The patch appears to 
include 1 new or modified test files.  |
    _ trunk Compile Tests _ |
   | +0 :ok: |  mvndep  |  15m 35s |  |  Maven dependency ordering for branch  |
   | +1 :green_heart: |  mvninstall  |  25m 16s |  |  trunk passed  |
   | +1 :green_heart: |  compile  |  23m 22s |  |  trunk passed with JDK 
Private Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1  |
   | +1 :green_heart: |  compile  |  20m 53s |  |  trunk passed with JDK 
Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07  |
   | +1 :green_heart: |  checkstyle  |   4m 21s |  |  trunk passed  |
   | +1 :green_heart: |  mvnsite  |   4m 28s |  |  trunk passed  |
   | +1 :green_heart: |  javadoc  |   3m 32s |  |  trunk passed with JDK 
Private Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1  |
   | +1 :green_heart: |  javadoc  |   3m 43s |  |  trunk passed with JDK 
Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07  |
   | +1 :green_heart: |  spotbugs  |   7m 12s |  |  trunk passed  |
   | +1 :green_heart: |  shadedclient  |  24m 28s |  |  branch has no errors 
when building and testing our client artifacts.  |
    _ Patch Compile Tests _ |
   | +0 :ok: |  mvndep  |   0m 34s |  |  Maven dependency ordering for patch  |
   | +1 :green_heart: |  mvninstall  |   2m 29s |  |  the patch passed  |
   | +1 :green_heart: |  compile  |  22m 32s |  |  the patch passed with JDK 
Private Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1  |
   | +1 :green_heart: |  javac  |  22m 32s |  |  the patch passed  |
   | +1 :green_heart: |  compile  |  20m 53s |  |  the patch passed with JDK 
Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07  |
   | +1 :green_heart: |  javac  |  20m 53s |  |  the patch passed  |
   | +1 :green_heart: |  blanks  |   0m  0s |  |  The patch has no blanks 
issues.  |
   | +1 :green_heart: |  checkstyle  |   4m 15s |  |  the patch passed  |
   | +1 :green_heart: |  mvnsite  |   4m 27s |  |  the patch passed  |
   | +1 :green_heart: |  javadoc  |   3m 24s |  |  the patch passed with JDK 
Private Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1  |
   | +1 :green_heart: |  javadoc  |   3m 32s |  |  the patch passed with JDK 
Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07  |
   | +1 :green_heart: |  spotbugs  |   7m 23s |  |  the patch passed  |
   | +1 :green_heart: |  shadedclient  |  24m 37s |  |  patch has no errors 
when building and testing our client artifacts.  |
    _ Other Tests _ |
   | +1 :green_heart: |  unit  |  18m 39s |  |  hadoop-common in the patch 
passed.  |
   | -1 :x: |  unit  | 264m 28s | 
[/patch-unit-hadoop-hdfs-project_hadoop-hdfs.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4602/1/artifact/out/patch-unit-hadoop-hdfs-project_hadoop-hdfs.txt)
 |  hadoop-hdfs in the patch passed.  |
   | +1 :green_heart: |  asflicense  |   1m 57s |  |  The patch does not 
generate ASF License warnings.  |
   |  |   | 525m 52s |  |  |
   
   
   | Reason | Tests |
   |---:|:--|
   | Failed junit tests | 
hadoop.metrics2.sink.TestRollingFileSystemSinkWithSecureHdfs |
   |   | hadoop.hdfs.TestDecommissionWithStripedBackoffMonitor |
   |   | hadoop.cli.TestHDFSCLI |
   |   | hadoop.hdfs.TestMaintenanceState |
   |   | hadoop.metrics2.sink.TestRollingFileSystemSinkWithHdfs |
   
   
   | Subsystem | Report/Notes |
   |--:|:-|
   | Docker | ClientAPI=1.41 ServerAPI=1.41 base: 
https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4602/1/artifact/out/Dockerfile
 |
   | GITHUB PR | https://github.com/apache/hadoop/pull/4602 |
   | Optional Tests | dupname asflicense mvnsite codespell detsecrets 
markdownlint compile javac javadoc mvninstall unit shadedclient spotbugs 
checkstyle |
   | uname | Linux 8502e5bf1122 4.15.0-112-generic #113-Ubuntu SMP Thu Jul 9 
23:41:39 UTC 2020 x86_64 x86_64 x86_64 GNU/Linux |
   | Build tool | maven |
   | Personality | dev-support/bin/hadoop.sh |
   | git revision | trunk / 281c285c3880d48308b45a3070f6116f219cc053 |
   | Default Java | Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 |
   | Multi-JDK versions | /usr/lib/jvm/java-11-openjdk-amd64:Private 
Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1 

[GitHub] [hadoop] hadoop-yetus commented on pull request #4606: HDFS-16678. RBF should supports disable getNodeUsage() in RBFMetrics

2022-07-21 Thread GitBox


hadoop-yetus commented on PR #4606:
URL: https://github.com/apache/hadoop/pull/4606#issuecomment-1191727346

   :broken_heart: **-1 overall**
   
   
   
   
   
   
   | Vote | Subsystem | Runtime |  Logfile | Comment |
   |::|--:|:|::|:---:|
   | +0 :ok: |  reexec  |   1m 21s |  |  Docker mode activated.  |
    _ Prechecks _ |
   | +1 :green_heart: |  dupname  |   0m  0s |  |  No case conflicting files 
found.  |
   | +0 :ok: |  codespell  |   0m  1s |  |  codespell was not available.  |
   | +0 :ok: |  detsecrets  |   0m  1s |  |  detect-secrets was not available.  
|
   | +0 :ok: |  xmllint  |   0m  1s |  |  xmllint was not available.  |
   | +1 :green_heart: |  @author  |   0m  0s |  |  The patch does not contain 
any @author tags.  |
   | -1 :x: |  test4tests  |   0m  0s |  |  The patch doesn't appear to include 
any new or modified tests. Please justify why no new tests are needed for this 
patch. Also please list what manual steps were performed to verify this patch.  
|
    _ trunk Compile Tests _ |
   | +1 :green_heart: |  mvninstall  |  43m  1s |  |  trunk passed  |
   | +1 :green_heart: |  compile  |   0m 53s |  |  trunk passed with JDK 
Private Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1  |
   | +1 :green_heart: |  compile  |   0m 47s |  |  trunk passed with JDK 
Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07  |
   | +1 :green_heart: |  checkstyle  |   0m 41s |  |  trunk passed  |
   | +1 :green_heart: |  mvnsite  |   0m 53s |  |  trunk passed  |
   | +1 :green_heart: |  javadoc  |   0m 59s |  |  trunk passed with JDK 
Private Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1  |
   | +1 :green_heart: |  javadoc  |   1m  9s |  |  trunk passed with JDK 
Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07  |
   | +1 :green_heart: |  spotbugs  |   1m 40s |  |  trunk passed  |
   | +1 :green_heart: |  shadedclient  |  24m 13s |  |  branch has no errors 
when building and testing our client artifacts.  |
    _ Patch Compile Tests _ |
   | +1 :green_heart: |  mvninstall  |   0m 38s |  |  the patch passed  |
   | +1 :green_heart: |  compile  |   0m 41s |  |  the patch passed with JDK 
Private Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1  |
   | +1 :green_heart: |  javac  |   0m 41s |  |  the patch passed  |
   | +1 :green_heart: |  compile  |   0m 36s |  |  the patch passed with JDK 
Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07  |
   | +1 :green_heart: |  javac  |   0m 36s |  |  the patch passed  |
   | +1 :green_heart: |  blanks  |   0m  0s |  |  The patch has no blanks 
issues.  |
   | +1 :green_heart: |  checkstyle  |   0m 22s |  |  the patch passed  |
   | +1 :green_heart: |  mvnsite  |   0m 39s |  |  the patch passed  |
   | +1 :green_heart: |  javadoc  |   0m 37s |  |  the patch passed with JDK 
Private Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1  |
   | +1 :green_heart: |  javadoc  |   0m 56s |  |  the patch passed with JDK 
Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07  |
   | +1 :green_heart: |  spotbugs  |   1m 26s |  |  the patch passed  |
   | +1 :green_heart: |  shadedclient  |  23m 34s |  |  patch has no errors 
when building and testing our client artifacts.  |
    _ Other Tests _ |
   | +1 :green_heart: |  unit  |  39m 50s |  |  hadoop-hdfs-rbf in the patch 
passed.  |
   | +1 :green_heart: |  asflicense  |   0m 44s |  |  The patch does not 
generate ASF License warnings.  |
   |  |   | 147m 35s |  |  |
   
   
   | Subsystem | Report/Notes |
   |--:|:-|
   | Docker | ClientAPI=1.41 ServerAPI=1.41 base: 
https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4606/1/artifact/out/Dockerfile
 |
   | GITHUB PR | https://github.com/apache/hadoop/pull/4606 |
   | Optional Tests | dupname asflicense compile javac javadoc mvninstall 
mvnsite unit shadedclient spotbugs checkstyle codespell detsecrets xmllint |
   | uname | Linux 843009c68092 4.15.0-175-generic #184-Ubuntu SMP Thu Mar 24 
17:48:36 UTC 2022 x86_64 x86_64 x86_64 GNU/Linux |
   | Build tool | maven |
   | Personality | dev-support/bin/hadoop.sh |
   | git revision | trunk / 4de60fb87f2f25087acab7b90d75cd0e20be622d |
   | Default Java | Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 |
   | Multi-JDK versions | /usr/lib/jvm/java-11-openjdk-amd64:Private 
Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1 
/usr/lib/jvm/java-8-openjdk-amd64:Private 
Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 |
   |  Test Results | 
https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4606/1/testReport/ |
   | Max. process+thread count | 2025 (vs. ulimit of 5500) |
   | modules | C: hadoop-hdfs-project/hadoop-hdfs-rbf U: 
hadoop-hdfs-project/hadoop-hdfs-rbf |
   | Console output | 
https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4606/1/console |
   | versions | git=2.25.1 maven=3.6.3 spotbugs=4.2.2 |
   | Powered by | Apache Yetus 0.14.0 https://yetus.apache.org |
   
   
   This message was automatically generated.
   
   


-- 
This is an automated message 

[GitHub] [hadoop] hadoop-yetus commented on pull request #4531: HDFS-13274. RBF: Extend RouterRpcClient to use multiple sockets

2022-07-21 Thread GitBox


hadoop-yetus commented on PR #4531:
URL: https://github.com/apache/hadoop/pull/4531#issuecomment-1191666286

   :confetti_ball: **+1 overall**
   
   
   
   
   
   
   | Vote | Subsystem | Runtime |  Logfile | Comment |
   |::|--:|:|::|:---:|
   | +0 :ok: |  reexec  |   0m 59s |  |  Docker mode activated.  |
    _ Prechecks _ |
   | +1 :green_heart: |  dupname  |   0m  1s |  |  No case conflicting files 
found.  |
   | +0 :ok: |  codespell  |   0m  0s |  |  codespell was not available.  |
   | +0 :ok: |  detsecrets  |   0m  0s |  |  detect-secrets was not available.  
|
   | +0 :ok: |  xmllint  |   0m  0s |  |  xmllint was not available.  |
   | +0 :ok: |  markdownlint  |   0m  0s |  |  markdownlint was not available.  
|
   | +1 :green_heart: |  @author  |   0m  0s |  |  The patch does not contain 
any @author tags.  |
   | +1 :green_heart: |  test4tests  |   0m  0s |  |  The patch appears to 
include 1 new or modified test files.  |
    _ trunk Compile Tests _ |
   | +1 :green_heart: |  mvninstall  |  38m 38s |  |  trunk passed  |
   | +1 :green_heart: |  compile  |   1m  2s |  |  trunk passed with JDK 
Private Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1  |
   | +1 :green_heart: |  compile  |   0m 58s |  |  trunk passed with JDK 
Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07  |
   | +1 :green_heart: |  checkstyle  |   0m 52s |  |  trunk passed  |
   | +1 :green_heart: |  mvnsite  |   1m  2s |  |  trunk passed  |
   | +1 :green_heart: |  javadoc  |   1m  8s |  |  trunk passed with JDK 
Private Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1  |
   | +1 :green_heart: |  javadoc  |   1m 14s |  |  trunk passed with JDK 
Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07  |
   | +1 :green_heart: |  spotbugs  |   1m 49s |  |  trunk passed  |
   | +1 :green_heart: |  shadedclient  |  21m 25s |  |  branch has no errors 
when building and testing our client artifacts.  |
    _ Patch Compile Tests _ |
   | +1 :green_heart: |  mvninstall  |   0m 42s |  |  the patch passed  |
   | +1 :green_heart: |  compile  |   0m 44s |  |  the patch passed with JDK 
Private Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1  |
   | +1 :green_heart: |  javac  |   0m 44s |  |  the patch passed  |
   | +1 :green_heart: |  compile  |   0m 40s |  |  the patch passed with JDK 
Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07  |
   | +1 :green_heart: |  javac  |   0m 40s |  |  the patch passed  |
   | +1 :green_heart: |  blanks  |   0m  0s |  |  The patch has no blanks 
issues.  |
   | +1 :green_heart: |  checkstyle  |   0m 26s |  |  the patch passed  |
   | +1 :green_heart: |  mvnsite  |   0m 44s |  |  the patch passed  |
   | +1 :green_heart: |  javadoc  |   0m 41s |  |  the patch passed with JDK 
Private Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1  |
   | +1 :green_heart: |  javadoc  |   0m 58s |  |  the patch passed with JDK 
Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07  |
   | +1 :green_heart: |  spotbugs  |   1m 27s |  |  the patch passed  |
   | +1 :green_heart: |  shadedclient  |  20m 35s |  |  patch has no errors 
when building and testing our client artifacts.  |
    _ Other Tests _ |
   | +1 :green_heart: |  unit  |  35m 39s |  |  hadoop-hdfs-rbf in the patch 
passed.  |
   | +1 :green_heart: |  asflicense  |   0m 50s |  |  The patch does not 
generate ASF License warnings.  |
   |  |   | 134m 45s |  |  |
   
   
   | Subsystem | Report/Notes |
   |--:|:-|
   | Docker | ClientAPI=1.41 ServerAPI=1.41 base: 
https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4531/6/artifact/out/Dockerfile
 |
   | GITHUB PR | https://github.com/apache/hadoop/pull/4531 |
   | Optional Tests | dupname asflicense compile javac javadoc mvninstall 
mvnsite unit shadedclient spotbugs checkstyle codespell detsecrets xmllint 
markdownlint |
   | uname | Linux 81f28fd5f6f7 4.15.0-65-generic #74-Ubuntu SMP Tue Sep 17 
17:06:04 UTC 2019 x86_64 x86_64 x86_64 GNU/Linux |
   | Build tool | maven |
   | Personality | dev-support/bin/hadoop.sh |
   | git revision | trunk / b2dabf627d663a0f0b55c175a165e27eaeda8fb5 |
   | Default Java | Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 |
   | Multi-JDK versions | /usr/lib/jvm/java-11-openjdk-amd64:Private 
Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1 
/usr/lib/jvm/java-8-openjdk-amd64:Private 
Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 |
   |  Test Results | 
https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4531/6/testReport/ |
   | Max. process+thread count | 2205 (vs. ulimit of 5500) |
   | modules | C: hadoop-hdfs-project/hadoop-hdfs-rbf U: 
hadoop-hdfs-project/hadoop-hdfs-rbf |
   | Console output | 
https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4531/6/console |
   | versions | git=2.25.1 maven=3.6.3 spotbugs=4.2.2 |
   | Powered by | Apache Yetus 0.14.0 https://yetus.apache.org |
   
   
   This message was automatically generated.
   
   


-- 
This is an automated message from the Apache Git Service.
To 

[jira] [Work logged] (HADOOP-17461) Add thread-level IOStatistics Context

2022-07-21 Thread ASF GitHub Bot (Jira)


 [ 
https://issues.apache.org/jira/browse/HADOOP-17461?focusedWorklogId=793785=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-793785
 ]

ASF GitHub Bot logged work on HADOOP-17461:
---

Author: ASF GitHub Bot
Created on: 21/Jul/22 14:57
Start Date: 21/Jul/22 14:57
Worklog Time Spent: 10m 
  Work Description: hadoop-yetus commented on PR #4566:
URL: https://github.com/apache/hadoop/pull/4566#issuecomment-1191589460

   :broken_heart: **-1 overall**
   
   
   
   
   
   
   | Vote | Subsystem | Runtime |  Logfile | Comment |
   |::|--:|:|::|:---:|
   | +0 :ok: |  reexec  |   0m 48s |  |  Docker mode activated.  |
    _ Prechecks _ |
   | +1 :green_heart: |  dupname  |   0m  0s |  |  No case conflicting files 
found.  |
   | +0 :ok: |  codespell  |   0m  0s |  |  codespell was not available.  |
   | +0 :ok: |  detsecrets  |   0m  0s |  |  detect-secrets was not available.  
|
   | +1 :green_heart: |  @author  |   0m  0s |  |  The patch does not contain 
any @author tags.  |
   | +1 :green_heart: |  test4tests  |   0m  0s |  |  The patch appears to 
include 6 new or modified test files.  |
    _ trunk Compile Tests _ |
   | +0 :ok: |  mvndep  |  15m 17s |  |  Maven dependency ordering for branch  |
   | +1 :green_heart: |  mvninstall  |  28m 15s |  |  trunk passed  |
   | +1 :green_heart: |  compile  |  25m 14s |  |  trunk passed with JDK 
Private Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1  |
   | +1 :green_heart: |  compile  |  21m 50s |  |  trunk passed with JDK 
Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07  |
   | +1 :green_heart: |  checkstyle  |   4m 34s |  |  trunk passed  |
   | +1 :green_heart: |  mvnsite  |   3m 12s |  |  trunk passed  |
   | +1 :green_heart: |  javadoc  |   2m 24s |  |  trunk passed with JDK 
Private Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1  |
   | +1 :green_heart: |  javadoc  |   2m  3s |  |  trunk passed with JDK 
Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07  |
   | +1 :green_heart: |  spotbugs  |   4m 39s |  |  trunk passed  |
   | +1 :green_heart: |  shadedclient  |  24m 28s |  |  branch has no errors 
when building and testing our client artifacts.  |
   | -0 :warning: |  patch  |  24m 55s |  |  Used diff version of patch file. 
Binary files and potentially other changes not applied. Please rebase and 
squash commits if necessary.  |
    _ Patch Compile Tests _ |
   | +0 :ok: |  mvndep  |   0m 25s |  |  Maven dependency ordering for patch  |
   | +1 :green_heart: |  mvninstall  |   1m 42s |  |  the patch passed  |
   | +1 :green_heart: |  compile  |  24m 19s |  |  the patch passed with JDK 
Private Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1  |
   | +1 :green_heart: |  javac  |  24m 19s |  |  the patch passed  |
   | +1 :green_heart: |  compile  |  21m 51s |  |  the patch passed with JDK 
Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07  |
   | +1 :green_heart: |  javac  |  21m 51s |  |  the patch passed  |
   | -1 :x: |  blanks  |   0m  0s | 
[/blanks-eol.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4566/5/artifact/out/blanks-eol.txt)
 |  The patch has 1 line(s) that end in blanks. Use git apply --whitespace=fix 
<>. Refer https://git-scm.com/docs/git-apply  |
   | +1 :green_heart: |  checkstyle  |   4m 26s |  |  the patch passed  |
   | +1 :green_heart: |  mvnsite  |   3m 10s |  |  the patch passed  |
   | +1 :green_heart: |  javadoc  |   2m 16s |  |  the patch passed with JDK 
Private Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1  |
   | +1 :green_heart: |  javadoc  |   2m  5s |  |  the patch passed with JDK 
Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07  |
   | +1 :green_heart: |  spotbugs  |   4m 51s |  |  the patch passed  |
   | +1 :green_heart: |  shadedclient  |  25m 56s |  |  patch has no errors 
when building and testing our client artifacts.  |
    _ Other Tests _ |
   | +1 :green_heart: |  unit  |  20m  2s |  |  hadoop-common in the patch 
passed.  |
   | +1 :green_heart: |  unit  |   3m 16s |  |  hadoop-aws in the patch passed. 
 |
   | +1 :green_heart: |  asflicense  |   1m 16s |  |  The patch does not 
generate ASF License warnings.  |
   |  |   | 252m 47s |  |  |
   
   
   | Subsystem | Report/Notes |
   |--:|:-|
   | Docker | ClientAPI=1.41 ServerAPI=1.41 base: 
https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4566/5/artifact/out/Dockerfile
 |
   | GITHUB PR | https://github.com/apache/hadoop/pull/4566 |
   | Optional Tests | dupname asflicense compile javac javadoc mvninstall 
mvnsite unit shadedclient spotbugs checkstyle codespell detsecrets |
   | uname | Linux 0af63939de89 4.15.0-175-generic #184-Ubuntu SMP Thu Mar 24 
17:48:36 UTC 2022 x86_64 x86_64 x86_64 GNU/Linux |
   | Build tool | maven |
   | Personality | dev-support/bin/hadoop.sh |
   | git revision | trunk / ac6a3fa248e00111cbeb94c6b767b7c2b80641c7 |
   | Default 

[jira] [Resolved] (HADOOP-18221) stream warns Not all bytes were read from the S3ObjectInputStream when closed

2022-07-21 Thread Ahmar Suhail (Jira)


 [ 
https://issues.apache.org/jira/browse/HADOOP-18221?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Ahmar Suhail resolved HADOOP-18221.
---
Resolution: Fixed

> stream warns Not all bytes were read from the S3ObjectInputStream when closed
> -
>
> Key: HADOOP-18221
> URL: https://issues.apache.org/jira/browse/HADOOP-18221
> Project: Hadoop Common
>  Issue Type: Sub-task
>Reporter: Ahmar Suhail
>Assignee: Ahmar Suhail
>Priority: Minor
>  Labels: pull-request-available
>  Time Spent: 1h 40m
>  Remaining Estimate: 0h
>
> Issue: [https://github.com/aws/aws-sdk-java/issues/1211] has resurfaced in 
> the prefetching stream when it is closed before reading for blocks is 
> complete. This can be fixed by draining the stream before closing 



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[GitHub] [hadoop] hadoop-yetus commented on pull request #4566: HADOOP-17461. IOStatisticsContext + committer integration

2022-07-21 Thread GitBox


hadoop-yetus commented on PR #4566:
URL: https://github.com/apache/hadoop/pull/4566#issuecomment-1191589460

   :broken_heart: **-1 overall**
   
   
   
   
   
   
   | Vote | Subsystem | Runtime |  Logfile | Comment |
   |::|--:|:|::|:---:|
   | +0 :ok: |  reexec  |   0m 48s |  |  Docker mode activated.  |
    _ Prechecks _ |
   | +1 :green_heart: |  dupname  |   0m  0s |  |  No case conflicting files 
found.  |
   | +0 :ok: |  codespell  |   0m  0s |  |  codespell was not available.  |
   | +0 :ok: |  detsecrets  |   0m  0s |  |  detect-secrets was not available.  
|
   | +1 :green_heart: |  @author  |   0m  0s |  |  The patch does not contain 
any @author tags.  |
   | +1 :green_heart: |  test4tests  |   0m  0s |  |  The patch appears to 
include 6 new or modified test files.  |
    _ trunk Compile Tests _ |
   | +0 :ok: |  mvndep  |  15m 17s |  |  Maven dependency ordering for branch  |
   | +1 :green_heart: |  mvninstall  |  28m 15s |  |  trunk passed  |
   | +1 :green_heart: |  compile  |  25m 14s |  |  trunk passed with JDK 
Private Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1  |
   | +1 :green_heart: |  compile  |  21m 50s |  |  trunk passed with JDK 
Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07  |
   | +1 :green_heart: |  checkstyle  |   4m 34s |  |  trunk passed  |
   | +1 :green_heart: |  mvnsite  |   3m 12s |  |  trunk passed  |
   | +1 :green_heart: |  javadoc  |   2m 24s |  |  trunk passed with JDK 
Private Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1  |
   | +1 :green_heart: |  javadoc  |   2m  3s |  |  trunk passed with JDK 
Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07  |
   | +1 :green_heart: |  spotbugs  |   4m 39s |  |  trunk passed  |
   | +1 :green_heart: |  shadedclient  |  24m 28s |  |  branch has no errors 
when building and testing our client artifacts.  |
   | -0 :warning: |  patch  |  24m 55s |  |  Used diff version of patch file. 
Binary files and potentially other changes not applied. Please rebase and 
squash commits if necessary.  |
    _ Patch Compile Tests _ |
   | +0 :ok: |  mvndep  |   0m 25s |  |  Maven dependency ordering for patch  |
   | +1 :green_heart: |  mvninstall  |   1m 42s |  |  the patch passed  |
   | +1 :green_heart: |  compile  |  24m 19s |  |  the patch passed with JDK 
Private Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1  |
   | +1 :green_heart: |  javac  |  24m 19s |  |  the patch passed  |
   | +1 :green_heart: |  compile  |  21m 51s |  |  the patch passed with JDK 
Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07  |
   | +1 :green_heart: |  javac  |  21m 51s |  |  the patch passed  |
   | -1 :x: |  blanks  |   0m  0s | 
[/blanks-eol.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4566/5/artifact/out/blanks-eol.txt)
 |  The patch has 1 line(s) that end in blanks. Use git apply --whitespace=fix 
<>. Refer https://git-scm.com/docs/git-apply  |
   | +1 :green_heart: |  checkstyle  |   4m 26s |  |  the patch passed  |
   | +1 :green_heart: |  mvnsite  |   3m 10s |  |  the patch passed  |
   | +1 :green_heart: |  javadoc  |   2m 16s |  |  the patch passed with JDK 
Private Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1  |
   | +1 :green_heart: |  javadoc  |   2m  5s |  |  the patch passed with JDK 
Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07  |
   | +1 :green_heart: |  spotbugs  |   4m 51s |  |  the patch passed  |
   | +1 :green_heart: |  shadedclient  |  25m 56s |  |  patch has no errors 
when building and testing our client artifacts.  |
    _ Other Tests _ |
   | +1 :green_heart: |  unit  |  20m  2s |  |  hadoop-common in the patch 
passed.  |
   | +1 :green_heart: |  unit  |   3m 16s |  |  hadoop-aws in the patch passed. 
 |
   | +1 :green_heart: |  asflicense  |   1m 16s |  |  The patch does not 
generate ASF License warnings.  |
   |  |   | 252m 47s |  |  |
   
   
   | Subsystem | Report/Notes |
   |--:|:-|
   | Docker | ClientAPI=1.41 ServerAPI=1.41 base: 
https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4566/5/artifact/out/Dockerfile
 |
   | GITHUB PR | https://github.com/apache/hadoop/pull/4566 |
   | Optional Tests | dupname asflicense compile javac javadoc mvninstall 
mvnsite unit shadedclient spotbugs checkstyle codespell detsecrets |
   | uname | Linux 0af63939de89 4.15.0-175-generic #184-Ubuntu SMP Thu Mar 24 
17:48:36 UTC 2022 x86_64 x86_64 x86_64 GNU/Linux |
   | Build tool | maven |
   | Personality | dev-support/bin/hadoop.sh |
   | git revision | trunk / ac6a3fa248e00111cbeb94c6b767b7c2b80641c7 |
   | Default Java | Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 |
   | Multi-JDK versions | /usr/lib/jvm/java-11-openjdk-amd64:Private 
Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1 
/usr/lib/jvm/java-8-openjdk-amd64:Private 
Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 |
   |  Test Results | 
https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4566/5/testReport/ |
   | Max. process+thread count | 2457 (vs. ulimit of 5500) |
  

[jira] [Assigned] (HADOOP-18230) Use async drain threshold to decide b/w async and sync draining

2022-07-21 Thread Ahmar Suhail (Jira)


 [ 
https://issues.apache.org/jira/browse/HADOOP-18230?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Ahmar Suhail reassigned HADOOP-18230:
-

Assignee: Ahmar Suhail

> Use async drain threshold to decide b/w async and sync draining
> ---
>
> Key: HADOOP-18230
> URL: https://issues.apache.org/jira/browse/HADOOP-18230
> Project: Hadoop Common
>  Issue Type: Sub-task
>Reporter: Ahmar Suhail
>Assignee: Ahmar Suhail
>Priority: Minor
>
> [https://github.com/apache/hadoop/pull/4294] introduces changes to drain the 
> stream asynchronously. [https://github.com/apache/hadoop/pull/2584] 
> introduced ASYNC_DRAIN_THRESHOLD, use this value to decide when to drain 
> asynchronously. This can be done once the prefetching branch has been rebased.



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[jira] [Resolved] (HADOOP-18247) Tests in ITestS3AOpenCost are failing

2022-07-21 Thread Ahmar Suhail (Jira)


 [ 
https://issues.apache.org/jira/browse/HADOOP-18247?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Ahmar Suhail resolved HADOOP-18247.
---
Resolution: Fixed

> Tests in ITestS3AOpenCost are failing
> -
>
> Key: HADOOP-18247
> URL: https://issues.apache.org/jira/browse/HADOOP-18247
> Project: Hadoop Common
>  Issue Type: Sub-task
>Reporter: Ahmar Suhail
>Assignee: Ahmar Suhail
>Priority: Minor
>
> After rebasing, when prefetching is enabled testOpenFileLongerLength & 
> testOpenFileShorterLength fail



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[GitHub] [hadoop] hadoop-yetus commented on pull request #4603: YARN-10793. Upgrade Junit from 4 to 5 in hadoop-yarn-server-applicationhistoryservice

2022-07-21 Thread GitBox


hadoop-yetus commented on PR #4603:
URL: https://github.com/apache/hadoop/pull/4603#issuecomment-1191578098

   :broken_heart: **-1 overall**
   
   
   
   
   
   
   | Vote | Subsystem | Runtime |  Logfile | Comment |
   |::|--:|:|::|:---:|
   | +0 :ok: |  reexec  |   0m 58s |  |  Docker mode activated.  |
    _ Prechecks _ |
   | +1 :green_heart: |  dupname  |   0m  1s |  |  No case conflicting files 
found.  |
   | +0 :ok: |  codespell  |   0m  0s |  |  codespell was not available.  |
   | +0 :ok: |  detsecrets  |   0m  0s |  |  detect-secrets was not available.  
|
   | +0 :ok: |  xmllint  |   0m  0s |  |  xmllint was not available.  |
   | +1 :green_heart: |  @author  |   0m  0s |  |  The patch does not contain 
any @author tags.  |
   | +1 :green_heart: |  test4tests  |   0m  0s |  |  The patch appears to 
include 20 new or modified test files.  |
    _ trunk Compile Tests _ |
   | +1 :green_heart: |  mvninstall  |  42m  5s |  |  trunk passed  |
   | +1 :green_heart: |  compile  |   0m 43s |  |  trunk passed with JDK 
Private Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1  |
   | +1 :green_heart: |  compile  |   0m 40s |  |  trunk passed with JDK 
Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07  |
   | +1 :green_heart: |  checkstyle  |   0m 42s |  |  trunk passed  |
   | +1 :green_heart: |  mvnsite  |   0m 44s |  |  trunk passed  |
   | +1 :green_heart: |  javadoc  |   0m 48s |  |  trunk passed with JDK 
Private Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1  |
   | +1 :green_heart: |  javadoc  |   0m 37s |  |  trunk passed with JDK 
Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07  |
   | +1 :green_heart: |  spotbugs  |   1m 16s |  |  trunk passed  |
   | +1 :green_heart: |  shadedclient  |  23m 54s |  |  branch has no errors 
when building and testing our client artifacts.  |
    _ Patch Compile Tests _ |
   | +1 :green_heart: |  mvninstall  |   0m 29s |  |  the patch passed  |
   | +1 :green_heart: |  compile  |   0m 30s |  |  the patch passed with JDK 
Private Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1  |
   | +1 :green_heart: |  javac  |   0m 30s |  |  the patch passed  |
   | +1 :green_heart: |  compile  |   0m 28s |  |  the patch passed with JDK 
Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07  |
   | +1 :green_heart: |  javac  |   0m 28s |  |  the patch passed  |
   | -1 :x: |  blanks  |   0m  0s | 
[/blanks-eol.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4603/2/artifact/out/blanks-eol.txt)
 |  The patch has 17 line(s) that end in blanks. Use git apply --whitespace=fix 
<>. Refer https://git-scm.com/docs/git-apply  |
   | -0 :warning: |  checkstyle  |   0m 21s | 
[/results-checkstyle-hadoop-yarn-project_hadoop-yarn_hadoop-yarn-server_hadoop-yarn-server-applicationhistoryservice.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4603/2/artifact/out/results-checkstyle-hadoop-yarn-project_hadoop-yarn_hadoop-yarn-server_hadoop-yarn-server-applicationhistoryservice.txt)
 |  
hadoop-yarn-project/hadoop-yarn/hadoop-yarn-server/hadoop-yarn-server-applicationhistoryservice:
 The patch generated 11 new + 135 unchanged - 145 fixed = 146 total (was 280)  |
   | +1 :green_heart: |  mvnsite  |   0m 30s |  |  the patch passed  |
   | +1 :green_heart: |  javadoc  |   0m 26s |  |  the patch passed with JDK 
Private Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1  |
   | +1 :green_heart: |  javadoc  |   0m 24s |  |  the patch passed with JDK 
Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07  |
   | +1 :green_heart: |  spotbugs  |   1m  3s |  |  the patch passed  |
   | +1 :green_heart: |  shadedclient  |  25m 12s |  |  patch has no errors 
when building and testing our client artifacts.  |
    _ Other Tests _ |
   | +1 :green_heart: |  unit  |   6m  5s |  |  
hadoop-yarn-server-applicationhistoryservice in the patch passed.  |
   | +1 :green_heart: |  asflicense  |   1m  4s |  |  The patch does not 
generate ASF License warnings.  |
   |  |   | 110m 45s |  |  |
   
   
   | Subsystem | Report/Notes |
   |--:|:-|
   | Docker | ClientAPI=1.41 ServerAPI=1.41 base: 
https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4603/2/artifact/out/Dockerfile
 |
   | GITHUB PR | https://github.com/apache/hadoop/pull/4603 |
   | Optional Tests | dupname asflicense compile javac javadoc mvninstall 
mvnsite unit shadedclient codespell detsecrets xmllint spotbugs checkstyle |
   | uname | Linux 9c00db5709c7 4.15.0-175-generic #184-Ubuntu SMP Thu Mar 24 
17:48:36 UTC 2022 x86_64 x86_64 x86_64 GNU/Linux |
   | Build tool | maven |
   | Personality | dev-support/bin/hadoop.sh |
   | git revision | trunk / efa78138a0457f1a887615e298bfe14809a37697 |
   | Default Java | Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 |
   | Multi-JDK versions | /usr/lib/jvm/java-11-openjdk-amd64:Private 
Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1 
/usr/lib/jvm/java-8-openjdk-amd64:Private 
Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 |
   |  

[jira] [Work logged] (HADOOP-17461) Add thread-level IOStatistics Context

2022-07-21 Thread ASF GitHub Bot (Jira)


 [ 
https://issues.apache.org/jira/browse/HADOOP-17461?focusedWorklogId=793770=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-793770
 ]

ASF GitHub Bot logged work on HADOOP-17461:
---

Author: ASF GitHub Bot
Created on: 21/Jul/22 14:35
Start Date: 21/Jul/22 14:35
Worklog Time Spent: 10m 
  Work Description: hadoop-yetus commented on PR #4566:
URL: https://github.com/apache/hadoop/pull/4566#issuecomment-1191563524

   :broken_heart: **-1 overall**
   
   
   
   
   
   
   | Vote | Subsystem | Runtime |  Logfile | Comment |
   |::|--:|:|::|:---:|
   | +0 :ok: |  reexec  |   0m 43s |  |  Docker mode activated.  |
    _ Prechecks _ |
   | +1 :green_heart: |  dupname  |   0m  1s |  |  No case conflicting files 
found.  |
   | +0 :ok: |  codespell  |   0m  0s |  |  codespell was not available.  |
   | +0 :ok: |  detsecrets  |   0m  0s |  |  detect-secrets was not available.  
|
   | +1 :green_heart: |  @author  |   0m  0s |  |  The patch does not contain 
any @author tags.  |
   | +1 :green_heart: |  test4tests  |   0m  0s |  |  The patch appears to 
include 6 new or modified test files.  |
    _ trunk Compile Tests _ |
   | +0 :ok: |  mvndep  |  15m 46s |  |  Maven dependency ordering for branch  |
   | +1 :green_heart: |  mvninstall  |  25m 29s |  |  trunk passed  |
   | +1 :green_heart: |  compile  |  23m 11s |  |  trunk passed with JDK 
Private Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1  |
   | +1 :green_heart: |  compile  |  20m 54s |  |  trunk passed with JDK 
Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07  |
   | +1 :green_heart: |  checkstyle  |   4m 19s |  |  trunk passed  |
   | +1 :green_heart: |  mvnsite  |   3m 45s |  |  trunk passed  |
   | +1 :green_heart: |  javadoc  |   2m 57s |  |  trunk passed with JDK 
Private Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1  |
   | +1 :green_heart: |  javadoc  |   2m 27s |  |  trunk passed with JDK 
Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07  |
   | +1 :green_heart: |  spotbugs  |   5m  0s |  |  trunk passed  |
   | +1 :green_heart: |  shadedclient  |  22m  8s |  |  branch has no errors 
when building and testing our client artifacts.  |
   | -0 :warning: |  patch  |  22m 40s |  |  Used diff version of patch file. 
Binary files and potentially other changes not applied. Please rebase and 
squash commits if necessary.  |
    _ Patch Compile Tests _ |
   | +0 :ok: |  mvndep  |   0m 31s |  |  Maven dependency ordering for patch  |
   | +1 :green_heart: |  mvninstall  |   1m 46s |  |  the patch passed  |
   | +1 :green_heart: |  compile  |  22m 23s |  |  the patch passed with JDK 
Private Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1  |
   | +1 :green_heart: |  javac  |  22m 23s |  |  the patch passed  |
   | +1 :green_heart: |  compile  |  20m 59s |  |  the patch passed with JDK 
Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07  |
   | +1 :green_heart: |  javac  |  20m 59s |  |  the patch passed  |
   | -1 :x: |  blanks  |   0m  0s | 
[/blanks-eol.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4566/4/artifact/out/blanks-eol.txt)
 |  The patch has 1 line(s) that end in blanks. Use git apply --whitespace=fix 
<>. Refer https://git-scm.com/docs/git-apply  |
   | +1 :green_heart: |  checkstyle  |   4m 14s |  |  the patch passed  |
   | +1 :green_heart: |  mvnsite  |   3m 40s |  |  the patch passed  |
   | +1 :green_heart: |  javadoc  |   2m 53s |  |  the patch passed with JDK 
Private Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1  |
   | +1 :green_heart: |  javadoc  |   2m 25s |  |  the patch passed with JDK 
Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07  |
   | +1 :green_heart: |  spotbugs  |   5m  7s |  |  the patch passed  |
   | +1 :green_heart: |  shadedclient  |  22m 40s |  |  patch has no errors 
when building and testing our client artifacts.  |
    _ Other Tests _ |
   | +1 :green_heart: |  unit  |  18m 45s |  |  hadoop-common in the patch 
passed.  |
   | +1 :green_heart: |  unit  |   3m 13s |  |  hadoop-aws in the patch passed. 
 |
   | +1 :green_heart: |  asflicense  |   1m 37s |  |  The patch does not 
generate ASF License warnings.  |
   |  |   | 242m 35s |  |  |
   
   
   | Subsystem | Report/Notes |
   |--:|:-|
   | Docker | ClientAPI=1.41 ServerAPI=1.41 base: 
https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4566/4/artifact/out/Dockerfile
 |
   | GITHUB PR | https://github.com/apache/hadoop/pull/4566 |
   | Optional Tests | dupname asflicense compile javac javadoc mvninstall 
mvnsite unit shadedclient spotbugs checkstyle codespell detsecrets |
   | uname | Linux 16cb6c89e5fb 4.15.0-112-generic #113-Ubuntu SMP Thu Jul 9 
23:41:39 UTC 2020 x86_64 x86_64 x86_64 GNU/Linux |
   | Build tool | maven |
   | Personality | dev-support/bin/hadoop.sh |
   | git revision | trunk / 40ca262a7183799ae076cef877511d597a1da3c3 |
   | Default 

[GitHub] [hadoop] hadoop-yetus commented on pull request #4566: HADOOP-17461. IOStatisticsContext + committer integration

2022-07-21 Thread GitBox


hadoop-yetus commented on PR #4566:
URL: https://github.com/apache/hadoop/pull/4566#issuecomment-1191563524

   :broken_heart: **-1 overall**
   
   
   
   
   
   
   | Vote | Subsystem | Runtime |  Logfile | Comment |
   |::|--:|:|::|:---:|
   | +0 :ok: |  reexec  |   0m 43s |  |  Docker mode activated.  |
    _ Prechecks _ |
   | +1 :green_heart: |  dupname  |   0m  1s |  |  No case conflicting files 
found.  |
   | +0 :ok: |  codespell  |   0m  0s |  |  codespell was not available.  |
   | +0 :ok: |  detsecrets  |   0m  0s |  |  detect-secrets was not available.  
|
   | +1 :green_heart: |  @author  |   0m  0s |  |  The patch does not contain 
any @author tags.  |
   | +1 :green_heart: |  test4tests  |   0m  0s |  |  The patch appears to 
include 6 new or modified test files.  |
    _ trunk Compile Tests _ |
   | +0 :ok: |  mvndep  |  15m 46s |  |  Maven dependency ordering for branch  |
   | +1 :green_heart: |  mvninstall  |  25m 29s |  |  trunk passed  |
   | +1 :green_heart: |  compile  |  23m 11s |  |  trunk passed with JDK 
Private Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1  |
   | +1 :green_heart: |  compile  |  20m 54s |  |  trunk passed with JDK 
Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07  |
   | +1 :green_heart: |  checkstyle  |   4m 19s |  |  trunk passed  |
   | +1 :green_heart: |  mvnsite  |   3m 45s |  |  trunk passed  |
   | +1 :green_heart: |  javadoc  |   2m 57s |  |  trunk passed with JDK 
Private Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1  |
   | +1 :green_heart: |  javadoc  |   2m 27s |  |  trunk passed with JDK 
Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07  |
   | +1 :green_heart: |  spotbugs  |   5m  0s |  |  trunk passed  |
   | +1 :green_heart: |  shadedclient  |  22m  8s |  |  branch has no errors 
when building and testing our client artifacts.  |
   | -0 :warning: |  patch  |  22m 40s |  |  Used diff version of patch file. 
Binary files and potentially other changes not applied. Please rebase and 
squash commits if necessary.  |
    _ Patch Compile Tests _ |
   | +0 :ok: |  mvndep  |   0m 31s |  |  Maven dependency ordering for patch  |
   | +1 :green_heart: |  mvninstall  |   1m 46s |  |  the patch passed  |
   | +1 :green_heart: |  compile  |  22m 23s |  |  the patch passed with JDK 
Private Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1  |
   | +1 :green_heart: |  javac  |  22m 23s |  |  the patch passed  |
   | +1 :green_heart: |  compile  |  20m 59s |  |  the patch passed with JDK 
Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07  |
   | +1 :green_heart: |  javac  |  20m 59s |  |  the patch passed  |
   | -1 :x: |  blanks  |   0m  0s | 
[/blanks-eol.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4566/4/artifact/out/blanks-eol.txt)
 |  The patch has 1 line(s) that end in blanks. Use git apply --whitespace=fix 
<>. Refer https://git-scm.com/docs/git-apply  |
   | +1 :green_heart: |  checkstyle  |   4m 14s |  |  the patch passed  |
   | +1 :green_heart: |  mvnsite  |   3m 40s |  |  the patch passed  |
   | +1 :green_heart: |  javadoc  |   2m 53s |  |  the patch passed with JDK 
Private Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1  |
   | +1 :green_heart: |  javadoc  |   2m 25s |  |  the patch passed with JDK 
Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07  |
   | +1 :green_heart: |  spotbugs  |   5m  7s |  |  the patch passed  |
   | +1 :green_heart: |  shadedclient  |  22m 40s |  |  patch has no errors 
when building and testing our client artifacts.  |
    _ Other Tests _ |
   | +1 :green_heart: |  unit  |  18m 45s |  |  hadoop-common in the patch 
passed.  |
   | +1 :green_heart: |  unit  |   3m 13s |  |  hadoop-aws in the patch passed. 
 |
   | +1 :green_heart: |  asflicense  |   1m 37s |  |  The patch does not 
generate ASF License warnings.  |
   |  |   | 242m 35s |  |  |
   
   
   | Subsystem | Report/Notes |
   |--:|:-|
   | Docker | ClientAPI=1.41 ServerAPI=1.41 base: 
https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4566/4/artifact/out/Dockerfile
 |
   | GITHUB PR | https://github.com/apache/hadoop/pull/4566 |
   | Optional Tests | dupname asflicense compile javac javadoc mvninstall 
mvnsite unit shadedclient spotbugs checkstyle codespell detsecrets |
   | uname | Linux 16cb6c89e5fb 4.15.0-112-generic #113-Ubuntu SMP Thu Jul 9 
23:41:39 UTC 2020 x86_64 x86_64 x86_64 GNU/Linux |
   | Build tool | maven |
   | Personality | dev-support/bin/hadoop.sh |
   | git revision | trunk / 40ca262a7183799ae076cef877511d597a1da3c3 |
   | Default Java | Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 |
   | Multi-JDK versions | /usr/lib/jvm/java-11-openjdk-amd64:Private 
Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1 
/usr/lib/jvm/java-8-openjdk-amd64:Private 
Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 |
   |  Test Results | 
https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4566/4/testReport/ |
   | Max. process+thread count | 1544 (vs. ulimit of 5500) |
   

[GitHub] [hadoop] ZanderXu opened a new pull request, #4606: HDFS-16678. RBF should supports disable getNodeUsage() in RBFMetrics

2022-07-21 Thread GitBox


ZanderXu opened a new pull request, #4606:
URL: https://github.com/apache/hadoop/pull/4606

   ### Description of PR
   In our prod environment, we try to collect RBF metrics every 15s through 
jmx_exporter. And we found that collection task often failed. 
   
   After tracing and found that the collection task is blocked at 
getNodeUsage() in RBFMetrics, because it will collect all datanode's usage from 
downstream nameservices.  
   
   This is a very expensive and almost useless operation. Because in most 
scenarios, each downstream nameserivce contains almost the same DNs. We can get 
the data usage's from any one nameservices if need, not from RBF.
   
   So I feel that RBF should supports disable getNodeUsage() in RBFMetrics.
   
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[GitHub] [hadoop] hadoop-yetus commented on pull request #4604: HDFS-16674 : Improve TestDFSIO to support more filesystems

2022-07-21 Thread GitBox


hadoop-yetus commented on PR #4604:
URL: https://github.com/apache/hadoop/pull/4604#issuecomment-1191543433

   :confetti_ball: **+1 overall**
   
   
   
   
   
   
   | Vote | Subsystem | Runtime |  Logfile | Comment |
   |::|--:|:|::|:---:|
   | +0 :ok: |  reexec  |   0m 36s |  |  Docker mode activated.  |
    _ Prechecks _ |
   | +1 :green_heart: |  dupname  |   0m  0s |  |  No case conflicting files 
found.  |
   | +0 :ok: |  codespell  |   0m  1s |  |  codespell was not available.  |
   | +0 :ok: |  detsecrets  |   0m  1s |  |  detect-secrets was not available.  
|
   | +1 :green_heart: |  @author  |   0m  0s |  |  The patch does not contain 
any @author tags.  |
   | +1 :green_heart: |  test4tests  |   0m  0s |  |  The patch appears to 
include 1 new or modified test files.  |
    _ trunk Compile Tests _ |
   | +1 :green_heart: |  mvninstall  |  38m 30s |  |  trunk passed  |
   | +1 :green_heart: |  compile  |   0m 56s |  |  trunk passed with JDK 
Private Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1  |
   | +1 :green_heart: |  compile  |   0m 53s |  |  trunk passed with JDK 
Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07  |
   | +1 :green_heart: |  checkstyle  |   0m 59s |  |  trunk passed  |
   | +1 :green_heart: |  mvnsite  |   0m 59s |  |  trunk passed  |
   | +1 :green_heart: |  javadoc  |   0m 57s |  |  trunk passed with JDK 
Private Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1  |
   | +1 :green_heart: |  javadoc  |   0m 45s |  |  trunk passed with JDK 
Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07  |
   | +1 :green_heart: |  spotbugs  |   1m 20s |  |  trunk passed  |
   | +1 :green_heart: |  shadedclient  |  20m 51s |  |  branch has no errors 
when building and testing our client artifacts.  |
    _ Patch Compile Tests _ |
   | +1 :green_heart: |  mvninstall  |   0m 38s |  |  the patch passed  |
   | +1 :green_heart: |  compile  |   0m 38s |  |  the patch passed with JDK 
Private Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1  |
   | +1 :green_heart: |  javac  |   0m 38s |  |  the patch passed  |
   | +1 :green_heart: |  compile  |   0m 36s |  |  the patch passed with JDK 
Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07  |
   | +1 :green_heart: |  javac  |   0m 36s |  |  the patch passed  |
   | +1 :green_heart: |  blanks  |   0m  0s |  |  The patch has no blanks 
issues.  |
   | -0 :warning: |  checkstyle  |   0m 34s | 
[/results-checkstyle-hadoop-mapreduce-project_hadoop-mapreduce-client_hadoop-mapreduce-client-jobclient.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4604/1/artifact/out/results-checkstyle-hadoop-mapreduce-project_hadoop-mapreduce-client_hadoop-mapreduce-client-jobclient.txt)
 |  
hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient:
 The patch generated 4 new + 42 unchanged - 0 fixed = 46 total (was 42)  |
   | +1 :green_heart: |  mvnsite  |   0m 37s |  |  the patch passed  |
   | +1 :green_heart: |  javadoc  |   0m 28s |  |  the patch passed with JDK 
Private Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1  |
   | +1 :green_heart: |  javadoc  |   0m 27s |  |  the patch passed with JDK 
Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07  |
   | +1 :green_heart: |  spotbugs  |   0m 59s |  |  the patch passed  |
   | +1 :green_heart: |  shadedclient  |  20m  7s |  |  patch has no errors 
when building and testing our client artifacts.  |
    _ Other Tests _ |
   | +1 :green_heart: |  unit  | 141m 43s |  |  
hadoop-mapreduce-client-jobclient in the patch passed.  |
   | +1 :green_heart: |  asflicense  |   1m  6s |  |  The patch does not 
generate ASF License warnings.  |
   |  |   | 236m 31s |  |  |
   
   
   | Subsystem | Report/Notes |
   |--:|:-|
   | Docker | ClientAPI=1.41 ServerAPI=1.41 base: 
https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4604/1/artifact/out/Dockerfile
 |
   | GITHUB PR | https://github.com/apache/hadoop/pull/4604 |
   | Optional Tests | dupname asflicense compile javac javadoc mvninstall 
mvnsite unit shadedclient spotbugs checkstyle codespell detsecrets |
   | uname | Linux 8129d5771b31 4.15.0-156-generic #163-Ubuntu SMP Thu Aug 19 
23:31:58 UTC 2021 x86_64 x86_64 x86_64 GNU/Linux |
   | Build tool | maven |
   | Personality | dev-support/bin/hadoop.sh |
   | git revision | trunk / 7c1f2c7c65f60e76e68a929fde9631584d75955c |
   | Default Java | Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 |
   | Multi-JDK versions | /usr/lib/jvm/java-11-openjdk-amd64:Private 
Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1 
/usr/lib/jvm/java-8-openjdk-amd64:Private 
Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 |
   |  Test Results | 
https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4604/1/testReport/ |
   | Max. process+thread count | 1351 (vs. ulimit of 5500) |
   | modules | C: 
hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient
 U: 

[jira] [Updated] (HADOOP-17705) S3A to add option fs.s3a.endpoint.region to set AWS region

2022-07-21 Thread Steve Loughran (Jira)


 [ 
https://issues.apache.org/jira/browse/HADOOP-17705?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Steve Loughran updated HADOOP-17705:

Description: 
Currently, AWS region is either constructed via the endpoint URL, by making an 
assumption that the 2nd component after delimiter "." is the region in endpoint 
URL, which doesn't work for private links and sets the default to us-east-1 
thus causing authorization issue w.r.t the private link.

The option fs.s3a.endpoint.region allows this to be explicitly set

h2. how to set the s3 region on older hadoop releases

For anyone who needs to set the signing region on older versions of the s3a 
client *you do not need this festure*. instead just provide a custom endpoint 
to region mapping json file

# Download the default region mapping file 
[awssdk_config_default.json|https://github.com/aws/aws-sdk-java/blob/master/aws-java-sdk-core/src/main/resources/com/amazonaws/internal/config/awssdk_config_default.json]
# Add a new regular expression to map the endpoint/hostname to the target region
# Save the file as {{/etc/hadoop/awssdk_config_override.json}}
# verify basic hadop fs -ls commands work
# copy to the rest of the cluster.
# There should be no need to restart any services


  was:
Currently, AWS region is either constructed via the endpoint URL, by making an 
assumption that the 2nd component after delimiter "." is the region in endpoint 
URL, which doesn't work for private links and sets the default to us-east-1 
thus causing authorization issue w.r.t the private link.

The option fs.s3a.endpoint.region allows this to be explicitly set

h2. how to set the s3 region on older hadoop releases

For anyone who needs to set the signing region on older versions of the s3a 
client *you do not need this festure*. instead just provide a custom endpoint 
to region mapping json file

# Download the default region mapping file 
[awssdk_config_default.json|https://github.com/aws/aws-sdk-java/blob/master/aws-java-sdk-core/src/main/resources/com/amazonaws/internal/config/awssdk_config_default.json]
# Add a new regular expression to map the endpoint/hostname to the target region
# Save the file as /etc/hadoop/awssdk_config_override.json 
# verify basic hadop fs -ls commands work
# copy to the rest of the cluster.
# There should be no need to restart any services



> S3A to add option fs.s3a.endpoint.region to set AWS region
> --
>
> Key: HADOOP-17705
> URL: https://issues.apache.org/jira/browse/HADOOP-17705
> Project: Hadoop Common
>  Issue Type: Sub-task
>  Components: fs/s3
>Reporter: Mehakmeet Singh
>Assignee: Mehakmeet Singh
>Priority: Major
>  Labels: pull-request-available
> Fix For: 3.3.2
>
>  Time Spent: 3h
>  Remaining Estimate: 0h
>
> Currently, AWS region is either constructed via the endpoint URL, by making 
> an assumption that the 2nd component after delimiter "." is the region in 
> endpoint URL, which doesn't work for private links and sets the default to 
> us-east-1 thus causing authorization issue w.r.t the private link.
> The option fs.s3a.endpoint.region allows this to be explicitly set
> h2. how to set the s3 region on older hadoop releases
> For anyone who needs to set the signing region on older versions of the s3a 
> client *you do not need this festure*. instead just provide a custom endpoint 
> to region mapping json file
> # Download the default region mapping file 
> [awssdk_config_default.json|https://github.com/aws/aws-sdk-java/blob/master/aws-java-sdk-core/src/main/resources/com/amazonaws/internal/config/awssdk_config_default.json]
> # Add a new regular expression to map the endpoint/hostname to the target 
> region
> # Save the file as {{/etc/hadoop/awssdk_config_override.json}}
> # verify basic hadop fs -ls commands work
> # copy to the rest of the cluster.
> # There should be no need to restart any services



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[jira] [Updated] (HADOOP-17705) S3A to add option fs.s3a.endpoint.region to set AWS region

2022-07-21 Thread Steve Loughran (Jira)


 [ 
https://issues.apache.org/jira/browse/HADOOP-17705?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Steve Loughran updated HADOOP-17705:

Description: 
Currently, AWS region is either constructed via the endpoint URL, by making an 
assumption that the 2nd component after delimiter "." is the region in endpoint 
URL, which doesn't work for private links and sets the default to us-east-1 
thus causing authorization issue w.r.t the private link.

The option fs.s3a.endpoint.region allows this to be explicitly set

h2. how to set the s3 region on older hadoop releases

For anyone who needs to set the signing region on older versions of the s3a 
client *you do not need this festure*. instead just provide a custom endpoint 
to region mapping json file

# Download the default region mapping file 
[awssdk_config_default.json|https://github.com/aws/aws-sdk-java/blob/master/aws-java-sdk-core/src/main/resources/com/amazonaws/internal/config/awssdk_config_default.json]
# Add a new regular expression to map the endpoint/hostname to the target region
# Save the file as /etc/hadoop/awssdk_config_override.json 
# verify basic hadop fs -ls commands work
# copy to the rest of the cluster.
# There should be no need to restart any services


  was:
Currently, AWS region is either constructed via the endpoint URL, by making an 
assumption that the 2nd component after delimiter "." is the region in endpoint 
URL, which doesn't work for private links and sets the default to us-east-1 
thus causing authorization issue w.r.t the private link.

The option fs.s3a.endpoint.region allows this to be explicitly set


> S3A to add option fs.s3a.endpoint.region to set AWS region
> --
>
> Key: HADOOP-17705
> URL: https://issues.apache.org/jira/browse/HADOOP-17705
> Project: Hadoop Common
>  Issue Type: Sub-task
>  Components: fs/s3
>Reporter: Mehakmeet Singh
>Assignee: Mehakmeet Singh
>Priority: Major
>  Labels: pull-request-available
> Fix For: 3.3.2
>
>  Time Spent: 3h
>  Remaining Estimate: 0h
>
> Currently, AWS region is either constructed via the endpoint URL, by making 
> an assumption that the 2nd component after delimiter "." is the region in 
> endpoint URL, which doesn't work for private links and sets the default to 
> us-east-1 thus causing authorization issue w.r.t the private link.
> The option fs.s3a.endpoint.region allows this to be explicitly set
> h2. how to set the s3 region on older hadoop releases
> For anyone who needs to set the signing region on older versions of the s3a 
> client *you do not need this festure*. instead just provide a custom endpoint 
> to region mapping json file
> # Download the default region mapping file 
> [awssdk_config_default.json|https://github.com/aws/aws-sdk-java/blob/master/aws-java-sdk-core/src/main/resources/com/amazonaws/internal/config/awssdk_config_default.json]
> # Add a new regular expression to map the endpoint/hostname to the target 
> region
> # Save the file as /etc/hadoop/awssdk_config_override.json 
> # verify basic hadop fs -ls commands work
> # copy to the rest of the cluster.
> # There should be no need to restart any services



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[GitHub] [hadoop] hadoop-yetus commented on pull request #4531: HDFS-13274. RBF: Extend RouterRpcClient to use multiple sockets

2022-07-21 Thread GitBox


hadoop-yetus commented on PR #4531:
URL: https://github.com/apache/hadoop/pull/4531#issuecomment-1191461484

   :broken_heart: **-1 overall**
   
   
   
   
   
   
   | Vote | Subsystem | Runtime |  Logfile | Comment |
   |::|--:|:|::|:---:|
   | +0 :ok: |  reexec  |   1m 22s |  |  Docker mode activated.  |
    _ Prechecks _ |
   | +1 :green_heart: |  dupname  |   0m  1s |  |  No case conflicting files 
found.  |
   | +0 :ok: |  codespell  |   0m  0s |  |  codespell was not available.  |
   | +0 :ok: |  detsecrets  |   0m  0s |  |  detect-secrets was not available.  
|
   | +0 :ok: |  xmllint  |   0m  0s |  |  xmllint was not available.  |
   | +0 :ok: |  markdownlint  |   0m  0s |  |  markdownlint was not available.  
|
   | +1 :green_heart: |  @author  |   0m  0s |  |  The patch does not contain 
any @author tags.  |
   | +1 :green_heart: |  test4tests  |   0m  0s |  |  The patch appears to 
include 1 new or modified test files.  |
    _ trunk Compile Tests _ |
   | +1 :green_heart: |  mvninstall  |  39m 17s |  |  trunk passed  |
   | +1 :green_heart: |  compile  |   1m  2s |  |  trunk passed with JDK 
Private Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1  |
   | +1 :green_heart: |  compile  |   0m 59s |  |  trunk passed with JDK 
Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07  |
   | +1 :green_heart: |  checkstyle  |   0m 52s |  |  trunk passed  |
   | +1 :green_heart: |  mvnsite  |   1m  4s |  |  trunk passed  |
   | +1 :green_heart: |  javadoc  |   1m  9s |  |  trunk passed with JDK 
Private Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1  |
   | +1 :green_heart: |  javadoc  |   1m 14s |  |  trunk passed with JDK 
Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07  |
   | +1 :green_heart: |  spotbugs  |   1m 46s |  |  trunk passed  |
   | +1 :green_heart: |  shadedclient  |  21m 13s |  |  branch has no errors 
when building and testing our client artifacts.  |
    _ Patch Compile Tests _ |
   | +1 :green_heart: |  mvninstall  |   0m 42s |  |  the patch passed  |
   | +1 :green_heart: |  compile  |   0m 43s |  |  the patch passed with JDK 
Private Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1  |
   | +1 :green_heart: |  javac  |   0m 43s |  |  the patch passed  |
   | +1 :green_heart: |  compile  |   0m 40s |  |  the patch passed with JDK 
Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07  |
   | +1 :green_heart: |  javac  |   0m 40s |  |  the patch passed  |
   | -1 :x: |  blanks  |   0m  0s | 
[/blanks-eol.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4531/5/artifact/out/blanks-eol.txt)
 |  The patch has 1 line(s) that end in blanks. Use git apply --whitespace=fix 
<>. Refer https://git-scm.com/docs/git-apply  |
   | +1 :green_heart: |  checkstyle  |   0m 26s |  |  the patch passed  |
   | +1 :green_heart: |  mvnsite  |   0m 43s |  |  the patch passed  |
   | +1 :green_heart: |  javadoc  |   0m 41s |  |  the patch passed with JDK 
Private Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1  |
   | +1 :green_heart: |  javadoc  |   1m  1s |  |  the patch passed with JDK 
Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07  |
   | +1 :green_heart: |  spotbugs  |   1m 31s |  |  the patch passed  |
   | +1 :green_heart: |  shadedclient  |  20m 52s |  |  patch has no errors 
when building and testing our client artifacts.  |
    _ Other Tests _ |
   | +1 :green_heart: |  unit  |  36m 12s |  |  hadoop-hdfs-rbf in the patch 
passed.  |
   | +1 :green_heart: |  asflicense  |   0m 53s |  |  The patch does not 
generate ASF License warnings.  |
   |  |   | 136m 32s |  |  |
   
   
   | Subsystem | Report/Notes |
   |--:|:-|
   | Docker | ClientAPI=1.41 ServerAPI=1.41 base: 
https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4531/5/artifact/out/Dockerfile
 |
   | GITHUB PR | https://github.com/apache/hadoop/pull/4531 |
   | Optional Tests | dupname asflicense compile javac javadoc mvninstall 
mvnsite unit shadedclient spotbugs checkstyle codespell detsecrets xmllint 
markdownlint |
   | uname | Linux 100e3f4ef007 4.15.0-65-generic #74-Ubuntu SMP Tue Sep 17 
17:06:04 UTC 2019 x86_64 x86_64 x86_64 GNU/Linux |
   | Build tool | maven |
   | Personality | dev-support/bin/hadoop.sh |
   | git revision | trunk / 072786b7b8daf2e7ec79951d015c619b5ac15e0b |
   | Default Java | Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 |
   | Multi-JDK versions | /usr/lib/jvm/java-11-openjdk-amd64:Private 
Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1 
/usr/lib/jvm/java-8-openjdk-amd64:Private 
Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 |
   |  Test Results | 
https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4531/5/testReport/ |
   | Max. process+thread count | 2212 (vs. ulimit of 5500) |
   | modules | C: hadoop-hdfs-project/hadoop-hdfs-rbf U: 
hadoop-hdfs-project/hadoop-hdfs-rbf |
   | Console output | 
https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4531/5/console |
   | versions | git=2.25.1 maven=3.6.3 spotbugs=4.2.2 |
   | 

[GitHub] [hadoop] Neilxzn commented on pull request #4605: HDFS-16677. Add OP_SWAP_BLOCK_LIST as an operation code in FSEditLogOpCodes.

2022-07-21 Thread GitBox


Neilxzn commented on PR #4605:
URL: https://github.com/apache/hadoop/pull/4605#issuecomment-1191454939

   @ayushtkn @jojochuangHi, would you take some time to review it?  


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[GitHub] [hadoop] Neilxzn opened a new pull request, #4605: HDFS-16677. Add OP_SWAP_BLOCK_LIST as an operation code in FSEditLogOpCodes.

2022-07-21 Thread GitBox


Neilxzn opened a new pull request, #4605:
URL: https://github.com/apache/hadoop/pull/4605

   
   
   ### Description of PR
   https://issues.apache.org/jira/browse/HDFS-16677
   
   sub task of https://issues.apache.org/jira/browse/HDFS-14978
   
   ### How was this patch tested?
   
   add unit test testSwapBlockListEditLog
   ### For code changes:
   add swapBlockListEditLog & Feature.SWAP_BLOCK_LIST for NameNodeLayoutVersion


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[GitHub] [hadoop] Neilxzn closed pull request #4590: HDFS-15006. Add OP_SWAP_BLOCK_LIST as an operation code in FSEditLogOpCodes.

2022-07-21 Thread GitBox


Neilxzn closed pull request #4590: HDFS-15006. Add OP_SWAP_BLOCK_LIST as an 
operation code in FSEditLogOpCodes.
URL: https://github.com/apache/hadoop/pull/4590


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[GitHub] [hadoop] szilard-nemeth closed pull request #4599: YARN-11211. QueueMetrics leaks Configuration objects when validation …

2022-07-21 Thread GitBox


szilard-nemeth closed pull request #4599: YARN-11211. QueueMetrics leaks 
Configuration objects when validation …
URL: https://github.com/apache/hadoop/pull/4599


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[GitHub] [hadoop] ferhui commented on a diff in pull request #4529: HDFS-16648. Add isDebugEnabled check for debug logs in some classes

2022-07-21 Thread GitBox


ferhui commented on code in PR #4529:
URL: https://github.com/apache/hadoop/pull/4529#discussion_r926587594


##
hadoop-hdfs-project/hadoop-hdfs/src/main/java/org/apache/hadoop/hdfs/qjournal/client/QuorumJournalManager.java:
##
@@ -479,10 +479,9 @@ public void recoverUnfinalizedSegments() throws 
IOException {
 LOG.info("Successfully started new epoch " + loggers.getEpoch());
 
 if (LOG.isDebugEnabled()) {

Review Comment:
   Yes, reasonable!



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[GitHub] [hadoop] hadoop-yetus commented on pull request #4603: YARN-10793. Upgrade Junit from 4 to 5 in hadoop-yarn-server-applicationhistoryservice

2022-07-21 Thread GitBox


hadoop-yetus commented on PR #4603:
URL: https://github.com/apache/hadoop/pull/4603#issuecomment-1191392283

   :broken_heart: **-1 overall**
   
   
   
   
   
   
   | Vote | Subsystem | Runtime |  Logfile | Comment |
   |::|--:|:|::|:---:|
   | +0 :ok: |  reexec  |   1m  2s |  |  Docker mode activated.  |
    _ Prechecks _ |
   | +1 :green_heart: |  dupname  |   0m  0s |  |  No case conflicting files 
found.  |
   | +0 :ok: |  codespell  |   0m  0s |  |  codespell was not available.  |
   | +0 :ok: |  detsecrets  |   0m  0s |  |  detect-secrets was not available.  
|
   | +0 :ok: |  xmllint  |   0m  0s |  |  xmllint was not available.  |
   | +1 :green_heart: |  @author  |   0m  0s |  |  The patch does not contain 
any @author tags.  |
   | +1 :green_heart: |  test4tests  |   0m  0s |  |  The patch appears to 
include 20 new or modified test files.  |
    _ trunk Compile Tests _ |
   | +1 :green_heart: |  mvninstall  |  41m  9s |  |  trunk passed  |
   | +1 :green_heart: |  compile  |   0m 42s |  |  trunk passed with JDK 
Private Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1  |
   | +1 :green_heart: |  compile  |   0m 39s |  |  trunk passed with JDK 
Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07  |
   | +1 :green_heart: |  checkstyle  |   0m 42s |  |  trunk passed  |
   | +1 :green_heart: |  mvnsite  |   0m 47s |  |  trunk passed  |
   | +1 :green_heart: |  javadoc  |   0m 48s |  |  trunk passed with JDK 
Private Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1  |
   | +1 :green_heart: |  javadoc  |   0m 37s |  |  trunk passed with JDK 
Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07  |
   | +1 :green_heart: |  spotbugs  |   1m 17s |  |  trunk passed  |
   | +1 :green_heart: |  shadedclient  |  23m 32s |  |  branch has no errors 
when building and testing our client artifacts.  |
    _ Patch Compile Tests _ |
   | +1 :green_heart: |  mvninstall  |   0m 29s |  |  the patch passed  |
   | +1 :green_heart: |  compile  |   0m 31s |  |  the patch passed with JDK 
Private Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1  |
   | +1 :green_heart: |  javac  |   0m 31s |  |  the patch passed  |
   | +1 :green_heart: |  compile  |   0m 27s |  |  the patch passed with JDK 
Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07  |
   | +1 :green_heart: |  javac  |   0m 27s |  |  the patch passed  |
   | -1 :x: |  blanks  |   0m  1s | 
[/blanks-eol.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4603/1/artifact/out/blanks-eol.txt)
 |  The patch has 18 line(s) that end in blanks. Use git apply --whitespace=fix 
<>. Refer https://git-scm.com/docs/git-apply  |
   | -0 :warning: |  checkstyle  |   0m 21s | 
[/results-checkstyle-hadoop-yarn-project_hadoop-yarn_hadoop-yarn-server_hadoop-yarn-server-applicationhistoryservice.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4603/1/artifact/out/results-checkstyle-hadoop-yarn-project_hadoop-yarn_hadoop-yarn-server_hadoop-yarn-server-applicationhistoryservice.txt)
 |  
hadoop-yarn-project/hadoop-yarn/hadoop-yarn-server/hadoop-yarn-server-applicationhistoryservice:
 The patch generated 40 new + 142 unchanged - 138 fixed = 182 total (was 280)  |
   | +1 :green_heart: |  mvnsite  |   0m 30s |  |  the patch passed  |
   | +1 :green_heart: |  javadoc  |   0m 27s |  |  the patch passed with JDK 
Private Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1  |
   | +1 :green_heart: |  javadoc  |   0m 24s |  |  the patch passed with JDK 
Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07  |
   | +1 :green_heart: |  spotbugs  |   1m  1s |  |  the patch passed  |
   | +1 :green_heart: |  shadedclient  |  23m  6s |  |  patch has no errors 
when building and testing our client artifacts.  |
    _ Other Tests _ |
   | +1 :green_heart: |  unit  |   4m 38s |  |  
hadoop-yarn-server-applicationhistoryservice in the patch passed.  |
   | +1 :green_heart: |  asflicense  |   0m 42s |  |  The patch does not 
generate ASF License warnings.  |
   |  |   | 105m 43s |  |  |
   
   
   | Subsystem | Report/Notes |
   |--:|:-|
   | Docker | ClientAPI=1.41 ServerAPI=1.41 base: 
https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4603/1/artifact/out/Dockerfile
 |
   | GITHUB PR | https://github.com/apache/hadoop/pull/4603 |
   | Optional Tests | dupname asflicense compile javac javadoc mvninstall 
mvnsite unit shadedclient codespell detsecrets xmllint spotbugs checkstyle |
   | uname | Linux fc26ac2cfe23 4.15.0-175-generic #184-Ubuntu SMP Thu Mar 24 
17:48:36 UTC 2022 x86_64 x86_64 x86_64 GNU/Linux |
   | Build tool | maven |
   | Personality | dev-support/bin/hadoop.sh |
   | git revision | trunk / 0b7f6823866a084b483f303a916f094979ed71b3 |
   | Default Java | Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 |
   | Multi-JDK versions | /usr/lib/jvm/java-11-openjdk-amd64:Private 
Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1 
/usr/lib/jvm/java-8-openjdk-amd64:Private 
Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 |
   |  

[GitHub] [hadoop] ZanderXu commented on a diff in pull request #4529: HDFS-16648. Add isDebugEnabled check for debug logs in some classes

2022-07-21 Thread GitBox


ZanderXu commented on code in PR #4529:
URL: https://github.com/apache/hadoop/pull/4529#discussion_r926573319


##
hadoop-hdfs-project/hadoop-hdfs/src/main/java/org/apache/hadoop/hdfs/qjournal/client/QuorumJournalManager.java:
##
@@ -479,10 +479,9 @@ public void recoverUnfinalizedSegments() throws 
IOException {
 LOG.info("Successfully started new epoch " + loggers.getEpoch());
 
 if (LOG.isDebugEnabled()) {

Review Comment:
   There is string concatenation in `QuorumCall.mapToString(resps)`, so we need 
to keep this judgment.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[GitHub] [hadoop] ZanderXu commented on a diff in pull request #4529: HDFS-16648. Add isDebugEnabled check for debug logs in some classes

2022-07-21 Thread GitBox


ZanderXu commented on code in PR #4529:
URL: https://github.com/apache/hadoop/pull/4529#discussion_r926573133


##
hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/net/NetworkTopology.java:
##
@@ -539,10 +538,12 @@ protected Node chooseRandom(final String scope, String 
excludedScope,
 netlock.readLock().unlock();
   }
 }
-LOG.debug("Choosing random from {} available nodes on node {},"
-+ " scope={}, excludedScope={}, excludeNodes={}. numOfDatanodes={}.",
-availableNodes, innerNode, scope, excludedScope, excludedNodes,
-numOfDatanodes);
+if (LOG.isDebugEnabled()) {

Review Comment:
   There is string concatenation here, so we need to keep this judgment.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[GitHub] [hadoop] ferhui commented on a diff in pull request #4529: HDFS-16648. Add isDebugEnabled check for debug logs in some classes

2022-07-21 Thread GitBox


ferhui commented on code in PR #4529:
URL: https://github.com/apache/hadoop/pull/4529#discussion_r926566222


##
hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/net/NetworkTopology.java:
##
@@ -539,10 +538,12 @@ protected Node chooseRandom(final String scope, String 
excludedScope,
 netlock.readLock().unlock();
   }
 }
-LOG.debug("Choosing random from {} available nodes on node {},"
-+ " scope={}, excludedScope={}, excludeNodes={}. numOfDatanodes={}.",
-availableNodes, innerNode, scope, excludedScope, excludedNodes,
-numOfDatanodes);
+if (LOG.isDebugEnabled()) {

Review Comment:
   Here



##
hadoop-hdfs-project/hadoop-hdfs/src/main/java/org/apache/hadoop/hdfs/qjournal/client/QuorumJournalManager.java:
##
@@ -479,10 +479,9 @@ public void recoverUnfinalizedSegments() throws 
IOException {
 LOG.info("Successfully started new epoch " + loggers.getEpoch());
 
 if (LOG.isDebugEnabled()) {

Review Comment:
   and here



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[GitHub] [hadoop] slfan1989 commented on pull request #4540: YARN-11160. Support getResourceProfiles, getResourceProfile API's for Federation

2022-07-21 Thread GitBox


slfan1989 commented on PR #4540:
URL: https://github.com/apache/hadoop/pull/4540#issuecomment-1191348416

   @goiri Can you help merge this pr to trunk branch? thank you very much! I 
will follow up with YARN-11161.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[jira] [Work logged] (HADOOP-17461) Add thread-level IOStatistics Context

2022-07-21 Thread ASF GitHub Bot (Jira)


 [ 
https://issues.apache.org/jira/browse/HADOOP-17461?focusedWorklogId=793670=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-793670
 ]

ASF GitHub Bot logged work on HADOOP-17461:
---

Author: ASF GitHub Bot
Created on: 21/Jul/22 10:38
Start Date: 21/Jul/22 10:38
Worklog Time Spent: 10m 
  Work Description: steveloughran commented on PR #4352:
URL: https://github.com/apache/hadoop/pull/4352#issuecomment-1191326991

   This is my draft commit message btw
   
   
   
   
   Adds a new IOStatistics class IOStatisticsContext.
   
   This is the active collector of thread-level statistics for
   The current thread.
   
   The S3A Filesystem's input and output streams, and listing
   operations, all update this context when close() is called on
   them (and not before!), so there is effectively automatic
   aggregation of all IO statistics performed by a single thread.
   
   The IOStatisticsContext of a thread can be retrieved and
   cached for invocation in other threads. Holding such a
   reference also ensures that the context will not be garbage
   collected.
   
   To collect statistics on a thread:
   
   1. Retrieve the active context with a call to
  IOStatisticsContext.getCurrentIOStatisticsContext()
   2. Call IOStatisticsContext.reset() to reset all statistics
   3. Call getIOStatistics() on it for the latest values, or
  snapshot() for a snapshot of them.
 
   To instrument filesystem objects for thread-level
   IOStatistics
   
   1. Cache the current IOStatisticsContext context or just its
  aggregator in the object constructor.
   2. In the close() operation, aggregate() the object's own statistics.
   3. Pass the context into worker threads performing work
  on behalf of this thread, through
 IOStatisticsContext.setThreadIOStatisticsContext();
 set it to null afterwards.
   
   TaskPool does the context propagation and reset
   automatically.
   
   Contributed by Mehakmeet Singh
   
   ---




Issue Time Tracking
---

Worklog Id: (was: 793670)
Time Spent: 7h 40m  (was: 7.5h)

> Add thread-level IOStatistics Context
> -
>
> Key: HADOOP-17461
> URL: https://issues.apache.org/jira/browse/HADOOP-17461
> Project: Hadoop Common
>  Issue Type: Sub-task
>  Components: fs, fs/azure, fs/s3
>Affects Versions: 3.3.1
>Reporter: Steve Loughran
>Assignee: Mehakmeet Singh
>Priority: Major
>  Labels: pull-request-available
>  Time Spent: 7h 40m
>  Remaining Estimate: 0h
>
> For effective reporting of the iostatistics of individual worker threads, we 
> need a thread-level context which IO components update.
> * this contact needs to be passed in two background thread forming work on 
> behalf of a task.
> * IO Components (streams, iterators, filesystems) need to update this context 
> statistics as they perform work
> * Without double counting anything.
> I imagine a ThreadLocal IOStatisticContext which will be updated in the 
> FileSystem API Calls. This context MUST be passed into the background threads 
> used by a task, so that IO is correctly aggregated.
> I don't want streams, listIterators  to do the updating as there is more 
> risk of double counting. However, we need to see their statistics if we want 
> to know things like "bytes discarded in backwards seeks". And I don't want to 
> be updating a shared context object on every read() call.
> If all we want is store IO (HEAD, GET, DELETE, list performance etc) then the 
> FS is sufficient. 
> If we do want the stream-specific detail, then I propose
> * caching the context in the constructor
> * updating it only in close() or unbuffer() (as we do from S3AInputStream to 
> S3AInstrumenation)
> * excluding those we know the FS already collects.



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[GitHub] [hadoop] steveloughran commented on pull request #4352: HADOOP-17461. Thread-level IOStatistics in S3A

2022-07-21 Thread GitBox


steveloughran commented on PR #4352:
URL: https://github.com/apache/hadoop/pull/4352#issuecomment-1191326991

   This is my draft commit message btw
   
   
   
   
   Adds a new IOStatistics class IOStatisticsContext.
   
   This is the active collector of thread-level statistics for
   The current thread.
   
   The S3A Filesystem's input and output streams, and listing
   operations, all update this context when close() is called on
   them (and not before!), so there is effectively automatic
   aggregation of all IO statistics performed by a single thread.
   
   The IOStatisticsContext of a thread can be retrieved and
   cached for invocation in other threads. Holding such a
   reference also ensures that the context will not be garbage
   collected.
   
   To collect statistics on a thread:
   
   1. Retrieve the active context with a call to
  IOStatisticsContext.getCurrentIOStatisticsContext()
   2. Call IOStatisticsContext.reset() to reset all statistics
   3. Call getIOStatistics() on it for the latest values, or
  snapshot() for a snapshot of them.
 
   To instrument filesystem objects for thread-level
   IOStatistics
   
   1. Cache the current IOStatisticsContext context or just its
  aggregator in the object constructor.
   2. In the close() operation, aggregate() the object's own statistics.
   3. Pass the context into worker threads performing work
  on behalf of this thread, through
 IOStatisticsContext.setThreadIOStatisticsContext();
 set it to null afterwards.
   
   TaskPool does the context propagation and reset
   automatically.
   
   Contributed by Mehakmeet Singh
   
   ---


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[jira] [Work logged] (HADOOP-17461) Add thread-level IOStatistics Context

2022-07-21 Thread ASF GitHub Bot (Jira)


 [ 
https://issues.apache.org/jira/browse/HADOOP-17461?focusedWorklogId=793665=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-793665
 ]

ASF GitHub Bot logged work on HADOOP-17461:
---

Author: ASF GitHub Bot
Created on: 21/Jul/22 10:33
Start Date: 21/Jul/22 10:33
Worklog Time Spent: 10m 
  Work Description: steveloughran commented on PR #4352:
URL: https://github.com/apache/hadoop/pull/4352#issuecomment-1191322172

   i was writing a commit message with instructions on use, realised that the 
task pools sharing meant we needed a way to reset the context, then that the 
pool should do this...adding that and I concluded that the TaskPool should 
automatically pick up and propagate the context. Which it now does.
   
   no new tests, not tested at all...




Issue Time Tracking
---

Worklog Id: (was: 793665)
Time Spent: 7.5h  (was: 7h 20m)

> Add thread-level IOStatistics Context
> -
>
> Key: HADOOP-17461
> URL: https://issues.apache.org/jira/browse/HADOOP-17461
> Project: Hadoop Common
>  Issue Type: Sub-task
>  Components: fs, fs/azure, fs/s3
>Affects Versions: 3.3.1
>Reporter: Steve Loughran
>Assignee: Mehakmeet Singh
>Priority: Major
>  Labels: pull-request-available
>  Time Spent: 7.5h
>  Remaining Estimate: 0h
>
> For effective reporting of the iostatistics of individual worker threads, we 
> need a thread-level context which IO components update.
> * this contact needs to be passed in two background thread forming work on 
> behalf of a task.
> * IO Components (streams, iterators, filesystems) need to update this context 
> statistics as they perform work
> * Without double counting anything.
> I imagine a ThreadLocal IOStatisticContext which will be updated in the 
> FileSystem API Calls. This context MUST be passed into the background threads 
> used by a task, so that IO is correctly aggregated.
> I don't want streams, listIterators  to do the updating as there is more 
> risk of double counting. However, we need to see their statistics if we want 
> to know things like "bytes discarded in backwards seeks". And I don't want to 
> be updating a shared context object on every read() call.
> If all we want is store IO (HEAD, GET, DELETE, list performance etc) then the 
> FS is sufficient. 
> If we do want the stream-specific detail, then I propose
> * caching the context in the constructor
> * updating it only in close() or unbuffer() (as we do from S3AInputStream to 
> S3AInstrumenation)
> * excluding those we know the FS already collects.



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[GitHub] [hadoop] steveloughran commented on pull request #4352: HADOOP-17461. Thread-level IOStatistics in S3A

2022-07-21 Thread GitBox


steveloughran commented on PR #4352:
URL: https://github.com/apache/hadoop/pull/4352#issuecomment-1191322172

   i was writing a commit message with instructions on use, realised that the 
task pools sharing meant we needed a way to reset the context, then that the 
pool should do this...adding that and I concluded that the TaskPool should 
automatically pick up and propagate the context. Which it now does.
   
   no new tests, not tested at all...


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[jira] [Work logged] (HADOOP-18333) hadoop-client-runtime impact by CVE-2022-2047 due to shaded jetty

2022-07-21 Thread ASF GitHub Bot (Jira)


 [ 
https://issues.apache.org/jira/browse/HADOOP-18333?focusedWorklogId=793664=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-793664
 ]

ASF GitHub Bot logged work on HADOOP-18333:
---

Author: ASF GitHub Bot
Created on: 21/Jul/22 10:23
Start Date: 21/Jul/22 10:23
Worklog Time Spent: 10m 
  Work Description: hadoop-yetus commented on PR #4600:
URL: https://github.com/apache/hadoop/pull/4600#issuecomment-1191312830

   :broken_heart: **-1 overall**
   
   
   
   
   
   
   | Vote | Subsystem | Runtime |  Logfile | Comment |
   |::|--:|:|::|:---:|
   | +0 :ok: |  reexec  |   0m 51s |  |  Docker mode activated.  |
    _ Prechecks _ |
   | +1 :green_heart: |  dupname  |   0m  0s |  |  No case conflicting files 
found.  |
   | +0 :ok: |  codespell  |   0m  0s |  |  codespell was not available.  |
   | +0 :ok: |  detsecrets  |   0m  0s |  |  detect-secrets was not available.  
|
   | +0 :ok: |  xmllint  |   0m  0s |  |  xmllint was not available.  |
   | +0 :ok: |  shelldocs  |   0m  0s |  |  Shelldocs was not available.  |
   | +1 :green_heart: |  @author  |   0m  0s |  |  The patch does not contain 
any @author tags.  |
   | -1 :x: |  test4tests  |   0m  0s |  |  The patch doesn't appear to include 
any new or modified tests. Please justify why no new tests are needed for this 
patch. Also please list what manual steps were performed to verify this patch.  
|
    _ branch-3.3 Compile Tests _ |
   | +0 :ok: |  mvndep  |  14m 41s |  |  Maven dependency ordering for branch  |
   | +1 :green_heart: |  mvninstall  |  27m  0s |  |  branch-3.3 passed  |
   | +1 :green_heart: |  compile  |  18m 51s |  |  branch-3.3 passed  |
   | +1 :green_heart: |  checkstyle  |   3m 18s |  |  branch-3.3 passed  |
   | +1 :green_heart: |  mvnsite  |  21m  1s |  |  branch-3.3 passed  |
   | +1 :green_heart: |  javadoc  |   7m  4s |  |  branch-3.3 passed  |
   | +0 :ok: |  spotbugs  |   0m 27s |  |  branch/hadoop-project no spotbugs 
output file (spotbugsXml.xml)  |
   | +1 :green_heart: |  shadedclient  |  59m 31s |  |  branch has no errors 
when building and testing our client artifacts.  |
    _ Patch Compile Tests _ |
   | +0 :ok: |  mvndep  |   0m 56s |  |  Maven dependency ordering for patch  |
   | +1 :green_heart: |  mvninstall  |  25m 42s |  |  the patch passed  |
   | +1 :green_heart: |  compile  |  18m 12s |  |  the patch passed  |
   | +1 :green_heart: |  javac  |  18m 12s |  |  the patch passed  |
   | +1 :green_heart: |  blanks  |   0m  0s |  |  The patch has no blanks 
issues.  |
   | +1 :green_heart: |  checkstyle  |   3m 12s |  |  the patch passed  |
   | +1 :green_heart: |  mvnsite  |  20m 33s |  |  the patch passed  |
   | +1 :green_heart: |  shellcheck  |   0m  0s |  |  No new issues.  |
   | +1 :green_heart: |  javadoc  |   6m 57s |  |  the patch passed  |
   | +0 :ok: |  spotbugs  |   0m 25s |  |  hadoop-project has no data from 
spotbugs  |
   | +1 :green_heart: |  shadedclient  |  60m  6s |  |  patch has no errors 
when building and testing our client artifacts.  |
    _ Other Tests _ |
   | -1 :x: |  unit  | 773m 58s | 
[/patch-unit-root.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4600/1/artifact/out/patch-unit-root.txt)
 |  root in the patch passed.  |
   | +1 :green_heart: |  asflicense  |   2m 14s |  |  The patch does not 
generate ASF License warnings.  |
   |  |   | 1074m 26s |  |  |
   
   
   | Reason | Tests |
   |---:|:--|
   | Failed junit tests | hadoop.hdfs.server.federation.router.TestRouterRpc |
   |   | hadoop.hdfs.server.federation.router.TestRouterRpcMultiDestination |
   |   | hadoop.hdfs.server.namenode.snapshot.TestRandomOpsWithSnapshots |
   |   | hadoop.hdfs.server.namenode.snapshot.TestSnapshot |
   |   | hadoop.hdfs.server.blockmanagement.TestBlockStatsMXBean |
   |   | hadoop.hdfs.server.namenode.snapshot.TestSnapshotRename |
   |   | hadoop.hdfs.tools.TestDFSAdmin |
   |   | hadoop.hdfs.server.namenode.TestSecureNameNode |
   |   | hadoop.hdfs.server.blockmanagement.TestUnderReplicatedBlocks |
   |   | hadoop.hdfs.server.namenode.TestSecurityTokenEditLog |
   |   | hadoop.hdfs.server.namenode.TestFileTruncate |
   |   | hadoop.hdfs.server.namenode.TestFSImage |
   |   | hadoop.hdfs.server.namenode.TestFsck |
   |   | hadoop.hdfs.server.namenode.TestBlockPlacementPolicyRackFaultTolerant |
   
   
   | Subsystem | Report/Notes |
   |--:|:-|
   | Docker | ClientAPI=1.41 ServerAPI=1.41 base: 
https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4600/1/artifact/out/Dockerfile
 |
   | GITHUB PR | https://github.com/apache/hadoop/pull/4600 |
   | Optional Tests | dupname asflicense compile javac javadoc mvninstall 
mvnsite unit shadedclient spotbugs checkstyle codespell detsecrets xmllint 
shellcheck shelldocs |
   | uname | Linux 498b721f9712 4.15.0-175-generic 

[GitHub] [hadoop] hadoop-yetus commented on pull request #4600: HADOOP-18333. Upgrade jetty version to 9.4.48.v20220622

2022-07-21 Thread GitBox


hadoop-yetus commented on PR #4600:
URL: https://github.com/apache/hadoop/pull/4600#issuecomment-1191312830

   :broken_heart: **-1 overall**
   
   
   
   
   
   
   | Vote | Subsystem | Runtime |  Logfile | Comment |
   |::|--:|:|::|:---:|
   | +0 :ok: |  reexec  |   0m 51s |  |  Docker mode activated.  |
    _ Prechecks _ |
   | +1 :green_heart: |  dupname  |   0m  0s |  |  No case conflicting files 
found.  |
   | +0 :ok: |  codespell  |   0m  0s |  |  codespell was not available.  |
   | +0 :ok: |  detsecrets  |   0m  0s |  |  detect-secrets was not available.  
|
   | +0 :ok: |  xmllint  |   0m  0s |  |  xmllint was not available.  |
   | +0 :ok: |  shelldocs  |   0m  0s |  |  Shelldocs was not available.  |
   | +1 :green_heart: |  @author  |   0m  0s |  |  The patch does not contain 
any @author tags.  |
   | -1 :x: |  test4tests  |   0m  0s |  |  The patch doesn't appear to include 
any new or modified tests. Please justify why no new tests are needed for this 
patch. Also please list what manual steps were performed to verify this patch.  
|
    _ branch-3.3 Compile Tests _ |
   | +0 :ok: |  mvndep  |  14m 41s |  |  Maven dependency ordering for branch  |
   | +1 :green_heart: |  mvninstall  |  27m  0s |  |  branch-3.3 passed  |
   | +1 :green_heart: |  compile  |  18m 51s |  |  branch-3.3 passed  |
   | +1 :green_heart: |  checkstyle  |   3m 18s |  |  branch-3.3 passed  |
   | +1 :green_heart: |  mvnsite  |  21m  1s |  |  branch-3.3 passed  |
   | +1 :green_heart: |  javadoc  |   7m  4s |  |  branch-3.3 passed  |
   | +0 :ok: |  spotbugs  |   0m 27s |  |  branch/hadoop-project no spotbugs 
output file (spotbugsXml.xml)  |
   | +1 :green_heart: |  shadedclient  |  59m 31s |  |  branch has no errors 
when building and testing our client artifacts.  |
    _ Patch Compile Tests _ |
   | +0 :ok: |  mvndep  |   0m 56s |  |  Maven dependency ordering for patch  |
   | +1 :green_heart: |  mvninstall  |  25m 42s |  |  the patch passed  |
   | +1 :green_heart: |  compile  |  18m 12s |  |  the patch passed  |
   | +1 :green_heart: |  javac  |  18m 12s |  |  the patch passed  |
   | +1 :green_heart: |  blanks  |   0m  0s |  |  The patch has no blanks 
issues.  |
   | +1 :green_heart: |  checkstyle  |   3m 12s |  |  the patch passed  |
   | +1 :green_heart: |  mvnsite  |  20m 33s |  |  the patch passed  |
   | +1 :green_heart: |  shellcheck  |   0m  0s |  |  No new issues.  |
   | +1 :green_heart: |  javadoc  |   6m 57s |  |  the patch passed  |
   | +0 :ok: |  spotbugs  |   0m 25s |  |  hadoop-project has no data from 
spotbugs  |
   | +1 :green_heart: |  shadedclient  |  60m  6s |  |  patch has no errors 
when building and testing our client artifacts.  |
    _ Other Tests _ |
   | -1 :x: |  unit  | 773m 58s | 
[/patch-unit-root.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4600/1/artifact/out/patch-unit-root.txt)
 |  root in the patch passed.  |
   | +1 :green_heart: |  asflicense  |   2m 14s |  |  The patch does not 
generate ASF License warnings.  |
   |  |   | 1074m 26s |  |  |
   
   
   | Reason | Tests |
   |---:|:--|
   | Failed junit tests | hadoop.hdfs.server.federation.router.TestRouterRpc |
   |   | hadoop.hdfs.server.federation.router.TestRouterRpcMultiDestination |
   |   | hadoop.hdfs.server.namenode.snapshot.TestRandomOpsWithSnapshots |
   |   | hadoop.hdfs.server.namenode.snapshot.TestSnapshot |
   |   | hadoop.hdfs.server.blockmanagement.TestBlockStatsMXBean |
   |   | hadoop.hdfs.server.namenode.snapshot.TestSnapshotRename |
   |   | hadoop.hdfs.tools.TestDFSAdmin |
   |   | hadoop.hdfs.server.namenode.TestSecureNameNode |
   |   | hadoop.hdfs.server.blockmanagement.TestUnderReplicatedBlocks |
   |   | hadoop.hdfs.server.namenode.TestSecurityTokenEditLog |
   |   | hadoop.hdfs.server.namenode.TestFileTruncate |
   |   | hadoop.hdfs.server.namenode.TestFSImage |
   |   | hadoop.hdfs.server.namenode.TestFsck |
   |   | hadoop.hdfs.server.namenode.TestBlockPlacementPolicyRackFaultTolerant |
   
   
   | Subsystem | Report/Notes |
   |--:|:-|
   | Docker | ClientAPI=1.41 ServerAPI=1.41 base: 
https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4600/1/artifact/out/Dockerfile
 |
   | GITHUB PR | https://github.com/apache/hadoop/pull/4600 |
   | Optional Tests | dupname asflicense compile javac javadoc mvninstall 
mvnsite unit shadedclient spotbugs checkstyle codespell detsecrets xmllint 
shellcheck shelldocs |
   | uname | Linux 498b721f9712 4.15.0-175-generic #184-Ubuntu SMP Thu Mar 24 
17:48:36 UTC 2022 x86_64 x86_64 x86_64 GNU/Linux |
   | Build tool | maven |
   | Personality | dev-support/bin/hadoop.sh |
   | git revision | branch-3.3 / 9d6326071acb9e4859a0be5d5bccb0bce2948ffd |
   | Default Java | Private Build-1.8.0_312-8u312-b07-0ubuntu1~18.04-b07 |
   |  Test Results | 
https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4600/1/testReport/ |
   | Max. process+thread count | 

[GitHub] [hadoop] gvijay452 opened a new pull request, #4604: HDFS-16674 : Improve TestDFSIO to support more filesystems

2022-07-21 Thread GitBox


gvijay452 opened a new pull request, #4604:
URL: https://github.com/apache/hadoop/pull/4604

   
   
   ### Description of PR
   Added support for more filesystems. This will let the same TestDFSIO to be 
used for testing different filesystem other than HDFS
   
   ### How was this patch tested?
   Tested with OCI filesystem after building the jar with new changes
   
   ### For code changes:
   
   - [ ] Does the title or this PR starts with the corresponding JIRA issue id 
(e.g. 'HADOOP-17799. Your PR title ...')?
   - [ ] Object storage: have the integration tests been executed and the 
endpoint declared according to the connector-specific documentation?
   - [ ] If adding new dependencies to the code, are these dependencies 
licensed in a way that is compatible for inclusion under [ASF 
2.0](http://www.apache.org/legal/resolved.html#category-a)?
   - [ ] If applicable, have you updated the `LICENSE`, `LICENSE-binary`, 
`NOTICE-binary` files?
   
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[jira] [Work logged] (HADOOP-17461) Add thread-level IOStatistics Context

2022-07-21 Thread ASF GitHub Bot (Jira)


 [ 
https://issues.apache.org/jira/browse/HADOOP-17461?focusedWorklogId=793659=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-793659
 ]

ASF GitHub Bot logged work on HADOOP-17461:
---

Author: ASF GitHub Bot
Created on: 21/Jul/22 10:14
Start Date: 21/Jul/22 10:14
Worklog Time Spent: 10m 
  Work Description: steveloughran commented on code in PR #4352:
URL: https://github.com/apache/hadoop/pull/4352#discussion_r926503951


##
hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/fs/statistics/impl/IOStatisticsContextIntegration.java:
##
@@ -0,0 +1,151 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ *  or more contributor license agreements.  See the NOTICE file
+ *  distributed with this work for additional information
+ *  regarding copyright ownership.  The ASF licenses this file
+ *  to you under the Apache License, Version 2.0 (the
+ *  "License"); you may not use this file except in compliance
+ *  with the License.  You may obtain a copy of the License at
+ *
+ *   http://www.apache.org/licenses/LICENSE-2.0
+ *
+ *  Unless required by applicable law or agreed to in writing, software
+ *  distributed under the License is distributed on an "AS IS" BASIS,
+ *  WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ *  See the License for the specific language governing permissions and
+ *  limitations under the License.
+ */
+
+package org.apache.hadoop.fs.statistics.impl;
+
+import java.lang.ref.WeakReference;
+import java.util.concurrent.atomic.AtomicLong;
+
+import org.slf4j.Logger;
+import org.slf4j.LoggerFactory;
+
+import org.apache.hadoop.classification.VisibleForTesting;
+import org.apache.hadoop.conf.Configuration;
+import org.apache.hadoop.fs.impl.WeakReferenceThreadMap;
+import org.apache.hadoop.fs.statistics.IOStatisticsContext;
+
+import static 
org.apache.hadoop.fs.CommonConfigurationKeys.THREAD_LEVEL_IOSTATISTICS_ENABLED;
+import static 
org.apache.hadoop.fs.CommonConfigurationKeys.THREAD_LEVEL_IOSTATISTICS_ENABLED_DEFAULT;
+
+/**
+ * A Utility class for IOStatisticsContext, which helps in creating and
+ * getting the current active context. Static methods in this class allows to
+ * get the current context to start aggregating the IOStatistics.
+ *
+ * Static initializer is used to work out if the feature to collect
+ * thread-level IOStatistics is enabled or not and the corresponding
+ * implementation class is called for it.
+ *
+ * Weak Reference thread map to be used to keep track of different context's
+ * to avoid long-lived memory leakages as these references would be cleaned
+ * up at GC.
+ */
+public final class IOStatisticsContextIntegration {
+
+  private static final Logger LOG =
+  LoggerFactory.getLogger(IOStatisticsContextIntegration.class);
+
+  /**
+   * Is thread-level IO Statistics enabled?
+   */
+  private static final boolean IS_THREAD_IOSTATS_ENABLED;
+
+  /**
+   * ID for next instance to create.
+   */
+  public static final AtomicLong INSTANCE_ID = new AtomicLong(1);
+
+  /**
+   * Active IOStatistics Context containing different worker thread's
+   * statistics. Weak Reference so that it gets cleaned up during GC and we
+   * avoid any memory leak issues due to long lived references.
+   */
+  private static final WeakReferenceThreadMap
+  ACTIVE_IOSTATS_CONTEXT =
+  new WeakReferenceThreadMap<>(
+  IOStatisticsContextIntegration::createNewInstance,
+  IOStatisticsContextIntegration::referenceLostContext
+  );
+
+  static {
+// Work out if the current context has thread level IOStatistics enabled.
+final Configuration configuration = new Configuration();
+IS_THREAD_IOSTATS_ENABLED =
+configuration.getBoolean(THREAD_LEVEL_IOSTATISTICS_ENABLED,
+THREAD_LEVEL_IOSTATISTICS_ENABLED_DEFAULT);
+  }
+
+  /**
+   * Private constructor for a utility class to be used in IOStatisticsContext.
+   */
+  private IOStatisticsContextIntegration() {}
+
+  /**
+   * Creating a new IOStatisticsContext instance for a FS to be used.
+   * @param key Thread ID that represents which thread the context belongs to.
+   * @return an instance of IOStatisticsContext.
+   */
+  private static IOStatisticsContext createNewInstance(Long key) {
+return new IOStatisticsContextImpl(key, INSTANCE_ID.getAndIncrement());
+  }
+
+  /**
+   * In case of reference loss for IOStatisticsContext.
+   * @param key ThreadID.
+   */
+  private static void referenceLostContext(Long key) {
+LOG.debug("Reference lost for threadID for the context: {}", key);
+  }
+
+  /**
+   * Get the current thread's IOStatisticsContext instance. If no instance is
+   * present for this thread ID, create one using the factory.
+   * @return instance of IOStatisticsContext.
+   */
+  public static IOStatisticsContext getCurrentIOStatisticsContext() {
+return 

[GitHub] [hadoop] steveloughran commented on a diff in pull request #4352: HADOOP-17461. Thread-level IOStatistics in S3A

2022-07-21 Thread GitBox


steveloughran commented on code in PR #4352:
URL: https://github.com/apache/hadoop/pull/4352#discussion_r926503951


##
hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/fs/statistics/impl/IOStatisticsContextIntegration.java:
##
@@ -0,0 +1,151 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ *  or more contributor license agreements.  See the NOTICE file
+ *  distributed with this work for additional information
+ *  regarding copyright ownership.  The ASF licenses this file
+ *  to you under the Apache License, Version 2.0 (the
+ *  "License"); you may not use this file except in compliance
+ *  with the License.  You may obtain a copy of the License at
+ *
+ *   http://www.apache.org/licenses/LICENSE-2.0
+ *
+ *  Unless required by applicable law or agreed to in writing, software
+ *  distributed under the License is distributed on an "AS IS" BASIS,
+ *  WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ *  See the License for the specific language governing permissions and
+ *  limitations under the License.
+ */
+
+package org.apache.hadoop.fs.statistics.impl;
+
+import java.lang.ref.WeakReference;
+import java.util.concurrent.atomic.AtomicLong;
+
+import org.slf4j.Logger;
+import org.slf4j.LoggerFactory;
+
+import org.apache.hadoop.classification.VisibleForTesting;
+import org.apache.hadoop.conf.Configuration;
+import org.apache.hadoop.fs.impl.WeakReferenceThreadMap;
+import org.apache.hadoop.fs.statistics.IOStatisticsContext;
+
+import static 
org.apache.hadoop.fs.CommonConfigurationKeys.THREAD_LEVEL_IOSTATISTICS_ENABLED;
+import static 
org.apache.hadoop.fs.CommonConfigurationKeys.THREAD_LEVEL_IOSTATISTICS_ENABLED_DEFAULT;
+
+/**
+ * A Utility class for IOStatisticsContext, which helps in creating and
+ * getting the current active context. Static methods in this class allows to
+ * get the current context to start aggregating the IOStatistics.
+ *
+ * Static initializer is used to work out if the feature to collect
+ * thread-level IOStatistics is enabled or not and the corresponding
+ * implementation class is called for it.
+ *
+ * Weak Reference thread map to be used to keep track of different context's
+ * to avoid long-lived memory leakages as these references would be cleaned
+ * up at GC.
+ */
+public final class IOStatisticsContextIntegration {
+
+  private static final Logger LOG =
+  LoggerFactory.getLogger(IOStatisticsContextIntegration.class);
+
+  /**
+   * Is thread-level IO Statistics enabled?
+   */
+  private static final boolean IS_THREAD_IOSTATS_ENABLED;
+
+  /**
+   * ID for next instance to create.
+   */
+  public static final AtomicLong INSTANCE_ID = new AtomicLong(1);
+
+  /**
+   * Active IOStatistics Context containing different worker thread's
+   * statistics. Weak Reference so that it gets cleaned up during GC and we
+   * avoid any memory leak issues due to long lived references.
+   */
+  private static final WeakReferenceThreadMap
+  ACTIVE_IOSTATS_CONTEXT =
+  new WeakReferenceThreadMap<>(
+  IOStatisticsContextIntegration::createNewInstance,
+  IOStatisticsContextIntegration::referenceLostContext
+  );
+
+  static {
+// Work out if the current context has thread level IOStatistics enabled.
+final Configuration configuration = new Configuration();
+IS_THREAD_IOSTATS_ENABLED =
+configuration.getBoolean(THREAD_LEVEL_IOSTATISTICS_ENABLED,
+THREAD_LEVEL_IOSTATISTICS_ENABLED_DEFAULT);
+  }
+
+  /**
+   * Private constructor for a utility class to be used in IOStatisticsContext.
+   */
+  private IOStatisticsContextIntegration() {}
+
+  /**
+   * Creating a new IOStatisticsContext instance for a FS to be used.
+   * @param key Thread ID that represents which thread the context belongs to.
+   * @return an instance of IOStatisticsContext.
+   */
+  private static IOStatisticsContext createNewInstance(Long key) {
+return new IOStatisticsContextImpl(key, INSTANCE_ID.getAndIncrement());
+  }
+
+  /**
+   * In case of reference loss for IOStatisticsContext.
+   * @param key ThreadID.
+   */
+  private static void referenceLostContext(Long key) {
+LOG.debug("Reference lost for threadID for the context: {}", key);
+  }
+
+  /**
+   * Get the current thread's IOStatisticsContext instance. If no instance is
+   * present for this thread ID, create one using the factory.
+   * @return instance of IOStatisticsContext.
+   */
+  public static IOStatisticsContext getCurrentIOStatisticsContext() {
+return IS_THREAD_IOSTATS_ENABLED
+? ACTIVE_IOSTATS_CONTEXT.getForCurrentThread()
+: EmptyIOStatisticsContextImpl.getInstance();
+  }
+
+  /**
+   * Set the IOStatisticsContext for the current thread.
+   * @param statisticsContext IOStatistics context instance for the
+   * current thread.
+   */
+  public static void setThreadIOStatisticsContext(
+  IOStatisticsContext statisticsContext) {
+if 

[jira] [Work logged] (HADOOP-17461) Add thread-level IOStatistics Context

2022-07-21 Thread ASF GitHub Bot (Jira)


 [ 
https://issues.apache.org/jira/browse/HADOOP-17461?focusedWorklogId=793658=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-793658
 ]

ASF GitHub Bot logged work on HADOOP-17461:
---

Author: ASF GitHub Bot
Created on: 21/Jul/22 10:12
Start Date: 21/Jul/22 10:12
Worklog Time Spent: 10m 
  Work Description: steveloughran commented on code in PR #4352:
URL: https://github.com/apache/hadoop/pull/4352#discussion_r926485594


##
hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/fs/statistics/impl/IOStatisticsContextIntegration.java:
##
@@ -0,0 +1,151 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ *  or more contributor license agreements.  See the NOTICE file
+ *  distributed with this work for additional information
+ *  regarding copyright ownership.  The ASF licenses this file
+ *  to you under the Apache License, Version 2.0 (the
+ *  "License"); you may not use this file except in compliance
+ *  with the License.  You may obtain a copy of the License at
+ *
+ *   http://www.apache.org/licenses/LICENSE-2.0
+ *
+ *  Unless required by applicable law or agreed to in writing, software
+ *  distributed under the License is distributed on an "AS IS" BASIS,
+ *  WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ *  See the License for the specific language governing permissions and
+ *  limitations under the License.
+ */
+
+package org.apache.hadoop.fs.statistics.impl;
+
+import java.lang.ref.WeakReference;
+import java.util.concurrent.atomic.AtomicLong;
+
+import org.slf4j.Logger;
+import org.slf4j.LoggerFactory;
+
+import org.apache.hadoop.classification.VisibleForTesting;
+import org.apache.hadoop.conf.Configuration;
+import org.apache.hadoop.fs.impl.WeakReferenceThreadMap;
+import org.apache.hadoop.fs.statistics.IOStatisticsContext;
+
+import static 
org.apache.hadoop.fs.CommonConfigurationKeys.THREAD_LEVEL_IOSTATISTICS_ENABLED;
+import static 
org.apache.hadoop.fs.CommonConfigurationKeys.THREAD_LEVEL_IOSTATISTICS_ENABLED_DEFAULT;
+
+/**
+ * A Utility class for IOStatisticsContext, which helps in creating and
+ * getting the current active context. Static methods in this class allows to
+ * get the current context to start aggregating the IOStatistics.
+ *
+ * Static initializer is used to work out if the feature to collect
+ * thread-level IOStatistics is enabled or not and the corresponding
+ * implementation class is called for it.
+ *
+ * Weak Reference thread map to be used to keep track of different context's
+ * to avoid long-lived memory leakages as these references would be cleaned
+ * up at GC.
+ */
+public final class IOStatisticsContextIntegration {
+
+  private static final Logger LOG =
+  LoggerFactory.getLogger(IOStatisticsContextIntegration.class);
+
+  /**
+   * Is thread-level IO Statistics enabled?
+   */
+  private static final boolean IS_THREAD_IOSTATS_ENABLED;
+
+  /**
+   * ID for next instance to create.
+   */
+  public static final AtomicLong INSTANCE_ID = new AtomicLong(1);
+
+  /**
+   * Active IOStatistics Context containing different worker thread's
+   * statistics. Weak Reference so that it gets cleaned up during GC and we
+   * avoid any memory leak issues due to long lived references.
+   */
+  private static final WeakReferenceThreadMap
+  ACTIVE_IOSTATS_CONTEXT =
+  new WeakReferenceThreadMap<>(
+  IOStatisticsContextIntegration::createNewInstance,
+  IOStatisticsContextIntegration::referenceLostContext
+  );
+
+  static {
+// Work out if the current context has thread level IOStatistics enabled.
+final Configuration configuration = new Configuration();
+IS_THREAD_IOSTATS_ENABLED =
+configuration.getBoolean(THREAD_LEVEL_IOSTATISTICS_ENABLED,
+THREAD_LEVEL_IOSTATISTICS_ENABLED_DEFAULT);
+  }
+
+  /**
+   * Private constructor for a utility class to be used in IOStatisticsContext.
+   */
+  private IOStatisticsContextIntegration() {}
+
+  /**
+   * Creating a new IOStatisticsContext instance for a FS to be used.
+   * @param key Thread ID that represents which thread the context belongs to.
+   * @return an instance of IOStatisticsContext.
+   */
+  private static IOStatisticsContext createNewInstance(Long key) {
+return new IOStatisticsContextImpl(key, INSTANCE_ID.getAndIncrement());
+  }
+
+  /**
+   * In case of reference loss for IOStatisticsContext.
+   * @param key ThreadID.
+   */
+  private static void referenceLostContext(Long key) {
+LOG.debug("Reference lost for threadID for the context: {}", key);
+  }
+
+  /**
+   * Get the current thread's IOStatisticsContext instance. If no instance is
+   * present for this thread ID, create one using the factory.
+   * @return instance of IOStatisticsContext.
+   */
+  public static IOStatisticsContext getCurrentIOStatisticsContext() {
+return 

[GitHub] [hadoop] steveloughran commented on a diff in pull request #4352: HADOOP-17461. Thread-level IOStatistics in S3A

2022-07-21 Thread GitBox


steveloughran commented on code in PR #4352:
URL: https://github.com/apache/hadoop/pull/4352#discussion_r926485594


##
hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/fs/statistics/impl/IOStatisticsContextIntegration.java:
##
@@ -0,0 +1,151 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ *  or more contributor license agreements.  See the NOTICE file
+ *  distributed with this work for additional information
+ *  regarding copyright ownership.  The ASF licenses this file
+ *  to you under the Apache License, Version 2.0 (the
+ *  "License"); you may not use this file except in compliance
+ *  with the License.  You may obtain a copy of the License at
+ *
+ *   http://www.apache.org/licenses/LICENSE-2.0
+ *
+ *  Unless required by applicable law or agreed to in writing, software
+ *  distributed under the License is distributed on an "AS IS" BASIS,
+ *  WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ *  See the License for the specific language governing permissions and
+ *  limitations under the License.
+ */
+
+package org.apache.hadoop.fs.statistics.impl;
+
+import java.lang.ref.WeakReference;
+import java.util.concurrent.atomic.AtomicLong;
+
+import org.slf4j.Logger;
+import org.slf4j.LoggerFactory;
+
+import org.apache.hadoop.classification.VisibleForTesting;
+import org.apache.hadoop.conf.Configuration;
+import org.apache.hadoop.fs.impl.WeakReferenceThreadMap;
+import org.apache.hadoop.fs.statistics.IOStatisticsContext;
+
+import static 
org.apache.hadoop.fs.CommonConfigurationKeys.THREAD_LEVEL_IOSTATISTICS_ENABLED;
+import static 
org.apache.hadoop.fs.CommonConfigurationKeys.THREAD_LEVEL_IOSTATISTICS_ENABLED_DEFAULT;
+
+/**
+ * A Utility class for IOStatisticsContext, which helps in creating and
+ * getting the current active context. Static methods in this class allows to
+ * get the current context to start aggregating the IOStatistics.
+ *
+ * Static initializer is used to work out if the feature to collect
+ * thread-level IOStatistics is enabled or not and the corresponding
+ * implementation class is called for it.
+ *
+ * Weak Reference thread map to be used to keep track of different context's
+ * to avoid long-lived memory leakages as these references would be cleaned
+ * up at GC.
+ */
+public final class IOStatisticsContextIntegration {
+
+  private static final Logger LOG =
+  LoggerFactory.getLogger(IOStatisticsContextIntegration.class);
+
+  /**
+   * Is thread-level IO Statistics enabled?
+   */
+  private static final boolean IS_THREAD_IOSTATS_ENABLED;
+
+  /**
+   * ID for next instance to create.
+   */
+  public static final AtomicLong INSTANCE_ID = new AtomicLong(1);
+
+  /**
+   * Active IOStatistics Context containing different worker thread's
+   * statistics. Weak Reference so that it gets cleaned up during GC and we
+   * avoid any memory leak issues due to long lived references.
+   */
+  private static final WeakReferenceThreadMap
+  ACTIVE_IOSTATS_CONTEXT =
+  new WeakReferenceThreadMap<>(
+  IOStatisticsContextIntegration::createNewInstance,
+  IOStatisticsContextIntegration::referenceLostContext
+  );
+
+  static {
+// Work out if the current context has thread level IOStatistics enabled.
+final Configuration configuration = new Configuration();
+IS_THREAD_IOSTATS_ENABLED =
+configuration.getBoolean(THREAD_LEVEL_IOSTATISTICS_ENABLED,
+THREAD_LEVEL_IOSTATISTICS_ENABLED_DEFAULT);
+  }
+
+  /**
+   * Private constructor for a utility class to be used in IOStatisticsContext.
+   */
+  private IOStatisticsContextIntegration() {}
+
+  /**
+   * Creating a new IOStatisticsContext instance for a FS to be used.
+   * @param key Thread ID that represents which thread the context belongs to.
+   * @return an instance of IOStatisticsContext.
+   */
+  private static IOStatisticsContext createNewInstance(Long key) {
+return new IOStatisticsContextImpl(key, INSTANCE_ID.getAndIncrement());
+  }
+
+  /**
+   * In case of reference loss for IOStatisticsContext.
+   * @param key ThreadID.
+   */
+  private static void referenceLostContext(Long key) {
+LOG.debug("Reference lost for threadID for the context: {}", key);
+  }
+
+  /**
+   * Get the current thread's IOStatisticsContext instance. If no instance is
+   * present for this thread ID, create one using the factory.
+   * @return instance of IOStatisticsContext.
+   */
+  public static IOStatisticsContext getCurrentIOStatisticsContext() {
+return IS_THREAD_IOSTATS_ENABLED
+? ACTIVE_IOSTATS_CONTEXT.getForCurrentThread()
+: EmptyIOStatisticsContextImpl.getInstance();
+  }
+
+  /**
+   * Set the IOStatisticsContext for the current thread.
+   * @param statisticsContext IOStatistics context instance for the
+   * current thread.
+   */
+  public static void setThreadIOStatisticsContext(
+  IOStatisticsContext statisticsContext) {
+if 

[jira] [Assigned] (HADOOP-16318) Upgrade JUnit from 4 to 5 in hadoop security

2022-07-21 Thread groot (Jira)


 [ 
https://issues.apache.org/jira/browse/HADOOP-16318?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

groot reassigned HADOOP-16318:
--

Assignee: groot  (was: Kei Kori)

> Upgrade JUnit from 4 to 5 in hadoop security
> 
>
> Key: HADOOP-16318
> URL: https://issues.apache.org/jira/browse/HADOOP-16318
> Project: Hadoop Common
>  Issue Type: Sub-task
>  Components: test
>Reporter: Ajay Kumar
>Assignee: groot
>Priority: Major
> Attachments: HDFS-12433.001.patch, HDFS-12433.002.patch
>
>
> Upgrade JUnit from 4 to 5 in hadoop security  (org.apache.hadoop.security)



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[GitHub] [hadoop] hadoop-yetus commented on pull request #4529: HDFS-16648. Add isDebugEnabled check for debug logs in some classes

2022-07-21 Thread GitBox


hadoop-yetus commented on PR #4529:
URL: https://github.com/apache/hadoop/pull/4529#issuecomment-1191299605

   :broken_heart: **-1 overall**
   
   
   
   
   
   
   | Vote | Subsystem | Runtime |  Logfile | Comment |
   |::|--:|:|::|:---:|
   | +0 :ok: |  reexec  |   0m  0s |  |  Docker mode activated.  |
   | -1 :x: |  patch  |   0m 19s |  |  
https://github.com/apache/hadoop/pull/4529 does not apply to trunk. Rebase 
required? Wrong Branch? See 
https://cwiki.apache.org/confluence/display/HADOOP/How+To+Contribute for help.  
|
   
   
   | Subsystem | Report/Notes |
   |--:|:-|
   | GITHUB PR | https://github.com/apache/hadoop/pull/4529 |
   | Console output | 
https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4529/5/console |
   | versions | git=2.17.1 |
   | Powered by | Apache Yetus 0.14.0 https://yetus.apache.org |
   
   
   This message was automatically generated.
   
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[GitHub] [hadoop] ashutoshcipher opened a new pull request, #4603: YARN-10793. Upgrade Junit from 4 to 5 in hadoop-yarn-server-applicationhistoryservice

2022-07-21 Thread GitBox


ashutoshcipher opened a new pull request, #4603:
URL: https://github.com/apache/hadoop/pull/4603

   ### Description of PR
   
   Upgrade Junit from 4 to 5 in hadoop-yarn-server-applicationhistoryservice
   
   JIRA - YARN-10793
   
   ### How was this patch tested?
   
   Ran tests in local as well.
   
   
   ### For code changes:
   
   - [X] Does the title or this PR starts with the corresponding JIRA issue id 
(e.g. 'HADOOP-17799. Your PR title ...')?
   - [ ] Object storage: have the integration tests been executed and the 
endpoint declared according to the connector-specific documentation?
   - [ ] If adding new dependencies to the code, are these dependencies 
licensed in a way that is compatible for inclusion under [ASF 
2.0](http://www.apache.org/legal/resolved.html#category-a)?
   - [ ] If applicable, have you updated the `LICENSE`, `LICENSE-binary`, 
`NOTICE-binary` files?
   
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[GitHub] [hadoop] ZanderXu commented on a diff in pull request #4529: HDFS-16648. Add isDebugEnabled check for debug logs in some classes

2022-07-21 Thread GitBox


ZanderXu commented on code in PR #4529:
URL: https://github.com/apache/hadoop/pull/4529#discussion_r926497566


##
hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/ipc/Server.java:
##
@@ -3152,15 +3152,17 @@ public void run() {
   IOUtils.cleanupWithLogger(LOG, traceScope);
   if (call != null) {
 updateMetrics(call, startTimeNanos, connDropped);
-ProcessingDetails.LOG.debug(

Review Comment:
   Thanks @ferhui for your review. I have updated this patch, please help me 
review it again. Thanks



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[jira] [Work logged] (HADOOP-18330) S3AFileSystem removes Path when calling createS3Client

2022-07-21 Thread ASF GitHub Bot (Jira)


 [ 
https://issues.apache.org/jira/browse/HADOOP-18330?focusedWorklogId=793623=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-793623
 ]

ASF GitHub Bot logged work on HADOOP-18330:
---

Author: ASF GitHub Bot
Created on: 21/Jul/22 09:16
Start Date: 21/Jul/22 09:16
Worklog Time Spent: 10m 
  Work Description: steveloughran merged PR #4572:
URL: https://github.com/apache/hadoop/pull/4572




Issue Time Tracking
---

Worklog Id: (was: 793623)
Time Spent: 3h 50m  (was: 3h 40m)

> S3AFileSystem removes Path when calling createS3Client
> --
>
> Key: HADOOP-18330
> URL: https://issues.apache.org/jira/browse/HADOOP-18330
> Project: Hadoop Common
>  Issue Type: Bug
>  Components: fs/s3
>Affects Versions: 3.3.0, 3.3.1, 3.3.2, 3.3.3
>Reporter: Ashutosh Pant
>Assignee: Ashutosh Pant
>Priority: Minor
>  Labels: pull-request-available
>  Time Spent: 3h 50m
>  Remaining Estimate: 0h
>
> when using hadoop and spark to read/write data from an s3 bucket like -> 
> s3a://bucket/path and using a custom Credentials Provider, the path is 
> removed from the s3a URI and the credentials provider fails because the full 
> path is gone.
> In Spark 3.2,
> It was invoked as -> s3 = ReflectionUtils.newInstance(s3ClientFactoryClass, 
> conf)
> .createS3Client(name, bucket, credentials); 
> But In spark 3.3.3
> It is invoked as s3 = ReflectionUtils.newInstance(s3ClientFactoryClass, 
> conf).createS3Client(getUri(), parameters);
> the getUri() removes the path from the s3a URI



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[GitHub] [hadoop] steveloughran merged pull request #4572: HADOOP-18330-S3AFileSystem removes Path when calling createS3Client

2022-07-21 Thread GitBox


steveloughran merged PR #4572:
URL: https://github.com/apache/hadoop/pull/4572


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[jira] [Work logged] (HADOOP-12007) GzipCodec native CodecPool leaks memory

2022-07-21 Thread ASF GitHub Bot (Jira)


 [ 
https://issues.apache.org/jira/browse/HADOOP-12007?focusedWorklogId=793618=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-793618
 ]

ASF GitHub Bot logged work on HADOOP-12007:
---

Author: ASF GitHub Bot
Created on: 21/Jul/22 09:04
Start Date: 21/Jul/22 09:04
Worklog Time Spent: 10m 
  Work Description: hadoop-yetus commented on PR #4585:
URL: https://github.com/apache/hadoop/pull/4585#issuecomment-1191234486

   :broken_heart: **-1 overall**
   
   
   
   
   
   
   | Vote | Subsystem | Runtime |  Logfile | Comment |
   |::|--:|:|::|:---:|
   | +0 :ok: |  reexec  |   0m 53s |  |  Docker mode activated.  |
    _ Prechecks _ |
   | +1 :green_heart: |  dupname  |   0m  0s |  |  No case conflicting files 
found.  |
   | +0 :ok: |  codespell  |   0m  0s |  |  codespell was not available.  |
   | +0 :ok: |  detsecrets  |   0m  0s |  |  detect-secrets was not available.  
|
   | +0 :ok: |  xmllint  |   0m  0s |  |  xmllint was not available.  |
   | +1 :green_heart: |  @author  |   0m  0s |  |  The patch does not contain 
any @author tags.  |
   | +1 :green_heart: |  test4tests  |   0m  0s |  |  The patch appears to 
include 1 new or modified test files.  |
    _ trunk Compile Tests _ |
   | +0 :ok: |  mvndep  |  15m  3s |  |  Maven dependency ordering for branch  |
   | +1 :green_heart: |  mvninstall  |  28m  7s |  |  trunk passed  |
   | +1 :green_heart: |  compile  |  25m 12s |  |  trunk passed with JDK 
Private Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1  |
   | +1 :green_heart: |  compile  |  21m 49s |  |  trunk passed with JDK 
Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07  |
   | +1 :green_heart: |  checkstyle  |   4m 31s |  |  trunk passed  |
   | +1 :green_heart: |  mvnsite  |  20m  1s |  |  trunk passed  |
   | +1 :green_heart: |  javadoc  |   8m 34s |  |  trunk passed with JDK 
Private Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1  |
   | +1 :green_heart: |  javadoc  |   7m 29s |  |  trunk passed with JDK 
Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07  |
   | +1 :green_heart: |  spotbugs  |  41m 41s |  |  trunk passed  |
   | +1 :green_heart: |  shadedclient  |  58m 11s |  |  branch has no errors 
when building and testing our client artifacts.  |
    _ Patch Compile Tests _ |
   | +0 :ok: |  mvndep  |   0m 40s |  |  Maven dependency ordering for patch  |
   | +1 :green_heart: |  mvninstall  |  27m 50s |  |  the patch passed  |
   | +1 :green_heart: |  compile  |  26m 23s |  |  the patch passed with JDK 
Private Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1  |
   | +1 :green_heart: |  javac  |  26m 23s |  |  the patch passed  |
   | +1 :green_heart: |  compile  |  23m 35s |  |  the patch passed with JDK 
Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07  |
   | +1 :green_heart: |  javac  |  23m 35s |  |  the patch passed  |
   | +1 :green_heart: |  blanks  |   0m  0s |  |  The patch has no blanks 
issues.  |
   | -0 :warning: |  checkstyle  |   4m 14s | 
[/results-checkstyle-root.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4585/6/artifact/out/results-checkstyle-root.txt)
 |  root: The patch generated 6 new + 12 unchanged - 0 fixed = 18 total (was 
12)  |
   | +1 :green_heart: |  mvnsite  |  21m 16s |  |  the patch passed  |
   | +1 :green_heart: |  javadoc  |   8m 50s |  |  the patch passed with JDK 
Private Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1  |
   | +1 :green_heart: |  javadoc  |   8m  9s |  |  the patch passed with JDK 
Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07  |
   | +1 :green_heart: |  spotbugs  |  46m 14s |  |  the patch passed  |
   | +1 :green_heart: |  shadedclient  |  60m 33s |  |  patch has no errors 
when building and testing our client artifacts.  |
    _ Other Tests _ |
   | -1 :x: |  unit  | 1012m 42s | 
[/patch-unit-root.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4585/6/artifact/out/patch-unit-root.txt)
 |  root in the patch passed.  |
   | +1 :green_heart: |  asflicense  |   2m  3s |  |  The patch does not 
generate ASF License warnings.  |
   |  |   | 1405m 50s |  |  |
   
   
   | Reason | Tests |
   |---:|:--|
   | Failed junit tests | hadoop.hdfs.server.mover.TestMover |
   
   
   | Subsystem | Report/Notes |
   |--:|:-|
   | Docker | ClientAPI=1.41 ServerAPI=1.41 base: 
https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4585/6/artifact/out/Dockerfile
 |
   | GITHUB PR | https://github.com/apache/hadoop/pull/4585 |
   | Optional Tests | dupname asflicense compile javac javadoc mvninstall 
mvnsite unit shadedclient spotbugs checkstyle codespell detsecrets xmllint |
   | uname | Linux 46cfe907cfef 4.15.0-175-generic #184-Ubuntu SMP Thu Mar 24 
17:48:36 UTC 2022 x86_64 x86_64 x86_64 GNU/Linux |
   | Build tool | maven |
   | Personality | dev-support/bin/hadoop.sh |
   | git revision | trunk / 

[GitHub] [hadoop] hadoop-yetus commented on pull request #4585: HADOOP-12007. GzipCodec native CodecPool leaks memory

2022-07-21 Thread GitBox


hadoop-yetus commented on PR #4585:
URL: https://github.com/apache/hadoop/pull/4585#issuecomment-1191234486

   :broken_heart: **-1 overall**
   
   
   
   
   
   
   | Vote | Subsystem | Runtime |  Logfile | Comment |
   |::|--:|:|::|:---:|
   | +0 :ok: |  reexec  |   0m 53s |  |  Docker mode activated.  |
    _ Prechecks _ |
   | +1 :green_heart: |  dupname  |   0m  0s |  |  No case conflicting files 
found.  |
   | +0 :ok: |  codespell  |   0m  0s |  |  codespell was not available.  |
   | +0 :ok: |  detsecrets  |   0m  0s |  |  detect-secrets was not available.  
|
   | +0 :ok: |  xmllint  |   0m  0s |  |  xmllint was not available.  |
   | +1 :green_heart: |  @author  |   0m  0s |  |  The patch does not contain 
any @author tags.  |
   | +1 :green_heart: |  test4tests  |   0m  0s |  |  The patch appears to 
include 1 new or modified test files.  |
    _ trunk Compile Tests _ |
   | +0 :ok: |  mvndep  |  15m  3s |  |  Maven dependency ordering for branch  |
   | +1 :green_heart: |  mvninstall  |  28m  7s |  |  trunk passed  |
   | +1 :green_heart: |  compile  |  25m 12s |  |  trunk passed with JDK 
Private Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1  |
   | +1 :green_heart: |  compile  |  21m 49s |  |  trunk passed with JDK 
Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07  |
   | +1 :green_heart: |  checkstyle  |   4m 31s |  |  trunk passed  |
   | +1 :green_heart: |  mvnsite  |  20m  1s |  |  trunk passed  |
   | +1 :green_heart: |  javadoc  |   8m 34s |  |  trunk passed with JDK 
Private Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1  |
   | +1 :green_heart: |  javadoc  |   7m 29s |  |  trunk passed with JDK 
Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07  |
   | +1 :green_heart: |  spotbugs  |  41m 41s |  |  trunk passed  |
   | +1 :green_heart: |  shadedclient  |  58m 11s |  |  branch has no errors 
when building and testing our client artifacts.  |
    _ Patch Compile Tests _ |
   | +0 :ok: |  mvndep  |   0m 40s |  |  Maven dependency ordering for patch  |
   | +1 :green_heart: |  mvninstall  |  27m 50s |  |  the patch passed  |
   | +1 :green_heart: |  compile  |  26m 23s |  |  the patch passed with JDK 
Private Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1  |
   | +1 :green_heart: |  javac  |  26m 23s |  |  the patch passed  |
   | +1 :green_heart: |  compile  |  23m 35s |  |  the patch passed with JDK 
Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07  |
   | +1 :green_heart: |  javac  |  23m 35s |  |  the patch passed  |
   | +1 :green_heart: |  blanks  |   0m  0s |  |  The patch has no blanks 
issues.  |
   | -0 :warning: |  checkstyle  |   4m 14s | 
[/results-checkstyle-root.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4585/6/artifact/out/results-checkstyle-root.txt)
 |  root: The patch generated 6 new + 12 unchanged - 0 fixed = 18 total (was 
12)  |
   | +1 :green_heart: |  mvnsite  |  21m 16s |  |  the patch passed  |
   | +1 :green_heart: |  javadoc  |   8m 50s |  |  the patch passed with JDK 
Private Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1  |
   | +1 :green_heart: |  javadoc  |   8m  9s |  |  the patch passed with JDK 
Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07  |
   | +1 :green_heart: |  spotbugs  |  46m 14s |  |  the patch passed  |
   | +1 :green_heart: |  shadedclient  |  60m 33s |  |  patch has no errors 
when building and testing our client artifacts.  |
    _ Other Tests _ |
   | -1 :x: |  unit  | 1012m 42s | 
[/patch-unit-root.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4585/6/artifact/out/patch-unit-root.txt)
 |  root in the patch passed.  |
   | +1 :green_heart: |  asflicense  |   2m  3s |  |  The patch does not 
generate ASF License warnings.  |
   |  |   | 1405m 50s |  |  |
   
   
   | Reason | Tests |
   |---:|:--|
   | Failed junit tests | hadoop.hdfs.server.mover.TestMover |
   
   
   | Subsystem | Report/Notes |
   |--:|:-|
   | Docker | ClientAPI=1.41 ServerAPI=1.41 base: 
https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4585/6/artifact/out/Dockerfile
 |
   | GITHUB PR | https://github.com/apache/hadoop/pull/4585 |
   | Optional Tests | dupname asflicense compile javac javadoc mvninstall 
mvnsite unit shadedclient spotbugs checkstyle codespell detsecrets xmllint |
   | uname | Linux 46cfe907cfef 4.15.0-175-generic #184-Ubuntu SMP Thu Mar 24 
17:48:36 UTC 2022 x86_64 x86_64 x86_64 GNU/Linux |
   | Build tool | maven |
   | Personality | dev-support/bin/hadoop.sh |
   | git revision | trunk / c44808b40a0fd5ddf900a48490a4984ddcdf5e4d |
   | Default Java | Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 |
   | Multi-JDK versions | /usr/lib/jvm/java-11-openjdk-amd64:Private 
Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1 
/usr/lib/jvm/java-8-openjdk-amd64:Private 
Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 |
   |  Test Results | 
https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4585/6/testReport/ |
   | Max. process+thread 

[GitHub] [hadoop] GuoPhilipse opened a new pull request, #4602: HDFS-16673. Fix usage of chown

2022-07-21 Thread GitBox


GuoPhilipse opened a new pull request, #4602:
URL: https://github.com/apache/hadoop/pull/4602

   JIRA: HDFS-16673.
   actually `chown`  command can be used for the owner of the files or the 
super user, we need to correct the doc
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[GitHub] [hadoop] slfan1989 commented on pull request #4529: HDFS-16648. Add isDebugEnabled check for debug logs in some classes

2022-07-21 Thread GitBox


slfan1989 commented on PR #4529:
URL: https://github.com/apache/hadoop/pull/4529#issuecomment-1191116918

   > @slfan1989 ping. Can you push this PR forward?
   
   This pr looks good, let's wait for the opinions of other partners. thanks 
for your contribution!


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[GitHub] [hadoop] hadoop-yetus commented on pull request #4595: YARN-11158. Support getDelegationToken, renewDelegationToken, cancelDelegationToken API's for Federation

2022-07-21 Thread GitBox


hadoop-yetus commented on PR #4595:
URL: https://github.com/apache/hadoop/pull/4595#issuecomment-119083

   :broken_heart: **-1 overall**
   
   
   
   
   
   
   | Vote | Subsystem | Runtime |  Logfile | Comment |
   |::|--:|:|::|:---:|
   | +0 :ok: |  reexec  |   0m 49s |  |  Docker mode activated.  |
    _ Prechecks _ |
   | +1 :green_heart: |  dupname  |   0m  0s |  |  No case conflicting files 
found.  |
   | +0 :ok: |  codespell  |   0m  1s |  |  codespell was not available.  |
   | +0 :ok: |  detsecrets  |   0m  1s |  |  detect-secrets was not available.  
|
   | +0 :ok: |  buf  |   0m  1s |  |  buf was not available.  |
   | +0 :ok: |  buf  |   0m  1s |  |  buf was not available.  |
   | +0 :ok: |  xmllint  |   0m  1s |  |  xmllint was not available.  |
   | +1 :green_heart: |  @author  |   0m  0s |  |  The patch does not contain 
any @author tags.  |
   | +1 :green_heart: |  test4tests  |   0m  0s |  |  The patch appears to 
include 1 new or modified test files.  |
    _ trunk Compile Tests _ |
   | +0 :ok: |  mvndep  |  15m 28s |  |  Maven dependency ordering for branch  |
   | +1 :green_heart: |  mvninstall  |  28m 27s |  |  trunk passed  |
   | +1 :green_heart: |  compile  |  25m 16s |  |  trunk passed with JDK 
Private Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1  |
   | +1 :green_heart: |  compile  |  21m 52s |  |  trunk passed with JDK 
Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07  |
   | +1 :green_heart: |  checkstyle  |   4m 32s |  |  trunk passed  |
   | +1 :green_heart: |  mvnsite  |   5m 32s |  |  trunk passed  |
   | +1 :green_heart: |  javadoc  |   4m 41s |  |  trunk passed with JDK 
Private Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1  |
   | +1 :green_heart: |  javadoc  |   4m  6s |  |  trunk passed with JDK 
Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07  |
   | +1 :green_heart: |  spotbugs  |   8m 33s |  |  trunk passed  |
   | +1 :green_heart: |  shadedclient  |  24m 18s |  |  branch has no errors 
when building and testing our client artifacts.  |
    _ Patch Compile Tests _ |
   | +0 :ok: |  mvndep  |   0m 26s |  |  Maven dependency ordering for patch  |
   | +1 :green_heart: |  mvninstall  |   2m 54s |  |  the patch passed  |
   | +1 :green_heart: |  compile  |  24m 23s |  |  the patch passed with JDK 
Private Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1  |
   | +1 :green_heart: |  cc  |  24m 23s |  |  the patch passed  |
   | +1 :green_heart: |  javac  |  24m 23s |  |  the patch passed  |
   | +1 :green_heart: |  compile  |  21m 52s |  |  the patch passed with JDK 
Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07  |
   | +1 :green_heart: |  cc  |  21m 52s |  |  the patch passed  |
   | +1 :green_heart: |  javac  |  21m 52s |  |  the patch passed  |
   | +1 :green_heart: |  blanks  |   0m  0s |  |  The patch has no blanks 
issues.  |
   | +1 :green_heart: |  checkstyle  |   4m 25s |  |  root: The patch generated 
0 new + 169 unchanged - 2 fixed = 169 total (was 171)  |
   | +1 :green_heart: |  mvnsite  |   5m 30s |  |  the patch passed  |
   | +1 :green_heart: |  javadoc  |   4m 30s |  |  the patch passed with JDK 
Private Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1  |
   | +1 :green_heart: |  javadoc  |   4m  7s |  |  the patch passed with JDK 
Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07  |
   | -1 :x: |  spotbugs  |   1m 29s | 
[/new-spotbugs-hadoop-yarn-project_hadoop-yarn_hadoop-yarn-server_hadoop-yarn-server-router.html](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4595/3/artifact/out/new-spotbugs-hadoop-yarn-project_hadoop-yarn_hadoop-yarn-server_hadoop-yarn-server-router.html)
 |  
hadoop-yarn-project/hadoop-yarn/hadoop-yarn-server/hadoop-yarn-server-router 
generated 1 new + 0 unchanged - 0 fixed = 1 total (was 0)  |
   | +1 :green_heart: |  shadedclient  |  24m 34s |  |  patch has no errors 
when building and testing our client artifacts.  |
    _ Other Tests _ |
   | +1 :green_heart: |  unit  |  18m 27s |  |  hadoop-common in the patch 
passed.  |
   | -1 :x: |  unit  |   1m 30s | 
[/patch-unit-hadoop-yarn-project_hadoop-yarn_hadoop-yarn-api.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4595/3/artifact/out/patch-unit-hadoop-yarn-project_hadoop-yarn_hadoop-yarn-api.txt)
 |  hadoop-yarn-api in the patch passed.  |
   | +1 :green_heart: |  unit  |   3m 13s |  |  hadoop-yarn-server-common in 
the patch passed.  |
   | -1 :x: |  unit  |   2m 31s | 
[/patch-unit-hadoop-yarn-project_hadoop-yarn_hadoop-yarn-server_hadoop-yarn-server-router.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4595/3/artifact/out/patch-unit-hadoop-yarn-project_hadoop-yarn_hadoop-yarn-server_hadoop-yarn-server-router.txt)
 |  hadoop-yarn-server-router in the patch passed.  |
   | +1 :green_heart: |  asflicense  |   1m 16s |  |  The patch does not 
generate ASF License warnings.  |
   |  |   | 277m 46s |  |  |
   
   
   | Reason | Tests |
   |---:|:--|
   | SpotBugs | 

[GitHub] [hadoop] hadoop-yetus commented on pull request #4026: MAPREDUCE-7372 MapReduce set permission too late in copyJar method

2022-07-21 Thread GitBox


hadoop-yetus commented on PR #4026:
URL: https://github.com/apache/hadoop/pull/4026#issuecomment-1191095038

   :broken_heart: **-1 overall**
   
   
   
   
   
   
   | Vote | Subsystem | Runtime |  Logfile | Comment |
   |::|--:|:|::|:---:|
   | +0 :ok: |  reexec  |   0m 35s |  |  Docker mode activated.  |
    _ Prechecks _ |
   | +1 :green_heart: |  dupname  |   0m  0s |  |  No case conflicting files 
found.  |
   | +0 :ok: |  codespell  |   0m  1s |  |  codespell was not available.  |
   | +0 :ok: |  detsecrets  |   0m  1s |  |  detect-secrets was not available.  
|
   | +1 :green_heart: |  @author  |   0m  0s |  |  The patch does not contain 
any @author tags.  |
   | -1 :x: |  test4tests  |   0m  0s |  |  The patch doesn't appear to include 
any new or modified tests. Please justify why no new tests are needed for this 
patch. Also please list what manual steps were performed to verify this patch.  
|
    _ trunk Compile Tests _ |
   | +1 :green_heart: |  mvninstall  |  38m 28s |  |  trunk passed  |
   | +1 :green_heart: |  compile  |   1m  2s |  |  trunk passed with JDK 
Private Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1  |
   | +1 :green_heart: |  compile  |   0m 58s |  |  trunk passed with JDK 
Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07  |
   | +1 :green_heart: |  checkstyle  |   1m  1s |  |  trunk passed  |
   | +1 :green_heart: |  mvnsite  |   1m  5s |  |  trunk passed  |
   | +1 :green_heart: |  javadoc  |   0m 55s |  |  trunk passed with JDK 
Private Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1  |
   | +1 :green_heart: |  javadoc  |   0m 44s |  |  trunk passed with JDK 
Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07  |
   | +1 :green_heart: |  spotbugs  |   1m 52s |  |  trunk passed  |
   | +1 :green_heart: |  shadedclient  |  21m 19s |  |  branch has no errors 
when building and testing our client artifacts.  |
    _ Patch Compile Tests _ |
   | +1 :green_heart: |  mvninstall  |   0m 42s |  |  the patch passed  |
   | +1 :green_heart: |  compile  |   0m 45s |  |  the patch passed with JDK 
Private Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1  |
   | +1 :green_heart: |  javac  |   0m 45s |  |  the patch passed  |
   | +1 :green_heart: |  compile  |   0m 41s |  |  the patch passed with JDK 
Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07  |
   | +1 :green_heart: |  javac  |   0m 41s |  |  the patch passed  |
   | +1 :green_heart: |  blanks  |   0m  0s |  |  The patch has no blanks 
issues.  |
   | +1 :green_heart: |  checkstyle  |   0m 35s |  |  the patch passed  |
   | +1 :green_heart: |  mvnsite  |   0m 45s |  |  the patch passed  |
   | +1 :green_heart: |  javadoc  |   0m 27s |  |  the patch passed with JDK 
Private Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1  |
   | +1 :green_heart: |  javadoc  |   0m 27s |  |  the patch passed with JDK 
Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07  |
   | +1 :green_heart: |  spotbugs  |   1m 31s |  |  the patch passed  |
   | +1 :green_heart: |  shadedclient  |  20m 22s |  |  patch has no errors 
when building and testing our client artifacts.  |
    _ Other Tests _ |
   | +1 :green_heart: |  unit  |   7m  2s |  |  hadoop-mapreduce-client-core in 
the patch passed.  |
   | +1 :green_heart: |  asflicense  |   0m 51s |  |  The patch does not 
generate ASF License warnings.  |
   |  |   | 102m 36s |  |  |
   
   
   | Subsystem | Report/Notes |
   |--:|:-|
   | Docker | ClientAPI=1.41 ServerAPI=1.41 base: 
https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4026/5/artifact/out/Dockerfile
 |
   | GITHUB PR | https://github.com/apache/hadoop/pull/4026 |
   | Optional Tests | dupname asflicense compile javac javadoc mvninstall 
mvnsite unit shadedclient spotbugs checkstyle codespell detsecrets |
   | uname | Linux 69fbadfad1af 4.15.0-175-generic #184-Ubuntu SMP Thu Mar 24 
17:48:36 UTC 2022 x86_64 x86_64 x86_64 GNU/Linux |
   | Build tool | maven |
   | Personality | dev-support/bin/hadoop.sh |
   | git revision | trunk / 684b38e4f858d21dcd87021f519f0c41537c08a7 |
   | Default Java | Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 |
   | Multi-JDK versions | /usr/lib/jvm/java-11-openjdk-amd64:Private 
Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1 
/usr/lib/jvm/java-8-openjdk-amd64:Private 
Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 |
   |  Test Results | 
https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4026/5/testReport/ |
   | Max. process+thread count | 1605 (vs. ulimit of 5500) |
   | modules | C: 
hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-core 
U: 
hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-core |
   | Console output | 
https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4026/5/console |
   | versions | git=2.25.1 maven=3.6.3 spotbugs=4.2.2 |
   | Powered by | Apache Yetus 0.14.0 https://yetus.apache.org |
   
   
   This message was automatically generated.
   
   


-- 
This is an 

[jira] [Work logged] (HADOOP-18333) hadoop-client-runtime impact by CVE-2022-2047 due to shaded jetty

2022-07-21 Thread ASF GitHub Bot (Jira)


 [ 
https://issues.apache.org/jira/browse/HADOOP-18333?focusedWorklogId=793551=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-793551
 ]

ASF GitHub Bot logged work on HADOOP-18333:
---

Author: ASF GitHub Bot
Created on: 21/Jul/22 06:23
Start Date: 21/Jul/22 06:23
Worklog Time Spent: 10m 
  Work Description: ashutoshcipher commented on PR #4553:
URL: https://github.com/apache/hadoop/pull/4553#issuecomment-1191088686

   Thanks @jojochuang for reviewing and merging.




Issue Time Tracking
---

Worklog Id: (was: 793551)
Time Spent: 1h 40m  (was: 1.5h)

> hadoop-client-runtime impact by CVE-2022-2047 due to shaded jetty
> -
>
> Key: HADOOP-18333
> URL: https://issues.apache.org/jira/browse/HADOOP-18333
> Project: Hadoop Common
>  Issue Type: Improvement
>Affects Versions: 3.3.3
>Reporter: phoebe chen
>Assignee: groot
>Priority: Major
>  Labels: pull-request-available
>  Time Spent: 1h 40m
>  Remaining Estimate: 0h
>
> CVE-2022-2047 is recently found for Eclipse Jetty, and impacts 9.4.0 thru 
> 9.4.46.
> In latest 3.3.3 of hadoop-client-runtime, it shaded 9.4.43.v20210629 version 
> jetty which is impacted.
> In Trunk, Jetty is in version 9.4.44.v20210927, which is still impacted.
> Need to upgrade Jetty Version. 



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[GitHub] [hadoop] ashutoshcipher commented on pull request #4553: HADOOP-18333.Upgrade jetty version to 9.4.48.v20220622

2022-07-21 Thread GitBox


ashutoshcipher commented on PR #4553:
URL: https://github.com/apache/hadoop/pull/4553#issuecomment-1191088686

   Thanks @jojochuang for reviewing and merging.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org