[GitHub] [hadoop] hadoop-yetus commented on pull request #4488: HDFS-16640. RBF: Show datanode IP list when click DN histogram in Router
hadoop-yetus commented on PR #4488: URL: https://github.com/apache/hadoop/pull/4488#issuecomment-1165211795 :confetti_ball: **+1 overall** | Vote | Subsystem | Runtime | Logfile | Comment | |::|--:|:|::|:---:| | +0 :ok: | reexec | 0m 52s | | Docker mode activated. | _ Prechecks _ | | +1 :green_heart: | dupname | 0m 0s | | No case conflicting files found. | | +0 :ok: | codespell | 0m 0s | | codespell was not available. | | +0 :ok: | detsecrets | 0m 0s | | detect-secrets was not available. | | +0 :ok: | xmllint | 0m 0s | | xmllint was not available. | | +0 :ok: | jshint | 0m 0s | | jshint was not available. | | +1 :green_heart: | @author | 0m 0s | | The patch does not contain any @author tags. | _ trunk Compile Tests _ | | +0 :ok: | mvndep | 15m 6s | | Maven dependency ordering for branch | | +1 :green_heart: | mvninstall | 28m 16s | | trunk passed | | +1 :green_heart: | shadedclient | 66m 3s | | branch has no errors when building and testing our client artifacts. | _ Patch Compile Tests _ | | +0 :ok: | mvndep | 0m 24s | | Maven dependency ordering for patch | | +1 :green_heart: | mvninstall | 2m 2s | | the patch passed | | +1 :green_heart: | blanks | 0m 0s | | The patch has no blanks issues. | | +1 :green_heart: | shadedclient | 22m 1s | | patch has no errors when building and testing our client artifacts. | _ Other Tests _ | | +1 :green_heart: | asflicense | 0m 41s | | The patch does not generate ASF License warnings. | | | | 93m 53s | | | | Subsystem | Report/Notes | |--:|:-| | Docker | ClientAPI=1.41 ServerAPI=1.41 base: https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4488/3/artifact/out/Dockerfile | | GITHUB PR | https://github.com/apache/hadoop/pull/4488 | | Optional Tests | dupname asflicense shadedclient codespell detsecrets xmllint jshint | | uname | Linux f4567394809c 4.15.0-175-generic #184-Ubuntu SMP Thu Mar 24 17:48:36 UTC 2022 x86_64 x86_64 x86_64 GNU/Linux | | Build tool | maven | | Personality | dev-support/bin/hadoop.sh | | git revision | trunk / 550cf7145ea36948de7ada1e48f3c534d1760850 | | Max. process+thread count | 578 (vs. ulimit of 5500) | | modules | C: hadoop-hdfs-project/hadoop-hdfs hadoop-hdfs-project/hadoop-hdfs-rbf U: hadoop-hdfs-project | | Console output | https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4488/3/console | | versions | git=2.25.1 maven=3.6.3 | | Powered by | Apache Yetus 0.14.0 https://yetus.apache.org | This message was automatically generated. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] hadoop-yetus commented on pull request #4488: HDFS-16640. RBF: Show datanode IP list when click DN histogram in Router
hadoop-yetus commented on PR #4488: URL: https://github.com/apache/hadoop/pull/4488#issuecomment-1165207837 :confetti_ball: **+1 overall** | Vote | Subsystem | Runtime | Logfile | Comment | |::|--:|:|::|:---:| | +0 :ok: | reexec | 0m 46s | | Docker mode activated. | _ Prechecks _ | | +1 :green_heart: | dupname | 0m 0s | | No case conflicting files found. | | +0 :ok: | codespell | 0m 0s | | codespell was not available. | | +0 :ok: | detsecrets | 0m 0s | | detect-secrets was not available. | | +0 :ok: | xmllint | 0m 0s | | xmllint was not available. | | +0 :ok: | jshint | 0m 0s | | jshint was not available. | | +1 :green_heart: | @author | 0m 0s | | The patch does not contain any @author tags. | _ trunk Compile Tests _ | | +0 :ok: | mvndep | 15m 21s | | Maven dependency ordering for branch | | +1 :green_heart: | mvninstall | 27m 0s | | trunk passed | | +1 :green_heart: | shadedclient | 62m 4s | | branch has no errors when building and testing our client artifacts. | _ Patch Compile Tests _ | | +0 :ok: | mvndep | 0m 32s | | Maven dependency ordering for patch | | +1 :green_heart: | mvninstall | 2m 3s | | the patch passed | | +1 :green_heart: | blanks | 0m 0s | | The patch has no blanks issues. | | +1 :green_heart: | shadedclient | 19m 21s | | patch has no errors when building and testing our client artifacts. | _ Other Tests _ | | +1 :green_heart: | asflicense | 0m 51s | | The patch does not generate ASF License warnings. | | | | 87m 35s | | | | Subsystem | Report/Notes | |--:|:-| | Docker | ClientAPI=1.41 ServerAPI=1.41 base: https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4488/2/artifact/out/Dockerfile | | GITHUB PR | https://github.com/apache/hadoop/pull/4488 | | Optional Tests | dupname asflicense shadedclient codespell detsecrets xmllint jshint | | uname | Linux bbd2d63a1905 4.15.0-169-generic #177-Ubuntu SMP Thu Feb 3 10:50:38 UTC 2022 x86_64 x86_64 x86_64 GNU/Linux | | Build tool | maven | | Personality | dev-support/bin/hadoop.sh | | git revision | trunk / 550cf7145ea36948de7ada1e48f3c534d1760850 | | Max. process+thread count | 558 (vs. ulimit of 5500) | | modules | C: hadoop-hdfs-project/hadoop-hdfs hadoop-hdfs-project/hadoop-hdfs-rbf U: hadoop-hdfs-project | | Console output | https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4488/2/console | | versions | git=2.25.1 maven=3.6.3 | | Powered by | Apache Yetus 0.14.0 https://yetus.apache.org | This message was automatically generated. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] wzhallright commented on pull request #4488: HDFS-16640. RBF: Show datanode IP list when click DN histogram in Router
wzhallright commented on PR #4488: URL: https://github.com/apache/hadoop/pull/4488#issuecomment-1165167793 @ayushtkn Add histogram-hostip.js, after modified the screenshots. Please take a review, Thanks! RBF. ![image](https://user-images.githubusercontent.com/32935220/175459305-1e721bb3-1747-413e-8518-0ea1a482b1f0.png). NN ![image](https://user-images.githubusercontent.com/32935220/175459363-605b0ca4-b8a4-42dc-b79d-bcf6ae5fb0d4.png) -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Work logged] (HADOOP-18044) Hadoop - Upgrade to JQuery 3.6.0
[ https://issues.apache.org/jira/browse/HADOOP-18044?focusedWorklogId=784405=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-784405 ] ASF GitHub Bot logged work on HADOOP-18044: --- Author: ASF GitHub Bot Created on: 24/Jun/22 03:26 Start Date: 24/Jun/22 03:26 Worklog Time Spent: 10m Work Description: hadoop-yetus commented on PR #4495: URL: https://github.com/apache/hadoop/pull/4495#issuecomment-1165155367 :broken_heart: **-1 overall** | Vote | Subsystem | Runtime | Logfile | Comment | |::|--:|:|::|:---:| | +0 :ok: | reexec | 10m 55s | | Docker mode activated. | _ Prechecks _ | | +1 :green_heart: | dupname | 0m 0s | | No case conflicting files found. | | +0 :ok: | codespell | 0m 0s | | codespell was not available. | | +0 :ok: | jshint | 0m 0s | | jshint was not available. | | +0 :ok: | shelldocs | 0m 1s | | Shelldocs was not available. | | +1 :green_heart: | @author | 0m 0s | | The patch does not contain any @author tags. | | -1 :x: | test4tests | 0m 0s | | The patch doesn't appear to include any new or modified tests. Please justify why no new tests are needed for this patch. Also please list what manual steps were performed to verify this patch. | _ branch-3.3.4 Compile Tests _ | | +0 :ok: | mvndep | 5m 0s | | Maven dependency ordering for branch | | +1 :green_heart: | mvninstall | 31m 23s | | branch-3.3.4 passed | | +1 :green_heart: | compile | 18m 8s | | branch-3.3.4 passed | | +1 :green_heart: | mvnsite | 25m 7s | | branch-3.3.4 passed | | +1 :green_heart: | javadoc | 8m 12s | | branch-3.3.4 passed | | +1 :green_heart: | shadedclient | 31m 15s | | branch has no errors when building and testing our client artifacts. | _ Patch Compile Tests _ | | +0 :ok: | mvndep | 0m 34s | | Maven dependency ordering for patch | | +1 :green_heart: | mvninstall | 24m 35s | | the patch passed | | +1 :green_heart: | compile | 17m 23s | | the patch passed | | +1 :green_heart: | javac | 17m 23s | | the patch passed | | +1 :green_heart: | blanks | 0m 0s | | The patch has no blanks issues. | | +1 :green_heart: | mvnsite | 20m 54s | | the patch passed | | +1 :green_heart: | shellcheck | 0m 0s | | No new issues. | | +1 :green_heart: | xml | 0m 2s | | The patch has no ill-formed XML file. | | +1 :green_heart: | javadoc | 8m 3s | | the patch passed | | +1 :green_heart: | shadedclient | 32m 48s | | patch has no errors when building and testing our client artifacts. | _ Other Tests _ | | -1 :x: | unit | 749m 42s | [/patch-unit-root.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4495/1/artifact/out/patch-unit-root.txt) | root in the patch passed. | | +1 :green_heart: | asflicense | 2m 53s | | The patch does not generate ASF License warnings. | | | | 973m 36s | | | | Reason | Tests | |---:|:--| | Failed junit tests | hadoop.yarn.sls.TestReservationSystemInvariants | | | hadoop.hdfs.server.namenode.ha.TestHAAppend | | | hadoop.hdfs.server.datanode.TestBPOfferService | | | hadoop.yarn.csi.client.TestCsiClient | | | hadoop.yarn.server.resourcemanager.TestRMEmbeddedElector | | Subsystem | Report/Notes | |--:|:-| | Docker | ClientAPI=1.41 ServerAPI=1.41 base: https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4495/1/artifact/out/Dockerfile | | GITHUB PR | https://github.com/apache/hadoop/pull/4495 | | Optional Tests | dupname asflicense compile javac javadoc mvninstall mvnsite unit shadedclient codespell xml jshint shellcheck shelldocs | | uname | Linux 1da82b6adfcb 4.15.0-65-generic #74-Ubuntu SMP Tue Sep 17 17:06:04 UTC 2019 x86_64 x86_64 x86_64 GNU/Linux | | Build tool | maven | | Personality | dev-support/bin/hadoop.sh | | git revision | branch-3.3.4 / 4c22960836eebe082c01ea2545f9cea5eeb06526 | | Default Java | Private Build-1.8.0_312-8u312-b07-0ubuntu1~18.04-b07 | | Test Results | https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4495/1/testReport/ | | Max. process+thread count | 2741 (vs. ulimit of 5500) | | modules | C: hadoop-hdfs-project/hadoop-hdfs hadoop-hdfs-project/hadoop-hdfs-rbf hadoop-yarn-project/hadoop-yarn/hadoop-yarn-common . U: . | | Console output | https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4495/1/console | | versions | git=2.17.1 maven=3.6.0 shellcheck=0.4.6 | | Powered by | Apache Yetus 0.14.0-SNAPSHOT https://yetus.apache.org | This message was automatically generated. Issue Time Tracking --- Worklog Id:
[GitHub] [hadoop] hadoop-yetus commented on pull request #4495: HADOOP-18044. Hadoop - Upgrade to jQuery 3.6.0 (#3791)
hadoop-yetus commented on PR #4495: URL: https://github.com/apache/hadoop/pull/4495#issuecomment-1165155367 :broken_heart: **-1 overall** | Vote | Subsystem | Runtime | Logfile | Comment | |::|--:|:|::|:---:| | +0 :ok: | reexec | 10m 55s | | Docker mode activated. | _ Prechecks _ | | +1 :green_heart: | dupname | 0m 0s | | No case conflicting files found. | | +0 :ok: | codespell | 0m 0s | | codespell was not available. | | +0 :ok: | jshint | 0m 0s | | jshint was not available. | | +0 :ok: | shelldocs | 0m 1s | | Shelldocs was not available. | | +1 :green_heart: | @author | 0m 0s | | The patch does not contain any @author tags. | | -1 :x: | test4tests | 0m 0s | | The patch doesn't appear to include any new or modified tests. Please justify why no new tests are needed for this patch. Also please list what manual steps were performed to verify this patch. | _ branch-3.3.4 Compile Tests _ | | +0 :ok: | mvndep | 5m 0s | | Maven dependency ordering for branch | | +1 :green_heart: | mvninstall | 31m 23s | | branch-3.3.4 passed | | +1 :green_heart: | compile | 18m 8s | | branch-3.3.4 passed | | +1 :green_heart: | mvnsite | 25m 7s | | branch-3.3.4 passed | | +1 :green_heart: | javadoc | 8m 12s | | branch-3.3.4 passed | | +1 :green_heart: | shadedclient | 31m 15s | | branch has no errors when building and testing our client artifacts. | _ Patch Compile Tests _ | | +0 :ok: | mvndep | 0m 34s | | Maven dependency ordering for patch | | +1 :green_heart: | mvninstall | 24m 35s | | the patch passed | | +1 :green_heart: | compile | 17m 23s | | the patch passed | | +1 :green_heart: | javac | 17m 23s | | the patch passed | | +1 :green_heart: | blanks | 0m 0s | | The patch has no blanks issues. | | +1 :green_heart: | mvnsite | 20m 54s | | the patch passed | | +1 :green_heart: | shellcheck | 0m 0s | | No new issues. | | +1 :green_heart: | xml | 0m 2s | | The patch has no ill-formed XML file. | | +1 :green_heart: | javadoc | 8m 3s | | the patch passed | | +1 :green_heart: | shadedclient | 32m 48s | | patch has no errors when building and testing our client artifacts. | _ Other Tests _ | | -1 :x: | unit | 749m 42s | [/patch-unit-root.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4495/1/artifact/out/patch-unit-root.txt) | root in the patch passed. | | +1 :green_heart: | asflicense | 2m 53s | | The patch does not generate ASF License warnings. | | | | 973m 36s | | | | Reason | Tests | |---:|:--| | Failed junit tests | hadoop.yarn.sls.TestReservationSystemInvariants | | | hadoop.hdfs.server.namenode.ha.TestHAAppend | | | hadoop.hdfs.server.datanode.TestBPOfferService | | | hadoop.yarn.csi.client.TestCsiClient | | | hadoop.yarn.server.resourcemanager.TestRMEmbeddedElector | | Subsystem | Report/Notes | |--:|:-| | Docker | ClientAPI=1.41 ServerAPI=1.41 base: https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4495/1/artifact/out/Dockerfile | | GITHUB PR | https://github.com/apache/hadoop/pull/4495 | | Optional Tests | dupname asflicense compile javac javadoc mvninstall mvnsite unit shadedclient codespell xml jshint shellcheck shelldocs | | uname | Linux 1da82b6adfcb 4.15.0-65-generic #74-Ubuntu SMP Tue Sep 17 17:06:04 UTC 2019 x86_64 x86_64 x86_64 GNU/Linux | | Build tool | maven | | Personality | dev-support/bin/hadoop.sh | | git revision | branch-3.3.4 / 4c22960836eebe082c01ea2545f9cea5eeb06526 | | Default Java | Private Build-1.8.0_312-8u312-b07-0ubuntu1~18.04-b07 | | Test Results | https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4495/1/testReport/ | | Max. process+thread count | 2741 (vs. ulimit of 5500) | | modules | C: hadoop-hdfs-project/hadoop-hdfs hadoop-hdfs-project/hadoop-hdfs-rbf hadoop-yarn-project/hadoop-yarn/hadoop-yarn-common . U: . | | Console output | https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4495/1/console | | versions | git=2.17.1 maven=3.6.0 shellcheck=0.4.6 | | Powered by | Apache Yetus 0.14.0-SNAPSHOT https://yetus.apache.org | This message was automatically generated. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional
[GitHub] [hadoop] hadoop-yetus commented on pull request #4406: HDFS-16619. Fix HttpHeaders.Values And HttpHeaders.Names Deprecated Import
hadoop-yetus commented on PR #4406: URL: https://github.com/apache/hadoop/pull/4406#issuecomment-1165151975 :confetti_ball: **+1 overall** | Vote | Subsystem | Runtime | Logfile | Comment | |::|--:|:|::|:---:| | +0 :ok: | reexec | 0m 46s | | Docker mode activated. | _ Prechecks _ | | +1 :green_heart: | dupname | 0m 1s | | No case conflicting files found. | | +0 :ok: | codespell | 0m 0s | | codespell was not available. | | +0 :ok: | detsecrets | 0m 0s | | detect-secrets was not available. | | +1 :green_heart: | @author | 0m 0s | | The patch does not contain any @author tags. | | +1 :green_heart: | test4tests | 0m 0s | | The patch appears to include 2 new or modified test files. | _ trunk Compile Tests _ | | +1 :green_heart: | mvninstall | 37m 41s | | trunk passed | | +1 :green_heart: | compile | 1m 38s | | trunk passed with JDK Private Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1 | | +1 :green_heart: | compile | 1m 35s | | trunk passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | checkstyle | 1m 25s | | trunk passed | | +1 :green_heart: | mvnsite | 1m 45s | | trunk passed | | +1 :green_heart: | javadoc | 1m 24s | | trunk passed with JDK Private Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1 | | +1 :green_heart: | javadoc | 1m 50s | | trunk passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | spotbugs | 3m 41s | | trunk passed | | +1 :green_heart: | shadedclient | 23m 2s | | branch has no errors when building and testing our client artifacts. | | -0 :warning: | patch | 23m 29s | | Used diff version of patch file. Binary files and potentially other changes not applied. Please rebase and squash commits if necessary. | _ Patch Compile Tests _ | | +1 :green_heart: | mvninstall | 1m 23s | | the patch passed | | +1 :green_heart: | compile | 1m 28s | | the patch passed with JDK Private Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1 | | +1 :green_heart: | javac | 1m 28s | | hadoop-hdfs-project_hadoop-hdfs-jdkPrivateBuild-11.0.15+10-Ubuntu-0ubuntu0.20.04.1 with JDK Private Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1 generated 0 new + 911 unchanged - 26 fixed = 911 total (was 937) | | +1 :green_heart: | compile | 1m 21s | | the patch passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | javac | 1m 21s | | hadoop-hdfs-project_hadoop-hdfs-jdkPrivateBuild-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 generated 0 new + 890 unchanged - 26 fixed = 890 total (was 916) | | +1 :green_heart: | blanks | 0m 0s | | The patch has no blanks issues. | | +1 :green_heart: | checkstyle | 1m 0s | | the patch passed | | +1 :green_heart: | mvnsite | 1m 23s | | the patch passed | | +1 :green_heart: | javadoc | 0m 59s | | the patch passed with JDK Private Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1 | | +1 :green_heart: | javadoc | 1m 30s | | the patch passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | spotbugs | 3m 20s | | the patch passed | | +1 :green_heart: | shadedclient | 22m 28s | | patch has no errors when building and testing our client artifacts. | _ Other Tests _ | | +1 :green_heart: | unit | 236m 26s | | hadoop-hdfs in the patch passed. | | +1 :green_heart: | asflicense | 1m 16s | | The patch does not generate ASF License warnings. | | | | 345m 49s | | | | Subsystem | Report/Notes | |--:|:-| | Docker | ClientAPI=1.41 ServerAPI=1.41 base: https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4406/10/artifact/out/Dockerfile | | GITHUB PR | https://github.com/apache/hadoop/pull/4406 | | Optional Tests | dupname asflicense compile javac javadoc mvninstall mvnsite unit shadedclient spotbugs checkstyle codespell detsecrets | | uname | Linux 743797d72837 4.15.0-112-generic #113-Ubuntu SMP Thu Jul 9 23:41:39 UTC 2020 x86_64 x86_64 x86_64 GNU/Linux | | Build tool | maven | | Personality | dev-support/bin/hadoop.sh | | git revision | trunk / 38082b886a768a6834f85ef5114a96969b2ed1d8 | | Default Java | Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | Multi-JDK versions | /usr/lib/jvm/java-11-openjdk-amd64:Private Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1 /usr/lib/jvm/java-8-openjdk-amd64:Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | Test Results | https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4406/10/testReport/ | | Max. process+thread count | 3523 (vs. ulimit of 5500) | | modules | C:
[GitHub] [hadoop] wzhallright commented on pull request #4488: HDFS-16640. RBF: Show datanode IP list when click DN histogram in Router
wzhallright commented on PR #4488: URL: https://github.com/apache/hadoop/pull/4488#issuecomment-1165134491 > I'm not good at JS... But I can try to modify -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Work logged] (HADOOP-18303) Remove shading exclusion of javax.ws.rs-api from hadoop-client-runtime
[ https://issues.apache.org/jira/browse/HADOOP-18303?focusedWorklogId=784395=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-784395 ] ASF GitHub Bot logged work on HADOOP-18303: --- Author: ASF GitHub Bot Created on: 24/Jun/22 02:13 Start Date: 24/Jun/22 02:13 Worklog Time Spent: 10m Work Description: virajjasani commented on PR #4461: URL: https://github.com/apache/hadoop/pull/4461#issuecomment-1165109458 Even if HADOOP-18033 had not excluded javax.ws.rs-api from shading, as [mentioned here](https://github.com/apache/hadoop/pull/4461#pullrequestreview-1012678699), we would have anyways caused subtle problems by having only one version of common class in the final client jar (as opposed to having different classes for both javax.ws.rs-api and jsr311-api). Issue Time Tracking --- Worklog Id: (was: 784395) Time Spent: 3h 10m (was: 3h) > Remove shading exclusion of javax.ws.rs-api from hadoop-client-runtime > -- > > Key: HADOOP-18303 > URL: https://issues.apache.org/jira/browse/HADOOP-18303 > Project: Hadoop Common > Issue Type: Bug >Reporter: Viraj Jasani >Assignee: Viraj Jasani >Priority: Critical > Labels: pull-request-available > Time Spent: 3h 10m > Remaining Estimate: 0h > > As part of HADOOP-18033, we have excluded shading of javax.ws.rs-api from > both hadoop-client-runtime and hadoop-client-minicluster. This has caused > issues for downstreamers e.g. > [https://github.com/apache/incubator-kyuubi/issues/2904], more discussions. > We should put the shading back in hadoop-client-runtime to fix CNFE issues > for downstreamers. > cc [~ayushsaxena] [~pan3793] -- This message was sent by Atlassian Jira (v8.20.7#820007) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] virajjasani commented on pull request #4461: HADOOP-18303. Remove shading exclusion of javax.ws.rs-api from hadoop-client-runtime
virajjasani commented on PR #4461: URL: https://github.com/apache/hadoop/pull/4461#issuecomment-1165109458 Even if HADOOP-18033 had not excluded javax.ws.rs-api from shading, as [mentioned here](https://github.com/apache/hadoop/pull/4461#pullrequestreview-1012678699), we would have anyways caused subtle problems by having only one version of common class in the final client jar (as opposed to having different classes for both javax.ws.rs-api and jsr311-api). -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Work logged] (HADOOP-18303) Remove shading exclusion of javax.ws.rs-api from hadoop-client-runtime
[ https://issues.apache.org/jira/browse/HADOOP-18303?focusedWorklogId=784392=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-784392 ] ASF GitHub Bot logged work on HADOOP-18303: --- Author: ASF GitHub Bot Created on: 24/Jun/22 02:08 Start Date: 24/Jun/22 02:08 Worklog Time Spent: 10m Work Description: virajjasani commented on PR #4461: URL: https://github.com/apache/hadoop/pull/4461#issuecomment-1165107308 While HADOOP-18033 is released in 3.3.2, HADOOP-18178 has also made it's way to 3.3.3. Issue Time Tracking --- Worklog Id: (was: 784392) Time Spent: 3h (was: 2h 50m) > Remove shading exclusion of javax.ws.rs-api from hadoop-client-runtime > -- > > Key: HADOOP-18303 > URL: https://issues.apache.org/jira/browse/HADOOP-18303 > Project: Hadoop Common > Issue Type: Bug >Reporter: Viraj Jasani >Assignee: Viraj Jasani >Priority: Critical > Labels: pull-request-available > Time Spent: 3h > Remaining Estimate: 0h > > As part of HADOOP-18033, we have excluded shading of javax.ws.rs-api from > both hadoop-client-runtime and hadoop-client-minicluster. This has caused > issues for downstreamers e.g. > [https://github.com/apache/incubator-kyuubi/issues/2904], more discussions. > We should put the shading back in hadoop-client-runtime to fix CNFE issues > for downstreamers. > cc [~ayushsaxena] [~pan3793] -- This message was sent by Atlassian Jira (v8.20.7#820007) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] virajjasani commented on pull request #4461: HADOOP-18303. Remove shading exclusion of javax.ws.rs-api from hadoop-client-runtime
virajjasani commented on PR #4461: URL: https://github.com/apache/hadoop/pull/4461#issuecomment-1165107308 While HADOOP-18033 is released in 3.3.2, HADOOP-18178 has also made it's way to 3.3.3. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Commented] (HADOOP-18290) Fix some compatibility issues with 3.3.3 release notes
[ https://issues.apache.org/jira/browse/HADOOP-18290?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17558284#comment-17558284 ] JiangHua Zhu commented on HADOOP-18290: --- I appreciate your suggestion, [~ste...@apache.org]. Automating builds is very important work, thank you very much for your work. > Fix some compatibility issues with 3.3.3 release notes > -- > > Key: HADOOP-18290 > URL: https://issues.apache.org/jira/browse/HADOOP-18290 > Project: Hadoop Common > Issue Type: Improvement > Components: build, documentation >Affects Versions: 3.3.3 >Reporter: JiangHua Zhu >Priority: Major > Attachments: image-2022-06-14-10-27-23-027.png, > image-2022-06-14-10-28-53-822.png > > > 3.3.3 Release Notes: > https://hadoop.apache.org/docs/r3.3.3/hadoop-project-dist/hadoop-common/release/3.3.3/RELEASENOTES.3.3.3.html > There are some compatibility issues here. E.g: > !image-2022-06-14-10-27-23-027.png! > I think this is happening due to a syntax issue. > It would be more appropriate to change it to this: > !image-2022-06-14-10-28-53-822.png! -- This message was sent by Atlassian Jira (v8.20.7#820007) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Work logged] (HADOOP-18254) Add in configuration option to enable prefetching
[ https://issues.apache.org/jira/browse/HADOOP-18254?focusedWorklogId=784390=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-784390 ] ASF GitHub Bot logged work on HADOOP-18254: --- Author: ASF GitHub Bot Created on: 24/Jun/22 01:55 Start Date: 24/Jun/22 01:55 Worklog Time Spent: 10m Work Description: ashutoshcipher commented on PR #4469: URL: https://github.com/apache/hadoop/pull/4469#issuecomment-1165100507 LGTM +1(non-binding) Issue Time Tracking --- Worklog Id: (was: 784390) Time Spent: 1h (was: 50m) > Add in configuration option to enable prefetching > - > > Key: HADOOP-18254 > URL: https://issues.apache.org/jira/browse/HADOOP-18254 > Project: Hadoop Common > Issue Type: Sub-task >Reporter: Ahmar Suhail >Assignee: Ahmar Suhail >Priority: Minor > Labels: pull-request-available > Time Spent: 1h > Remaining Estimate: 0h > > Currently prefetching is enabled by default, we should instead add in a > config option to enable it. -- This message was sent by Atlassian Jira (v8.20.7#820007) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] ashutoshcipher commented on pull request #4469: HADOOP-18254. Disable prefetching by default.
ashutoshcipher commented on PR #4469: URL: https://github.com/apache/hadoop/pull/4469#issuecomment-1165100507 LGTM +1(non-binding) -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] ashutoshcipher commented on a diff in pull request #4248: MAPREDUCE-7370. Parallelize MultipleOutputs#close call
ashutoshcipher commented on code in PR #4248: URL: https://github.com/apache/hadoop/pull/4248#discussion_r905654385 ## hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-core/src/main/java/org/apache/hadoop/mapreduce/lib/output/MultipleOutputs.java: ## @@ -570,8 +570,14 @@ public void setStatus(String status) { */ @SuppressWarnings("unchecked") public void close() throws IOException, InterruptedException { -for (RecordWriter writer : recordWriters.values()) { - writer.close(context); -} +recordWriters.values().parallelStream().forEach(writer -> { Review Comment: Thanks @cnauroth and @steveloughran for comments. I will make the required changes. Thanks. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Work logged] (HADOOP-18309) Upgrade bundled Tomcat to 8.5.76 or higher
[ https://issues.apache.org/jira/browse/HADOOP-18309?focusedWorklogId=784386=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-784386 ] ASF GitHub Bot logged work on HADOOP-18309: --- Author: ASF GitHub Bot Created on: 24/Jun/22 00:12 Start Date: 24/Jun/22 00:12 Worklog Time Spent: 10m Work Description: aajisaka commented on PR #4479: URL: https://github.com/apache/hadoop/pull/4479#issuecomment-1165034064 Thank you @ashutoshcipher for the contribution and thank you @iwasakims for testing and merging the PR. Issue Time Tracking --- Worklog Id: (was: 784386) Time Spent: 40m (was: 0.5h) > Upgrade bundled Tomcat to 8.5.76 or higher > -- > > Key: HADOOP-18309 > URL: https://issues.apache.org/jira/browse/HADOOP-18309 > Project: Hadoop Common > Issue Type: Improvement > Components: httpfs, kms >Affects Versions: 2.10.1, 2.10.2 >Reporter: Ashutosh Gupta >Assignee: Ashutosh Gupta >Priority: Major > Labels: pull-request-available > Fix For: 2.10.3 > > Time Spent: 40m > Remaining Estimate: 0h > > Currently we are using 8.5.75 which is affected by > {color:#33}CVE-2022-25762{color} > More Details - > [https://lists.apache.org/thread/qzkqh2819x6zsmj7vwdf14ng2fdgckw7] > Lets upgrade 8.5.76 or higher > > -- This message was sent by Atlassian Jira (v8.20.7#820007) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] aajisaka commented on pull request #4479: HADOOP-18309.Upgrade bundled Tomcat to 8.5.81
aajisaka commented on PR #4479: URL: https://github.com/apache/hadoop/pull/4479#issuecomment-1165034064 Thank you @ashutoshcipher for the contribution and thank you @iwasakims for testing and merging the PR. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Work logged] (HADOOP-18303) Remove shading exclusion of javax.ws.rs-api from hadoop-client-runtime
[ https://issues.apache.org/jira/browse/HADOOP-18303?focusedWorklogId=784376=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-784376 ] ASF GitHub Bot logged work on HADOOP-18303: --- Author: ASF GitHub Bot Created on: 23/Jun/22 23:13 Start Date: 23/Jun/22 23:13 Worklog Time Spent: 10m Work Description: ayushtkn commented on PR #4461: URL: https://github.com/apache/hadoop/pull/4461#issuecomment-116433 Yeps, if people let us do that, that jira is already in a released version Issue Time Tracking --- Worklog Id: (was: 784376) Time Spent: 2h 50m (was: 2h 40m) > Remove shading exclusion of javax.ws.rs-api from hadoop-client-runtime > -- > > Key: HADOOP-18303 > URL: https://issues.apache.org/jira/browse/HADOOP-18303 > Project: Hadoop Common > Issue Type: Bug >Reporter: Viraj Jasani >Assignee: Viraj Jasani >Priority: Critical > Labels: pull-request-available > Time Spent: 2h 50m > Remaining Estimate: 0h > > As part of HADOOP-18033, we have excluded shading of javax.ws.rs-api from > both hadoop-client-runtime and hadoop-client-minicluster. This has caused > issues for downstreamers e.g. > [https://github.com/apache/incubator-kyuubi/issues/2904], more discussions. > We should put the shading back in hadoop-client-runtime to fix CNFE issues > for downstreamers. > cc [~ayushsaxena] [~pan3793] -- This message was sent by Atlassian Jira (v8.20.7#820007) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] ayushtkn commented on pull request #4461: HADOOP-18303. Remove shading exclusion of javax.ws.rs-api from hadoop-client-runtime
ayushtkn commented on PR #4461: URL: https://github.com/apache/hadoop/pull/4461#issuecomment-116433 Yeps, if people let us do that, that jira is already in a released version -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Work logged] (HADOOP-18310) Add option and make 400 bad request retryable
[ https://issues.apache.org/jira/browse/HADOOP-18310?focusedWorklogId=784372=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-784372 ] ASF GitHub Bot logged work on HADOOP-18310: --- Author: ASF GitHub Bot Created on: 23/Jun/22 22:46 Start Date: 23/Jun/22 22:46 Worklog Time Spent: 10m Work Description: mukund-thakur commented on PR #4483: URL: https://github.com/apache/hadoop/pull/4483#issuecomment-1164984485 Do we really need to introduce this config? Seems like an overkill. I think 400 Bad Request are supposed to be non retry-able. Issue Time Tracking --- Worklog Id: (was: 784372) Time Spent: 1h 50m (was: 1h 40m) > Add option and make 400 bad request retryable > - > > Key: HADOOP-18310 > URL: https://issues.apache.org/jira/browse/HADOOP-18310 > Project: Hadoop Common > Issue Type: Bug > Components: fs/s3 >Affects Versions: 3.3.4 >Reporter: Tak-Lon (Stephen) Wu >Priority: Major > Labels: pull-request-available > Time Spent: 1h 50m > Remaining Estimate: 0h > > When one is using a customized credential provider via > fs.s3a.aws.credentials.provider, e.g. > org.apache.hadoop.fs.s3a.TemporaryAWSCredentialsProvider, when the provided > credential by this pluggable provider is expired and return an error code of > 400 as bad request exception. > Here, the current S3ARetryPolicy will fail immediately and does not retry on > the S3A level. > Our recent use case in HBase found this use case could lead to a Region > Server got immediate abandoned from this Exception without retry, when the > file system is trying open or S3AInputStream is trying to reopen the file. > especially the S3AInputStream use cases, we cannot find a good way to retry > outside of the file system semantic (because if a ongoing stream is failing > currently it's considered as irreparable state), and thus we come up with > this optional flag for retrying in S3A. > {code} > Caused by: com.amazonaws.services.s3.model.AmazonS3Exception: The provided > token has expired. (Service: Amazon S3; Status Code: 400; Error Code: > ExpiredToken; Request ID: XYZ; S3 Extended Request ID: ABC; Proxy: null), S3 > Extended Request ID: 123 > at > com.amazonaws.http.AmazonHttpClient$RequestExecutor.handleErrorResponse(AmazonHttpClient.java:1862) > at > com.amazonaws.http.AmazonHttpClient$RequestExecutor.handleServiceErrorResponse(AmazonHttpClient.java:1415) > at > com.amazonaws.http.AmazonHttpClient$RequestExecutor.executeOneRequest(AmazonHttpClient.java:1384) > at > com.amazonaws.http.AmazonHttpClient$RequestExecutor.executeHelper(AmazonHttpClient.java:1154) > at > com.amazonaws.http.AmazonHttpClient$RequestExecutor.doExecute(AmazonHttpClient.java:811) > at > com.amazonaws.http.AmazonHttpClient$RequestExecutor.executeWithTimer(AmazonHttpClient.java:779) > at > com.amazonaws.http.AmazonHttpClient$RequestExecutor.execute(AmazonHttpClient.java:753) > at > com.amazonaws.http.AmazonHttpClient$RequestExecutor.access$500(AmazonHttpClient.java:713) > at > com.amazonaws.http.AmazonHttpClient$RequestExecutionBuilderImpl.execute(AmazonHttpClient.java:695) > at > com.amazonaws.http.AmazonHttpClient.execute(AmazonHttpClient.java:559) > at > com.amazonaws.http.AmazonHttpClient.execute(AmazonHttpClient.java:539) > at > com.amazonaws.services.s3.AmazonS3Client.invoke(AmazonS3Client.java:5453) > at > com.amazonaws.services.s3.AmazonS3Client.invoke(AmazonS3Client.java:5400) > at > com.amazonaws.services.s3.AmazonS3Client.getObject(AmazonS3Client.java:1524) > at > org.apache.hadoop.fs.s3a.S3AFileSystem$InputStreamCallbacksImpl.getObject(S3AFileSystem.java:1506) > at > org.apache.hadoop.fs.s3a.S3AInputStream.lambda$reopen$0(S3AInputStream.java:217) > at org.apache.hadoop.fs.s3a.Invoker.once(Invoker.java:117) > ... 35 more > {code} -- This message was sent by Atlassian Jira (v8.20.7#820007) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] mukund-thakur commented on pull request #4483: HADOOP-18310 Add option and make 400 bad request retryable
mukund-thakur commented on PR #4483: URL: https://github.com/apache/hadoop/pull/4483#issuecomment-1164984485 Do we really need to introduce this config? Seems like an overkill. I think 400 Bad Request are supposed to be non retry-able. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Work logged] (HADOOP-18306) Warnings should not be shown on cli console when linux user not present on client
[ https://issues.apache.org/jira/browse/HADOOP-18306?focusedWorklogId=784363=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-784363 ] ASF GitHub Bot logged work on HADOOP-18306: --- Author: ASF GitHub Bot Created on: 23/Jun/22 22:22 Start Date: 23/Jun/22 22:22 Worklog Time Spent: 10m Work Description: hadoop-yetus commented on PR #4474: URL: https://github.com/apache/hadoop/pull/4474#issuecomment-1164971077 :broken_heart: **-1 overall** | Vote | Subsystem | Runtime | Logfile | Comment | |::|--:|:|::|:---:| | +0 :ok: | reexec | 0m 55s | | Docker mode activated. | _ Prechecks _ | | +1 :green_heart: | dupname | 0m 0s | | No case conflicting files found. | | +0 :ok: | codespell | 0m 0s | | codespell was not available. | | +0 :ok: | detsecrets | 0m 0s | | detect-secrets was not available. | | +1 :green_heart: | @author | 0m 0s | | The patch does not contain any @author tags. | | -1 :x: | test4tests | 0m 0s | | The patch doesn't appear to include any new or modified tests. Please justify why no new tests are needed for this patch. Also please list what manual steps were performed to verify this patch. | _ trunk Compile Tests _ | | +1 :green_heart: | mvninstall | 40m 24s | | trunk passed | | +1 :green_heart: | compile | 25m 10s | | trunk passed with JDK Private Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1 | | +1 :green_heart: | compile | 21m 37s | | trunk passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | checkstyle | 1m 31s | | trunk passed | | +1 :green_heart: | mvnsite | 1m 58s | | trunk passed | | +1 :green_heart: | javadoc | 1m 31s | | trunk passed with JDK Private Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1 | | +1 :green_heart: | javadoc | 1m 4s | | trunk passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | spotbugs | 3m 6s | | trunk passed | | +1 :green_heart: | shadedclient | 25m 52s | | branch has no errors when building and testing our client artifacts. | _ Patch Compile Tests _ | | +1 :green_heart: | mvninstall | 1m 6s | | the patch passed | | +1 :green_heart: | compile | 24m 13s | | the patch passed with JDK Private Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1 | | +1 :green_heart: | javac | 24m 13s | | the patch passed | | +1 :green_heart: | compile | 21m 45s | | the patch passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | javac | 21m 45s | | the patch passed | | +1 :green_heart: | blanks | 0m 0s | | The patch has no blanks issues. | | +1 :green_heart: | checkstyle | 1m 24s | | the patch passed | | +1 :green_heart: | mvnsite | 1m 56s | | the patch passed | | +1 :green_heart: | javadoc | 1m 22s | | the patch passed with JDK Private Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1 | | +1 :green_heart: | javadoc | 1m 4s | | the patch passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | spotbugs | 3m 3s | | the patch passed | | +1 :green_heart: | shadedclient | 25m 37s | | patch has no errors when building and testing our client artifacts. | _ Other Tests _ | | +1 :green_heart: | unit | 18m 17s | | hadoop-common in the patch passed. | | +1 :green_heart: | asflicense | 1m 16s | | The patch does not generate ASF License warnings. | | | | 224m 51s | | | | Subsystem | Report/Notes | |--:|:-| | Docker | ClientAPI=1.41 ServerAPI=1.41 base: https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4474/2/artifact/out/Dockerfile | | GITHUB PR | https://github.com/apache/hadoop/pull/4474 | | JIRA Issue | HADOOP-18306 | | Optional Tests | dupname asflicense compile javac javadoc mvninstall mvnsite unit shadedclient spotbugs checkstyle codespell detsecrets | | uname | Linux 472d91b4539c 4.15.0-175-generic #184-Ubuntu SMP Thu Mar 24 17:48:36 UTC 2022 x86_64 x86_64 x86_64 GNU/Linux | | Build tool | maven | | Personality | dev-support/bin/hadoop.sh | | git revision | trunk / 370df9b21f54462fd0da45aa1a5cf2c87bbd0757 | | Default Java | Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | Multi-JDK versions | /usr/lib/jvm/java-11-openjdk-amd64:Private Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1 /usr/lib/jvm/java-8-openjdk-amd64:Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | Test Results | https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4474/2/testReport/ | | Max. process+thread count | 1803 (vs. ulimit of 5500) | | modules | C:
[jira] [Work logged] (HADOOP-18215) Enhance WritableName to be able to return aliases for classes that use serializers
[ https://issues.apache.org/jira/browse/HADOOP-18215?focusedWorklogId=784364=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-784364 ] ASF GitHub Bot logged work on HADOOP-18215: --- Author: ASF GitHub Bot Created on: 23/Jun/22 22:22 Start Date: 23/Jun/22 22:22 Worklog Time Spent: 10m Work Description: hadoop-yetus commented on PR #4215: URL: https://github.com/apache/hadoop/pull/4215#issuecomment-1164971314 :confetti_ball: **+1 overall** | Vote | Subsystem | Runtime | Logfile | Comment | |::|--:|:|::|:---:| | +0 :ok: | reexec | 0m 39s | | Docker mode activated. | _ Prechecks _ | | +1 :green_heart: | dupname | 0m 0s | | No case conflicting files found. | | +0 :ok: | codespell | 0m 0s | | codespell was not available. | | +0 :ok: | detsecrets | 0m 0s | | detect-secrets was not available. | | +1 :green_heart: | @author | 0m 0s | | The patch does not contain any @author tags. | | +1 :green_heart: | test4tests | 0m 0s | | The patch appears to include 2 new or modified test files. | _ trunk Compile Tests _ | | +1 :green_heart: | mvninstall | 37m 50s | | trunk passed | | +1 :green_heart: | compile | 23m 15s | | trunk passed with JDK Private Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1 | | +1 :green_heart: | compile | 20m 39s | | trunk passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | checkstyle | 1m 48s | | trunk passed | | +1 :green_heart: | mvnsite | 2m 14s | | trunk passed | | +1 :green_heart: | javadoc | 1m 36s | | trunk passed with JDK Private Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1 | | +1 :green_heart: | javadoc | 1m 22s | | trunk passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | spotbugs | 3m 11s | | trunk passed | | +1 :green_heart: | shadedclient | 23m 34s | | branch has no errors when building and testing our client artifacts. | _ Patch Compile Tests _ | | +1 :green_heart: | mvninstall | 1m 7s | | the patch passed | | +1 :green_heart: | compile | 22m 23s | | the patch passed with JDK Private Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1 | | +1 :green_heart: | javac | 22m 23s | | the patch passed | | +1 :green_heart: | compile | 20m 41s | | the patch passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | javac | 20m 41s | | the patch passed | | +1 :green_heart: | blanks | 0m 0s | | The patch has no blanks issues. | | +1 :green_heart: | checkstyle | 1m 39s | | the patch passed | | +1 :green_heart: | mvnsite | 2m 9s | | the patch passed | | +1 :green_heart: | javadoc | 1m 38s | | the patch passed with JDK Private Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1 | | +1 :green_heart: | javadoc | 1m 20s | | the patch passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | spotbugs | 3m 8s | | the patch passed | | +1 :green_heart: | shadedclient | 23m 27s | | patch has no errors when building and testing our client artifacts. | _ Other Tests _ | | +1 :green_heart: | unit | 18m 47s | | hadoop-common in the patch passed. | | +1 :green_heart: | asflicense | 1m 35s | | The patch does not generate ASF License warnings. | | | | 215m 49s | | | | Subsystem | Report/Notes | |--:|:-| | Docker | ClientAPI=1.41 ServerAPI=1.41 base: https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4215/6/artifact/out/Dockerfile | | GITHUB PR | https://github.com/apache/hadoop/pull/4215 | | Optional Tests | dupname asflicense compile javac javadoc mvninstall mvnsite unit shadedclient spotbugs checkstyle codespell detsecrets | | uname | Linux 0cb34f79ee1f 4.15.0-112-generic #113-Ubuntu SMP Thu Jul 9 23:41:39 UTC 2020 x86_64 x86_64 x86_64 GNU/Linux | | Build tool | maven | | Personality | dev-support/bin/hadoop.sh | | git revision | trunk / adf3de0112c97cb637b72c19aacddb0662bdee4d | | Default Java | Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | Multi-JDK versions | /usr/lib/jvm/java-11-openjdk-amd64:Private Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1 /usr/lib/jvm/java-8-openjdk-amd64:Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | Test Results | https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4215/6/testReport/ | | Max. process+thread count | 2466 (vs. ulimit of 5500) | | modules | C: hadoop-common-project/hadoop-common U: hadoop-common-project/hadoop-common | | Console output | https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4215/6/console |
[GitHub] [hadoop] hadoop-yetus commented on pull request #4215: HADOOP-18215. Enhance WritableName to be able to return aliases for classes that use serializers
hadoop-yetus commented on PR #4215: URL: https://github.com/apache/hadoop/pull/4215#issuecomment-1164971314 :confetti_ball: **+1 overall** | Vote | Subsystem | Runtime | Logfile | Comment | |::|--:|:|::|:---:| | +0 :ok: | reexec | 0m 39s | | Docker mode activated. | _ Prechecks _ | | +1 :green_heart: | dupname | 0m 0s | | No case conflicting files found. | | +0 :ok: | codespell | 0m 0s | | codespell was not available. | | +0 :ok: | detsecrets | 0m 0s | | detect-secrets was not available. | | +1 :green_heart: | @author | 0m 0s | | The patch does not contain any @author tags. | | +1 :green_heart: | test4tests | 0m 0s | | The patch appears to include 2 new or modified test files. | _ trunk Compile Tests _ | | +1 :green_heart: | mvninstall | 37m 50s | | trunk passed | | +1 :green_heart: | compile | 23m 15s | | trunk passed with JDK Private Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1 | | +1 :green_heart: | compile | 20m 39s | | trunk passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | checkstyle | 1m 48s | | trunk passed | | +1 :green_heart: | mvnsite | 2m 14s | | trunk passed | | +1 :green_heart: | javadoc | 1m 36s | | trunk passed with JDK Private Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1 | | +1 :green_heart: | javadoc | 1m 22s | | trunk passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | spotbugs | 3m 11s | | trunk passed | | +1 :green_heart: | shadedclient | 23m 34s | | branch has no errors when building and testing our client artifacts. | _ Patch Compile Tests _ | | +1 :green_heart: | mvninstall | 1m 7s | | the patch passed | | +1 :green_heart: | compile | 22m 23s | | the patch passed with JDK Private Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1 | | +1 :green_heart: | javac | 22m 23s | | the patch passed | | +1 :green_heart: | compile | 20m 41s | | the patch passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | javac | 20m 41s | | the patch passed | | +1 :green_heart: | blanks | 0m 0s | | The patch has no blanks issues. | | +1 :green_heart: | checkstyle | 1m 39s | | the patch passed | | +1 :green_heart: | mvnsite | 2m 9s | | the patch passed | | +1 :green_heart: | javadoc | 1m 38s | | the patch passed with JDK Private Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1 | | +1 :green_heart: | javadoc | 1m 20s | | the patch passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | spotbugs | 3m 8s | | the patch passed | | +1 :green_heart: | shadedclient | 23m 27s | | patch has no errors when building and testing our client artifacts. | _ Other Tests _ | | +1 :green_heart: | unit | 18m 47s | | hadoop-common in the patch passed. | | +1 :green_heart: | asflicense | 1m 35s | | The patch does not generate ASF License warnings. | | | | 215m 49s | | | | Subsystem | Report/Notes | |--:|:-| | Docker | ClientAPI=1.41 ServerAPI=1.41 base: https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4215/6/artifact/out/Dockerfile | | GITHUB PR | https://github.com/apache/hadoop/pull/4215 | | Optional Tests | dupname asflicense compile javac javadoc mvninstall mvnsite unit shadedclient spotbugs checkstyle codespell detsecrets | | uname | Linux 0cb34f79ee1f 4.15.0-112-generic #113-Ubuntu SMP Thu Jul 9 23:41:39 UTC 2020 x86_64 x86_64 x86_64 GNU/Linux | | Build tool | maven | | Personality | dev-support/bin/hadoop.sh | | git revision | trunk / adf3de0112c97cb637b72c19aacddb0662bdee4d | | Default Java | Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | Multi-JDK versions | /usr/lib/jvm/java-11-openjdk-amd64:Private Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1 /usr/lib/jvm/java-8-openjdk-amd64:Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | Test Results | https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4215/6/testReport/ | | Max. process+thread count | 2466 (vs. ulimit of 5500) | | modules | C: hadoop-common-project/hadoop-common U: hadoop-common-project/hadoop-common | | Console output | https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4215/6/console | | versions | git=2.25.1 maven=3.6.3 spotbugs=4.2.2 | | Powered by | Apache Yetus 0.14.0 https://yetus.apache.org | This message was automatically generated. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For
[GitHub] [hadoop] hadoop-yetus commented on pull request #4474: HADOOP-18306: Warnings should not be shown on cli console when linux user not present on client
hadoop-yetus commented on PR #4474: URL: https://github.com/apache/hadoop/pull/4474#issuecomment-1164971077 :broken_heart: **-1 overall** | Vote | Subsystem | Runtime | Logfile | Comment | |::|--:|:|::|:---:| | +0 :ok: | reexec | 0m 55s | | Docker mode activated. | _ Prechecks _ | | +1 :green_heart: | dupname | 0m 0s | | No case conflicting files found. | | +0 :ok: | codespell | 0m 0s | | codespell was not available. | | +0 :ok: | detsecrets | 0m 0s | | detect-secrets was not available. | | +1 :green_heart: | @author | 0m 0s | | The patch does not contain any @author tags. | | -1 :x: | test4tests | 0m 0s | | The patch doesn't appear to include any new or modified tests. Please justify why no new tests are needed for this patch. Also please list what manual steps were performed to verify this patch. | _ trunk Compile Tests _ | | +1 :green_heart: | mvninstall | 40m 24s | | trunk passed | | +1 :green_heart: | compile | 25m 10s | | trunk passed with JDK Private Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1 | | +1 :green_heart: | compile | 21m 37s | | trunk passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | checkstyle | 1m 31s | | trunk passed | | +1 :green_heart: | mvnsite | 1m 58s | | trunk passed | | +1 :green_heart: | javadoc | 1m 31s | | trunk passed with JDK Private Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1 | | +1 :green_heart: | javadoc | 1m 4s | | trunk passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | spotbugs | 3m 6s | | trunk passed | | +1 :green_heart: | shadedclient | 25m 52s | | branch has no errors when building and testing our client artifacts. | _ Patch Compile Tests _ | | +1 :green_heart: | mvninstall | 1m 6s | | the patch passed | | +1 :green_heart: | compile | 24m 13s | | the patch passed with JDK Private Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1 | | +1 :green_heart: | javac | 24m 13s | | the patch passed | | +1 :green_heart: | compile | 21m 45s | | the patch passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | javac | 21m 45s | | the patch passed | | +1 :green_heart: | blanks | 0m 0s | | The patch has no blanks issues. | | +1 :green_heart: | checkstyle | 1m 24s | | the patch passed | | +1 :green_heart: | mvnsite | 1m 56s | | the patch passed | | +1 :green_heart: | javadoc | 1m 22s | | the patch passed with JDK Private Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1 | | +1 :green_heart: | javadoc | 1m 4s | | the patch passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | spotbugs | 3m 3s | | the patch passed | | +1 :green_heart: | shadedclient | 25m 37s | | patch has no errors when building and testing our client artifacts. | _ Other Tests _ | | +1 :green_heart: | unit | 18m 17s | | hadoop-common in the patch passed. | | +1 :green_heart: | asflicense | 1m 16s | | The patch does not generate ASF License warnings. | | | | 224m 51s | | | | Subsystem | Report/Notes | |--:|:-| | Docker | ClientAPI=1.41 ServerAPI=1.41 base: https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4474/2/artifact/out/Dockerfile | | GITHUB PR | https://github.com/apache/hadoop/pull/4474 | | JIRA Issue | HADOOP-18306 | | Optional Tests | dupname asflicense compile javac javadoc mvninstall mvnsite unit shadedclient spotbugs checkstyle codespell detsecrets | | uname | Linux 472d91b4539c 4.15.0-175-generic #184-Ubuntu SMP Thu Mar 24 17:48:36 UTC 2022 x86_64 x86_64 x86_64 GNU/Linux | | Build tool | maven | | Personality | dev-support/bin/hadoop.sh | | git revision | trunk / 370df9b21f54462fd0da45aa1a5cf2c87bbd0757 | | Default Java | Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | Multi-JDK versions | /usr/lib/jvm/java-11-openjdk-amd64:Private Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1 /usr/lib/jvm/java-8-openjdk-amd64:Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | Test Results | https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4474/2/testReport/ | | Max. process+thread count | 1803 (vs. ulimit of 5500) | | modules | C: hadoop-common-project/hadoop-common U: hadoop-common-project/hadoop-common | | Console output | https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4474/2/console | | versions | git=2.25.1 maven=3.6.3 spotbugs=4.2.2 | | Powered by | Apache Yetus 0.14.0 https://yetus.apache.org | This message was automatically generated. -- This is an automated message from the Apache Git Service. To respond to the
[GitHub] [hadoop] slfan1989 commented on pull request #4464: YARN-11169. Support moveApplicationAcrossQueues, getQueueInfo API's for Federation.
slfan1989 commented on PR #4464: URL: https://github.com/apache/hadoop/pull/4464#issuecomment-1164896098 @goiri Please help me to review the code, Thank you very much! -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] hadoop-yetus commented on pull request #4155: HDFS-16533. COMPOSITE_CRC failed between replicated file and striped …
hadoop-yetus commented on PR #4155: URL: https://github.com/apache/hadoop/pull/4155#issuecomment-1164882298 :confetti_ball: **+1 overall** | Vote | Subsystem | Runtime | Logfile | Comment | |::|--:|:|::|:---:| | +0 :ok: | reexec | 0m 39s | | Docker mode activated. | _ Prechecks _ | | +1 :green_heart: | dupname | 0m 0s | | No case conflicting files found. | | +0 :ok: | codespell | 0m 0s | | codespell was not available. | | +0 :ok: | detsecrets | 0m 0s | | detect-secrets was not available. | | +1 :green_heart: | @author | 0m 0s | | The patch does not contain any @author tags. | | +1 :green_heart: | test4tests | 0m 0s | | The patch appears to include 1 new or modified test files. | _ trunk Compile Tests _ | | +0 :ok: | mvndep | 14m 52s | | Maven dependency ordering for branch | | +1 :green_heart: | mvninstall | 25m 59s | | trunk passed | | +1 :green_heart: | compile | 6m 17s | | trunk passed with JDK Private Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1 | | +1 :green_heart: | compile | 6m 0s | | trunk passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | checkstyle | 1m 37s | | trunk passed | | +1 :green_heart: | mvnsite | 3m 9s | | trunk passed | | +1 :green_heart: | javadoc | 2m 27s | | trunk passed with JDK Private Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1 | | +1 :green_heart: | javadoc | 2m 52s | | trunk passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | spotbugs | 6m 30s | | trunk passed | | +1 :green_heart: | shadedclient | 22m 55s | | branch has no errors when building and testing our client artifacts. | _ Patch Compile Tests _ | | +0 :ok: | mvndep | 0m 31s | | Maven dependency ordering for patch | | +1 :green_heart: | mvninstall | 2m 20s | | the patch passed | | +1 :green_heart: | compile | 6m 0s | | the patch passed with JDK Private Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1 | | +1 :green_heart: | javac | 6m 0s | | the patch passed | | +1 :green_heart: | compile | 5m 44s | | the patch passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | javac | 5m 44s | | the patch passed | | +1 :green_heart: | blanks | 0m 0s | | The patch has no blanks issues. | | +1 :green_heart: | checkstyle | 1m 15s | | the patch passed | | +1 :green_heart: | mvnsite | 2m 30s | | the patch passed | | +1 :green_heart: | javadoc | 1m 46s | | the patch passed with JDK Private Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1 | | +1 :green_heart: | javadoc | 2m 19s | | the patch passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | spotbugs | 6m 9s | | the patch passed | | +1 :green_heart: | shadedclient | 22m 37s | | patch has no errors when building and testing our client artifacts. | _ Other Tests _ | | +1 :green_heart: | unit | 2m 40s | | hadoop-hdfs-client in the patch passed. | | +1 :green_heart: | unit | 237m 20s | | hadoop-hdfs in the patch passed. | | +1 :green_heart: | asflicense | 1m 17s | | The patch does not generate ASF License warnings. | | | | 384m 46s | | | | Subsystem | Report/Notes | |--:|:-| | Docker | ClientAPI=1.41 ServerAPI=1.41 base: https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4155/4/artifact/out/Dockerfile | | GITHUB PR | https://github.com/apache/hadoop/pull/4155 | | Optional Tests | dupname asflicense compile javac javadoc mvninstall mvnsite unit shadedclient spotbugs checkstyle codespell detsecrets | | uname | Linux 7bdf891ec394 4.15.0-156-generic #163-Ubuntu SMP Thu Aug 19 23:31:58 UTC 2021 x86_64 x86_64 x86_64 GNU/Linux | | Build tool | maven | | Personality | dev-support/bin/hadoop.sh | | git revision | trunk / 91fd09d2c3759c8cbdb376a2caa22f64ddd41875 | | Default Java | Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | Multi-JDK versions | /usr/lib/jvm/java-11-openjdk-amd64:Private Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1 /usr/lib/jvm/java-8-openjdk-amd64:Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | Test Results | https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4155/4/testReport/ | | Max. process+thread count | 3092 (vs. ulimit of 5500) | | modules | C: hadoop-hdfs-project/hadoop-hdfs-client hadoop-hdfs-project/hadoop-hdfs U: hadoop-hdfs-project | | Console output | https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4155/4/console | | versions | git=2.25.1 maven=3.6.3 spotbugs=4.2.2 | | Powered by | Apache Yetus 0.14.0 https://yetus.apache.org | This message was
[jira] [Work logged] (HADOOP-18303) Remove shading exclusion of javax.ws.rs-api from hadoop-client-runtime
[ https://issues.apache.org/jira/browse/HADOOP-18303?focusedWorklogId=784342=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-784342 ] ASF GitHub Bot logged work on HADOOP-18303: --- Author: ASF GitHub Bot Created on: 23/Jun/22 19:37 Start Date: 23/Jun/22 19:37 Worklog Time Spent: 10m Work Description: sunchao commented on PR #4461: URL: https://github.com/apache/hadoop/pull/4461#issuecomment-1164795939 In that case should we consider reverting it for now until we are ready to upgrade both jersey and jackson together at some point? Issue Time Tracking --- Worklog Id: (was: 784342) Time Spent: 2h 40m (was: 2.5h) > Remove shading exclusion of javax.ws.rs-api from hadoop-client-runtime > -- > > Key: HADOOP-18303 > URL: https://issues.apache.org/jira/browse/HADOOP-18303 > Project: Hadoop Common > Issue Type: Bug >Reporter: Viraj Jasani >Assignee: Viraj Jasani >Priority: Critical > Labels: pull-request-available > Time Spent: 2h 40m > Remaining Estimate: 0h > > As part of HADOOP-18033, we have excluded shading of javax.ws.rs-api from > both hadoop-client-runtime and hadoop-client-minicluster. This has caused > issues for downstreamers e.g. > [https://github.com/apache/incubator-kyuubi/issues/2904], more discussions. > We should put the shading back in hadoop-client-runtime to fix CNFE issues > for downstreamers. > cc [~ayushsaxena] [~pan3793] -- This message was sent by Atlassian Jira (v8.20.7#820007) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] sunchao commented on pull request #4461: HADOOP-18303. Remove shading exclusion of javax.ws.rs-api from hadoop-client-runtime
sunchao commented on PR #4461: URL: https://github.com/apache/hadoop/pull/4461#issuecomment-1164795939 In that case should we consider reverting it for now until we are ready to upgrade both jersey and jackson together at some point? -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Work logged] (HADOOP-18044) Hadoop - Upgrade to JQuery 3.6.0
[ https://issues.apache.org/jira/browse/HADOOP-18044?focusedWorklogId=784341=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-784341 ] ASF GitHub Bot logged work on HADOOP-18044: --- Author: ASF GitHub Bot Created on: 23/Jun/22 19:23 Start Date: 23/Jun/22 19:23 Worklog Time Spent: 10m Work Description: steveloughran commented on PR #4495: URL: https://github.com/apache/hadoop/pull/4495#issuecomment-1164783134 merged. doesn't need yetus to validate it fully as its a cherrypick and i'd verified this branch built correctly Issue Time Tracking --- Worklog Id: (was: 784341) Time Spent: 2h 10m (was: 2h) > Hadoop - Upgrade to JQuery 3.6.0 > > > Key: HADOOP-18044 > URL: https://issues.apache.org/jira/browse/HADOOP-18044 > Project: Hadoop Common > Issue Type: Improvement >Reporter: Yuan Luo >Assignee: Yuan Luo >Priority: Major > Labels: pull-request-available > Fix For: 3.4.0, 3.3.4 > > Time Spent: 2h 10m > Remaining Estimate: 0h > > jQuery 3.6.0 has been released few months ago - > [http://blog.jquery.com/2021/03/02/jquery-3-6-0-released/ > |http://blog.jquery.com/2021/03/02/jquery-3-6-0-released/,] > We can upgrade jquery-3.5.1.min.js to jquery-3.6.0.min.js in hadoop project. -- This message was sent by Atlassian Jira (v8.20.7#820007) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] steveloughran commented on pull request #4495: HADOOP-18044. Hadoop - Upgrade to jQuery 3.6.0 (#3791)
steveloughran commented on PR #4495: URL: https://github.com/apache/hadoop/pull/4495#issuecomment-1164783134 merged. doesn't need yetus to validate it fully as its a cherrypick and i'd verified this branch built correctly -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Work logged] (HADOOP-18044) Hadoop - Upgrade to JQuery 3.6.0
[ https://issues.apache.org/jira/browse/HADOOP-18044?focusedWorklogId=784340=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-784340 ] ASF GitHub Bot logged work on HADOOP-18044: --- Author: ASF GitHub Bot Created on: 23/Jun/22 19:22 Start Date: 23/Jun/22 19:22 Worklog Time Spent: 10m Work Description: steveloughran merged PR #4495: URL: https://github.com/apache/hadoop/pull/4495 Issue Time Tracking --- Worklog Id: (was: 784340) Time Spent: 2h (was: 1h 50m) > Hadoop - Upgrade to JQuery 3.6.0 > > > Key: HADOOP-18044 > URL: https://issues.apache.org/jira/browse/HADOOP-18044 > Project: Hadoop Common > Issue Type: Improvement >Reporter: Yuan Luo >Assignee: Yuan Luo >Priority: Major > Labels: pull-request-available > Fix For: 3.4.0, 3.3.4 > > Time Spent: 2h > Remaining Estimate: 0h > > jQuery 3.6.0 has been released few months ago - > [http://blog.jquery.com/2021/03/02/jquery-3-6-0-released/ > |http://blog.jquery.com/2021/03/02/jquery-3-6-0-released/,] > We can upgrade jquery-3.5.1.min.js to jquery-3.6.0.min.js in hadoop project. -- This message was sent by Atlassian Jira (v8.20.7#820007) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] steveloughran merged pull request #4495: HADOOP-18044. Hadoop - Upgrade to jQuery 3.6.0 (#3791)
steveloughran merged PR #4495: URL: https://github.com/apache/hadoop/pull/4495 -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Work logged] (HADOOP-18215) Enhance WritableName to be able to return aliases for classes that use serializers
[ https://issues.apache.org/jira/browse/HADOOP-18215?focusedWorklogId=784339=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-784339 ] ASF GitHub Bot logged work on HADOOP-18215: --- Author: ASF GitHub Bot Created on: 23/Jun/22 19:21 Start Date: 23/Jun/22 19:21 Worklog Time Spent: 10m Work Description: hadoop-yetus commented on PR #4215: URL: https://github.com/apache/hadoop/pull/4215#issuecomment-1164781759 :confetti_ball: **+1 overall** | Vote | Subsystem | Runtime | Logfile | Comment | |::|--:|:|::|:---:| | +0 :ok: | reexec | 0m 44s | | Docker mode activated. | _ Prechecks _ | | +1 :green_heart: | dupname | 0m 0s | | No case conflicting files found. | | +0 :ok: | codespell | 0m 0s | | codespell was not available. | | +0 :ok: | detsecrets | 0m 0s | | detect-secrets was not available. | | +1 :green_heart: | @author | 0m 0s | | The patch does not contain any @author tags. | | +1 :green_heart: | test4tests | 0m 0s | | The patch appears to include 2 new or modified test files. | _ trunk Compile Tests _ | | +1 :green_heart: | mvninstall | 38m 41s | | trunk passed | | +1 :green_heart: | compile | 23m 6s | | trunk passed with JDK Private Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1 | | +1 :green_heart: | compile | 21m 1s | | trunk passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | checkstyle | 1m 17s | | trunk passed | | +1 :green_heart: | mvnsite | 1m 41s | | trunk passed | | +1 :green_heart: | javadoc | 1m 12s | | trunk passed with JDK Private Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1 | | +1 :green_heart: | javadoc | 0m 46s | | trunk passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | spotbugs | 2m 44s | | trunk passed | | +1 :green_heart: | shadedclient | 22m 33s | | branch has no errors when building and testing our client artifacts. | _ Patch Compile Tests _ | | +1 :green_heart: | mvninstall | 0m 59s | | the patch passed | | +1 :green_heart: | compile | 23m 18s | | the patch passed with JDK Private Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1 | | +1 :green_heart: | javac | 23m 18s | | the patch passed | | +1 :green_heart: | compile | 21m 8s | | the patch passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | javac | 21m 8s | | the patch passed | | +1 :green_heart: | blanks | 0m 0s | | The patch has no blanks issues. | | +1 :green_heart: | checkstyle | 1m 9s | | the patch passed | | +1 :green_heart: | mvnsite | 1m 39s | | the patch passed | | +1 :green_heart: | javadoc | 1m 4s | | the patch passed with JDK Private Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1 | | +1 :green_heart: | javadoc | 0m 49s | | the patch passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | spotbugs | 2m 37s | | the patch passed | | +1 :green_heart: | shadedclient | 23m 7s | | patch has no errors when building and testing our client artifacts. | _ Other Tests _ | | +1 :green_heart: | unit | 18m 35s | | hadoop-common in the patch passed. | | +1 :green_heart: | asflicense | 1m 6s | | The patch does not generate ASF License warnings. | | | | 209m 44s | | | | Subsystem | Report/Notes | |--:|:-| | Docker | ClientAPI=1.41 ServerAPI=1.41 base: https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4215/5/artifact/out/Dockerfile | | GITHUB PR | https://github.com/apache/hadoop/pull/4215 | | Optional Tests | dupname asflicense compile javac javadoc mvninstall mvnsite unit shadedclient spotbugs checkstyle codespell detsecrets | | uname | Linux 85b4da9d0004 4.15.0-169-generic #177-Ubuntu SMP Thu Feb 3 10:50:38 UTC 2022 x86_64 x86_64 x86_64 GNU/Linux | | Build tool | maven | | Personality | dev-support/bin/hadoop.sh | | git revision | trunk / 52fc693a714ad62aff9bc4b729f2f52200553786 | | Default Java | Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | Multi-JDK versions | /usr/lib/jvm/java-11-openjdk-amd64:Private Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1 /usr/lib/jvm/java-8-openjdk-amd64:Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | Test Results | https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4215/5/testReport/ | | Max. process+thread count | 1549 (vs. ulimit of 5500) | | modules | C: hadoop-common-project/hadoop-common U: hadoop-common-project/hadoop-common | | Console output | https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4215/5/console |
[jira] [Work logged] (HADOOP-18303) Remove shading exclusion of javax.ws.rs-api from hadoop-client-runtime
[ https://issues.apache.org/jira/browse/HADOOP-18303?focusedWorklogId=784338=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-784338 ] ASF GitHub Bot logged work on HADOOP-18303: --- Author: ASF GitHub Bot Created on: 23/Jun/22 19:21 Start Date: 23/Jun/22 19:21 Worklog Time Spent: 10m Work Description: ayushtkn commented on PR #4461: URL: https://github.com/apache/hadoop/pull/4461#issuecomment-1164781602 Both shading and the issue that I am talking about are due to HADOOP-18033 Jackson upgrade added a rs-api jar which is not shaded and which all cause conflicts in Tez jsr311 jar already present Issue Time Tracking --- Worklog Id: (was: 784338) Time Spent: 2.5h (was: 2h 20m) > Remove shading exclusion of javax.ws.rs-api from hadoop-client-runtime > -- > > Key: HADOOP-18303 > URL: https://issues.apache.org/jira/browse/HADOOP-18303 > Project: Hadoop Common > Issue Type: Bug >Reporter: Viraj Jasani >Assignee: Viraj Jasani >Priority: Critical > Labels: pull-request-available > Time Spent: 2.5h > Remaining Estimate: 0h > > As part of HADOOP-18033, we have excluded shading of javax.ws.rs-api from > both hadoop-client-runtime and hadoop-client-minicluster. This has caused > issues for downstreamers e.g. > [https://github.com/apache/incubator-kyuubi/issues/2904], more discussions. > We should put the shading back in hadoop-client-runtime to fix CNFE issues > for downstreamers. > cc [~ayushsaxena] [~pan3793] -- This message was sent by Atlassian Jira (v8.20.7#820007) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] hadoop-yetus commented on pull request #4215: HADOOP-18215. Enhance WritableName to be able to return aliases for classes that use serializers
hadoop-yetus commented on PR #4215: URL: https://github.com/apache/hadoop/pull/4215#issuecomment-1164781759 :confetti_ball: **+1 overall** | Vote | Subsystem | Runtime | Logfile | Comment | |::|--:|:|::|:---:| | +0 :ok: | reexec | 0m 44s | | Docker mode activated. | _ Prechecks _ | | +1 :green_heart: | dupname | 0m 0s | | No case conflicting files found. | | +0 :ok: | codespell | 0m 0s | | codespell was not available. | | +0 :ok: | detsecrets | 0m 0s | | detect-secrets was not available. | | +1 :green_heart: | @author | 0m 0s | | The patch does not contain any @author tags. | | +1 :green_heart: | test4tests | 0m 0s | | The patch appears to include 2 new or modified test files. | _ trunk Compile Tests _ | | +1 :green_heart: | mvninstall | 38m 41s | | trunk passed | | +1 :green_heart: | compile | 23m 6s | | trunk passed with JDK Private Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1 | | +1 :green_heart: | compile | 21m 1s | | trunk passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | checkstyle | 1m 17s | | trunk passed | | +1 :green_heart: | mvnsite | 1m 41s | | trunk passed | | +1 :green_heart: | javadoc | 1m 12s | | trunk passed with JDK Private Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1 | | +1 :green_heart: | javadoc | 0m 46s | | trunk passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | spotbugs | 2m 44s | | trunk passed | | +1 :green_heart: | shadedclient | 22m 33s | | branch has no errors when building and testing our client artifacts. | _ Patch Compile Tests _ | | +1 :green_heart: | mvninstall | 0m 59s | | the patch passed | | +1 :green_heart: | compile | 23m 18s | | the patch passed with JDK Private Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1 | | +1 :green_heart: | javac | 23m 18s | | the patch passed | | +1 :green_heart: | compile | 21m 8s | | the patch passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | javac | 21m 8s | | the patch passed | | +1 :green_heart: | blanks | 0m 0s | | The patch has no blanks issues. | | +1 :green_heart: | checkstyle | 1m 9s | | the patch passed | | +1 :green_heart: | mvnsite | 1m 39s | | the patch passed | | +1 :green_heart: | javadoc | 1m 4s | | the patch passed with JDK Private Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1 | | +1 :green_heart: | javadoc | 0m 49s | | the patch passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | spotbugs | 2m 37s | | the patch passed | | +1 :green_heart: | shadedclient | 23m 7s | | patch has no errors when building and testing our client artifacts. | _ Other Tests _ | | +1 :green_heart: | unit | 18m 35s | | hadoop-common in the patch passed. | | +1 :green_heart: | asflicense | 1m 6s | | The patch does not generate ASF License warnings. | | | | 209m 44s | | | | Subsystem | Report/Notes | |--:|:-| | Docker | ClientAPI=1.41 ServerAPI=1.41 base: https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4215/5/artifact/out/Dockerfile | | GITHUB PR | https://github.com/apache/hadoop/pull/4215 | | Optional Tests | dupname asflicense compile javac javadoc mvninstall mvnsite unit shadedclient spotbugs checkstyle codespell detsecrets | | uname | Linux 85b4da9d0004 4.15.0-169-generic #177-Ubuntu SMP Thu Feb 3 10:50:38 UTC 2022 x86_64 x86_64 x86_64 GNU/Linux | | Build tool | maven | | Personality | dev-support/bin/hadoop.sh | | git revision | trunk / 52fc693a714ad62aff9bc4b729f2f52200553786 | | Default Java | Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | Multi-JDK versions | /usr/lib/jvm/java-11-openjdk-amd64:Private Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1 /usr/lib/jvm/java-8-openjdk-amd64:Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | Test Results | https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4215/5/testReport/ | | Max. process+thread count | 1549 (vs. ulimit of 5500) | | modules | C: hadoop-common-project/hadoop-common U: hadoop-common-project/hadoop-common | | Console output | https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4215/5/console | | versions | git=2.25.1 maven=3.6.3 spotbugs=4.2.2 | | Powered by | Apache Yetus 0.14.0 https://yetus.apache.org | This message was automatically generated. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For
[GitHub] [hadoop] ayushtkn commented on pull request #4461: HADOOP-18303. Remove shading exclusion of javax.ws.rs-api from hadoop-client-runtime
ayushtkn commented on PR #4461: URL: https://github.com/apache/hadoop/pull/4461#issuecomment-1164781602 Both shading and the issue that I am talking about are due to HADOOP-18033 Jackson upgrade added a rs-api jar which is not shaded and which all cause conflicts in Tez jsr311 jar already present -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Work logged] (HADOOP-18311) Upgrade dependencies to address several CVEs
[ https://issues.apache.org/jira/browse/HADOOP-18311?focusedWorklogId=784335=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-784335 ] ASF GitHub Bot logged work on HADOOP-18311: --- Author: ASF GitHub Bot Created on: 23/Jun/22 18:59 Start Date: 23/Jun/22 18:59 Worklog Time Spent: 10m Work Description: steveloughran commented on PR #4491: URL: https://github.com/apache/hadoop/pull/4491#issuecomment-1164763274 sorry, i confused jetty with jersey. don't know how jetty is on branch 3.3. it is not quite as bad as that jersey thing. Issue Time Tracking --- Worklog Id: (was: 784335) Time Spent: 1h 10m (was: 1h) > Upgrade dependencies to address several CVEs > > > Key: HADOOP-18311 > URL: https://issues.apache.org/jira/browse/HADOOP-18311 > Project: Hadoop Common > Issue Type: Improvement > Components: common >Affects Versions: 3.3.3, 3.3.4 >Reporter: Steve Vaughan >Priority: Major > Labels: pull-request-available > Fix For: 3.3.4 > > Time Spent: 1h 10m > Remaining Estimate: 0h > > The following CVEs can be addressed by upgrading dependencies within the > build. This includes a replacement of HTrace with a noop implementation. > * CVE-2018-7489 > * CVE-2020-10663 > * CVE-2020-28491 > * CVE-2020-35490 > * CVE-2020-35491 > * CVE-2020-36518 > * PRISMA-2021-0182 > This addresses all of the CVEs from 3.3.3 except for ones that would require > upgrading Netty to 4.x. I'll be submitting a pull request for 3.3.4. -- This message was sent by Atlassian Jira (v8.20.7#820007) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] steveloughran commented on pull request #4491: HADOOP-18311. Upgrade dependencies to address several CVEs
steveloughran commented on PR #4491: URL: https://github.com/apache/hadoop/pull/4491#issuecomment-1164763274 sorry, i confused jetty with jersey. don't know how jetty is on branch 3.3. it is not quite as bad as that jersey thing. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Work logged] (HADOOP-18303) Remove shading exclusion of javax.ws.rs-api from hadoop-client-runtime
[ https://issues.apache.org/jira/browse/HADOOP-18303?focusedWorklogId=784332=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-784332 ] ASF GitHub Bot logged work on HADOOP-18303: --- Author: ASF GitHub Bot Created on: 23/Jun/22 18:47 Start Date: 23/Jun/22 18:47 Worklog Time Spent: 10m Work Description: sunchao commented on PR #4461: URL: https://github.com/apache/hadoop/pull/4461#issuecomment-1164752328 > @sunchao It doesn't, that is what I was saying, even if we sort the shading issue, the Tez issue is gonna stay because it doesn't use the shaded Jar and gets both these jars in classpath and things mess up. I see. This looks like a separate issue from this PR, is that correct? is this because of https://issues.apache.org/jira/browse/HADOOP-18033? The issue seems tricky though. I wonder if we can shade jersey 2 in `hadoop-thirdparty` and then update Hadoop to use that. Issue Time Tracking --- Worklog Id: (was: 784332) Time Spent: 2h 20m (was: 2h 10m) > Remove shading exclusion of javax.ws.rs-api from hadoop-client-runtime > -- > > Key: HADOOP-18303 > URL: https://issues.apache.org/jira/browse/HADOOP-18303 > Project: Hadoop Common > Issue Type: Bug >Reporter: Viraj Jasani >Assignee: Viraj Jasani >Priority: Critical > Labels: pull-request-available > Time Spent: 2h 20m > Remaining Estimate: 0h > > As part of HADOOP-18033, we have excluded shading of javax.ws.rs-api from > both hadoop-client-runtime and hadoop-client-minicluster. This has caused > issues for downstreamers e.g. > [https://github.com/apache/incubator-kyuubi/issues/2904], more discussions. > We should put the shading back in hadoop-client-runtime to fix CNFE issues > for downstreamers. > cc [~ayushsaxena] [~pan3793] -- This message was sent by Atlassian Jira (v8.20.7#820007) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] sunchao commented on pull request #4461: HADOOP-18303. Remove shading exclusion of javax.ws.rs-api from hadoop-client-runtime
sunchao commented on PR #4461: URL: https://github.com/apache/hadoop/pull/4461#issuecomment-1164752328 > @sunchao It doesn't, that is what I was saying, even if we sort the shading issue, the Tez issue is gonna stay because it doesn't use the shaded Jar and gets both these jars in classpath and things mess up. I see. This looks like a separate issue from this PR, is that correct? is this because of https://issues.apache.org/jira/browse/HADOOP-18033? The issue seems tricky though. I wonder if we can shade jersey 2 in `hadoop-thirdparty` and then update Hadoop to use that. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Work logged] (HADOOP-18215) Enhance WritableName to be able to return aliases for classes that use serializers
[ https://issues.apache.org/jira/browse/HADOOP-18215?focusedWorklogId=784308=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-784308 ] ASF GitHub Bot logged work on HADOOP-18215: --- Author: ASF GitHub Bot Created on: 23/Jun/22 16:49 Start Date: 23/Jun/22 16:49 Worklog Time Spent: 10m Work Description: hadoop-yetus commented on PR #4215: URL: https://github.com/apache/hadoop/pull/4215#issuecomment-1164645318 :broken_heart: **-1 overall** | Vote | Subsystem | Runtime | Logfile | Comment | |::|--:|:|::|:---:| | +0 :ok: | reexec | 0m 49s | | Docker mode activated. | _ Prechecks _ | | +1 :green_heart: | dupname | 0m 0s | | No case conflicting files found. | | +0 :ok: | codespell | 0m 1s | | codespell was not available. | | +0 :ok: | detsecrets | 0m 1s | | detect-secrets was not available. | | +1 :green_heart: | @author | 0m 0s | | The patch does not contain any @author tags. | | +1 :green_heart: | test4tests | 0m 0s | | The patch appears to include 2 new or modified test files. | _ trunk Compile Tests _ | | -1 :x: | mvninstall | 37m 16s | [/branch-mvninstall-root.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4215/4/artifact/out/branch-mvninstall-root.txt) | root in trunk failed. | | -1 :x: | compile | 0m 54s | [/branch-compile-root-jdkPrivateBuild-11.0.15+10-Ubuntu-0ubuntu0.20.04.1.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4215/4/artifact/out/branch-compile-root-jdkPrivateBuild-11.0.15+10-Ubuntu-0ubuntu0.20.04.1.txt) | root in trunk failed with JDK Private Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1. | | -1 :x: | compile | 0m 54s | [/branch-compile-root-jdkPrivateBuild-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4215/4/artifact/out/branch-compile-root-jdkPrivateBuild-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07.txt) | root in trunk failed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07. | | -0 :warning: | checkstyle | 0m 51s | [/buildtool-branch-checkstyle-hadoop-common-project_hadoop-common.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4215/4/artifact/out/buildtool-branch-checkstyle-hadoop-common-project_hadoop-common.txt) | The patch fails to run checkstyle in hadoop-common | | -1 :x: | mvnsite | 0m 51s | [/branch-mvnsite-hadoop-common-project_hadoop-common.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4215/4/artifact/out/branch-mvnsite-hadoop-common-project_hadoop-common.txt) | hadoop-common in trunk failed. | | -1 :x: | javadoc | 0m 58s | [/branch-javadoc-hadoop-common-project_hadoop-common-jdkPrivateBuild-11.0.15+10-Ubuntu-0ubuntu0.20.04.1.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4215/4/artifact/out/branch-javadoc-hadoop-common-project_hadoop-common-jdkPrivateBuild-11.0.15+10-Ubuntu-0ubuntu0.20.04.1.txt) | hadoop-common in trunk failed with JDK Private Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1. | | -1 :x: | javadoc | 0m 47s | [/branch-javadoc-hadoop-common-project_hadoop-common-jdkPrivateBuild-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4215/4/artifact/out/branch-javadoc-hadoop-common-project_hadoop-common-jdkPrivateBuild-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07.txt) | hadoop-common in trunk failed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07. | | -1 :x: | spotbugs | 0m 52s | [/branch-spotbugs-hadoop-common-project_hadoop-common.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4215/4/artifact/out/branch-spotbugs-hadoop-common-project_hadoop-common.txt) | hadoop-common in trunk failed. | | +1 :green_heart: | shadedclient | 6m 30s | | branch has no errors when building and testing our client artifacts. | _ Patch Compile Tests _ | | -1 :x: | mvninstall | 0m 33s | [/patch-mvninstall-hadoop-common-project_hadoop-common.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4215/4/artifact/out/patch-mvninstall-hadoop-common-project_hadoop-common.txt) | hadoop-common in the patch failed. | | -1 :x: | compile | 0m 33s | [/patch-compile-root-jdkPrivateBuild-11.0.15+10-Ubuntu-0ubuntu0.20.04.1.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4215/4/artifact/out/patch-compile-root-jdkPrivateBuild-11.0.15+10-Ubuntu-0ubuntu0.20.04.1.txt) | root in the patch failed with JDK Private Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1. | | -1 :x: | javac | 0m 33s |
[GitHub] [hadoop] hadoop-yetus commented on pull request #4215: HADOOP-18215. Enhance WritableName to be able to return aliases for classes that use serializers
hadoop-yetus commented on PR #4215: URL: https://github.com/apache/hadoop/pull/4215#issuecomment-1164645318 :broken_heart: **-1 overall** | Vote | Subsystem | Runtime | Logfile | Comment | |::|--:|:|::|:---:| | +0 :ok: | reexec | 0m 49s | | Docker mode activated. | _ Prechecks _ | | +1 :green_heart: | dupname | 0m 0s | | No case conflicting files found. | | +0 :ok: | codespell | 0m 1s | | codespell was not available. | | +0 :ok: | detsecrets | 0m 1s | | detect-secrets was not available. | | +1 :green_heart: | @author | 0m 0s | | The patch does not contain any @author tags. | | +1 :green_heart: | test4tests | 0m 0s | | The patch appears to include 2 new or modified test files. | _ trunk Compile Tests _ | | -1 :x: | mvninstall | 37m 16s | [/branch-mvninstall-root.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4215/4/artifact/out/branch-mvninstall-root.txt) | root in trunk failed. | | -1 :x: | compile | 0m 54s | [/branch-compile-root-jdkPrivateBuild-11.0.15+10-Ubuntu-0ubuntu0.20.04.1.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4215/4/artifact/out/branch-compile-root-jdkPrivateBuild-11.0.15+10-Ubuntu-0ubuntu0.20.04.1.txt) | root in trunk failed with JDK Private Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1. | | -1 :x: | compile | 0m 54s | [/branch-compile-root-jdkPrivateBuild-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4215/4/artifact/out/branch-compile-root-jdkPrivateBuild-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07.txt) | root in trunk failed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07. | | -0 :warning: | checkstyle | 0m 51s | [/buildtool-branch-checkstyle-hadoop-common-project_hadoop-common.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4215/4/artifact/out/buildtool-branch-checkstyle-hadoop-common-project_hadoop-common.txt) | The patch fails to run checkstyle in hadoop-common | | -1 :x: | mvnsite | 0m 51s | [/branch-mvnsite-hadoop-common-project_hadoop-common.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4215/4/artifact/out/branch-mvnsite-hadoop-common-project_hadoop-common.txt) | hadoop-common in trunk failed. | | -1 :x: | javadoc | 0m 58s | [/branch-javadoc-hadoop-common-project_hadoop-common-jdkPrivateBuild-11.0.15+10-Ubuntu-0ubuntu0.20.04.1.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4215/4/artifact/out/branch-javadoc-hadoop-common-project_hadoop-common-jdkPrivateBuild-11.0.15+10-Ubuntu-0ubuntu0.20.04.1.txt) | hadoop-common in trunk failed with JDK Private Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1. | | -1 :x: | javadoc | 0m 47s | [/branch-javadoc-hadoop-common-project_hadoop-common-jdkPrivateBuild-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4215/4/artifact/out/branch-javadoc-hadoop-common-project_hadoop-common-jdkPrivateBuild-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07.txt) | hadoop-common in trunk failed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07. | | -1 :x: | spotbugs | 0m 52s | [/branch-spotbugs-hadoop-common-project_hadoop-common.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4215/4/artifact/out/branch-spotbugs-hadoop-common-project_hadoop-common.txt) | hadoop-common in trunk failed. | | +1 :green_heart: | shadedclient | 6m 30s | | branch has no errors when building and testing our client artifacts. | _ Patch Compile Tests _ | | -1 :x: | mvninstall | 0m 33s | [/patch-mvninstall-hadoop-common-project_hadoop-common.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4215/4/artifact/out/patch-mvninstall-hadoop-common-project_hadoop-common.txt) | hadoop-common in the patch failed. | | -1 :x: | compile | 0m 33s | [/patch-compile-root-jdkPrivateBuild-11.0.15+10-Ubuntu-0ubuntu0.20.04.1.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4215/4/artifact/out/patch-compile-root-jdkPrivateBuild-11.0.15+10-Ubuntu-0ubuntu0.20.04.1.txt) | root in the patch failed with JDK Private Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1. | | -1 :x: | javac | 0m 33s | [/patch-compile-root-jdkPrivateBuild-11.0.15+10-Ubuntu-0ubuntu0.20.04.1.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4215/4/artifact/out/patch-compile-root-jdkPrivateBuild-11.0.15+10-Ubuntu-0ubuntu0.20.04.1.txt) | root in the patch failed with JDK Private Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1. | | -1 :x: | compile | 0m 33s | [/patch-compile-root-jdkPrivateBuild-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4215/4/artifact/out/patch-compile-root-jdkPrivateBuild-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07.txt) | root in the
[GitHub] [hadoop] hadoop-yetus commented on pull request #4450: YARN-11183. Federation: Remove outdated ApplicationHomeSubCluster in …
hadoop-yetus commented on PR #4450: URL: https://github.com/apache/hadoop/pull/4450#issuecomment-1164643178 :broken_heart: **-1 overall** | Vote | Subsystem | Runtime | Logfile | Comment | |::|--:|:|::|:---:| | +0 :ok: | reexec | 1m 15s | | Docker mode activated. | _ Prechecks _ | | +1 :green_heart: | dupname | 0m 1s | | No case conflicting files found. | | +0 :ok: | codespell | 0m 1s | | codespell was not available. | | +0 :ok: | detsecrets | 0m 1s | | detect-secrets was not available. | | +0 :ok: | buf | 0m 0s | | buf was not available. | | +0 :ok: | buf | 0m 0s | | buf was not available. | | +1 :green_heart: | @author | 0m 0s | | The patch does not contain any @author tags. | | +1 :green_heart: | test4tests | 0m 0s | | The patch appears to include 4 new or modified test files. | _ trunk Compile Tests _ | | +0 :ok: | mvndep | 14m 42s | | Maven dependency ordering for branch | | +1 :green_heart: | mvninstall | 27m 52s | | trunk passed | | +1 :green_heart: | compile | 11m 6s | | trunk passed with JDK Private Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1 | | +1 :green_heart: | compile | 9m 12s | | trunk passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | checkstyle | 2m 5s | | trunk passed | | +1 :green_heart: | mvnsite | 8m 16s | | trunk passed | | +1 :green_heart: | javadoc | 6m 15s | | trunk passed with JDK Private Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1 | | +1 :green_heart: | javadoc | 5m 39s | | trunk passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | spotbugs | 19m 31s | | trunk passed | | +1 :green_heart: | shadedclient | 23m 18s | | branch has no errors when building and testing our client artifacts. | _ Patch Compile Tests _ | | +0 :ok: | mvndep | 0m 32s | | Maven dependency ordering for patch | | +1 :green_heart: | mvninstall | 7m 32s | | the patch passed | | +1 :green_heart: | compile | 10m 37s | | the patch passed with JDK Private Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1 | | +1 :green_heart: | cc | 10m 37s | | the patch passed | | +1 :green_heart: | javac | 10m 37s | | the patch passed | | +1 :green_heart: | compile | 9m 1s | | the patch passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | cc | 9m 1s | | the patch passed | | +1 :green_heart: | javac | 9m 1s | | the patch passed | | +1 :green_heart: | blanks | 0m 0s | | The patch has no blanks issues. | | -0 :warning: | checkstyle | 1m 43s | [/results-checkstyle-hadoop-yarn-project_hadoop-yarn.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4450/3/artifact/out/results-checkstyle-hadoop-yarn-project_hadoop-yarn.txt) | hadoop-yarn-project/hadoop-yarn: The patch generated 1 new + 87 unchanged - 0 fixed = 88 total (was 87) | | +1 :green_heart: | mvnsite | 7m 39s | | the patch passed | | +1 :green_heart: | javadoc | 5m 30s | | the patch passed with JDK Private Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1 | | +1 :green_heart: | javadoc | 4m 57s | | the patch passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | spotbugs | 19m 30s | | the patch passed | | +1 :green_heart: | shadedclient | 21m 39s | | patch has no errors when building and testing our client artifacts. | _ Other Tests _ | | -1 :x: | unit | 234m 53s | [/patch-unit-hadoop-yarn-project_hadoop-yarn.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4450/3/artifact/out/patch-unit-hadoop-yarn-project_hadoop-yarn.txt) | hadoop-yarn in the patch passed. | | +1 :green_heart: | unit | 5m 46s | | hadoop-yarn-common in the patch passed. | | +1 :green_heart: | unit | 3m 51s | | hadoop-yarn-server-common in the patch passed. | | -1 :x: | unit | 60m 22s | [/patch-unit-hadoop-yarn-project_hadoop-yarn_hadoop-yarn-server_hadoop-yarn-server-resourcemanager.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4450/3/artifact/out/patch-unit-hadoop-yarn-project_hadoop-yarn_hadoop-yarn-server_hadoop-yarn-server-resourcemanager.txt) | hadoop-yarn-server-resourcemanager in the patch passed. | | -1 :x: | asflicense | 1m 36s | [/results-asflicense.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4450/3/artifact/out/results-asflicense.txt) | The patch generated 12 ASF License warnings. | | | | 525m 53s | | | | Reason | Tests | |---:|:--| | Failed junit tests | hadoop.yarn.server.resourcemanager.webapp.TestRMWebServicesReservation | | |
[GitHub] [hadoop] steveloughran commented on a diff in pull request #4248: MAPREDUCE-7370. Parallelize MultipleOutputs#close call
steveloughran commented on code in PR #4248: URL: https://github.com/apache/hadoop/pull/4248#discussion_r905241976 ## hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-core/src/main/java/org/apache/hadoop/mapreduce/lib/output/MultipleOutputs.java: ## @@ -570,8 +570,14 @@ public void setStatus(String status) { */ @SuppressWarnings("unchecked") public void close() throws IOException, InterruptedException { -for (RecordWriter writer : recordWriters.values()) { - writer.close(context); -} +recordWriters.values().parallelStream().forEach(writer -> { Review Comment: this is probably true. looking at IOUtils, our own cleanupWithLogger() method catches all throwables, but closeSocket() only swallows IOEs. We must assume a lot of other code is similar, so raised IOEs must stay as IOEs -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] PrabhuJoseph merged pull request #4487: YARN-9874.Remove unnecessary LevelDb write call in LeveldbConfigurationStore#confirmMutation
PrabhuJoseph merged PR #4487: URL: https://github.com/apache/hadoop/pull/4487 -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] PrabhuJoseph merged pull request #4486: YARN-10320.Replace FSDataInputStream#read with readFully in Log Aggregation
PrabhuJoseph merged PR #4486: URL: https://github.com/apache/hadoop/pull/4486 -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] PrabhuJoseph commented on pull request #4486: YARN-10320.Replace FSDataInputStream#read with readFully in Log Aggregation
PrabhuJoseph commented on PR #4486: URL: https://github.com/apache/hadoop/pull/4486#issuecomment-1164625426 Latest Patch looks good, +1. Will commit it. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] hadoop-yetus commented on pull request #4464: YARN-11169. Support moveApplicationAcrossQueues, getQueueInfo API's for Federation.
hadoop-yetus commented on PR #4464: URL: https://github.com/apache/hadoop/pull/4464#issuecomment-1164612891 :confetti_ball: **+1 overall** | Vote | Subsystem | Runtime | Logfile | Comment | |::|--:|:|::|:---:| | +0 :ok: | reexec | 1m 1s | | Docker mode activated. | _ Prechecks _ | | +1 :green_heart: | dupname | 0m 0s | | No case conflicting files found. | | +0 :ok: | codespell | 0m 1s | | codespell was not available. | | +0 :ok: | detsecrets | 0m 1s | | detect-secrets was not available. | | +0 :ok: | xmllint | 0m 1s | | xmllint was not available. | | +1 :green_heart: | @author | 0m 0s | | The patch does not contain any @author tags. | | +1 :green_heart: | test4tests | 0m 0s | | The patch appears to include 3 new or modified test files. | _ trunk Compile Tests _ | | +1 :green_heart: | mvninstall | 37m 28s | | trunk passed | | +1 :green_heart: | compile | 0m 50s | | trunk passed with JDK Private Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1 | | +1 :green_heart: | compile | 0m 47s | | trunk passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | checkstyle | 0m 48s | | trunk passed | | +1 :green_heart: | mvnsite | 0m 53s | | trunk passed | | +1 :green_heart: | javadoc | 0m 57s | | trunk passed with JDK Private Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1 | | +1 :green_heart: | javadoc | 0m 46s | | trunk passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | spotbugs | 1m 20s | | trunk passed | | +1 :green_heart: | shadedclient | 20m 41s | | branch has no errors when building and testing our client artifacts. | _ Patch Compile Tests _ | | +1 :green_heart: | mvninstall | 0m 32s | | the patch passed | | +1 :green_heart: | compile | 0m 32s | | the patch passed with JDK Private Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1 | | +1 :green_heart: | javac | 0m 32s | | the patch passed | | +1 :green_heart: | compile | 0m 29s | | the patch passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | javac | 0m 29s | | the patch passed | | +1 :green_heart: | blanks | 0m 0s | | The patch has no blanks issues. | | -0 :warning: | checkstyle | 0m 24s | [/results-checkstyle-hadoop-yarn-project_hadoop-yarn_hadoop-yarn-server_hadoop-yarn-server-router.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4464/6/artifact/out/results-checkstyle-hadoop-yarn-project_hadoop-yarn_hadoop-yarn-server_hadoop-yarn-server-router.txt) | hadoop-yarn-project/hadoop-yarn/hadoop-yarn-server/hadoop-yarn-server-router: The patch generated 3 new + 0 unchanged - 0 fixed = 3 total (was 0) | | +1 :green_heart: | mvnsite | 0m 33s | | the patch passed | | +1 :green_heart: | javadoc | 0m 28s | | the patch passed with JDK Private Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1 | | +1 :green_heart: | javadoc | 0m 27s | | the patch passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | spotbugs | 0m 58s | | the patch passed | | +1 :green_heart: | shadedclient | 20m 13s | | patch has no errors when building and testing our client artifacts. | _ Other Tests _ | | +1 :green_heart: | unit | 3m 12s | | hadoop-yarn-server-router in the patch passed. | | +1 :green_heart: | asflicense | 0m 52s | | The patch does not generate ASF License warnings. | | | | 96m 11s | | | | Subsystem | Report/Notes | |--:|:-| | Docker | ClientAPI=1.41 ServerAPI=1.41 base: https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4464/6/artifact/out/Dockerfile | | GITHUB PR | https://github.com/apache/hadoop/pull/4464 | | Optional Tests | dupname asflicense compile javac javadoc mvninstall mvnsite unit shadedclient spotbugs checkstyle codespell detsecrets xmllint | | uname | Linux a73b32a6a271 4.15.0-112-generic #113-Ubuntu SMP Thu Jul 9 23:41:39 UTC 2020 x86_64 x86_64 x86_64 GNU/Linux | | Build tool | maven | | Personality | dev-support/bin/hadoop.sh | | git revision | trunk / ffc67ce3294a4fc52ebaf3d766a9e50020bbdd9b | | Default Java | Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | Multi-JDK versions | /usr/lib/jvm/java-11-openjdk-amd64:Private Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1 /usr/lib/jvm/java-8-openjdk-amd64:Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | Test Results | https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4464/6/testReport/ | | Max. process+thread count | 1359 (vs. ulimit of 5500) | | modules | C: hadoop-yarn-project/hadoop-yarn/hadoop-yarn-server/hadoop-yarn-server-router U:
[GitHub] [hadoop] hadoop-yetus commented on pull request #4493: [Do not commit] Testing cross platform builds
hadoop-yetus commented on PR #4493: URL: https://github.com/apache/hadoop/pull/4493#issuecomment-1164571030 :confetti_ball: **+1 overall** | Vote | Subsystem | Runtime | Logfile | Comment | |::|--:|:|::|:---:| | +0 :ok: | reexec | 0m 53s | | Docker mode activated. | _ Prechecks _ | | +1 :green_heart: | dupname | 0m 0s | | No case conflicting files found. | | +0 :ok: | codespell | 0m 0s | | codespell was not available. | | +0 :ok: | detsecrets | 0m 0s | | detect-secrets was not available. | | +1 :green_heart: | @author | 0m 0s | | The patch does not contain any @author tags. | | +1 :green_heart: | test4tests | 0m 0s | | The patch appears to include 1 new or modified test files. | _ trunk Compile Tests _ | | +1 :green_heart: | mvninstall | 21m 33s | | trunk passed | | +1 :green_heart: | compile | 4m 3s | | trunk passed with JDK Private Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1 | | +1 :green_heart: | compile | 4m 3s | | trunk passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | mvnsite | 0m 48s | | trunk passed | | +1 :green_heart: | shadedclient | 50m 11s | | branch has no errors when building and testing our client artifacts. | _ Patch Compile Tests _ | | +1 :green_heart: | mvninstall | 0m 21s | | the patch passed | | +1 :green_heart: | compile | 4m 15s | | the patch passed with JDK Private Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1 | | +1 :green_heart: | cc | 4m 15s | | the patch passed | | +1 :green_heart: | golang | 4m 15s | | the patch passed | | +1 :green_heart: | javac | 4m 15s | | the patch passed | | +1 :green_heart: | compile | 4m 9s | | the patch passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | cc | 4m 9s | | the patch passed | | +1 :green_heart: | golang | 4m 9s | | the patch passed | | +1 :green_heart: | javac | 4m 9s | | the patch passed | | +1 :green_heart: | blanks | 0m 0s | | The patch has no blanks issues. | | +1 :green_heart: | mvnsite | 0m 28s | | the patch passed | | +1 :green_heart: | shadedclient | 21m 18s | | patch has no errors when building and testing our client artifacts. | _ Other Tests _ | | +1 :green_heart: | unit | 33m 3s | | hadoop-hdfs-native-client in the patch passed. | | +1 :green_heart: | asflicense | 0m 49s | | The patch does not generate ASF License warnings. | | | | 117m 49s | | | | Subsystem | Report/Notes | |--:|:-| | Docker | ClientAPI=1.41 ServerAPI=1.41 base: https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4493/1/artifact/out/Dockerfile | | GITHUB PR | https://github.com/apache/hadoop/pull/4493 | | Optional Tests | dupname asflicense compile cc mvnsite javac unit codespell detsecrets golang | | uname | Linux f6b9c1a52ebe 4.15.0-58-generic #64-Ubuntu SMP Tue Aug 6 11:12:41 UTC 2019 x86_64 x86_64 x86_64 GNU/Linux | | Build tool | maven | | Personality | dev-support/bin/hadoop.sh | | git revision | trunk / 58248bea83dc91b5a2e2c64941acbcb6a058514d | | Default Java | Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | Multi-JDK versions | /usr/lib/jvm/java-11-openjdk-amd64:Private Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1 /usr/lib/jvm/java-8-openjdk-amd64:Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | Test Results | https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4493/1/testReport/ | | Max. process+thread count | 558 (vs. ulimit of 5500) | | modules | C: hadoop-hdfs-project/hadoop-hdfs-native-client U: hadoop-hdfs-project/hadoop-hdfs-native-client | | Console output | https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4493/1/console | | versions | git=2.25.1 maven=3.6.3 | | Powered by | Apache Yetus 0.14.0 https://yetus.apache.org | This message was automatically generated. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Commented] (HADOOP-18246) Remove lower limit on s3a prefetching/caching block size
[ https://issues.apache.org/jira/browse/HADOOP-18246?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17558139#comment-17558139 ] Daniel Carl Jones commented on HADOOP-18246: I think it'd be reasonable to allow a block size as small as 1 byte - that may not be performant or safe for production, but it is up to the users discretion. > Remove lower limit on s3a prefetching/caching block size > > > Key: HADOOP-18246 > URL: https://issues.apache.org/jira/browse/HADOOP-18246 > Project: Hadoop Common > Issue Type: Sub-task >Reporter: Daniel Carl Jones >Assignee: Daniel Carl Jones >Priority: Minor > > The minimum allowed block size currently is {{PREFETCH_BLOCK_DEFAULT_SIZE}} > (8MB). > {code:java} > this.prefetchBlockSize = intOption( > conf, PREFETCH_BLOCK_SIZE_KEY, > PREFETCH_BLOCK_DEFAULT_SIZE, PREFETCH_BLOCK_DEFAULT_SIZE);{code} > [https://github.com/apache/hadoop/blob/3aa03e0eb95bbcb066144706e06509f0e0549196/hadoop-tools/hadoop-aws/src/main/java/org/apache/hadoop/fs/s3a/S3AFileSystem.java#L487-L488] > Why is this the case and should we lower or remove it? -- This message was sent by Atlassian Jira (v8.20.7#820007) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Assigned] (HADOOP-18246) Remove lower limit on s3a prefetching/caching block size
[ https://issues.apache.org/jira/browse/HADOOP-18246?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Daniel Carl Jones reassigned HADOOP-18246: -- Assignee: Daniel Carl Jones > Remove lower limit on s3a prefetching/caching block size > > > Key: HADOOP-18246 > URL: https://issues.apache.org/jira/browse/HADOOP-18246 > Project: Hadoop Common > Issue Type: Sub-task >Reporter: Daniel Carl Jones >Assignee: Daniel Carl Jones >Priority: Minor > > The minimum allowed block size currently is {{PREFETCH_BLOCK_DEFAULT_SIZE}} > (8MB). > {code:java} > this.prefetchBlockSize = intOption( > conf, PREFETCH_BLOCK_SIZE_KEY, > PREFETCH_BLOCK_DEFAULT_SIZE, PREFETCH_BLOCK_DEFAULT_SIZE);{code} > [https://github.com/apache/hadoop/blob/3aa03e0eb95bbcb066144706e06509f0e0549196/hadoop-tools/hadoop-aws/src/main/java/org/apache/hadoop/fs/s3a/S3AFileSystem.java#L487-L488] > Why is this the case and should we lower or remove it? -- This message was sent by Atlassian Jira (v8.20.7#820007) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] ZanderXu commented on pull request #4480: HDFS-16638. Add isDebugEnabled check for debug blockLogs in BlockManager
ZanderXu commented on PR #4480: URL: https://github.com/apache/hadoop/pull/4480#issuecomment-1164537396 you can also refer to [logging_performance](https://www.slf4j.org/faq.html#logging_performance). -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] ZanderXu commented on pull request #4480: HDFS-16638. Add isDebugEnabled check for debug blockLogs in BlockManager
ZanderXu commented on PR #4480: URL: https://github.com/apache/hadoop/pull/4480#issuecomment-1164531835 Thanks @cxzl25 for your comment. [HDFS-14103](https://issues.apache.org/jira/browse/HDFS-14103) is getting rid of the call of LOG.isDebugEnabled() by SLF4J API, can you take a look? -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Work logged] (HADOOP-13126) Add Brotli compression codec
[ https://issues.apache.org/jira/browse/HADOOP-13126?focusedWorklogId=784249=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-784249 ] ASF GitHub Bot logged work on HADOOP-13126: --- Author: ASF GitHub Bot Created on: 23/Jun/22 14:38 Start Date: 23/Jun/22 14:38 Worklog Time Spent: 10m Work Description: hadoop-yetus commented on PR #2723: URL: https://github.com/apache/hadoop/pull/2723#issuecomment-1164492256 :broken_heart: **-1 overall** | Vote | Subsystem | Runtime | Logfile | Comment | |::|--:|:|::|:---:| | +0 :ok: | reexec | 0m 38s | | Docker mode activated. | _ Prechecks _ | | +1 :green_heart: | dupname | 0m 1s | | No case conflicting files found. | | +0 :ok: | codespell | 0m 0s | | codespell was not available. | | +0 :ok: | detsecrets | 0m 0s | | detect-secrets was not available. | | +0 :ok: | xmllint | 0m 0s | | xmllint was not available. | | +1 :green_heart: | @author | 0m 0s | | The patch does not contain any @author tags. | | +1 :green_heart: | test4tests | 0m 0s | | The patch appears to include 3 new or modified test files. | _ trunk Compile Tests _ | | +0 :ok: | mvndep | 14m 47s | | Maven dependency ordering for branch | | +1 :green_heart: | mvninstall | 25m 0s | | trunk passed | | +1 :green_heart: | compile | 23m 12s | | trunk passed with JDK Private Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1 | | +1 :green_heart: | compile | 20m 33s | | trunk passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | checkstyle | 4m 27s | | trunk passed | | +1 :green_heart: | mvnsite | 3m 26s | | trunk passed | | +1 :green_heart: | javadoc | 3m 1s | | trunk passed with JDK Private Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1 | | +1 :green_heart: | javadoc | 2m 33s | | trunk passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | +0 :ok: | spotbugs | 1m 26s | | branch/hadoop-project no spotbugs output file (spotbugsXml.xml) | | +1 :green_heart: | shadedclient | 23m 24s | | branch has no errors when building and testing our client artifacts. | _ Patch Compile Tests _ | | +0 :ok: | mvndep | 0m 44s | | Maven dependency ordering for patch | | -1 :x: | mvninstall | 0m 28s | [/patch-mvninstall-hadoop-common-project_hadoop-common.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-2723/2/artifact/out/patch-mvninstall-hadoop-common-project_hadoop-common.txt) | hadoop-common in the patch failed. | | -1 :x: | compile | 0m 53s | [/patch-compile-root-jdkPrivateBuild-11.0.15+10-Ubuntu-0ubuntu0.20.04.1.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-2723/2/artifact/out/patch-compile-root-jdkPrivateBuild-11.0.15+10-Ubuntu-0ubuntu0.20.04.1.txt) | root in the patch failed with JDK Private Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1. | | -1 :x: | javac | 0m 53s | [/patch-compile-root-jdkPrivateBuild-11.0.15+10-Ubuntu-0ubuntu0.20.04.1.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-2723/2/artifact/out/patch-compile-root-jdkPrivateBuild-11.0.15+10-Ubuntu-0ubuntu0.20.04.1.txt) | root in the patch failed with JDK Private Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1. | | -1 :x: | compile | 0m 47s | [/patch-compile-root-jdkPrivateBuild-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-2723/2/artifact/out/patch-compile-root-jdkPrivateBuild-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07.txt) | root in the patch failed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07. | | -1 :x: | javac | 0m 47s | [/patch-compile-root-jdkPrivateBuild-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-2723/2/artifact/out/patch-compile-root-jdkPrivateBuild-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07.txt) | root in the patch failed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07. | | +1 :green_heart: | blanks | 0m 0s | | The patch has no blanks issues. | | -0 :warning: | checkstyle | 4m 26s | [/results-checkstyle-root.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-2723/2/artifact/out/results-checkstyle-root.txt) | root: The patch generated 2 new + 0 unchanged - 0 fixed = 2 total (was 0) | | -1 :x: | mvnsite | 0m 32s | [/patch-mvnsite-hadoop-common-project_hadoop-common.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-2723/2/artifact/out/patch-mvnsite-hadoop-common-project_hadoop-common.txt) | hadoop-common in the patch failed. | | -1 :x: | javadoc | 0m 29s |
[GitHub] [hadoop] hadoop-yetus commented on pull request #2723: HADOOP-13126 Add BrotliCodec based on Brotli4j library
hadoop-yetus commented on PR #2723: URL: https://github.com/apache/hadoop/pull/2723#issuecomment-1164492256 :broken_heart: **-1 overall** | Vote | Subsystem | Runtime | Logfile | Comment | |::|--:|:|::|:---:| | +0 :ok: | reexec | 0m 38s | | Docker mode activated. | _ Prechecks _ | | +1 :green_heart: | dupname | 0m 1s | | No case conflicting files found. | | +0 :ok: | codespell | 0m 0s | | codespell was not available. | | +0 :ok: | detsecrets | 0m 0s | | detect-secrets was not available. | | +0 :ok: | xmllint | 0m 0s | | xmllint was not available. | | +1 :green_heart: | @author | 0m 0s | | The patch does not contain any @author tags. | | +1 :green_heart: | test4tests | 0m 0s | | The patch appears to include 3 new or modified test files. | _ trunk Compile Tests _ | | +0 :ok: | mvndep | 14m 47s | | Maven dependency ordering for branch | | +1 :green_heart: | mvninstall | 25m 0s | | trunk passed | | +1 :green_heart: | compile | 23m 12s | | trunk passed with JDK Private Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1 | | +1 :green_heart: | compile | 20m 33s | | trunk passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | checkstyle | 4m 27s | | trunk passed | | +1 :green_heart: | mvnsite | 3m 26s | | trunk passed | | +1 :green_heart: | javadoc | 3m 1s | | trunk passed with JDK Private Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1 | | +1 :green_heart: | javadoc | 2m 33s | | trunk passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | +0 :ok: | spotbugs | 1m 26s | | branch/hadoop-project no spotbugs output file (spotbugsXml.xml) | | +1 :green_heart: | shadedclient | 23m 24s | | branch has no errors when building and testing our client artifacts. | _ Patch Compile Tests _ | | +0 :ok: | mvndep | 0m 44s | | Maven dependency ordering for patch | | -1 :x: | mvninstall | 0m 28s | [/patch-mvninstall-hadoop-common-project_hadoop-common.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-2723/2/artifact/out/patch-mvninstall-hadoop-common-project_hadoop-common.txt) | hadoop-common in the patch failed. | | -1 :x: | compile | 0m 53s | [/patch-compile-root-jdkPrivateBuild-11.0.15+10-Ubuntu-0ubuntu0.20.04.1.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-2723/2/artifact/out/patch-compile-root-jdkPrivateBuild-11.0.15+10-Ubuntu-0ubuntu0.20.04.1.txt) | root in the patch failed with JDK Private Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1. | | -1 :x: | javac | 0m 53s | [/patch-compile-root-jdkPrivateBuild-11.0.15+10-Ubuntu-0ubuntu0.20.04.1.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-2723/2/artifact/out/patch-compile-root-jdkPrivateBuild-11.0.15+10-Ubuntu-0ubuntu0.20.04.1.txt) | root in the patch failed with JDK Private Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1. | | -1 :x: | compile | 0m 47s | [/patch-compile-root-jdkPrivateBuild-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-2723/2/artifact/out/patch-compile-root-jdkPrivateBuild-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07.txt) | root in the patch failed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07. | | -1 :x: | javac | 0m 47s | [/patch-compile-root-jdkPrivateBuild-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-2723/2/artifact/out/patch-compile-root-jdkPrivateBuild-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07.txt) | root in the patch failed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07. | | +1 :green_heart: | blanks | 0m 0s | | The patch has no blanks issues. | | -0 :warning: | checkstyle | 4m 26s | [/results-checkstyle-root.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-2723/2/artifact/out/results-checkstyle-root.txt) | root: The patch generated 2 new + 0 unchanged - 0 fixed = 2 total (was 0) | | -1 :x: | mvnsite | 0m 32s | [/patch-mvnsite-hadoop-common-project_hadoop-common.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-2723/2/artifact/out/patch-mvnsite-hadoop-common-project_hadoop-common.txt) | hadoop-common in the patch failed. | | -1 :x: | javadoc | 0m 29s | [/patch-javadoc-hadoop-common-project_hadoop-common-jdkPrivateBuild-11.0.15+10-Ubuntu-0ubuntu0.20.04.1.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-2723/2/artifact/out/patch-javadoc-hadoop-common-project_hadoop-common-jdkPrivateBuild-11.0.15+10-Ubuntu-0ubuntu0.20.04.1.txt) | hadoop-common in the patch failed with JDK Private Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1. | | -1 :x: | javadoc | 0m 28s |
[jira] [Work logged] (HADOOP-18304) Improve S3A committers documentation clarity
[ https://issues.apache.org/jira/browse/HADOOP-18304?focusedWorklogId=784246=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-784246 ] ASF GitHub Bot logged work on HADOOP-18304: --- Author: ASF GitHub Bot Created on: 23/Jun/22 14:34 Start Date: 23/Jun/22 14:34 Worklog Time Spent: 10m Work Description: hadoop-yetus commented on PR #4478: URL: https://github.com/apache/hadoop/pull/4478#issuecomment-1164488192 :broken_heart: **-1 overall** | Vote | Subsystem | Runtime | Logfile | Comment | |::|--:|:|::|:---:| | +0 :ok: | reexec | 1m 20s | | Docker mode activated. | _ Prechecks _ | | +1 :green_heart: | dupname | 0m 0s | | No case conflicting files found. | | +0 :ok: | codespell | 0m 0s | | codespell was not available. | | +0 :ok: | detsecrets | 0m 0s | | detect-secrets was not available. | | +0 :ok: | markdownlint | 0m 0s | | markdownlint was not available. | | +1 :green_heart: | @author | 0m 0s | | The patch does not contain any @author tags. | _ trunk Compile Tests _ | | -1 :x: | mvninstall | 4m 13s | [/branch-mvninstall-root.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4478/4/artifact/out/branch-mvninstall-root.txt) | root in trunk failed. | | +1 :green_heart: | mvnsite | 4m 7s | | trunk passed | | +1 :green_heart: | shadedclient | 36m 24s | | branch has no errors when building and testing our client artifacts. | _ Patch Compile Tests _ | | +1 :green_heart: | mvninstall | 0m 36s | | the patch passed | | +1 :green_heart: | blanks | 0m 0s | | The patch has no blanks issues. | | +1 :green_heart: | mvnsite | 0m 37s | | the patch passed | | +1 :green_heart: | shadedclient | 22m 48s | | patch has no errors when building and testing our client artifacts. | _ Other Tests _ | | +1 :green_heart: | asflicense | 0m 45s | | The patch does not generate ASF License warnings. | | | | 63m 47s | | | | Subsystem | Report/Notes | |--:|:-| | Docker | ClientAPI=1.41 ServerAPI=1.41 base: https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4478/4/artifact/out/Dockerfile | | GITHUB PR | https://github.com/apache/hadoop/pull/4478 | | Optional Tests | dupname asflicense mvnsite codespell detsecrets markdownlint | | uname | Linux dbf5984762f5 4.15.0-166-generic #174-Ubuntu SMP Wed Dec 8 19:07:44 UTC 2021 x86_64 x86_64 x86_64 GNU/Linux | | Build tool | maven | | Personality | dev-support/bin/hadoop.sh | | git revision | trunk / cd97bad7950f99e1611887b8e8d4c93e75c8bf7d | | Max. process+thread count | 608 (vs. ulimit of 5500) | | modules | C: hadoop-tools/hadoop-aws U: hadoop-tools/hadoop-aws | | Console output | https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4478/4/console | | versions | git=2.25.1 maven=3.6.3 | | Powered by | Apache Yetus 0.14.0 https://yetus.apache.org | This message was automatically generated. Issue Time Tracking --- Worklog Id: (was: 784246) Time Spent: 2.5h (was: 2h 20m) > Improve S3A committers documentation clarity > > > Key: HADOOP-18304 > URL: https://issues.apache.org/jira/browse/HADOOP-18304 > Project: Hadoop Common > Issue Type: Sub-task > Components: documentation >Reporter: Daniel Carl Jones >Assignee: Daniel Carl Jones >Priority: Trivial > Labels: pull-request-available > Time Spent: 2.5h > Remaining Estimate: 0h > > I recently was learning more about the S3A committers. I'm hoping to provide > some improvements as someone who has recently read [this > documentation|https://github.com/apache/hadoop/blob/1f157f802d2d6142d21482eaa86baf1bef458ed4/hadoop-tools/hadoop-aws/src/site/markdown/tools/hadoop-aws/committers.md#L495] > without fully understanding prior. > For instance, referencing different components more explicitly and adding > pre-requisite info. -- This message was sent by Atlassian Jira (v8.20.7#820007) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] hadoop-yetus commented on pull request #4478: HADOOP-18304. Improve user-facing S3A committers documentation
hadoop-yetus commented on PR #4478: URL: https://github.com/apache/hadoop/pull/4478#issuecomment-1164488192 :broken_heart: **-1 overall** | Vote | Subsystem | Runtime | Logfile | Comment | |::|--:|:|::|:---:| | +0 :ok: | reexec | 1m 20s | | Docker mode activated. | _ Prechecks _ | | +1 :green_heart: | dupname | 0m 0s | | No case conflicting files found. | | +0 :ok: | codespell | 0m 0s | | codespell was not available. | | +0 :ok: | detsecrets | 0m 0s | | detect-secrets was not available. | | +0 :ok: | markdownlint | 0m 0s | | markdownlint was not available. | | +1 :green_heart: | @author | 0m 0s | | The patch does not contain any @author tags. | _ trunk Compile Tests _ | | -1 :x: | mvninstall | 4m 13s | [/branch-mvninstall-root.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4478/4/artifact/out/branch-mvninstall-root.txt) | root in trunk failed. | | +1 :green_heart: | mvnsite | 4m 7s | | trunk passed | | +1 :green_heart: | shadedclient | 36m 24s | | branch has no errors when building and testing our client artifacts. | _ Patch Compile Tests _ | | +1 :green_heart: | mvninstall | 0m 36s | | the patch passed | | +1 :green_heart: | blanks | 0m 0s | | The patch has no blanks issues. | | +1 :green_heart: | mvnsite | 0m 37s | | the patch passed | | +1 :green_heart: | shadedclient | 22m 48s | | patch has no errors when building and testing our client artifacts. | _ Other Tests _ | | +1 :green_heart: | asflicense | 0m 45s | | The patch does not generate ASF License warnings. | | | | 63m 47s | | | | Subsystem | Report/Notes | |--:|:-| | Docker | ClientAPI=1.41 ServerAPI=1.41 base: https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4478/4/artifact/out/Dockerfile | | GITHUB PR | https://github.com/apache/hadoop/pull/4478 | | Optional Tests | dupname asflicense mvnsite codespell detsecrets markdownlint | | uname | Linux dbf5984762f5 4.15.0-166-generic #174-Ubuntu SMP Wed Dec 8 19:07:44 UTC 2021 x86_64 x86_64 x86_64 GNU/Linux | | Build tool | maven | | Personality | dev-support/bin/hadoop.sh | | git revision | trunk / cd97bad7950f99e1611887b8e8d4c93e75c8bf7d | | Max. process+thread count | 608 (vs. ulimit of 5500) | | modules | C: hadoop-tools/hadoop-aws U: hadoop-tools/hadoop-aws | | Console output | https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4478/4/console | | versions | git=2.25.1 maven=3.6.3 | | Powered by | Apache Yetus 0.14.0 https://yetus.apache.org | This message was automatically generated. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] cxzl25 commented on pull request #4480: HDFS-16638. Add isDebugEnabled check for debug blockLogs in BlockManager
cxzl25 commented on PR #4480: URL: https://github.com/apache/hadoop/pull/4480#issuecomment-1164455182 > Thanks @cxzl25 for your patch. I have a question about it and looking forward your feedback. Can this change improve the namenode performance? I see that some places specially remove this judgment, and use log.info("{}", XXX) replace it. > > Or can you do some performance test for it? use this judgment or not. We have some JIRAs that use the `isDebugEnabled` judgment. At the beginning, I found that the log of `Removing stale replica` has meaningless string splicing, and using `NameNode.blockStateChangeLog` can be replaced by `blockLog`, because this debug-level log is called a lot every day. So I fixed the other problems by the way. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Commented] (HADOOP-17725) Improve error message for token providers in ABFS
[ https://issues.apache.org/jira/browse/HADOOP-17725?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17558102#comment-17558102 ] Carl commented on HADOOP-17725: --- The changes related to this ticket force to set optional fields and prevent from relying on Azure's metadata service to set them automatically. Both [~vjasani] 's PR ([https://github.com/apache/hadoop/pull/3788)] and mine fix it ([https://github.com/apache/hadoop/pull/4262|https://github.com/apache/hadoop/pull/4262)]). Can we get one of these merged please ? > Improve error message for token providers in ABFS > - > > Key: HADOOP-17725 > URL: https://issues.apache.org/jira/browse/HADOOP-17725 > Project: Hadoop Common > Issue Type: Improvement > Components: fs/azure, hadoop-thirdparty >Affects Versions: 3.3.0 >Reporter: Ivan Sadikov >Assignee: Viraj Jasani >Priority: Major > Labels: pull-request-available > Fix For: 3.3.2 > > Time Spent: 8h > Remaining Estimate: 0h > > It would be good to improve error messages for token providers in ABFS. > Currently, when a configuration key is not found or mistyped, the error is > not very clear on what went wrong. It would be good to indicate that the key > was required but not found in Hadoop configuration when creating a token > provider. > For example, when running the following code: > {code:java} > import org.apache.hadoop.conf._ > import org.apache.hadoop.fs._ > val conf = new Configuration() > conf.set("fs.azure.account.auth.type", "OAuth") > conf.set("fs.azure.account.oauth.provider.type", > "org.apache.hadoop.fs.azurebfs.oauth2.ClientCredsTokenProvider") > conf.set("fs.azure.account.oauth2.client.id", "my-client-id") > // > conf.set("fs.azure.account.oauth2.client.secret.my-account.dfs.core.windows.net", > "my-secret") > conf.set("fs.azure.account.oauth2.client.endpoint", "my-endpoint") > val path = new Path("abfss://contai...@my-account.dfs.core.windows.net/") > val fs = path.getFileSystem(conf) > fs.getFileStatus(path){code} > The following exception is thrown: > {code:java} > TokenAccessProviderException: Unable to load OAuth token provider class. > ... > Caused by: UncheckedExecutionException: java.lang.NullPointerException: > clientSecret > ... > Caused by: NullPointerException: clientSecret {code} > which does not tell what configuration key was not loaded. > > IMHO, it would be good if the exception was something like this: > {code:java} > TokenAccessProviderException: Unable to load OAuth token provider class. > ... > Caused by: ConfigurationPropertyNotFoundException: Configuration property > fs.azure.account.oauth2.client.secret not found. {code} -- This message was sent by Atlassian Jira (v8.20.7#820007) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] hadoop-yetus commented on pull request #4493: [Do not commit] Testing cross platform builds
hadoop-yetus commented on PR #4493: URL: https://github.com/apache/hadoop/pull/4493#issuecomment-1164424317 :confetti_ball: **+1 overall** | Vote | Subsystem | Runtime | Logfile | Comment | |::|--:|:|::|:---:| | +0 :ok: | reexec | 21m 24s | | Docker mode activated. | _ Prechecks _ | | +1 :green_heart: | dupname | 0m 0s | | No case conflicting files found. | | +0 :ok: | codespell | 0m 1s | | codespell was not available. | | +0 :ok: | detsecrets | 0m 1s | | detect-secrets was not available. | | +1 :green_heart: | @author | 0m 0s | | The patch does not contain any @author tags. | | +1 :green_heart: | test4tests | 0m 0s | | The patch appears to include 1 new or modified test files. | _ trunk Compile Tests _ | | +1 :green_heart: | mvninstall | 24m 56s | | trunk passed | | +1 :green_heart: | compile | 3m 35s | | trunk passed | | +1 :green_heart: | mvnsite | 0m 44s | | trunk passed | | +1 :green_heart: | shadedclient | 55m 22s | | branch has no errors when building and testing our client artifacts. | _ Patch Compile Tests _ | | +1 :green_heart: | mvninstall | 0m 23s | | the patch passed | | +1 :green_heart: | compile | 3m 13s | | the patch passed | | +1 :green_heart: | cc | 3m 13s | | the patch passed | | +1 :green_heart: | golang | 3m 13s | | the patch passed | | +1 :green_heart: | javac | 3m 13s | | the patch passed | | +1 :green_heart: | blanks | 0m 0s | | The patch has no blanks issues. | | +1 :green_heart: | mvnsite | 0m 26s | | the patch passed | | +1 :green_heart: | shadedclient | 26m 18s | | patch has no errors when building and testing our client artifacts. | _ Other Tests _ | | +1 :green_heart: | unit | 32m 12s | | hadoop-hdfs-native-client in the patch passed. | | +1 :green_heart: | asflicense | 0m 48s | | The patch does not generate ASF License warnings. | | | | 142m 51s | | | | Subsystem | Report/Notes | |--:|:-| | Docker | ClientAPI=1.41 ServerAPI=1.41 base: https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4493/1/artifact/out/Dockerfile | | GITHUB PR | https://github.com/apache/hadoop/pull/4493 | | Optional Tests | dupname asflicense compile cc mvnsite javac unit codespell detsecrets golang | | uname | Linux 7cb79aa5e88d 4.15.0-58-generic #64-Ubuntu SMP Tue Aug 6 11:12:41 UTC 2019 x86_64 GNU/Linux | | Build tool | maven | | Personality | dev-support/bin/hadoop.sh | | git revision | trunk / 58248bea83dc91b5a2e2c64941acbcb6a058514d | | Default Java | Debian-11.0.15+10-post-Debian-1deb10u1 | | Test Results | https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4493/1/testReport/ | | modules | C: hadoop-hdfs-project/hadoop-hdfs-native-client U: hadoop-hdfs-project/hadoop-hdfs-native-client | | Console output | https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4493/1/console | | versions | git=2.20.1 maven=3.6.0 | | Powered by | Apache Yetus 0.14.0 https://yetus.apache.org | This message was automatically generated. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Work logged] (HADOOP-18311) Upgrade dependencies to address several CVEs
[ https://issues.apache.org/jira/browse/HADOOP-18311?focusedWorklogId=784220=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-784220 ] ASF GitHub Bot logged work on HADOOP-18311: --- Author: ASF GitHub Bot Created on: 23/Jun/22 13:35 Start Date: 23/Jun/22 13:35 Worklog Time Spent: 10m Work Description: steveloughran commented on PR #4491: URL: https://github.com/apache/hadoop/pull/4491#issuecomment-1164417288 (you are going to hate me here. sorry) First, please let's not have "update a few dependency" patches. Is it not a useful title and by updating multiple dependencies simultaneously makes it a lot harder to identify problems through git bisect and makes the changes harder roll back and cherry pick. Second, we must not have anything in this release which isn't already in branch 3.3 and so has been stabilising there in the uses other developers have been making of that branch. Finally, I am scared of any- and all- last minute updates of dependences as the blast radius of a change of a few digits in a number in a POM file can have dramatic impact on a project two hops away. That's why I believe the default decision on any last minute dependency update should be "no". This is worth bearing in mind as I intend to share release manager responsibilities with Mukund on the branch-3.3 feature release this summer, and refusing last-minute changes is going to be my default action, especially when it comes to jar updates. Get those changes in and stabilising now! ## jetty -1 to jetty update because I'm scared of what will break. the hadoop.next release will upgrade to jetty 2 and shade it. ## Htrace -1 to htrace as it was fixed in this branch by #3520 ``` 9e2936f8d1f HADOOP-17424. Replace HTrace with No-Op tracer (#3520) ``` If this is not the case then we have a serious issue which needs to be fixed across all the recent branches. file a critical hadoop JIRA and we can go from there. ## Zookeeper -1 until/unless in branch-3.3 Interesting one there. trunk is on 3.6.3 after HADOOP-17612. Upgrade Zookeeper to 3.6.3 and Curator to 5.2.0 #3241 For any change there, an increment on 3.5.x is lower risk and may not need a matching curator increment, but that'd still need qualification for the branch-3.3 release, why don't we cherrypick #3241 and followons? ## AWS SDK -1 to updating the AWS SDK except as a standalone cherrypick of our branch-3.3 patch #3864 with full requalification ``` d8ab84275e0 - HADOOP-18068. upgrade AWS SDK to 1.12.132 (#3864) ``` The SDK is covered in HADOOP-18068; any back porting should just be a cherrypick. But as with most is AWS SDK updates it caused a regression (HADOOP-18085). Anyone proposing it as a backport has to 1. Run the full hadoop-aws integration test suite with `-Dscale` and declare which endpoint they ran against. 2. look at the section "Qualifying an AWS SDK Update" and treat the instructions there as a MUST not a MAY https://hadoop.apache.org/docs/stable/hadoop-aws/tools/hadoop-aws/testing.html#Qualifying_an_AWS_SDK_Update 3. note that instruction 1 there is "Don’t make this a last minute action." I have encountered other cases where people have been updating this SDK dependency without raising it with me. Yes, tools do highlight Jackson serialisation issues which exist in the shaded Jackson dependency. However, the AWS STK does not use those bits of Jackson. And, because nothing else uses those bits of Jackson in this library precisely because they are shaded, the risk is not actually manifest in the S3A connector. Given this fact and the qualification process I don't want to include it. If you really want this in, create a single PR cherry picking HADOOP-18068, and all follow-on fixes which are applicable to this branch, say which AWS endpoint you ran the hadoop-aws test suites against. And do the entire SDK update qualification covered in the testing doc. I will then merge the chain of commits in one by one This should be safe because we have actually been using this in branch 3.3+ and other than the regression in tests there have been no adverse consequences. It MUST be the exact version we have been using (1.12.132) as no later release has been validated. Issue Time Tracking --- Worklog Id: (was: 784220) Time Spent: 1h (was: 50m) > Upgrade dependencies to address several CVEs > > > Key: HADOOP-18311 > URL: https://issues.apache.org/jira/browse/HADOOP-18311 > Project: Hadoop Common > Issue Type: Improvement > Components: common >Affects Versions: 3.3.3, 3.3.4 >Reporter:
[GitHub] [hadoop] steveloughran commented on pull request #4491: HADOOP-18311. Upgrade dependencies to address several CVEs
steveloughran commented on PR #4491: URL: https://github.com/apache/hadoop/pull/4491#issuecomment-1164417288 (you are going to hate me here. sorry) First, please let's not have "update a few dependency" patches. Is it not a useful title and by updating multiple dependencies simultaneously makes it a lot harder to identify problems through git bisect and makes the changes harder roll back and cherry pick. Second, we must not have anything in this release which isn't already in branch 3.3 and so has been stabilising there in the uses other developers have been making of that branch. Finally, I am scared of any- and all- last minute updates of dependences as the blast radius of a change of a few digits in a number in a POM file can have dramatic impact on a project two hops away. That's why I believe the default decision on any last minute dependency update should be "no". This is worth bearing in mind as I intend to share release manager responsibilities with Mukund on the branch-3.3 feature release this summer, and refusing last-minute changes is going to be my default action, especially when it comes to jar updates. Get those changes in and stabilising now! ## jetty -1 to jetty update because I'm scared of what will break. the hadoop.next release will upgrade to jetty 2 and shade it. ## Htrace -1 to htrace as it was fixed in this branch by #3520 ``` 9e2936f8d1f HADOOP-17424. Replace HTrace with No-Op tracer (#3520) ``` If this is not the case then we have a serious issue which needs to be fixed across all the recent branches. file a critical hadoop JIRA and we can go from there. ## Zookeeper -1 until/unless in branch-3.3 Interesting one there. trunk is on 3.6.3 after HADOOP-17612. Upgrade Zookeeper to 3.6.3 and Curator to 5.2.0 #3241 For any change there, an increment on 3.5.x is lower risk and may not need a matching curator increment, but that'd still need qualification for the branch-3.3 release, why don't we cherrypick #3241 and followons? ## AWS SDK -1 to updating the AWS SDK except as a standalone cherrypick of our branch-3.3 patch #3864 with full requalification ``` d8ab84275e0 - HADOOP-18068. upgrade AWS SDK to 1.12.132 (#3864) ``` The SDK is covered in HADOOP-18068; any back porting should just be a cherrypick. But as with most is AWS SDK updates it caused a regression (HADOOP-18085). Anyone proposing it as a backport has to 1. Run the full hadoop-aws integration test suite with `-Dscale` and declare which endpoint they ran against. 2. look at the section "Qualifying an AWS SDK Update" and treat the instructions there as a MUST not a MAY https://hadoop.apache.org/docs/stable/hadoop-aws/tools/hadoop-aws/testing.html#Qualifying_an_AWS_SDK_Update 3. note that instruction 1 there is "Don’t make this a last minute action." I have encountered other cases where people have been updating this SDK dependency without raising it with me. Yes, tools do highlight Jackson serialisation issues which exist in the shaded Jackson dependency. However, the AWS STK does not use those bits of Jackson. And, because nothing else uses those bits of Jackson in this library precisely because they are shaded, the risk is not actually manifest in the S3A connector. Given this fact and the qualification process I don't want to include it. If you really want this in, create a single PR cherry picking HADOOP-18068, and all follow-on fixes which are applicable to this branch, say which AWS endpoint you ran the hadoop-aws test suites against. And do the entire SDK update qualification covered in the testing doc. I will then merge the chain of commits in one by one This should be safe because we have actually been using this in branch 3.3+ and other than the regression in tests there have been no adverse consequences. It MUST be the exact version we have been using (1.12.132) as no later release has been validated. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Work logged] (HADOOP-18044) Hadoop - Upgrade to JQuery 3.6.0
[ https://issues.apache.org/jira/browse/HADOOP-18044?focusedWorklogId=784214=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-784214 ] ASF GitHub Bot logged work on HADOOP-18044: --- Author: ASF GitHub Bot Created on: 23/Jun/22 13:20 Start Date: 23/Jun/22 13:20 Worklog Time Spent: 10m Work Description: ashutoshcipher commented on PR #4495: URL: https://github.com/apache/hadoop/pull/4495#issuecomment-1164401081 LGTM +1 (Jenkins pending) Issue Time Tracking --- Worklog Id: (was: 784214) Time Spent: 1h 50m (was: 1h 40m) > Hadoop - Upgrade to JQuery 3.6.0 > > > Key: HADOOP-18044 > URL: https://issues.apache.org/jira/browse/HADOOP-18044 > Project: Hadoop Common > Issue Type: Improvement >Reporter: Yuan Luo >Assignee: Yuan Luo >Priority: Major > Labels: pull-request-available > Fix For: 3.4.0, 3.3.4 > > Time Spent: 1h 50m > Remaining Estimate: 0h > > jQuery 3.6.0 has been released few months ago - > [http://blog.jquery.com/2021/03/02/jquery-3-6-0-released/ > |http://blog.jquery.com/2021/03/02/jquery-3-6-0-released/,] > We can upgrade jquery-3.5.1.min.js to jquery-3.6.0.min.js in hadoop project. -- This message was sent by Atlassian Jira (v8.20.7#820007) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] ashutoshcipher commented on pull request #4495: HADOOP-18044. Hadoop - Upgrade to jQuery 3.6.0 (#3791)
ashutoshcipher commented on PR #4495: URL: https://github.com/apache/hadoop/pull/4495#issuecomment-1164401081 LGTM +1 (Jenkins pending) -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Work logged] (HADOOP-18304) Improve S3A committers documentation clarity
[ https://issues.apache.org/jira/browse/HADOOP-18304?focusedWorklogId=784210=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-784210 ] ASF GitHub Bot logged work on HADOOP-18304: --- Author: ASF GitHub Bot Created on: 23/Jun/22 13:15 Start Date: 23/Jun/22 13:15 Worklog Time Spent: 10m Work Description: hadoop-yetus commented on PR #4478: URL: https://github.com/apache/hadoop/pull/4478#issuecomment-1164395425 :broken_heart: **-1 overall** | Vote | Subsystem | Runtime | Logfile | Comment | |::|--:|:|::|:---:| | +0 :ok: | reexec | 0m 57s | | Docker mode activated. | _ Prechecks _ | | +1 :green_heart: | dupname | 0m 0s | | No case conflicting files found. | | +0 :ok: | codespell | 0m 0s | | codespell was not available. | | +0 :ok: | detsecrets | 0m 0s | | detect-secrets was not available. | | +0 :ok: | markdownlint | 0m 0s | | markdownlint was not available. | | +1 :green_heart: | @author | 0m 0s | | The patch does not contain any @author tags. | _ trunk Compile Tests _ | | +1 :green_heart: | mvninstall | 49m 4s | | trunk passed | | +1 :green_heart: | mvnsite | 1m 35s | | trunk passed | | -1 :x: | shadedclient | 86m 56s | | branch has errors when building and testing our client artifacts. | _ Patch Compile Tests _ | | +1 :green_heart: | mvninstall | 0m 49s | | the patch passed | | +1 :green_heart: | blanks | 0m 0s | | The patch has no blanks issues. | | +1 :green_heart: | mvnsite | 0m 52s | | the patch passed | | -1 :x: | shadedclient | 5m 10s | | patch has errors when building and testing our client artifacts. | _ Other Tests _ | | +0 :ok: | asflicense | 0m 34s | | ASF License check generated no output? | | | | 96m 21s | | | | Subsystem | Report/Notes | |--:|:-| | Docker | ClientAPI=1.41 ServerAPI=1.41 base: https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4478/3/artifact/out/Dockerfile | | GITHUB PR | https://github.com/apache/hadoop/pull/4478 | | Optional Tests | dupname asflicense mvnsite codespell detsecrets markdownlint | | uname | Linux e68a2eb56e51 4.15.0-166-generic #174-Ubuntu SMP Wed Dec 8 19:07:44 UTC 2021 x86_64 x86_64 x86_64 GNU/Linux | | Build tool | maven | | Personality | dev-support/bin/hadoop.sh | | git revision | trunk / cd600503542751a72b7b21028ef08b26e41e6580 | | Max. process+thread count | 546 (vs. ulimit of 5500) | | modules | C: hadoop-tools/hadoop-aws U: hadoop-tools/hadoop-aws | | Console output | https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4478/3/console | | versions | git=2.25.1 maven=3.6.3 | | Powered by | Apache Yetus 0.14.0 https://yetus.apache.org | This message was automatically generated. Issue Time Tracking --- Worklog Id: (was: 784210) Time Spent: 2h 20m (was: 2h 10m) > Improve S3A committers documentation clarity > > > Key: HADOOP-18304 > URL: https://issues.apache.org/jira/browse/HADOOP-18304 > Project: Hadoop Common > Issue Type: Sub-task > Components: documentation >Reporter: Daniel Carl Jones >Assignee: Daniel Carl Jones >Priority: Trivial > Labels: pull-request-available > Time Spent: 2h 20m > Remaining Estimate: 0h > > I recently was learning more about the S3A committers. I'm hoping to provide > some improvements as someone who has recently read [this > documentation|https://github.com/apache/hadoop/blob/1f157f802d2d6142d21482eaa86baf1bef458ed4/hadoop-tools/hadoop-aws/src/site/markdown/tools/hadoop-aws/committers.md#L495] > without fully understanding prior. > For instance, referencing different components more explicitly and adding > pre-requisite info. -- This message was sent by Atlassian Jira (v8.20.7#820007) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] hadoop-yetus commented on pull request #4478: HADOOP-18304. Improve user-facing S3A committers documentation
hadoop-yetus commented on PR #4478: URL: https://github.com/apache/hadoop/pull/4478#issuecomment-1164395425 :broken_heart: **-1 overall** | Vote | Subsystem | Runtime | Logfile | Comment | |::|--:|:|::|:---:| | +0 :ok: | reexec | 0m 57s | | Docker mode activated. | _ Prechecks _ | | +1 :green_heart: | dupname | 0m 0s | | No case conflicting files found. | | +0 :ok: | codespell | 0m 0s | | codespell was not available. | | +0 :ok: | detsecrets | 0m 0s | | detect-secrets was not available. | | +0 :ok: | markdownlint | 0m 0s | | markdownlint was not available. | | +1 :green_heart: | @author | 0m 0s | | The patch does not contain any @author tags. | _ trunk Compile Tests _ | | +1 :green_heart: | mvninstall | 49m 4s | | trunk passed | | +1 :green_heart: | mvnsite | 1m 35s | | trunk passed | | -1 :x: | shadedclient | 86m 56s | | branch has errors when building and testing our client artifacts. | _ Patch Compile Tests _ | | +1 :green_heart: | mvninstall | 0m 49s | | the patch passed | | +1 :green_heart: | blanks | 0m 0s | | The patch has no blanks issues. | | +1 :green_heart: | mvnsite | 0m 52s | | the patch passed | | -1 :x: | shadedclient | 5m 10s | | patch has errors when building and testing our client artifacts. | _ Other Tests _ | | +0 :ok: | asflicense | 0m 34s | | ASF License check generated no output? | | | | 96m 21s | | | | Subsystem | Report/Notes | |--:|:-| | Docker | ClientAPI=1.41 ServerAPI=1.41 base: https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4478/3/artifact/out/Dockerfile | | GITHUB PR | https://github.com/apache/hadoop/pull/4478 | | Optional Tests | dupname asflicense mvnsite codespell detsecrets markdownlint | | uname | Linux e68a2eb56e51 4.15.0-166-generic #174-Ubuntu SMP Wed Dec 8 19:07:44 UTC 2021 x86_64 x86_64 x86_64 GNU/Linux | | Build tool | maven | | Personality | dev-support/bin/hadoop.sh | | git revision | trunk / cd600503542751a72b7b21028ef08b26e41e6580 | | Max. process+thread count | 546 (vs. ulimit of 5500) | | modules | C: hadoop-tools/hadoop-aws U: hadoop-tools/hadoop-aws | | Console output | https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4478/3/console | | versions | git=2.25.1 maven=3.6.3 | | Powered by | Apache Yetus 0.14.0 https://yetus.apache.org | This message was automatically generated. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Commented] (HADOOP-18305) Release Hadoop 3.3.4: minor update of hadoop-3.3.3
[ https://issues.apache.org/jira/browse/HADOOP-18305?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17558054#comment-17558054 ] Steve Loughran commented on HADOOP-18305: - HADOOP-18044 is now part of the changes of this release > Release Hadoop 3.3.4: minor update of hadoop-3.3.3 > -- > > Key: HADOOP-18305 > URL: https://issues.apache.org/jira/browse/HADOOP-18305 > Project: Hadoop Common > Issue Type: Task > Components: build >Affects Versions: 3.3.3 >Reporter: Steve Loughran >Assignee: Steve Loughran >Priority: Major > Labels: pull-request-available > Time Spent: 1h 10m > Remaining Estimate: 0h > > Create a Hadoop 3.3.4 release with > * critical fixes > * ARM artifacts as well as the intel ones -- This message was sent by Atlassian Jira (v8.20.7#820007) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Updated] (HADOOP-18044) Hadoop - Upgrade to JQuery 3.6.0
[ https://issues.apache.org/jira/browse/HADOOP-18044?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Steve Loughran updated HADOOP-18044: Fix Version/s: 3.3.4 (was: 3.3.9) > Hadoop - Upgrade to JQuery 3.6.0 > > > Key: HADOOP-18044 > URL: https://issues.apache.org/jira/browse/HADOOP-18044 > Project: Hadoop Common > Issue Type: Improvement >Reporter: Yuan Luo >Assignee: Yuan Luo >Priority: Major > Labels: pull-request-available > Fix For: 3.4.0, 3.3.4 > > Time Spent: 1h 40m > Remaining Estimate: 0h > > jQuery 3.6.0 has been released few months ago - > [http://blog.jquery.com/2021/03/02/jquery-3-6-0-released/ > |http://blog.jquery.com/2021/03/02/jquery-3-6-0-released/,] > We can upgrade jquery-3.5.1.min.js to jquery-3.6.0.min.js in hadoop project. -- This message was sent by Atlassian Jira (v8.20.7#820007) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] hadoop-yetus commented on pull request #4494: fix bug for MAPREDUCE-7392 bug for GzipCodec in native task
hadoop-yetus commented on PR #4494: URL: https://github.com/apache/hadoop/pull/4494#issuecomment-1164353778 :broken_heart: **-1 overall** | Vote | Subsystem | Runtime | Logfile | Comment | |::|--:|:|::|:---:| | +0 :ok: | reexec | 37m 14s | | Docker mode activated. | _ Prechecks _ | | +1 :green_heart: | dupname | 0m 0s | | No case conflicting files found. | | +0 :ok: | codespell | 0m 0s | | codespell was not available. | | +0 :ok: | detsecrets | 0m 0s | | detect-secrets was not available. | | +1 :green_heart: | @author | 0m 0s | | The patch does not contain any @author tags. | | -1 :x: | test4tests | 0m 0s | | The patch doesn't appear to include any new or modified tests. Please justify why no new tests are needed for this patch. Also please list what manual steps were performed to verify this patch. | _ trunk Compile Tests _ | | +1 :green_heart: | mvninstall | 38m 38s | | trunk passed | | +1 :green_heart: | compile | 1m 13s | | trunk passed | | +1 :green_heart: | mvnsite | 0m 51s | | trunk passed | | +1 :green_heart: | shadedclient | 59m 28s | | branch has no errors when building and testing our client artifacts. | _ Patch Compile Tests _ | | +1 :green_heart: | mvninstall | 0m 29s | | the patch passed | | +1 :green_heart: | compile | 0m 56s | | the patch passed | | +1 :green_heart: | cc | 0m 56s | | the patch passed | | +1 :green_heart: | golang | 0m 56s | | the patch passed | | +1 :green_heart: | javac | 0m 56s | | the patch passed | | +1 :green_heart: | blanks | 0m 0s | | The patch has no blanks issues. | | +1 :green_heart: | mvnsite | 0m 30s | | the patch passed | | +1 :green_heart: | shadedclient | 18m 41s | | patch has no errors when building and testing our client artifacts. | _ Other Tests _ | | +1 :green_heart: | unit | 3m 5s | | hadoop-mapreduce-client-nativetask in the patch passed. | | +1 :green_heart: | asflicense | 0m 52s | | The patch does not generate ASF License warnings. | | | | 122m 27s | | | | Subsystem | Report/Notes | |--:|:-| | Docker | ClientAPI=1.41 ServerAPI=1.41 base: https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4494/1/artifact/out/Dockerfile | | GITHUB PR | https://github.com/apache/hadoop/pull/4494 | | Optional Tests | dupname asflicense compile cc mvnsite javac unit codespell detsecrets golang | | uname | Linux 81f90ef871cc 4.15.0-112-generic #113-Ubuntu SMP Thu Jul 9 23:41:39 UTC 2020 x86_64 x86_64 x86_64 GNU/Linux | | Build tool | maven | | Personality | dev-support/bin/hadoop.sh | | git revision | trunk / ca83e4f6621384e23a14781c45cc0a72e3d29b48 | | Default Java | Red Hat, Inc.-1.8.0_332-b09 | | Test Results | https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4494/1/testReport/ | | Max. process+thread count | 717 (vs. ulimit of 5500) | | modules | C: hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-nativetask U: hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-nativetask | | Console output | https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4494/1/console | | versions | git=2.9.5 maven=3.6.3 | | Powered by | Apache Yetus 0.14.0 https://yetus.apache.org | This message was automatically generated. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Commented] (HADOOP-13126) Add Brotli compression codec
[ https://issues.apache.org/jira/browse/HADOOP-13126?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17558037#comment-17558037 ] Martin Tzvetanov Grigorov commented on HADOOP-13126: [~ste...@apache.org] Rebased my branch to latest trunk. Hopefully Hadoop QA will like it this time. > Add Brotli compression codec > > > Key: HADOOP-13126 > URL: https://issues.apache.org/jira/browse/HADOOP-13126 > Project: Hadoop Common > Issue Type: Improvement > Components: io >Affects Versions: 2.7.2 >Reporter: Ryan Blue >Assignee: Ryan Blue >Priority: Major > Labels: pull-request-available > Attachments: HADOOP-13126.1.patch, HADOOP-13126.2.patch, > HADOOP-13126.3.patch, HADOOP-13126.4.patch, HADOOP-13126.5.patch > > Time Spent: 20m > Remaining Estimate: 0h > > I've been testing [Brotli|https://github.com/google/brotli/], a new > compression library based on LZ77 from Google. Google's [brotli > benchmarks|https://cran.r-project.org/web/packages/brotli/vignettes/brotli-2015-09-22.pdf] > look really good and we're also seeing a significant improvement in > compression size, compression speed, or both. > {code:title=Brotli preliminary test results} > [blue@work Downloads]$ time parquet from test.parquet -o test.snappy.parquet > --compression-codec snappy --overwrite > real1m17.106s > user1m30.804s > sys 0m4.404s > [blue@work Downloads]$ time parquet from test.parquet -o test.br.parquet > --compression-codec brotli --overwrite > real1m16.640s > user1m24.244s > sys 0m6.412s > [blue@work Downloads]$ time parquet from test.parquet -o test.gz.parquet > --compression-codec gzip --overwrite > real3m39.496s > user3m48.736s > sys 0m3.880s > [blue@work Downloads]$ ls -l > -rw-r--r-- 1 blue blue 1068821936 May 10 11:06 test.br.parquet > -rw-r--r-- 1 blue blue 1421601880 May 10 11:10 test.gz.parquet > -rw-r--r-- 1 blue blue 2265950833 May 10 10:30 test.snappy.parquet > {code} > Brotli, at quality 1, is as fast as snappy and ends up smaller than gzip-9. > Another test resulted in a slightly larger Brotli file than gzip produced, > but Brotli was 4x faster. I'd like to get this compression codec into Hadoop. > [Brotli is licensed with the MIT > license|https://github.com/google/brotli/blob/master/LICENSE], and the [JNI > library jbrotli is > ALv2|https://github.com/MeteoGroup/jbrotli/blob/master/LICENSE]. -- This message was sent by Atlassian Jira (v8.20.7#820007) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] hadoop-yetus commented on pull request #4486: YARN-10320.Replace FSDataInputStream#read with readFully in Log Aggregation
hadoop-yetus commented on PR #4486: URL: https://github.com/apache/hadoop/pull/4486#issuecomment-1164319691 :broken_heart: **-1 overall** | Vote | Subsystem | Runtime | Logfile | Comment | |::|--:|:|::|:---:| | +0 :ok: | reexec | 0m 51s | | Docker mode activated. | _ Prechecks _ | | +1 :green_heart: | dupname | 0m 0s | | No case conflicting files found. | | +0 :ok: | codespell | 0m 1s | | codespell was not available. | | +0 :ok: | detsecrets | 0m 1s | | detect-secrets was not available. | | +1 :green_heart: | @author | 0m 0s | | The patch does not contain any @author tags. | | -1 :x: | test4tests | 0m 0s | | The patch doesn't appear to include any new or modified tests. Please justify why no new tests are needed for this patch. Also please list what manual steps were performed to verify this patch. | _ trunk Compile Tests _ | | +1 :green_heart: | mvninstall | 42m 47s | | trunk passed | | +1 :green_heart: | compile | 1m 2s | | trunk passed with JDK Private Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1 | | +1 :green_heart: | compile | 1m 0s | | trunk passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | checkstyle | 0m 49s | | trunk passed | | +1 :green_heart: | mvnsite | 0m 58s | | trunk passed | | +1 :green_heart: | javadoc | 1m 5s | | trunk passed with JDK Private Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1 | | +1 :green_heart: | javadoc | 0m 56s | | trunk passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | spotbugs | 2m 5s | | trunk passed | | +1 :green_heart: | shadedclient | 24m 10s | | branch has no errors when building and testing our client artifacts. | _ Patch Compile Tests _ | | +1 :green_heart: | mvninstall | 0m 43s | | the patch passed | | +1 :green_heart: | compile | 0m 46s | | the patch passed with JDK Private Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1 | | +1 :green_heart: | javac | 0m 46s | | the patch passed | | +1 :green_heart: | compile | 0m 41s | | the patch passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | javac | 0m 41s | | the patch passed | | +1 :green_heart: | blanks | 0m 0s | | The patch has no blanks issues. | | +1 :green_heart: | checkstyle | 0m 28s | | the patch passed | | +1 :green_heart: | mvnsite | 0m 44s | | the patch passed | | +1 :green_heart: | javadoc | 0m 44s | | the patch passed with JDK Private Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1 | | +1 :green_heart: | javadoc | 0m 42s | | the patch passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | spotbugs | 1m 47s | | the patch passed | | +1 :green_heart: | shadedclient | 23m 54s | | patch has no errors when building and testing our client artifacts. | _ Other Tests _ | | +1 :green_heart: | unit | 4m 44s | | hadoop-yarn-common in the patch passed. | | +1 :green_heart: | asflicense | 0m 42s | | The patch does not generate ASF License warnings. | | | | 111m 44s | | | | Subsystem | Report/Notes | |--:|:-| | Docker | ClientAPI=1.41 ServerAPI=1.41 base: https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4486/2/artifact/out/Dockerfile | | GITHUB PR | https://github.com/apache/hadoop/pull/4486 | | Optional Tests | dupname asflicense compile javac javadoc mvninstall mvnsite unit shadedclient spotbugs checkstyle codespell detsecrets | | uname | Linux 62b100924735 4.15.0-175-generic #184-Ubuntu SMP Thu Mar 24 17:48:36 UTC 2022 x86_64 x86_64 x86_64 GNU/Linux | | Build tool | maven | | Personality | dev-support/bin/hadoop.sh | | git revision | trunk / 8da8aedacdfee5f6df280c40de31352a3084bfc6 | | Default Java | Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | Multi-JDK versions | /usr/lib/jvm/java-11-openjdk-amd64:Private Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1 /usr/lib/jvm/java-8-openjdk-amd64:Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | Test Results | https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4486/2/testReport/ | | Max. process+thread count | 600 (vs. ulimit of 5500) | | modules | C: hadoop-yarn-project/hadoop-yarn/hadoop-yarn-common U: hadoop-yarn-project/hadoop-yarn/hadoop-yarn-common | | Console output | https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4486/2/console | | versions | git=2.25.1 maven=3.6.3 spotbugs=4.2.2 | | Powered by | Apache Yetus 0.14.0 https://yetus.apache.org | This message was automatically generated. -- This is an automated message from the Apache Git Service. To respond to the
[jira] [Work logged] (HADOOP-18304) Improve S3A committers documentation clarity
[ https://issues.apache.org/jira/browse/HADOOP-18304?focusedWorklogId=784159=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-784159 ] ASF GitHub Bot logged work on HADOOP-18304: --- Author: ASF GitHub Bot Created on: 23/Jun/22 11:41 Start Date: 23/Jun/22 11:41 Worklog Time Spent: 10m Work Description: dannycjones commented on PR #4478: URL: https://github.com/apache/hadoop/pull/4478#issuecomment-1164304062 @ahmarsuhail, I've updated based on your feedback. Thanks for reviewing it with such detail. I've put the changes for things like missing `.` in separate commit cd600503542751a72b7b21028ef08b26e41e6580 in case we don't want to touch too many lines unnecessarily. Issue Time Tracking --- Worklog Id: (was: 784159) Time Spent: 2h 10m (was: 2h) > Improve S3A committers documentation clarity > > > Key: HADOOP-18304 > URL: https://issues.apache.org/jira/browse/HADOOP-18304 > Project: Hadoop Common > Issue Type: Sub-task > Components: documentation >Reporter: Daniel Carl Jones >Assignee: Daniel Carl Jones >Priority: Trivial > Labels: pull-request-available > Time Spent: 2h 10m > Remaining Estimate: 0h > > I recently was learning more about the S3A committers. I'm hoping to provide > some improvements as someone who has recently read [this > documentation|https://github.com/apache/hadoop/blob/1f157f802d2d6142d21482eaa86baf1bef458ed4/hadoop-tools/hadoop-aws/src/site/markdown/tools/hadoop-aws/committers.md#L495] > without fully understanding prior. > For instance, referencing different components more explicitly and adding > pre-requisite info. -- This message was sent by Atlassian Jira (v8.20.7#820007) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] dannycjones commented on pull request #4478: HADOOP-18304. Improve user-facing S3A committers documentation
dannycjones commented on PR #4478: URL: https://github.com/apache/hadoop/pull/4478#issuecomment-1164304062 @ahmarsuhail, I've updated based on your feedback. Thanks for reviewing it with such detail. I've put the changes for things like missing `.` in separate commit cd600503542751a72b7b21028ef08b26e41e6580 in case we don't want to touch too many lines unnecessarily. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Work logged] (HADOOP-18258) Merging of S3A Audit Logs
[ https://issues.apache.org/jira/browse/HADOOP-18258?focusedWorklogId=784156=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-784156 ] ASF GitHub Bot logged work on HADOOP-18258: --- Author: ASF GitHub Bot Created on: 23/Jun/22 11:35 Start Date: 23/Jun/22 11:35 Worklog Time Spent: 10m Work Description: sravanigadey commented on code in PR #4383: URL: https://github.com/apache/hadoop/pull/4383#discussion_r904912305 ## hadoop-tools/hadoop-aws/src/main/java/org/apache/hadoop/fs/s3a/audit/AuditTool.java: ## @@ -0,0 +1,308 @@ +/* + * Licensed to the Apache Software Foundation (ASF) under one + * or more contributor license agreements. See the NOTICE file + * distributed with this work for additional information + * regarding copyright ownership. The ASF licenses this file + * to you under the Apache License, Version 2.0 (the + * "License"); you may not use this file except in compliance + * with the License. You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + */ + +package org.apache.hadoop.fs.s3a.audit; + +import java.io.Closeable; +import java.io.EOFException; +import java.io.File; +import java.io.IOException; +import java.io.PrintWriter; +import java.net.URI; +import java.net.URISyntaxException; +import java.util.ArrayList; +import java.util.Arrays; +import java.util.List; + +import org.slf4j.Logger; +import org.slf4j.LoggerFactory; + +import org.apache.commons.io.FileUtils; +import org.apache.hadoop.classification.VisibleForTesting; +import org.apache.hadoop.conf.Configuration; +import org.apache.hadoop.conf.Configured; +import org.apache.hadoop.fs.FSDataInputStream; +import org.apache.hadoop.fs.FileStatus; +import org.apache.hadoop.fs.FileSystem; +import org.apache.hadoop.fs.FilterFileSystem; +import org.apache.hadoop.fs.LocatedFileStatus; +import org.apache.hadoop.fs.Path; +import org.apache.hadoop.fs.RemoteIterator; +import org.apache.hadoop.fs.s3a.S3AFileSystem; +import org.apache.hadoop.util.ExitUtil; +import org.apache.hadoop.util.Tool; +import org.apache.hadoop.util.ToolRunner; + +import static org.apache.hadoop.service.launcher.LauncherExitCodes.EXIT_COMMAND_ARGUMENT_ERROR; +import static org.apache.hadoop.service.launcher.LauncherExitCodes.EXIT_SERVICE_UNAVAILABLE; +import static org.apache.hadoop.service.launcher.LauncherExitCodes.EXIT_SUCCESS; + +/** + * AuditTool is a Command Line Interface. + * i.e, it's functionality is to parse the merged audit log file. + * and generate avro file. + */ +public class AuditTool extends Configured implements Tool, Closeable { Review Comment: done Issue Time Tracking --- Worklog Id: (was: 784156) Time Spent: 6h 50m (was: 6h 40m) > Merging of S3A Audit Logs > - > > Key: HADOOP-18258 > URL: https://issues.apache.org/jira/browse/HADOOP-18258 > Project: Hadoop Common > Issue Type: Sub-task > Components: fs/s3 >Reporter: Sravani Gadey >Assignee: Sravani Gadey >Priority: Major > Labels: pull-request-available > Time Spent: 6h 50m > Remaining Estimate: 0h > > Merging audit log files containing huge number of audit logs collected from a > job like Hive or Spark job containing various S3 requests like list, head, > get and put requests. -- This message was sent by Atlassian Jira (v8.20.7#820007) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] sravanigadey commented on a diff in pull request #4383: HADOOP-18258. Merging of S3A Audit Logs
sravanigadey commented on code in PR #4383: URL: https://github.com/apache/hadoop/pull/4383#discussion_r904912305 ## hadoop-tools/hadoop-aws/src/main/java/org/apache/hadoop/fs/s3a/audit/AuditTool.java: ## @@ -0,0 +1,308 @@ +/* + * Licensed to the Apache Software Foundation (ASF) under one + * or more contributor license agreements. See the NOTICE file + * distributed with this work for additional information + * regarding copyright ownership. The ASF licenses this file + * to you under the Apache License, Version 2.0 (the + * "License"); you may not use this file except in compliance + * with the License. You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + */ + +package org.apache.hadoop.fs.s3a.audit; + +import java.io.Closeable; +import java.io.EOFException; +import java.io.File; +import java.io.IOException; +import java.io.PrintWriter; +import java.net.URI; +import java.net.URISyntaxException; +import java.util.ArrayList; +import java.util.Arrays; +import java.util.List; + +import org.slf4j.Logger; +import org.slf4j.LoggerFactory; + +import org.apache.commons.io.FileUtils; +import org.apache.hadoop.classification.VisibleForTesting; +import org.apache.hadoop.conf.Configuration; +import org.apache.hadoop.conf.Configured; +import org.apache.hadoop.fs.FSDataInputStream; +import org.apache.hadoop.fs.FileStatus; +import org.apache.hadoop.fs.FileSystem; +import org.apache.hadoop.fs.FilterFileSystem; +import org.apache.hadoop.fs.LocatedFileStatus; +import org.apache.hadoop.fs.Path; +import org.apache.hadoop.fs.RemoteIterator; +import org.apache.hadoop.fs.s3a.S3AFileSystem; +import org.apache.hadoop.util.ExitUtil; +import org.apache.hadoop.util.Tool; +import org.apache.hadoop.util.ToolRunner; + +import static org.apache.hadoop.service.launcher.LauncherExitCodes.EXIT_COMMAND_ARGUMENT_ERROR; +import static org.apache.hadoop.service.launcher.LauncherExitCodes.EXIT_SERVICE_UNAVAILABLE; +import static org.apache.hadoop.service.launcher.LauncherExitCodes.EXIT_SUCCESS; + +/** + * AuditTool is a Command Line Interface. + * i.e, it's functionality is to parse the merged audit log file. + * and generate avro file. + */ +public class AuditTool extends Configured implements Tool, Closeable { Review Comment: done -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Work logged] (HADOOP-18304) Improve S3A committers documentation clarity
[ https://issues.apache.org/jira/browse/HADOOP-18304?focusedWorklogId=784154=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-784154 ] ASF GitHub Bot logged work on HADOOP-18304: --- Author: ASF GitHub Bot Created on: 23/Jun/22 11:34 Start Date: 23/Jun/22 11:34 Worklog Time Spent: 10m Work Description: dannycjones commented on code in PR #4478: URL: https://github.com/apache/hadoop/pull/4478#discussion_r904911780 ## hadoop-tools/hadoop-aws/src/site/markdown/tools/hadoop-aws/committers.md: ## @@ -474,7 +466,7 @@ files which do not contain relevant data. What the partitioned committer does is, where the tooling permits, allows callers to add data to an existing partitioned layout*. Review Comment: for this one, i'm not sure if the `*` is referring to something. I will remove for now. Happy to add it back Issue Time Tracking --- Worklog Id: (was: 784154) Time Spent: 2h (was: 1h 50m) > Improve S3A committers documentation clarity > > > Key: HADOOP-18304 > URL: https://issues.apache.org/jira/browse/HADOOP-18304 > Project: Hadoop Common > Issue Type: Sub-task > Components: documentation >Reporter: Daniel Carl Jones >Assignee: Daniel Carl Jones >Priority: Trivial > Labels: pull-request-available > Time Spent: 2h > Remaining Estimate: 0h > > I recently was learning more about the S3A committers. I'm hoping to provide > some improvements as someone who has recently read [this > documentation|https://github.com/apache/hadoop/blob/1f157f802d2d6142d21482eaa86baf1bef458ed4/hadoop-tools/hadoop-aws/src/site/markdown/tools/hadoop-aws/committers.md#L495] > without fully understanding prior. > For instance, referencing different components more explicitly and adding > pre-requisite info. -- This message was sent by Atlassian Jira (v8.20.7#820007) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] dannycjones commented on a diff in pull request #4478: HADOOP-18304. Improve user-facing S3A committers documentation
dannycjones commented on code in PR #4478: URL: https://github.com/apache/hadoop/pull/4478#discussion_r904911780 ## hadoop-tools/hadoop-aws/src/site/markdown/tools/hadoop-aws/committers.md: ## @@ -474,7 +466,7 @@ files which do not contain relevant data. What the partitioned committer does is, where the tooling permits, allows callers to add data to an existing partitioned layout*. Review Comment: for this one, i'm not sure if the `*` is referring to something. I will remove for now. Happy to add it back -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Work logged] (HADOOP-18304) Improve S3A committers documentation clarity
[ https://issues.apache.org/jira/browse/HADOOP-18304?focusedWorklogId=784149=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-784149 ] ASF GitHub Bot logged work on HADOOP-18304: --- Author: ASF GitHub Bot Created on: 23/Jun/22 11:26 Start Date: 23/Jun/22 11:26 Worklog Time Spent: 10m Work Description: dannycjones commented on code in PR #4478: URL: https://github.com/apache/hadoop/pull/4478#discussion_r904905036 ## hadoop-tools/hadoop-aws/src/site/markdown/tools/hadoop-aws/committers.md: ## @@ -165,6 +165,7 @@ that the network has partitioned and that they must abort their work. That's "essentially" it. When working with HDFS and similar filesystems, Review Comment: I agree - I will actually remove redundant "whether the output is to be *committed* or *aborted*" to hopefully make it clearer. Issue Time Tracking --- Worklog Id: (was: 784149) Time Spent: 1h 50m (was: 1h 40m) > Improve S3A committers documentation clarity > > > Key: HADOOP-18304 > URL: https://issues.apache.org/jira/browse/HADOOP-18304 > Project: Hadoop Common > Issue Type: Sub-task > Components: documentation >Reporter: Daniel Carl Jones >Assignee: Daniel Carl Jones >Priority: Trivial > Labels: pull-request-available > Time Spent: 1h 50m > Remaining Estimate: 0h > > I recently was learning more about the S3A committers. I'm hoping to provide > some improvements as someone who has recently read [this > documentation|https://github.com/apache/hadoop/blob/1f157f802d2d6142d21482eaa86baf1bef458ed4/hadoop-tools/hadoop-aws/src/site/markdown/tools/hadoop-aws/committers.md#L495] > without fully understanding prior. > For instance, referencing different components more explicitly and adding > pre-requisite info. -- This message was sent by Atlassian Jira (v8.20.7#820007) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] dannycjones commented on a diff in pull request #4478: HADOOP-18304. Improve user-facing S3A committers documentation
dannycjones commented on code in PR #4478: URL: https://github.com/apache/hadoop/pull/4478#discussion_r904905036 ## hadoop-tools/hadoop-aws/src/site/markdown/tools/hadoop-aws/committers.md: ## @@ -165,6 +165,7 @@ that the network has partitioned and that they must abort their work. That's "essentially" it. When working with HDFS and similar filesystems, Review Comment: I agree - I will actually remove redundant "whether the output is to be *committed* or *aborted*" to hopefully make it clearer. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] hadoop-yetus commented on pull request #4493: [Do not commit] Testing cross platform builds
hadoop-yetus commented on PR #4493: URL: https://github.com/apache/hadoop/pull/4493#issuecomment-1164285346 :confetti_ball: **+1 overall** | Vote | Subsystem | Runtime | Logfile | Comment | |::|--:|:|::|:---:| | +0 :ok: | reexec | 21m 41s | | Docker mode activated. | _ Prechecks _ | | +1 :green_heart: | dupname | 0m 0s | | No case conflicting files found. | | +0 :ok: | codespell | 0m 0s | | codespell was not available. | | +0 :ok: | detsecrets | 0m 0s | | detect-secrets was not available. | | +1 :green_heart: | @author | 0m 1s | | The patch does not contain any @author tags. | | +1 :green_heart: | test4tests | 0m 0s | | The patch appears to include 1 new or modified test files. | _ trunk Compile Tests _ | | +1 :green_heart: | mvninstall | 22m 31s | | trunk passed | | +1 :green_heart: | compile | 4m 10s | | trunk passed | | +1 :green_heart: | mvnsite | 1m 5s | | trunk passed | | +1 :green_heart: | shadedclient | 47m 6s | | branch has no errors when building and testing our client artifacts. | _ Patch Compile Tests _ | | +1 :green_heart: | mvninstall | 0m 32s | | the patch passed | | +1 :green_heart: | compile | 3m 45s | | the patch passed | | +1 :green_heart: | cc | 3m 45s | | the patch passed | | +1 :green_heart: | golang | 3m 45s | | the patch passed | | +1 :green_heart: | javac | 3m 45s | | the patch passed | | +1 :green_heart: | blanks | 0m 0s | | The patch has no blanks issues. | | +1 :green_heart: | mvnsite | 0m 37s | | the patch passed | | +1 :green_heart: | shadedclient | 19m 23s | | patch has no errors when building and testing our client artifacts. | _ Other Tests _ | | +1 :green_heart: | unit | 33m 7s | | hadoop-hdfs-native-client in the patch passed. | | +1 :green_heart: | asflicense | 1m 1s | | The patch does not generate ASF License warnings. | | | | 130m 1s | | | | Subsystem | Report/Notes | |--:|:-| | Docker | ClientAPI=1.41 ServerAPI=1.41 base: https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4493/1/artifact/out/Dockerfile | | GITHUB PR | https://github.com/apache/hadoop/pull/4493 | | Optional Tests | dupname asflicense compile cc mvnsite javac unit codespell detsecrets golang | | uname | Linux d9a74e5eec3f 4.15.0-58-generic #64-Ubuntu SMP Tue Aug 6 11:12:41 UTC 2019 x86_64 x86_64 x86_64 GNU/Linux | | Build tool | maven | | Personality | dev-support/bin/hadoop.sh | | git revision | trunk / 58248bea83dc91b5a2e2c64941acbcb6a058514d | | Default Java | Red Hat, Inc.-1.8.0_312-b07 | | Test Results | https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4493/1/testReport/ | | Max. process+thread count | 556 (vs. ulimit of 5500) | | modules | C: hadoop-hdfs-project/hadoop-hdfs-native-client U: hadoop-hdfs-project/hadoop-hdfs-native-client | | Console output | https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4493/1/console | | versions | git=2.27.0 maven=3.6.3 | | Powered by | Apache Yetus 0.14.0 https://yetus.apache.org | This message was automatically generated. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Work logged] (HADOOP-17461) Add thread-level IOStatistics Context
[ https://issues.apache.org/jira/browse/HADOOP-17461?focusedWorklogId=784144=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-784144 ] ASF GitHub Bot logged work on HADOOP-17461: --- Author: ASF GitHub Bot Created on: 23/Jun/22 11:18 Start Date: 23/Jun/22 11:18 Worklog Time Spent: 10m Work Description: hadoop-yetus commented on PR #4352: URL: https://github.com/apache/hadoop/pull/4352#issuecomment-1164284637 :broken_heart: **-1 overall** | Vote | Subsystem | Runtime | Logfile | Comment | |::|--:|:|::|:---:| | +0 :ok: | reexec | 2m 18s | | Docker mode activated. | _ Prechecks _ | | +1 :green_heart: | dupname | 0m 0s | | No case conflicting files found. | | +0 :ok: | codespell | 0m 0s | | codespell was not available. | | +0 :ok: | detsecrets | 0m 0s | | detect-secrets was not available. | | +1 :green_heart: | @author | 0m 0s | | The patch does not contain any @author tags. | | +1 :green_heart: | test4tests | 0m 0s | | The patch appears to include 4 new or modified test files. | _ trunk Compile Tests _ | | +0 :ok: | mvndep | 15m 10s | | Maven dependency ordering for branch | | +1 :green_heart: | mvninstall | 26m 24s | | trunk passed | | +1 :green_heart: | compile | 23m 45s | | trunk passed with JDK Private Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1 | | +1 :green_heart: | compile | 21m 44s | | trunk passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | checkstyle | 4m 5s | | trunk passed | | +1 :green_heart: | mvnsite | 3m 25s | | trunk passed | | +1 :green_heart: | javadoc | 2m 31s | | trunk passed with JDK Private Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1 | | +1 :green_heart: | javadoc | 2m 16s | | trunk passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | spotbugs | 5m 2s | | trunk passed | | +1 :green_heart: | shadedclient | 22m 54s | | branch has no errors when building and testing our client artifacts. | | -0 :warning: | patch | 23m 22s | | Used diff version of patch file. Binary files and potentially other changes not applied. Please rebase and squash commits if necessary. | _ Patch Compile Tests _ | | +0 :ok: | mvndep | 0m 32s | | Maven dependency ordering for patch | | +1 :green_heart: | mvninstall | 1m 54s | | the patch passed | | +1 :green_heart: | compile | 24m 42s | | the patch passed with JDK Private Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1 | | +1 :green_heart: | javac | 24m 42s | | the patch passed | | +1 :green_heart: | compile | 21m 7s | | the patch passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | javac | 21m 7s | | the patch passed | | +1 :green_heart: | blanks | 0m 0s | | The patch has no blanks issues. | | +1 :green_heart: | checkstyle | 4m 24s | | the patch passed | | +1 :green_heart: | mvnsite | 3m 27s | | the patch passed | | +1 :green_heart: | javadoc | 2m 22s | | the patch passed with JDK Private Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1 | | -1 :x: | javadoc | 1m 0s | [/results-javadoc-javadoc-hadoop-tools_hadoop-aws-jdkPrivateBuild-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4352/5/artifact/out/results-javadoc-javadoc-hadoop-tools_hadoop-aws-jdkPrivateBuild-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07.txt) | hadoop-tools_hadoop-aws-jdkPrivateBuild-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 generated 2 new + 37 unchanged - 0 fixed = 39 total (was 37) | | +1 :green_heart: | spotbugs | 4m 56s | | the patch passed | | +1 :green_heart: | shadedclient | 23m 6s | | patch has no errors when building and testing our client artifacts. | _ Other Tests _ | | +1 :green_heart: | unit | 19m 5s | | hadoop-common in the patch passed. | | +1 :green_heart: | unit | 2m 56s | | hadoop-aws in the patch passed. | | +1 :green_heart: | asflicense | 1m 27s | | The patch does not generate ASF License warnings. | | | | 246m 42s | | | | Subsystem | Report/Notes | |--:|:-| | Docker | ClientAPI=1.41 ServerAPI=1.41 base: https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4352/5/artifact/out/Dockerfile | | GITHUB PR | https://github.com/apache/hadoop/pull/4352 | | Optional Tests | dupname asflicense compile javac javadoc mvninstall mvnsite unit shadedclient spotbugs checkstyle codespell detsecrets | | uname | Linux 4b91d0b6f8a6 4.15.0-112-generic #113-Ubuntu SMP
[GitHub] [hadoop] hadoop-yetus commented on pull request #4352: HADOOP-17461. Thread-level IOStatistics in S3A
hadoop-yetus commented on PR #4352: URL: https://github.com/apache/hadoop/pull/4352#issuecomment-1164284637 :broken_heart: **-1 overall** | Vote | Subsystem | Runtime | Logfile | Comment | |::|--:|:|::|:---:| | +0 :ok: | reexec | 2m 18s | | Docker mode activated. | _ Prechecks _ | | +1 :green_heart: | dupname | 0m 0s | | No case conflicting files found. | | +0 :ok: | codespell | 0m 0s | | codespell was not available. | | +0 :ok: | detsecrets | 0m 0s | | detect-secrets was not available. | | +1 :green_heart: | @author | 0m 0s | | The patch does not contain any @author tags. | | +1 :green_heart: | test4tests | 0m 0s | | The patch appears to include 4 new or modified test files. | _ trunk Compile Tests _ | | +0 :ok: | mvndep | 15m 10s | | Maven dependency ordering for branch | | +1 :green_heart: | mvninstall | 26m 24s | | trunk passed | | +1 :green_heart: | compile | 23m 45s | | trunk passed with JDK Private Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1 | | +1 :green_heart: | compile | 21m 44s | | trunk passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | checkstyle | 4m 5s | | trunk passed | | +1 :green_heart: | mvnsite | 3m 25s | | trunk passed | | +1 :green_heart: | javadoc | 2m 31s | | trunk passed with JDK Private Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1 | | +1 :green_heart: | javadoc | 2m 16s | | trunk passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | spotbugs | 5m 2s | | trunk passed | | +1 :green_heart: | shadedclient | 22m 54s | | branch has no errors when building and testing our client artifacts. | | -0 :warning: | patch | 23m 22s | | Used diff version of patch file. Binary files and potentially other changes not applied. Please rebase and squash commits if necessary. | _ Patch Compile Tests _ | | +0 :ok: | mvndep | 0m 32s | | Maven dependency ordering for patch | | +1 :green_heart: | mvninstall | 1m 54s | | the patch passed | | +1 :green_heart: | compile | 24m 42s | | the patch passed with JDK Private Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1 | | +1 :green_heart: | javac | 24m 42s | | the patch passed | | +1 :green_heart: | compile | 21m 7s | | the patch passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | javac | 21m 7s | | the patch passed | | +1 :green_heart: | blanks | 0m 0s | | The patch has no blanks issues. | | +1 :green_heart: | checkstyle | 4m 24s | | the patch passed | | +1 :green_heart: | mvnsite | 3m 27s | | the patch passed | | +1 :green_heart: | javadoc | 2m 22s | | the patch passed with JDK Private Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1 | | -1 :x: | javadoc | 1m 0s | [/results-javadoc-javadoc-hadoop-tools_hadoop-aws-jdkPrivateBuild-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4352/5/artifact/out/results-javadoc-javadoc-hadoop-tools_hadoop-aws-jdkPrivateBuild-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07.txt) | hadoop-tools_hadoop-aws-jdkPrivateBuild-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 generated 2 new + 37 unchanged - 0 fixed = 39 total (was 37) | | +1 :green_heart: | spotbugs | 4m 56s | | the patch passed | | +1 :green_heart: | shadedclient | 23m 6s | | patch has no errors when building and testing our client artifacts. | _ Other Tests _ | | +1 :green_heart: | unit | 19m 5s | | hadoop-common in the patch passed. | | +1 :green_heart: | unit | 2m 56s | | hadoop-aws in the patch passed. | | +1 :green_heart: | asflicense | 1m 27s | | The patch does not generate ASF License warnings. | | | | 246m 42s | | | | Subsystem | Report/Notes | |--:|:-| | Docker | ClientAPI=1.41 ServerAPI=1.41 base: https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4352/5/artifact/out/Dockerfile | | GITHUB PR | https://github.com/apache/hadoop/pull/4352 | | Optional Tests | dupname asflicense compile javac javadoc mvninstall mvnsite unit shadedclient spotbugs checkstyle codespell detsecrets | | uname | Linux 4b91d0b6f8a6 4.15.0-112-generic #113-Ubuntu SMP Thu Jul 9 23:41:39 UTC 2020 x86_64 x86_64 x86_64 GNU/Linux | | Build tool | maven | | Personality | dev-support/bin/hadoop.sh | | git revision | trunk / bcb4c253b695a1f84f829044a006c491ee3d359d | | Default Java | Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | Multi-JDK versions | /usr/lib/jvm/java-11-openjdk-amd64:Private Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1
[jira] [Work logged] (HADOOP-18044) Hadoop - Upgrade to JQuery 3.6.0
[ https://issues.apache.org/jira/browse/HADOOP-18044?focusedWorklogId=784139=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-784139 ] ASF GitHub Bot logged work on HADOOP-18044: --- Author: ASF GitHub Bot Created on: 23/Jun/22 11:12 Start Date: 23/Jun/22 11:12 Worklog Time Spent: 10m Work Description: steveloughran opened a new pull request, #4495: URL: https://github.com/apache/hadoop/pull/4495 Co-authored-by: luoyuan (cherry picked from commit e2d620192aa0b712d05e4092eb63ef2ccdcd8220) Issue Time Tracking --- Worklog Id: (was: 784139) Time Spent: 1h 40m (was: 1.5h) > Hadoop - Upgrade to JQuery 3.6.0 > > > Key: HADOOP-18044 > URL: https://issues.apache.org/jira/browse/HADOOP-18044 > Project: Hadoop Common > Issue Type: Improvement >Reporter: Yuan Luo >Assignee: Yuan Luo >Priority: Major > Labels: pull-request-available > Fix For: 3.4.0, 3.3.9 > > Time Spent: 1h 40m > Remaining Estimate: 0h > > jQuery 3.6.0 has been released few months ago - > [http://blog.jquery.com/2021/03/02/jquery-3-6-0-released/ > |http://blog.jquery.com/2021/03/02/jquery-3-6-0-released/,] > We can upgrade jquery-3.5.1.min.js to jquery-3.6.0.min.js in hadoop project. -- This message was sent by Atlassian Jira (v8.20.7#820007) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Work logged] (HADOOP-18304) Improve S3A committers documentation clarity
[ https://issues.apache.org/jira/browse/HADOOP-18304?focusedWorklogId=784140=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-784140 ] ASF GitHub Bot logged work on HADOOP-18304: --- Author: ASF GitHub Bot Created on: 23/Jun/22 11:12 Start Date: 23/Jun/22 11:12 Worklog Time Spent: 10m Work Description: dannycjones commented on code in PR #4478: URL: https://github.com/apache/hadoop/pull/4478#discussion_r904894123 ## hadoop-tools/hadoop-aws/src/site/markdown/tools/hadoop-aws/committers.md: ## @@ -283,40 +281,37 @@ new data to an existing partitioned directory tree is a common operation. ``` -**replace** : when the job is committed (and not before), delete files in +The _Directory Committer_ uses the entire directory tree for conflict resolution. +For this committer, the behavior of each conflict mode is shown below: + Review Comment: I will add it above the XML example. Issue Time Tracking --- Worklog Id: (was: 784140) Time Spent: 1h 40m (was: 1.5h) > Improve S3A committers documentation clarity > > > Key: HADOOP-18304 > URL: https://issues.apache.org/jira/browse/HADOOP-18304 > Project: Hadoop Common > Issue Type: Sub-task > Components: documentation >Reporter: Daniel Carl Jones >Assignee: Daniel Carl Jones >Priority: Trivial > Labels: pull-request-available > Time Spent: 1h 40m > Remaining Estimate: 0h > > I recently was learning more about the S3A committers. I'm hoping to provide > some improvements as someone who has recently read [this > documentation|https://github.com/apache/hadoop/blob/1f157f802d2d6142d21482eaa86baf1bef458ed4/hadoop-tools/hadoop-aws/src/site/markdown/tools/hadoop-aws/committers.md#L495] > without fully understanding prior. > For instance, referencing different components more explicitly and adding > pre-requisite info. -- This message was sent by Atlassian Jira (v8.20.7#820007) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] dannycjones commented on a diff in pull request #4478: HADOOP-18304. Improve user-facing S3A committers documentation
dannycjones commented on code in PR #4478: URL: https://github.com/apache/hadoop/pull/4478#discussion_r904894123 ## hadoop-tools/hadoop-aws/src/site/markdown/tools/hadoop-aws/committers.md: ## @@ -283,40 +281,37 @@ new data to an existing partitioned directory tree is a common operation. ``` -**replace** : when the job is committed (and not before), delete files in +The _Directory Committer_ uses the entire directory tree for conflict resolution. +For this committer, the behavior of each conflict mode is shown below: + Review Comment: I will add it above the XML example. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] steveloughran opened a new pull request, #4495: HADOOP-18044. Hadoop - Upgrade to jQuery 3.6.0 (#3791)
steveloughran opened a new pull request, #4495: URL: https://github.com/apache/hadoop/pull/4495 Co-authored-by: luoyuan (cherry picked from commit e2d620192aa0b712d05e4092eb63ef2ccdcd8220) -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Work logged] (HADOOP-18304) Improve S3A committers documentation clarity
[ https://issues.apache.org/jira/browse/HADOOP-18304?focusedWorklogId=784138=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-784138 ] ASF GitHub Bot logged work on HADOOP-18304: --- Author: ASF GitHub Bot Created on: 23/Jun/22 11:11 Start Date: 23/Jun/22 11:11 Worklog Time Spent: 10m Work Description: dannycjones commented on code in PR #4478: URL: https://github.com/apache/hadoop/pull/4478#discussion_r904893346 ## hadoop-tools/hadoop-aws/src/site/markdown/tools/hadoop-aws/committers.md: ## @@ -492,18 +484,19 @@ was written. With the policy of `append`, the new file would be added to the existing set of files. -### Notes +### Notes on using Staging Committers 1. A deep partition tree can itself be a performance problem in S3 and the s3a client, -or, more specifically. a problem with applications which use recursive directory tree +or more specifically a problem with applications which use recursive directory tree walks to work with data. 1. The outcome if you have more than one job trying simultaneously to write data to the same destination with any policy other than "append" is undefined. 1. In the `append` operation, there is no check for conflict with file names. -If, in the example above, the file `log-20170228.avro` already existed, -it would be overridden. Set `fs.s3a.committer.staging.unique-filenames` to `true` +If the file `log-20170228.avro` in the example above already existed, it would be overwritten. + + Set `fs.s3a.committer.staging.unique-filenames` to `true` Review Comment: Using the indentation like this I believe allows you to put the sentence on a new line but still part of the previous point. That being said, it is not obvious from the markdown and I cannot test the output HTML so I'll revert. Issue Time Tracking --- Worklog Id: (was: 784138) Time Spent: 1.5h (was: 1h 20m) > Improve S3A committers documentation clarity > > > Key: HADOOP-18304 > URL: https://issues.apache.org/jira/browse/HADOOP-18304 > Project: Hadoop Common > Issue Type: Sub-task > Components: documentation >Reporter: Daniel Carl Jones >Assignee: Daniel Carl Jones >Priority: Trivial > Labels: pull-request-available > Time Spent: 1.5h > Remaining Estimate: 0h > > I recently was learning more about the S3A committers. I'm hoping to provide > some improvements as someone who has recently read [this > documentation|https://github.com/apache/hadoop/blob/1f157f802d2d6142d21482eaa86baf1bef458ed4/hadoop-tools/hadoop-aws/src/site/markdown/tools/hadoop-aws/committers.md#L495] > without fully understanding prior. > For instance, referencing different components more explicitly and adding > pre-requisite info. -- This message was sent by Atlassian Jira (v8.20.7#820007) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] dannycjones commented on a diff in pull request #4478: HADOOP-18304. Improve user-facing S3A committers documentation
dannycjones commented on code in PR #4478: URL: https://github.com/apache/hadoop/pull/4478#discussion_r904893346 ## hadoop-tools/hadoop-aws/src/site/markdown/tools/hadoop-aws/committers.md: ## @@ -492,18 +484,19 @@ was written. With the policy of `append`, the new file would be added to the existing set of files. -### Notes +### Notes on using Staging Committers 1. A deep partition tree can itself be a performance problem in S3 and the s3a client, -or, more specifically. a problem with applications which use recursive directory tree +or more specifically a problem with applications which use recursive directory tree walks to work with data. 1. The outcome if you have more than one job trying simultaneously to write data to the same destination with any policy other than "append" is undefined. 1. In the `append` operation, there is no check for conflict with file names. -If, in the example above, the file `log-20170228.avro` already existed, -it would be overridden. Set `fs.s3a.committer.staging.unique-filenames` to `true` +If the file `log-20170228.avro` in the example above already existed, it would be overwritten. + + Set `fs.s3a.committer.staging.unique-filenames` to `true` Review Comment: Using the indentation like this I believe allows you to put the sentence on a new line but still part of the previous point. That being said, it is not obvious from the markdown and I cannot test the output HTML so I'll revert. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Work logged] (HADOOP-18304) Improve S3A committers documentation clarity
[ https://issues.apache.org/jira/browse/HADOOP-18304?focusedWorklogId=784137=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-784137 ] ASF GitHub Bot logged work on HADOOP-18304: --- Author: ASF GitHub Bot Created on: 23/Jun/22 11:07 Start Date: 23/Jun/22 11:07 Worklog Time Spent: 10m Work Description: dannycjones commented on code in PR #4478: URL: https://github.com/apache/hadoop/pull/4478#discussion_r904889810 ## hadoop-tools/hadoop-aws/src/site/markdown/tools/hadoop-aws/committers.md: ## @@ -650,7 +639,14 @@ Conflict management is left to the execution engine itself. Review Comment: Good call out, I had a look and I think that option should actually be just `fs.s3a.committer.uuid` - no `staging`. I'll update it. The code in `AbstractS3ACommitter` checks for `fs.s3a.committer.uuid` and the Spark UUID before failing fast. https://github.com/apache/hadoop/blob/e199da3fae1c82e87f88c8c50f6a96c6515e2edd/hadoop-tools/hadoop-aws/src/main/java/org/apache/hadoop/fs/s3a/commit/AbstractS3ACommitter.java#L1341-L1360 Issue Time Tracking --- Worklog Id: (was: 784137) Time Spent: 1h 20m (was: 1h 10m) > Improve S3A committers documentation clarity > > > Key: HADOOP-18304 > URL: https://issues.apache.org/jira/browse/HADOOP-18304 > Project: Hadoop Common > Issue Type: Sub-task > Components: documentation >Reporter: Daniel Carl Jones >Assignee: Daniel Carl Jones >Priority: Trivial > Labels: pull-request-available > Time Spent: 1h 20m > Remaining Estimate: 0h > > I recently was learning more about the S3A committers. I'm hoping to provide > some improvements as someone who has recently read [this > documentation|https://github.com/apache/hadoop/blob/1f157f802d2d6142d21482eaa86baf1bef458ed4/hadoop-tools/hadoop-aws/src/site/markdown/tools/hadoop-aws/committers.md#L495] > without fully understanding prior. > For instance, referencing different components more explicitly and adding > pre-requisite info. -- This message was sent by Atlassian Jira (v8.20.7#820007) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] dannycjones commented on a diff in pull request #4478: HADOOP-18304. Improve user-facing S3A committers documentation
dannycjones commented on code in PR #4478: URL: https://github.com/apache/hadoop/pull/4478#discussion_r904889810 ## hadoop-tools/hadoop-aws/src/site/markdown/tools/hadoop-aws/committers.md: ## @@ -650,7 +639,14 @@ Conflict management is left to the execution engine itself. Review Comment: Good call out, I had a look and I think that option should actually be just `fs.s3a.committer.uuid` - no `staging`. I'll update it. The code in `AbstractS3ACommitter` checks for `fs.s3a.committer.uuid` and the Spark UUID before failing fast. https://github.com/apache/hadoop/blob/e199da3fae1c82e87f88c8c50f6a96c6515e2edd/hadoop-tools/hadoop-aws/src/main/java/org/apache/hadoop/fs/s3a/commit/AbstractS3ACommitter.java#L1341-L1360 -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] ZanderXu commented on pull request #4480: HDFS-16638. Add isDebugEnabled check for debug blockLogs in BlockManager
ZanderXu commented on PR #4480: URL: https://github.com/apache/hadoop/pull/4480#issuecomment-1164270092 Thanks @cxzl25 for your patch. I have a question about it and looking forward your feedback. Can this change improve the namenode performance? I see that some places specially remove this judgment, and use log.info("{}", XXX) replace it. Or can you do some performance test for it? use this judgment or not. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Work logged] (HADOOP-18304) Improve S3A committers documentation clarity
[ https://issues.apache.org/jira/browse/HADOOP-18304?focusedWorklogId=784134=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-784134 ] ASF GitHub Bot logged work on HADOOP-18304: --- Author: ASF GitHub Bot Created on: 23/Jun/22 10:59 Start Date: 23/Jun/22 10:59 Worklog Time Spent: 10m Work Description: dannycjones commented on code in PR #4478: URL: https://github.com/apache/hadoop/pull/4478#discussion_r904883423 ## hadoop-tools/hadoop-aws/src/site/markdown/tools/hadoop-aws/committers.md: ## @@ -530,18 +527,22 @@ performance. ### Enabling the committer +Set the committer used by S3A's committer factory to `magic`: + ```xml Review Comment: It's a good question. I do worry we duplicate this in a lot of places, not sure what is the best option. I plan to leave it out of the scope of this patch though. Issue Time Tracking --- Worklog Id: (was: 784134) Time Spent: 1h 10m (was: 1h) > Improve S3A committers documentation clarity > > > Key: HADOOP-18304 > URL: https://issues.apache.org/jira/browse/HADOOP-18304 > Project: Hadoop Common > Issue Type: Sub-task > Components: documentation >Reporter: Daniel Carl Jones >Assignee: Daniel Carl Jones >Priority: Trivial > Labels: pull-request-available > Time Spent: 1h 10m > Remaining Estimate: 0h > > I recently was learning more about the S3A committers. I'm hoping to provide > some improvements as someone who has recently read [this > documentation|https://github.com/apache/hadoop/blob/1f157f802d2d6142d21482eaa86baf1bef458ed4/hadoop-tools/hadoop-aws/src/site/markdown/tools/hadoop-aws/committers.md#L495] > without fully understanding prior. > For instance, referencing different components more explicitly and adding > pre-requisite info. -- This message was sent by Atlassian Jira (v8.20.7#820007) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] dannycjones commented on a diff in pull request #4478: HADOOP-18304. Improve user-facing S3A committers documentation
dannycjones commented on code in PR #4478: URL: https://github.com/apache/hadoop/pull/4478#discussion_r904883423 ## hadoop-tools/hadoop-aws/src/site/markdown/tools/hadoop-aws/committers.md: ## @@ -530,18 +527,22 @@ performance. ### Enabling the committer +Set the committer used by S3A's committer factory to `magic`: + ```xml Review Comment: It's a good question. I do worry we duplicate this in a lot of places, not sure what is the best option. I plan to leave it out of the scope of this patch though. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] cfg1234 opened a new pull request, #4494: fix bug for MAPREDUCE-7392 bug for GzipCodec in native task
cfg1234 opened a new pull request, #4494: URL: https://github.com/apache/hadoop/pull/4494 I found that inflateReset is not called after inflate return Z_STREAM_END.When Z_STREAM_END is returned after an inflate() method is called,the next inflate() call does not consume any input data and dost not produce any output data,thus the loop cannot exit(This bug is trigged by some special compressed data).So I fix the problem. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] ayushtkn commented on pull request #4488: HDFS-16640. RBF: Show datanode IP list when click DN histogram in Router
ayushtkn commented on PR #4488: URL: https://github.com/apache/hadoop/pull/4488#issuecomment-1164231208 >it would be good to extract some if this common code. Not an expert with Javascript either, but I feel this should be possible. If we extract all these common methods into some other Javascript file say Utils.js and then add that in both dfsHealth.html and FederationHealth.html before these .js files like: ```
[jira] [Work logged] (HADOOP-18304) Improve S3A committers documentation clarity
[ https://issues.apache.org/jira/browse/HADOOP-18304?focusedWorklogId=784113=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-784113 ] ASF GitHub Bot logged work on HADOOP-18304: --- Author: ASF GitHub Bot Created on: 23/Jun/22 10:14 Start Date: 23/Jun/22 10:14 Worklog Time Spent: 10m Work Description: ahmarsuhail commented on code in PR #4478: URL: https://github.com/apache/hadoop/pull/4478#discussion_r902757105 ## hadoop-tools/hadoop-aws/src/site/markdown/tools/hadoop-aws/committers.md: ## @@ -88,17 +88,17 @@ proportional to the amount of data created. It still can't handle task failure. loss or corruption of generated data** -To address these problems there is now explicit support in the `hadop-aws` -module for committing work to Amazon S3 via the S3A filesystem client, -*the S3A Committers* +To address these problems there is now explicit support in the `hadoop-aws` +module for committing work to Amazon S3 via the S3A filesystem client: +*the S3A Committers*. For safe, as well as high-performance output of work to S3, -we need use "a committer" explicitly written to work with S3, treating it as -an object store with special features. +we need to use "a committer" explicitly written to work with S3, +treating it as an object store with special features. -### Background : Hadoop's "Commit Protocol" +### Background: Hadoop's "Commit Protocol" How exactly is work written to its final destination? That is accomplished by a "commit protocol" between the workers and the job manager. Review Comment: line 112 has a typo. The job has "workers", which are processes which work _with_ the actual data and write the results. ## hadoop-tools/hadoop-aws/src/site/markdown/tools/hadoop-aws/committers.md: ## @@ -88,17 +88,17 @@ proportional to the amount of data created. It still can't handle task failure. loss or corruption of generated data** Review Comment: On line 84, change *. to * ## hadoop-tools/hadoop-aws/src/site/markdown/tools/hadoop-aws/committers.md: ## @@ -165,6 +165,7 @@ that the network has partitioned and that they must abort their work. That's "essentially" it. When working with HDFS and similar filesystems, Review Comment: two full stops on line 146, no full stop on 109, 147. ## hadoop-tools/hadoop-aws/src/site/markdown/tools/hadoop-aws/committers.md: ## @@ -283,40 +281,37 @@ new data to an existing partitioned directory tree is a common operation. ``` -**replace** : when the job is committed (and not before), delete files in +The _Directory Committer_ uses the entire directory tree for conflict resolution. +For this committer, the behavior of each conflict mode is shown below: + Review Comment: is there a default mode? if yes can we say which one that is here ## hadoop-tools/hadoop-aws/src/site/markdown/tools/hadoop-aws/committers.md: ## @@ -530,18 +527,22 @@ performance. ### Enabling the committer +Set the committer used by S3A's committer factory to `magic`: + ```xml Review Comment: unrelated, but do you know why these configs aren't listed in https://github.com/apache/hadoop/blob/trunk/hadoop-tools/hadoop-aws/src/site/markdown/tools/hadoop-aws/index.md#general-s3a-client-configuration? there are other properties (eg: delegation token config) that isn't there either. Wondering if it's useful to have them all in one place so it's easy to see everything that's available ## hadoop-tools/hadoop-aws/src/site/markdown/tools/hadoop-aws/committers.md: ## @@ -696,10 +692,9 @@ The magic committer recognizes when files are created under paths with `__magic/ and redirects the upload to a different location, adding the information needed to complete the upload Review Comment: full stop on line 675 ## hadoop-tools/hadoop-aws/src/site/markdown/tools/hadoop-aws/committers.md: ## @@ -650,7 +639,14 @@ Conflict management is left to the execution engine itself. Review Comment: full stop/question mark on line 632. little confusing to read currently. also i don't see a config option for ` fs.s3a.committer.staging.uuid` in staging committer options list ## hadoop-tools/hadoop-aws/src/site/markdown/tools/hadoop-aws/committers.md: ## @@ -88,17 +88,17 @@ proportional to the amount of data created. It still can't handle task failure. loss or corruption of generated data** -To address these problems there is now explicit support in the `hadop-aws` -module for committing work to Amazon S3 via the S3A filesystem client, -*the S3A Committers* +To address these problems there is now explicit support in the `hadoop-aws` +module for committing work to Amazon S3 via the S3A filesystem client: +*the S3A Committers*. For safe, as well as high-performance output of work to
[GitHub] [hadoop] ahmarsuhail commented on a diff in pull request #4478: HADOOP-18304. Improve user-facing S3A committers documentation
ahmarsuhail commented on code in PR #4478: URL: https://github.com/apache/hadoop/pull/4478#discussion_r902757105 ## hadoop-tools/hadoop-aws/src/site/markdown/tools/hadoop-aws/committers.md: ## @@ -88,17 +88,17 @@ proportional to the amount of data created. It still can't handle task failure. loss or corruption of generated data** -To address these problems there is now explicit support in the `hadop-aws` -module for committing work to Amazon S3 via the S3A filesystem client, -*the S3A Committers* +To address these problems there is now explicit support in the `hadoop-aws` +module for committing work to Amazon S3 via the S3A filesystem client: +*the S3A Committers*. For safe, as well as high-performance output of work to S3, -we need use "a committer" explicitly written to work with S3, treating it as -an object store with special features. +we need to use "a committer" explicitly written to work with S3, +treating it as an object store with special features. -### Background : Hadoop's "Commit Protocol" +### Background: Hadoop's "Commit Protocol" How exactly is work written to its final destination? That is accomplished by a "commit protocol" between the workers and the job manager. Review Comment: line 112 has a typo. The job has "workers", which are processes which work _with_ the actual data and write the results. ## hadoop-tools/hadoop-aws/src/site/markdown/tools/hadoop-aws/committers.md: ## @@ -88,17 +88,17 @@ proportional to the amount of data created. It still can't handle task failure. loss or corruption of generated data** Review Comment: On line 84, change *. to * ## hadoop-tools/hadoop-aws/src/site/markdown/tools/hadoop-aws/committers.md: ## @@ -165,6 +165,7 @@ that the network has partitioned and that they must abort their work. That's "essentially" it. When working with HDFS and similar filesystems, Review Comment: two full stops on line 146, no full stop on 109, 147. ## hadoop-tools/hadoop-aws/src/site/markdown/tools/hadoop-aws/committers.md: ## @@ -283,40 +281,37 @@ new data to an existing partitioned directory tree is a common operation. ``` -**replace** : when the job is committed (and not before), delete files in +The _Directory Committer_ uses the entire directory tree for conflict resolution. +For this committer, the behavior of each conflict mode is shown below: + Review Comment: is there a default mode? if yes can we say which one that is here ## hadoop-tools/hadoop-aws/src/site/markdown/tools/hadoop-aws/committers.md: ## @@ -530,18 +527,22 @@ performance. ### Enabling the committer +Set the committer used by S3A's committer factory to `magic`: + ```xml Review Comment: unrelated, but do you know why these configs aren't listed in https://github.com/apache/hadoop/blob/trunk/hadoop-tools/hadoop-aws/src/site/markdown/tools/hadoop-aws/index.md#general-s3a-client-configuration? there are other properties (eg: delegation token config) that isn't there either. Wondering if it's useful to have them all in one place so it's easy to see everything that's available ## hadoop-tools/hadoop-aws/src/site/markdown/tools/hadoop-aws/committers.md: ## @@ -696,10 +692,9 @@ The magic committer recognizes when files are created under paths with `__magic/ and redirects the upload to a different location, adding the information needed to complete the upload Review Comment: full stop on line 675 ## hadoop-tools/hadoop-aws/src/site/markdown/tools/hadoop-aws/committers.md: ## @@ -650,7 +639,14 @@ Conflict management is left to the execution engine itself. Review Comment: full stop/question mark on line 632. little confusing to read currently. also i don't see a config option for ` fs.s3a.committer.staging.uuid` in staging committer options list ## hadoop-tools/hadoop-aws/src/site/markdown/tools/hadoop-aws/committers.md: ## @@ -88,17 +88,17 @@ proportional to the amount of data created. It still can't handle task failure. loss or corruption of generated data** -To address these problems there is now explicit support in the `hadop-aws` -module for committing work to Amazon S3 via the S3A filesystem client, -*the S3A Committers* +To address these problems there is now explicit support in the `hadoop-aws` +module for committing work to Amazon S3 via the S3A filesystem client: +*the S3A Committers*. For safe, as well as high-performance output of work to S3, -we need use "a committer" explicitly written to work with S3, treating it as -an object store with special features. +we need to use "a committer" explicitly written to work with S3, +treating it as an object store with special features. -### Background : Hadoop's "Commit Protocol" +### Background: Hadoop's "Commit Protocol" How exactly is work written to its final destination? That is accomplished by a "commit
[GitHub] [hadoop] ashutoshcipher commented on pull request #4486: YARN-10320.Replace FSDataInputStream#read with readFully in Log Aggregation
ashutoshcipher commented on PR #4486: URL: https://github.com/apache/hadoop/pull/4486#issuecomment-1164222830 > Thanks @ashutoshcipher for the patch. There is one more reference to read() at 804. Can you validate that as well. > > ` int actualLength = checksumFileInputStream.read(b);` Thanks for pointing it out @PrabhuJoseph - I have made the changes. Thanks. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Work logged] (HADOOP-18292) s3a storage class reduced redundancy breaks s3 select tests
[ https://issues.apache.org/jira/browse/HADOOP-18292?focusedWorklogId=784109=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-784109 ] ASF GitHub Bot logged work on HADOOP-18292: --- Author: ASF GitHub Bot Created on: 23/Jun/22 10:01 Start Date: 23/Jun/22 10:01 Worklog Time Spent: 10m Work Description: hadoop-yetus commented on PR #4489: URL: https://github.com/apache/hadoop/pull/4489#issuecomment-1164217128 :confetti_ball: **+1 overall** | Vote | Subsystem | Runtime | Logfile | Comment | |::|--:|:|::|:---:| | +0 :ok: | reexec | 0m 49s | | Docker mode activated. | _ Prechecks _ | | +1 :green_heart: | dupname | 0m 0s | | No case conflicting files found. | | +0 :ok: | codespell | 0m 0s | | codespell was not available. | | +0 :ok: | detsecrets | 0m 0s | | detect-secrets was not available. | | +0 :ok: | markdownlint | 0m 0s | | markdownlint was not available. | | +1 :green_heart: | @author | 0m 0s | | The patch does not contain any @author tags. | | +1 :green_heart: | test4tests | 0m 0s | | The patch appears to include 1 new or modified test files. | _ trunk Compile Tests _ | | +1 :green_heart: | mvninstall | 40m 21s | | trunk passed | | +1 :green_heart: | compile | 0m 53s | | trunk passed with JDK Private Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1 | | +1 :green_heart: | compile | 0m 45s | | trunk passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | checkstyle | 0m 42s | | trunk passed | | +1 :green_heart: | mvnsite | 0m 53s | | trunk passed | | +1 :green_heart: | javadoc | 0m 37s | | trunk passed with JDK Private Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1 | | +1 :green_heart: | javadoc | 0m 41s | | trunk passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | spotbugs | 1m 28s | | trunk passed | | +1 :green_heart: | shadedclient | 23m 35s | | branch has no errors when building and testing our client artifacts. | _ Patch Compile Tests _ | | +1 :green_heart: | mvninstall | 0m 35s | | the patch passed | | +1 :green_heart: | compile | 0m 41s | | the patch passed with JDK Private Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1 | | +1 :green_heart: | javac | 0m 41s | | the patch passed | | +1 :green_heart: | compile | 0m 34s | | the patch passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | javac | 0m 34s | | the patch passed | | +1 :green_heart: | blanks | 0m 0s | | The patch has no blanks issues. | | +1 :green_heart: | checkstyle | 0m 23s | | the patch passed | | +1 :green_heart: | mvnsite | 0m 37s | | the patch passed | | +1 :green_heart: | javadoc | 0m 19s | | the patch passed with JDK Private Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1 | | +1 :green_heart: | javadoc | 0m 27s | | the patch passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | spotbugs | 1m 13s | | the patch passed | | +1 :green_heart: | shadedclient | 23m 25s | | patch has no errors when building and testing our client artifacts. | _ Other Tests _ | | +1 :green_heart: | unit | 2m 45s | | hadoop-aws in the patch passed. | | +1 :green_heart: | asflicense | 0m 42s | | The patch does not generate ASF License warnings. | | | | 103m 29s | | | | Subsystem | Report/Notes | |--:|:-| | Docker | ClientAPI=1.41 ServerAPI=1.41 base: https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4489/2/artifact/out/Dockerfile | | GITHUB PR | https://github.com/apache/hadoop/pull/4489 | | Optional Tests | dupname asflicense mvnsite codespell detsecrets markdownlint compile javac javadoc mvninstall unit shadedclient spotbugs checkstyle | | uname | Linux cf4b3013172d 4.15.0-175-generic #184-Ubuntu SMP Thu Mar 24 17:48:36 UTC 2022 x86_64 x86_64 x86_64 GNU/Linux | | Build tool | maven | | Personality | dev-support/bin/hadoop.sh | | git revision | trunk / 1ec4163bc2c6352db11101bde8df8ee49f1490a7 | | Default Java | Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | Multi-JDK versions | /usr/lib/jvm/java-11-openjdk-amd64:Private Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1 /usr/lib/jvm/java-8-openjdk-amd64:Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | Test Results | https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4489/2/testReport/ | | Max. process+thread count | 582 (vs. ulimit of 5500) | | modules | C: hadoop-tools/hadoop-aws U: hadoop-tools/hadoop-aws | | Console output |