[jira] [Commented] (HADOOP-19156) ZooKeeper based state stores use different ZK address configs
[ https://issues.apache.org/jira/browse/HADOOP-19156?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17841638#comment-17841638 ] ASF GitHub Bot commented on HADOOP-19156: - hadoop-yetus commented on PR #6767: URL: https://github.com/apache/hadoop/pull/6767#issuecomment-2081478842 :broken_heart: **-1 overall** | Vote | Subsystem | Runtime | Logfile | Comment | |::|--:|:|::|:---:| _ Prechecks _ | | +1 :green_heart: | dupname | 0m 03s | | No case conflicting files found. | | +0 :ok: | spotbugs | 0m 01s | | spotbugs executables are not available. | | +0 :ok: | codespell | 0m 01s | | codespell was not available. | | +0 :ok: | detsecrets | 0m 01s | | detect-secrets was not available. | | +0 :ok: | xmllint | 0m 01s | | xmllint was not available. | | +1 :green_heart: | @author | 0m 00s | | The patch does not contain any @author tags. | | +1 :green_heart: | test4tests | 0m 00s | | The patch appears to include 12 new or modified test files. | _ trunk Compile Tests _ | | +0 :ok: | mvndep | 2m 45s | | Maven dependency ordering for branch | | +1 :green_heart: | mvninstall | 88m 30s | | trunk passed | | +1 :green_heart: | compile | 38m 38s | | trunk passed | | +1 :green_heart: | checkstyle | 5m 48s | | trunk passed | | -1 :x: | mvnsite | 4m 18s | [/branch-mvnsite-hadoop-common-project_hadoop-common.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch-windows-10/job/PR-6767/3/artifact/out/branch-mvnsite-hadoop-common-project_hadoop-common.txt) | hadoop-common in trunk failed. | | +1 :green_heart: | javadoc | 28m 55s | | trunk passed | | +1 :green_heart: | shadedclient | 197m 28s | | branch has no errors when building and testing our client artifacts. | _ Patch Compile Tests _ | | +0 :ok: | mvndep | 2m 21s | | Maven dependency ordering for patch | | +1 :green_heart: | mvninstall | 20m 24s | | the patch passed | | +1 :green_heart: | compile | 35m 50s | | the patch passed | | +1 :green_heart: | javac | 35m 50s | | the patch passed | | +1 :green_heart: | blanks | 0m 01s | | The patch has no blanks issues. | | +1 :green_heart: | checkstyle | 5m 57s | | the patch passed | | -1 :x: | mvnsite | 4m 21s | [/patch-mvnsite-hadoop-common-project_hadoop-common.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch-windows-10/job/PR-6767/3/artifact/out/patch-mvnsite-hadoop-common-project_hadoop-common.txt) | hadoop-common in the patch failed. | | +1 :green_heart: | javadoc | 28m 41s | | the patch passed | | +1 :green_heart: | shadedclient | 203m 25s | | patch has no errors when building and testing our client artifacts. | _ Other Tests _ | | +1 :green_heart: | asflicense | 6m 13s | | The patch does not generate ASF License warnings. | | | | 602m 49s | | | | Subsystem | Report/Notes | |--:|:-| | GITHUB PR | https://github.com/apache/hadoop/pull/6767 | | Optional Tests | dupname asflicense compile javac javadoc mvninstall mvnsite unit shadedclient spotbugs checkstyle codespell detsecrets xmllint | | uname | MINGW64_NT-10.0-17763 f0463bf0a8fe 3.4.10-87d57229.x86_64 2024-02-14 20:17 UTC x86_64 Msys | | Build tool | maven | | Personality | /c/hadoop/dev-support/bin/hadoop.sh | | git revision | trunk / 689c74a2c9a5db059cc89a9ff523cbdc545eeb97 | | Default Java | Azul Systems, Inc.-1.8.0_332-b09 | | Test Results | https://ci-hadoop.apache.org/job/hadoop-multibranch-windows-10/job/PR-6767/3/testReport/ | | modules | C: hadoop-common-project/hadoop-common hadoop-yarn-project/hadoop-yarn/hadoop-yarn-api hadoop-yarn-project/hadoop-yarn/hadoop-yarn-common hadoop-yarn-project/hadoop-yarn/hadoop-yarn-server/hadoop-yarn-server-common hadoop-yarn-project/hadoop-yarn/hadoop-yarn-server/hadoop-yarn-server-resourcemanager hadoop-hdfs-project/hadoop-hdfs-rbf U: . | | Console output | https://ci-hadoop.apache.org/job/hadoop-multibranch-windows-10/job/PR-6767/3/console | | versions | git=2.44.0.windows.1 | | Powered by | Apache Yetus 0.14.0 https://yetus.apache.org | This message was automatically generated. > ZooKeeper based state stores use different ZK address configs > - > > Key: HADOOP-19156 > URL: https://issues.apache.org/jira/browse/HADOOP-19156 > Project: Hadoop Common > Issue Type: Improvement >Reporter: liu bin >Priority: Major > Labels: pull-request-available > > Currently, the Zookeeper-based state stores of RM, YARN Federation, and HDFS > Federation use the same ZK address
[jira] [Commented] (HADOOP-19156) ZooKeeper based state stores use different ZK address configs
[ https://issues.apache.org/jira/browse/HADOOP-19156?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17841628#comment-17841628 ] ASF GitHub Bot commented on HADOOP-19156: - hadoop-yetus commented on PR #6767: URL: https://github.com/apache/hadoop/pull/6767#issuecomment-2081429516 :broken_heart: **-1 overall** | Vote | Subsystem | Runtime | Logfile | Comment | |::|--:|:|::|:---:| | +0 :ok: | reexec | 0m 51s | | Docker mode activated. | _ Prechecks _ | | +1 :green_heart: | dupname | 0m 1s | | No case conflicting files found. | | +0 :ok: | codespell | 0m 0s | | codespell was not available. | | +0 :ok: | detsecrets | 0m 0s | | detect-secrets was not available. | | +0 :ok: | xmllint | 0m 0s | | xmllint was not available. | | +1 :green_heart: | @author | 0m 0s | | The patch does not contain any @author tags. | | +1 :green_heart: | test4tests | 0m 0s | | The patch appears to include 12 new or modified test files. | _ trunk Compile Tests _ | | +0 :ok: | mvndep | 14m 27s | | Maven dependency ordering for branch | | +1 :green_heart: | mvninstall | 36m 31s | | trunk passed | | +1 :green_heart: | compile | 19m 57s | | trunk passed with JDK Ubuntu-11.0.22+7-post-Ubuntu-0ubuntu220.04.1 | | +1 :green_heart: | compile | 18m 37s | | trunk passed with JDK Private Build-1.8.0_402-8u402-ga-2ubuntu1~20.04-b06 | | +1 :green_heart: | checkstyle | 4m 41s | | trunk passed | | +1 :green_heart: | mvnsite | 6m 34s | | trunk passed | | +1 :green_heart: | javadoc | 5m 52s | | trunk passed with JDK Ubuntu-11.0.22+7-post-Ubuntu-0ubuntu220.04.1 | | +1 :green_heart: | javadoc | 5m 10s | | trunk passed with JDK Private Build-1.8.0_402-8u402-ga-2ubuntu1~20.04-b06 | | +1 :green_heart: | spotbugs | 11m 37s | | trunk passed | | +1 :green_heart: | shadedclient | 40m 9s | | branch has no errors when building and testing our client artifacts. | _ Patch Compile Tests _ | | +0 :ok: | mvndep | 0m 32s | | Maven dependency ordering for patch | | +1 :green_heart: | mvninstall | 4m 0s | | the patch passed | | +1 :green_heart: | compile | 18m 55s | | the patch passed with JDK Ubuntu-11.0.22+7-post-Ubuntu-0ubuntu220.04.1 | | +1 :green_heart: | javac | 18m 55s | | the patch passed | | +1 :green_heart: | compile | 18m 8s | | the patch passed with JDK Private Build-1.8.0_402-8u402-ga-2ubuntu1~20.04-b06 | | +1 :green_heart: | javac | 18m 8s | | the patch passed | | +1 :green_heart: | blanks | 0m 0s | | The patch has no blanks issues. | | +1 :green_heart: | checkstyle | 4m 41s | | the patch passed | | +1 :green_heart: | mvnsite | 6m 27s | | the patch passed | | +1 :green_heart: | javadoc | 5m 46s | | the patch passed with JDK Ubuntu-11.0.22+7-post-Ubuntu-0ubuntu220.04.1 | | +1 :green_heart: | javadoc | 5m 13s | | the patch passed with JDK Private Build-1.8.0_402-8u402-ga-2ubuntu1~20.04-b06 | | +1 :green_heart: | spotbugs | 12m 53s | | the patch passed | | +1 :green_heart: | shadedclient | 39m 38s | | patch has no errors when building and testing our client artifacts. | _ Other Tests _ | | +1 :green_heart: | unit | 20m 58s | | hadoop-common in the patch passed. | | +1 :green_heart: | unit | 1m 9s | | hadoop-yarn-api in the patch passed. | | +1 :green_heart: | unit | 5m 22s | | hadoop-yarn-common in the patch passed. | | +1 :green_heart: | unit | 4m 9s | | hadoop-yarn-server-common in the patch passed. | | +1 :green_heart: | unit | 109m 10s | | hadoop-yarn-server-resourcemanager in the patch passed. | | -1 :x: | unit | 34m 51s | [/patch-unit-hadoop-hdfs-project_hadoop-hdfs-rbf.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-6767/3/artifact/out/patch-unit-hadoop-hdfs-project_hadoop-hdfs-rbf.txt) | hadoop-hdfs-rbf in the patch failed. | | +1 :green_heart: | asflicense | 0m 59s | | The patch does not generate ASF License warnings. | | | | 464m 49s | | | | Subsystem | Report/Notes | |--:|:-| | Docker | ClientAPI=1.44 ServerAPI=1.44 base: https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-6767/3/artifact/out/Dockerfile | | GITHUB PR | https://github.com/apache/hadoop/pull/6767 | | Optional Tests | dupname asflicense compile javac javadoc mvninstall mvnsite unit shadedclient spotbugs checkstyle codespell detsecrets xmllint | | uname | Linux 2fb3f408608c 5.15.0-94-generic #104-Ubuntu SMP Tue Jan 9 15:25:40 UTC 2024 x86_64 x86_64 x86_64 GNU/Linux | | Build tool | maven | | Personality | dev-support/bin/hadoop.sh | | git revision |
[jira] [Commented] (HADOOP-19159) Fix hadoop-aws document for fs.s3a.committer.abort.pending.uploads
[ https://issues.apache.org/jira/browse/HADOOP-19159?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17841576#comment-17841576 ] ASF GitHub Bot commented on HADOOP-19159: - hadoop-yetus commented on PR #6778: URL: https://github.com/apache/hadoop/pull/6778#issuecomment-2081329778 :confetti_ball: **+1 overall** | Vote | Subsystem | Runtime | Logfile | Comment | |::|--:|:|::|:---:| | +0 :ok: | reexec | 19m 22s | | Docker mode activated. | _ Prechecks _ | | +1 :green_heart: | dupname | 0m 0s | | No case conflicting files found. | | +0 :ok: | codespell | 0m 0s | | codespell was not available. | | +0 :ok: | detsecrets | 0m 0s | | detect-secrets was not available. | | +0 :ok: | markdownlint | 0m 0s | | markdownlint was not available. | | +1 :green_heart: | @author | 0m 0s | | The patch does not contain any @author tags. | _ trunk Compile Tests _ | | +1 :green_heart: | mvninstall | 50m 43s | | trunk passed | | +1 :green_heart: | mvnsite | 0m 46s | | trunk passed | | +1 :green_heart: | shadedclient | 90m 20s | | branch has no errors when building and testing our client artifacts. | _ Patch Compile Tests _ | | +1 :green_heart: | mvninstall | 0m 30s | | the patch passed | | +1 :green_heart: | blanks | 0m 0s | | The patch has no blanks issues. | | +1 :green_heart: | mvnsite | 0m 33s | | the patch passed | | +1 :green_heart: | shadedclient | 38m 46s | | patch has no errors when building and testing our client artifacts. | _ Other Tests _ | | +1 :green_heart: | asflicense | 0m 37s | | The patch does not generate ASF License warnings. | | | | 154m 34s | | | | Subsystem | Report/Notes | |--:|:-| | Docker | ClientAPI=1.45 ServerAPI=1.45 base: https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-6778/1/artifact/out/Dockerfile | | GITHUB PR | https://github.com/apache/hadoop/pull/6778 | | Optional Tests | dupname asflicense mvnsite codespell detsecrets markdownlint | | uname | Linux 36acb33c0a21 5.15.0-94-generic #104-Ubuntu SMP Tue Jan 9 15:25:40 UTC 2024 x86_64 x86_64 x86_64 GNU/Linux | | Build tool | maven | | Personality | dev-support/bin/hadoop.sh | | git revision | trunk / 799a3b4266164328bcc6277f976ab6d2cb63235b | | Max. process+thread count | 527 (vs. ulimit of 5500) | | modules | C: hadoop-tools/hadoop-aws U: hadoop-tools/hadoop-aws | | Console output | https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-6778/1/console | | versions | git=2.25.1 maven=3.6.3 | | Powered by | Apache Yetus 0.14.0 https://yetus.apache.org | This message was automatically generated. > Fix hadoop-aws document for fs.s3a.committer.abort.pending.uploads > -- > > Key: HADOOP-19159 > URL: https://issues.apache.org/jira/browse/HADOOP-19159 > Project: Hadoop Common > Issue Type: Improvement > Components: documentation >Reporter: Xi Chen >Priority: Minor > Labels: pull-request-available > > The description about `fs.s3a.committer.abort.pending.uploads` in the > _Concurrent Jobs writing to the same destination_ is not all correct. -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Commented] (HADOOP-19159) Fix hadoop-aws document for fs.s3a.committer.abort.pending.uploads
[ https://issues.apache.org/jira/browse/HADOOP-19159?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17841563#comment-17841563 ] ASF GitHub Bot commented on HADOOP-19159: - jshmchenxi commented on PR #6778: URL: https://github.com/apache/hadoop/pull/6778#issuecomment-2081296075 cc @steveloughran Please take a look, thanks! > Fix hadoop-aws document for fs.s3a.committer.abort.pending.uploads > -- > > Key: HADOOP-19159 > URL: https://issues.apache.org/jira/browse/HADOOP-19159 > Project: Hadoop Common > Issue Type: Improvement > Components: documentation >Reporter: Xi Chen >Priority: Minor > Labels: pull-request-available > > The description about `fs.s3a.committer.abort.pending.uploads` in the > _Concurrent Jobs writing to the same destination_ is not all correct. -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Updated] (HADOOP-19159) Fix hadoop-aws document for fs.s3a.committer.abort.pending.uploads
[ https://issues.apache.org/jira/browse/HADOOP-19159?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] ASF GitHub Bot updated HADOOP-19159: Labels: pull-request-available (was: ) > Fix hadoop-aws document for fs.s3a.committer.abort.pending.uploads > -- > > Key: HADOOP-19159 > URL: https://issues.apache.org/jira/browse/HADOOP-19159 > Project: Hadoop Common > Issue Type: Improvement > Components: documentation >Reporter: Xi Chen >Priority: Minor > Labels: pull-request-available > > The description about `fs.s3a.committer.abort.pending.uploads` in the > _Concurrent Jobs writing to the same destination_ is not all correct. -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Commented] (HADOOP-19159) Fix hadoop-aws document for fs.s3a.committer.abort.pending.uploads
[ https://issues.apache.org/jira/browse/HADOOP-19159?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17841561#comment-17841561 ] ASF GitHub Bot commented on HADOOP-19159: - jshmchenxi opened a new pull request, #6778: URL: https://github.com/apache/hadoop/pull/6778 ### Description of PR The description about `fs.s3a.committer.abort.pending.uploads` in the section `Concurrent Jobs writing to the same destination` is not all correct. Its default value is `true`. ### How was this patch tested? Minor document change, not involving tests. ### For code changes: - [ ] Does the title or this PR starts with the corresponding JIRA issue id (e.g. 'HADOOP-17799. Your PR title ...')? - [ ] Object storage: have the integration tests been executed and the endpoint declared according to the connector-specific documentation? - [ ] If adding new dependencies to the code, are these dependencies licensed in a way that is compatible for inclusion under [ASF 2.0](http://www.apache.org/legal/resolved.html#category-a)? - [ ] If applicable, have you updated the `LICENSE`, `LICENSE-binary`, `NOTICE-binary` files? > Fix hadoop-aws document for fs.s3a.committer.abort.pending.uploads > -- > > Key: HADOOP-19159 > URL: https://issues.apache.org/jira/browse/HADOOP-19159 > Project: Hadoop Common > Issue Type: Improvement > Components: documentation >Reporter: Xi Chen >Priority: Minor > > The description about `fs.s3a.committer.abort.pending.uploads` in the > _Concurrent Jobs writing to the same destination_ is not all correct. -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Commented] (HADOOP-19134) use StringBuilder instead of StringBuffer
[ https://issues.apache.org/jira/browse/HADOOP-19134?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17841393#comment-17841393 ] ASF GitHub Bot commented on HADOOP-19134: - hadoop-yetus commented on PR #6692: URL: https://github.com/apache/hadoop/pull/6692#issuecomment-2080325011 :confetti_ball: **+1 overall** | Vote | Subsystem | Runtime | Logfile | Comment | |::|--:|:|::|:---:| | +0 :ok: | reexec | 0m 31s | | Docker mode activated. | _ Prechecks _ | | +1 :green_heart: | dupname | 0m 2s | | No case conflicting files found. | | +0 :ok: | codespell | 0m 1s | | codespell was not available. | | +0 :ok: | detsecrets | 0m 1s | | detect-secrets was not available. | | +1 :green_heart: | @author | 0m 0s | | The patch does not contain any @author tags. | | +1 :green_heart: | test4tests | 0m 0s | | The patch appears to include 56 new or modified test files. | _ trunk Compile Tests _ | | +0 :ok: | mvndep | 14m 20s | | Maven dependency ordering for branch | | +1 :green_heart: | mvninstall | 32m 9s | | trunk passed | | +1 :green_heart: | compile | 17m 26s | | trunk passed with JDK Ubuntu-11.0.22+7-post-Ubuntu-0ubuntu220.04.1 | | +1 :green_heart: | compile | 16m 30s | | trunk passed with JDK Private Build-1.8.0_402-8u402-ga-2ubuntu1~20.04-b06 | | +1 :green_heart: | checkstyle | 4m 48s | | trunk passed | | +1 :green_heart: | mvnsite | 20m 4s | | trunk passed | | +1 :green_heart: | javadoc | 17m 42s | | trunk passed with JDK Ubuntu-11.0.22+7-post-Ubuntu-0ubuntu220.04.1 | | +1 :green_heart: | javadoc | 17m 37s | | trunk passed with JDK Private Build-1.8.0_402-8u402-ga-2ubuntu1~20.04-b06 | | +1 :green_heart: | spotbugs | 31m 4s | | trunk passed | | +1 :green_heart: | shadedclient | 34m 24s | | branch has no errors when building and testing our client artifacts. | _ Patch Compile Tests _ | | +0 :ok: | mvndep | 0m 35s | | Maven dependency ordering for patch | | +1 :green_heart: | mvninstall | 11m 45s | | the patch passed | | +1 :green_heart: | compile | 16m 58s | | the patch passed with JDK Ubuntu-11.0.22+7-post-Ubuntu-0ubuntu220.04.1 | | +1 :green_heart: | javac | 16m 58s | | the patch passed | | +1 :green_heart: | compile | 16m 19s | | the patch passed with JDK Private Build-1.8.0_402-8u402-ga-2ubuntu1~20.04-b06 | | +1 :green_heart: | javac | 16m 19s | | the patch passed | | +1 :green_heart: | blanks | 0m 0s | | The patch has no blanks issues. | | -0 :warning: | checkstyle | 4m 54s | [/results-checkstyle-root.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-6692/5/artifact/out/results-checkstyle-root.txt) | root: The patch generated 7 new + 3898 unchanged - 62 fixed = 3905 total (was 3960) | | +1 :green_heart: | mvnsite | 20m 11s | | the patch passed | | +1 :green_heart: | javadoc | 17m 44s | | the patch passed with JDK Ubuntu-11.0.22+7-post-Ubuntu-0ubuntu220.04.1 | | +1 :green_heart: | javadoc | 17m 38s | | the patch passed with JDK Private Build-1.8.0_402-8u402-ga-2ubuntu1~20.04-b06 | | +1 :green_heart: | spotbugs | 35m 42s | | the patch passed | | +1 :green_heart: | shadedclient | 34m 29s | | patch has no errors when building and testing our client artifacts. | _ Other Tests _ | | +1 :green_heart: | unit | 19m 38s | | hadoop-common in the patch passed. | | +1 :green_heart: | unit | 3m 54s | | hadoop-kms in the patch passed. | | +1 :green_heart: | unit | 229m 0s | | hadoop-hdfs in the patch passed. | | +1 :green_heart: | unit | 1m 27s | | hadoop-yarn-api in the patch passed. | | +1 :green_heart: | unit | 6m 7s | | hadoop-yarn-common in the patch passed. | | +1 :green_heart: | unit | 4m 1s | | hadoop-yarn-server-common in the patch passed. | | +1 :green_heart: | unit | 106m 1s | | hadoop-yarn-server-resourcemanager in the patch passed. | | +1 :green_heart: | unit | 24m 46s | | hadoop-yarn-server-nodemanager in the patch passed. | | +1 :green_heart: | unit | 28m 51s | | hadoop-yarn-client in the patch passed. | | +1 :green_heart: | unit | 7m 51s | | hadoop-mapreduce-client-core in the patch passed. | | +1 :green_heart: | unit | 1m 41s | | hadoop-mapreduce-client-common in the patch passed. | | +1 :green_heart: | unit | 9m 6s | | hadoop-mapreduce-client-app in the patch passed. | | +1 :green_heart: | unit | 4m 40s | | hadoop-mapreduce-client-hs in the patch passed. | | +1 :green_heart: | unit | 127m 45s | | hadoop-mapreduce-client-jobclient in the patch passed. | | +1 :green_heart: | unit | 24m
[jira] [Commented] (HADOOP-19158) S3A: Support ByteBufferPositionedReadable through vector IO
[ https://issues.apache.org/jira/browse/HADOOP-19158?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17841340#comment-17841340 ] ASF GitHub Bot commented on HADOOP-19158: - hadoop-yetus commented on PR #6773: URL: https://github.com/apache/hadoop/pull/6773#issuecomment-2079947516 :broken_heart: **-1 overall** | Vote | Subsystem | Runtime | Logfile | Comment | |::|--:|:|::|:---:| | +0 :ok: | reexec | 11m 50s | | Docker mode activated. | _ Prechecks _ | | +1 :green_heart: | dupname | 0m 0s | | No case conflicting files found. | | +0 :ok: | codespell | 0m 1s | | codespell was not available. | | +0 :ok: | detsecrets | 0m 1s | | detect-secrets was not available. | | +1 :green_heart: | @author | 0m 0s | | The patch does not contain any @author tags. | | -1 :x: | test4tests | 0m 0s | | The patch doesn't appear to include any new or modified tests. Please justify why no new tests are needed for this patch. Also please list what manual steps were performed to verify this patch. | _ trunk Compile Tests _ | | +0 :ok: | mvndep | 14m 39s | | Maven dependency ordering for branch | | +1 :green_heart: | mvninstall | 32m 13s | | trunk passed | | +1 :green_heart: | compile | 17m 30s | | trunk passed with JDK Ubuntu-11.0.22+7-post-Ubuntu-0ubuntu220.04.1 | | +1 :green_heart: | compile | 16m 26s | | trunk passed with JDK Private Build-1.8.0_402-8u402-ga-2ubuntu1~20.04-b06 | | +1 :green_heart: | checkstyle | 4m 24s | | trunk passed | | +1 :green_heart: | mvnsite | 2m 41s | | trunk passed | | +1 :green_heart: | javadoc | 1m 55s | | trunk passed with JDK Ubuntu-11.0.22+7-post-Ubuntu-0ubuntu220.04.1 | | +1 :green_heart: | javadoc | 1m 45s | | trunk passed with JDK Private Build-1.8.0_402-8u402-ga-2ubuntu1~20.04-b06 | | +1 :green_heart: | spotbugs | 3m 55s | | trunk passed | | +1 :green_heart: | shadedclient | 34m 16s | | branch has no errors when building and testing our client artifacts. | _ Patch Compile Tests _ | | +0 :ok: | mvndep | 0m 34s | | Maven dependency ordering for patch | | +1 :green_heart: | mvninstall | 1m 25s | | the patch passed | | +1 :green_heart: | compile | 16m 56s | | the patch passed with JDK Ubuntu-11.0.22+7-post-Ubuntu-0ubuntu220.04.1 | | +1 :green_heart: | javac | 16m 56s | | the patch passed | | +1 :green_heart: | compile | 15m 53s | | the patch passed with JDK Private Build-1.8.0_402-8u402-ga-2ubuntu1~20.04-b06 | | +1 :green_heart: | javac | 15m 53s | | the patch passed | | +1 :green_heart: | blanks | 0m 0s | | The patch has no blanks issues. | | +1 :green_heart: | checkstyle | 4m 13s | | the patch passed | | +1 :green_heart: | mvnsite | 2m 36s | | the patch passed | | +1 :green_heart: | javadoc | 1m 51s | | the patch passed with JDK Ubuntu-11.0.22+7-post-Ubuntu-0ubuntu220.04.1 | | +1 :green_heart: | javadoc | 1m 46s | | the patch passed with JDK Private Build-1.8.0_402-8u402-ga-2ubuntu1~20.04-b06 | | +1 :green_heart: | spotbugs | 4m 19s | | the patch passed | | +1 :green_heart: | shadedclient | 34m 13s | | patch has no errors when building and testing our client artifacts. | _ Other Tests _ | | +1 :green_heart: | unit | 19m 26s | | hadoop-common in the patch passed. | | +1 :green_heart: | unit | 3m 19s | | hadoop-aws in the patch passed. | | +1 :green_heart: | asflicense | 1m 5s | | The patch does not generate ASF License warnings. | | | | 258m 58s | | | | Subsystem | Report/Notes | |--:|:-| | Docker | ClientAPI=1.45 ServerAPI=1.45 base: https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-6773/2/artifact/out/Dockerfile | | GITHUB PR | https://github.com/apache/hadoop/pull/6773 | | Optional Tests | dupname asflicense compile javac javadoc mvninstall mvnsite unit shadedclient spotbugs checkstyle codespell detsecrets | | uname | Linux fac2d1cbde16 5.15.0-94-generic #104-Ubuntu SMP Tue Jan 9 15:25:40 UTC 2024 x86_64 x86_64 x86_64 GNU/Linux | | Build tool | maven | | Personality | dev-support/bin/hadoop.sh | | git revision | trunk / 30f890203d1d43cda82d8182714803e099a4a65f | | Default Java | Private Build-1.8.0_402-8u402-ga-2ubuntu1~20.04-b06 | | Multi-JDK versions | /usr/lib/jvm/java-11-openjdk-amd64:Ubuntu-11.0.22+7-post-Ubuntu-0ubuntu220.04.1 /usr/lib/jvm/java-8-openjdk-amd64:Private Build-1.8.0_402-8u402-ga-2ubuntu1~20.04-b06 | | Test Results | https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-6773/2/testReport/ | | Max. process+thread count | 1254 (vs. ulimit of 5500) | | modules | C:
[jira] [Commented] (HADOOP-18679) Add API for bulk/paged object deletion
[ https://issues.apache.org/jira/browse/HADOOP-18679?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17841319#comment-17841319 ] ASF GitHub Bot commented on HADOOP-18679: - steveloughran commented on PR #6726: URL: https://github.com/apache/hadoop/pull/6726#issuecomment-2079798916 iceberg poc pr https://github.com/apache/iceberg/pull/10233 > Add API for bulk/paged object deletion > -- > > Key: HADOOP-18679 > URL: https://issues.apache.org/jira/browse/HADOOP-18679 > Project: Hadoop Common > Issue Type: Sub-task > Components: fs/s3 >Affects Versions: 3.3.5 >Reporter: Steve Loughran >Priority: Major > Labels: pull-request-available > > iceberg and hbase could benefit from being able to give a list of individual > files to delete -files which may be scattered round the bucket for better > read peformance. > Add some new optional interface for an object store which allows a caller to > submit a list of paths to files to delete, where > the expectation is > * if a path is a file: delete > * if a path is a dir, outcome undefined > For s3 that'd let us build these into DeleteRequest objects, and submit, > without any probes first. -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Commented] (HADOOP-18717) Move CodecPool getCompressor/getDecompressor logs to DEBUG
[ https://issues.apache.org/jira/browse/HADOOP-18717?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17841314#comment-17841314 ] ASF GitHub Bot commented on HADOOP-18717: - steveloughran commented on code in PR #6445: URL: https://github.com/apache/hadoop/pull/6445#discussion_r1581316396 ## hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/io/compress/CodecPool.java: ## @@ -150,7 +150,9 @@ public static Compressor getCompressor(CompressionCodec codec, Configuration con Compressor compressor = borrow(compressorPool, codec.getCompressorType()); if (compressor == null) { compressor = codec.createCompressor(); - LOG.info("Got brand-new compressor ["+codec.getDefaultExtension()+"]"); + if(LOG.isDebugEnabled()) { Review Comment: actually we can move to full slf4j here, so ``` LOG.debug("Got brand-new compressor [{}]", codec.getDefaultExtension()); ``` I've looked at the codecs and apart from one case, `PassthroughCodec`, this is all a low cost invocation so no need to wrap. but that does log at info. so there is a good argument for guarding the logging. but: can you add a space between the `if` and the `(` > Move CodecPool getCompressor/getDecompressor logs to DEBUG > -- > > Key: HADOOP-18717 > URL: https://issues.apache.org/jira/browse/HADOOP-18717 > Project: Hadoop Common > Issue Type: Improvement > Components: common >Affects Versions: 3.3.6 >Reporter: Claire McGinty >Priority: Trivial > Labels: pull-request-available > > The "Got brand new compressor|decompressor" logs in CodecPool[0] can be quite > noisy when reading thousands of blocks and aren't that illuminating for the > end user. I'd like to propose moving them from log.info to log.debug if > there's no objection. > > [0] > https://github.com/apache/hadoop/blob/b737869e01fe3334b948a38fe3835e48873bf3a6/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/io/compress/CodecPool.java#L149-L195 -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Commented] (HADOOP-18516) [ABFS]: Support fixed SAS token config in addition to Custom SASTokenProvider Implementation
[ https://issues.apache.org/jira/browse/HADOOP-18516?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17841312#comment-17841312 ] ASF GitHub Bot commented on HADOOP-18516: - steveloughran commented on code in PR #6552: URL: https://github.com/apache/hadoop/pull/6552#discussion_r1581305976 ## hadoop-tools/hadoop-azure/src/main/java/org/apache/hadoop/fs/azurebfs/AbfsConfiguration.java: ## @@ -976,33 +977,60 @@ public AccessTokenProvider getTokenProvider() throws TokenAccessProviderExceptio } } + /** + * Returns the SASTokenProvider implementation to be used to generate SAS token. + * Users can choose between a custom implementation of {@link SASTokenProvider} + * or an in house implementation {@link FixedSASTokenProvider}. + * For Custom implementation "fs.azure.sas.token.provider.type" needs to be provided. + * For Fixed SAS Token use "fs.azure.sas.fixed.token" needs to be provided. + * In case both are provided, Preference will be given to Custom implementation. + * Avoid using a custom tokenProvider implementation just to read the configured + * fixed token, as this could create confusion. Also,implementing the SASTokenProvider + * requires relying on the raw configurations. It is more stable to depend on + * the AbfsConfiguration with which a filesystem is initialized, and eliminate + * chances of dynamic modifications and spurious situations. + * @return sasTokenProvider object based on configurations provided + * @throws AzureBlobFileSystemException + */ public SASTokenProvider getSASTokenProvider() throws AzureBlobFileSystemException { AuthType authType = getEnum(FS_AZURE_ACCOUNT_AUTH_TYPE_PROPERTY_NAME, AuthType.SharedKey); if (authType != AuthType.SAS) { throw new SASTokenProviderException(String.format( -"Invalid auth type: %s is being used, expecting SAS", authType)); + "Invalid auth type: %s is being used, expecting SAS.", authType)); } try { - String configKey = FS_AZURE_SAS_TOKEN_PROVIDER_TYPE; - Class sasTokenProviderClass = - getTokenProviderClass(authType, configKey, null, - SASTokenProvider.class); - - Preconditions.checkArgument(sasTokenProviderClass != null, - String.format("The configuration value for \"%s\" is invalid.", configKey)); - - SASTokenProvider sasTokenProvider = ReflectionUtils - .newInstance(sasTokenProviderClass, rawConfig); - Preconditions.checkArgument(sasTokenProvider != null, - String.format("Failed to initialize %s", sasTokenProviderClass)); - - LOG.trace("Initializing {}", sasTokenProviderClass.getName()); - sasTokenProvider.initialize(rawConfig, accountName); - LOG.trace("{} init complete", sasTokenProviderClass.getName()); - return sasTokenProvider; + Class customSasTokenProviderImplementation = + getTokenProviderClass(authType, FS_AZURE_SAS_TOKEN_PROVIDER_TYPE, + null, SASTokenProvider.class); + String configuredFixedToken = this.rawConfig.get(FS_AZURE_SAS_FIXED_TOKEN, + null); + + Preconditions.checkArgument( + customSasTokenProviderImplementation != null || configuredFixedToken != null, + "At least one of the \"%s\" and \"%s\" must be set.", + FS_AZURE_SAS_TOKEN_PROVIDER_TYPE, FS_AZURE_SAS_FIXED_TOKEN); + + // Prefer Custom SASTokenProvider Implementation if configured. + if (customSasTokenProviderImplementation != null) { +LOG.trace("Using Custom SASTokenProvider implementation because it is given precedence when it is set."); +SASTokenProvider sasTokenProvider = ReflectionUtils.newInstance( Review Comment: it's needed to support dynamic loading of the class provided in the configuration file. ## hadoop-tools/hadoop-azure/src/main/java/org/apache/hadoop/fs/azurebfs/services/FixedSASTokenProvider.java: ## @@ -0,0 +1,46 @@ +/** + * Licensed to the Apache Software Foundation (ASF) under one + * or more contributor license agreements. See the NOTICE file + * distributed with this work for additional information + * regarding copyright ownership. The ASF licenses this file + * to you under the Apache License, Version 2.0 (the + * "License"); you may not use this file except in compliance + * with the License. You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + */ + +package org.apache.hadoop.fs.azurebfs.services; + +import java.io.IOException; + +import
[jira] [Commented] (HADOOP-18516) [ABFS]: Support fixed SAS token config in addition to Custom SASTokenProvider Implementation
[ https://issues.apache.org/jira/browse/HADOOP-18516?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17841305#comment-17841305 ] ASF GitHub Bot commented on HADOOP-18516: - steveloughran commented on code in PR #6552: URL: https://github.com/apache/hadoop/pull/6552#discussion_r1581300456 ## hadoop-tools/hadoop-azure/src/test/java/org/apache/hadoop/fs/azurebfs/ITestAzureBlobFileSystemChooseSAS.java: ## @@ -0,0 +1,145 @@ +/** + * Licensed to the Apache Software Foundation (ASF) under one + * or more contributor license agreements. See the NOTICE file + * distributed with this work for additional information + * regarding copyright ownership. The ASF licenses this file + * to you under the Apache License, Version 2.0 (the + * "License"); you may not use this file except in compliance + * with the License. You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + */ +package org.apache.hadoop.fs.azurebfs; + +import org.apache.hadoop.fs.FileSystem; +import org.apache.hadoop.fs.Path; +import org.apache.hadoop.fs.azurebfs.contracts.exceptions.AzureBlobFileSystemException; +import org.apache.hadoop.fs.azurebfs.contracts.exceptions.SASTokenProviderException; +import org.apache.hadoop.fs.azurebfs.contracts.exceptions.TokenAccessProviderException; +import org.apache.hadoop.fs.azurebfs.services.AuthType; +import org.apache.hadoop.fs.azurebfs.utils.AccountSASGenerator; +import org.apache.hadoop.fs.azurebfs.utils.Base64; +import org.apache.hadoop.fs.azurebfs.utils.TracingContext; +import org.junit.Assume; +import org.junit.Test; + +import java.io.IOException; + +import static org.apache.hadoop.fs.azurebfs.constants.ConfigurationKeys.FS_AZURE_SAS_FIXED_TOKEN; +import static org.apache.hadoop.fs.azurebfs.constants.ConfigurationKeys.FS_AZURE_SAS_TOKEN_PROVIDER_TYPE; +import static org.apache.hadoop.test.LambdaTestUtils.intercept; + +public class ITestAzureBlobFileSystemChooseSAS extends AbstractAbfsIntegrationTest{ + + private String accountSAS; + + public ITestAzureBlobFileSystemChooseSAS() throws Exception { +// The test uses shared key to create a random filesystem and then creates another +// instance of this filesystem using SAS authorization. +Assume.assumeTrue(this.getAuthType() == AuthType.SharedKey); + } + + private void generateAccountSAS() throws AzureBlobFileSystemException { +final String accountKey = getConfiguration().getStorageAccountKey(); +AccountSASGenerator configAccountSASGenerator = new AccountSASGenerator(Base64.decode(accountKey)); +accountSAS = configAccountSASGenerator.getAccountSAS(getAccountName()); + } + + @Override + public void setup() throws Exception { +createFilesystemForSASTests(); +super.setup(); +// obtaining an account SAS token from in-built generator to set as configuration for testing filesystem level operations +generateAccountSAS(); + } + + /** + * Tests the scenario where both the token provider class and a fixed token are configured: + * whether the correct choice is made (precedence given to token provider class), and the chosen SAS Token works as expected + * @throws Exception + */ + @Test + public void testBothProviderFixedTokenConfigured() throws Exception { +AbfsConfiguration testAbfsConfig = getConfiguration(); + +// configuring a SASTokenProvider class: this provides a user delegation SAS +// user delegation SAS Provider is set +// This easily distinguishes between results of filesystem level and blob level operations to ensure correct SAS is chosen, +// when both a provider class and fixed token is configured. +testAbfsConfig.set(FS_AZURE_SAS_TOKEN_PROVIDER_TYPE, "org.apache.hadoop.fs.azurebfs.extensions.MockDelegationSASTokenProvider"); + +// configuring the fixed SAS token +testAbfsConfig.set(FS_AZURE_SAS_FIXED_TOKEN, accountSAS); + +// creating a new fs instance with the updated configs +AzureBlobFileSystem newTestFs = (AzureBlobFileSystem) FileSystem.newInstance(testAbfsConfig.getRawConfiguration()); + +// testing a file system level operation +TracingContext tracingContext = getTestTracingContext(newTestFs, true); +// expected to fail in the ideal case, as delegation SAS will be chosen, provider class is given preference when both are configured +// this expectation is because filesystem level operations are beyond the scope of Delegation SAS Token +intercept(SASTokenProviderException.class, +() -> { +
[jira] [Commented] (HADOOP-18679) Add API for bulk/paged object deletion
[ https://issues.apache.org/jira/browse/HADOOP-18679?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17841304#comment-17841304 ] ASF GitHub Bot commented on HADOOP-18679: - steveloughran commented on code in PR #6726: URL: https://github.com/apache/hadoop/pull/6726#discussion_r1581288128 ## hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/fs/DefaultBulkDeleteOperation.java: ## @@ -17,61 +17,86 @@ */ package org.apache.hadoop.fs; +import java.io.FileNotFoundException; import java.io.IOException; import java.util.ArrayList; import java.util.Collection; import java.util.List; import java.util.Map; +import org.slf4j.Logger; +import org.slf4j.LoggerFactory; + import org.apache.hadoop.util.functional.Tuples; import static java.util.Objects.requireNonNull; import static org.apache.hadoop.fs.BulkDeleteUtils.validateBulkDeletePaths; -import static org.apache.hadoop.util.Preconditions.checkArgument; /** * Default implementation of the {@link BulkDelete} interface. */ public class DefaultBulkDeleteOperation implements BulkDelete { -private final int pageSize; +private static Logger LOG = LoggerFactory.getLogger(DefaultBulkDeleteOperation.class); + +/** Default page size for bulk delete. */ +private static final int DEFAULT_PAGE_SIZE = 1; +/** Base path for the bulk delete operation. */ private final Path basePath; +/** Delegate File system make actual delete calls. */ private final FileSystem fs; -public DefaultBulkDeleteOperation(int pageSize, - Path basePath, +public DefaultBulkDeleteOperation(Path basePath, FileSystem fs) { -checkArgument(pageSize == 1, "Page size must be equal to 1"); -this.pageSize = pageSize; this.basePath = requireNonNull(basePath); this.fs = fs; } @Override public int pageSize() { -return pageSize; +return DEFAULT_PAGE_SIZE; } @Override public Path basePath() { return basePath; } +/** + * {@inheritDoc} + */ @Override public List> bulkDelete(Collection paths) throws IOException, IllegalArgumentException { -validateBulkDeletePaths(paths, pageSize, basePath); +validateBulkDeletePaths(paths, DEFAULT_PAGE_SIZE, basePath); List> result = new ArrayList<>(); -// this for loop doesn't make sense as pageSize must be 1. -for (Path path : paths) { +if (!paths.isEmpty()) { +// As the page size is always 1, this should be the only one +// path in the collection. +Path pathToDelete = paths.iterator().next(); try { -fs.delete(path, false); -// What to do if this return false? -// I think we should add the path to the result list with value "Not Deleted". -} catch (IOException e) { -result.add(Tuples.pair(path, e.toString())); +boolean deleted = fs.delete(pathToDelete, false); +if (deleted) { +return result; +} else { +try { +FileStatus fileStatus = fs.getFileStatus(pathToDelete); +if (fileStatus.isDirectory()) { +result.add(Tuples.pair(pathToDelete, "Path is a directory")); +} +} catch (FileNotFoundException e) { +// Ignore FNFE and don't add to the result list. +LOG.debug("Couldn't delete {} - does not exist: {}", pathToDelete, e.toString()); +} catch (Exception e) { +LOG.debug("Couldn't delete {} - exception occurred: {}", pathToDelete, e.toString()); +result.add(Tuples.pair(pathToDelete, e.toString())); +} +} +} catch (Exception ex) { Review Comment: make this an IOException ## hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/fs/DefaultBulkDeleteOperation.java: ## @@ -17,61 +17,86 @@ */ package org.apache.hadoop.fs; +import java.io.FileNotFoundException; import java.io.IOException; import java.util.ArrayList; import java.util.Collection; import java.util.List; import java.util.Map; +import org.slf4j.Logger; +import org.slf4j.LoggerFactory; + import org.apache.hadoop.util.functional.Tuples; import static java.util.Objects.requireNonNull; import static org.apache.hadoop.fs.BulkDeleteUtils.validateBulkDeletePaths; -import static org.apache.hadoop.util.Preconditions.checkArgument; /** * Default implementation of the {@link BulkDelete} interface. */ public class DefaultBulkDeleteOperation implements BulkDelete {
[jira] [Commented] (HADOOP-19058) [JDK-17] Fix UT Failures in hadoop common, hdfs, yarn
[ https://issues.apache.org/jira/browse/HADOOP-19058?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17841237#comment-17841237 ] ASF GitHub Bot commented on HADOOP-19058: - steveloughran commented on code in PR #6531: URL: https://github.com/apache/hadoop/pull/6531#discussion_r1481361766 ## hadoop-project/pom.xml: ## @@ -168,7 +168,18 @@ [3.3.0,) [JDK-17] Fix UT Failures in hadoop common, hdfs, yarn > - > > Key: HADOOP-19058 > URL: https://issues.apache.org/jira/browse/HADOOP-19058 > Project: Hadoop Common > Issue Type: Sub-task >Reporter: Bilwa S T >Assignee: Bilwa S T >Priority: Major > Labels: pull-request-available > > Most of the UT's failed with below exception: > Caused by: java.lang.ExceptionInInitializerError: Exception > java.lang.reflect.InaccessibleObjectException: Unable to make protected final > java.lang.Class > java.lang.ClassLoader.defineClass(java.lang.String,byte[],int,int) throws > java.lang.ClassFormatError accessible: module java.base does not "opens > java.lang" to unnamed module @d13f7c [in thread "Time-limited test"] -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Commented] (HADOOP-19152) Do not hard code security providers.
[ https://issues.apache.org/jira/browse/HADOOP-19152?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17841234#comment-17841234 ] ASF GitHub Bot commented on HADOOP-19152: - steveloughran commented on code in PR #6739: URL: https://github.com/apache/hadoop/pull/6739#discussion_r1581033141 ## hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/crypto/CryptoUtils.java: ## @@ -55,15 +58,18 @@ public static String getJceProvider(Configuration conf) { CommonConfigurationKeysPublic.HADOOP_SECURITY_CRYPTO_JCE_PROVIDER_AUTO_ADD_KEY, CommonConfigurationKeysPublic.HADOOP_SECURITY_CRYPTO_JCE_PROVIDER_AUTO_ADD_DEFAULT); -// For backward compatible, auto-add BOUNCY_CASTLE_PROVIDER_CLASS. -if (autoAdd && !provider.isEmpty()) { +// For backward compatible, auto-add BOUNCY_CASTLE_PROVIDER_CLASS when the provider is "BC". +if (autoAdd && PROVIDER_NAME.equals(provider)) { try { // Use reflection in order to avoid statically loading the class. final Class clazz = Class.forName(BOUNCY_CASTLE_PROVIDER_CLASS); -final Field provider_name = clazz.getField("PROVIDER_NAME"); -if (provider.equals(provider_name.get(null))) { +final Field providerName = clazz.getField("PROVIDER_NAME"); Review Comment: I dont think this is needed any more. If it is, use the constant `PROVIDER_NAME_FIELD`, but really, given we know what string we are looking for, no need to ask for the field or check it again ## hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/crypto/CryptoUtils.java: ## @@ -40,6 +41,8 @@ public class CryptoUtils { = "org.bouncycastle.jce.provider.BouncyCastleProvider"; private static final String PROVIDER_NAME_FIELD = "PROVIDER_NAME"; + static final String PROVIDER_NAME = "BC"; Review Comment: make private add a javadoc, and give it a name like BOUNCY_CASTLE_PROVIDER_NAME > Do not hard code security providers. > > > Key: HADOOP-19152 > URL: https://issues.apache.org/jira/browse/HADOOP-19152 > Project: Hadoop Common > Issue Type: Improvement > Components: security >Reporter: Tsz-wo Sze >Assignee: Tsz-wo Sze >Priority: Major > Labels: pull-request-available > > In order to support different security providers in different clusters, we > should not hard code a provider in our code. -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Commented] (HADOOP-19136) Upgrade commons-io to 2.16.1
[ https://issues.apache.org/jira/browse/HADOOP-19136?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17841173#comment-17841173 ] ASF GitHub Bot commented on HADOOP-19136: - hadoop-yetus commented on PR #6704: URL: https://github.com/apache/hadoop/pull/6704#issuecomment-2079208576 :broken_heart: **-1 overall** | Vote | Subsystem | Runtime | Logfile | Comment | |::|--:|:|::|:---:| | +0 :ok: | reexec | 0m 47s | | Docker mode activated. | _ Prechecks _ | | +1 :green_heart: | dupname | 0m 0s | | No case conflicting files found. | | +0 :ok: | codespell | 0m 1s | | codespell was not available. | | +0 :ok: | detsecrets | 0m 1s | | detect-secrets was not available. | | +0 :ok: | xmllint | 0m 1s | | xmllint was not available. | | +0 :ok: | shelldocs | 0m 1s | | Shelldocs was not available. | | +1 :green_heart: | @author | 0m 0s | | The patch does not contain any @author tags. | | -1 :x: | test4tests | 0m 0s | | The patch doesn't appear to include any new or modified tests. Please justify why no new tests are needed for this patch. Also please list what manual steps were performed to verify this patch. | _ trunk Compile Tests _ | | +0 :ok: | mvndep | 15m 2s | | Maven dependency ordering for branch | | +1 :green_heart: | mvninstall | 37m 38s | | trunk passed | | +1 :green_heart: | compile | 19m 15s | | trunk passed with JDK Ubuntu-11.0.22+7-post-Ubuntu-0ubuntu220.04.1 | | +1 :green_heart: | compile | 17m 34s | | trunk passed with JDK Private Build-1.8.0_402-8u402-ga-2ubuntu1~20.04-b06 | | +1 :green_heart: | mvnsite | 23m 43s | | trunk passed | | +1 :green_heart: | javadoc | 8m 57s | | trunk passed with JDK Ubuntu-11.0.22+7-post-Ubuntu-0ubuntu220.04.1 | | +1 :green_heart: | javadoc | 7m 53s | | trunk passed with JDK Private Build-1.8.0_402-8u402-ga-2ubuntu1~20.04-b06 | | +1 :green_heart: | shadedclient | 54m 56s | | branch has no errors when building and testing our client artifacts. | _ Patch Compile Tests _ | | +0 :ok: | mvndep | 0m 35s | | Maven dependency ordering for patch | | +1 :green_heart: | mvninstall | 35m 46s | | the patch passed | | +1 :green_heart: | compile | 18m 40s | | the patch passed with JDK Ubuntu-11.0.22+7-post-Ubuntu-0ubuntu220.04.1 | | +1 :green_heart: | javac | 18m 40s | | the patch passed | | +1 :green_heart: | compile | 17m 27s | | the patch passed with JDK Private Build-1.8.0_402-8u402-ga-2ubuntu1~20.04-b06 | | +1 :green_heart: | javac | 17m 27s | | the patch passed | | +1 :green_heart: | blanks | 0m 0s | | The patch has no blanks issues. | | +1 :green_heart: | mvnsite | 17m 36s | | the patch passed | | +1 :green_heart: | shellcheck | 0m 0s | | No new issues. | | +1 :green_heart: | javadoc | 8m 53s | | the patch passed with JDK Ubuntu-11.0.22+7-post-Ubuntu-0ubuntu220.04.1 | | +1 :green_heart: | javadoc | 7m 42s | | the patch passed with JDK Private Build-1.8.0_402-8u402-ga-2ubuntu1~20.04-b06 | | +1 :green_heart: | shadedclient | 55m 24s | | patch has no errors when building and testing our client artifacts. | _ Other Tests _ | | -1 :x: | unit | 853m 30s | [/patch-unit-root.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-6704/5/artifact/out/patch-unit-root.txt) | root in the patch passed. | | +1 :green_heart: | asflicense | 1m 36s | | The patch does not generate ASF License warnings. | | | | 1176m 2s | | | | Reason | Tests | |---:|:--| | Failed junit tests | hadoop.hdfs.server.datanode.TestLargeBlockReport | | Subsystem | Report/Notes | |--:|:-| | Docker | ClientAPI=1.44 ServerAPI=1.44 base: https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-6704/5/artifact/out/Dockerfile | | GITHUB PR | https://github.com/apache/hadoop/pull/6704 | | Optional Tests | dupname asflicense compile javac javadoc mvninstall mvnsite unit shadedclient codespell detsecrets xmllint shellcheck shelldocs | | uname | Linux 95ceb5885b5c 5.15.0-94-generic #104-Ubuntu SMP Tue Jan 9 15:25:40 UTC 2024 x86_64 x86_64 x86_64 GNU/Linux | | Build tool | maven | | Personality | dev-support/bin/hadoop.sh | | git revision | trunk / 9965c286709d6e5880b91e656088ef7766ea3b94 | | Default Java | Private Build-1.8.0_402-8u402-ga-2ubuntu1~20.04-b06 | | Multi-JDK versions | /usr/lib/jvm/java-11-openjdk-amd64:Ubuntu-11.0.22+7-post-Ubuntu-0ubuntu220.04.1 /usr/lib/jvm/java-8-openjdk-amd64:Private Build-1.8.0_402-8u402-ga-2ubuntu1~20.04-b06 | | Test Results | https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-6704/5/testReport/ |
[jira] [Commented] (HADOOP-18717) Move CodecPool getCompressor/getDecompressor logs to DEBUG
[ https://issues.apache.org/jira/browse/HADOOP-18717?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17841156#comment-17841156 ] ASF GitHub Bot commented on HADOOP-18717: - hadoop-yetus commented on PR #6445: URL: https://github.com/apache/hadoop/pull/6445#issuecomment-2079166908 :broken_heart: **-1 overall** | Vote | Subsystem | Runtime | Logfile | Comment | |::|--:|:|::|:---:| _ Prechecks _ | | +1 :green_heart: | dupname | 0m 00s | | No case conflicting files found. | | +0 :ok: | spotbugs | 0m 01s | | spotbugs executables are not available. | | +0 :ok: | codespell | 0m 01s | | codespell was not available. | | +0 :ok: | detsecrets | 0m 01s | | detect-secrets was not available. | | +1 :green_heart: | @author | 0m 00s | | The patch does not contain any @author tags. | | -1 :x: | test4tests | 0m 00s | | The patch doesn't appear to include any new or modified tests. Please justify why no new tests are needed for this patch. Also please list what manual steps were performed to verify this patch. | _ trunk Compile Tests _ | | +1 :green_heart: | mvninstall | 128m 09s | | trunk passed | | +1 :green_heart: | compile | 60m 23s | | trunk passed | | +1 :green_heart: | checkstyle | 7m 22s | | trunk passed | | -1 :x: | mvnsite | 6m 48s | [/branch-mvnsite-hadoop-common-project_hadoop-common.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch-windows-10/job/PR-6445/1/artifact/out/branch-mvnsite-hadoop-common-project_hadoop-common.txt) | hadoop-common in trunk failed. | | +1 :green_heart: | javadoc | 7m 14s | | trunk passed | | +1 :green_heart: | shadedclient | 212m 33s | | branch has no errors when building and testing our client artifacts. | _ Patch Compile Tests _ | | +1 :green_heart: | mvninstall | 7m 34s | | the patch passed | | +1 :green_heart: | compile | 57m 43s | | the patch passed | | +1 :green_heart: | javac | 57m 43s | | the patch passed | | +1 :green_heart: | blanks | 0m 00s | | The patch has no blanks issues. | | +1 :green_heart: | checkstyle | 7m 10s | | the patch passed | | -1 :x: | mvnsite | 6m 48s | [/patch-mvnsite-hadoop-common-project_hadoop-common.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch-windows-10/job/PR-6445/1/artifact/out/patch-mvnsite-hadoop-common-project_hadoop-common.txt) | hadoop-common in the patch failed. | | +1 :green_heart: | javadoc | 7m 22s | | the patch passed | | +1 :green_heart: | shadedclient | 228m 46s | | patch has no errors when building and testing our client artifacts. | _ Other Tests _ | | +1 :green_heart: | asflicense | 9m 30s | | The patch does not generate ASF License warnings. | | | | 715m 46s | | | | Subsystem | Report/Notes | |--:|:-| | GITHUB PR | https://github.com/apache/hadoop/pull/6445 | | Optional Tests | dupname asflicense compile javac javadoc mvninstall mvnsite unit shadedclient spotbugs checkstyle codespell detsecrets | | uname | MINGW64_NT-10.0-17763 2356a482a111 3.4.10-87d57229.x86_64 2024-02-14 20:17 UTC x86_64 Msys | | Build tool | maven | | Personality | /c/hadoop/dev-support/bin/hadoop.sh | | git revision | trunk / d3b62d5b9c5b601b5ec2b8baa6e900c12c6aa372 | | Default Java | Azul Systems, Inc.-1.8.0_332-b09 | | Test Results | https://ci-hadoop.apache.org/job/hadoop-multibranch-windows-10/job/PR-6445/1/testReport/ | | modules | C: hadoop-common-project/hadoop-common U: hadoop-common-project/hadoop-common | | Console output | https://ci-hadoop.apache.org/job/hadoop-multibranch-windows-10/job/PR-6445/1/console | | versions | git=2.44.0.windows.1 | | Powered by | Apache Yetus 0.14.0 https://yetus.apache.org | This message was automatically generated. > Move CodecPool getCompressor/getDecompressor logs to DEBUG > -- > > Key: HADOOP-18717 > URL: https://issues.apache.org/jira/browse/HADOOP-18717 > Project: Hadoop Common > Issue Type: Improvement > Components: common >Affects Versions: 3.3.6 >Reporter: Claire McGinty >Priority: Trivial > Labels: pull-request-available > > The "Got brand new compressor|decompressor" logs in CodecPool[0] can be quite > noisy when reading thousands of blocks and aren't that illuminating for the > end user. I'd like to propose moving them from log.info to log.debug if > there's no objection. > > [0] >
[jira] [Commented] (HADOOP-19156) ZooKeeper based state stores use different ZK address configs
[ https://issues.apache.org/jira/browse/HADOOP-19156?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17841139#comment-17841139 ] ASF GitHub Bot commented on HADOOP-19156: - hadoop-yetus commented on PR #6767: URL: https://github.com/apache/hadoop/pull/6767#issuecomment-2079146708 :broken_heart: **-1 overall** | Vote | Subsystem | Runtime | Logfile | Comment | |::|--:|:|::|:---:| | +0 :ok: | reexec | 25m 38s | | Docker mode activated. | _ Prechecks _ | | +1 :green_heart: | dupname | 0m 0s | | No case conflicting files found. | | +0 :ok: | codespell | 0m 0s | | codespell was not available. | | +0 :ok: | detsecrets | 0m 0s | | detect-secrets was not available. | | +0 :ok: | xmllint | 0m 0s | | xmllint was not available. | | +1 :green_heart: | @author | 0m 0s | | The patch does not contain any @author tags. | | +1 :green_heart: | test4tests | 0m 0s | | The patch appears to include 12 new or modified test files. | _ trunk Compile Tests _ | | +0 :ok: | mvndep | 14m 32s | | Maven dependency ordering for branch | | +1 :green_heart: | mvninstall | 37m 23s | | trunk passed | | +1 :green_heart: | compile | 19m 8s | | trunk passed with JDK Ubuntu-11.0.22+7-post-Ubuntu-0ubuntu220.04.1 | | +1 :green_heart: | compile | 18m 40s | | trunk passed with JDK Private Build-1.8.0_402-8u402-ga-2ubuntu1~20.04-b06 | | +1 :green_heart: | checkstyle | 4m 40s | | trunk passed | | +1 :green_heart: | mvnsite | 6m 31s | | trunk passed | | +1 :green_heart: | javadoc | 5m 48s | | trunk passed with JDK Ubuntu-11.0.22+7-post-Ubuntu-0ubuntu220.04.1 | | +1 :green_heart: | javadoc | 5m 9s | | trunk passed with JDK Private Build-1.8.0_402-8u402-ga-2ubuntu1~20.04-b06 | | +1 :green_heart: | spotbugs | 11m 43s | | trunk passed | | +1 :green_heart: | shadedclient | 40m 3s | | branch has no errors when building and testing our client artifacts. | _ Patch Compile Tests _ | | +0 :ok: | mvndep | 0m 32s | | Maven dependency ordering for patch | | +1 :green_heart: | mvninstall | 4m 0s | | the patch passed | | +1 :green_heart: | compile | 18m 16s | | the patch passed with JDK Ubuntu-11.0.22+7-post-Ubuntu-0ubuntu220.04.1 | | +1 :green_heart: | javac | 18m 16s | | the patch passed | | +1 :green_heart: | compile | 17m 45s | | the patch passed with JDK Private Build-1.8.0_402-8u402-ga-2ubuntu1~20.04-b06 | | +1 :green_heart: | javac | 17m 45s | | the patch passed | | -1 :x: | blanks | 0m 0s | [/blanks-eol.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-6767/2/artifact/out/blanks-eol.txt) | The patch has 1 line(s) that end in blanks. Use git apply --whitespace=fix <>. Refer https://git-scm.com/docs/git-apply | | +1 :green_heart: | checkstyle | 4m 36s | | the patch passed | | +1 :green_heart: | mvnsite | 6m 23s | | the patch passed | | +1 :green_heart: | javadoc | 5m 48s | | the patch passed with JDK Ubuntu-11.0.22+7-post-Ubuntu-0ubuntu220.04.1 | | +1 :green_heart: | javadoc | 5m 7s | | the patch passed with JDK Private Build-1.8.0_402-8u402-ga-2ubuntu1~20.04-b06 | | +1 :green_heart: | spotbugs | 12m 48s | | the patch passed | | +1 :green_heart: | shadedclient | 40m 8s | | patch has no errors when building and testing our client artifacts. | _ Other Tests _ | | +1 :green_heart: | unit | 20m 56s | | hadoop-common in the patch passed. | | +1 :green_heart: | unit | 1m 12s | | hadoop-yarn-api in the patch passed. | | +1 :green_heart: | unit | 5m 42s | | hadoop-yarn-common in the patch passed. | | +1 :green_heart: | unit | 4m 8s | | hadoop-yarn-server-common in the patch passed. | | +1 :green_heart: | unit | 108m 48s | | hadoop-yarn-server-resourcemanager in the patch passed. | | +1 :green_heart: | unit | 33m 27s | | hadoop-hdfs-rbf in the patch passed. | | +1 :green_heart: | asflicense | 1m 1s | | The patch does not generate ASF License warnings. | | | | 487m 14s | | | | Subsystem | Report/Notes | |--:|:-| | Docker | ClientAPI=1.44 ServerAPI=1.44 base: https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-6767/2/artifact/out/Dockerfile | | GITHUB PR | https://github.com/apache/hadoop/pull/6767 | | Optional Tests | dupname asflicense compile javac javadoc mvninstall mvnsite unit shadedclient spotbugs checkstyle codespell detsecrets xmllint | | uname | Linux 35c89bc7ab28 5.15.0-94-generic #104-Ubuntu SMP Tue Jan 9 15:25:40 UTC 2024 x86_64 x86_64 x86_64 GNU/Linux | | Build tool | maven | | Personality | dev-support/bin/hadoop.sh |
[jira] [Commented] (HADOOP-19029) Migrate abstract permission tests to AssertJ
[ https://issues.apache.org/jira/browse/HADOOP-19029?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17841131#comment-17841131 ] ASF GitHub Bot commented on HADOOP-19029: - hadoop-yetus commented on PR #6418: URL: https://github.com/apache/hadoop/pull/6418#issuecomment-2079062284 :broken_heart: **-1 overall** | Vote | Subsystem | Runtime | Logfile | Comment | |::|--:|:|::|:---:| _ Prechecks _ | | +1 :green_heart: | dupname | 0m 00s | | No case conflicting files found. | | +0 :ok: | spotbugs | 0m 01s | | spotbugs executables are not available. | | +0 :ok: | codespell | 0m 01s | | codespell was not available. | | +0 :ok: | detsecrets | 0m 01s | | detect-secrets was not available. | | +1 :green_heart: | @author | 0m 00s | | The patch does not contain any @author tags. | | +1 :green_heart: | test4tests | 0m 00s | | The patch appears to include 2 new or modified test files. | _ trunk Compile Tests _ | | +1 :green_heart: | mvninstall | 91m 07s | | trunk passed | | +1 :green_heart: | compile | 40m 08s | | trunk passed | | +1 :green_heart: | checkstyle | 4m 39s | | trunk passed | | -1 :x: | mvnsite | 4m 21s | [/branch-mvnsite-hadoop-common-project_hadoop-common.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch-windows-10/job/PR-6418/1/artifact/out/branch-mvnsite-hadoop-common-project_hadoop-common.txt) | hadoop-common in trunk failed. | | +1 :green_heart: | javadoc | 4m 46s | | trunk passed | | +1 :green_heart: | shadedclient | 149m 00s | | branch has no errors when building and testing our client artifacts. | _ Patch Compile Tests _ | | +1 :green_heart: | mvninstall | 5m 06s | | the patch passed | | +1 :green_heart: | compile | 38m 27s | | the patch passed | | +1 :green_heart: | javac | 38m 27s | | the patch passed | | +1 :green_heart: | blanks | 0m 00s | | The patch has no blanks issues. | | +1 :green_heart: | checkstyle | 4m 39s | | the patch passed | | -1 :x: | mvnsite | 4m 28s | [/patch-mvnsite-hadoop-common-project_hadoop-common.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch-windows-10/job/PR-6418/1/artifact/out/patch-mvnsite-hadoop-common-project_hadoop-common.txt) | hadoop-common in the patch failed. | | +1 :green_heart: | javadoc | 4m 49s | | the patch passed | | +1 :green_heart: | shadedclient | 161m 13s | | patch has no errors when building and testing our client artifacts. | _ Other Tests _ | | +1 :green_heart: | asflicense | 5m 38s | | The patch does not generate ASF License warnings. | | | | 498m 28s | | | | Subsystem | Report/Notes | |--:|:-| | GITHUB PR | https://github.com/apache/hadoop/pull/6418 | | Optional Tests | dupname asflicense compile javac javadoc mvninstall mvnsite unit shadedclient spotbugs checkstyle codespell detsecrets | | uname | MINGW64_NT-10.0-17763 c28cbc285662 3.4.10-87d57229.x86_64 2024-02-14 20:17 UTC x86_64 Msys | | Build tool | maven | | Personality | /c/hadoop/dev-support/bin/hadoop.sh | | git revision | trunk / 6e74a1f0377a3ebfba6eafbf1be482714c146e38 | | Default Java | Azul Systems, Inc.-1.8.0_332-b09 | | Test Results | https://ci-hadoop.apache.org/job/hadoop-multibranch-windows-10/job/PR-6418/1/testReport/ | | modules | C: hadoop-common-project/hadoop-common U: hadoop-common-project/hadoop-common | | Console output | https://ci-hadoop.apache.org/job/hadoop-multibranch-windows-10/job/PR-6418/1/console | | versions | git=2.44.0.windows.1 | | Powered by | Apache Yetus 0.14.0 https://yetus.apache.org | This message was automatically generated. > Migrate abstract permission tests to AssertJ > > > Key: HADOOP-19029 > URL: https://issues.apache.org/jira/browse/HADOOP-19029 > Project: Hadoop Common > Issue Type: Improvement >Reporter: Zhaobo Huang >Priority: Major > Labels: pull-request-available > -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Commented] (HADOOP-18679) Add API for bulk/paged object deletion
[ https://issues.apache.org/jira/browse/HADOOP-18679?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17841095#comment-17841095 ] ASF GitHub Bot commented on HADOOP-18679: - hadoop-yetus commented on PR #6726: URL: https://github.com/apache/hadoop/pull/6726#issuecomment-2078874481 :broken_heart: **-1 overall** | Vote | Subsystem | Runtime | Logfile | Comment | |::|--:|:|::|:---:| | +0 :ok: | reexec | 22m 31s | | Docker mode activated. | _ Prechecks _ | | +1 :green_heart: | dupname | 0m 1s | | No case conflicting files found. | | +0 :ok: | codespell | 0m 0s | | codespell was not available. | | +0 :ok: | detsecrets | 0m 0s | | detect-secrets was not available. | | +0 :ok: | xmllint | 0m 0s | | xmllint was not available. | | +0 :ok: | markdownlint | 0m 0s | | markdownlint was not available. | | +1 :green_heart: | @author | 0m 0s | | The patch does not contain any @author tags. | | +1 :green_heart: | test4tests | 0m 0s | | The patch appears to include 10 new or modified test files. | _ trunk Compile Tests _ | | +0 :ok: | mvndep | 14m 58s | | Maven dependency ordering for branch | | +1 :green_heart: | mvninstall | 39m 38s | | trunk passed | | +1 :green_heart: | compile | 23m 30s | | trunk passed with JDK Ubuntu-11.0.22+7-post-Ubuntu-0ubuntu220.04.1 | | +1 :green_heart: | compile | 20m 16s | | trunk passed with JDK Private Build-1.8.0_402-8u402-ga-2ubuntu1~20.04-b06 | | +1 :green_heart: | checkstyle | 5m 52s | | trunk passed | | +1 :green_heart: | mvnsite | 5m 23s | | trunk passed | | +1 :green_heart: | javadoc | 3m 59s | | trunk passed with JDK Ubuntu-11.0.22+7-post-Ubuntu-0ubuntu220.04.1 | | +1 :green_heart: | javadoc | 4m 23s | | trunk passed with JDK Private Build-1.8.0_402-8u402-ga-2ubuntu1~20.04-b06 | | +1 :green_heart: | spotbugs | 8m 57s | | trunk passed | | +1 :green_heart: | shadedclient | 43m 25s | | branch has no errors when building and testing our client artifacts. | _ Patch Compile Tests _ | | +0 :ok: | mvndep | 0m 37s | | Maven dependency ordering for patch | | +1 :green_heart: | mvninstall | 3m 23s | | the patch passed | | +1 :green_heart: | compile | 22m 41s | | the patch passed with JDK Ubuntu-11.0.22+7-post-Ubuntu-0ubuntu220.04.1 | | +1 :green_heart: | javac | 22m 41s | | the patch passed | | +1 :green_heart: | compile | 21m 12s | | the patch passed with JDK Private Build-1.8.0_402-8u402-ga-2ubuntu1~20.04-b06 | | +1 :green_heart: | javac | 21m 12s | | the patch passed | | -1 :x: | blanks | 0m 0s | [/blanks-eol.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-6726/5/artifact/out/blanks-eol.txt) | The patch has 8 line(s) that end in blanks. Use git apply --whitespace=fix <>. Refer https://git-scm.com/docs/git-apply | | -0 :warning: | checkstyle | 5m 34s | [/results-checkstyle-root.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-6726/5/artifact/out/results-checkstyle-root.txt) | root: The patch generated 409 new + 106 unchanged - 0 fixed = 515 total (was 106) | | +1 :green_heart: | mvnsite | 5m 35s | | the patch passed | | -1 :x: | javadoc | 1m 52s | [/results-javadoc-javadoc-hadoop-common-project_hadoop-common-jdkUbuntu-11.0.22+7-post-Ubuntu-0ubuntu220.04.1.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-6726/5/artifact/out/results-javadoc-javadoc-hadoop-common-project_hadoop-common-jdkUbuntu-11.0.22+7-post-Ubuntu-0ubuntu220.04.1.txt) | hadoop-common-project_hadoop-common-jdkUbuntu-11.0.22+7-post-Ubuntu-0ubuntu220.04.1 with JDK Ubuntu-11.0.22+7-post-Ubuntu-0ubuntu220.04.1 generated 2 new + 0 unchanged - 0 fixed = 2 total (was 0) | | -1 :x: | javadoc | 0m 43s | [/results-javadoc-javadoc-hadoop-tools_hadoop-aws-jdkPrivateBuild-1.8.0_402-8u402-ga-2ubuntu1~20.04-b06.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-6726/5/artifact/out/results-javadoc-javadoc-hadoop-tools_hadoop-aws-jdkPrivateBuild-1.8.0_402-8u402-ga-2ubuntu1~20.04-b06.txt) | hadoop-tools_hadoop-aws-jdkPrivateBuild-1.8.0_402-8u402-ga-2ubuntu1~20.04-b06 with JDK Private Build-1.8.0_402-8u402-ga-2ubuntu1~20.04-b06 generated 3 new + 0 unchanged - 0 fixed = 3 total (was 0) | | +1 :green_heart: | spotbugs | 9m 37s | | the patch passed | | +1 :green_heart: | shadedclient | 40m 41s | | patch has no errors when building and testing our client artifacts. | _ Other Tests _ | | +1 :green_heart: | unit | 5m 45s | | hadoop-common in the patch passed. | | -1 :x: | unit | 275m 3s |
[jira] [Commented] (HADOOP-18679) Add API for bulk/paged object deletion
[ https://issues.apache.org/jira/browse/HADOOP-18679?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17841074#comment-17841074 ] ASF GitHub Bot commented on HADOOP-18679: - hadoop-yetus commented on PR #6726: URL: https://github.com/apache/hadoop/pull/6726#issuecomment-2078773861 :broken_heart: **-1 overall** | Vote | Subsystem | Runtime | Logfile | Comment | |::|--:|:|::|:---:| | +0 :ok: | reexec | 18m 18s | | Docker mode activated. | _ Prechecks _ | | +1 :green_heart: | dupname | 0m 0s | | No case conflicting files found. | | +0 :ok: | codespell | 0m 1s | | codespell was not available. | | +0 :ok: | detsecrets | 0m 1s | | detect-secrets was not available. | | +0 :ok: | xmllint | 0m 1s | | xmllint was not available. | | +0 :ok: | markdownlint | 0m 1s | | markdownlint was not available. | | +1 :green_heart: | @author | 0m 0s | | The patch does not contain any @author tags. | | +1 :green_heart: | test4tests | 0m 0s | | The patch appears to include 10 new or modified test files. | _ trunk Compile Tests _ | | +0 :ok: | mvndep | 14m 48s | | Maven dependency ordering for branch | | +1 :green_heart: | mvninstall | 36m 32s | | trunk passed | | +1 :green_heart: | compile | 19m 45s | | trunk passed with JDK Ubuntu-11.0.22+7-post-Ubuntu-0ubuntu220.04.1 | | +1 :green_heart: | compile | 18m 7s | | trunk passed with JDK Private Build-1.8.0_402-8u402-ga-2ubuntu1~20.04-b06 | | +1 :green_heart: | checkstyle | 4m 54s | | trunk passed | | +1 :green_heart: | mvnsite | 5m 9s | | trunk passed | | +1 :green_heart: | javadoc | 3m 59s | | trunk passed with JDK Ubuntu-11.0.22+7-post-Ubuntu-0ubuntu220.04.1 | | +1 :green_heart: | javadoc | 4m 22s | | trunk passed with JDK Private Build-1.8.0_402-8u402-ga-2ubuntu1~20.04-b06 | | +1 :green_heart: | spotbugs | 8m 49s | | trunk passed | | +1 :green_heart: | shadedclient | 39m 33s | | branch has no errors when building and testing our client artifacts. | _ Patch Compile Tests _ | | +0 :ok: | mvndep | 0m 57s | | Maven dependency ordering for patch | | +1 :green_heart: | mvninstall | 3m 14s | | the patch passed | | +1 :green_heart: | compile | 18m 46s | | the patch passed with JDK Ubuntu-11.0.22+7-post-Ubuntu-0ubuntu220.04.1 | | +1 :green_heart: | javac | 18m 46s | | the patch passed | | +1 :green_heart: | compile | 17m 56s | | the patch passed with JDK Private Build-1.8.0_402-8u402-ga-2ubuntu1~20.04-b06 | | +1 :green_heart: | javac | 17m 56s | | the patch passed | | -1 :x: | blanks | 0m 0s | [/blanks-eol.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-6726/4/artifact/out/blanks-eol.txt) | The patch has 8 line(s) that end in blanks. Use git apply --whitespace=fix <>. Refer https://git-scm.com/docs/git-apply | | -0 :warning: | checkstyle | 4m 49s | [/results-checkstyle-root.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-6726/4/artifact/out/results-checkstyle-root.txt) | root: The patch generated 409 new + 106 unchanged - 0 fixed = 515 total (was 106) | | +1 :green_heart: | mvnsite | 5m 5s | | the patch passed | | -1 :x: | javadoc | 1m 11s | [/results-javadoc-javadoc-hadoop-common-project_hadoop-common-jdkUbuntu-11.0.22+7-post-Ubuntu-0ubuntu220.04.1.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-6726/4/artifact/out/results-javadoc-javadoc-hadoop-common-project_hadoop-common-jdkUbuntu-11.0.22+7-post-Ubuntu-0ubuntu220.04.1.txt) | hadoop-common-project_hadoop-common-jdkUbuntu-11.0.22+7-post-Ubuntu-0ubuntu220.04.1 with JDK Ubuntu-11.0.22+7-post-Ubuntu-0ubuntu220.04.1 generated 2 new + 0 unchanged - 0 fixed = 2 total (was 0) | | -1 :x: | javadoc | 0m 47s | [/results-javadoc-javadoc-hadoop-tools_hadoop-aws-jdkPrivateBuild-1.8.0_402-8u402-ga-2ubuntu1~20.04-b06.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-6726/4/artifact/out/results-javadoc-javadoc-hadoop-tools_hadoop-aws-jdkPrivateBuild-1.8.0_402-8u402-ga-2ubuntu1~20.04-b06.txt) | hadoop-tools_hadoop-aws-jdkPrivateBuild-1.8.0_402-8u402-ga-2ubuntu1~20.04-b06 with JDK Private Build-1.8.0_402-8u402-ga-2ubuntu1~20.04-b06 generated 3 new + 0 unchanged - 0 fixed = 3 total (was 0) | | +1 :green_heart: | spotbugs | 9m 24s | | the patch passed | | +1 :green_heart: | shadedclient | 40m 5s | | patch has no errors when building and testing our client artifacts. | _ Other Tests _ | | +1 :green_heart: | unit | 5m 45s | | hadoop-common in the patch passed. | | -1 :x: | unit | 267m 9s |
[jira] [Commented] (HADOOP-19073) WASB: Fix connection leak in FolderRenamePending
[ https://issues.apache.org/jira/browse/HADOOP-19073?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17841059#comment-17841059 ] ASF GitHub Bot commented on HADOOP-19073: - xuzifu666 commented on PR #6534: URL: https://github.com/apache/hadoop/pull/6534#issuecomment-2078699796 @steveloughran could you give the final review or close the pr? Thanks > WASB: Fix connection leak in FolderRenamePending > > > Key: HADOOP-19073 > URL: https://issues.apache.org/jira/browse/HADOOP-19073 > Project: Hadoop Common > Issue Type: Bug > Components: fs/azure >Affects Versions: 3.3.6 >Reporter: xy >Priority: Major > Labels: pull-request-available > > Fix connection leak in FolderRenamePending in getting bytes -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Commented] (HADOOP-18938) S3A region logic to handle vpce and non standard endpoints
[ https://issues.apache.org/jira/browse/HADOOP-18938?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17841054#comment-17841054 ] ASF GitHub Bot commented on HADOOP-18938: - hadoop-yetus commented on PR #6466: URL: https://github.com/apache/hadoop/pull/6466#issuecomment-2078647702 :confetti_ball: **+1 overall** | Vote | Subsystem | Runtime | Logfile | Comment | |::|--:|:|::|:---:| _ Prechecks _ | | +1 :green_heart: | dupname | 0m 01s | | No case conflicting files found. | | +0 :ok: | spotbugs | 0m 00s | | spotbugs executables are not available. | | +0 :ok: | codespell | 0m 00s | | codespell was not available. | | +0 :ok: | detsecrets | 0m 00s | | detect-secrets was not available. | | +1 :green_heart: | @author | 0m 01s | | The patch does not contain any @author tags. | | +1 :green_heart: | test4tests | 0m 00s | | The patch appears to include 2 new or modified test files. | _ trunk Compile Tests _ | | +1 :green_heart: | mvninstall | 108m 16s | | trunk passed | | +1 :green_heart: | compile | 5m 41s | | trunk passed | | +1 :green_heart: | checkstyle | 5m 21s | | trunk passed | | +1 :green_heart: | mvnsite | 6m 00s | | trunk passed | | +1 :green_heart: | javadoc | 5m 40s | | trunk passed | | +1 :green_heart: | shadedclient | 173m 14s | | branch has no errors when building and testing our client artifacts. | _ Patch Compile Tests _ | | +1 :green_heart: | mvninstall | 4m 11s | | the patch passed | | +1 :green_heart: | compile | 2m 42s | | the patch passed | | +1 :green_heart: | javac | 2m 42s | | the patch passed | | +1 :green_heart: | blanks | 0m 00s | | The patch has no blanks issues. | | +1 :green_heart: | checkstyle | 2m 25s | | the patch passed | | +1 :green_heart: | mvnsite | 2m 52s | | the patch passed | | +1 :green_heart: | javadoc | 2m 34s | | the patch passed | | +1 :green_heart: | shadedclient | 191m 32s | | patch has no errors when building and testing our client artifacts. | _ Other Tests _ | | +1 :green_heart: | asflicense | 6m 49s | | The patch does not generate ASF License warnings. | | | | 501m 31s | | | | Subsystem | Report/Notes | |--:|:-| | GITHUB PR | https://github.com/apache/hadoop/pull/6466 | | Optional Tests | dupname asflicense compile javac javadoc mvninstall mvnsite unit shadedclient spotbugs checkstyle codespell detsecrets | | uname | MINGW64_NT-10.0-17763 c92fbd9e5cae 3.4.10-87d57229.x86_64 2024-02-14 20:17 UTC x86_64 Msys | | Build tool | maven | | Personality | /c/hadoop/dev-support/bin/hadoop.sh | | git revision | trunk / 97360ba71f24df4cfc2d44f2f05c1bee0129a968 | | Default Java | Azul Systems, Inc.-1.8.0_332-b09 | | Test Results | https://ci-hadoop.apache.org/job/hadoop-multibranch-windows-10/job/PR-6466/1/testReport/ | | modules | C: hadoop-tools/hadoop-aws U: hadoop-tools/hadoop-aws | | Console output | https://ci-hadoop.apache.org/job/hadoop-multibranch-windows-10/job/PR-6466/1/console | | versions | git=2.44.0.windows.1 | | Powered by | Apache Yetus 0.14.0 https://yetus.apache.org | This message was automatically generated. > S3A region logic to handle vpce and non standard endpoints > --- > > Key: HADOOP-18938 > URL: https://issues.apache.org/jira/browse/HADOOP-18938 > Project: Hadoop Common > Issue Type: Sub-task > Components: fs/s3 >Affects Versions: 3.4.0 >Reporter: Ahmar Suhail >Priority: Major > Labels: pull-request-available > > For non standard endpoints such as VPCE the region parsing added in > HADOOP-18908 doesn't work. This is expected as that logic is only meant to be > used for standard endpoints. > If you are using a non-standard endpoint, check if a region is also provided, > else fail fast. > Also update documentation to explain to region and endpoint behaviour with > SDK V2. -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Commented] (HADOOP-18682) Move hadoop docker scripts under the main source code
[ https://issues.apache.org/jira/browse/HADOOP-18682?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17841050#comment-17841050 ] ASF GitHub Bot commented on HADOOP-18682: - hadoop-yetus commented on PR #6483: URL: https://github.com/apache/hadoop/pull/6483#issuecomment-2078631754 :broken_heart: **-1 overall** | Vote | Subsystem | Runtime | Logfile | Comment | |::|--:|:|::|:---:| _ Prechecks _ | | +1 :green_heart: | dupname | 0m 02s | | No case conflicting files found. | | +0 :ok: | codespell | 0m 02s | | codespell was not available. | | +0 :ok: | detsecrets | 0m 02s | | detect-secrets was not available. | | +0 :ok: | markdownlint | 0m 02s | | markdownlint was not available. | | +0 :ok: | xmllint | 0m 00s | | xmllint was not available. | | +0 :ok: | shellcheck | 0m 01s | | Shellcheck was not available. | | +0 :ok: | shelldocs | 0m 01s | | Shelldocs was not available. | | +0 :ok: | yamllint | 0m 01s | | yamllint was not available. | | +0 :ok: | hadolint | 0m 01s | | hadolint was not available. | | +1 :green_heart: | @author | 0m 00s | | The patch does not contain any @author tags. | | -1 :x: | test4tests | 0m 00s | | The patch doesn't appear to include any new or modified tests. Please justify why no new tests are needed for this patch. Also please list what manual steps were performed to verify this patch. | _ trunk Compile Tests _ | | +0 :ok: | mvndep | 4m 56s | | Maven dependency ordering for branch | | +1 :green_heart: | mvninstall | 108m 15s | | trunk passed | | +1 :green_heart: | compile | 47m 52s | | trunk passed | | -1 :x: | mvnsite | 28m 48s | [/branch-mvnsite-root.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch-windows-10/job/PR-6483/1/artifact/out/branch-mvnsite-root.txt) | root in trunk failed. | | +1 :green_heart: | javadoc | 19m 14s | | trunk passed | | +1 :green_heart: | shadedclient | 390m 28s | | branch has no errors when building and testing our client artifacts. | _ Patch Compile Tests _ | | +0 :ok: | mvndep | 5m 06s | | Maven dependency ordering for patch | | -1 :x: | mvninstall | 3m 08s | [/patch-mvninstall-hadoop-dist.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch-windows-10/job/PR-6483/1/artifact/out/patch-mvninstall-hadoop-dist.txt) | hadoop-dist in the patch failed. | | +1 :green_heart: | compile | 48m 07s | | the patch passed | | +1 :green_heart: | javac | 48m 07s | | the patch passed | | +1 :green_heart: | blanks | 0m 01s | | The patch has no blanks issues. | | -1 :x: | mvnsite | 28m 00s | [/patch-mvnsite-root.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch-windows-10/job/PR-6483/1/artifact/out/patch-mvnsite-root.txt) | root in the patch failed. | | +1 :green_heart: | javadoc | 19m 56s | | the patch passed | | +1 :green_heart: | shadedclient | 232m 42s | | patch has no errors when building and testing our client artifacts. | _ Other Tests _ | | +1 :green_heart: | asflicense | 7m 47s | | The patch does not generate ASF License warnings. | | | | 813m 26s | | | | Subsystem | Report/Notes | |--:|:-| | GITHUB PR | https://github.com/apache/hadoop/pull/6483 | | Optional Tests | dupname asflicense mvnsite codespell detsecrets markdownlint compile javac javadoc mvninstall unit shadedclient xmllint shellcheck shelldocs yamllint hadolint | | uname | MINGW64_NT-10.0-17763 34cb8b2b35a4 3.4.10-87d57229.x86_64 2024-02-14 20:17 UTC x86_64 Msys | | Build tool | maven | | Personality | /c/hadoop/dev-support/bin/hadoop.sh | | git revision | trunk / 03d600f0a51e9cefa7282856215c77e63196583b | | Default Java | Azul Systems, Inc.-1.8.0_332-b09 | | Test Results | https://ci-hadoop.apache.org/job/hadoop-multibranch-windows-10/job/PR-6483/1/testReport/ | | modules | C: hadoop-common-project/hadoop-common hadoop-dist . U: . | | Console output | https://ci-hadoop.apache.org/job/hadoop-multibranch-windows-10/job/PR-6483/1/console | | versions | git=2.44.0.windows.1 | | Powered by | Apache Yetus 0.14.0 https://yetus.apache.org | This message was automatically generated. > Move hadoop docker scripts under the main source code > - > > Key: HADOOP-18682 > URL: https://issues.apache.org/jira/browse/HADOOP-18682 > Project: Hadoop Common > Issue Type: Improvement >Reporter: Ayush Saxena >Assignee: Christos Bisias >Priority: Major > Labels: pull-request-available > > Exploratory: > Coming from
[jira] [Commented] (HADOOP-19134) use StringBuilder instead of StringBuffer
[ https://issues.apache.org/jira/browse/HADOOP-19134?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17841045#comment-17841045 ] ASF GitHub Bot commented on HADOOP-19134: - hadoop-yetus commented on PR #6692: URL: https://github.com/apache/hadoop/pull/6692#issuecomment-2078615455 :broken_heart: **-1 overall** | Vote | Subsystem | Runtime | Logfile | Comment | |::|--:|:|::|:---:| | +0 :ok: | reexec | 0m 29s | | Docker mode activated. | _ Prechecks _ | | +1 :green_heart: | dupname | 0m 2s | | No case conflicting files found. | | +0 :ok: | codespell | 0m 1s | | codespell was not available. | | +0 :ok: | detsecrets | 0m 1s | | detect-secrets was not available. | | +1 :green_heart: | @author | 0m 0s | | The patch does not contain any @author tags. | | +1 :green_heart: | test4tests | 0m 0s | | The patch appears to include 56 new or modified test files. | _ trunk Compile Tests _ | | +0 :ok: | mvndep | 14m 50s | | Maven dependency ordering for branch | | +1 :green_heart: | mvninstall | 31m 53s | | trunk passed | | +1 :green_heart: | compile | 16m 57s | | trunk passed with JDK Ubuntu-11.0.22+7-post-Ubuntu-0ubuntu220.04.1 | | +1 :green_heart: | compile | 15m 36s | | trunk passed with JDK Private Build-1.8.0_402-8u402-ga-2ubuntu1~20.04-b06 | | +1 :green_heart: | checkstyle | 4m 55s | | trunk passed | | +1 :green_heart: | mvnsite | 18m 52s | | trunk passed | | +1 :green_heart: | javadoc | 16m 31s | | trunk passed with JDK Ubuntu-11.0.22+7-post-Ubuntu-0ubuntu220.04.1 | | +1 :green_heart: | javadoc | 16m 28s | | trunk passed with JDK Private Build-1.8.0_402-8u402-ga-2ubuntu1~20.04-b06 | | +1 :green_heart: | spotbugs | 29m 55s | | trunk passed | | +1 :green_heart: | shadedclient | 33m 37s | | branch has no errors when building and testing our client artifacts. | _ Patch Compile Tests _ | | +0 :ok: | mvndep | 0m 32s | | Maven dependency ordering for patch | | +1 :green_heart: | mvninstall | 11m 15s | | the patch passed | | +1 :green_heart: | compile | 16m 34s | | the patch passed with JDK Ubuntu-11.0.22+7-post-Ubuntu-0ubuntu220.04.1 | | +1 :green_heart: | javac | 16m 34s | | the patch passed | | +1 :green_heart: | compile | 15m 37s | | the patch passed with JDK Private Build-1.8.0_402-8u402-ga-2ubuntu1~20.04-b06 | | +1 :green_heart: | javac | 15m 37s | | the patch passed | | +1 :green_heart: | blanks | 0m 0s | | The patch has no blanks issues. | | -0 :warning: | checkstyle | 4m 43s | [/results-checkstyle-root.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-6692/4/artifact/out/results-checkstyle-root.txt) | root: The patch generated 3 new + 3956 unchanged - 6 fixed = 3959 total (was 3962) | | +1 :green_heart: | mvnsite | 18m 52s | | the patch passed | | +1 :green_heart: | javadoc | 16m 37s | | the patch passed with JDK Ubuntu-11.0.22+7-post-Ubuntu-0ubuntu220.04.1 | | +1 :green_heart: | javadoc | 16m 29s | | the patch passed with JDK Private Build-1.8.0_402-8u402-ga-2ubuntu1~20.04-b06 | | +1 :green_heart: | spotbugs | 34m 21s | | the patch passed | | +1 :green_heart: | shadedclient | 34m 25s | | patch has no errors when building and testing our client artifacts. | _ Other Tests _ | | +1 :green_heart: | unit | 19m 19s | | hadoop-common in the patch passed. | | +1 :green_heart: | unit | 3m 48s | | hadoop-kms in the patch passed. | | -1 :x: | unit | 227m 46s | [/patch-unit-hadoop-hdfs-project_hadoop-hdfs.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-6692/4/artifact/out/patch-unit-hadoop-hdfs-project_hadoop-hdfs.txt) | hadoop-hdfs in the patch passed. | | +1 :green_heart: | unit | 1m 20s | | hadoop-yarn-api in the patch passed. | | +1 :green_heart: | unit | 5m 59s | | hadoop-yarn-common in the patch passed. | | +1 :green_heart: | unit | 3m 55s | | hadoop-yarn-server-common in the patch passed. | | +1 :green_heart: | unit | 105m 42s | | hadoop-yarn-server-resourcemanager in the patch passed. | | +1 :green_heart: | unit | 24m 59s | | hadoop-yarn-server-nodemanager in the patch passed. | | +1 :green_heart: | unit | 28m 44s | | hadoop-yarn-client in the patch passed. | | +1 :green_heart: | unit | 7m 47s | | hadoop-mapreduce-client-core in the patch passed. | | +1 :green_heart: | unit | 1m 37s | | hadoop-mapreduce-client-common in the patch passed. | | +1 :green_heart: | unit | 9m 9s | | hadoop-mapreduce-client-app in the patch passed. | | +1 :green_heart: | unit | 4m 35s | | hadoop-mapreduce-client-hs in
[jira] [Commented] (HADOOP-19139) [ABFS]: No GetPathStatus call for opening AbfsInputStream
[ https://issues.apache.org/jira/browse/HADOOP-19139?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17841037#comment-17841037 ] ASF GitHub Bot commented on HADOOP-19139: - saxenapranav commented on PR #6699: URL: https://github.com/apache/hadoop/pull/6699#issuecomment-2078578293 @steveloughran , @mukund-thakur , @mehakmeet , requesting your kind review please. Thanks! > [ABFS]: No GetPathStatus call for opening AbfsInputStream > - > > Key: HADOOP-19139 > URL: https://issues.apache.org/jira/browse/HADOOP-19139 > Project: Hadoop Common > Issue Type: Sub-task > Components: fs/azure >Reporter: Pranav Saxena >Assignee: Pranav Saxena >Priority: Major > Labels: pull-request-available > > Read API gives contentLen and etag of the path. This information would be > used in future calls on that inputStream. Prior information of eTag is of not > much importance. -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Commented] (HADOOP-19036) Standardize Maven Initialization Across Operating Systems
[ https://issues.apache.org/jira/browse/HADOOP-19036?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17841013#comment-17841013 ] ASF GitHub Bot commented on HADOOP-19036: - hadoop-yetus commented on PR #6444: URL: https://github.com/apache/hadoop/pull/6444#issuecomment-2078467162 :broken_heart: **-1 overall** | Vote | Subsystem | Runtime | Logfile | Comment | |::|--:|:|::|:---:| | -1 :x: | patch | 0m 55s | | https://github.com/apache/hadoop/pull/6444 does not apply to trunk. Rebase required? Wrong Branch? See https://cwiki.apache.org/confluence/display/HADOOP/How+To+Contribute for help. | | Subsystem | Report/Notes | |--:|:-| | GITHUB PR | https://github.com/apache/hadoop/pull/6444 | | Console output | https://ci-hadoop.apache.org/job/hadoop-multibranch-windows-10/job/PR-6444/1/console | | versions | git=2.44.0.windows.1 | | Powered by | Apache Yetus 0.14.0 https://yetus.apache.org | This message was automatically generated. > Standardize Maven Initialization Across Operating Systems > - > > Key: HADOOP-19036 > URL: https://issues.apache.org/jira/browse/HADOOP-19036 > Project: Hadoop Common > Issue Type: Improvement > Components: build >Affects Versions: 3.4.0, 3.5.0 >Reporter: Shilun Fan >Assignee: Shilun Fan >Priority: Major > Labels: pull-request-available > > The differences in initializing Maven for various operating systems in the > build scripts are as follows: > - For Ubuntu and Debian, Maven is installed using the yum repository. > - For CentOS 7 and CentOS 8, Maven is downloaded remotely. > - The Maven version used on Windows is inconsistent with other operating > systems. -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Commented] (HADOOP-19073) WASB: Fix connection leak in FolderRenamePending
[ https://issues.apache.org/jira/browse/HADOOP-19073?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17841002#comment-17841002 ] ASF GitHub Bot commented on HADOOP-19073: - xuzifu666 commented on PR #6534: URL: https://github.com/apache/hadoop/pull/6534#issuecomment-2078294578 > > @xuzifu666 yes. > > > > 1. Which azure region did you run all the wasb tests against > > 2. and what were your maven command line arguments used? > > 3. did any tests fail? > > @steveloughran Hi,seems network connection available,test is hard for me to execute,but the pr may be a obvious connection leak,could you give a help to confirm it?Thank any way~ cc@steveloughran > WASB: Fix connection leak in FolderRenamePending > > > Key: HADOOP-19073 > URL: https://issues.apache.org/jira/browse/HADOOP-19073 > Project: Hadoop Common > Issue Type: Bug > Components: fs/azure >Affects Versions: 3.3.6 >Reporter: xy >Priority: Major > Labels: pull-request-available > > Fix connection leak in FolderRenamePending in getting bytes -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Commented] (HADOOP-19131) Assist reflection IO with WrappedOperations class
[ https://issues.apache.org/jira/browse/HADOOP-19131?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17840994#comment-17840994 ] ASF GitHub Bot commented on HADOOP-19131: - hadoop-yetus commented on PR #6686: URL: https://github.com/apache/hadoop/pull/6686#issuecomment-2078260697 :broken_heart: **-1 overall** | Vote | Subsystem | Runtime | Logfile | Comment | |::|--:|:|::|:---:| | +0 :ok: | reexec | 0m 31s | | Docker mode activated. | _ Prechecks _ | | +1 :green_heart: | dupname | 0m 0s | | No case conflicting files found. | | +0 :ok: | codespell | 0m 0s | | codespell was not available. | | +0 :ok: | detsecrets | 0m 0s | | detect-secrets was not available. | | +0 :ok: | markdownlint | 0m 0s | | markdownlint was not available. | | +1 :green_heart: | @author | 0m 0s | | The patch does not contain any @author tags. | | +1 :green_heart: | test4tests | 0m 0s | | The patch appears to include 2 new or modified test files. | _ trunk Compile Tests _ | | +0 :ok: | mvndep | 14m 52s | | Maven dependency ordering for branch | | +1 :green_heart: | mvninstall | 32m 30s | | trunk passed | | +1 :green_heart: | compile | 19m 11s | | trunk passed with JDK Ubuntu-11.0.22+7-post-Ubuntu-0ubuntu220.04.1 | | +1 :green_heart: | compile | 17m 53s | | trunk passed with JDK Private Build-1.8.0_402-8u402-ga-2ubuntu1~20.04-b06 | | +1 :green_heart: | checkstyle | 4m 32s | | trunk passed | | +1 :green_heart: | mvnsite | 2m 36s | | trunk passed | | +1 :green_heart: | javadoc | 1m 44s | | trunk passed with JDK Ubuntu-11.0.22+7-post-Ubuntu-0ubuntu220.04.1 | | +1 :green_heart: | javadoc | 1m 37s | | trunk passed with JDK Private Build-1.8.0_402-8u402-ga-2ubuntu1~20.04-b06 | | +1 :green_heart: | spotbugs | 3m 55s | | trunk passed | | +1 :green_heart: | shadedclient | 34m 22s | | branch has no errors when building and testing our client artifacts. | | -0 :warning: | patch | 34m 49s | | Used diff version of patch file. Binary files and potentially other changes not applied. Please rebase and squash commits if necessary. | _ Patch Compile Tests _ | | +0 :ok: | mvndep | 0m 35s | | Maven dependency ordering for patch | | +1 :green_heart: | mvninstall | 1m 33s | | the patch passed | | +1 :green_heart: | compile | 19m 47s | | the patch passed with JDK Ubuntu-11.0.22+7-post-Ubuntu-0ubuntu220.04.1 | | +1 :green_heart: | javac | 19m 47s | | the patch passed | | +1 :green_heart: | compile | 19m 10s | | the patch passed with JDK Private Build-1.8.0_402-8u402-ga-2ubuntu1~20.04-b06 | | +1 :green_heart: | javac | 19m 10s | | the patch passed | | +1 :green_heart: | blanks | 0m 0s | | The patch has no blanks issues. | | -0 :warning: | checkstyle | 4m 20s | [/results-checkstyle-root.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-6686/8/artifact/out/results-checkstyle-root.txt) | root: The patch generated 12 new + 26 unchanged - 0 fixed = 38 total (was 26) | | +1 :green_heart: | mvnsite | 2m 32s | | the patch passed | | -1 :x: | javadoc | 1m 6s | [/patch-javadoc-hadoop-common-project_hadoop-common-jdkUbuntu-11.0.22+7-post-Ubuntu-0ubuntu220.04.1.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-6686/8/artifact/out/patch-javadoc-hadoop-common-project_hadoop-common-jdkUbuntu-11.0.22+7-post-Ubuntu-0ubuntu220.04.1.txt) | hadoop-common in the patch failed with JDK Ubuntu-11.0.22+7-post-Ubuntu-0ubuntu220.04.1. | | +1 :green_heart: | javadoc | 1m 33s | | the patch passed with JDK Private Build-1.8.0_402-8u402-ga-2ubuntu1~20.04-b06 | | -1 :x: | spotbugs | 2m 54s | [/new-spotbugs-hadoop-common-project_hadoop-common.html](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-6686/8/artifact/out/new-spotbugs-hadoop-common-project_hadoop-common.html) | hadoop-common-project/hadoop-common generated 1 new + 0 unchanged - 0 fixed = 1 total (was 0) | | +1 :green_heart: | shadedclient | 35m 10s | | patch has no errors when building and testing our client artifacts. | _ Other Tests _ | | +1 :green_heart: | unit | 20m 6s | | hadoop-common in the patch passed. | | +1 :green_heart: | unit | 3m 11s | | hadoop-aws in the patch passed. | | +1 :green_heart: | asflicense | 1m 2s | | The patch does not generate ASF License warnings. | | | | 256m 2s | | | | Reason | Tests | |---:|:--| | SpotBugs | module:hadoop-common-project/hadoop-common | | | Unchecked/unconfirmed cast from Throwable to Exception in org.apache.hadoop.io.wrappedio.DynMethods.throwIfInstance(Throwable, Class) At
[jira] [Commented] (HADOOP-19131) Assist reflection IO with WrappedOperations class
[ https://issues.apache.org/jira/browse/HADOOP-19131?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17840993#comment-17840993 ] ASF GitHub Bot commented on HADOOP-19131: - hadoop-yetus commented on PR #6686: URL: https://github.com/apache/hadoop/pull/6686#issuecomment-2078260263 :broken_heart: **-1 overall** | Vote | Subsystem | Runtime | Logfile | Comment | |::|--:|:|::|:---:| | +0 :ok: | reexec | 0m 30s | | Docker mode activated. | _ Prechecks _ | | +1 :green_heart: | dupname | 0m 1s | | No case conflicting files found. | | +0 :ok: | codespell | 0m 0s | | codespell was not available. | | +0 :ok: | detsecrets | 0m 0s | | detect-secrets was not available. | | +0 :ok: | markdownlint | 0m 0s | | markdownlint was not available. | | +1 :green_heart: | @author | 0m 0s | | The patch does not contain any @author tags. | | +1 :green_heart: | test4tests | 0m 0s | | The patch appears to include 2 new or modified test files. | _ trunk Compile Tests _ | | +0 :ok: | mvndep | 14m 50s | | Maven dependency ordering for branch | | +1 :green_heart: | mvninstall | 32m 42s | | trunk passed | | +1 :green_heart: | compile | 18m 59s | | trunk passed with JDK Ubuntu-11.0.22+7-post-Ubuntu-0ubuntu220.04.1 | | +1 :green_heart: | compile | 17m 48s | | trunk passed with JDK Private Build-1.8.0_402-8u402-ga-2ubuntu1~20.04-b06 | | +1 :green_heart: | checkstyle | 4m 35s | | trunk passed | | +1 :green_heart: | mvnsite | 2m 37s | | trunk passed | | +1 :green_heart: | javadoc | 1m 43s | | trunk passed with JDK Ubuntu-11.0.22+7-post-Ubuntu-0ubuntu220.04.1 | | +1 :green_heart: | javadoc | 1m 32s | | trunk passed with JDK Private Build-1.8.0_402-8u402-ga-2ubuntu1~20.04-b06 | | +1 :green_heart: | spotbugs | 3m 58s | | trunk passed | | +1 :green_heart: | shadedclient | 34m 56s | | branch has no errors when building and testing our client artifacts. | | -0 :warning: | patch | 35m 20s | | Used diff version of patch file. Binary files and potentially other changes not applied. Please rebase and squash commits if necessary. | _ Patch Compile Tests _ | | +0 :ok: | mvndep | 0m 33s | | Maven dependency ordering for patch | | +1 :green_heart: | mvninstall | 1m 31s | | the patch passed | | +1 :green_heart: | compile | 20m 3s | | the patch passed with JDK Ubuntu-11.0.22+7-post-Ubuntu-0ubuntu220.04.1 | | +1 :green_heart: | javac | 20m 3s | | the patch passed | | +1 :green_heart: | compile | 19m 16s | | the patch passed with JDK Private Build-1.8.0_402-8u402-ga-2ubuntu1~20.04-b06 | | +1 :green_heart: | javac | 19m 16s | | the patch passed | | +1 :green_heart: | blanks | 0m 0s | | The patch has no blanks issues. | | -0 :warning: | checkstyle | 4m 33s | [/results-checkstyle-root.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-6686/7/artifact/out/results-checkstyle-root.txt) | root: The patch generated 12 new + 26 unchanged - 0 fixed = 38 total (was 26) | | +1 :green_heart: | mvnsite | 2m 38s | | the patch passed | | -1 :x: | javadoc | 1m 7s | [/patch-javadoc-hadoop-common-project_hadoop-common-jdkUbuntu-11.0.22+7-post-Ubuntu-0ubuntu220.04.1.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-6686/7/artifact/out/patch-javadoc-hadoop-common-project_hadoop-common-jdkUbuntu-11.0.22+7-post-Ubuntu-0ubuntu220.04.1.txt) | hadoop-common in the patch failed with JDK Ubuntu-11.0.22+7-post-Ubuntu-0ubuntu220.04.1. | | +1 :green_heart: | javadoc | 1m 36s | | the patch passed with JDK Private Build-1.8.0_402-8u402-ga-2ubuntu1~20.04-b06 | | -1 :x: | spotbugs | 2m 56s | [/new-spotbugs-hadoop-common-project_hadoop-common.html](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-6686/7/artifact/out/new-spotbugs-hadoop-common-project_hadoop-common.html) | hadoop-common-project/hadoop-common generated 1 new + 0 unchanged - 0 fixed = 1 total (was 0) | | +1 :green_heart: | shadedclient | 34m 53s | | patch has no errors when building and testing our client artifacts. | _ Other Tests _ | | +1 :green_heart: | unit | 20m 5s | | hadoop-common in the patch passed. | | +1 :green_heart: | unit | 3m 11s | | hadoop-aws in the patch passed. | | +1 :green_heart: | asflicense | 1m 5s | | The patch does not generate ASF License warnings. | | | | 256m 45s | | | | Reason | Tests | |---:|:--| | SpotBugs | module:hadoop-common-project/hadoop-common | | | Unchecked/unconfirmed cast from Throwable to Exception in org.apache.hadoop.io.wrappedio.DynMethods.throwIfInstance(Throwable, Class) At
[jira] [Commented] (HADOOP-19037) S3A: S3A: ITestS3AConfiguration failing with region problems
[ https://issues.apache.org/jira/browse/HADOOP-19037?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17840991#comment-17840991 ] ASF GitHub Bot commented on HADOOP-19037: - hadoop-yetus commented on PR #6486: URL: https://github.com/apache/hadoop/pull/6486#issuecomment-2078256387 :confetti_ball: **+1 overall** | Vote | Subsystem | Runtime | Logfile | Comment | |::|--:|:|::|:---:| _ Prechecks _ | | +1 :green_heart: | dupname | 0m 00s | | No case conflicting files found. | | +0 :ok: | spotbugs | 0m 00s | | spotbugs executables are not available. | | +0 :ok: | codespell | 0m 00s | | codespell was not available. | | +0 :ok: | detsecrets | 0m 00s | | detect-secrets was not available. | | +1 :green_heart: | @author | 0m 00s | | The patch does not contain any @author tags. | | +1 :green_heart: | test4tests | 0m 01s | | The patch appears to include 1 new or modified test files. | _ trunk Compile Tests _ | | +1 :green_heart: | mvninstall | 91m 56s | | trunk passed | | +1 :green_heart: | compile | 5m 27s | | trunk passed | | +1 :green_heart: | checkstyle | 4m 58s | | trunk passed | | +1 :green_heart: | mvnsite | 5m 21s | | trunk passed | | +1 :green_heart: | javadoc | 5m 05s | | trunk passed | | +1 :green_heart: | shadedclient | 154m 04s | | branch has no errors when building and testing our client artifacts. | _ Patch Compile Tests _ | | +1 :green_heart: | mvninstall | 3m 06s | | the patch passed | | +1 :green_heart: | compile | 2m 20s | | the patch passed | | +1 :green_heart: | javac | 2m 20s | | the patch passed | | +1 :green_heart: | blanks | 0m 00s | | The patch has no blanks issues. | | +1 :green_heart: | checkstyle | 2m 08s | | the patch passed | | +1 :green_heart: | mvnsite | 2m 31s | | the patch passed | | +1 :green_heart: | javadoc | 2m 18s | | the patch passed | | +1 :green_heart: | shadedclient | 160m 20s | | patch has no errors when building and testing our client artifacts. | _ Other Tests _ | | +1 :green_heart: | asflicense | 5m 34s | | The patch does not generate ASF License warnings. | | | | 429m 52s | | | | Subsystem | Report/Notes | |--:|:-| | GITHUB PR | https://github.com/apache/hadoop/pull/6486 | | Optional Tests | dupname asflicense compile javac javadoc mvninstall mvnsite unit shadedclient spotbugs checkstyle codespell detsecrets | | uname | MINGW64_NT-10.0-17763 6d23b3a20b4d 3.4.10-87d57229.x86_64 2024-02-14 20:17 UTC x86_64 Msys | | Build tool | maven | | Personality | /c/hadoop/dev-support/bin/hadoop.sh | | git revision | trunk / e6bfaef37b083914607b14ca2ea863d060a186fb | | Default Java | Azul Systems, Inc.-1.8.0_332-b09 | | Test Results | https://ci-hadoop.apache.org/job/hadoop-multibranch-windows-10/job/PR-6486/1/testReport/ | | modules | C: hadoop-tools/hadoop-aws U: hadoop-tools/hadoop-aws | | Console output | https://ci-hadoop.apache.org/job/hadoop-multibranch-windows-10/job/PR-6486/1/console | | versions | git=2.44.0.windows.1 | | Powered by | Apache Yetus 0.14.0 https://yetus.apache.org | This message was automatically generated. > S3A: S3A: ITestS3AConfiguration failing with region problems > > > Key: HADOOP-19037 > URL: https://issues.apache.org/jira/browse/HADOOP-19037 > Project: Hadoop Common > Issue Type: Sub-task > Components: fs/s3, test >Affects Versions: 3.4.0 >Reporter: Steve Loughran >Priority: Major > Labels: pull-request-available > > After commented out the default region in my ~/.aws/config [default} profile, > test ITestS3AConfiguration. testS3SpecificSignerOverride() fails > {code} > [ERROR] > testS3SpecificSignerOverride(org.apache.hadoop.fs.s3a.ITestS3AConfiguration) > Time elapsed: 0.054 s <<< ERROR! > software.amazon.awssdk.core.exception.SdkClientException: Unable to load > region from any of the providers in the chain > software.amazon.awssdk.regions.providers.DefaultAwsRegionProviderChain@12c626f8: > > [software.amazon.awssdk.regions.providers.SystemSettingsRegionProvider@ae63559: > Unable to load region from system settings. Region must be specified either > via environment variable (AWS_REGION) or system property (aws.region)., > software.amazon.awssdk.regions.providers.AwsProfileRegionProvider@6e6cfd4c: > No region provided in profile: default, > software.amazon.awssdk.regions.providers.InstanceProfileRegionProvider@139147de: > EC2 Metadata is disabled. Unable to retrieve region information
[jira] [Commented] (HADOOP-19158) S3A: Support ByteBufferPositionedReadable through vector IO
[ https://issues.apache.org/jira/browse/HADOOP-19158?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17840990#comment-17840990 ] ASF GitHub Bot commented on HADOOP-19158: - hadoop-yetus commented on PR #6773: URL: https://github.com/apache/hadoop/pull/6773#issuecomment-2078254536 :broken_heart: **-1 overall** | Vote | Subsystem | Runtime | Logfile | Comment | |::|--:|:|::|:---:| | +0 :ok: | reexec | 0m 31s | | Docker mode activated. | _ Prechecks _ | | +1 :green_heart: | dupname | 0m 0s | | No case conflicting files found. | | +0 :ok: | codespell | 0m 0s | | codespell was not available. | | +0 :ok: | detsecrets | 0m 0s | | detect-secrets was not available. | | +1 :green_heart: | @author | 0m 0s | | The patch does not contain any @author tags. | | -1 :x: | test4tests | 0m 0s | | The patch doesn't appear to include any new or modified tests. Please justify why no new tests are needed for this patch. Also please list what manual steps were performed to verify this patch. | _ trunk Compile Tests _ | | +0 :ok: | mvndep | 15m 2s | | Maven dependency ordering for branch | | +1 :green_heart: | mvninstall | 31m 48s | | trunk passed | | +1 :green_heart: | compile | 17m 34s | | trunk passed with JDK Ubuntu-11.0.22+7-post-Ubuntu-0ubuntu220.04.1 | | +1 :green_heart: | compile | 16m 17s | | trunk passed with JDK Private Build-1.8.0_402-8u402-ga-2ubuntu1~20.04-b06 | | +1 :green_heart: | checkstyle | 4m 18s | | trunk passed | | +1 :green_heart: | mvnsite | 2m 38s | | trunk passed | | +1 :green_heart: | javadoc | 1m 55s | | trunk passed with JDK Ubuntu-11.0.22+7-post-Ubuntu-0ubuntu220.04.1 | | +1 :green_heart: | javadoc | 1m 45s | | trunk passed with JDK Private Build-1.8.0_402-8u402-ga-2ubuntu1~20.04-b06 | | +1 :green_heart: | spotbugs | 3m 55s | | trunk passed | | +1 :green_heart: | shadedclient | 34m 20s | | branch has no errors when building and testing our client artifacts. | _ Patch Compile Tests _ | | +0 :ok: | mvndep | 0m 34s | | Maven dependency ordering for patch | | +1 :green_heart: | mvninstall | 1m 28s | | the patch passed | | +1 :green_heart: | compile | 16m 49s | | the patch passed with JDK Ubuntu-11.0.22+7-post-Ubuntu-0ubuntu220.04.1 | | +1 :green_heart: | javac | 16m 49s | | the patch passed | | +1 :green_heart: | compile | 16m 19s | | the patch passed with JDK Private Build-1.8.0_402-8u402-ga-2ubuntu1~20.04-b06 | | +1 :green_heart: | javac | 16m 19s | | the patch passed | | +1 :green_heart: | blanks | 0m 0s | | The patch has no blanks issues. | | +1 :green_heart: | checkstyle | 4m 14s | | the patch passed | | +1 :green_heart: | mvnsite | 2m 38s | | the patch passed | | +1 :green_heart: | javadoc | 1m 52s | | the patch passed with JDK Ubuntu-11.0.22+7-post-Ubuntu-0ubuntu220.04.1 | | +1 :green_heart: | javadoc | 1m 40s | | the patch passed with JDK Private Build-1.8.0_402-8u402-ga-2ubuntu1~20.04-b06 | | +1 :green_heart: | spotbugs | 4m 16s | | the patch passed | | +1 :green_heart: | shadedclient | 34m 41s | | patch has no errors when building and testing our client artifacts. | _ Other Tests _ | | +1 :green_heart: | unit | 19m 37s | | hadoop-common in the patch passed. | | +1 :green_heart: | unit | 3m 16s | | hadoop-aws in the patch passed. | | +1 :green_heart: | asflicense | 1m 6s | | The patch does not generate ASF License warnings. | | | | 246m 12s | | | | Subsystem | Report/Notes | |--:|:-| | Docker | ClientAPI=1.45 ServerAPI=1.45 base: https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-6773/1/artifact/out/Dockerfile | | GITHUB PR | https://github.com/apache/hadoop/pull/6773 | | Optional Tests | dupname asflicense compile javac javadoc mvninstall mvnsite unit shadedclient spotbugs checkstyle codespell detsecrets | | uname | Linux 981a65420e64 5.15.0-94-generic #104-Ubuntu SMP Tue Jan 9 15:25:40 UTC 2024 x86_64 x86_64 x86_64 GNU/Linux | | Build tool | maven | | Personality | dev-support/bin/hadoop.sh | | git revision | trunk / 54f54eaa19f0ed3631cea3e145ca8193c93bad2c | | Default Java | Private Build-1.8.0_402-8u402-ga-2ubuntu1~20.04-b06 | | Multi-JDK versions | /usr/lib/jvm/java-11-openjdk-amd64:Ubuntu-11.0.22+7-post-Ubuntu-0ubuntu220.04.1 /usr/lib/jvm/java-8-openjdk-amd64:Private Build-1.8.0_402-8u402-ga-2ubuntu1~20.04-b06 | | Test Results | https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-6773/1/testReport/ | | Max. process+thread count | 1254 (vs. ulimit of 5500) | | modules | C:
[jira] [Commented] (HADOOP-18679) Add API for bulk/paged object deletion
[ https://issues.apache.org/jira/browse/HADOOP-18679?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17840989#comment-17840989 ] ASF GitHub Bot commented on HADOOP-18679: - mukund-thakur commented on code in PR #6726: URL: https://github.com/apache/hadoop/pull/6726#discussion_r1580173720 ## hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/fs/DefalutBulkDeleteSource.java: ## @@ -0,0 +1,38 @@ +/** + * Licensed to the Apache Software Foundation (ASF) under one + * or more contributor license agreements. See the NOTICE file + * distributed with this work for additional information + * regarding copyright ownership. The ASF licenses this file + * to you under the Apache License, Version 2.0 (the + * "License"); you may not use this file except in compliance + * with the License. You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + */ +package org.apache.hadoop.fs; + +import java.io.IOException; + +/** + * Default implementation of {@link BulkDeleteSource}. + */ +public class DefalutBulkDeleteSource implements BulkDeleteSource { + +private final FileSystem fs; Review Comment: javadoc ## hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/fs/BulkDeleteUtils.java: ## @@ -0,0 +1,54 @@ +/** + * Licensed to the Apache Software Foundation (ASF) under one + * or more contributor license agreements. See the NOTICE file + * distributed with this work for additional information + * regarding copyright ownership. The ASF licenses this file + * to you under the Apache License, Version 2.0 (the + * "License"); you may not use this file except in compliance + * with the License. You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + */ +package org.apache.hadoop.fs; + +import java.util.Collection; + +import static java.util.Objects.requireNonNull; +import static org.apache.hadoop.util.Preconditions.checkArgument; + +/** + * Utility class for bulk delete operations. + */ +public final class BulkDeleteUtils { + +private BulkDeleteUtils() { +} + +public static void validateBulkDeletePaths(Collection paths, int pageSize, Path basePath) { +requireNonNull(paths); +checkArgument(paths.size() <= pageSize, +"Number of paths (%d) is larger than the page size (%d)", paths.size(), pageSize); +paths.forEach(p -> { +checkArgument(p.isAbsolute(), "Path %s is not absolute", p); +checkArgument(validatePathIsUnderParent(p, basePath), +"Path %s is not under the base path %s", p, basePath); +}); +} + +public static boolean validatePathIsUnderParent(Path p, Path basePath) { Review Comment: javadoc > Add API for bulk/paged object deletion > -- > > Key: HADOOP-18679 > URL: https://issues.apache.org/jira/browse/HADOOP-18679 > Project: Hadoop Common > Issue Type: Sub-task > Components: fs/s3 >Affects Versions: 3.3.5 >Reporter: Steve Loughran >Priority: Major > Labels: pull-request-available > > iceberg and hbase could benefit from being able to give a list of individual > files to delete -files which may be scattered round the bucket for better > read peformance. > Add some new optional interface for an object store which allows a caller to > submit a list of paths to files to delete, where > the expectation is > * if a path is a file: delete > * if a path is a dir, outcome undefined > For s3 that'd let us build these into DeleteRequest objects, and submit, > without any probes first. -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Commented] (HADOOP-19152) Do not hard code security providers.
[ https://issues.apache.org/jira/browse/HADOOP-19152?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17840946#comment-17840946 ] ASF GitHub Bot commented on HADOOP-19152: - hadoop-yetus commented on PR #6739: URL: https://github.com/apache/hadoop/pull/6739#issuecomment-2078071797 :confetti_ball: **+1 overall** | Vote | Subsystem | Runtime | Logfile | Comment | |::|--:|:|::|:---:| | +0 :ok: | reexec | 0m 55s | | Docker mode activated. | _ Prechecks _ | | +1 :green_heart: | dupname | 0m 0s | | No case conflicting files found. | | +0 :ok: | codespell | 0m 1s | | codespell was not available. | | +0 :ok: | detsecrets | 0m 1s | | detect-secrets was not available. | | +0 :ok: | xmllint | 0m 1s | | xmllint was not available. | | +1 :green_heart: | @author | 0m 0s | | The patch does not contain any @author tags. | | +1 :green_heart: | test4tests | 0m 0s | | The patch appears to include 3 new or modified test files. | _ trunk Compile Tests _ | | +1 :green_heart: | mvninstall | 45m 9s | | trunk passed | | +1 :green_heart: | compile | 17m 31s | | trunk passed with JDK Ubuntu-11.0.22+7-post-Ubuntu-0ubuntu220.04.1 | | +1 :green_heart: | compile | 16m 19s | | trunk passed with JDK Private Build-1.8.0_402-8u402-ga-2ubuntu1~20.04-b06 | | +1 :green_heart: | checkstyle | 1m 18s | | trunk passed | | +1 :green_heart: | mvnsite | 1m 42s | | trunk passed | | +1 :green_heart: | javadoc | 1m 17s | | trunk passed with JDK Ubuntu-11.0.22+7-post-Ubuntu-0ubuntu220.04.1 | | +1 :green_heart: | javadoc | 0m 55s | | trunk passed with JDK Private Build-1.8.0_402-8u402-ga-2ubuntu1~20.04-b06 | | +1 :green_heart: | spotbugs | 2m 34s | | trunk passed | | +1 :green_heart: | shadedclient | 35m 29s | | branch has no errors when building and testing our client artifacts. | _ Patch Compile Tests _ | | +1 :green_heart: | mvninstall | 0m 55s | | the patch passed | | +1 :green_heart: | compile | 16m 44s | | the patch passed with JDK Ubuntu-11.0.22+7-post-Ubuntu-0ubuntu220.04.1 | | +1 :green_heart: | javac | 16m 44s | | the patch passed | | +1 :green_heart: | compile | 15m 59s | | the patch passed with JDK Private Build-1.8.0_402-8u402-ga-2ubuntu1~20.04-b06 | | +1 :green_heart: | javac | 15m 59s | | the patch passed | | +1 :green_heart: | blanks | 0m 0s | | The patch has no blanks issues. | | +1 :green_heart: | checkstyle | 1m 15s | | the patch passed | | +1 :green_heart: | mvnsite | 1m 42s | | the patch passed | | +1 :green_heart: | javadoc | 1m 10s | | the patch passed with JDK Ubuntu-11.0.22+7-post-Ubuntu-0ubuntu220.04.1 | | +1 :green_heart: | javadoc | 0m 54s | | the patch passed with JDK Private Build-1.8.0_402-8u402-ga-2ubuntu1~20.04-b06 | | +1 :green_heart: | spotbugs | 2m 43s | | the patch passed | | +1 :green_heart: | shadedclient | 35m 33s | | patch has no errors when building and testing our client artifacts. | _ Other Tests _ | | +1 :green_heart: | unit | 19m 40s | | hadoop-common in the patch passed. | | +1 :green_heart: | asflicense | 1m 4s | | The patch does not generate ASF License warnings. | | | | 224m 52s | | | | Subsystem | Report/Notes | |--:|:-| | Docker | ClientAPI=1.45 ServerAPI=1.45 base: https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-6739/7/artifact/out/Dockerfile | | GITHUB PR | https://github.com/apache/hadoop/pull/6739 | | Optional Tests | dupname asflicense compile javac javadoc mvninstall mvnsite unit shadedclient codespell detsecrets xmllint spotbugs checkstyle | | uname | Linux 44a3394253ca 5.15.0-94-generic #104-Ubuntu SMP Tue Jan 9 15:25:40 UTC 2024 x86_64 x86_64 x86_64 GNU/Linux | | Build tool | maven | | Personality | dev-support/bin/hadoop.sh | | git revision | trunk / a1411f11ea8c85b47d37e8d5fa53920b3824bb62 | | Default Java | Private Build-1.8.0_402-8u402-ga-2ubuntu1~20.04-b06 | | Multi-JDK versions | /usr/lib/jvm/java-11-openjdk-amd64:Ubuntu-11.0.22+7-post-Ubuntu-0ubuntu220.04.1 /usr/lib/jvm/java-8-openjdk-amd64:Private Build-1.8.0_402-8u402-ga-2ubuntu1~20.04-b06 | | Test Results | https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-6739/7/testReport/ | | Max. process+thread count | 1263 (vs. ulimit of 5500) | | modules | C: hadoop-common-project/hadoop-common U: hadoop-common-project/hadoop-common | | Console output | https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-6739/7/console | | versions | git=2.25.1 maven=3.6.3 spotbugs=4.2.2 | | Powered by | Apache Yetus 0.14.0 https://yetus.apache.org |
[jira] [Commented] (HADOOP-19013) fs.getXattrs(path) for S3FS doesn't have x-amz-server-side-encryption-aws-kms-key-id header.
[ https://issues.apache.org/jira/browse/HADOOP-19013?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17840901#comment-17840901 ] ASF GitHub Bot commented on HADOOP-19013: - steveloughran commented on code in PR #6646: URL: https://github.com/apache/hadoop/pull/6646#discussion_r1534430622 ## hadoop-tools/hadoop-aws/src/test/java/org/apache/hadoop/fs/s3a/EncryptionTestUtils.java: ## @@ -111,4 +119,27 @@ public static void assertEncrypted(S3AFileSystem fs, } } + /** + * Assert that a path is encrypted with right encryption settings. + * @param fs filesystem. + * @param path path + * @param algorithm encryption algorithm. + * @param kmsKey full kms key if present. + * @throws IOException any IOE. + */ + public static void validateEncryptionFileAttributes(S3AFileSystem fs, +Path path, +String algorithm, +Optional kmsKey) throws IOException { +Map xAttrs = fs.getXAttrs(path); + Assertions.assertThat(HeaderProcessing.decodeBytes(xAttrs.get(XA_SERVER_SIDE_ENCRYPTION))) Review Comment: assert that the .get isn't null(), you can use .extracting to chain ``` assertThat(xAttrs.get(XA_SERVER_SIDE_ENCRYPTION)) .describedAs(...) .isNotNull() .extracting(HeaderProcessing::decodeBytes()) .isEqualTo(...) ## hadoop-tools/hadoop-aws/src/test/java/org/apache/hadoop/fs/s3a/EncryptionTestUtils.java: ## @@ -111,4 +119,27 @@ public static void assertEncrypted(S3AFileSystem fs, } } + /** + * Assert that a path is encrypted with right encryption settings. + * @param fs filesystem. + * @param path path + * @param algorithm encryption algorithm. + * @param kmsKey full kms key if present. + * @throws IOException any IOE. + */ + public static void validateEncryptionFileAttributes(S3AFileSystem fs, +Path path, +String algorithm, +Optional kmsKey) throws IOException { +Map xAttrs = fs.getXAttrs(path); + Assertions.assertThat(HeaderProcessing.decodeBytes(xAttrs.get(XA_SERVER_SIDE_ENCRYPTION))) +.describedAs("Server side encryption algorithm must match") +.isEqualTo(algorithm); +Assertions.assertThat(xAttrs.containsKey(XA_ENCRYPTION_KEY_ID)) Review Comment: there's a specific assertion on a map containing a value ## hadoop-tools/hadoop-aws/src/test/java/org/apache/hadoop/fs/s3a/ITestS3AEncryptionWithDefaultS3Settings.java: ## @@ -97,6 +104,21 @@ protected void assertEncrypted(Path path) throws IOException { EncryptionTestUtils.assertEncrypted(fs, path, SSE_KMS, kmsKey); } + @Test + public void testEncryptionFileAttributes() throws Exception { +Path path = path(createFilename(1024)); Review Comment: add a describe() for the logs ## hadoop-tools/hadoop-aws/src/test/java/org/apache/hadoop/fs/s3a/ITestS3AEncryptionSSEKMSDefaultKey.java: ## @@ -19,12 +19,20 @@ package org.apache.hadoop.fs.s3a; import java.io.IOException; +import java.util.Optional; +import org.apache.hadoop.fs.contract.ContractTestUtils; Review Comment: wrong location > fs.getXattrs(path) for S3FS doesn't have > x-amz-server-side-encryption-aws-kms-key-id header. > > > Key: HADOOP-19013 > URL: https://issues.apache.org/jira/browse/HADOOP-19013 > Project: Hadoop Common > Issue Type: Sub-task > Components: fs/s3 >Affects Versions: 3.3.6 >Reporter: Mukund Thakur >Assignee: Mukund Thakur >Priority: Major > Labels: pull-request-available > > Once a path while uploading has been encrypted with SSE-KMS with a key id and > then later when we try to read the attributes of the same file, it doesn't > contain the key id information as an attribute. should we add it? -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Commented] (HADOOP-19158) S3A: Support ByteBufferPositionedReadable through vector IO
[ https://issues.apache.org/jira/browse/HADOOP-19158?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17840900#comment-17840900 ] ASF GitHub Bot commented on HADOOP-19158: - steveloughran opened a new pull request, #6773: URL: https://github.com/apache/hadoop/pull/6773 HADOOP-19158. ### How was this patch tested? no tests; would need to move the hdfs one up to a contract test. ### For code changes: - [X] Does the title or this PR starts with the corresponding JIRA issue id (e.g. 'HADOOP-17799. Your PR title ...')? - [ ] Object storage: have the integration tests been executed and the endpoint declared according to the connector-specific documentation? - [ ] If adding new dependencies to the code, are these dependencies licensed in a way that is compatible for inclusion under [ASF 2.0](http://www.apache.org/legal/resolved.html#category-a)? - [ ] If applicable, have you updated the `LICENSE`, `LICENSE-binary`, `NOTICE-binary` files? > S3A: Support ByteBufferPositionedReadable through vector IO > --- > > Key: HADOOP-19158 > URL: https://issues.apache.org/jira/browse/HADOOP-19158 > Project: Hadoop Common > Issue Type: Sub-task > Components: fs, fs/s3 >Affects Versions: 3.4.0 >Reporter: Steve Loughran >Assignee: Steve Loughran >Priority: Major > > Make it easy for any stream with vector io to support > {{ByteBufferPositionedReadable}} > Specifically, {{ByteBufferPositionedReadable.readFully()}} > is exactly a single range read so is easy to read. > the simpler read() call which can return less isn't part of the vector API. > Proposed: invoke the readFully() but convert an EOFException to -1 -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Updated] (HADOOP-19158) S3A: Support ByteBufferPositionedReadable through vector IO
[ https://issues.apache.org/jira/browse/HADOOP-19158?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] ASF GitHub Bot updated HADOOP-19158: Labels: pull-request-available (was: ) > S3A: Support ByteBufferPositionedReadable through vector IO > --- > > Key: HADOOP-19158 > URL: https://issues.apache.org/jira/browse/HADOOP-19158 > Project: Hadoop Common > Issue Type: Sub-task > Components: fs, fs/s3 >Affects Versions: 3.4.0 >Reporter: Steve Loughran >Assignee: Steve Loughran >Priority: Major > Labels: pull-request-available > > Make it easy for any stream with vector io to support > {{ByteBufferPositionedReadable}} > Specifically, {{ByteBufferPositionedReadable.readFully()}} > is exactly a single range read so is easy to read. > the simpler read() call which can return less isn't part of the vector API. > Proposed: invoke the readFully() but convert an EOFException to -1 -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Commented] (HADOOP-19058) [JDK-17] Fix UT Failures in hadoop common, hdfs, yarn
[ https://issues.apache.org/jira/browse/HADOOP-19058?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17840889#comment-17840889 ] ASF GitHub Bot commented on HADOOP-19058: - hadoop-yetus commented on PR #6531: URL: https://github.com/apache/hadoop/pull/6531#issuecomment-2077803597 :broken_heart: **-1 overall** | Vote | Subsystem | Runtime | Logfile | Comment | |::|--:|:|::|:---:| _ Prechecks _ | | +1 :green_heart: | dupname | 0m 00s | | No case conflicting files found. | | +0 :ok: | codespell | 0m 01s | | codespell was not available. | | +0 :ok: | detsecrets | 0m 01s | | detect-secrets was not available. | | +0 :ok: | xmllint | 0m 01s | | xmllint was not available. | | +1 :green_heart: | @author | 0m 00s | | The patch does not contain any @author tags. | | -1 :x: | test4tests | 0m 00s | | The patch doesn't appear to include any new or modified tests. Please justify why no new tests are needed for this patch. Also please list what manual steps were performed to verify this patch. | _ trunk Compile Tests _ | | +1 :green_heart: | mvninstall | 92m 58s | | trunk passed | | +1 :green_heart: | compile | 4m 29s | | trunk passed | | +1 :green_heart: | mvnsite | 4m 34s | | trunk passed | | +1 :green_heart: | javadoc | 4m 33s | | trunk passed | | +1 :green_heart: | shadedclient | 246m 17s | | branch has no errors when building and testing our client artifacts. | _ Patch Compile Tests _ | | +1 :green_heart: | mvninstall | 2m 01s | | the patch passed | | +1 :green_heart: | compile | 1m 55s | | the patch passed | | +1 :green_heart: | javac | 1m 55s | | the patch passed | | +1 :green_heart: | blanks | 0m 00s | | The patch has no blanks issues. | | +1 :green_heart: | mvnsite | 1m 59s | | the patch passed | | +1 :green_heart: | javadoc | 1m 57s | | the patch passed | | +1 :green_heart: | shadedclient | 152m 30s | | patch has no errors when building and testing our client artifacts. | _ Other Tests _ | | +1 :green_heart: | asflicense | 5m 29s | | The patch does not generate ASF License warnings. | | | | 418m 45s | | | | Subsystem | Report/Notes | |--:|:-| | GITHUB PR | https://github.com/apache/hadoop/pull/6531 | | Optional Tests | dupname asflicense compile javac javadoc mvninstall mvnsite unit shadedclient codespell detsecrets xmllint | | uname | MINGW64_NT-10.0-17763 fef11fae28ac 3.4.10-87d57229.x86_64 2024-02-14 20:17 UTC x86_64 Msys | | Build tool | maven | | Personality | /c/hadoop/dev-support/bin/hadoop.sh | | git revision | trunk / fcb51d70221c64831ab78e20abb3d48f59bbf506 | | Default Java | Azul Systems, Inc.-1.8.0_332-b09 | | Test Results | https://ci-hadoop.apache.org/job/hadoop-multibranch-windows-10/job/PR-6531/1/testReport/ | | modules | C: hadoop-project U: hadoop-project | | Console output | https://ci-hadoop.apache.org/job/hadoop-multibranch-windows-10/job/PR-6531/1/console | | versions | git=2.44.0.windows.1 | | Powered by | Apache Yetus 0.14.0 https://yetus.apache.org | This message was automatically generated. > [JDK-17] Fix UT Failures in hadoop common, hdfs, yarn > - > > Key: HADOOP-19058 > URL: https://issues.apache.org/jira/browse/HADOOP-19058 > Project: Hadoop Common > Issue Type: Sub-task >Reporter: Bilwa S T >Assignee: Bilwa S T >Priority: Major > Labels: pull-request-available > > Most of the UT's failed with below exception: > Caused by: java.lang.ExceptionInInitializerError: Exception > java.lang.reflect.InaccessibleObjectException: Unable to make protected final > java.lang.Class > java.lang.ClassLoader.defineClass(java.lang.String,byte[],int,int) throws > java.lang.ClassFormatError accessible: module java.base does not "opens > java.lang" to unnamed module @d13f7c [in thread "Time-limited test"] -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Commented] (HADOOP-19012) Use CRC tables to speed up galoisFieldMultiply in CrcUtil
[ https://issues.apache.org/jira/browse/HADOOP-19012?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17840848#comment-17840848 ] ASF GitHub Bot commented on HADOOP-19012: - hadoop-yetus commented on PR #6542: URL: https://github.com/apache/hadoop/pull/6542#issuecomment-2077552234 :broken_heart: **-1 overall** | Vote | Subsystem | Runtime | Logfile | Comment | |::|--:|:|::|:---:| _ Prechecks _ | | +1 :green_heart: | dupname | 0m 01s | | No case conflicting files found. | | +0 :ok: | spotbugs | 0m 01s | | spotbugs executables are not available. | | +0 :ok: | codespell | 0m 01s | | codespell was not available. | | +0 :ok: | detsecrets | 0m 01s | | detect-secrets was not available. | | +1 :green_heart: | @author | 0m 00s | | The patch does not contain any @author tags. | | +1 :green_heart: | test4tests | 0m 00s | | The patch appears to include 1 new or modified test files. | _ trunk Compile Tests _ | | +1 :green_heart: | mvninstall | 109m 29s | | trunk passed | | +1 :green_heart: | compile | 47m 20s | | trunk passed | | +1 :green_heart: | checkstyle | 5m 35s | | trunk passed | | -1 :x: | mvnsite | 5m 04s | [/branch-mvnsite-hadoop-common-project_hadoop-common.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch-windows-10/job/PR-6542/1/artifact/out/branch-mvnsite-hadoop-common-project_hadoop-common.txt) | hadoop-common in trunk failed. | | +1 :green_heart: | javadoc | 5m 32s | | trunk passed | | +1 :green_heart: | shadedclient | 178m 00s | | branch has no errors when building and testing our client artifacts. | _ Patch Compile Tests _ | | +1 :green_heart: | mvninstall | 6m 20s | | the patch passed | | +1 :green_heart: | compile | 45m 48s | | the patch passed | | +1 :green_heart: | javac | 45m 48s | | the patch passed | | +1 :green_heart: | blanks | 0m 01s | | The patch has no blanks issues. | | +1 :green_heart: | checkstyle | 5m 24s | | the patch passed | | -1 :x: | mvnsite | 5m 23s | [/patch-mvnsite-hadoop-common-project_hadoop-common.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch-windows-10/job/PR-6542/1/artifact/out/patch-mvnsite-hadoop-common-project_hadoop-common.txt) | hadoop-common in the patch failed. | | +1 :green_heart: | javadoc | 5m 34s | | the patch passed | | +1 :green_heart: | shadedclient | 189m 18s | | patch has no errors when building and testing our client artifacts. | _ Other Tests _ | | +1 :green_heart: | asflicense | 6m 34s | | The patch does not generate ASF License warnings. | | | | 591m 41s | | | | Subsystem | Report/Notes | |--:|:-| | GITHUB PR | https://github.com/apache/hadoop/pull/6542 | | Optional Tests | dupname asflicense compile javac javadoc mvninstall mvnsite unit shadedclient spotbugs checkstyle codespell detsecrets | | uname | MINGW64_NT-10.0-17763 f3105eaa8c09 3.4.10-87d57229.x86_64 2024-02-14 20:17 UTC x86_64 Msys | | Build tool | maven | | Personality | /c/hadoop/dev-support/bin/hadoop.sh | | git revision | trunk / 63fb429f68529ca934a84ac44e40eb4ce6e1bf70 | | Default Java | Azul Systems, Inc.-1.8.0_332-b09 | | Test Results | https://ci-hadoop.apache.org/job/hadoop-multibranch-windows-10/job/PR-6542/1/testReport/ | | modules | C: hadoop-common-project/hadoop-common U: hadoop-common-project/hadoop-common | | Console output | https://ci-hadoop.apache.org/job/hadoop-multibranch-windows-10/job/PR-6542/1/console | | versions | git=2.44.0.windows.1 | | Powered by | Apache Yetus 0.14.0 https://yetus.apache.org | This message was automatically generated. > Use CRC tables to speed up galoisFieldMultiply in CrcUtil > - > > Key: HADOOP-19012 > URL: https://issues.apache.org/jira/browse/HADOOP-19012 > Project: Hadoop Common > Issue Type: Improvement > Components: util >Reporter: Tsz-wo Sze >Assignee: Tsz-wo Sze >Priority: Major > Labels: pull-request-available > > CrcUtil.galoisFieldMultiply(p, q, m) supports multiplying two polynomials p, > q modulo any modulus polynomial m over GF(2). Since the method is used for > CRC calculations, the modulus polynomial m is restricted to either the > GZIP_POLYNOMIAL or the CASTAGNOLI_POLYNOMIAL. We may use CRC tables in > PureJavaCrc32/PureJavaCrc32C to speed up the computation. -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
[jira] [Commented] (HADOOP-19073) WASB: Fix connection leak in FolderRenamePending
[ https://issues.apache.org/jira/browse/HADOOP-19073?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17840825#comment-17840825 ] ASF GitHub Bot commented on HADOOP-19073: - hadoop-yetus commented on PR #6534: URL: https://github.com/apache/hadoop/pull/6534#issuecomment-2077324557 :broken_heart: **-1 overall** | Vote | Subsystem | Runtime | Logfile | Comment | |::|--:|:|::|:---:| _ Prechecks _ | | +1 :green_heart: | dupname | 0m 00s | | No case conflicting files found. | | +0 :ok: | spotbugs | 0m 00s | | spotbugs executables are not available. | | +0 :ok: | codespell | 0m 01s | | codespell was not available. | | +0 :ok: | detsecrets | 0m 01s | | detect-secrets was not available. | | +1 :green_heart: | @author | 0m 00s | | The patch does not contain any @author tags. | | -1 :x: | test4tests | 0m 00s | | The patch doesn't appear to include any new or modified tests. Please justify why no new tests are needed for this patch. Also please list what manual steps were performed to verify this patch. | _ trunk Compile Tests _ | | +1 :green_heart: | mvninstall | 86m 45s | | trunk passed | | +1 :green_heart: | compile | 4m 35s | | trunk passed | | +1 :green_heart: | checkstyle | 4m 20s | | trunk passed | | +1 :green_heart: | mvnsite | 4m 45s | | trunk passed | | +1 :green_heart: | javadoc | 4m 26s | | trunk passed | | +1 :green_heart: | shadedclient | 140m 16s | | branch has no errors when building and testing our client artifacts. | _ Patch Compile Tests _ | | +1 :green_heart: | mvninstall | 2m 33s | | the patch passed | | +1 :green_heart: | compile | 2m 06s | | the patch passed | | +1 :green_heart: | javac | 2m 06s | | the patch passed | | +1 :green_heart: | blanks | 0m 01s | | The patch has no blanks issues. | | +1 :green_heart: | checkstyle | 1m 56s | | the patch passed | | +1 :green_heart: | mvnsite | 2m 13s | | the patch passed | | +1 :green_heart: | javadoc | 2m 01s | | the patch passed | | +1 :green_heart: | shadedclient | 150m 23s | | patch has no errors when building and testing our client artifacts. | _ Other Tests _ | | +1 :green_heart: | asflicense | 5m 41s | | The patch does not generate ASF License warnings. | | | | 399m 51s | | | | Subsystem | Report/Notes | |--:|:-| | GITHUB PR | https://github.com/apache/hadoop/pull/6534 | | Optional Tests | dupname asflicense compile javac javadoc mvninstall mvnsite unit shadedclient spotbugs checkstyle codespell detsecrets | | uname | MINGW64_NT-10.0-17763 2ea128c33854 3.4.10-87d57229.x86_64 2024-02-14 20:17 UTC x86_64 Msys | | Build tool | maven | | Personality | /c/hadoop/dev-support/bin/hadoop.sh | | git revision | trunk / 649c40fe96cc84f17ff0d97949fc2b55b46772e9 | | Default Java | Azul Systems, Inc.-1.8.0_332-b09 | | Test Results | https://ci-hadoop.apache.org/job/hadoop-multibranch-windows-10/job/PR-6534/1/testReport/ | | modules | C: hadoop-tools/hadoop-azure U: hadoop-tools/hadoop-azure | | Console output | https://ci-hadoop.apache.org/job/hadoop-multibranch-windows-10/job/PR-6534/1/console | | versions | git=2.44.0.windows.1 | | Powered by | Apache Yetus 0.14.0 https://yetus.apache.org | This message was automatically generated. > WASB: Fix connection leak in FolderRenamePending > > > Key: HADOOP-19073 > URL: https://issues.apache.org/jira/browse/HADOOP-19073 > Project: Hadoop Common > Issue Type: Bug > Components: fs/azure >Affects Versions: 3.3.6 >Reporter: xy >Priority: Major > Labels: pull-request-available > > Fix connection leak in FolderRenamePending in getting bytes -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Commented] (HADOOP-19139) [ABFS]: No GetPathStatus call for opening AbfsInputStream
[ https://issues.apache.org/jira/browse/HADOOP-19139?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17840818#comment-17840818 ] ASF GitHub Bot commented on HADOOP-19139: - hadoop-yetus commented on PR #6699: URL: https://github.com/apache/hadoop/pull/6699#issuecomment-2077298894 :confetti_ball: **+1 overall** | Vote | Subsystem | Runtime | Logfile | Comment | |::|--:|:|::|:---:| | +0 :ok: | reexec | 0m 46s | | Docker mode activated. | _ Prechecks _ | | +1 :green_heart: | dupname | 0m 1s | | No case conflicting files found. | | +0 :ok: | codespell | 0m 1s | | codespell was not available. | | +0 :ok: | detsecrets | 0m 1s | | detect-secrets was not available. | | +0 :ok: | markdownlint | 0m 1s | | markdownlint was not available. | | +1 :green_heart: | @author | 0m 0s | | The patch does not contain any @author tags. | | +1 :green_heart: | test4tests | 0m 0s | | The patch appears to include 13 new or modified test files. | _ trunk Compile Tests _ | | +0 :ok: | mvndep | 14m 28s | | Maven dependency ordering for branch | | +1 :green_heart: | mvninstall | 37m 27s | | trunk passed | | +1 :green_heart: | compile | 19m 16s | | trunk passed with JDK Ubuntu-11.0.22+7-post-Ubuntu-0ubuntu220.04.1 | | +1 :green_heart: | compile | 17m 26s | | trunk passed with JDK Private Build-1.8.0_402-8u402-ga-2ubuntu1~20.04-b06 | | +1 :green_heart: | checkstyle | 4m 46s | | trunk passed | | +1 :green_heart: | mvnsite | 2m 30s | | trunk passed | | +1 :green_heart: | javadoc | 1m 59s | | trunk passed with JDK Ubuntu-11.0.22+7-post-Ubuntu-0ubuntu220.04.1 | | +1 :green_heart: | javadoc | 1m 33s | | trunk passed with JDK Private Build-1.8.0_402-8u402-ga-2ubuntu1~20.04-b06 | | +1 :green_heart: | spotbugs | 3m 41s | | trunk passed | | +1 :green_heart: | shadedclient | 39m 24s | | branch has no errors when building and testing our client artifacts. | | -0 :warning: | patch | 39m 50s | | Used diff version of patch file. Binary files and potentially other changes not applied. Please rebase and squash commits if necessary. | _ Patch Compile Tests _ | | +0 :ok: | mvndep | 0m 31s | | Maven dependency ordering for patch | | +1 :green_heart: | mvninstall | 1m 25s | | the patch passed | | +1 :green_heart: | compile | 18m 21s | | the patch passed with JDK Ubuntu-11.0.22+7-post-Ubuntu-0ubuntu220.04.1 | | +1 :green_heart: | javac | 18m 21s | | the patch passed | | +1 :green_heart: | compile | 17m 24s | | the patch passed with JDK Private Build-1.8.0_402-8u402-ga-2ubuntu1~20.04-b06 | | +1 :green_heart: | javac | 17m 24s | | the patch passed | | +1 :green_heart: | blanks | 0m 0s | | The patch has no blanks issues. | | +1 :green_heart: | checkstyle | 4m 38s | | the patch passed | | +1 :green_heart: | mvnsite | 2m 32s | | the patch passed | | +1 :green_heart: | javadoc | 1m 51s | | the patch passed with JDK Ubuntu-11.0.22+7-post-Ubuntu-0ubuntu220.04.1 | | +1 :green_heart: | javadoc | 1m 34s | | the patch passed with JDK Private Build-1.8.0_402-8u402-ga-2ubuntu1~20.04-b06 | | +1 :green_heart: | spotbugs | 4m 5s | | the patch passed | | +1 :green_heart: | shadedclient | 39m 49s | | patch has no errors when building and testing our client artifacts. | _ Other Tests _ | | +1 :green_heart: | unit | 19m 23s | | hadoop-common in the patch passed. | | +1 :green_heart: | unit | 2m 39s | | hadoop-azure in the patch passed. | | +1 :green_heart: | asflicense | 1m 0s | | The patch does not generate ASF License warnings. | | | | 265m 40s | | | | Subsystem | Report/Notes | |--:|:-| | Docker | ClientAPI=1.44 ServerAPI=1.44 base: https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-6699/58/artifact/out/Dockerfile | | GITHUB PR | https://github.com/apache/hadoop/pull/6699 | | Optional Tests | dupname asflicense mvnsite codespell detsecrets markdownlint compile javac javadoc mvninstall unit shadedclient spotbugs checkstyle | | uname | Linux c45f04e834ae 5.15.0-94-generic #104-Ubuntu SMP Tue Jan 9 15:25:40 UTC 2024 x86_64 x86_64 x86_64 GNU/Linux | | Build tool | maven | | Personality | dev-support/bin/hadoop.sh | | git revision | trunk / ffefdb36395fcb38ef06d842a8608cb8fc53f945 | | Default Java | Private Build-1.8.0_402-8u402-ga-2ubuntu1~20.04-b06 | | Multi-JDK versions | /usr/lib/jvm/java-11-openjdk-amd64:Ubuntu-11.0.22+7-post-Ubuntu-0ubuntu220.04.1 /usr/lib/jvm/java-8-openjdk-amd64:Private Build-1.8.0_402-8u402-ga-2ubuntu1~20.04-b06 | | Test Results |
[jira] [Commented] (HADOOP-19072) S3A: expand optimisations on stores with "fs.s3a.create.performance"
[ https://issues.apache.org/jira/browse/HADOOP-19072?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17840791#comment-17840791 ] ASF GitHub Bot commented on HADOOP-19072: - hadoop-yetus commented on PR #6543: URL: https://github.com/apache/hadoop/pull/6543#issuecomment-2077150464 :broken_heart: **-1 overall** | Vote | Subsystem | Runtime | Logfile | Comment | |::|--:|:|::|:---:| _ Prechecks _ | | +1 :green_heart: | dupname | 0m 02s | | No case conflicting files found. | | +0 :ok: | codespell | 0m 02s | | codespell was not available. | | +0 :ok: | detsecrets | 0m 02s | | detect-secrets was not available. | | +0 :ok: | markdownlint | 0m 02s | | markdownlint was not available. | | +0 :ok: | spotbugs | 0m 01s | | spotbugs executables are not available. | | +1 :green_heart: | @author | 0m 00s | | The patch does not contain any @author tags. | | +1 :green_heart: | test4tests | 0m 00s | | The patch appears to include 6 new or modified test files. | _ trunk Compile Tests _ | | +0 :ok: | mvndep | 2m 16s | | Maven dependency ordering for branch | | +1 :green_heart: | mvninstall | 90m 13s | | trunk passed | | +1 :green_heart: | compile | 39m 39s | | trunk passed | | +1 :green_heart: | checkstyle | 5m 56s | | trunk passed | | -1 :x: | mvnsite | 4m 24s | [/branch-mvnsite-hadoop-common-project_hadoop-common.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch-windows-10/job/PR-6543/1/artifact/out/branch-mvnsite-hadoop-common-project_hadoop-common.txt) | hadoop-common in trunk failed. | | +1 :green_heart: | javadoc | 9m 25s | | trunk passed | | +1 :green_heart: | shadedclient | 161m 37s | | branch has no errors when building and testing our client artifacts. | _ Patch Compile Tests _ | | +0 :ok: | mvndep | 2m 18s | | Maven dependency ordering for patch | | +1 :green_heart: | mvninstall | 8m 37s | | the patch passed | | +1 :green_heart: | compile | 37m 56s | | the patch passed | | +1 :green_heart: | javac | 37m 56s | | the patch passed | | +1 :green_heart: | blanks | 0m 00s | | The patch has no blanks issues. | | +1 :green_heart: | checkstyle | 6m 00s | | the patch passed | | -1 :x: | mvnsite | 4m 30s | [/patch-mvnsite-hadoop-common-project_hadoop-common.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch-windows-10/job/PR-6543/1/artifact/out/patch-mvnsite-hadoop-common-project_hadoop-common.txt) | hadoop-common in the patch failed. | | +1 :green_heart: | javadoc | 9m 22s | | the patch passed | | +1 :green_heart: | shadedclient | 171m 17s | | patch has no errors when building and testing our client artifacts. | _ Other Tests _ | | +1 :green_heart: | asflicense | 8m 21s | | The patch does not generate ASF License warnings. | | | | 528m 37s | | | | Subsystem | Report/Notes | |--:|:-| | GITHUB PR | https://github.com/apache/hadoop/pull/6543 | | Optional Tests | dupname asflicense mvnsite codespell detsecrets markdownlint compile javac javadoc mvninstall unit shadedclient spotbugs checkstyle | | uname | MINGW64_NT-10.0-17763 48a1e573d3ea 3.4.10-87d57229.x86_64 2024-02-14 20:17 UTC x86_64 Msys | | Build tool | maven | | Personality | /c/hadoop/dev-support/bin/hadoop.sh | | git revision | trunk / b7e4ede34ba1fb2094ba5363c0ec07cd763e5336 | | Default Java | Azul Systems, Inc.-1.8.0_332-b09 | | Test Results | https://ci-hadoop.apache.org/job/hadoop-multibranch-windows-10/job/PR-6543/1/testReport/ | | modules | C: hadoop-common-project/hadoop-common hadoop-tools/hadoop-aws U: . | | Console output | https://ci-hadoop.apache.org/job/hadoop-multibranch-windows-10/job/PR-6543/1/console | | versions | git=2.44.0.windows.1 | | Powered by | Apache Yetus 0.14.0 https://yetus.apache.org | This message was automatically generated. > S3A: expand optimisations on stores with "fs.s3a.create.performance" > > > Key: HADOOP-19072 > URL: https://issues.apache.org/jira/browse/HADOOP-19072 > Project: Hadoop Common > Issue Type: Sub-task > Components: fs/s3 >Affects Versions: 3.4.0 >Reporter: Steve Loughran >Assignee: Viraj Jasani >Priority: Major > Labels: pull-request-available > > on an s3a store with fs.s3a.create.performance set, speed up other operations > * mkdir to skip parent directory check: just do a HEAD to see if there's a > file at the target location -- This message was sent by Atlassian Jira (v8.20.10#820010)
[jira] [Commented] (HADOOP-19134) use StringBuilder instead of StringBuffer
[ https://issues.apache.org/jira/browse/HADOOP-19134?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17840774#comment-17840774 ] ASF GitHub Bot commented on HADOOP-19134: - hadoop-yetus commented on PR #6692: URL: https://github.com/apache/hadoop/pull/6692#issuecomment-2077039728 :broken_heart: **-1 overall** | Vote | Subsystem | Runtime | Logfile | Comment | |::|--:|:|::|:---:| | +0 :ok: | reexec | 0m 33s | | Docker mode activated. | _ Prechecks _ | | +1 :green_heart: | dupname | 0m 3s | | No case conflicting files found. | | +0 :ok: | codespell | 0m 0s | | codespell was not available. | | +0 :ok: | detsecrets | 0m 0s | | detect-secrets was not available. | | +1 :green_heart: | @author | 0m 0s | | The patch does not contain any @author tags. | | +1 :green_heart: | test4tests | 0m 0s | | The patch appears to include 56 new or modified test files. | _ trunk Compile Tests _ | | +0 :ok: | mvndep | 14m 59s | | Maven dependency ordering for branch | | +1 :green_heart: | mvninstall | 32m 41s | | trunk passed | | +1 :green_heart: | compile | 17m 36s | | trunk passed with JDK Ubuntu-11.0.22+7-post-Ubuntu-0ubuntu220.04.1 | | +1 :green_heart: | compile | 16m 9s | | trunk passed with JDK Private Build-1.8.0_402-8u402-ga-2ubuntu1~20.04-b06 | | +1 :green_heart: | checkstyle | 4m 47s | | trunk passed | | +1 :green_heart: | mvnsite | 20m 8s | | trunk passed | | +1 :green_heart: | javadoc | 17m 47s | | trunk passed with JDK Ubuntu-11.0.22+7-post-Ubuntu-0ubuntu220.04.1 | | +1 :green_heart: | javadoc | 17m 40s | | trunk passed with JDK Private Build-1.8.0_402-8u402-ga-2ubuntu1~20.04-b06 | | +1 :green_heart: | spotbugs | 31m 13s | | trunk passed | | +1 :green_heart: | shadedclient | 34m 26s | | branch has no errors when building and testing our client artifacts. | _ Patch Compile Tests _ | | +0 :ok: | mvndep | 0m 34s | | Maven dependency ordering for patch | | -1 :x: | mvninstall | 0m 30s | [/patch-mvninstall-hadoop-yarn-project_hadoop-yarn_hadoop-yarn-common.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-6692/2/artifact/out/patch-mvninstall-hadoop-yarn-project_hadoop-yarn_hadoop-yarn-common.txt) | hadoop-yarn-common in the patch failed. | | +1 :green_heart: | compile | 16m 53s | | the patch passed with JDK Ubuntu-11.0.22+7-post-Ubuntu-0ubuntu220.04.1 | | +1 :green_heart: | javac | 16m 53s | | the patch passed | | -1 :x: | compile | 6m 50s | [/patch-compile-root-jdkPrivateBuild-1.8.0_402-8u402-ga-2ubuntu1~20.04-b06.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-6692/2/artifact/out/patch-compile-root-jdkPrivateBuild-1.8.0_402-8u402-ga-2ubuntu1~20.04-b06.txt) | root in the patch failed with JDK Private Build-1.8.0_402-8u402-ga-2ubuntu1~20.04-b06. | | -1 :x: | javac | 6m 50s | [/patch-compile-root-jdkPrivateBuild-1.8.0_402-8u402-ga-2ubuntu1~20.04-b06.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-6692/2/artifact/out/patch-compile-root-jdkPrivateBuild-1.8.0_402-8u402-ga-2ubuntu1~20.04-b06.txt) | root in the patch failed with JDK Private Build-1.8.0_402-8u402-ga-2ubuntu1~20.04-b06. | | +1 :green_heart: | blanks | 0m 0s | | The patch has no blanks issues. | | -0 :warning: | checkstyle | 5m 14s | [/results-checkstyle-root.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-6692/2/artifact/out/results-checkstyle-root.txt) | root: The patch generated 3 new + 3954 unchanged - 6 fixed = 3957 total (was 3960) | | -1 :x: | mvnsite | 0m 57s | [/patch-mvnsite-hadoop-yarn-project_hadoop-yarn_hadoop-yarn-common.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-6692/2/artifact/out/patch-mvnsite-hadoop-yarn-project_hadoop-yarn_hadoop-yarn-common.txt) | hadoop-yarn-common in the patch failed. | | +1 :green_heart: | javadoc | 17m 46s | | the patch passed with JDK Ubuntu-11.0.22+7-post-Ubuntu-0ubuntu220.04.1 | | +1 :green_heart: | javadoc | 17m 36s | | the patch passed with JDK Private Build-1.8.0_402-8u402-ga-2ubuntu1~20.04-b06 | | -1 :x: | spotbugs | 0m 52s | [/patch-spotbugs-hadoop-yarn-project_hadoop-yarn_hadoop-yarn-common.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-6692/2/artifact/out/patch-spotbugs-hadoop-yarn-project_hadoop-yarn_hadoop-yarn-common.txt) | hadoop-yarn-common in the patch failed. | | -1 :x: | shadedclient | 9m 18s | | patch has errors when building and testing our client artifacts. | _ Other Tests _ | | +1 :green_heart: | unit | 19m 35s | | hadoop-common in the patch passed. | | +1 :green_heart: | unit | 3m 52s | | hadoop-kms in the patch
[jira] [Commented] (HADOOP-19134) use StringBuilder instead of StringBuffer
[ https://issues.apache.org/jira/browse/HADOOP-19134?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17840771#comment-17840771 ] ASF GitHub Bot commented on HADOOP-19134: - hadoop-yetus commented on PR #6692: URL: https://github.com/apache/hadoop/pull/6692#issuecomment-2077013292 :broken_heart: **-1 overall** | Vote | Subsystem | Runtime | Logfile | Comment | |::|--:|:|::|:---:| | +0 :ok: | reexec | 0m 29s | | Docker mode activated. | _ Prechecks _ | | +1 :green_heart: | dupname | 0m 2s | | No case conflicting files found. | | +0 :ok: | codespell | 0m 0s | | codespell was not available. | | +0 :ok: | detsecrets | 0m 0s | | detect-secrets was not available. | | +1 :green_heart: | @author | 0m 0s | | The patch does not contain any @author tags. | | +1 :green_heart: | test4tests | 0m 0s | | The patch appears to include 56 new or modified test files. | _ trunk Compile Tests _ | | +0 :ok: | mvndep | 14m 53s | | Maven dependency ordering for branch | | +1 :green_heart: | mvninstall | 32m 10s | | trunk passed | | +1 :green_heart: | compile | 17m 6s | | trunk passed with JDK Ubuntu-11.0.22+7-post-Ubuntu-0ubuntu220.04.1 | | +1 :green_heart: | compile | 15m 38s | | trunk passed with JDK Private Build-1.8.0_402-8u402-ga-2ubuntu1~20.04-b06 | | +1 :green_heart: | checkstyle | 4m 48s | | trunk passed | | +1 :green_heart: | mvnsite | 19m 5s | | trunk passed | | +1 :green_heart: | javadoc | 16m 42s | | trunk passed with JDK Ubuntu-11.0.22+7-post-Ubuntu-0ubuntu220.04.1 | | +1 :green_heart: | javadoc | 16m 41s | | trunk passed with JDK Private Build-1.8.0_402-8u402-ga-2ubuntu1~20.04-b06 | | +1 :green_heart: | spotbugs | 29m 47s | | trunk passed | | +1 :green_heart: | shadedclient | 34m 2s | | branch has no errors when building and testing our client artifacts. | _ Patch Compile Tests _ | | +0 :ok: | mvndep | 0m 33s | | Maven dependency ordering for patch | | -1 :x: | mvninstall | 0m 29s | [/patch-mvninstall-hadoop-yarn-project_hadoop-yarn_hadoop-yarn-common.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-6692/3/artifact/out/patch-mvninstall-hadoop-yarn-project_hadoop-yarn_hadoop-yarn-common.txt) | hadoop-yarn-common in the patch failed. | | +1 :green_heart: | compile | 16m 32s | | the patch passed with JDK Ubuntu-11.0.22+7-post-Ubuntu-0ubuntu220.04.1 | | +1 :green_heart: | javac | 16m 32s | | the patch passed | | -1 :x: | compile | 6m 49s | [/patch-compile-root-jdkPrivateBuild-1.8.0_402-8u402-ga-2ubuntu1~20.04-b06.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-6692/3/artifact/out/patch-compile-root-jdkPrivateBuild-1.8.0_402-8u402-ga-2ubuntu1~20.04-b06.txt) | root in the patch failed with JDK Private Build-1.8.0_402-8u402-ga-2ubuntu1~20.04-b06. | | -1 :x: | javac | 6m 49s | [/patch-compile-root-jdkPrivateBuild-1.8.0_402-8u402-ga-2ubuntu1~20.04-b06.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-6692/3/artifact/out/patch-compile-root-jdkPrivateBuild-1.8.0_402-8u402-ga-2ubuntu1~20.04-b06.txt) | root in the patch failed with JDK Private Build-1.8.0_402-8u402-ga-2ubuntu1~20.04-b06. | | +1 :green_heart: | blanks | 0m 0s | | The patch has no blanks issues. | | -0 :warning: | checkstyle | 5m 13s | [/results-checkstyle-root.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-6692/3/artifact/out/results-checkstyle-root.txt) | root: The patch generated 3 new + 3956 unchanged - 6 fixed = 3959 total (was 3962) | | -1 :x: | mvnsite | 0m 54s | [/patch-mvnsite-hadoop-yarn-project_hadoop-yarn_hadoop-yarn-common.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-6692/3/artifact/out/patch-mvnsite-hadoop-yarn-project_hadoop-yarn_hadoop-yarn-common.txt) | hadoop-yarn-common in the patch failed. | | +1 :green_heart: | javadoc | 16m 35s | | the patch passed with JDK Ubuntu-11.0.22+7-post-Ubuntu-0ubuntu220.04.1 | | +1 :green_heart: | javadoc | 16m 32s | | the patch passed with JDK Private Build-1.8.0_402-8u402-ga-2ubuntu1~20.04-b06 | | -1 :x: | spotbugs | 0m 49s | [/patch-spotbugs-hadoop-yarn-project_hadoop-yarn_hadoop-yarn-common.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-6692/3/artifact/out/patch-spotbugs-hadoop-yarn-project_hadoop-yarn_hadoop-yarn-common.txt) | hadoop-yarn-common in the patch failed. | | -1 :x: | shadedclient | 9m 3s | | patch has errors when building and testing our client artifacts. | _ Other Tests _ | | +1 :green_heart: | unit | 19m 21s | | hadoop-common in the patch passed. | | +1 :green_heart: | unit | 3m 47s | | hadoop-kms in the patch
[jira] [Commented] (HADOOP-18516) [ABFS]: Support fixed SAS token config in addition to Custom SASTokenProvider Implementation
[ https://issues.apache.org/jira/browse/HADOOP-18516?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17840759#comment-17840759 ] ASF GitHub Bot commented on HADOOP-18516: - hadoop-yetus commented on PR #6552: URL: https://github.com/apache/hadoop/pull/6552#issuecomment-2076944140 :broken_heart: **-1 overall** | Vote | Subsystem | Runtime | Logfile | Comment | |::|--:|:|::|:---:| _ Prechecks _ | | +1 :green_heart: | dupname | 0m 03s | | No case conflicting files found. | | +0 :ok: | spotbugs | 0m 00s | | spotbugs executables are not available. | | +0 :ok: | codespell | 0m 00s | | codespell was not available. | | +0 :ok: | detsecrets | 0m 00s | | detect-secrets was not available. | | +0 :ok: | markdownlint | 0m 01s | | markdownlint was not available. | | +1 :green_heart: | @author | 0m 00s | | The patch does not contain any @author tags. | | +1 :green_heart: | test4tests | 0m 00s | | The patch appears to include 6 new or modified test files. | _ trunk Compile Tests _ | | +1 :green_heart: | mvninstall | 123m 47s | | trunk passed | | +1 :green_heart: | compile | 7m 35s | | trunk passed | | +1 :green_heart: | checkstyle | 7m 10s | | trunk passed | | +1 :green_heart: | mvnsite | 7m 53s | | trunk passed | | +1 :green_heart: | javadoc | 7m 18s | | trunk passed | | +1 :green_heart: | shadedclient | 216m 15s | | branch has no errors when building and testing our client artifacts. | | -0 :warning: | patch | 220m 11s | | Used diff version of patch file. Binary files and potentially other changes not applied. Please rebase and squash commits if necessary. | _ Patch Compile Tests _ | | -1 :x: | mvninstall | 3m 54s | [/patch-mvninstall-hadoop-tools_hadoop-azure.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch-windows-10/job/PR-6552/1/artifact/out/patch-mvninstall-hadoop-tools_hadoop-azure.txt) | hadoop-azure in the patch failed. | | -1 :x: | compile | 3m 32s | [/patch-compile-hadoop-tools_hadoop-azure.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch-windows-10/job/PR-6552/1/artifact/out/patch-compile-hadoop-tools_hadoop-azure.txt) | hadoop-azure in the patch failed. | | -1 :x: | javac | 3m 32s | [/patch-compile-hadoop-tools_hadoop-azure.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch-windows-10/job/PR-6552/1/artifact/out/patch-compile-hadoop-tools_hadoop-azure.txt) | hadoop-azure in the patch failed. | | +1 :green_heart: | blanks | 0m 00s | | The patch has no blanks issues. | | +1 :green_heart: | checkstyle | 3m 12s | | the patch passed | | -1 :x: | mvnsite | 3m 29s | [/patch-mvnsite-hadoop-tools_hadoop-azure.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch-windows-10/job/PR-6552/1/artifact/out/patch-mvnsite-hadoop-tools_hadoop-azure.txt) | hadoop-azure in the patch failed. | | +1 :green_heart: | javadoc | 3m 20s | | the patch passed | | -1 :x: | shadedclient | 221m 00s | | patch has errors when building and testing our client artifacts. | _ Other Tests _ | | +1 :green_heart: | asflicense | 8m 14s | | The patch does not generate ASF License warnings. | | | | 596m 44s | | | | Subsystem | Report/Notes | |--:|:-| | GITHUB PR | https://github.com/apache/hadoop/pull/6552 | | Optional Tests | dupname asflicense compile javac javadoc mvninstall mvnsite unit shadedclient spotbugs checkstyle codespell detsecrets markdownlint | | uname | MINGW64_NT-10.0-17763 2dbfdbef420f 3.4.10-87d57229.x86_64 2024-02-14 20:17 UTC x86_64 Msys | | Build tool | maven | | Personality | /c/hadoop/dev-support/bin/hadoop.sh | | git revision | trunk / d06fe415497f1ca98787119b5f7e8680afed2547 | | Default Java | Azul Systems, Inc.-1.8.0_332-b09 | | Test Results | https://ci-hadoop.apache.org/job/hadoop-multibranch-windows-10/job/PR-6552/1/testReport/ | | modules | C: hadoop-tools/hadoop-azure U: hadoop-tools/hadoop-azure | | Console output | https://ci-hadoop.apache.org/job/hadoop-multibranch-windows-10/job/PR-6552/1/console | | versions | git=2.44.0.windows.1 | | Powered by | Apache Yetus 0.14.0 https://yetus.apache.org | This message was automatically generated. > [ABFS]: Support fixed SAS token config in addition to Custom SASTokenProvider > Implementation > > > Key: HADOOP-18516 > URL: https://issues.apache.org/jira/browse/HADOOP-18516 > Project: Hadoop Common > Issue Type: Sub-task > Components: fs/azure >Affects Versions: 3.4.0 >Reporter:
[jira] [Commented] (HADOOP-19071) Update maven-surefire-plugin from 3.0.0 to 3.2.5
[ https://issues.apache.org/jira/browse/HADOOP-19071?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17840753#comment-17840753 ] ASF GitHub Bot commented on HADOOP-19071: - hadoop-yetus commented on PR #6545: URL: https://github.com/apache/hadoop/pull/6545#issuecomment-2076895892 :broken_heart: **-1 overall** | Vote | Subsystem | Runtime | Logfile | Comment | |::|--:|:|::|:---:| | +0 :ok: | reexec | 1m 05s | | Precommit patch detected. | _ Prechecks _ | | +1 :green_heart: | dupname | 0m 01s | | No case conflicting files found. | | +0 :ok: | codespell | 0m 01s | | codespell was not available. | | +0 :ok: | detsecrets | 0m 01s | | detect-secrets was not available. | | +0 :ok: | shellcheck | 0m 01s | | Shellcheck was not available. | | +0 :ok: | shelldocs | 0m 01s | | Shelldocs was not available. | | +0 :ok: | xmllint | 0m 00s | | xmllint was not available. | | +1 :green_heart: | @author | 0m 01s | | The patch does not contain any @author tags. | | -1 :x: | test4tests | 0m 00s | | The patch doesn't appear to include any new or modified tests. Please justify why no new tests are needed for this patch. Also please list what manual steps were performed to verify this patch. | _ branch-3.4 Compile Tests _ | | +1 :green_heart: | mvninstall | 91m 46s | | branch-3.4 passed | | +1 :green_heart: | compile | 4m 32s | | branch-3.4 passed | | +1 :green_heart: | mvnsite | 4m 21s | | branch-3.4 passed | | +1 :green_heart: | javadoc | 4m 25s | | branch-3.4 passed | | +1 :green_heart: | shadedclient | 239m 15s | | branch has no errors when building and testing our client artifacts. | _ Patch Compile Tests _ | | +1 :green_heart: | mvninstall | 2m 04s | | the patch passed | | +1 :green_heart: | compile | 1m 58s | | the patch passed | | +1 :green_heart: | javac | 1m 58s | | the patch passed | | +1 :green_heart: | blanks | 0m 00s | | The patch has no blanks issues. | | +1 :green_heart: | mvnsite | 1m 58s | | the patch passed | | +1 :green_heart: | javadoc | 1m 52s | | the patch passed | | +1 :green_heart: | shadedclient | 150m 41s | | patch has no errors when building and testing our client artifacts. | _ Other Tests _ | | +1 :green_heart: | unit | 4m 32s | | hadoop-project in the patch passed. | | -1 :x: | asflicense | 5m 23s | [/results-asflicense.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch-windows-10/job/PR-6545/1/artifact/out/results-asflicense.txt) | The patch generated 1 ASF License warnings. | | | | 422m 00s | | | | Subsystem | Report/Notes | |--:|:-| | GITHUB PR | https://github.com/apache/hadoop/pull/6545 | | Optional Tests | dupname asflicense mvnsite unit codespell detsecrets shellcheck shelldocs compile javac javadoc mvninstall shadedclient xmllint | | uname | MINGW64_NT-10.0-17763 e57f08a91a03 3.4.10-87d57229.x86_64 2024-02-14 20:17 UTC x86_64 Msys | | Build tool | maven | | Personality | /c/out/precommit/personality/provided.sh | | git revision | branch-3.4 / 1f110c1da42f607b2454a13d68074dc88879874b | | Default Java | Azul Systems, Inc.-1.8.0_332-b09 | | Test Results | https://ci-hadoop.apache.org/job/hadoop-multibranch-windows-10/job/PR-6545/1/testReport/ | | modules | C: hadoop-project U: hadoop-project | | Console output | https://ci-hadoop.apache.org/job/hadoop-multibranch-windows-10/job/PR-6545/1/console | | versions | git=2.44.0.windows.1 | | Powered by | Apache Yetus 0.14.0 https://yetus.apache.org | This message was automatically generated. > Update maven-surefire-plugin from 3.0.0 to 3.2.5 > - > > Key: HADOOP-19071 > URL: https://issues.apache.org/jira/browse/HADOOP-19071 > Project: Hadoop Common > Issue Type: Sub-task > Components: build, common >Affects Versions: 3.4.0, 3.5.0 >Reporter: Shilun Fan >Assignee: Shilun Fan >Priority: Major > Labels: pull-request-available > -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Commented] (HADOOP-19139) [ABFS]: No GetPathStatus call for opening AbfsInputStream
[ https://issues.apache.org/jira/browse/HADOOP-19139?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17840736#comment-17840736 ] ASF GitHub Bot commented on HADOOP-19139: - anmolanmol1234 commented on code in PR #6699: URL: https://github.com/apache/hadoop/pull/6699#discussion_r1579209761 ## hadoop-tools/hadoop-azure/src/test/java/org/apache/hadoop/fs/azurebfs/ITestAzureBlobFileSystemAuthorization.java: ## @@ -328,13 +328,13 @@ private void executeOp(Path reqPath, AzureBlobFileSystem fs, fs.open(reqPath); break; case Open: - InputStream is = fs.open(reqPath); - if (getConfiguration().getHeadOptimizationForInputStream()) { -try { - is.read(); -} catch (IOException ex) { - is.close(); - throw (IOException) ex.getCause(); + try(InputStream is = fs.open(reqPath)) { +if (getConfiguration().isInputStreamLazyOptimizationEnabled()) { + try { +is.read(); + } catch (IOException ex) { +throw (IOException) ex.getCause(); + } Review Comment: okay missed that part, males semse ## hadoop-tools/hadoop-azure/src/test/java/org/apache/hadoop/fs/azurebfs/ITestAzureBlobFileSystemAuthorization.java: ## @@ -328,13 +328,13 @@ private void executeOp(Path reqPath, AzureBlobFileSystem fs, fs.open(reqPath); break; case Open: - InputStream is = fs.open(reqPath); - if (getConfiguration().getHeadOptimizationForInputStream()) { -try { - is.read(); -} catch (IOException ex) { - is.close(); - throw (IOException) ex.getCause(); + try(InputStream is = fs.open(reqPath)) { +if (getConfiguration().isInputStreamLazyOptimizationEnabled()) { + try { +is.read(); + } catch (IOException ex) { +throw (IOException) ex.getCause(); + } Review Comment: okay missed that part, makes sense > [ABFS]: No GetPathStatus call for opening AbfsInputStream > - > > Key: HADOOP-19139 > URL: https://issues.apache.org/jira/browse/HADOOP-19139 > Project: Hadoop Common > Issue Type: Sub-task > Components: fs/azure >Reporter: Pranav Saxena >Assignee: Pranav Saxena >Priority: Major > Labels: pull-request-available > > Read API gives contentLen and etag of the path. This information would be > used in future calls on that inputStream. Prior information of eTag is of not > much importance. -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Commented] (HADOOP-19139) [ABFS]: No GetPathStatus call for opening AbfsInputStream
[ https://issues.apache.org/jira/browse/HADOOP-19139?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17840733#comment-17840733 ] ASF GitHub Bot commented on HADOOP-19139: - saxenapranav commented on code in PR #6699: URL: https://github.com/apache/hadoop/pull/6699#discussion_r1579189592 ## hadoop-tools/hadoop-azure/src/main/java/org/apache/hadoop/fs/azurebfs/services/AbfsInputStream.java: ## @@ -376,32 +439,48 @@ private int readLastBlock(final byte[] b, final int off, final int len) // data need to be copied to user buffer from index bCursor, // AbfsInutStream buffer is going to contain data from last block start. In // that case bCursor will be set to fCursor - lastBlockStart -long lastBlockStart = max(0, contentLength - footerReadSize); +if (!fileStatusInformationPresent.get()) { + long lastBlockStart = max(0, (fCursor + len) - footerReadSize); + bCursor = (int) (fCursor - lastBlockStart); + return optimisedRead(b, off, len, lastBlockStart, min(fCursor + len, footerReadSize), true); +} +long lastBlockStart = max(0, getContentLength() - footerReadSize); bCursor = (int) (fCursor - lastBlockStart); // 0 if contentlength is < buffersize -long actualLenToRead = min(footerReadSize, contentLength); -return optimisedRead(b, off, len, lastBlockStart, actualLenToRead); +long actualLenToRead = min(footerReadSize, getContentLength()); +return optimisedRead(b, off, len, lastBlockStart, actualLenToRead, false); } private int optimisedRead(final byte[] b, final int off, final int len, - final long readFrom, final long actualLen) throws IOException { + final long readFrom, final long actualLen, + final boolean isReadWithoutContentLengthInformation) throws IOException { fCursor = readFrom; int totalBytesRead = 0; int lastBytesRead = 0; try { buffer = new byte[bufferSize]; + boolean fileStatusInformationPresentBeforeRead = fileStatusInformationPresent.get(); for (int i = 0; - i < MAX_OPTIMIZED_READ_ATTEMPTS && fCursor < contentLength; i++) { + i < MAX_OPTIMIZED_READ_ATTEMPTS && (!fileStatusInformationPresent.get() + || fCursor < getContentLength()); i++) { lastBytesRead = readInternal(fCursor, buffer, limit, (int) actualLen - limit, true); if (lastBytesRead > 0) { totalBytesRead += lastBytesRead; + boolean shouldBreak = !fileStatusInformationPresentBeforeRead + && totalBytesRead == (int) actualLen; limit += lastBytesRead; fCursor += lastBytesRead; fCursorAfterLastRead = fCursor; + if (shouldBreak) { +break; + } } } } catch (IOException e) { + if (isNonRetriableOptimizedReadException(e)) { +throw e; Review Comment: Added. > [ABFS]: No GetPathStatus call for opening AbfsInputStream > - > > Key: HADOOP-19139 > URL: https://issues.apache.org/jira/browse/HADOOP-19139 > Project: Hadoop Common > Issue Type: Sub-task > Components: fs/azure >Reporter: Pranav Saxena >Assignee: Pranav Saxena >Priority: Major > Labels: pull-request-available > > Read API gives contentLen and etag of the path. This information would be > used in future calls on that inputStream. Prior information of eTag is of not > much importance. -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Commented] (HADOOP-19139) [ABFS]: No GetPathStatus call for opening AbfsInputStream
[ https://issues.apache.org/jira/browse/HADOOP-19139?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17840732#comment-17840732 ] ASF GitHub Bot commented on HADOOP-19139: - saxenapranav commented on code in PR #6699: URL: https://github.com/apache/hadoop/pull/6699#discussion_r1579186349 ## hadoop-tools/hadoop-azure/src/test/java/org/apache/hadoop/fs/azurebfs/ITestAbfsNetworkStatistics.java: ## @@ -231,7 +237,17 @@ public void testAbfsHttpResponseStatistics() throws IOException { // 1 read request = 1 connection and 1 get response expectedConnectionsMade++; expectedGetResponses++; - expectedBytesReceived += bytesWrittenToFile; + if (!getConfiguration().getHeadOptimizationForInputStream()) { +expectedBytesReceived += bytesWrittenToFile; + } else { +/* + * With head optimization enabled, the abfsInputStream is not aware + * of the contentLength and hence, it would only read data for which the range + * is provided. With the first remote call done, the inputStream will get + * aware of the contentLength and would be able to use it for further reads. + */ +expectedBytesReceived += 1; Review Comment: At this point, the inputStream is at position 0 and the read request from application is 1 Byte. If the read full-file optimization is enabled, the inputStream would attempt to read the first readBuffer block from the file, which would read the whole file as the fileContentLength is smaller than the readBuffer size. > [ABFS]: No GetPathStatus call for opening AbfsInputStream > - > > Key: HADOOP-19139 > URL: https://issues.apache.org/jira/browse/HADOOP-19139 > Project: Hadoop Common > Issue Type: Sub-task > Components: fs/azure >Reporter: Pranav Saxena >Assignee: Pranav Saxena >Priority: Major > Labels: pull-request-available > > Read API gives contentLen and etag of the path. This information would be > used in future calls on that inputStream. Prior information of eTag is of not > much importance. -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Commented] (HADOOP-19139) [ABFS]: No GetPathStatus call for opening AbfsInputStream
[ https://issues.apache.org/jira/browse/HADOOP-19139?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17840729#comment-17840729 ] ASF GitHub Bot commented on HADOOP-19139: - saxenapranav commented on code in PR #6699: URL: https://github.com/apache/hadoop/pull/6699#discussion_r1579180763 ## hadoop-tools/hadoop-azure/src/main/java/org/apache/hadoop/fs/azurebfs/services/AbfsInputStream.java: ## @@ -376,32 +439,48 @@ private int readLastBlock(final byte[] b, final int off, final int len) // data need to be copied to user buffer from index bCursor, // AbfsInutStream buffer is going to contain data from last block start. In // that case bCursor will be set to fCursor - lastBlockStart -long lastBlockStart = max(0, contentLength - footerReadSize); +if (!fileStatusInformationPresent.get()) { + long lastBlockStart = max(0, (fCursor + len) - footerReadSize); + bCursor = (int) (fCursor - lastBlockStart); + return optimisedRead(b, off, len, lastBlockStart, min(fCursor + len, footerReadSize), true); +} +long lastBlockStart = max(0, getContentLength() - footerReadSize); bCursor = (int) (fCursor - lastBlockStart); // 0 if contentlength is < buffersize -long actualLenToRead = min(footerReadSize, contentLength); -return optimisedRead(b, off, len, lastBlockStart, actualLenToRead); +long actualLenToRead = min(footerReadSize, getContentLength()); +return optimisedRead(b, off, len, lastBlockStart, actualLenToRead, false); } private int optimisedRead(final byte[] b, final int off, final int len, - final long readFrom, final long actualLen) throws IOException { + final long readFrom, final long actualLen, + final boolean isReadWithoutContentLengthInformation) throws IOException { fCursor = readFrom; int totalBytesRead = 0; int lastBytesRead = 0; try { buffer = new byte[bufferSize]; + boolean fileStatusInformationPresentBeforeRead = fileStatusInformationPresent.get(); for (int i = 0; - i < MAX_OPTIMIZED_READ_ATTEMPTS && fCursor < contentLength; i++) { + i < MAX_OPTIMIZED_READ_ATTEMPTS && (!fileStatusInformationPresent.get() + || fCursor < getContentLength()); i++) { lastBytesRead = readInternal(fCursor, buffer, limit, (int) actualLen - limit, true); if (lastBytesRead > 0) { totalBytesRead += lastBytesRead; + boolean shouldBreak = !fileStatusInformationPresentBeforeRead + && totalBytesRead == (int) actualLen; limit += lastBytesRead; fCursor += lastBytesRead; fCursorAfterLastRead = fCursor; + if (shouldBreak) { +break; + } } } } catch (IOException e) { + if (isNonRetriableOptimizedReadException(e)) { +throw e; Review Comment: adding: ``` /* * FileNotFoundException in AbfsInputStream read can happen only in case of * lazy optimization enabled. In such case, the contentLength is not known * before opening the inputStream, and the first read can give a * FileNotFoundException, and if this exception is raised, it has to be * thrown back to the application and make a readOneBlock call. */ ``` > [ABFS]: No GetPathStatus call for opening AbfsInputStream > - > > Key: HADOOP-19139 > URL: https://issues.apache.org/jira/browse/HADOOP-19139 > Project: Hadoop Common > Issue Type: Sub-task > Components: fs/azure >Reporter: Pranav Saxena >Assignee: Pranav Saxena >Priority: Major > Labels: pull-request-available > > Read API gives contentLen and etag of the path. This information would be > used in future calls on that inputStream. Prior information of eTag is of not > much importance. -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Commented] (HADOOP-19139) [ABFS]: No GetPathStatus call for opening AbfsInputStream
[ https://issues.apache.org/jira/browse/HADOOP-19139?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17840724#comment-17840724 ] ASF GitHub Bot commented on HADOOP-19139: - saxenapranav commented on code in PR #6699: URL: https://github.com/apache/hadoop/pull/6699#discussion_r1579170762 ## hadoop-tools/hadoop-azure/src/main/java/org/apache/hadoop/fs/azurebfs/services/AbfsInputStream.java: ## @@ -376,32 +439,48 @@ private int readLastBlock(final byte[] b, final int off, final int len) // data need to be copied to user buffer from index bCursor, // AbfsInutStream buffer is going to contain data from last block start. In // that case bCursor will be set to fCursor - lastBlockStart -long lastBlockStart = max(0, contentLength - footerReadSize); +if (!fileStatusInformationPresent.get()) { + long lastBlockStart = max(0, (fCursor + len) - footerReadSize); + bCursor = (int) (fCursor - lastBlockStart); + return optimisedRead(b, off, len, lastBlockStart, min(fCursor + len, footerReadSize), true); +} +long lastBlockStart = max(0, getContentLength() - footerReadSize); bCursor = (int) (fCursor - lastBlockStart); // 0 if contentlength is < buffersize -long actualLenToRead = min(footerReadSize, contentLength); -return optimisedRead(b, off, len, lastBlockStart, actualLenToRead); +long actualLenToRead = min(footerReadSize, getContentLength()); +return optimisedRead(b, off, len, lastBlockStart, actualLenToRead, false); } private int optimisedRead(final byte[] b, final int off, final int len, - final long readFrom, final long actualLen) throws IOException { + final long readFrom, final long actualLen, + final boolean isReadWithoutContentLengthInformation) throws IOException { fCursor = readFrom; int totalBytesRead = 0; int lastBytesRead = 0; try { buffer = new byte[bufferSize]; + boolean fileStatusInformationPresentBeforeRead = fileStatusInformationPresent.get(); for (int i = 0; - i < MAX_OPTIMIZED_READ_ATTEMPTS && fCursor < contentLength; i++) { + i < MAX_OPTIMIZED_READ_ATTEMPTS && (!fileStatusInformationPresent.get() + || fCursor < getContentLength()); i++) { lastBytesRead = readInternal(fCursor, buffer, limit, (int) actualLen - limit, true); if (lastBytesRead > 0) { totalBytesRead += lastBytesRead; + boolean shouldBreak = !fileStatusInformationPresentBeforeRead + && totalBytesRead == (int) actualLen; limit += lastBytesRead; fCursor += lastBytesRead; fCursorAfterLastRead = fCursor; + if (shouldBreak) { +break; + } } } } catch (IOException e) { + if (isNonRetriableOptimizedReadException(e)) { +throw e; Review Comment: So, this is the case where there is lazy optimization , and inputStream is not aware of contentlength. Now, on the first read, it could go into an optimized block, and would try to read. In a non-lazy case, in trunk, if there is IOException raised in optimizeRead, it tries to read with ReadOneBlock. Now, in non-lazy case, it cannot be a case that an inputStream gets created for non-existing path. But, in a lazy case, inputStream can be created for non-existing path, and optimizeRead can be tried for it. Now, when the optimizeRead is failing with FileNotFound, the inputStream should fail and not try readOneBlock. I should add a comment for better code understanding in future. > [ABFS]: No GetPathStatus call for opening AbfsInputStream > - > > Key: HADOOP-19139 > URL: https://issues.apache.org/jira/browse/HADOOP-19139 > Project: Hadoop Common > Issue Type: Sub-task > Components: fs/azure >Reporter: Pranav Saxena >Assignee: Pranav Saxena >Priority: Major > Labels: pull-request-available > > Read API gives contentLen and etag of the path. This information would be > used in future calls on that inputStream. Prior information of eTag is of not > much importance. -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Commented] (HADOOP-19137) [ABFS]:Extra getAcl call while calling the very first API of FileSystem
[ https://issues.apache.org/jira/browse/HADOOP-19137?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17840673#comment-17840673 ] ASF GitHub Bot commented on HADOOP-19137: - saxenapranav commented on code in PR #6752: URL: https://github.com/apache/hadoop/pull/6752#discussion_r1578968010 ## hadoop-tools/hadoop-azure/src/main/java/org/apache/hadoop/fs/azurebfs/AzureBlobFileSystemStore.java: ## @@ -373,7 +372,21 @@ public boolean getIsNamespaceEnabled(TracingContext tracingContext) + " getAcl server call", e); } -isNamespaceEnabled = Trilean.getTrilean(NamespaceUtil.isNamespaceEnabled(client, tracingContext)); +try { + LOG.debug("Get root ACL status"); Review Comment: This class was actually introduced in CPK pr: https://github.com/apache/hadoop/pull/6221/files#diff-aed851febeaa85ad9b0c254c00ebd21200b84d36bc99255a344db89319aab0b0. Reason for adding this class was that we wanted to check namespace information in abfsClient and store. This is no more required, as we would need the information in store. Reason for removing this class is to bring in the state which was before-cpk merge. > [ABFS]:Extra getAcl call while calling the very first API of FileSystem > --- > > Key: HADOOP-19137 > URL: https://issues.apache.org/jira/browse/HADOOP-19137 > Project: Hadoop Common > Issue Type: Sub-task > Components: fs/azure >Affects Versions: 3.4.0 >Reporter: Pranav Saxena >Assignee: Pranav Saxena >Priority: Major > Labels: pull-request-available > > Store doesn't flow in the namespace information to the client. > In https://github.com/apache/hadoop/pull/6221, getIsNamespaceEnabled is added > in client methods which checks if namespace information is there or not, and > if not there, it will make getAcl call and set the field. Once the field is > set, it would be used in future getIsNamespaceEnabled method calls for a > given AbfsClient. > Since, CPK both global and encryptionContext are only for hns account, the > fix that is proposed is that we would fail fs init if its non-hns account and > cpk config is given. -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Commented] (HADOOP-19137) [ABFS]:Extra getAcl call while calling the very first API of FileSystem
[ https://issues.apache.org/jira/browse/HADOOP-19137?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17840674#comment-17840674 ] ASF GitHub Bot commented on HADOOP-19137: - saxenapranav commented on code in PR #6752: URL: https://github.com/apache/hadoop/pull/6752#discussion_r1578968010 ## hadoop-tools/hadoop-azure/src/main/java/org/apache/hadoop/fs/azurebfs/AzureBlobFileSystemStore.java: ## @@ -373,7 +372,21 @@ public boolean getIsNamespaceEnabled(TracingContext tracingContext) + " getAcl server call", e); } -isNamespaceEnabled = Trilean.getTrilean(NamespaceUtil.isNamespaceEnabled(client, tracingContext)); +try { + LOG.debug("Get root ACL status"); Review Comment: This class was actually introduced in CPK pr: https://github.com/apache/hadoop/pull/6221/files#diff-aed851febeaa85ad9b0c254c00ebd21200b84d36bc99255a344db89319aab0b0. Reason for adding this class was that we wanted to check namespace information in abfsClient and store. This is no more required, as we would need the information in store. Reason for removing this class is to bring in the state which was before-cpk merge, and also there is no more need of this util method as the usage is only at single place. > [ABFS]:Extra getAcl call while calling the very first API of FileSystem > --- > > Key: HADOOP-19137 > URL: https://issues.apache.org/jira/browse/HADOOP-19137 > Project: Hadoop Common > Issue Type: Sub-task > Components: fs/azure >Affects Versions: 3.4.0 >Reporter: Pranav Saxena >Assignee: Pranav Saxena >Priority: Major > Labels: pull-request-available > > Store doesn't flow in the namespace information to the client. > In https://github.com/apache/hadoop/pull/6221, getIsNamespaceEnabled is added > in client methods which checks if namespace information is there or not, and > if not there, it will make getAcl call and set the field. Once the field is > set, it would be used in future getIsNamespaceEnabled method calls for a > given AbfsClient. > Since, CPK both global and encryptionContext are only for hns account, the > fix that is proposed is that we would fail fs init if its non-hns account and > cpk config is given. -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Commented] (HADOOP-19137) [ABFS]:Extra getAcl call while calling the very first API of FileSystem
[ https://issues.apache.org/jira/browse/HADOOP-19137?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17840671#comment-17840671 ] ASF GitHub Bot commented on HADOOP-19137: - anmolanmol1234 commented on code in PR #6752: URL: https://github.com/apache/hadoop/pull/6752#discussion_r1578943627 ## hadoop-tools/hadoop-azure/src/main/java/org/apache/hadoop/fs/azurebfs/AzureBlobFileSystemStore.java: ## @@ -373,7 +372,21 @@ public boolean getIsNamespaceEnabled(TracingContext tracingContext) + " getAcl server call", e); } -isNamespaceEnabled = Trilean.getTrilean(NamespaceUtil.isNamespaceEnabled(client, tracingContext)); +try { + LOG.debug("Get root ACL status"); Review Comment: Why are we making the change of removing the NamespaceUtil class ? In my view we should not increase the diff > [ABFS]:Extra getAcl call while calling the very first API of FileSystem > --- > > Key: HADOOP-19137 > URL: https://issues.apache.org/jira/browse/HADOOP-19137 > Project: Hadoop Common > Issue Type: Sub-task > Components: fs/azure >Affects Versions: 3.4.0 >Reporter: Pranav Saxena >Assignee: Pranav Saxena >Priority: Major > Labels: pull-request-available > > Store doesn't flow in the namespace information to the client. > In https://github.com/apache/hadoop/pull/3440, getIsNamespaceEnabled is added > in client methods which checks if namespace information is there or not, and > if not there, it will make getAcl call and set the field. Once the field is > set, it would be used in future getIsNamespaceEnabled method calls for a > given AbfsClient. > Since, CPK both global and encryptionContext are only for hns account, the > fix that is proposed is that we would fail fs init if its non-hns account and > cpk config is given. -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Commented] (HADOOP-19139) [ABFS]: No GetPathStatus call for opening AbfsInputStream
[ https://issues.apache.org/jira/browse/HADOOP-19139?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17840669#comment-17840669 ] ASF GitHub Bot commented on HADOOP-19139: - saxenapranav commented on code in PR #6699: URL: https://github.com/apache/hadoop/pull/6699#discussion_r1578942351 ## hadoop-tools/hadoop-azure/src/test/java/org/apache/hadoop/fs/azurebfs/ITestAzureBlobFileSystemAuthorization.java: ## @@ -328,13 +328,13 @@ private void executeOp(Path reqPath, AzureBlobFileSystem fs, fs.open(reqPath); break; case Open: - InputStream is = fs.open(reqPath); - if (getConfiguration().getHeadOptimizationForInputStream()) { -try { - is.read(); -} catch (IOException ex) { - is.close(); - throw (IOException) ex.getCause(); + try(InputStream is = fs.open(reqPath)) { +if (getConfiguration().isInputStreamLazyOptimizationEnabled()) { + try { +is.read(); + } catch (IOException ex) { +throw (IOException) ex.getCause(); + } Review Comment: inputStream is opened with try-with-resources statement, at the end of the block, inputStream would be closed (even in exception raise). > [ABFS]: No GetPathStatus call for opening AbfsInputStream > - > > Key: HADOOP-19139 > URL: https://issues.apache.org/jira/browse/HADOOP-19139 > Project: Hadoop Common > Issue Type: Sub-task > Components: fs/azure >Reporter: Pranav Saxena >Assignee: Pranav Saxena >Priority: Major > Labels: pull-request-available > > Read API gives contentLen and etag of the path. This information would be > used in future calls on that inputStream. Prior information of eTag is of not > much importance. -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Commented] (HADOOP-19139) [ABFS]: No GetPathStatus call for opening AbfsInputStream
[ https://issues.apache.org/jira/browse/HADOOP-19139?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17840665#comment-17840665 ] ASF GitHub Bot commented on HADOOP-19139: - anmolanmol1234 commented on code in PR #6699: URL: https://github.com/apache/hadoop/pull/6699#discussion_r1578933240 ## hadoop-tools/hadoop-azure/src/test/java/org/apache/hadoop/fs/azurebfs/ITestAbfsNetworkStatistics.java: ## @@ -231,7 +237,17 @@ public void testAbfsHttpResponseStatistics() throws IOException { // 1 read request = 1 connection and 1 get response expectedConnectionsMade++; expectedGetResponses++; - expectedBytesReceived += bytesWrittenToFile; + if (!getConfiguration().getHeadOptimizationForInputStream()) { +expectedBytesReceived += bytesWrittenToFile; + } else { +/* + * With head optimization enabled, the abfsInputStream is not aware + * of the contentLength and hence, it would only read data for which the range + * is provided. With the first remote call done, the inputStream will get + * aware of the contentLength and would be able to use it for further reads. + */ +expectedBytesReceived += 1; Review Comment: That should mean +1 for operation, why would the bytes received be 1. Bytes received should be equal to the range it read right ## hadoop-tools/hadoop-azure/src/main/java/org/apache/hadoop/fs/azurebfs/services/AbfsInputStream.java: ## @@ -376,32 +439,48 @@ private int readLastBlock(final byte[] b, final int off, final int len) // data need to be copied to user buffer from index bCursor, // AbfsInutStream buffer is going to contain data from last block start. In // that case bCursor will be set to fCursor - lastBlockStart -long lastBlockStart = max(0, contentLength - footerReadSize); +if (!fileStatusInformationPresent.get()) { + long lastBlockStart = max(0, (fCursor + len) - footerReadSize); + bCursor = (int) (fCursor - lastBlockStart); + return optimisedRead(b, off, len, lastBlockStart, min(fCursor + len, footerReadSize), true); +} +long lastBlockStart = max(0, getContentLength() - footerReadSize); bCursor = (int) (fCursor - lastBlockStart); // 0 if contentlength is < buffersize -long actualLenToRead = min(footerReadSize, contentLength); -return optimisedRead(b, off, len, lastBlockStart, actualLenToRead); +long actualLenToRead = min(footerReadSize, getContentLength()); +return optimisedRead(b, off, len, lastBlockStart, actualLenToRead, false); } private int optimisedRead(final byte[] b, final int off, final int len, - final long readFrom, final long actualLen) throws IOException { + final long readFrom, final long actualLen, + final boolean isReadWithoutContentLengthInformation) throws IOException { fCursor = readFrom; int totalBytesRead = 0; int lastBytesRead = 0; try { buffer = new byte[bufferSize]; + boolean fileStatusInformationPresentBeforeRead = fileStatusInformationPresent.get(); for (int i = 0; - i < MAX_OPTIMIZED_READ_ATTEMPTS && fCursor < contentLength; i++) { + i < MAX_OPTIMIZED_READ_ATTEMPTS && (!fileStatusInformationPresent.get() + || fCursor < getContentLength()); i++) { lastBytesRead = readInternal(fCursor, buffer, limit, (int) actualLen - limit, true); if (lastBytesRead > 0) { totalBytesRead += lastBytesRead; + boolean shouldBreak = !fileStatusInformationPresentBeforeRead + && totalBytesRead == (int) actualLen; limit += lastBytesRead; fCursor += lastBytesRead; fCursorAfterLastRead = fCursor; + if (shouldBreak) { +break; + } } } } catch (IOException e) { + if (isNonRetriableOptimizedReadException(e)) { +throw e; Review Comment: Can you explain this a bit more, on the exception only for FileNotFound being thrown ? ## hadoop-tools/hadoop-azure/src/test/java/org/apache/hadoop/fs/azurebfs/ITestAzureBlobFileSystemAuthorization.java: ## @@ -328,13 +328,13 @@ private void executeOp(Path reqPath, AzureBlobFileSystem fs, fs.open(reqPath); break; case Open: - InputStream is = fs.open(reqPath); - if (getConfiguration().getHeadOptimizationForInputStream()) { -try { - is.read(); -} catch (IOException ex) { - is.close(); - throw (IOException) ex.getCause(); + try(InputStream is = fs.open(reqPath)) { +if (getConfiguration().isInputStreamLazyOptimizationEnabled()) { + try { +is.read(); + } catch (IOException ex) { +throw (IOException) ex.getCause(); + } Review Comment: finally block to close the input stream
[jira] [Commented] (HADOOP-19103) Add logic for verifying that the STS URL is in the correct format
[ https://issues.apache.org/jira/browse/HADOOP-19103?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17840614#comment-17840614 ] ASF GitHub Bot commented on HADOOP-19103: - hadoop-yetus commented on PR #6615: URL: https://github.com/apache/hadoop/pull/6615#issuecomment-2076140834 :confetti_ball: **+1 overall** | Vote | Subsystem | Runtime | Logfile | Comment | |::|--:|:|::|:---:| _ Prechecks _ | | +1 :green_heart: | dupname | 0m 00s | | No case conflicting files found. | | +0 :ok: | spotbugs | 0m 01s | | spotbugs executables are not available. | | +0 :ok: | codespell | 0m 01s | | codespell was not available. | | +0 :ok: | detsecrets | 0m 01s | | detect-secrets was not available. | | +1 :green_heart: | @author | 0m 00s | | The patch does not contain any @author tags. | | +1 :green_heart: | test4tests | 0m 00s | | The patch appears to include 1 new or modified test files. | _ trunk Compile Tests _ | | +1 :green_heart: | mvninstall | 106m 34s | | trunk passed | | +1 :green_heart: | compile | 5m 51s | | trunk passed | | +1 :green_heart: | checkstyle | 5m 17s | | trunk passed | | +1 :green_heart: | mvnsite | 5m 45s | | trunk passed | | +1 :green_heart: | javadoc | 5m 24s | | trunk passed | | +1 :green_heart: | shadedclient | 168m 41s | | branch has no errors when building and testing our client artifacts. | _ Patch Compile Tests _ | | +1 :green_heart: | mvninstall | 3m 37s | | the patch passed | | +1 :green_heart: | compile | 2m 40s | | the patch passed | | +1 :green_heart: | javac | 2m 40s | | the patch passed | | +1 :green_heart: | blanks | 0m 00s | | The patch has no blanks issues. | | +1 :green_heart: | checkstyle | 2m 24s | | the patch passed | | +1 :green_heart: | mvnsite | 2m 49s | | the patch passed | | +1 :green_heart: | javadoc | 2m 31s | | the patch passed | | +1 :green_heart: | shadedclient | 184m 37s | | patch has no errors when building and testing our client artifacts. | _ Other Tests _ | | +1 :green_heart: | asflicense | 6m 13s | | The patch does not generate ASF License warnings. | | | | 487m 07s | | | | Subsystem | Report/Notes | |--:|:-| | GITHUB PR | https://github.com/apache/hadoop/pull/6615 | | Optional Tests | dupname asflicense compile javac javadoc mvninstall mvnsite unit shadedclient spotbugs checkstyle codespell detsecrets | | uname | MINGW64_NT-10.0-17763 634fe718ba01 3.4.10-87d57229.x86_64 2024-02-14 20:17 UTC x86_64 Msys | | Build tool | maven | | Personality | /c/hadoop/dev-support/bin/hadoop.sh | | git revision | trunk / f7bfc5e8adb758bda80818d15721fc271fc0d7ff | | Default Java | Azul Systems, Inc.-1.8.0_332-b09 | | Test Results | https://ci-hadoop.apache.org/job/hadoop-multibranch-windows-10/job/PR-6615/1/testReport/ | | modules | C: hadoop-tools/hadoop-aws U: hadoop-tools/hadoop-aws | | Console output | https://ci-hadoop.apache.org/job/hadoop-multibranch-windows-10/job/PR-6615/1/console | | versions | git=2.44.0.windows.1 | | Powered by | Apache Yetus 0.14.0 https://yetus.apache.org | This message was automatically generated. > Add logic for verifying that the STS URL is in the correct format > - > > Key: HADOOP-19103 > URL: https://issues.apache.org/jira/browse/HADOOP-19103 > Project: Hadoop Common > Issue Type: Bug > Components: fs/s3 >Reporter: Narayanan Venkateswaran >Priority: Minor > Labels: pull-request-available > > * At present an invalid URL can be supplied as an STS endpoint. It will > attempt to create an STSClient with it and then fail with, > {quote}java.net.UnknownHostException: request session credentials: > software.amazon.awssdk.core.exception.SdkClientException: Received an > UnknownHostException when attempting to interact with a service. See cause > for the exact endpoint that is failing to resolve. If this is happening on an > endpoint that previously worked, there may be a network connectivity issue or > your DNS cache could be storing endpoints for too long.: > software.amazon.awssdk.core.exception.SdkClientException: Received an > UnknownHostException when attempting to interact with a service. See cause > for the exact endpoint that is failing to resolve. If this is happening on an > endpoint that previously worked, there may be a network connectivity issue or > your DNS cache could be storing endpoints for too long.: https > {quote} * This is inefficient. An invalid URL can be parsed much
[jira] [Commented] (HADOOP-19152) Do not hard code security providers.
[ https://issues.apache.org/jira/browse/HADOOP-19152?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17840606#comment-17840606 ] ASF GitHub Bot commented on HADOOP-19152: - hadoop-yetus commented on PR #6739: URL: https://github.com/apache/hadoop/pull/6739#issuecomment-2076112558 :confetti_ball: **+1 overall** | Vote | Subsystem | Runtime | Logfile | Comment | |::|--:|:|::|:---:| | +0 :ok: | reexec | 0m 53s | | Docker mode activated. | _ Prechecks _ | | +1 :green_heart: | dupname | 0m 1s | | No case conflicting files found. | | +0 :ok: | codespell | 0m 0s | | codespell was not available. | | +0 :ok: | detsecrets | 0m 0s | | detect-secrets was not available. | | +0 :ok: | xmllint | 0m 0s | | xmllint was not available. | | +1 :green_heart: | @author | 0m 0s | | The patch does not contain any @author tags. | | +1 :green_heart: | test4tests | 0m 0s | | The patch appears to include 3 new or modified test files. | _ trunk Compile Tests _ | | +1 :green_heart: | mvninstall | 44m 55s | | trunk passed | | +1 :green_heart: | compile | 17m 28s | | trunk passed with JDK Ubuntu-11.0.22+7-post-Ubuntu-0ubuntu220.04.1 | | +1 :green_heart: | compile | 16m 13s | | trunk passed with JDK Private Build-1.8.0_402-8u402-ga-2ubuntu1~20.04-b06 | | +1 :green_heart: | checkstyle | 1m 19s | | trunk passed | | +1 :green_heart: | mvnsite | 1m 42s | | trunk passed | | +1 :green_heart: | javadoc | 1m 18s | | trunk passed with JDK Ubuntu-11.0.22+7-post-Ubuntu-0ubuntu220.04.1 | | +1 :green_heart: | javadoc | 0m 51s | | trunk passed with JDK Private Build-1.8.0_402-8u402-ga-2ubuntu1~20.04-b06 | | +1 :green_heart: | spotbugs | 2m 33s | | trunk passed | | +1 :green_heart: | shadedclient | 35m 9s | | branch has no errors when building and testing our client artifacts. | _ Patch Compile Tests _ | | +1 :green_heart: | mvninstall | 0m 57s | | the patch passed | | +1 :green_heart: | compile | 16m 43s | | the patch passed with JDK Ubuntu-11.0.22+7-post-Ubuntu-0ubuntu220.04.1 | | +1 :green_heart: | javac | 16m 43s | | the patch passed | | +1 :green_heart: | compile | 16m 12s | | the patch passed with JDK Private Build-1.8.0_402-8u402-ga-2ubuntu1~20.04-b06 | | +1 :green_heart: | javac | 16m 12s | | the patch passed | | +1 :green_heart: | blanks | 0m 0s | | The patch has no blanks issues. | | -0 :warning: | checkstyle | 1m 12s | [/results-checkstyle-hadoop-common-project_hadoop-common.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-6739/6/artifact/out/results-checkstyle-hadoop-common-project_hadoop-common.txt) | hadoop-common-project/hadoop-common: The patch generated 8 new + 124 unchanged - 7 fixed = 132 total (was 131) | | +1 :green_heart: | mvnsite | 1m 40s | | the patch passed | | +1 :green_heart: | javadoc | 1m 10s | | the patch passed with JDK Ubuntu-11.0.22+7-post-Ubuntu-0ubuntu220.04.1 | | +1 :green_heart: | javadoc | 0m 55s | | the patch passed with JDK Private Build-1.8.0_402-8u402-ga-2ubuntu1~20.04-b06 | | +1 :green_heart: | spotbugs | 2m 45s | | the patch passed | | +1 :green_heart: | shadedclient | 36m 5s | | patch has no errors when building and testing our client artifacts. | _ Other Tests _ | | +1 :green_heart: | unit | 19m 47s | | hadoop-common in the patch passed. | | +1 :green_heart: | asflicense | 1m 3s | | The patch does not generate ASF License warnings. | | | | 224m 54s | | | | Subsystem | Report/Notes | |--:|:-| | Docker | ClientAPI=1.45 ServerAPI=1.45 base: https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-6739/6/artifact/out/Dockerfile | | GITHUB PR | https://github.com/apache/hadoop/pull/6739 | | Optional Tests | dupname asflicense compile javac javadoc mvninstall mvnsite unit shadedclient codespell detsecrets xmllint spotbugs checkstyle | | uname | Linux 7bfec095725f 5.15.0-94-generic #104-Ubuntu SMP Tue Jan 9 15:25:40 UTC 2024 x86_64 x86_64 x86_64 GNU/Linux | | Build tool | maven | | Personality | dev-support/bin/hadoop.sh | | git revision | trunk / d1daa7ade1a1f83d6851c42743a146f4b9aa52dc | | Default Java | Private Build-1.8.0_402-8u402-ga-2ubuntu1~20.04-b06 | | Multi-JDK versions | /usr/lib/jvm/java-11-openjdk-amd64:Ubuntu-11.0.22+7-post-Ubuntu-0ubuntu220.04.1 /usr/lib/jvm/java-8-openjdk-amd64:Private Build-1.8.0_402-8u402-ga-2ubuntu1~20.04-b06 | | Test Results | https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-6739/6/testReport/ | | Max. process+thread count | 2067 (vs. ulimit of 5500) | | modules | C:
[jira] [Commented] (HADOOP-18184) s3a prefetching stream to support unbuffer()
[ https://issues.apache.org/jira/browse/HADOOP-18184?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17840581#comment-17840581 ] ASF GitHub Bot commented on HADOOP-18184: - hadoop-yetus commented on PR #5832: URL: https://github.com/apache/hadoop/pull/5832#issuecomment-2075868288 :broken_heart: **-1 overall** | Vote | Subsystem | Runtime | Logfile | Comment | |::|--:|:|::|:---:| | +0 :ok: | reexec | 7m 9s | | Docker mode activated. | _ Prechecks _ | | +1 :green_heart: | dupname | 0m 1s | | No case conflicting files found. | | +0 :ok: | codespell | 0m 0s | | codespell was not available. | | +0 :ok: | detsecrets | 0m 0s | | detect-secrets was not available. | | +0 :ok: | markdownlint | 0m 0s | | markdownlint was not available. | | +1 :green_heart: | @author | 0m 0s | | The patch does not contain any @author tags. | | +1 :green_heart: | test4tests | 0m 0s | | The patch appears to include 28 new or modified test files. | _ trunk Compile Tests _ | | +0 :ok: | mvndep | 14m 7s | | Maven dependency ordering for branch | | +1 :green_heart: | mvninstall | 19m 54s | | trunk passed | | +1 :green_heart: | compile | 9m 2s | | trunk passed with JDK Ubuntu-11.0.22+7-post-Ubuntu-0ubuntu220.04.1 | | +1 :green_heart: | compile | 8m 9s | | trunk passed with JDK Private Build-1.8.0_402-8u402-ga-2ubuntu1~20.04-b06 | | +1 :green_heart: | checkstyle | 2m 3s | | trunk passed | | +1 :green_heart: | mvnsite | 1m 35s | | trunk passed | | +1 :green_heart: | javadoc | 1m 15s | | trunk passed with JDK Ubuntu-11.0.22+7-post-Ubuntu-0ubuntu220.04.1 | | +1 :green_heart: | javadoc | 1m 10s | | trunk passed with JDK Private Build-1.8.0_402-8u402-ga-2ubuntu1~20.04-b06 | | +1 :green_heart: | spotbugs | 2m 20s | | trunk passed | | +1 :green_heart: | shadedclient | 20m 51s | | branch has no errors when building and testing our client artifacts. | _ Patch Compile Tests _ | | +0 :ok: | mvndep | 0m 23s | | Maven dependency ordering for patch | | +1 :green_heart: | mvninstall | 0m 50s | | the patch passed | | +1 :green_heart: | compile | 8m 29s | | the patch passed with JDK Ubuntu-11.0.22+7-post-Ubuntu-0ubuntu220.04.1 | | +1 :green_heart: | javac | 8m 29s | | the patch passed | | +1 :green_heart: | compile | 8m 21s | | the patch passed with JDK Private Build-1.8.0_402-8u402-ga-2ubuntu1~20.04-b06 | | +1 :green_heart: | javac | 8m 21s | | the patch passed | | +1 :green_heart: | blanks | 0m 0s | | The patch has no blanks issues. | | -0 :warning: | checkstyle | 2m 3s | [/results-checkstyle-root.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-5832/21/artifact/out/results-checkstyle-root.txt) | root: The patch generated 39 new + 9 unchanged - 0 fixed = 48 total (was 9) | | +1 :green_heart: | mvnsite | 1m 35s | | the patch passed | | +1 :green_heart: | javadoc | 1m 10s | | the patch passed with JDK Ubuntu-11.0.22+7-post-Ubuntu-0ubuntu220.04.1 | | -1 :x: | javadoc | 0m 32s | [/results-javadoc-javadoc-hadoop-tools_hadoop-aws-jdkPrivateBuild-1.8.0_402-8u402-ga-2ubuntu1~20.04-b06.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-5832/21/artifact/out/results-javadoc-javadoc-hadoop-tools_hadoop-aws-jdkPrivateBuild-1.8.0_402-8u402-ga-2ubuntu1~20.04-b06.txt) | hadoop-tools_hadoop-aws-jdkPrivateBuild-1.8.0_402-8u402-ga-2ubuntu1~20.04-b06 with JDK Private Build-1.8.0_402-8u402-ga-2ubuntu1~20.04-b06 generated 1 new + 0 unchanged - 0 fixed = 1 total (was 0) | | +1 :green_heart: | spotbugs | 2m 29s | | the patch passed | | +1 :green_heart: | shadedclient | 20m 52s | | patch has no errors when building and testing our client artifacts. | _ Other Tests _ | | +1 :green_heart: | unit | 16m 24s | | hadoop-common in the patch passed. | | -1 :x: | unit | 2m 33s | [/patch-unit-hadoop-tools_hadoop-aws.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-5832/21/artifact/out/patch-unit-hadoop-tools_hadoop-aws.txt) | hadoop-aws in the patch passed. | | +1 :green_heart: | asflicense | 0m 42s | | The patch does not generate ASF License warnings. | | | | 159m 14s | | | | Reason | Tests | |---:|:--| | Failed junit tests | hadoop.fs.s3a.prefetch.TestS3ACachingBlockManager | | Subsystem | Report/Notes | |--:|:-| | Docker | ClientAPI=1.45 ServerAPI=1.45 base: https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-5832/21/artifact/out/Dockerfile | | GITHUB PR | https://github.com/apache/hadoop/pull/5832 | | Optional Tests | dupname asflicense
[jira] [Commented] (HADOOP-19128) Unified use of placeholder for log calling
[ https://issues.apache.org/jira/browse/HADOOP-19128?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17840580#comment-17840580 ] ASF GitHub Bot commented on HADOOP-19128: - hadoop-yetus commented on PR #6680: URL: https://github.com/apache/hadoop/pull/6680#issuecomment-2075863421 :broken_heart: **-1 overall** | Vote | Subsystem | Runtime | Logfile | Comment | |::|--:|:|::|:---:| _ Prechecks _ | | +1 :green_heart: | dupname | 0m 12s | | No case conflicting files found. | | +0 :ok: | spotbugs | 0m 01s | | spotbugs executables are not available. | | +0 :ok: | codespell | 0m 01s | | codespell was not available. | | +0 :ok: | detsecrets | 0m 01s | | detect-secrets was not available. | | +0 :ok: | markdownlint | 0m 01s | | markdownlint was not available. | | +1 :green_heart: | @author | 0m 00s | | The patch does not contain any @author tags. | | +1 :green_heart: | test4tests | 0m 00s | | The patch appears to include 9 new or modified test files. | _ trunk Compile Tests _ | | +0 :ok: | mvndep | 2m 24s | | Maven dependency ordering for branch | | +1 :green_heart: | mvninstall | 89m 44s | | trunk passed | | +1 :green_heart: | compile | 40m 14s | | trunk passed | | +1 :green_heart: | checkstyle | 6m 09s | | trunk passed | | -1 :x: | mvnsite | 4m 29s | [/branch-mvnsite-hadoop-common-project_hadoop-common.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch-windows-10/job/PR-6680/1/artifact/out/branch-mvnsite-hadoop-common-project_hadoop-common.txt) | hadoop-common in trunk failed. | | +1 :green_heart: | javadoc | 74m 41s | | trunk passed | | +1 :green_heart: | shadedclient | 325m 01s | | branch has no errors when building and testing our client artifacts. | _ Patch Compile Tests _ | | +0 :ok: | mvndep | 2m 17s | | Maven dependency ordering for patch | | +1 :green_heart: | mvninstall | 73m 34s | | the patch passed | | +1 :green_heart: | compile | 40m 26s | | the patch passed | | +1 :green_heart: | javac | 40m 26s | | the patch passed | | +1 :green_heart: | blanks | 0m 00s | | The patch has no blanks issues. | | +1 :green_heart: | checkstyle | 7m 08s | | the patch passed | | -1 :x: | mvnsite | 4m 31s | [/patch-mvnsite-hadoop-common-project_hadoop-common.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch-windows-10/job/PR-6680/1/artifact/out/patch-mvnsite-hadoop-common-project_hadoop-common.txt) | hadoop-common in the patch failed. | | +1 :green_heart: | javadoc | 73m 24s | | the patch passed | | +1 :green_heart: | shadedclient | 336m 00s | | patch has no errors when building and testing our client artifacts. | _ Other Tests _ | | +1 :green_heart: | asflicense | 5m 47s | | The patch does not generate ASF License warnings. | | | | 924m 09s | | | | Subsystem | Report/Notes | |--:|:-| | GITHUB PR | https://github.com/apache/hadoop/pull/6680 | | Optional Tests | dupname asflicense compile javac javadoc mvninstall mvnsite unit shadedclient spotbugs checkstyle codespell detsecrets markdownlint | | uname | MINGW64_NT-10.0-17763 100c2ae9c020 3.4.10-87d57229.x86_64 2024-02-14 20:17 UTC x86_64 Msys | | Build tool | maven | | Personality | /c/hadoop/dev-support/bin/hadoop.sh | | git revision | trunk / 68e98b00ea98c640a3a7ffcf07d10300e1812ad0 | | Default Java | Azul Systems, Inc.-1.8.0_332-b09 | | Test Results | https://ci-hadoop.apache.org/job/hadoop-multibranch-windows-10/job/PR-6680/1/testReport/ | | modules | C: hadoop-common-project/hadoop-common hadoop-common-project/hadoop-kms hadoop-hdfs-project/hadoop-hdfs-client hadoop-hdfs-project/hadoop-hdfs hadoop-hdfs-project/hadoop-hdfs-native-client hadoop-hdfs-project/hadoop-hdfs-nfs hadoop-yarn-project/hadoop-yarn/hadoop-yarn-server/hadoop-yarn-server-applicationhistoryservice hadoop-yarn-project/hadoop-yarn/hadoop-yarn-server/hadoop-yarn-server-resourcemanager hadoop-yarn-project/hadoop-yarn/hadoop-yarn-server/hadoop-yarn-server-nodemanager hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient hadoop-tools/hadoop-distcp hadoop-yarn-project/hadoop-yarn/hadoop-yarn-server/hadoop-yarn-server-sharedcachemanager hadoop-yarn-project/hadoop-yarn/hadoop-yarn-applications/hadoop-yarn-applications-distributedshell hadoop-yarn-project/hadoop-yarn/hadoop-yarn-applications/hadoop-yarn-applications-unmanaged-am-launcher hadoop-yarn-project/hadoop-yarn/hadoop-yarn-site hadoop-mapreduce-project/hadoop-mapreduce-examples hadoop-tools/hadoop-streaming U: . | | Console output |
[jira] [Commented] (HADOOP-19013) fs.getXattrs(path) for S3FS doesn't have x-amz-server-side-encryption-aws-kms-key-id header.
[ https://issues.apache.org/jira/browse/HADOOP-19013?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17840579#comment-17840579 ] ASF GitHub Bot commented on HADOOP-19013: - hadoop-yetus commented on PR #6646: URL: https://github.com/apache/hadoop/pull/6646#issuecomment-2075856297 :confetti_ball: **+1 overall** | Vote | Subsystem | Runtime | Logfile | Comment | |::|--:|:|::|:---:| _ Prechecks _ | | +1 :green_heart: | dupname | 0m 01s | | No case conflicting files found. | | +0 :ok: | spotbugs | 0m 01s | | spotbugs executables are not available. | | +0 :ok: | codespell | 0m 01s | | codespell was not available. | | +0 :ok: | detsecrets | 0m 01s | | detect-secrets was not available. | | +1 :green_heart: | @author | 0m 00s | | The patch does not contain any @author tags. | | +1 :green_heart: | test4tests | 0m 00s | | The patch appears to include 3 new or modified test files. | _ trunk Compile Tests _ | | +1 :green_heart: | mvninstall | 94m 18s | | trunk passed | | +1 :green_heart: | compile | 5m 05s | | trunk passed | | +1 :green_heart: | checkstyle | 4m 27s | | trunk passed | | +1 :green_heart: | mvnsite | 4m 58s | | trunk passed | | +1 :green_heart: | javadoc | 4m 35s | | trunk passed | | +1 :green_heart: | shadedclient | 147m 03s | | branch has no errors when building and testing our client artifacts. | _ Patch Compile Tests _ | | +1 :green_heart: | mvninstall | 2m 49s | | the patch passed | | +1 :green_heart: | compile | 2m 13s | | the patch passed | | +1 :green_heart: | javac | 2m 12s | | the patch passed | | +1 :green_heart: | blanks | 0m 01s | | The patch has no blanks issues. | | +1 :green_heart: | checkstyle | 1m 59s | | the patch passed | | +1 :green_heart: | mvnsite | 2m 22s | | the patch passed | | +1 :green_heart: | javadoc | 2m 08s | | the patch passed | | +1 :green_heart: | shadedclient | 157m 13s | | patch has no errors when building and testing our client artifacts. | _ Other Tests _ | | +1 :green_heart: | asflicense | 5m 34s | | The patch does not generate ASF License warnings. | | | | 421m 47s | | | | Subsystem | Report/Notes | |--:|:-| | GITHUB PR | https://github.com/apache/hadoop/pull/6646 | | Optional Tests | dupname asflicense compile javac javadoc mvninstall mvnsite unit shadedclient spotbugs checkstyle codespell detsecrets | | uname | MINGW64_NT-10.0-17763 56bb0c536f33 3.4.10-87d57229.x86_64 2024-02-14 20:17 UTC x86_64 Msys | | Build tool | maven | | Personality | /c/hadoop/dev-support/bin/hadoop.sh | | git revision | trunk / 786d2f8bb310cb16348a6e370d39a4ce725fc6e2 | | Default Java | Azul Systems, Inc.-1.8.0_332-b09 | | Test Results | https://ci-hadoop.apache.org/job/hadoop-multibranch-windows-10/job/PR-6646/1/testReport/ | | modules | C: hadoop-tools/hadoop-aws U: hadoop-tools/hadoop-aws | | Console output | https://ci-hadoop.apache.org/job/hadoop-multibranch-windows-10/job/PR-6646/1/console | | versions | git=2.44.0.windows.1 | | Powered by | Apache Yetus 0.14.0 https://yetus.apache.org | This message was automatically generated. > fs.getXattrs(path) for S3FS doesn't have > x-amz-server-side-encryption-aws-kms-key-id header. > > > Key: HADOOP-19013 > URL: https://issues.apache.org/jira/browse/HADOOP-19013 > Project: Hadoop Common > Issue Type: Sub-task > Components: fs/s3 >Affects Versions: 3.3.6 >Reporter: Mukund Thakur >Assignee: Mukund Thakur >Priority: Major > Labels: pull-request-available > > Once a path while uploading has been encrypted with SSE-KMS with a key id and > then later when we try to read the attributes of the same file, it doesn't > contain the key id information as an attribute. should we add it? -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Commented] (HADOOP-19074) Transitive dependencies with CVEs in Hadoop distro
[ https://issues.apache.org/jira/browse/HADOOP-19074?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17840569#comment-17840569 ] ASF GitHub Bot commented on HADOOP-19074: - hadoop-yetus commented on PR #6586: URL: https://github.com/apache/hadoop/pull/6586#issuecomment-2075787815 :broken_heart: **-1 overall** | Vote | Subsystem | Runtime | Logfile | Comment | |::|--:|:|::|:---:| | -1 :x: | patch | 0m 54s | | https://github.com/apache/hadoop/pull/6586 does not apply to trunk. Rebase required? Wrong Branch? See https://cwiki.apache.org/confluence/display/HADOOP/How+To+Contribute for help. | | Subsystem | Report/Notes | |--:|:-| | GITHUB PR | https://github.com/apache/hadoop/pull/6586 | | Console output | https://ci-hadoop.apache.org/job/hadoop-multibranch-windows-10/job/PR-6586/1/console | | versions | git=2.44.0.windows.1 | | Powered by | Apache Yetus 0.14.0 https://yetus.apache.org | This message was automatically generated. > Transitive dependencies with CVEs in Hadoop distro > -- > > Key: HADOOP-19074 > URL: https://issues.apache.org/jira/browse/HADOOP-19074 > Project: Hadoop Common > Issue Type: Improvement > Components: build >Affects Versions: 3.4.0 >Reporter: Prathap Sagar S >Priority: Major > Labels: pull-request-available > Attachments: HADOOP_CVE_LIST.xlsx > > > Our ongoing security scans are turning up several long-standing CVEs, even in > the most recent version of Hadoop, which is making it difficult for us to use > Hadoop in our echo system. A comprehensive list of all the long-standing CVEs > and the JARs holding them is attached. I'm asking for community assistance to > address these high-risk vulnerabilities as soon as possible. > > |Vulnerability ID|Severity|Package name|Package version|Package type|Package > path|Package suggested fix| > |CVE-2023-2976|High|com.google.guava:guava|30.1.1-jre|java|/hadoop-3.4.0/share/hadoop/common/lib/hadoop-shaded-guava-1.1.1.jar|v32.0.0-android| > |CVE-2023-2976|High|com.google.guava:guava|30.1.1-jre|java|/hadoop-3.4.0/share/hadoop/client/hadoop-client-runtime-3.4.0-SNAPSHOT.jar|v32.0.0-android| > |CVE-2023-2976|High|com.google.guava:guava|12.0.1|java|/hadoop-3.4.0/share/hadoop/yarn/timelineservice/lib/guava-12.0.1.jar|v32.0.0-android| > |CVE-2023-2976|High|com.google.guava:guava|27.0-jre|java|/hadoop-3.4.0/share/hadoop/hdfs/lib/guava-27.0-jre.jar|v32.0.0-android| > |CVE-2023-2976|High|com.google.guava:guava|27.0-jre|java|/hadoop-3.4.0/share/hadoop/common/lib/guava-27.0-jre.jar|v32.0.0-android| > |CVE-2023-2976|High|com.google.guava:guava|30.1.1-jre|java|/hadoop-3.4.0/share/hadoop/hdfs/lib/hadoop-shaded-guava-1.1.1.jar|v32.0.0-android| > |CVE-2022-25647|High|com.google.code.gson:gson|2.8.5|java|/hadoop-3.4.0/share/hadoop/yarn/timelineservice/lib/hbase-shaded-gson-3.0.0.jar|v2.8.9| > |CVE-2022-3171|High|com.google.protobuf:protobuf-java|3.7.1|java|/hadoop-3.4.0/share/hadoop/client/hadoop-client-runtime-3.4.0-SNAPSHOT.jar|v3.16.3| > |CVE-2022-3171|High|com.google.protobuf:protobuf-java|2.5.0|java|/hadoop-3.4.0/share/hadoop/yarn/lib/protobuf-java-2.5.0.jar|v3.16.3| > |CVE-2022-3171|High|com.google.protobuf:protobuf-java|3.7.1|java|/hadoop-3.4.0/share/hadoop/common/lib/hadoop-shaded-guava-1.1.1.jar|v3.16.3| > |CVE-2022-3171|High|com.google.protobuf:protobuf-java|3.7.1|java|/hadoop-3.4.0/share/hadoop/common/lib/hadoop-shaded-protobuf_3_7-1.1.1.jar|v3.16.3| > |CVE-2022-3509|High|com.google.protobuf:protobuf-java|2.5.0|java|/hadoop-3.4.0/share/hadoop/yarn/lib/protobuf-java-2.5.0.jar|v3.16.3| > |CVE-2022-3509|High|com.google.protobuf:protobuf-java|3.7.1|java|/hadoop-3.4.0/share/hadoop/client/hadoop-client-runtime-3.4.0-SNAPSHOT.jar|v3.16.3| > |CVE-2022-3509|High|com.google.protobuf:protobuf-java|3.7.1|java|/hadoop-3.4.0/share/hadoop/hdfs/lib/hadoop-shaded-protobuf_3_7-1.1.1.jar|v3.16.3| > |CVE-2022-3509|High|com.google.protobuf:protobuf-java|3.7.1|java|/hadoop-3.4.0/share/hadoop/common/lib/hadoop-shaded-protobuf_3_7-1.1.1.jar|v3.16.3| > |CVE-2022-3510|High|com.google.protobuf:protobuf-java|3.7.1|java|/hadoop-3.4.0/share/hadoop/hdfs/lib/hadoop-shaded-protobuf_3_7-1.1.1.jar|v3.16.3| > |CVE-2022-3510|High|com.google.protobuf:protobuf-java|3.7.1|java|/hadoop-3.4.0/share/hadoop/common/lib/hadoop-shaded-protobuf_3_7-1.1.1.jar|v3.16.3| > |CVE-2022-3510|High|com.google.protobuf:protobuf-java|3.7.1|java|/hadoop-3.4.0/share/hadoop/client/hadoop-client-runtime-3.4.0-SNAPSHOT.jar|v3.16.3| > |CVE-2022-3510|High|com.google.protobuf:protobuf-java|2.5.0|java|/hadoop-3.4.0/share/hadoop/yarn/lib/protobuf-java-2.5.0.jar|v3.16.3| >
[jira] [Commented] (HADOOP-19152) Do not hard code security providers.
[ https://issues.apache.org/jira/browse/HADOOP-19152?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17840562#comment-17840562 ] ASF GitHub Bot commented on HADOOP-19152: - szetszwo commented on code in PR #6739: URL: https://github.com/apache/hadoop/pull/6739#discussion_r1578474234 ## hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/crypto/CryptoUtils.java: ## @@ -0,0 +1,81 @@ +/* + * Licensed to the Apache Software Foundation (ASF) under one + * or more contributor license agreements. See the NOTICE file + * distributed with this work for additional information + * regarding copyright ownership. The ASF licenses this file + * to you under the Apache License, Version 2.0 (the + * "License"); you may not use this file except in compliance + * with the License. You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + */ +package org.apache.hadoop.crypto; + +import org.apache.hadoop.classification.InterfaceAudience; +import org.apache.hadoop.conf.Configuration; +import org.apache.hadoop.fs.CommonConfigurationKeysPublic; +import org.apache.hadoop.fs.store.LogExactlyOnce; +import org.slf4j.Logger; +import org.slf4j.LoggerFactory; + +import java.lang.reflect.Field; +import java.security.Provider; +import java.security.Security; + +/** Utility methods for the crypto related features. */ +@InterfaceAudience.Private +public class CryptoUtils { + static final Logger LOG = LoggerFactory.getLogger(CryptoUtils.class); + private static final LogExactlyOnce LOG_FAILED_TO_LOAD_CLASS = new LogExactlyOnce(LOG); + private static final LogExactlyOnce LOG_FAILED_TO_GET_FIELD = new LogExactlyOnce(LOG); + private static final LogExactlyOnce LOG_FAILED_TO_ADD_PROVIDER = new LogExactlyOnce(LOG); + + private static final String BOUNCY_CASTLE_PROVIDER_CLASS + = "org.bouncycastle.jce.provider.BouncyCastleProvider"; + private static final String PROVIDER_NAME_FIELD = "PROVIDER_NAME"; + + /** + * Get the security provider value specified in + * {@link CommonConfigurationKeysPublic#HADOOP_SECURITY_CRYPTO_JCE_PROVIDER_KEY} + * from the given conf. + * + * @param conf the configuration + * @return the configured provider, if there is any; otherwise, return an empty string. + */ + public static String getJceProvider(Configuration conf) { +final String provider = conf.getTrimmed( +CommonConfigurationKeysPublic.HADOOP_SECURITY_CRYPTO_JCE_PROVIDER_KEY, ""); +final boolean autoAdd = conf.getBoolean( + CommonConfigurationKeysPublic.HADOOP_SECURITY_CRYPTO_JCE_PROVIDER_AUTO_ADD_KEY, + CommonConfigurationKeysPublic.HADOOP_SECURITY_CRYPTO_JCE_PROVIDER_AUTO_ADD_DEFAULT); + +// For backward compatible, auto-add BOUNCY_CASTLE_PROVIDER_CLASS. +if (autoAdd && !provider.isEmpty()) { + try { +// Use reflection in order to avoid statically loading the class. +final Class clazz = Class.forName(BOUNCY_CASTLE_PROVIDER_CLASS); Review Comment: Sure, checking "BC" sounds good. > Do not hard code security providers. > > > Key: HADOOP-19152 > URL: https://issues.apache.org/jira/browse/HADOOP-19152 > Project: Hadoop Common > Issue Type: Improvement > Components: security >Reporter: Tsz-wo Sze >Assignee: Tsz-wo Sze >Priority: Major > Labels: pull-request-available > > In order to support different security providers in different clusters, we > should not hard code a provider in our code. -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Commented] (HADOOP-19085) Compatibility Benchmark over HCFS Implementations
[ https://issues.apache.org/jira/browse/HADOOP-19085?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17840561#comment-17840561 ] ASF GitHub Bot commented on HADOOP-19085: - hadoop-yetus commented on PR #6602: URL: https://github.com/apache/hadoop/pull/6602#issuecomment-2075764416 :broken_heart: **-1 overall** | Vote | Subsystem | Runtime | Logfile | Comment | |::|--:|:|::|:---:| | -1 :x: | patch | 1m 04s | | https://github.com/apache/hadoop/pull/6602 does not apply to trunk. Rebase required? Wrong Branch? See https://cwiki.apache.org/confluence/display/HADOOP/How+To+Contribute for help. | | Subsystem | Report/Notes | |--:|:-| | GITHUB PR | https://github.com/apache/hadoop/pull/6602 | | Console output | https://ci-hadoop.apache.org/job/hadoop-multibranch-windows-10/job/PR-6602/1/console | | versions | git=2.44.0.windows.1 | | Powered by | Apache Yetus 0.14.0 https://yetus.apache.org | This message was automatically generated. > Compatibility Benchmark over HCFS Implementations > - > > Key: HADOOP-19085 > URL: https://issues.apache.org/jira/browse/HADOOP-19085 > Project: Hadoop Common > Issue Type: New Feature > Components: fs, test >Affects Versions: 3.4.0 >Reporter: Han Liu >Assignee: Han Liu >Priority: Major > Labels: pull-request-available > Fix For: 3.5.0 > > Attachments: HADOOP-19085.001.patch, HDFS Compatibility Benchmark > Design.pdf > > > {*}Background:{*}Hadoop-Compatible File System (HCFS) is a core conception in > big data storage ecosystem, providing unified interfaces and generally clear > semantics, and has become the de-factor standard for industry storage systems > to follow and conform with. There have been a series of HCFS implementations > in Hadoop, such as S3AFileSystem for Amazon's S3 Object Store, WASB for > Microsoft's Azure Blob Storage and OSS connector for Alibaba Cloud Object > Storage, and more from storage service's providers on their own. > {*}Problems:{*}However, as indicated by introduction.md, there is no formal > suite to do compatibility assessment of a file system for all such HCFS > implementations. Thus, whether the functionality is well accomplished and > meets the core compatible expectations mainly relies on service provider's > own report. Meanwhile, Hadoop is also developing and new features are > continuously contributing to HCFS interfaces for existing implementations to > follow and update, in which case, Hadoop also needs a tool to quickly assess > if these features are supported or not for a specific HCFS implementation. > Besides, the known hadoop command line tool or hdfs shell is used to directly > interact with a HCFS storage system, where most commands correspond to > specific HCFS interfaces and work well. Still, there are cases that are > complicated and may not work, like expunge command. To check such commands > for an HCFS, we also need an approach to figure them out. > {*}Proposal:{*}Accordingly, we propose to define a formal HCFS compatibility > benchmark and provide corresponding tool to do the compatibility assessment > for an HCFS storage system. The benchmark and tool should consider both HCFS > interfaces and hdfs shell commands. Different scenarios require different > kinds of compatibilities. For such consideration, we could define different > suites in the benchmark. > *Benefits:* We intend the benchmark and tool to be useful for both storage > providers and storage users. For end users, it can be used to evalute the > compatibility level and determine if the storage system in question is > suitable for the required scenarios. For storage providers, it helps to > quickly generate an objective and reliable report about core functioins of > the storage service. As an instance, if the HCFS got a 100% on a suite named > 'tpcds', it is demonstrated that all functions needed by a tpcds program have > been well achieved. It is also a guide indicating how storage service > abilities can map to HCFS interfaces, such as storage class on S3. > Any thoughts? Comments and feedback are mostly welcomed. Thanks in advance. -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Commented] (HADOOP-19152) Do not hard code security providers.
[ https://issues.apache.org/jira/browse/HADOOP-19152?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17840560#comment-17840560 ] ASF GitHub Bot commented on HADOOP-19152: - szetszwo commented on code in PR #6739: URL: https://github.com/apache/hadoop/pull/6739#discussion_r1578470256 ## hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/crypto/CryptoUtils.java: ## @@ -0,0 +1,71 @@ +/* + * Licensed to the Apache Software Foundation (ASF) under one + * or more contributor license agreements. See the NOTICE file + * distributed with this work for additional information + * regarding copyright ownership. The ASF licenses this file + * to you under the Apache License, Version 2.0 (the + * "License"); you may not use this file except in compliance + * with the License. You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + */ +package org.apache.hadoop.crypto; + +import org.apache.hadoop.classification.InterfaceAudience; +import org.apache.hadoop.conf.Configuration; Review Comment: Is there a way to set the import ordering in IDEs such as IntelliJ? We should not manually sort the imports. > Do not hard code security providers. > > > Key: HADOOP-19152 > URL: https://issues.apache.org/jira/browse/HADOOP-19152 > Project: Hadoop Common > Issue Type: Improvement > Components: security >Reporter: Tsz-wo Sze >Assignee: Tsz-wo Sze >Priority: Major > Labels: pull-request-available > > In order to support different security providers in different clusters, we > should not hard code a provider in our code. -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Commented] (HADOOP-19134) use StringBuilder instead of StringBuffer
[ https://issues.apache.org/jira/browse/HADOOP-19134?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17840554#comment-17840554 ] ASF GitHub Bot commented on HADOOP-19134: - hadoop-yetus commented on PR #6692: URL: https://github.com/apache/hadoop/pull/6692#issuecomment-2075737382 :broken_heart: **-1 overall** | Vote | Subsystem | Runtime | Logfile | Comment | |::|--:|:|::|:---:| _ Prechecks _ | | +1 :green_heart: | dupname | 0m 22s | | No case conflicting files found. | | +0 :ok: | spotbugs | 0m 01s | | spotbugs executables are not available. | | +0 :ok: | codespell | 0m 01s | | codespell was not available. | | +0 :ok: | detsecrets | 0m 01s | | detect-secrets was not available. | | +1 :green_heart: | @author | 0m 00s | | The patch does not contain any @author tags. | | +1 :green_heart: | test4tests | 0m 00s | | The patch appears to include 55 new or modified test files. | _ trunk Compile Tests _ | | +0 :ok: | mvndep | 3m 06s | | Maven dependency ordering for branch | | +1 :green_heart: | mvninstall | 108m 51s | | trunk passed | | +1 :green_heart: | compile | 47m 49s | | trunk passed | | +1 :green_heart: | checkstyle | 7m 17s | | trunk passed | | -1 :x: | mvnsite | 5m 17s | [/branch-mvnsite-hadoop-common-project_hadoop-common.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch-windows-10/job/PR-6692/1/artifact/out/branch-mvnsite-hadoop-common-project_hadoop-common.txt) | hadoop-common in trunk failed. | | +1 :green_heart: | javadoc | 95m 17s | | trunk passed | | +1 :green_heart: | shadedclient | 373m 49s | | branch has no errors when building and testing our client artifacts. | _ Patch Compile Tests _ | | +0 :ok: | mvndep | 2m 48s | | Maven dependency ordering for patch | | -1 :x: | mvninstall | 3m 14s | [/patch-mvninstall-hadoop-yarn-project_hadoop-yarn_hadoop-yarn-common.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch-windows-10/job/PR-6692/1/artifact/out/patch-mvninstall-hadoop-yarn-project_hadoop-yarn_hadoop-yarn-common.txt) | hadoop-yarn-common in the patch failed. | | -1 :x: | compile | 29m 37s | [/patch-compile-root.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch-windows-10/job/PR-6692/1/artifact/out/patch-compile-root.txt) | root in the patch failed. | | -1 :x: | javac | 29m 37s | [/patch-compile-root.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch-windows-10/job/PR-6692/1/artifact/out/patch-compile-root.txt) | root in the patch failed. | | +1 :green_heart: | blanks | 0m 01s | | The patch has no blanks issues. | | +1 :green_heart: | checkstyle | 6m 06s | | the patch passed | | -1 :x: | mvnsite | 4m 06s | [/patch-mvnsite-hadoop-common-project_hadoop-common.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch-windows-10/job/PR-6692/1/artifact/out/patch-mvnsite-hadoop-common-project_hadoop-common.txt) | hadoop-common in the patch failed. | | -1 :x: | mvnsite | 4m 28s | [/patch-mvnsite-hadoop-yarn-project_hadoop-yarn_hadoop-yarn-common.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch-windows-10/job/PR-6692/1/artifact/out/patch-mvnsite-hadoop-yarn-project_hadoop-yarn_hadoop-yarn-common.txt) | hadoop-yarn-common in the patch failed. | | +1 :green_heart: | javadoc | 74m 51s | | the patch passed | | -1 :x: | shadedclient | 220m 04s | | patch has errors when building and testing our client artifacts. | _ Other Tests _ | | +1 :green_heart: | asflicense | 5m 14s | | The patch does not generate ASF License warnings. | | | | 872m 01s | | | | Subsystem | Report/Notes | |--:|:-| | GITHUB PR | https://github.com/apache/hadoop/pull/6692 | | Optional Tests | dupname asflicense compile javac javadoc mvninstall mvnsite unit shadedclient spotbugs checkstyle codespell detsecrets | | uname | MINGW64_NT-10.0-17763 15c8e8eb8429 3.4.10-87d57229.x86_64 2024-02-14 20:17 UTC x86_64 Msys | | Build tool | maven | | Personality | /c/hadoop/dev-support/bin/hadoop.sh | | git revision | trunk / 45ab789535db417d71f833a5655fb49e1e4086f3 | | Default Java | Azul Systems, Inc.-1.8.0_332-b09 | | Test Results | https://ci-hadoop.apache.org/job/hadoop-multibranch-windows-10/job/PR-6692/1/testReport/ | | modules | C: hadoop-common-project/hadoop-common hadoop-common-project/hadoop-kms hadoop-hdfs-project/hadoop-hdfs hadoop-yarn-project/hadoop-yarn/hadoop-yarn-api hadoop-yarn-project/hadoop-yarn/hadoop-yarn-common hadoop-yarn-project/hadoop-yarn/hadoop-yarn-server/hadoop-yarn-server-common hadoop-yarn-project/hadoop-yarn/hadoop-yarn-server/hadoop-yarn-server-resourcemanager
[jira] [Commented] (HADOOP-19156) ZooKeeper based state stores use different ZK address configs
[ https://issues.apache.org/jira/browse/HADOOP-19156?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17840519#comment-17840519 ] ASF GitHub Bot commented on HADOOP-19156: - hadoop-yetus commented on PR #6767: URL: https://github.com/apache/hadoop/pull/6767#issuecomment-2075430805 :broken_heart: **-1 overall** | Vote | Subsystem | Runtime | Logfile | Comment | |::|--:|:|::|:---:| | +0 :ok: | reexec | 0m 50s | | Docker mode activated. | _ Prechecks _ | | +1 :green_heart: | dupname | 0m 1s | | No case conflicting files found. | | +0 :ok: | codespell | 0m 0s | | codespell was not available. | | +0 :ok: | detsecrets | 0m 0s | | detect-secrets was not available. | | +0 :ok: | xmllint | 0m 0s | | xmllint was not available. | | +1 :green_heart: | @author | 0m 0s | | The patch does not contain any @author tags. | | +1 :green_heart: | test4tests | 0m 0s | | The patch appears to include 12 new or modified test files. | _ trunk Compile Tests _ | | +0 :ok: | mvndep | 14m 14s | | Maven dependency ordering for branch | | +1 :green_heart: | mvninstall | 35m 6s | | trunk passed | | +1 :green_heart: | compile | 21m 17s | | trunk passed with JDK Ubuntu-11.0.22+7-post-Ubuntu-0ubuntu220.04.1 | | +1 :green_heart: | compile | 19m 4s | | trunk passed with JDK Private Build-1.8.0_402-8u402-ga-2ubuntu1~20.04-b06 | | +1 :green_heart: | checkstyle | 5m 0s | | trunk passed | | +1 :green_heart: | mvnsite | 6m 23s | | trunk passed | | +1 :green_heart: | javadoc | 5m 28s | | trunk passed with JDK Ubuntu-11.0.22+7-post-Ubuntu-0ubuntu220.04.1 | | +1 :green_heart: | javadoc | 4m 41s | | trunk passed with JDK Private Build-1.8.0_402-8u402-ga-2ubuntu1~20.04-b06 | | +1 :green_heart: | spotbugs | 11m 37s | | trunk passed | | +1 :green_heart: | shadedclient | 40m 19s | | branch has no errors when building and testing our client artifacts. | _ Patch Compile Tests _ | | +0 :ok: | mvndep | 0m 31s | | Maven dependency ordering for patch | | -1 :x: | mvninstall | 0m 29s | [/patch-mvninstall-hadoop-hdfs-project_hadoop-hdfs-rbf.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-6767/1/artifact/out/patch-mvninstall-hadoop-hdfs-project_hadoop-hdfs-rbf.txt) | hadoop-hdfs-rbf in the patch failed. | | -1 :x: | compile | 11m 50s | [/patch-compile-root-jdkUbuntu-11.0.22+7-post-Ubuntu-0ubuntu220.04.1.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-6767/1/artifact/out/patch-compile-root-jdkUbuntu-11.0.22+7-post-Ubuntu-0ubuntu220.04.1.txt) | root in the patch failed with JDK Ubuntu-11.0.22+7-post-Ubuntu-0ubuntu220.04.1. | | -1 :x: | javac | 11m 50s | [/patch-compile-root-jdkUbuntu-11.0.22+7-post-Ubuntu-0ubuntu220.04.1.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-6767/1/artifact/out/patch-compile-root-jdkUbuntu-11.0.22+7-post-Ubuntu-0ubuntu220.04.1.txt) | root in the patch failed with JDK Ubuntu-11.0.22+7-post-Ubuntu-0ubuntu220.04.1. | | -1 :x: | compile | 11m 8s | [/patch-compile-root-jdkPrivateBuild-1.8.0_402-8u402-ga-2ubuntu1~20.04-b06.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-6767/1/artifact/out/patch-compile-root-jdkPrivateBuild-1.8.0_402-8u402-ga-2ubuntu1~20.04-b06.txt) | root in the patch failed with JDK Private Build-1.8.0_402-8u402-ga-2ubuntu1~20.04-b06. | | -1 :x: | javac | 11m 8s | [/patch-compile-root-jdkPrivateBuild-1.8.0_402-8u402-ga-2ubuntu1~20.04-b06.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-6767/1/artifact/out/patch-compile-root-jdkPrivateBuild-1.8.0_402-8u402-ga-2ubuntu1~20.04-b06.txt) | root in the patch failed with JDK Private Build-1.8.0_402-8u402-ga-2ubuntu1~20.04-b06. | | -1 :x: | blanks | 0m 0s | [/blanks-eol.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-6767/1/artifact/out/blanks-eol.txt) | The patch has 1 line(s) that end in blanks. Use git apply --whitespace=fix <>. Refer https://git-scm.com/docs/git-apply | | +1 :green_heart: | checkstyle | 4m 26s | | the patch passed | | -1 :x: | mvnsite | 0m 38s | [/patch-mvnsite-hadoop-hdfs-project_hadoop-hdfs-rbf.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-6767/1/artifact/out/patch-mvnsite-hadoop-hdfs-project_hadoop-hdfs-rbf.txt) | hadoop-hdfs-rbf in the patch failed. | | -1 :x: | javadoc | 0m 54s | [/results-javadoc-javadoc-hadoop-common-project_hadoop-common-jdkUbuntu-11.0.22+7-post-Ubuntu-0ubuntu220.04.1.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-6767/1/artifact/out/results-javadoc-javadoc-hadoop-common-project_hadoop-common-jdkUbuntu-11.0.22+7-post-Ubuntu-0ubuntu220.04.1.txt) |
[jira] [Commented] (HADOOP-19136) Upgrade commons-io to 2.16.1
[ https://issues.apache.org/jira/browse/HADOOP-19136?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17840477#comment-17840477 ] ASF GitHub Bot commented on HADOOP-19136: - hadoop-yetus commented on PR #6704: URL: https://github.com/apache/hadoop/pull/6704#issuecomment-2075179751 :broken_heart: **-1 overall** | Vote | Subsystem | Runtime | Logfile | Comment | |::|--:|:|::|:---:| _ Prechecks _ | | +1 :green_heart: | dupname | 0m 00s | | No case conflicting files found. | | +0 :ok: | codespell | 0m 00s | | codespell was not available. | | +0 :ok: | detsecrets | 0m 00s | | detect-secrets was not available. | | +0 :ok: | shellcheck | 0m 00s | | Shellcheck was not available. | | +0 :ok: | shelldocs | 0m 00s | | Shelldocs was not available. | | +0 :ok: | xmllint | 0m 01s | | xmllint was not available. | | +1 :green_heart: | @author | 0m 00s | | The patch does not contain any @author tags. | | -1 :x: | test4tests | 0m 00s | | The patch doesn't appear to include any new or modified tests. Please justify why no new tests are needed for this patch. Also please list what manual steps were performed to verify this patch. | _ trunk Compile Tests _ | | +0 :ok: | mvndep | 2m 11s | | Maven dependency ordering for branch | | +1 :green_heart: | mvninstall | 86m 50s | | trunk passed | | +1 :green_heart: | compile | 37m 47s | | trunk passed | | -1 :x: | mvnsite | 22m 35s | [/branch-mvnsite-root.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch-windows-10/job/PR-6704/1/artifact/out/branch-mvnsite-root.txt) | root in trunk failed. | | +1 :green_heart: | javadoc | 15m 52s | | trunk passed | | +1 :green_heart: | shadedclient | 306m 04s | | branch has no errors when building and testing our client artifacts. | _ Patch Compile Tests _ | | +0 :ok: | mvndep | 2m 12s | | Maven dependency ordering for patch | | +1 :green_heart: | mvninstall | 86m 56s | | the patch passed | | +1 :green_heart: | compile | 39m 40s | | the patch passed | | +1 :green_heart: | javac | 39m 40s | | the patch passed | | +1 :green_heart: | blanks | 0m 00s | | The patch has no blanks issues. | | -1 :x: | mvnsite | 22m 21s | [/patch-mvnsite-root.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch-windows-10/job/PR-6704/1/artifact/out/patch-mvnsite-root.txt) | root in the patch failed. | | +1 :green_heart: | javadoc | 14m 40s | | the patch passed | | +1 :green_heart: | shadedclient | 183m 35s | | patch has no errors when building and testing our client artifacts. | _ Other Tests _ | | +1 :green_heart: | asflicense | 6m 03s | | The patch does not generate ASF License warnings. | | | | 634m 41s | | | | Subsystem | Report/Notes | |--:|:-| | GITHUB PR | https://github.com/apache/hadoop/pull/6704 | | Optional Tests | dupname asflicense codespell detsecrets shellcheck shelldocs compile javac javadoc mvninstall mvnsite unit shadedclient xmllint | | uname | MINGW64_NT-10.0-17763 cca86c1ae4b8 3.4.10-87d57229.x86_64 2024-02-14 20:17 UTC x86_64 Msys | | Build tool | maven | | Personality | /c/hadoop/dev-support/bin/hadoop.sh | | git revision | trunk / 6c7dae2c79174c6d64465cd503cf694bb438c6b6 | | Default Java | Azul Systems, Inc.-1.8.0_332-b09 | | Test Results | https://ci-hadoop.apache.org/job/hadoop-multibranch-windows-10/job/PR-6704/1/testReport/ | | modules | C: hadoop-project . U: . | | Console output | https://ci-hadoop.apache.org/job/hadoop-multibranch-windows-10/job/PR-6704/1/console | | versions | git=2.44.0.windows.1 | | Powered by | Apache Yetus 0.14.0 https://yetus.apache.org | This message was automatically generated. > Upgrade commons-io to 2.16.1 > > > Key: HADOOP-19136 > URL: https://issues.apache.org/jira/browse/HADOOP-19136 > Project: Hadoop Common > Issue Type: Improvement > Components: common >Affects Versions: 3.4.1 >Reporter: Shilun Fan >Assignee: Shilun Fan >Priority: Major > Labels: pull-request-available > > commons-io can be upgraded from 2.14.0 to 2.16.0, try to upgrade. -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Commented] (HADOOP-18184) s3a prefetching stream to support unbuffer()
[ https://issues.apache.org/jira/browse/HADOOP-18184?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17840473#comment-17840473 ] ASF GitHub Bot commented on HADOOP-18184: - steveloughran commented on PR #5832: URL: https://github.com/apache/hadoop/pull/5832#issuecomment-2075157859 There's some race condition with the list add/evict causing intermittent failures of one of the tests. Looks like the failure condition is * block has just been evicted * read() says block is in cache * read is attempted * read fails with FNFE. Suspect there's some kind of list update issue. Improved logging but not yet fixed this. ``` 2024-04-24 15:57:22,711 [setup] DEBUG s3a.S3AFileSystem (S3AFileSystem.java:initializeClass(5724)) - Initialize S3A class 2024-04-24 15:57:22,734 [setup] DEBUG s3a.S3ATestUtils (S3ATestUtils.java:removeBucketOverrides(914)) - Removing option fs.s3a.bucket.stevel-london.directory.marker.retention; was keep 2024-04-24 15:57:22,757 [setup] DEBUG s3a.S3AFileSystem (S3AFileSystem.java:initializeClass(5724)) - Initialize S3A class 2024-04-24 15:57:22,771 [setup] DEBUG s3a.S3AFileSystem (S3AFileSystem.java:initialize(549)) - Initializing S3AFileSystem for stevel-london 2024-04-24 15:57:22,774 [setup] DEBUG s3a.S3AUtils (S3AUtils.java:propagateBucketOptions(1103)) - Propagating entries under fs.s3a.bucket.stevel-london. 2024-04-24 15:57:22,777 [setup] DEBUG s3a.S3AUtils (S3AUtils.java:propagateBucketOptions(1124)) - Updating fs.s3a.versioned.store from [core-site.xml] 2024-04-24 15:57:22,777 [setup] DEBUG s3a.S3AUtils (S3AUtils.java:propagateBucketOptions(1124)) - Updating fs.s3a.encryption.algorithm from [core-site.xml] 2024-04-24 15:57:22,777 [setup] DEBUG s3a.S3AUtils (S3AUtils.java:propagateBucketOptions(1124)) - Updating fs.s3a.endpoint from [core-site.xml] 2024-04-24 15:57:22,777 [setup] DEBUG s3a.S3AUtils (S3AUtils.java:propagateBucketOptions(1124)) - Updating fs.s3a.encryption.key from [core-site.xml] 2024-04-24 15:57:22,777 [setup] DEBUG s3a.S3AUtils (S3AUtils.java:propagateBucketOptions(1124)) - Updating fs.s3a.change.detection.source from [core-site.xml] 2024-04-24 15:57:22,778 [setup] DEBUG s3a.S3AUtils (S3AUtils.java:maybeIsolateClassloader(1708)) - Configuration classloader set to S3AFileSystem classloader: sun.misc.Launcher$AppClassLoader@18b4aac2 2024-04-24 15:57:22,783 [setup] DEBUG s3a.S3AUtils (S3AUtils.java:buildEncryptionSecrets(1477)) - Using SSE-KMS with key of length 75 ending with 3 2024-04-24 15:57:22,784 [setup] DEBUG s3a.S3ARetryPolicy (S3ARetryPolicy.java:(145)) - Retrying on recoverable AWS failures 3 times with an initial interval of 500ms 2024-04-24 15:57:22,911 [setup] DEBUG s3a.S3AInstrumentation (S3AInstrumentation.java:getMetricsSystem(254)) - Metrics system inited org.apache.hadoop.metrics2.impl.MetricsSystemImpl@6425ab69 2024-04-24 15:57:22,917 [setup] DEBUG s3a.S3AFileSystem (S3AFileSystem.java:initialize(605)) - Client Side Encryption enabled: false 2024-04-24 15:57:22,917 [setup] DEBUG s3a.S3AUtils (S3AUtils.java:intOption(909)) - Value of fs.s3a.paging.maximum is 5000 2024-04-24 15:57:22,917 [setup] DEBUG s3a.S3AUtils (S3AUtils.java:longBytesOption(952)) - Value of fs.s3a.block.size is 33554432 2024-04-24 15:57:22,918 [setup] DEBUG s3a.S3AUtils (S3AUtils.java:longBytesOption(952)) - Value of fs.s3a.prefetch.block.size is 131072 2024-04-24 15:57:22,918 [setup] DEBUG s3a.S3AUtils (S3AUtils.java:intOption(909)) - Value of fs.s3a.prefetch.block.count is 8 2024-04-24 15:57:22,918 [setup] DEBUG s3a.S3AUtils (S3AUtils.java:intOption(909)) - Value of fs.s3a.max.total.tasks is 32 2024-04-24 15:57:22,920 [setup] DEBUG impl.ConfigurationHelper (ConfigurationHelper.java:getDuration(80)) - Duration of fs.s3a.threads.keepalivetime = PT1M 2024-04-24 15:57:22,920 [setup] DEBUG s3a.S3AUtils (S3AUtils.java:intOption(909)) - Value of fs.s3a.executor.capacity is 16 2024-04-24 15:57:22,937 [setup] DEBUG auth.SignerManager (SignerManager.java:initCustomSigners(68)) - No custom signers specified 2024-04-24 15:57:22,940 [setup] DEBUG audit.AuditIntegration (AuditIntegration.java:createAndInitAuditor(109)) - Auditor class is class org.apache.hadoop.fs.s3a.audit.impl.LoggingAuditor 2024-04-24 15:57:22,943 [setup] DEBUG impl.ActiveAuditManagerS3A (ActiveAuditManagerS3A.java:serviceInit(199)) - Audit manager initialized with audit service LoggingAuditor{ID='0d643328-91f6-4da7-acae-86fd72161299', headerEnabled=true, rejectOutOfSpan=true, isMultipartUploadEnabled=true} 2024-04-24 15:57:22,943 [setup] DEBUG impl.ActiveAuditManagerS3A (ActiveAuditManagerS3A.java:serviceStart(212)) - Started audit service LoggingAuditor{ID='0d643328-91f6-4da7-acae-86fd72161299', headerEnabled=true, rejectOutOfSpan=true, isMultipartUploadEnabled=true} 2024-04-24 15:57:22,943 [setup] DEBUG
[jira] [Commented] (HADOOP-18958) UserGroupInformation debug log improve
[ https://issues.apache.org/jira/browse/HADOOP-18958?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17840421#comment-17840421 ] ASF GitHub Bot commented on HADOOP-18958: - hiwangzhihui commented on PR #6255: URL: https://github.com/apache/hadoop/pull/6255#issuecomment-2074872335 @steveloughran Thank you for your review! I agree with your view. The message "Java. lang. Exception" can confuse troubleshooting and mistakenly assume that an exception has occurred. The printing stack display needs to be adjusted。 > UserGroupInformation debug log improve > -- > > Key: HADOOP-18958 > URL: https://issues.apache.org/jira/browse/HADOOP-18958 > Project: Hadoop Common > Issue Type: Improvement > Components: common >Affects Versions: 3.3.0, 3.3.5 >Reporter: wangzhihui >Priority: Minor > Labels: pull-request-available > Attachments: 20231029-122825-1.jpeg, 20231029-122825.jpeg, > 20231030-143525.jpeg, image-2023-10-29-09-47-56-489.png, > image-2023-10-30-14-35-11-161.png > > Original Estimate: 1h > Remaining Estimate: 1h > > Using “new Exception( )” to print the call stack of "doAs Method " in > the UserGroupInformation class. Using this way will print meaningless > Exception information and too many call stacks, This is not conducive to > troubleshooting > *example:* > !20231029-122825.jpeg|width=991,height=548! > > *improved result* : > > !image-2023-10-29-09-47-56-489.png|width=1099,height=156! > !20231030-143525.jpeg|width=572,height=674! -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Commented] (HADOOP-19156) ZooKeeper based state stores use different ZK address configs
[ https://issues.apache.org/jira/browse/HADOOP-19156?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17840372#comment-17840372 ] ASF GitHub Bot commented on HADOOP-19156: - liubin101 opened a new pull request, #6767: URL: https://github.com/apache/hadoop/pull/6767 ### Description of PR Currently, the Zookeeper-based state stores of RM, YARN Federation, and HDFS Federation use the same ZK address config `hadoop.zk.address`. But in our production environment, we hope that different services can use different ZKs to avoid mutual influence. This jira adds separate ZK address configs for each service. ### How was this patch tested? unit test ### For code changes: - [ ] Does the title or this PR starts with the corresponding JIRA issue id (e.g. 'HADOOP-17799. Your PR title ...')? - [ ] Object storage: have the integration tests been executed and the endpoint declared according to the connector-specific documentation? - [ ] If adding new dependencies to the code, are these dependencies licensed in a way that is compatible for inclusion under [ASF 2.0](http://www.apache.org/legal/resolved.html#category-a)? - [ ] If applicable, have you updated the `LICENSE`, `LICENSE-binary`, `NOTICE-binary` files? > ZooKeeper based state stores use different ZK address configs > - > > Key: HADOOP-19156 > URL: https://issues.apache.org/jira/browse/HADOOP-19156 > Project: Hadoop Common > Issue Type: Improvement >Reporter: liu bin >Priority: Major > > Currently, the Zookeeper-based state stores of RM, YARN Federation, and HDFS > Federation use the same ZK address config `{{{}hadoop.zk.address`{}}}. But in > our production environment, we hope that different services can use different > ZKs to avoid mutual influence. > This jira adds separate ZK address configs for each service. -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Updated] (HADOOP-19156) ZooKeeper based state stores use different ZK address configs
[ https://issues.apache.org/jira/browse/HADOOP-19156?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] ASF GitHub Bot updated HADOOP-19156: Labels: pull-request-available (was: ) > ZooKeeper based state stores use different ZK address configs > - > > Key: HADOOP-19156 > URL: https://issues.apache.org/jira/browse/HADOOP-19156 > Project: Hadoop Common > Issue Type: Improvement >Reporter: liu bin >Priority: Major > Labels: pull-request-available > > Currently, the Zookeeper-based state stores of RM, YARN Federation, and HDFS > Federation use the same ZK address config `{{{}hadoop.zk.address`{}}}. But in > our production environment, we hope that different services can use different > ZKs to avoid mutual influence. > This jira adds separate ZK address configs for each service. -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Updated] (HADOOP-19155) Fix TestZKSignerSecretProvider failing unit test
[ https://issues.apache.org/jira/browse/HADOOP-19155?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] ASF GitHub Bot updated HADOOP-19155: Labels: pull-request-available (was: ) > Fix TestZKSignerSecretProvider failing unit test > > > Key: HADOOP-19155 > URL: https://issues.apache.org/jira/browse/HADOOP-19155 > Project: Hadoop Common > Issue Type: Test > Components: auth >Affects Versions: 3.4.0 >Reporter: kuper >Priority: Minor > Labels: pull-request-available > Attachments: 企业微信截图_4436de68-18c5-43bf-9382-4d9a853f7ef0.png, > 企业微信截图_ab901a4a-c0d4-4a20-a595-057cf648c30c.png, > 企业微信截图_fa5e7d54-b3a8-4ca3-8d4a-25fe493b4eb1.png > > > * {{TestZKSignerSecretProvider and > }}{{{}TestRandomSignerSecretProvider{}}}}} unit test o{}}}ccasional > failure > * The reason was that the MockZKSignerSecretProvider class rollSecret method > is {{synchronized}} > * {{{}s{}}}ometimes verify (secretProvider, timeout (timeout). AtLeastOnce > ()). RollSecret () method first in RolloverSignerSecretProvider scheduler > thread lock, this results in a timeout > -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Commented] (HADOOP-19155) Fix TestZKSignerSecretProvider failing unit test
[ https://issues.apache.org/jira/browse/HADOOP-19155?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17840289#comment-17840289 ] ASF GitHub Bot commented on HADOOP-19155: - hadoop-yetus commented on PR #6766: URL: https://github.com/apache/hadoop/pull/6766#issuecomment-2074049822 :confetti_ball: **+1 overall** | Vote | Subsystem | Runtime | Logfile | Comment | |::|--:|:|::|:---:| | +0 :ok: | reexec | 0m 19s | | Docker mode activated. | _ Prechecks _ | | +1 :green_heart: | dupname | 0m 0s | | No case conflicting files found. | | +0 :ok: | codespell | 0m 0s | | codespell was not available. | | +0 :ok: | detsecrets | 0m 0s | | detect-secrets was not available. | | +1 :green_heart: | @author | 0m 0s | | The patch does not contain any @author tags. | | +1 :green_heart: | test4tests | 0m 0s | | The patch appears to include 2 new or modified test files. | _ trunk Compile Tests _ | | +1 :green_heart: | mvninstall | 33m 21s | | trunk passed | | +1 :green_heart: | compile | 9m 3s | | trunk passed with JDK Ubuntu-11.0.22+7-post-Ubuntu-0ubuntu220.04.1 | | +1 :green_heart: | compile | 8m 10s | | trunk passed with JDK Private Build-1.8.0_402-8u402-ga-2ubuntu1~20.04-b06 | | +1 :green_heart: | checkstyle | 0m 27s | | trunk passed | | +1 :green_heart: | mvnsite | 0m 31s | | trunk passed | | +1 :green_heart: | javadoc | 0m 31s | | trunk passed with JDK Ubuntu-11.0.22+7-post-Ubuntu-0ubuntu220.04.1 | | +1 :green_heart: | javadoc | 0m 29s | | trunk passed with JDK Private Build-1.8.0_402-8u402-ga-2ubuntu1~20.04-b06 | | +1 :green_heart: | spotbugs | 0m 43s | | trunk passed | | +1 :green_heart: | shadedclient | 20m 30s | | branch has no errors when building and testing our client artifacts. | _ Patch Compile Tests _ | | +1 :green_heart: | mvninstall | 0m 13s | | the patch passed | | +1 :green_heart: | compile | 8m 39s | | the patch passed with JDK Ubuntu-11.0.22+7-post-Ubuntu-0ubuntu220.04.1 | | +1 :green_heart: | javac | 8m 39s | | the patch passed | | +1 :green_heart: | compile | 8m 13s | | the patch passed with JDK Private Build-1.8.0_402-8u402-ga-2ubuntu1~20.04-b06 | | +1 :green_heart: | javac | 8m 13s | | the patch passed | | +1 :green_heart: | blanks | 0m 0s | | The patch has no blanks issues. | | +1 :green_heart: | checkstyle | 0m 22s | | the patch passed | | +1 :green_heart: | mvnsite | 0m 29s | | the patch passed | | +1 :green_heart: | javadoc | 0m 28s | | the patch passed with JDK Ubuntu-11.0.22+7-post-Ubuntu-0ubuntu220.04.1 | | +1 :green_heart: | javadoc | 0m 29s | | the patch passed with JDK Private Build-1.8.0_402-8u402-ga-2ubuntu1~20.04-b06 | | +1 :green_heart: | spotbugs | 0m 47s | | the patch passed | | +1 :green_heart: | shadedclient | 20m 36s | | patch has no errors when building and testing our client artifacts. | _ Other Tests _ | | +1 :green_heart: | unit | 2m 58s | | hadoop-auth in the patch passed. | | +1 :green_heart: | asflicense | 0m 42s | | The patch does not generate ASF License warnings. | | | | 122m 43s | | | | Subsystem | Report/Notes | |--:|:-| | Docker | ClientAPI=1.45 ServerAPI=1.45 base: https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-6766/1/artifact/out/Dockerfile | | GITHUB PR | https://github.com/apache/hadoop/pull/6766 | | Optional Tests | dupname asflicense compile javac javadoc mvninstall mvnsite unit shadedclient spotbugs checkstyle codespell detsecrets | | uname | Linux 1220939ff8ec 5.15.0-94-generic #104-Ubuntu SMP Tue Jan 9 15:25:40 UTC 2024 x86_64 x86_64 x86_64 GNU/Linux | | Build tool | maven | | Personality | dev-support/bin/hadoop.sh | | git revision | trunk / b314e098ce8b4dba13d525af7d63f164fc544da8 | | Default Java | Private Build-1.8.0_402-8u402-ga-2ubuntu1~20.04-b06 | | Multi-JDK versions | /usr/lib/jvm/java-11-openjdk-amd64:Ubuntu-11.0.22+7-post-Ubuntu-0ubuntu220.04.1 /usr/lib/jvm/java-8-openjdk-amd64:Private Build-1.8.0_402-8u402-ga-2ubuntu1~20.04-b06 | | Test Results | https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-6766/1/testReport/ | | Max. process+thread count | 551 (vs. ulimit of 5500) | | modules | C: hadoop-common-project/hadoop-auth U: hadoop-common-project/hadoop-auth | | Console output | https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-6766/1/console | | versions | git=2.25.1 maven=3.6.3 spotbugs=4.2.2 | | Powered by | Apache Yetus 0.14.0 https://yetus.apache.org | This message was automatically generated. > Fix TestZKSignerSecretProvider
[jira] [Commented] (HADOOP-19145) Software Architecture Document
[ https://issues.apache.org/jira/browse/HADOOP-19145?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17840288#comment-17840288 ] ASF GitHub Bot commented on HADOOP-19145: - hadoop-yetus commented on PR #6712: URL: https://github.com/apache/hadoop/pull/6712#issuecomment-2074047351 :broken_heart: **-1 overall** | Vote | Subsystem | Runtime | Logfile | Comment | |::|--:|:|::|:---:| _ Prechecks _ | | +1 :green_heart: | dupname | 0m 02s | | No case conflicting files found. | | +0 :ok: | codespell | 0m 02s | | codespell was not available. | | +0 :ok: | detsecrets | 0m 02s | | detect-secrets was not available. | | +0 :ok: | shellcheck | 0m 02s | | Shellcheck was not available. | | +0 :ok: | shelldocs | 0m 02s | | Shelldocs was not available. | | +1 :green_heart: | @author | 0m 00s | | The patch does not contain any @author tags. | _ trunk Compile Tests _ | | +1 :green_heart: | shadedclient | 143m 23s | | branch has no errors when building and testing our client artifacts. | _ Patch Compile Tests _ | | -1 :x: | blanks | 0m 00s | [/blanks-eol.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch-windows-10/job/PR-6712/1/artifact/out/blanks-eol.txt) | The patch has 15 line(s) that end in blanks. Use git apply --whitespace=fix <>. Refer https://git-scm.com/docs/git-apply | | -1 :x: | blanks | 0m 00s | [/blanks-tabs.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch-windows-10/job/PR-6712/1/artifact/out/blanks-tabs.txt) | The patch 207 line(s) with tabs. | | +1 :green_heart: | shadedclient | 144m 04s | | patch has no errors when building and testing our client artifacts. | _ Other Tests _ | | +1 :green_heart: | asflicense | 6m 04s | | The patch does not generate ASF License warnings. | | | | 303m 53s | | | | Subsystem | Report/Notes | |--:|:-| | GITHUB PR | https://github.com/apache/hadoop/pull/6712 | | Optional Tests | dupname asflicense codespell detsecrets shellcheck shelldocs | | uname | MINGW64_NT-10.0-17763 bfa26e3136b3 3.4.10-87d57229.x86_64 2024-02-14 20:17 UTC x86_64 Msys | | Build tool | maven | | Personality | /c/hadoop/dev-support/bin/hadoop.sh | | git revision | trunk / b1c2ce36766ded323b1d581936976160d0a1ce32 | | modules | C: . U: . | | Console output | https://ci-hadoop.apache.org/job/hadoop-multibranch-windows-10/job/PR-6712/1/console | | versions | git=2.44.0.windows.1 | | Powered by | Apache Yetus 0.14.0 https://yetus.apache.org | This message was automatically generated. > Software Architecture Document > -- > > Key: HADOOP-19145 > URL: https://issues.apache.org/jira/browse/HADOOP-19145 > Project: Hadoop Common > Issue Type: Improvement > Components: documentation >Reporter: Levon Khorasandzhian >Priority: Major > Labels: architecture, docuentation, pull-request-available, > software-engineering > Attachments: Apache_Hadoop_SAD.pdf > > Original Estimate: 3h > Remaining Estimate: 3h > > We (GitHub @lkhorasandzhian & @vacherkasskiy) have prepared features for > documentation. This attached Software Architecture Document is very useful > for new contributors and developers to get acquainted with enormous system in > a short time. Currently it's only in Russian, but if you're interested in > such files we can translate it in English. > There are no changes in code, only adding new documentation files. -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Commented] (HADOOP-19146) noaa-cors-pds bucket access with global endpoint fails
[ https://issues.apache.org/jira/browse/HADOOP-19146?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17840287#comment-17840287 ] ASF GitHub Bot commented on HADOOP-19146: - hadoop-yetus commented on PR #6723: URL: https://github.com/apache/hadoop/pull/6723#issuecomment-2074047003 :confetti_ball: **+1 overall** | Vote | Subsystem | Runtime | Logfile | Comment | |::|--:|:|::|:---:| _ Prechecks _ | | +1 :green_heart: | dupname | 0m 01s | | No case conflicting files found. | | +0 :ok: | spotbugs | 0m 01s | | spotbugs executables are not available. | | +0 :ok: | codespell | 0m 01s | | codespell was not available. | | +0 :ok: | detsecrets | 0m 01s | | detect-secrets was not available. | | +1 :green_heart: | @author | 0m 00s | | The patch does not contain any @author tags. | | +1 :green_heart: | test4tests | 0m 00s | | The patch appears to include 8 new or modified test files. | _ trunk Compile Tests _ | | +1 :green_heart: | mvninstall | 109m 14s | | trunk passed | | +1 :green_heart: | compile | 5m 54s | | trunk passed | | +1 :green_heart: | checkstyle | 5m 30s | | trunk passed | | +1 :green_heart: | mvnsite | 5m 57s | | trunk passed | | +1 :green_heart: | javadoc | 5m 43s | | trunk passed | | +1 :green_heart: | shadedclient | 175m 11s | | branch has no errors when building and testing our client artifacts. | _ Patch Compile Tests _ | | +1 :green_heart: | mvninstall | 4m 18s | | the patch passed | | +1 :green_heart: | compile | 2m 40s | | the patch passed | | +1 :green_heart: | javac | 2m 40s | | the patch passed | | +1 :green_heart: | blanks | 0m 00s | | The patch has no blanks issues. | | +1 :green_heart: | checkstyle | 2m 28s | | the patch passed | | +1 :green_heart: | mvnsite | 2m 52s | | the patch passed | | +1 :green_heart: | javadoc | 2m 37s | | the patch passed | | +1 :green_heart: | shadedclient | 190m 48s | | patch has no errors when building and testing our client artifacts. | _ Other Tests _ | | +1 :green_heart: | asflicense | 8m 28s | | The patch does not generate ASF License warnings. | | | | 505m 38s | | | | Subsystem | Report/Notes | |--:|:-| | GITHUB PR | https://github.com/apache/hadoop/pull/6723 | | Optional Tests | dupname asflicense compile javac javadoc mvninstall mvnsite unit shadedclient spotbugs checkstyle codespell detsecrets | | uname | MINGW64_NT-10.0-17763 bbcbad86ec94 3.4.10-87d57229.x86_64 2024-02-14 20:17 UTC x86_64 Msys | | Build tool | maven | | Personality | /c/hadoop/dev-support/bin/hadoop.sh | | git revision | trunk / 60bd00b49f12f28f2033ffe0a0946be73f29ecfa | | Default Java | Azul Systems, Inc.-1.8.0_332-b09 | | Test Results | https://ci-hadoop.apache.org/job/hadoop-multibranch-windows-10/job/PR-6723/1/testReport/ | | modules | C: hadoop-tools/hadoop-aws U: hadoop-tools/hadoop-aws | | Console output | https://ci-hadoop.apache.org/job/hadoop-multibranch-windows-10/job/PR-6723/1/console | | versions | git=2.44.0.windows.1 | | Powered by | Apache Yetus 0.14.0 https://yetus.apache.org | This message was automatically generated. > noaa-cors-pds bucket access with global endpoint fails > -- > > Key: HADOOP-19146 > URL: https://issues.apache.org/jira/browse/HADOOP-19146 > Project: Hadoop Common > Issue Type: Improvement > Components: fs/s3, test >Affects Versions: 3.4.0 >Reporter: Viraj Jasani >Assignee: Viraj Jasani >Priority: Major > Labels: pull-request-available > > All tests accessing noaa-cors-pds use us-east-1 region, as configured at > bucket level. If global endpoint is configured (e.g. us-west-2), they fail to > access to bucket. > > Sample error: > {code:java} > org.apache.hadoop.fs.s3a.AWSRedirectException: Received permanent redirect > response to region [us-east-1]. This likely indicates that the S3 region > configured in fs.s3a.endpoint.region does not match the AWS region containing > the bucket.: null (Service: S3, Status Code: 301, Request ID: > PMRWMQC9S91CNEJR, Extended Request ID: > 6Xrg9thLiZXffBM9rbSCRgBqwTxdLAzm6OzWk9qYJz1kGex3TVfdiMtqJ+G4vaYCyjkqL8cteKI/NuPBQu5A0Q==) > at org.apache.hadoop.fs.s3a.S3AUtils.translateException(S3AUtils.java:253) > at org.apache.hadoop.fs.s3a.S3AUtils.translateException(S3AUtils.java:155) > at > org.apache.hadoop.fs.s3a.S3AFileSystem.s3GetFileStatus(S3AFileSystem.java:4041) > at >
[jira] [Commented] (HADOOP-18679) Add API for bulk/paged object deletion
[ https://issues.apache.org/jira/browse/HADOOP-18679?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17840274#comment-17840274 ] ASF GitHub Bot commented on HADOOP-18679: - hadoop-yetus commented on PR #6726: URL: https://github.com/apache/hadoop/pull/6726#issuecomment-2073964537 :broken_heart: **-1 overall** | Vote | Subsystem | Runtime | Logfile | Comment | |::|--:|:|::|:---:| _ Prechecks _ | | +1 :green_heart: | dupname | 0m 05s | | No case conflicting files found. | | +0 :ok: | codespell | 0m 01s | | codespell was not available. | | +0 :ok: | detsecrets | 0m 01s | | detect-secrets was not available. | | +0 :ok: | xmllint | 0m 01s | | xmllint was not available. | | +0 :ok: | spotbugs | 0m 01s | | spotbugs executables are not available. | | +0 :ok: | markdownlint | 0m 01s | | markdownlint was not available. | | +1 :green_heart: | @author | 0m 00s | | The patch does not contain any @author tags. | | +1 :green_heart: | test4tests | 0m 00s | | The patch appears to include 6 new or modified test files. | _ trunk Compile Tests _ | | +0 :ok: | mvndep | 2m 31s | | Maven dependency ordering for branch | | +1 :green_heart: | mvninstall | 91m 09s | | trunk passed | | +1 :green_heart: | compile | 40m 32s | | trunk passed | | +1 :green_heart: | checkstyle | 6m 09s | | trunk passed | | -1 :x: | mvnsite | 4m 42s | [/branch-mvnsite-hadoop-common-project_hadoop-common.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch-windows-10/job/PR-6726/1/artifact/out/branch-mvnsite-hadoop-common-project_hadoop-common.txt) | hadoop-common in trunk failed. | | +1 :green_heart: | javadoc | 14m 12s | | trunk passed | | +1 :green_heart: | shadedclient | 171m 45s | | branch has no errors when building and testing our client artifacts. | _ Patch Compile Tests _ | | +0 :ok: | mvndep | 2m 24s | | Maven dependency ordering for patch | | +1 :green_heart: | mvninstall | 11m 00s | | the patch passed | | +1 :green_heart: | compile | 38m 33s | | the patch passed | | +1 :green_heart: | javac | 38m 33s | | the patch passed | | -1 :x: | blanks | 0m 00s | [/blanks-eol.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch-windows-10/job/PR-6726/1/artifact/out/blanks-eol.txt) | The patch has 5 line(s) that end in blanks. Use git apply --whitespace=fix <>. Refer https://git-scm.com/docs/git-apply | | +1 :green_heart: | checkstyle | 6m 31s | | the patch passed | | -1 :x: | mvnsite | 4m 36s | [/patch-mvnsite-hadoop-common-project_hadoop-common.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch-windows-10/job/PR-6726/1/artifact/out/patch-mvnsite-hadoop-common-project_hadoop-common.txt) | hadoop-common in the patch failed. | | +1 :green_heart: | javadoc | 14m 19s | | the patch passed | | +1 :green_heart: | shadedclient | 185m 02s | | patch has no errors when building and testing our client artifacts. | _ Other Tests _ | | -1 :x: | asflicense | 5m 46s | [/results-asflicense.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch-windows-10/job/PR-6726/1/artifact/out/results-asflicense.txt) | The patch generated 1 ASF License warnings. | | | | 555m 55s | | | | Subsystem | Report/Notes | |--:|:-| | GITHUB PR | https://github.com/apache/hadoop/pull/6726 | | Optional Tests | dupname asflicense compile javac javadoc mvninstall mvnsite unit shadedclient codespell detsecrets xmllint spotbugs checkstyle markdownlint | | uname | MINGW64_NT-10.0-17763 cfb6e8c364ad 3.4.10-87d57229.x86_64 2024-02-14 20:17 UTC x86_64 Msys | | Build tool | maven | | Personality | /c/hadoop/dev-support/bin/hadoop.sh | | git revision | trunk / 741542703607b954851f005514b12af61a98afb6 | | Default Java | Azul Systems, Inc.-1.8.0_332-b09 | | Test Results | https://ci-hadoop.apache.org/job/hadoop-multibranch-windows-10/job/PR-6726/1/testReport/ | | modules | C: hadoop-common-project/hadoop-common hadoop-tools/hadoop-aws hadoop-tools/hadoop-azure U: . | | Console output | https://ci-hadoop.apache.org/job/hadoop-multibranch-windows-10/job/PR-6726/1/console | | versions | git=2.44.0.windows.1 | | Powered by | Apache Yetus 0.14.0 https://yetus.apache.org | This message was automatically generated. > Add API for bulk/paged object deletion > -- > > Key: HADOOP-18679 > URL: https://issues.apache.org/jira/browse/HADOOP-18679 > Project: Hadoop Common > Issue Type: Sub-task > Components: fs/s3 >Affects Versions: 3.3.5 >Reporter: Steve Loughran >
[jira] [Commented] (HADOOP-19147) Update ISA-L to 2.31.0 in the build image
[ https://issues.apache.org/jira/browse/HADOOP-19147?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17840243#comment-17840243 ] ASF GitHub Bot commented on HADOOP-19147: - hadoop-yetus commented on PR #6729: URL: https://github.com/apache/hadoop/pull/6729#issuecomment-2073593403 :broken_heart: **-1 overall** | Vote | Subsystem | Runtime | Logfile | Comment | |::|--:|:|::|:---:| _ Prechecks _ | | +1 :green_heart: | dupname | 0m 00s | | No case conflicting files found. | | +0 :ok: | codespell | 0m 00s | | codespell was not available. | | +0 :ok: | detsecrets | 0m 00s | | detect-secrets was not available. | | +0 :ok: | shellcheck | 0m 00s | | Shellcheck was not available. | | +0 :ok: | shelldocs | 0m 00s | | Shelldocs was not available. | | +1 :green_heart: | @author | 0m 00s | | The patch does not contain any @author tags. | | -1 :x: | test4tests | 0m 00s | | The patch doesn't appear to include any new or modified tests. Please justify why no new tests are needed for this patch. Also please list what manual steps were performed to verify this patch. | _ trunk Compile Tests _ | | +0 :ok: | mvndep | 2m 21s | | Maven dependency ordering for branch | | +1 :green_heart: | mvninstall | 89m 42s | | trunk passed | | +1 :green_heart: | mvnsite | 0m 00s | | trunk passed | | +1 :green_heart: | shadedclient | 226m 17s | | branch has no errors when building and testing our client artifacts. | _ Patch Compile Tests _ | | +0 :ok: | mvndep | 2m 19s | | Maven dependency ordering for patch | | +1 :green_heart: | mvninstall | 0m 00s | | the patch passed | | +1 :green_heart: | blanks | 0m 00s | | The patch has no blanks issues. | | +1 :green_heart: | mvnsite | 0m 00s | | the patch passed | | +1 :green_heart: | shadedclient | 140m 11s | | patch has no errors when building and testing our client artifacts. | _ Other Tests _ | | +1 :green_heart: | asflicense | 5m 16s | | The patch does not generate ASF License warnings. | | | | 384m 00s | | | | Subsystem | Report/Notes | |--:|:-| | GITHUB PR | https://github.com/apache/hadoop/pull/6729 | | Optional Tests | dupname asflicense mvnsite unit codespell detsecrets shellcheck shelldocs | | uname | MINGW64_NT-10.0-17763 3803c25cc61e 3.4.10-87d57229.x86_64 2024-02-14 20:17 UTC x86_64 Msys | | Build tool | maven | | Personality | /c/hadoop/dev-support/bin/hadoop.sh | | git revision | trunk / 88246bd7ae0145a56b3d7237aa1d6c474a900e48 | | Test Results | https://ci-hadoop.apache.org/job/hadoop-multibranch-windows-10/job/PR-6729/1/testReport/ | | modules | C: U: | | Console output | https://ci-hadoop.apache.org/job/hadoop-multibranch-windows-10/job/PR-6729/1/console | | versions | git=2.44.0.windows.1 | | Powered by | Apache Yetus 0.14.0 https://yetus.apache.org | This message was automatically generated. > Update ISA-L to 2.31.0 in the build image > - > > Key: HADOOP-19147 > URL: https://issues.apache.org/jira/browse/HADOOP-19147 > Project: Hadoop Common > Issue Type: Task >Reporter: Takanobu Asanuma >Assignee: Takanobu Asanuma >Priority: Major > Labels: pull-request-available > > Intel ISA-L has several improvements in version 2.31.0. Let's update ISA-L in > our build image to this version. -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Commented] (HADOOP-19131) Assist reflection IO with WrappedOperations class
[ https://issues.apache.org/jira/browse/HADOOP-19131?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17840242#comment-17840242 ] ASF GitHub Bot commented on HADOOP-19131: - hadoop-yetus commented on PR #6686: URL: https://github.com/apache/hadoop/pull/6686#issuecomment-2073589561 :broken_heart: **-1 overall** | Vote | Subsystem | Runtime | Logfile | Comment | |::|--:|:|::|:---:| | +0 :ok: | reexec | 0m 30s | | Docker mode activated. | _ Prechecks _ | | +1 :green_heart: | dupname | 0m 1s | | No case conflicting files found. | | +0 :ok: | codespell | 0m 0s | | codespell was not available. | | +0 :ok: | detsecrets | 0m 0s | | detect-secrets was not available. | | +0 :ok: | markdownlint | 0m 0s | | markdownlint was not available. | | +1 :green_heart: | @author | 0m 0s | | The patch does not contain any @author tags. | | +1 :green_heart: | test4tests | 0m 0s | | The patch appears to include 2 new or modified test files. | _ trunk Compile Tests _ | | +0 :ok: | mvndep | 14m 50s | | Maven dependency ordering for branch | | +1 :green_heart: | mvninstall | 35m 7s | | trunk passed | | +1 :green_heart: | compile | 18m 44s | | trunk passed with JDK Ubuntu-11.0.22+7-post-Ubuntu-0ubuntu220.04.1 | | +1 :green_heart: | compile | 16m 16s | | trunk passed with JDK Private Build-1.8.0_402-8u402-ga-2ubuntu1~20.04-b06 | | +1 :green_heart: | checkstyle | 4m 18s | | trunk passed | | +1 :green_heart: | mvnsite | 2m 30s | | trunk passed | | +1 :green_heart: | javadoc | 1m 45s | | trunk passed with JDK Ubuntu-11.0.22+7-post-Ubuntu-0ubuntu220.04.1 | | +1 :green_heart: | javadoc | 1m 33s | | trunk passed with JDK Private Build-1.8.0_402-8u402-ga-2ubuntu1~20.04-b06 | | +1 :green_heart: | spotbugs | 3m 42s | | trunk passed | | +1 :green_heart: | shadedclient | 43m 3s | | branch has no errors when building and testing our client artifacts. | | -0 :warning: | patch | 43m 30s | | Used diff version of patch file. Binary files and potentially other changes not applied. Please rebase and squash commits if necessary. | _ Patch Compile Tests _ | | +0 :ok: | mvndep | 0m 33s | | Maven dependency ordering for patch | | +1 :green_heart: | mvninstall | 1m 31s | | the patch passed | | +1 :green_heart: | compile | 17m 17s | | the patch passed with JDK Ubuntu-11.0.22+7-post-Ubuntu-0ubuntu220.04.1 | | +1 :green_heart: | javac | 17m 17s | | the patch passed | | +1 :green_heart: | compile | 16m 35s | | the patch passed with JDK Private Build-1.8.0_402-8u402-ga-2ubuntu1~20.04-b06 | | +1 :green_heart: | javac | 16m 35s | | the patch passed | | +1 :green_heart: | blanks | 0m 0s | | The patch has no blanks issues. | | -0 :warning: | checkstyle | 4m 41s | [/results-checkstyle-root.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-6686/6/artifact/out/results-checkstyle-root.txt) | root: The patch generated 12 new + 16 unchanged - 0 fixed = 28 total (was 16) | | +1 :green_heart: | mvnsite | 2m 36s | | the patch passed | | -1 :x: | javadoc | 1m 9s | [/patch-javadoc-hadoop-common-project_hadoop-common-jdkUbuntu-11.0.22+7-post-Ubuntu-0ubuntu220.04.1.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-6686/6/artifact/out/patch-javadoc-hadoop-common-project_hadoop-common-jdkUbuntu-11.0.22+7-post-Ubuntu-0ubuntu220.04.1.txt) | hadoop-common in the patch failed with JDK Ubuntu-11.0.22+7-post-Ubuntu-0ubuntu220.04.1. | | +1 :green_heart: | javadoc | 1m 38s | | the patch passed with JDK Private Build-1.8.0_402-8u402-ga-2ubuntu1~20.04-b06 | | -1 :x: | spotbugs | 2m 59s | [/new-spotbugs-hadoop-common-project_hadoop-common.html](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-6686/6/artifact/out/new-spotbugs-hadoop-common-project_hadoop-common.html) | hadoop-common-project/hadoop-common generated 1 new + 0 unchanged - 0 fixed = 1 total (was 0) | | +1 :green_heart: | shadedclient | 34m 40s | | patch has no errors when building and testing our client artifacts. | _ Other Tests _ | | +1 :green_heart: | unit | 19m 48s | | hadoop-common in the patch passed. | | +1 :green_heart: | unit | 3m 9s | | hadoop-aws in the patch passed. | | +1 :green_heart: | asflicense | 1m 2s | | The patch does not generate ASF License warnings. | | | | 259m 29s | | | | Reason | Tests | |---:|:--| | SpotBugs | module:hadoop-common-project/hadoop-common | | | Unchecked/unconfirmed cast from Throwable to Exception in org.apache.hadoop.io.wrappedio.DynMethods.throwIfInstance(Throwable, Class) At
[jira] [Commented] (HADOOP-19140) [ABFS, S3A] Add IORateLimiter api to hadoop common
[ https://issues.apache.org/jira/browse/HADOOP-19140?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17840237#comment-17840237 ] ASF GitHub Bot commented on HADOOP-19140: - hadoop-yetus commented on PR #6703: URL: https://github.com/apache/hadoop/pull/6703#issuecomment-2073517148 :confetti_ball: **+1 overall** | Vote | Subsystem | Runtime | Logfile | Comment | |::|--:|:|::|:---:| | +0 :ok: | reexec | 0m 31s | | Docker mode activated. | _ Prechecks _ | | +1 :green_heart: | dupname | 0m 0s | | No case conflicting files found. | | +0 :ok: | codespell | 0m 0s | | codespell was not available. | | +0 :ok: | detsecrets | 0m 0s | | detect-secrets was not available. | | +1 :green_heart: | @author | 0m 0s | | The patch does not contain any @author tags. | | +1 :green_heart: | test4tests | 0m 0s | | The patch appears to include 1 new or modified test files. | _ trunk Compile Tests _ | | +1 :green_heart: | mvninstall | 46m 17s | | trunk passed | | +1 :green_heart: | compile | 17m 52s | | trunk passed with JDK Ubuntu-11.0.22+7-post-Ubuntu-0ubuntu220.04.1 | | +1 :green_heart: | compile | 17m 12s | | trunk passed with JDK Private Build-1.8.0_402-8u402-ga-2ubuntu1~20.04-b06 | | +1 :green_heart: | checkstyle | 1m 15s | | trunk passed | | +1 :green_heart: | mvnsite | 1m 39s | | trunk passed | | +1 :green_heart: | javadoc | 1m 14s | | trunk passed with JDK Ubuntu-11.0.22+7-post-Ubuntu-0ubuntu220.04.1 | | +1 :green_heart: | javadoc | 0m 50s | | trunk passed with JDK Private Build-1.8.0_402-8u402-ga-2ubuntu1~20.04-b06 | | +1 :green_heart: | spotbugs | 2m 35s | | trunk passed | | +1 :green_heart: | shadedclient | 38m 40s | | branch has no errors when building and testing our client artifacts. | _ Patch Compile Tests _ | | +1 :green_heart: | mvninstall | 0m 55s | | the patch passed | | +1 :green_heart: | compile | 16m 46s | | the patch passed with JDK Ubuntu-11.0.22+7-post-Ubuntu-0ubuntu220.04.1 | | +1 :green_heart: | javac | 16m 46s | | the patch passed | | +1 :green_heart: | compile | 16m 7s | | the patch passed with JDK Private Build-1.8.0_402-8u402-ga-2ubuntu1~20.04-b06 | | +1 :green_heart: | javac | 16m 7s | | the patch passed | | +1 :green_heart: | blanks | 0m 0s | | The patch has no blanks issues. | | +1 :green_heart: | checkstyle | 1m 14s | | the patch passed | | +1 :green_heart: | mvnsite | 1m 34s | | the patch passed | | +1 :green_heart: | javadoc | 1m 4s | | the patch passed with JDK Ubuntu-11.0.22+7-post-Ubuntu-0ubuntu220.04.1 | | +1 :green_heart: | javadoc | 0m 49s | | the patch passed with JDK Private Build-1.8.0_402-8u402-ga-2ubuntu1~20.04-b06 | | +1 :green_heart: | spotbugs | 2m 52s | | the patch passed | | +1 :green_heart: | shadedclient | 38m 43s | | patch has no errors when building and testing our client artifacts. | _ Other Tests _ | | +1 :green_heart: | unit | 19m 54s | | hadoop-common in the patch passed. | | +1 :green_heart: | asflicense | 0m 58s | | The patch does not generate ASF License warnings. | | | | 232m 44s | | | | Subsystem | Report/Notes | |--:|:-| | Docker | ClientAPI=1.45 ServerAPI=1.45 base: https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-6703/2/artifact/out/Dockerfile | | GITHUB PR | https://github.com/apache/hadoop/pull/6703 | | Optional Tests | dupname asflicense compile javac javadoc mvninstall mvnsite unit shadedclient spotbugs checkstyle codespell detsecrets | | uname | Linux 1e9683e47802 5.15.0-94-generic #104-Ubuntu SMP Tue Jan 9 15:25:40 UTC 2024 x86_64 x86_64 x86_64 GNU/Linux | | Build tool | maven | | Personality | dev-support/bin/hadoop.sh | | git revision | trunk / d2e146e4180311a52a94240922e3daf8f94ec8bd | | Default Java | Private Build-1.8.0_402-8u402-ga-2ubuntu1~20.04-b06 | | Multi-JDK versions | /usr/lib/jvm/java-11-openjdk-amd64:Ubuntu-11.0.22+7-post-Ubuntu-0ubuntu220.04.1 /usr/lib/jvm/java-8-openjdk-amd64:Private Build-1.8.0_402-8u402-ga-2ubuntu1~20.04-b06 | | Test Results | https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-6703/2/testReport/ | | Max. process+thread count | 2038 (vs. ulimit of 5500) | | modules | C: hadoop-common-project/hadoop-common U: hadoop-common-project/hadoop-common | | Console output | https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-6703/2/console | | versions | git=2.25.1 maven=3.6.3 spotbugs=4.2.2 | | Powered by | Apache Yetus 0.14.0 https://yetus.apache.org | This message was automatically generated. > [ABFS, S3A] Add
[jira] [Commented] (HADOOP-19152) Do not hard code security providers.
[ https://issues.apache.org/jira/browse/HADOOP-19152?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17840214#comment-17840214 ] ASF GitHub Bot commented on HADOOP-19152: - hadoop-yetus commented on PR #6739: URL: https://github.com/apache/hadoop/pull/6739#issuecomment-2073325889 :broken_heart: **-1 overall** | Vote | Subsystem | Runtime | Logfile | Comment | |::|--:|:|::|:---:| | +0 :ok: | reexec | 1m 0s | | Docker mode activated. | _ Prechecks _ | | +1 :green_heart: | dupname | 0m 0s | | No case conflicting files found. | | +0 :ok: | codespell | 0m 1s | | codespell was not available. | | +0 :ok: | detsecrets | 0m 1s | | detect-secrets was not available. | | +0 :ok: | xmllint | 0m 1s | | xmllint was not available. | | +1 :green_heart: | @author | 0m 0s | | The patch does not contain any @author tags. | | +1 :green_heart: | test4tests | 0m 0s | | The patch appears to include 2 new or modified test files. | _ trunk Compile Tests _ | | +1 :green_heart: | mvninstall | 49m 42s | | trunk passed | | +1 :green_heart: | compile | 18m 58s | | trunk passed with JDK Ubuntu-11.0.22+7-post-Ubuntu-0ubuntu220.04.1 | | +1 :green_heart: | compile | 16m 51s | | trunk passed with JDK Private Build-1.8.0_402-8u402-ga-2ubuntu1~20.04-b06 | | +1 :green_heart: | checkstyle | 1m 18s | | trunk passed | | +1 :green_heart: | mvnsite | 1m 47s | | trunk passed | | +1 :green_heart: | javadoc | 1m 19s | | trunk passed with JDK Ubuntu-11.0.22+7-post-Ubuntu-0ubuntu220.04.1 | | +1 :green_heart: | javadoc | 0m 53s | | trunk passed with JDK Private Build-1.8.0_402-8u402-ga-2ubuntu1~20.04-b06 | | +1 :green_heart: | spotbugs | 2m 40s | | trunk passed | | +1 :green_heart: | shadedclient | 37m 31s | | branch has no errors when building and testing our client artifacts. | _ Patch Compile Tests _ | | +1 :green_heart: | mvninstall | 0m 58s | | the patch passed | | +1 :green_heart: | compile | 18m 55s | | the patch passed with JDK Ubuntu-11.0.22+7-post-Ubuntu-0ubuntu220.04.1 | | +1 :green_heart: | javac | 18m 55s | | the patch passed | | +1 :green_heart: | compile | 17m 44s | | the patch passed with JDK Private Build-1.8.0_402-8u402-ga-2ubuntu1~20.04-b06 | | +1 :green_heart: | javac | 17m 44s | | the patch passed | | +1 :green_heart: | blanks | 0m 0s | | The patch has no blanks issues. | | -0 :warning: | checkstyle | 1m 16s | [/results-checkstyle-hadoop-common-project_hadoop-common.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-6739/5/artifact/out/results-checkstyle-hadoop-common-project_hadoop-common.txt) | hadoop-common-project/hadoop-common: The patch generated 4 new + 131 unchanged - 0 fixed = 135 total (was 131) | | +1 :green_heart: | mvnsite | 1m 47s | | the patch passed | | +1 :green_heart: | javadoc | 1m 10s | | the patch passed with JDK Ubuntu-11.0.22+7-post-Ubuntu-0ubuntu220.04.1 | | +1 :green_heart: | javadoc | 0m 53s | | the patch passed with JDK Private Build-1.8.0_402-8u402-ga-2ubuntu1~20.04-b06 | | +1 :green_heart: | spotbugs | 2m 53s | | the patch passed | | +1 :green_heart: | shadedclient | 36m 19s | | patch has no errors when building and testing our client artifacts. | _ Other Tests _ | | -1 :x: | unit | 19m 34s | [/patch-unit-hadoop-common-project_hadoop-common.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-6739/5/artifact/out/patch-unit-hadoop-common-project_hadoop-common.txt) | hadoop-common in the patch passed. | | +1 :green_heart: | asflicense | 1m 4s | | The patch does not generate ASF License warnings. | | | | 238m 37s | | | | Reason | Tests | |---:|:--| | Failed junit tests | hadoop.crypto.TestCryptoStreamsWithJceAesCtrCryptoCodec | | | hadoop.crypto.TestCryptoCodec | | Subsystem | Report/Notes | |--:|:-| | Docker | ClientAPI=1.45 ServerAPI=1.45 base: https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-6739/5/artifact/out/Dockerfile | | GITHUB PR | https://github.com/apache/hadoop/pull/6739 | | Optional Tests | dupname asflicense compile javac javadoc mvninstall mvnsite unit shadedclient codespell detsecrets xmllint spotbugs checkstyle | | uname | Linux 4587c4de41bf 5.15.0-94-generic #104-Ubuntu SMP Tue Jan 9 15:25:40 UTC 2024 x86_64 x86_64 x86_64 GNU/Linux | | Build tool | maven | | Personality | dev-support/bin/hadoop.sh | | git revision | trunk / 07cebad3573d47d09d790c3eed3f1faaed75900d | | Default Java | Private Build-1.8.0_402-8u402-ga-2ubuntu1~20.04-b06 | | Multi-JDK versions |
[jira] [Commented] (HADOOP-19154) upgrade bouncy castle to 1.78.1 due to CVEs
[ https://issues.apache.org/jira/browse/HADOOP-19154?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17840199#comment-17840199 ] ASF GitHub Bot commented on HADOOP-19154: - hadoop-yetus commented on PR #6755: URL: https://github.com/apache/hadoop/pull/6755#issuecomment-2073215984 :broken_heart: **-1 overall** | Vote | Subsystem | Runtime | Logfile | Comment | |::|--:|:|::|:---:| | +0 :ok: | reexec | 0m 59s | | Docker mode activated. | _ Prechecks _ | | +1 :green_heart: | dupname | 0m 0s | | No case conflicting files found. | | +0 :ok: | codespell | 0m 0s | | codespell was not available. | | +0 :ok: | detsecrets | 0m 0s | | detect-secrets was not available. | | +0 :ok: | markdownlint | 0m 0s | | markdownlint was not available. | | +0 :ok: | xmllint | 0m 0s | | xmllint was not available. | | +0 :ok: | shelldocs | 0m 0s | | Shelldocs was not available. | | +1 :green_heart: | @author | 0m 0s | | The patch does not contain any @author tags. | | -1 :x: | test4tests | 0m 0s | | The patch doesn't appear to include any new or modified tests. Please justify why no new tests are needed for this patch. Also please list what manual steps were performed to verify this patch. | _ trunk Compile Tests _ | | +0 :ok: | mvndep | 14m 50s | | Maven dependency ordering for branch | | +1 :green_heart: | mvninstall | 32m 29s | | trunk passed | | +1 :green_heart: | compile | 17m 32s | | trunk passed with JDK Ubuntu-11.0.22+7-post-Ubuntu-0ubuntu220.04.1 | | +1 :green_heart: | compile | 16m 17s | | trunk passed with JDK Private Build-1.8.0_402-8u402-ga-2ubuntu1~20.04-b06 | | +1 :green_heart: | mvnsite | 22m 17s | | trunk passed | | +1 :green_heart: | javadoc | 8m 42s | | trunk passed with JDK Ubuntu-11.0.22+7-post-Ubuntu-0ubuntu220.04.1 | | +1 :green_heart: | javadoc | 7m 59s | | trunk passed with JDK Private Build-1.8.0_402-8u402-ga-2ubuntu1~20.04-b06 | | +1 :green_heart: | shadedclient | 49m 20s | | branch has no errors when building and testing our client artifacts. | _ Patch Compile Tests _ | | +0 :ok: | mvndep | 0m 41s | | Maven dependency ordering for patch | | +1 :green_heart: | mvninstall | 30m 0s | | the patch passed | | +1 :green_heart: | compile | 16m 56s | | the patch passed with JDK Ubuntu-11.0.22+7-post-Ubuntu-0ubuntu220.04.1 | | +1 :green_heart: | javac | 16m 56s | | the patch passed | | +1 :green_heart: | compile | 16m 13s | | the patch passed with JDK Private Build-1.8.0_402-8u402-ga-2ubuntu1~20.04-b06 | | +1 :green_heart: | javac | 16m 13s | | the patch passed | | +1 :green_heart: | blanks | 0m 0s | | The patch has no blanks issues. | | +1 :green_heart: | mvnsite | 16m 23s | | the patch passed | | +1 :green_heart: | shellcheck | 0m 0s | | No new issues. | | +1 :green_heart: | javadoc | 8m 34s | | the patch passed with JDK Ubuntu-11.0.22+7-post-Ubuntu-0ubuntu220.04.1 | | +1 :green_heart: | javadoc | 7m 58s | | the patch passed with JDK Private Build-1.8.0_402-8u402-ga-2ubuntu1~20.04-b06 | | +1 :green_heart: | shadedclient | 51m 2s | | patch has no errors when building and testing our client artifacts. | _ Other Tests _ | | -1 :x: | unit | 853m 6s | [/patch-unit-root.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-6755/2/artifact/out/patch-unit-root.txt) | root in the patch passed. | | +1 :green_heart: | asflicense | 1m 27s | | The patch does not generate ASF License warnings. | | | | 1145m 53s | | | | Reason | Tests | |---:|:--| | Failed junit tests | hadoop.hdfs.server.datanode.TestLargeBlockReport | | | hadoop.hdfs.rbfbalance.TestRouterDistCpProcedure | | Subsystem | Report/Notes | |--:|:-| | Docker | ClientAPI=1.45 ServerAPI=1.45 base: https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-6755/2/artifact/out/Dockerfile | | GITHUB PR | https://github.com/apache/hadoop/pull/6755 | | Optional Tests | dupname asflicense mvnsite codespell detsecrets markdownlint compile javac javadoc mvninstall unit shadedclient xmllint shellcheck shelldocs | | uname | Linux 3706a0b0bcc2 5.15.0-94-generic #104-Ubuntu SMP Tue Jan 9 15:25:40 UTC 2024 x86_64 x86_64 x86_64 GNU/Linux | | Build tool | maven | | Personality | dev-support/bin/hadoop.sh | | git revision | trunk / 22548421c161bf9508649c73f50161bcfc48db40 | | Default Java | Private Build-1.8.0_402-8u402-ga-2ubuntu1~20.04-b06 | | Multi-JDK versions | /usr/lib/jvm/java-11-openjdk-amd64:Ubuntu-11.0.22+7-post-Ubuntu-0ubuntu220.04.1
[jira] [Commented] (HADOOP-19152) Do not hard code security providers.
[ https://issues.apache.org/jira/browse/HADOOP-19152?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17840167#comment-17840167 ] ASF GitHub Bot commented on HADOOP-19152: - steveloughran commented on code in PR #6739: URL: https://github.com/apache/hadoop/pull/6739#discussion_r1576632095 ## hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/crypto/CryptoUtils.java: ## @@ -19,53 +19,63 @@ import org.apache.hadoop.classification.InterfaceAudience; import org.apache.hadoop.conf.Configuration; +import org.apache.hadoop.fs.CommonConfigurationKeysPublic; +import org.apache.hadoop.fs.store.LogExactlyOnce; import org.slf4j.Logger; import org.slf4j.LoggerFactory; import java.lang.reflect.Field; import java.security.Provider; import java.security.Security; -import static org.apache.hadoop.fs.CommonConfigurationKeysPublic.HADOOP_SECURITY_CRYPTO_JCE_PROVIDER_ADD_DEFAULT; -import static org.apache.hadoop.fs.CommonConfigurationKeysPublic.HADOOP_SECURITY_CRYPTO_JCE_PROVIDER_ADD_KEY; -import static org.apache.hadoop.fs.CommonConfigurationKeysPublic.HADOOP_SECURITY_CRYPTO_JCE_PROVIDER_KEY; - +/** Utility methods for the crypto related features. */ @InterfaceAudience.Private public class CryptoUtils { Review Comment: best to make final ## hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/crypto/CryptoUtils.java: ## @@ -0,0 +1,71 @@ +/* + * Licensed to the Apache Software Foundation (ASF) under one + * or more contributor license agreements. See the NOTICE file + * distributed with this work for additional information + * regarding copyright ownership. The ASF licenses this file + * to you under the Apache License, Version 2.0 (the + * "License"); you may not use this file except in compliance + * with the License. You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + */ +package org.apache.hadoop.crypto; + +import org.apache.hadoop.classification.InterfaceAudience; +import org.apache.hadoop.conf.Configuration; Review Comment: more the mix of apache and others, the general layout is currently ``` java.* javax.* other org.apache* static ``` things are generally messy with "other" being tainted by our move off google guava into our own stuff, but its still good to try and keep things under control ## hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/crypto/CryptoUtils.java: ## @@ -0,0 +1,81 @@ +/* + * Licensed to the Apache Software Foundation (ASF) under one + * or more contributor license agreements. See the NOTICE file + * distributed with this work for additional information + * regarding copyright ownership. The ASF licenses this file + * to you under the Apache License, Version 2.0 (the + * "License"); you may not use this file except in compliance + * with the License. You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + */ +package org.apache.hadoop.crypto; + +import org.apache.hadoop.classification.InterfaceAudience; +import org.apache.hadoop.conf.Configuration; +import org.apache.hadoop.fs.CommonConfigurationKeysPublic; +import org.apache.hadoop.fs.store.LogExactlyOnce; +import org.slf4j.Logger; +import org.slf4j.LoggerFactory; + +import java.lang.reflect.Field; +import java.security.Provider; +import java.security.Security; + +/** Utility methods for the crypto related features. */ +@InterfaceAudience.Private +public class CryptoUtils { + static final Logger LOG = LoggerFactory.getLogger(CryptoUtils.class); + private static final LogExactlyOnce LOG_FAILED_TO_LOAD_CLASS = new LogExactlyOnce(LOG); + private static final LogExactlyOnce LOG_FAILED_TO_GET_FIELD = new LogExactlyOnce(LOG); + private static final LogExactlyOnce LOG_FAILED_TO_ADD_PROVIDER = new LogExactlyOnce(LOG); + + private static final String BOUNCY_CASTLE_PROVIDER_CLASS + = "org.bouncycastle.jce.provider.BouncyCastleProvider"; + private static final String PROVIDER_NAME_FIELD = "PROVIDER_NAME"; + + /** + * Get the security provider value specified in + * {@link CommonConfigurationKeysPublic#HADOOP_SECURITY_CRYPTO_JCE_PROVIDER_KEY} + * from the
[jira] [Commented] (HADOOP-18679) Add API for bulk/paged object deletion
[ https://issues.apache.org/jira/browse/HADOOP-18679?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17840157#comment-17840157 ] ASF GitHub Bot commented on HADOOP-18679: - hadoop-yetus commented on PR #6738: URL: https://github.com/apache/hadoop/pull/6738#issuecomment-2072921149 :broken_heart: **-1 overall** | Vote | Subsystem | Runtime | Logfile | Comment | |::|--:|:|::|:---:| _ Prechecks _ | | +1 :green_heart: | dupname | 0m 05s | | No case conflicting files found. | | +0 :ok: | codespell | 0m 00s | | codespell was not available. | | +0 :ok: | detsecrets | 0m 00s | | detect-secrets was not available. | | +0 :ok: | xmllint | 0m 00s | | xmllint was not available. | | +0 :ok: | spotbugs | 0m 00s | | spotbugs executables are not available. | | +0 :ok: | markdownlint | 0m 00s | | markdownlint was not available. | | +1 :green_heart: | @author | 0m 00s | | The patch does not contain any @author tags. | | +1 :green_heart: | test4tests | 0m 00s | | The patch appears to include 6 new or modified test files. | _ trunk Compile Tests _ | | +0 :ok: | mvndep | 3m 11s | | Maven dependency ordering for branch | | +1 :green_heart: | mvninstall | 90m 04s | | trunk passed | | +1 :green_heart: | compile | 39m 11s | | trunk passed | | +1 :green_heart: | checkstyle | 5m 51s | | trunk passed | | -1 :x: | mvnsite | 4m 20s | [/branch-mvnsite-hadoop-common-project_hadoop-common.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch-windows-10/job/PR-6738/1/artifact/out/branch-mvnsite-hadoop-common-project_hadoop-common.txt) | hadoop-common in trunk failed. | | +1 :green_heart: | javadoc | 13m 35s | | trunk passed | | +1 :green_heart: | shadedclient | 167m 43s | | branch has no errors when building and testing our client artifacts. | _ Patch Compile Tests _ | | +0 :ok: | mvndep | 2m 18s | | Maven dependency ordering for patch | | +1 :green_heart: | mvninstall | 10m 40s | | the patch passed | | +1 :green_heart: | compile | 37m 05s | | the patch passed | | +1 :green_heart: | javac | 37m 05s | | the patch passed | | -1 :x: | blanks | 0m 00s | [/blanks-eol.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch-windows-10/job/PR-6738/1/artifact/out/blanks-eol.txt) | The patch has 2 line(s) that end in blanks. Use git apply --whitespace=fix <>. Refer https://git-scm.com/docs/git-apply | | +1 :green_heart: | checkstyle | 6m 04s | | the patch passed | | -1 :x: | mvnsite | 4m 25s | [/patch-mvnsite-hadoop-common-project_hadoop-common.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch-windows-10/job/PR-6738/1/artifact/out/patch-mvnsite-hadoop-common-project_hadoop-common.txt) | hadoop-common in the patch failed. | | +1 :green_heart: | javadoc | 13m 59s | | the patch passed | | +1 :green_heart: | shadedclient | 177m 47s | | patch has no errors when building and testing our client artifacts. | _ Other Tests _ | | -1 :x: | asflicense | 5m 31s | [/results-asflicense.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch-windows-10/job/PR-6738/1/artifact/out/results-asflicense.txt) | The patch generated 1 ASF License warnings. | | | | 540m 54s | | | | Subsystem | Report/Notes | |--:|:-| | GITHUB PR | https://github.com/apache/hadoop/pull/6738 | | Optional Tests | dupname asflicense compile javac javadoc mvninstall mvnsite unit shadedclient codespell detsecrets xmllint spotbugs checkstyle markdownlint | | uname | MINGW64_NT-10.0-17763 b4a02a5f9adc 3.4.10-87d57229.x86_64 2024-02-14 20:17 UTC x86_64 Msys | | Build tool | maven | | Personality | /c/hadoop/dev-support/bin/hadoop.sh | | git revision | trunk / 744a643945e9fbf2fd1246c3e48c752789060370 | | Default Java | Azul Systems, Inc.-1.8.0_332-b09 | | Test Results | https://ci-hadoop.apache.org/job/hadoop-multibranch-windows-10/job/PR-6738/1/testReport/ | | modules | C: hadoop-common-project/hadoop-common hadoop-tools/hadoop-aws hadoop-tools/hadoop-azure U: . | | Console output | https://ci-hadoop.apache.org/job/hadoop-multibranch-windows-10/job/PR-6738/1/console | | versions | git=2.44.0.windows.1 | | Powered by | Apache Yetus 0.14.0 https://yetus.apache.org | This message was automatically generated. > Add API for bulk/paged object deletion > -- > > Key: HADOOP-18679 > URL: https://issues.apache.org/jira/browse/HADOOP-18679 > Project: Hadoop Common > Issue Type: Sub-task > Components: fs/s3 >Affects Versions: 3.3.5 >Reporter: Steve Loughran >
[jira] [Commented] (HADOOP-19150) Test ITestAbfsRestOperationException#testAuthFailException is broken.
[ https://issues.apache.org/jira/browse/HADOOP-19150?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17840155#comment-17840155 ] ASF GitHub Bot commented on HADOOP-19150: - mukund-thakur commented on PR #6756: URL: https://github.com/apache/hadoop/pull/6756#issuecomment-2072900446 Not sure what is wrong with Yetus here. can you please add an empty commit such that yetus run again. > Test ITestAbfsRestOperationException#testAuthFailException is broken. > -- > > Key: HADOOP-19150 > URL: https://issues.apache.org/jira/browse/HADOOP-19150 > Project: Hadoop Common > Issue Type: Sub-task >Reporter: Mukund Thakur >Assignee: Anuj Modi >Priority: Major > Labels: pull-request-available > > {code:java} > intercept(Exception.class, > () -> { > fs.getFileStatus(new Path("/")); > }); {code} > Intercept shouldn't be used as there are assertions in catch statements. > > CC [~ste...@apache.org] [~anujmodi2021] [~asrani_anmol] -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Commented] (HADOOP-19151) Support configurable SASL mechanism
[ https://issues.apache.org/jira/browse/HADOOP-19151?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17840152#comment-17840152 ] ASF GitHub Bot commented on HADOOP-19151: - szetszwo commented on PR #6740: URL: https://github.com/apache/hadoop/pull/6740#issuecomment-2072883206 The mvnsite failure is not related to this. > Support configurable SASL mechanism > --- > > Key: HADOOP-19151 > URL: https://issues.apache.org/jira/browse/HADOOP-19151 > Project: Hadoop Common > Issue Type: Improvement > Components: security >Reporter: Tsz-wo Sze >Assignee: Tsz-wo Sze >Priority: Major > Labels: pull-request-available > > Currently, the SASL mechanism is hard coded to DIGEST-MD5. As mentioned in > HADOOP-14811, DIGEST-MD5 is known to be insecure; see > [rfc6331|https://datatracker.ietf.org/doc/html/rfc6331]. > In this JIRA, we will make the SASL mechanism configurable. The default > mechanism will still be DIGEST-MD5 in order to maintain compatibility. -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Commented] (HADOOP-19152) Do not hard code security providers.
[ https://issues.apache.org/jira/browse/HADOOP-19152?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17840143#comment-17840143 ] ASF GitHub Bot commented on HADOOP-19152: - szetszwo commented on code in PR #6739: URL: https://github.com/apache/hadoop/pull/6739#discussion_r1576476164 ## hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/crypto/CryptoUtils.java: ## @@ -0,0 +1,71 @@ +/* + * Licensed to the Apache Software Foundation (ASF) under one + * or more contributor license agreements. See the NOTICE file + * distributed with this work for additional information + * regarding copyright ownership. The ASF licenses this file + * to you under the Apache License, Version 2.0 (the + * "License"); you may not use this file except in compliance + * with the License. You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + */ +package org.apache.hadoop.crypto; + +import org.apache.hadoop.classification.InterfaceAudience; +import org.apache.hadoop.conf.Configuration; Review Comment: The ordering seems okay > Do not hard code security providers. > > > Key: HADOOP-19152 > URL: https://issues.apache.org/jira/browse/HADOOP-19152 > Project: Hadoop Common > Issue Type: Improvement > Components: security >Reporter: Tsz-wo Sze >Assignee: Tsz-wo Sze >Priority: Major > Labels: pull-request-available > > In order to support different security providers in different clusters, we > should not hard code a provider in our code. -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Commented] (HADOOP-19152) Do not hard code security providers.
[ https://issues.apache.org/jira/browse/HADOOP-19152?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17840144#comment-17840144 ] ASF GitHub Bot commented on HADOOP-19152: - szetszwo commented on code in PR #6739: URL: https://github.com/apache/hadoop/pull/6739#discussion_r1576476989 ## hadoop-common-project/hadoop-common/src/main/resources/core-default.xml: ## @@ -3625,7 +3625,19 @@ The switch to turn S3A auditing on or off. The JCE provider name used in CryptoCodec. If this value is set, the corresponding provider must be added to the provider list. The provider may be added statically in the java.security file, or -added dynamically by calling the java.security.Security.addProvider(..) method. +dynamically by calling the java.security.Security.addProvider(..) method, or +automatically (only for org.bouncycastle.jce.provider.BouncyCastleProvider) +by setting "hadoop.security.crypto.jce.provider.add" to true + + + + + hadoop.security.crypto.jce.provider.add Review Comment: Let's change it to `auto-add`. > Do not hard code security providers. > > > Key: HADOOP-19152 > URL: https://issues.apache.org/jira/browse/HADOOP-19152 > Project: Hadoop Common > Issue Type: Improvement > Components: security >Reporter: Tsz-wo Sze >Assignee: Tsz-wo Sze >Priority: Major > Labels: pull-request-available > > In order to support different security providers in different clusters, we > should not hard code a provider in our code. -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Commented] (HADOOP-19151) Support configurable SASL mechanism
[ https://issues.apache.org/jira/browse/HADOOP-19151?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17840131#comment-17840131 ] ASF GitHub Bot commented on HADOOP-19151: - hadoop-yetus commented on PR #6740: URL: https://github.com/apache/hadoop/pull/6740#issuecomment-2072641545 :broken_heart: **-1 overall** | Vote | Subsystem | Runtime | Logfile | Comment | |::|--:|:|::|:---:| _ Prechecks _ | | +1 :green_heart: | dupname | 0m 01s | | No case conflicting files found. | | +0 :ok: | spotbugs | 0m 00s | | spotbugs executables are not available. | | +0 :ok: | codespell | 0m 00s | | codespell was not available. | | +0 :ok: | detsecrets | 0m 00s | | detect-secrets was not available. | | +1 :green_heart: | @author | 0m 00s | | The patch does not contain any @author tags. | | +1 :green_heart: | test4tests | 0m 00s | | The patch appears to include 2 new or modified test files. | _ trunk Compile Tests _ | | +0 :ok: | mvndep | 2m 13s | | Maven dependency ordering for branch | | +1 :green_heart: | mvninstall | 87m 55s | | trunk passed | | +1 :green_heart: | compile | 38m 59s | | trunk passed | | +1 :green_heart: | checkstyle | 5m 48s | | trunk passed | | -1 :x: | mvnsite | 4m 17s | [/branch-mvnsite-hadoop-common-project_hadoop-common.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch-windows-10/job/PR-6740/1/artifact/out/branch-mvnsite-hadoop-common-project_hadoop-common.txt) | hadoop-common in trunk failed. | | +1 :green_heart: | javadoc | 20m 15s | | trunk passed | | +1 :green_heart: | shadedclient | 181m 08s | | branch has no errors when building and testing our client artifacts. | _ Patch Compile Tests _ | | +0 :ok: | mvndep | 2m 13s | | Maven dependency ordering for patch | | +1 :green_heart: | mvninstall | 16m 42s | | the patch passed | | +1 :green_heart: | compile | 37m 02s | | the patch passed | | +1 :green_heart: | javac | 37m 02s | | the patch passed | | +1 :green_heart: | blanks | 0m 00s | | The patch has no blanks issues. | | +1 :green_heart: | checkstyle | 5m 46s | | the patch passed | | -1 :x: | mvnsite | 4m 17s | [/patch-mvnsite-hadoop-common-project_hadoop-common.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch-windows-10/job/PR-6740/1/artifact/out/patch-mvnsite-hadoop-common-project_hadoop-common.txt) | hadoop-common in the patch failed. | | +1 :green_heart: | javadoc | 21m 09s | | the patch passed | | +1 :green_heart: | shadedclient | 193m 21s | | patch has no errors when building and testing our client artifacts. | _ Other Tests _ | | +1 :green_heart: | asflicense | 5m 58s | | The patch does not generate ASF License warnings. | | | | 571m 39s | | | | Subsystem | Report/Notes | |--:|:-| | GITHUB PR | https://github.com/apache/hadoop/pull/6740 | | Optional Tests | dupname asflicense compile javac javadoc mvninstall mvnsite unit shadedclient spotbugs checkstyle codespell detsecrets | | uname | MINGW64_NT-10.0-17763 f3bb3ac3fa73 3.4.10-87d57229.x86_64 2024-02-14 20:17 UTC x86_64 Msys | | Build tool | maven | | Personality | /c/hadoop/dev-support/bin/hadoop.sh | | git revision | trunk / a82ffdcf125435e456c06df109c936caa2b2ce42 | | Default Java | Azul Systems, Inc.-1.8.0_332-b09 | | Test Results | https://ci-hadoop.apache.org/job/hadoop-multibranch-windows-10/job/PR-6740/1/testReport/ | | modules | C: hadoop-common-project/hadoop-common hadoop-hdfs-project/hadoop-hdfs-client hadoop-hdfs-project/hadoop-hdfs hadoop-yarn-project/hadoop-yarn/hadoop-yarn-server/hadoop-yarn-server-resourcemanager U: . | | Console output | https://ci-hadoop.apache.org/job/hadoop-multibranch-windows-10/job/PR-6740/1/console | | versions | git=2.44.0.windows.1 | | Powered by | Apache Yetus 0.14.0 https://yetus.apache.org | This message was automatically generated. > Support configurable SASL mechanism > --- > > Key: HADOOP-19151 > URL: https://issues.apache.org/jira/browse/HADOOP-19151 > Project: Hadoop Common > Issue Type: Improvement > Components: security >Reporter: Tsz-wo Sze >Assignee: Tsz-wo Sze >Priority: Major > Labels: pull-request-available > > Currently, the SASL mechanism is hard coded to DIGEST-MD5. As mentioned in > HADOOP-14811, DIGEST-MD5 is known to be insecure; see > [rfc6331|https://datatracker.ietf.org/doc/html/rfc6331]. > In this JIRA, we will make the SASL mechanism configurable. The default > mechanism will still be DIGEST-MD5 in order to
[jira] [Commented] (HADOOP-19152) Do not hard code security providers.
[ https://issues.apache.org/jira/browse/HADOOP-19152?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17840126#comment-17840126 ] ASF GitHub Bot commented on HADOOP-19152: - hadoop-yetus commented on PR #6739: URL: https://github.com/apache/hadoop/pull/6739#issuecomment-2072565481 :broken_heart: **-1 overall** | Vote | Subsystem | Runtime | Logfile | Comment | |::|--:|:|::|:---:| _ Prechecks _ | | +1 :green_heart: | dupname | 0m 01s | | No case conflicting files found. | | +0 :ok: | codespell | 0m 01s | | codespell was not available. | | +0 :ok: | detsecrets | 0m 01s | | detect-secrets was not available. | | +0 :ok: | xmllint | 0m 01s | | xmllint was not available. | | +0 :ok: | spotbugs | 0m 01s | | spotbugs executables are not available. | | +1 :green_heart: | @author | 0m 00s | | The patch does not contain any @author tags. | | +1 :green_heart: | test4tests | 0m 00s | | The patch appears to include 2 new or modified test files. | _ trunk Compile Tests _ | | +1 :green_heart: | mvninstall | 86m 50s | | trunk passed | | +1 :green_heart: | compile | 37m 51s | | trunk passed | | +1 :green_heart: | checkstyle | 4m 26s | | trunk passed | | -1 :x: | mvnsite | 4m 21s | [/branch-mvnsite-hadoop-common-project_hadoop-common.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch-windows-10/job/PR-6739/1/artifact/out/branch-mvnsite-hadoop-common-project_hadoop-common.txt) | hadoop-common in trunk failed. | | +1 :green_heart: | javadoc | 5m 14s | | trunk passed | | +1 :green_heart: | shadedclient | 144m 47s | | branch has no errors when building and testing our client artifacts. | _ Patch Compile Tests _ | | +1 :green_heart: | mvninstall | 4m 32s | | the patch passed | | +1 :green_heart: | compile | 35m 24s | | the patch passed | | +1 :green_heart: | javac | 35m 25s | | the patch passed | | +1 :green_heart: | blanks | 0m 00s | | The patch has no blanks issues. | | +1 :green_heart: | checkstyle | 4m 35s | | the patch passed | | -1 :x: | mvnsite | 4m 09s | [/patch-mvnsite-hadoop-common-project_hadoop-common.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch-windows-10/job/PR-6739/1/artifact/out/patch-mvnsite-hadoop-common-project_hadoop-common.txt) | hadoop-common in the patch failed. | | +1 :green_heart: | javadoc | 4m 27s | | the patch passed | | +1 :green_heart: | shadedclient | 148m 37s | | patch has no errors when building and testing our client artifacts. | _ Other Tests _ | | +1 :green_heart: | asflicense | 5m 09s | | The patch does not generate ASF License warnings. | | | | 470m 42s | | | | Subsystem | Report/Notes | |--:|:-| | GITHUB PR | https://github.com/apache/hadoop/pull/6739 | | Optional Tests | dupname asflicense compile javac javadoc mvninstall mvnsite unit shadedclient codespell detsecrets xmllint spotbugs checkstyle | | uname | MINGW64_NT-10.0-17763 ae53cf711cf0 3.4.10-87d57229.x86_64 2024-02-14 20:17 UTC x86_64 Msys | | Build tool | maven | | Personality | /c/hadoop/dev-support/bin/hadoop.sh | | git revision | trunk / 84e10b88587f0a0a8c2502d7610dee903e617735 | | Default Java | Azul Systems, Inc.-1.8.0_332-b09 | | Test Results | https://ci-hadoop.apache.org/job/hadoop-multibranch-windows-10/job/PR-6739/1/testReport/ | | modules | C: hadoop-common-project/hadoop-common U: hadoop-common-project/hadoop-common | | Console output | https://ci-hadoop.apache.org/job/hadoop-multibranch-windows-10/job/PR-6739/1/console | | versions | git=2.44.0.windows.1 | | Powered by | Apache Yetus 0.14.0 https://yetus.apache.org | This message was automatically generated. > Do not hard code security providers. > > > Key: HADOOP-19152 > URL: https://issues.apache.org/jira/browse/HADOOP-19152 > Project: Hadoop Common > Issue Type: Improvement > Components: security >Reporter: Tsz-wo Sze >Assignee: Tsz-wo Sze >Priority: Major > Labels: pull-request-available > > In order to support different security providers in different clusters, we > should not hard code a provider in our code. -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Commented] (HADOOP-19152) Do not hard code security providers.
[ https://issues.apache.org/jira/browse/HADOOP-19152?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17840076#comment-17840076 ] ASF GitHub Bot commented on HADOOP-19152: - steveloughran commented on code in PR #6739: URL: https://github.com/apache/hadoop/pull/6739#discussion_r1576126175 ## hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/crypto/CryptoUtils.java: ## @@ -0,0 +1,71 @@ +/* + * Licensed to the Apache Software Foundation (ASF) under one + * or more contributor license agreements. See the NOTICE file + * distributed with this work for additional information + * regarding copyright ownership. The ASF licenses this file + * to you under the Apache License, Version 2.0 (the + * "License"); you may not use this file except in compliance + * with the License. You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + */ +package org.apache.hadoop.crypto; + +import org.apache.hadoop.classification.InterfaceAudience; +import org.apache.hadoop.conf.Configuration; +import org.slf4j.Logger; +import org.slf4j.LoggerFactory; + +import java.lang.reflect.Field; +import java.security.Provider; +import java.security.Security; + +import static org.apache.hadoop.fs.CommonConfigurationKeysPublic.HADOOP_SECURITY_CRYPTO_JCE_PROVIDER_ADD_DEFAULT; +import static org.apache.hadoop.fs.CommonConfigurationKeysPublic.HADOOP_SECURITY_CRYPTO_JCE_PROVIDER_ADD_KEY; +import static org.apache.hadoop.fs.CommonConfigurationKeysPublic.HADOOP_SECURITY_CRYPTO_JCE_PROVIDER_KEY; + +@InterfaceAudience.Private Review Comment: add a javadoc explaining what it does ## hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/crypto/CryptoUtils.java: ## @@ -0,0 +1,71 @@ +/* + * Licensed to the Apache Software Foundation (ASF) under one + * or more contributor license agreements. See the NOTICE file + * distributed with this work for additional information + * regarding copyright ownership. The ASF licenses this file + * to you under the Apache License, Version 2.0 (the + * "License"); you may not use this file except in compliance + * with the License. You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + */ +package org.apache.hadoop.crypto; + +import org.apache.hadoop.classification.InterfaceAudience; +import org.apache.hadoop.conf.Configuration; Review Comment: nit: import ordering ## hadoop-common-project/hadoop-common/src/main/resources/core-default.xml: ## @@ -3625,7 +3625,19 @@ The switch to turn S3A auditing on or off. The JCE provider name used in CryptoCodec. If this value is set, the corresponding provider must be added to the provider list. The provider may be added statically in the java.security file, or -added dynamically by calling the java.security.Security.addProvider(..) method. +dynamically by calling the java.security.Security.addProvider(..) method, or +automatically (only for org.bouncycastle.jce.provider.BouncyCastleProvider) +by setting "hadoop.security.crypto.jce.provider.add" to true + + + + + hadoop.security.crypto.jce.provider.add Review Comment: not sure about the name here. ## hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/crypto/CryptoUtils.java: ## @@ -0,0 +1,71 @@ +/* + * Licensed to the Apache Software Foundation (ASF) under one + * or more contributor license agreements. See the NOTICE file + * distributed with this work for additional information + * regarding copyright ownership. The ASF licenses this file + * to you under the Apache License, Version 2.0 (the + * "License"); you may not use this file except in compliance + * with the License. You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + */ +package org.apache.hadoop.crypto; + +import
[jira] [Commented] (HADOOP-19102) [ABFS]: FooterReadBufferSize should not be greater than readBufferSize
[ https://issues.apache.org/jira/browse/HADOOP-19102?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17840073#comment-17840073 ] ASF GitHub Bot commented on HADOOP-19102: - steveloughran merged PR #6763: URL: https://github.com/apache/hadoop/pull/6763 > [ABFS]: FooterReadBufferSize should not be greater than readBufferSize > -- > > Key: HADOOP-19102 > URL: https://issues.apache.org/jira/browse/HADOOP-19102 > Project: Hadoop Common > Issue Type: Sub-task > Components: fs/azure >Affects Versions: 3.4.0 >Reporter: Pranav Saxena >Assignee: Pranav Saxena >Priority: Major > Labels: pull-request-available > > The method `optimisedRead` creates a buffer array of size `readBufferSize`. > If footerReadBufferSize is greater than readBufferSize, abfs will attempt to > read more data than the buffer array can hold, which causes an exception. > Change: To avoid this, we will keep footerBufferSize = > min(readBufferSizeConfig, footerBufferSizeConfig) > > -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Commented] (HADOOP-19139) [ABFS]: No GetPathStatus call for opening AbfsInputStream
[ https://issues.apache.org/jira/browse/HADOOP-19139?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17840039#comment-17840039 ] ASF GitHub Bot commented on HADOOP-19139: - hadoop-yetus commented on PR #6699: URL: https://github.com/apache/hadoop/pull/6699#issuecomment-2071857509 :confetti_ball: **+1 overall** | Vote | Subsystem | Runtime | Logfile | Comment | |::|--:|:|::|:---:| | +0 :ok: | reexec | 0m 46s | | Docker mode activated. | _ Prechecks _ | | +1 :green_heart: | dupname | 0m 1s | | No case conflicting files found. | | +0 :ok: | codespell | 0m 1s | | codespell was not available. | | +0 :ok: | detsecrets | 0m 1s | | detect-secrets was not available. | | +0 :ok: | markdownlint | 0m 1s | | markdownlint was not available. | | +1 :green_heart: | @author | 0m 0s | | The patch does not contain any @author tags. | | +1 :green_heart: | test4tests | 0m 0s | | The patch appears to include 13 new or modified test files. | _ trunk Compile Tests _ | | +0 :ok: | mvndep | 14m 11s | | Maven dependency ordering for branch | | +1 :green_heart: | mvninstall | 37m 55s | | trunk passed | | +1 :green_heart: | compile | 19m 23s | | trunk passed with JDK Ubuntu-11.0.22+7-post-Ubuntu-0ubuntu220.04.1 | | +1 :green_heart: | compile | 17m 40s | | trunk passed with JDK Private Build-1.8.0_402-8u402-ga-2ubuntu1~20.04-b06 | | +1 :green_heart: | checkstyle | 4m 47s | | trunk passed | | +1 :green_heart: | mvnsite | 2m 32s | | trunk passed | | +1 :green_heart: | javadoc | 1m 59s | | trunk passed with JDK Ubuntu-11.0.22+7-post-Ubuntu-0ubuntu220.04.1 | | +1 :green_heart: | javadoc | 1m 34s | | trunk passed with JDK Private Build-1.8.0_402-8u402-ga-2ubuntu1~20.04-b06 | | +1 :green_heart: | spotbugs | 3m 43s | | trunk passed | | +1 :green_heart: | shadedclient | 40m 1s | | branch has no errors when building and testing our client artifacts. | | -0 :warning: | patch | 40m 27s | | Used diff version of patch file. Binary files and potentially other changes not applied. Please rebase and squash commits if necessary. | _ Patch Compile Tests _ | | +0 :ok: | mvndep | 0m 33s | | Maven dependency ordering for patch | | +1 :green_heart: | mvninstall | 1m 23s | | the patch passed | | +1 :green_heart: | compile | 18m 32s | | the patch passed with JDK Ubuntu-11.0.22+7-post-Ubuntu-0ubuntu220.04.1 | | +1 :green_heart: | javac | 18m 32s | | the patch passed | | +1 :green_heart: | compile | 17m 24s | | the patch passed with JDK Private Build-1.8.0_402-8u402-ga-2ubuntu1~20.04-b06 | | +1 :green_heart: | javac | 17m 24s | | the patch passed | | +1 :green_heart: | blanks | 0m 0s | | The patch has no blanks issues. | | +1 :green_heart: | checkstyle | 4m 41s | | the patch passed | | +1 :green_heart: | mvnsite | 2m 30s | | the patch passed | | +1 :green_heart: | javadoc | 1m 54s | | the patch passed with JDK Ubuntu-11.0.22+7-post-Ubuntu-0ubuntu220.04.1 | | +1 :green_heart: | javadoc | 1m 34s | | the patch passed with JDK Private Build-1.8.0_402-8u402-ga-2ubuntu1~20.04-b06 | | +1 :green_heart: | spotbugs | 4m 0s | | the patch passed | | +1 :green_heart: | shadedclient | 41m 19s | | patch has no errors when building and testing our client artifacts. | _ Other Tests _ | | +1 :green_heart: | unit | 19m 35s | | hadoop-common in the patch passed. | | +1 :green_heart: | unit | 2m 40s | | hadoop-azure in the patch passed. | | +1 :green_heart: | asflicense | 0m 56s | | The patch does not generate ASF License warnings. | | | | 269m 3s | | | | Subsystem | Report/Notes | |--:|:-| | Docker | ClientAPI=1.44 ServerAPI=1.44 base: https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-6699/57/artifact/out/Dockerfile | | GITHUB PR | https://github.com/apache/hadoop/pull/6699 | | Optional Tests | dupname asflicense mvnsite codespell detsecrets markdownlint compile javac javadoc mvninstall unit shadedclient spotbugs checkstyle | | uname | Linux 3e217f1b4d3f 5.15.0-94-generic #104-Ubuntu SMP Tue Jan 9 15:25:40 UTC 2024 x86_64 x86_64 x86_64 GNU/Linux | | Build tool | maven | | Personality | dev-support/bin/hadoop.sh | | git revision | trunk / 01239aa7132fc54fb295b2df8af32cec7c83758e | | Default Java | Private Build-1.8.0_402-8u402-ga-2ubuntu1~20.04-b06 | | Multi-JDK versions | /usr/lib/jvm/java-11-openjdk-amd64:Ubuntu-11.0.22+7-post-Ubuntu-0ubuntu220.04.1 /usr/lib/jvm/java-8-openjdk-amd64:Private Build-1.8.0_402-8u402-ga-2ubuntu1~20.04-b06 | | Test Results |
[jira] [Commented] (HADOOP-19139) [ABFS]: No GetPathStatus call for opening AbfsInputStream
[ https://issues.apache.org/jira/browse/HADOOP-19139?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17839987#comment-17839987 ] ASF GitHub Bot commented on HADOOP-19139: - saxenapranav commented on PR #6699: URL: https://github.com/apache/hadoop/pull/6699#issuecomment-2071630901 -- AGGREGATED TEST RESULT HNS-OAuth [WARNING] Tests run: 137, Failures: 0, Errors: 0, Skipped: 2 [WARNING] Tests run: 623, Failures: 0, Errors: 0, Skipped: 73 [WARNING] Tests run: 380, Failures: 0, Errors: 0, Skipped: 57 HNS-SharedKey [WARNING] Tests run: 137, Failures: 0, Errors: 0, Skipped: 3 [WARNING] Tests run: 623, Failures: 0, Errors: 0, Skipped: 28 [WARNING] Tests run: 380, Failures: 0, Errors: 0, Skipped: 41 NonHNS-SharedKey [WARNING] Tests run: 137, Failures: 0, Errors: 0, Skipped: 9 [WARNING] Tests run: 607, Failures: 0, Errors: 0, Skipped: 268 [WARNING] Tests run: 380, Failures: 0, Errors: 0, Skipped: 44 AppendBlob-HNS-OAuth [WARNING] Tests run: 137, Failures: 0, Errors: 0, Skipped: 2 [WARNING] Tests run: 623, Failures: 0, Errors: 0, Skipped: 75 [WARNING] Tests run: 380, Failures: 0, Errors: 0, Skipped: 81 Time taken: 21 mins 16 secs. azureuser@Hadoop-VM-EAST2:~/hadoop/hadoop-tools/hadoop-azure$ git log commit 01239aa7132fc54fb295b2df8af32cec7c83758e (HEAD -> saxenapranav/noGpsForRead, origin/saxenapranav/noGpsForRead) Merge: 44ffeb37602 6404692c097 Author: Pranav Saxena <> Date: Mon Apr 22 22:06:26 2024 -0700 Merge branch 'trunk' into saxenapranav/noGpsForRead > [ABFS]: No GetPathStatus call for opening AbfsInputStream > - > > Key: HADOOP-19139 > URL: https://issues.apache.org/jira/browse/HADOOP-19139 > Project: Hadoop Common > Issue Type: Sub-task > Components: fs/azure >Reporter: Pranav Saxena >Assignee: Pranav Saxena >Priority: Major > Labels: pull-request-available > > Read API gives contentLen and etag of the path. This information would be > used in future calls on that inputStream. Prior information of eTag is of not > much importance. -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Commented] (HADOOP-19102) [ABFS]: FooterReadBufferSize should not be greater than readBufferSize
[ https://issues.apache.org/jira/browse/HADOOP-19102?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17839982#comment-17839982 ] ASF GitHub Bot commented on HADOOP-19102: - hadoop-yetus commented on PR #6763: URL: https://github.com/apache/hadoop/pull/6763#issuecomment-2071588551 :confetti_ball: **+1 overall** | Vote | Subsystem | Runtime | Logfile | Comment | |::|--:|:|::|:---:| | +0 :ok: | reexec | 6m 39s | | Docker mode activated. | _ Prechecks _ | | +1 :green_heart: | dupname | 0m 0s | | No case conflicting files found. | | +0 :ok: | codespell | 0m 0s | | codespell was not available. | | +0 :ok: | detsecrets | 0m 0s | | detect-secrets was not available. | | +1 :green_heart: | @author | 0m 0s | | The patch does not contain any @author tags. | | +1 :green_heart: | test4tests | 0m 0s | | The patch appears to include 5 new or modified test files. | _ branch-3.4 Compile Tests _ | | +0 :ok: | mvndep | 4m 5s | | Maven dependency ordering for branch | | +1 :green_heart: | mvninstall | 28m 45s | | branch-3.4 passed | | +1 :green_heart: | compile | 8m 52s | | branch-3.4 passed with JDK Ubuntu-11.0.22+7-post-Ubuntu-0ubuntu220.04.1 | | +1 :green_heart: | compile | 8m 6s | | branch-3.4 passed with JDK Private Build-1.8.0_402-8u402-ga-2ubuntu1~20.04-b06 | | +1 :green_heart: | checkstyle | 2m 2s | | branch-3.4 passed | | +1 :green_heart: | mvnsite | 1m 19s | | branch-3.4 passed | | +1 :green_heart: | javadoc | 1m 9s | | branch-3.4 passed with JDK Ubuntu-11.0.22+7-post-Ubuntu-0ubuntu220.04.1 | | +1 :green_heart: | javadoc | 1m 0s | | branch-3.4 passed with JDK Private Build-1.8.0_402-8u402-ga-2ubuntu1~20.04-b06 | | +1 :green_heart: | spotbugs | 2m 11s | | branch-3.4 passed | | +1 :green_heart: | shadedclient | 20m 4s | | branch has no errors when building and testing our client artifacts. | _ Patch Compile Tests _ | | +0 :ok: | mvndep | 0m 21s | | Maven dependency ordering for patch | | +1 :green_heart: | mvninstall | 0m 47s | | the patch passed | | +1 :green_heart: | compile | 8m 22s | | the patch passed with JDK Ubuntu-11.0.22+7-post-Ubuntu-0ubuntu220.04.1 | | +1 :green_heart: | javac | 8m 22s | | the patch passed | | +1 :green_heart: | compile | 7m 56s | | the patch passed with JDK Private Build-1.8.0_402-8u402-ga-2ubuntu1~20.04-b06 | | +1 :green_heart: | javac | 7m 56s | | the patch passed | | +1 :green_heart: | blanks | 0m 0s | | The patch has no blanks issues. | | +1 :green_heart: | checkstyle | 1m 55s | | root: The patch generated 0 new + 5 unchanged - 8 fixed = 5 total (was 13) | | +1 :green_heart: | mvnsite | 1m 21s | | the patch passed | | +1 :green_heart: | javadoc | 1m 4s | | the patch passed with JDK Ubuntu-11.0.22+7-post-Ubuntu-0ubuntu220.04.1 | | +1 :green_heart: | javadoc | 1m 2s | | the patch passed with JDK Private Build-1.8.0_402-8u402-ga-2ubuntu1~20.04-b06 | | +1 :green_heart: | spotbugs | 2m 17s | | the patch passed | | +1 :green_heart: | shadedclient | 20m 10s | | patch has no errors when building and testing our client artifacts. | _ Other Tests _ | | +1 :green_heart: | unit | 16m 31s | | hadoop-common in the patch passed. | | +1 :green_heart: | unit | 2m 8s | | hadoop-azure in the patch passed. | | +1 :green_heart: | asflicense | 0m 39s | | The patch does not generate ASF License warnings. | | | | 153m 32s | | | | Subsystem | Report/Notes | |--:|:-| | Docker | ClientAPI=1.44 ServerAPI=1.44 base: https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-6763/1/artifact/out/Dockerfile | | GITHUB PR | https://github.com/apache/hadoop/pull/6763 | | Optional Tests | dupname asflicense compile javac javadoc mvninstall mvnsite unit shadedclient spotbugs checkstyle codespell detsecrets | | uname | Linux d47908eada51 5.15.0-94-generic #104-Ubuntu SMP Tue Jan 9 15:25:40 UTC 2024 x86_64 x86_64 x86_64 GNU/Linux | | Build tool | maven | | Personality | dev-support/bin/hadoop.sh | | git revision | branch-3.4 / f7f022f8ef957ff32d3f13eaa3e7e7c245b75406 | | Default Java | Private Build-1.8.0_402-8u402-ga-2ubuntu1~20.04-b06 | | Multi-JDK versions | /usr/lib/jvm/java-11-openjdk-amd64:Ubuntu-11.0.22+7-post-Ubuntu-0ubuntu220.04.1 /usr/lib/jvm/java-8-openjdk-amd64:Private Build-1.8.0_402-8u402-ga-2ubuntu1~20.04-b06 | | Test Results | https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-6763/1/testReport/ | | Max. process+thread count | 3153 (vs. ulimit of 5500) | | modules | C:
[jira] [Commented] (HADOOP-19146) noaa-cors-pds bucket access with global endpoint fails
[ https://issues.apache.org/jira/browse/HADOOP-19146?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17839964#comment-17839964 ] ASF GitHub Bot commented on HADOOP-19146: - hadoop-yetus commented on PR #6723: URL: https://github.com/apache/hadoop/pull/6723#issuecomment-2071522327 :confetti_ball: **+1 overall** | Vote | Subsystem | Runtime | Logfile | Comment | |::|--:|:|::|:---:| | +0 :ok: | reexec | 0m 19s | | Docker mode activated. | _ Prechecks _ | | +1 :green_heart: | dupname | 0m 0s | | No case conflicting files found. | | +0 :ok: | codespell | 0m 0s | | codespell was not available. | | +0 :ok: | detsecrets | 0m 0s | | detect-secrets was not available. | | +1 :green_heart: | @author | 0m 0s | | The patch does not contain any @author tags. | | +1 :green_heart: | test4tests | 0m 0s | | The patch appears to include 8 new or modified test files. | _ trunk Compile Tests _ | | +1 :green_heart: | mvninstall | 33m 40s | | trunk passed | | +1 :green_heart: | compile | 0m 23s | | trunk passed with JDK Ubuntu-11.0.22+7-post-Ubuntu-0ubuntu220.04.1 | | +1 :green_heart: | compile | 0m 20s | | trunk passed with JDK Private Build-1.8.0_402-8u402-ga-2ubuntu1~20.04-b06 | | +1 :green_heart: | checkstyle | 0m 20s | | trunk passed | | +1 :green_heart: | mvnsite | 0m 26s | | trunk passed | | +1 :green_heart: | javadoc | 0m 17s | | trunk passed with JDK Ubuntu-11.0.22+7-post-Ubuntu-0ubuntu220.04.1 | | +1 :green_heart: | javadoc | 0m 21s | | trunk passed with JDK Private Build-1.8.0_402-8u402-ga-2ubuntu1~20.04-b06 | | +1 :green_heart: | spotbugs | 0m 43s | | trunk passed | | +1 :green_heart: | shadedclient | 21m 35s | | branch has no errors when building and testing our client artifacts. | _ Patch Compile Tests _ | | +1 :green_heart: | mvninstall | 0m 18s | | the patch passed | | +1 :green_heart: | compile | 0m 21s | | the patch passed with JDK Ubuntu-11.0.22+7-post-Ubuntu-0ubuntu220.04.1 | | +1 :green_heart: | javac | 0m 21s | | the patch passed | | +1 :green_heart: | compile | 0m 15s | | the patch passed with JDK Private Build-1.8.0_402-8u402-ga-2ubuntu1~20.04-b06 | | +1 :green_heart: | javac | 0m 15s | | the patch passed | | +1 :green_heart: | blanks | 0m 0s | | The patch has no blanks issues. | | +1 :green_heart: | checkstyle | 0m 11s | | the patch passed | | +1 :green_heart: | mvnsite | 0m 20s | | the patch passed | | +1 :green_heart: | javadoc | 0m 10s | | the patch passed with JDK Ubuntu-11.0.22+7-post-Ubuntu-0ubuntu220.04.1 | | +1 :green_heart: | javadoc | 0m 16s | | the patch passed with JDK Private Build-1.8.0_402-8u402-ga-2ubuntu1~20.04-b06 | | +1 :green_heart: | spotbugs | 0m 38s | | the patch passed | | +1 :green_heart: | shadedclient | 21m 5s | | patch has no errors when building and testing our client artifacts. | _ Other Tests _ | | +1 :green_heart: | unit | 2m 11s | | hadoop-aws in the patch passed. | | +1 :green_heart: | asflicense | 0m 24s | | The patch does not generate ASF License warnings. | | | | 87m 14s | | | | Subsystem | Report/Notes | |--:|:-| | Docker | ClientAPI=1.45 ServerAPI=1.45 base: https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-6723/3/artifact/out/Dockerfile | | GITHUB PR | https://github.com/apache/hadoop/pull/6723 | | Optional Tests | dupname asflicense compile javac javadoc mvninstall mvnsite unit shadedclient spotbugs checkstyle codespell detsecrets | | uname | Linux 4f3c0ebb2293 5.15.0-94-generic #104-Ubuntu SMP Tue Jan 9 15:25:40 UTC 2024 x86_64 x86_64 x86_64 GNU/Linux | | Build tool | maven | | Personality | dev-support/bin/hadoop.sh | | git revision | trunk / 60bd00b49f12f28f2033ffe0a0946be73f29ecfa | | Default Java | Private Build-1.8.0_402-8u402-ga-2ubuntu1~20.04-b06 | | Multi-JDK versions | /usr/lib/jvm/java-11-openjdk-amd64:Ubuntu-11.0.22+7-post-Ubuntu-0ubuntu220.04.1 /usr/lib/jvm/java-8-openjdk-amd64:Private Build-1.8.0_402-8u402-ga-2ubuntu1~20.04-b06 | | Test Results | https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-6723/3/testReport/ | | Max. process+thread count | 727 (vs. ulimit of 5500) | | modules | C: hadoop-tools/hadoop-aws U: hadoop-tools/hadoop-aws | | Console output | https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-6723/3/console | | versions | git=2.25.1 maven=3.6.3 spotbugs=4.2.2 | | Powered by | Apache Yetus 0.14.0 https://yetus.apache.org | This message was automatically generated. > noaa-cors-pds bucket access with global endpoint
[jira] [Commented] (HADOOP-19137) [ABFS]:Extra getAcl call while calling the very first API of FileSystem
[ https://issues.apache.org/jira/browse/HADOOP-19137?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17839962#comment-17839962 ] ASF GitHub Bot commented on HADOOP-19137: - hadoop-yetus commented on PR #6752: URL: https://github.com/apache/hadoop/pull/6752#issuecomment-2071517523 :confetti_ball: **+1 overall** | Vote | Subsystem | Runtime | Logfile | Comment | |::|--:|:|::|:---:| | +0 :ok: | reexec | 0m 19s | | Docker mode activated. | _ Prechecks _ | | +1 :green_heart: | dupname | 0m 1s | | No case conflicting files found. | | +0 :ok: | codespell | 0m 0s | | codespell was not available. | | +0 :ok: | detsecrets | 0m 0s | | detect-secrets was not available. | | +1 :green_heart: | @author | 0m 0s | | The patch does not contain any @author tags. | | +1 :green_heart: | test4tests | 0m 0s | | The patch appears to include 7 new or modified test files. | _ trunk Compile Tests _ | | +1 :green_heart: | mvninstall | 33m 15s | | trunk passed | | +1 :green_heart: | compile | 0m 20s | | trunk passed with JDK Ubuntu-11.0.22+7-post-Ubuntu-0ubuntu220.04.1 | | +1 :green_heart: | compile | 0m 19s | | trunk passed with JDK Private Build-1.8.0_402-8u402-ga-2ubuntu1~20.04-b06 | | +1 :green_heart: | checkstyle | 0m 18s | | trunk passed | | +1 :green_heart: | mvnsite | 0m 25s | | trunk passed | | +1 :green_heart: | javadoc | 0m 22s | | trunk passed with JDK Ubuntu-11.0.22+7-post-Ubuntu-0ubuntu220.04.1 | | +1 :green_heart: | javadoc | 0m 20s | | trunk passed with JDK Private Build-1.8.0_402-8u402-ga-2ubuntu1~20.04-b06 | | +1 :green_heart: | spotbugs | 0m 40s | | trunk passed | | +1 :green_heart: | shadedclient | 21m 24s | | branch has no errors when building and testing our client artifacts. | _ Patch Compile Tests _ | | +1 :green_heart: | mvninstall | 0m 19s | | the patch passed | | +1 :green_heart: | compile | 0m 19s | | the patch passed with JDK Ubuntu-11.0.22+7-post-Ubuntu-0ubuntu220.04.1 | | +1 :green_heart: | javac | 0m 19s | | the patch passed | | +1 :green_heart: | compile | 0m 17s | | the patch passed with JDK Private Build-1.8.0_402-8u402-ga-2ubuntu1~20.04-b06 | | +1 :green_heart: | javac | 0m 17s | | the patch passed | | +1 :green_heart: | blanks | 0m 0s | | The patch has no blanks issues. | | +1 :green_heart: | checkstyle | 0m 12s | | the patch passed | | +1 :green_heart: | mvnsite | 0m 17s | | the patch passed | | +1 :green_heart: | javadoc | 0m 16s | | the patch passed with JDK Ubuntu-11.0.22+7-post-Ubuntu-0ubuntu220.04.1 | | +1 :green_heart: | javadoc | 0m 16s | | the patch passed with JDK Private Build-1.8.0_402-8u402-ga-2ubuntu1~20.04-b06 | | +1 :green_heart: | spotbugs | 0m 44s | | the patch passed | | +1 :green_heart: | shadedclient | 21m 36s | | patch has no errors when building and testing our client artifacts. | _ Other Tests _ | | +1 :green_heart: | unit | 1m 49s | | hadoop-azure in the patch passed. | | +1 :green_heart: | asflicense | 0m 23s | | The patch does not generate ASF License warnings. | | | | 87m 9s | | | | Subsystem | Report/Notes | |--:|:-| | Docker | ClientAPI=1.45 ServerAPI=1.45 base: https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-6752/10/artifact/out/Dockerfile | | GITHUB PR | https://github.com/apache/hadoop/pull/6752 | | Optional Tests | dupname asflicense compile javac javadoc mvninstall mvnsite unit shadedclient spotbugs checkstyle codespell detsecrets | | uname | Linux 73b374a921d5 5.15.0-94-generic #104-Ubuntu SMP Tue Jan 9 15:25:40 UTC 2024 x86_64 x86_64 x86_64 GNU/Linux | | Build tool | maven | | Personality | dev-support/bin/hadoop.sh | | git revision | trunk / 5d095979d33da95599ce9d29c36399addda0690f | | Default Java | Private Build-1.8.0_402-8u402-ga-2ubuntu1~20.04-b06 | | Multi-JDK versions | /usr/lib/jvm/java-11-openjdk-amd64:Ubuntu-11.0.22+7-post-Ubuntu-0ubuntu220.04.1 /usr/lib/jvm/java-8-openjdk-amd64:Private Build-1.8.0_402-8u402-ga-2ubuntu1~20.04-b06 | | Test Results | https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-6752/10/testReport/ | | Max. process+thread count | 711 (vs. ulimit of 5500) | | modules | C: hadoop-tools/hadoop-azure U: hadoop-tools/hadoop-azure | | Console output | https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-6752/10/console | | versions | git=2.25.1 maven=3.6.3 spotbugs=4.2.2 | | Powered by | Apache Yetus 0.14.0 https://yetus.apache.org | This message was automatically generated. > [ABFS]:Extra getAcl call while calling the
[jira] [Commented] (HADOOP-19102) [ABFS]: FooterReadBufferSize should not be greater than readBufferSize
[ https://issues.apache.org/jira/browse/HADOOP-19102?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17839952#comment-17839952 ] ASF GitHub Bot commented on HADOOP-19102: - saxenapranav commented on PR #6617: URL: https://github.com/apache/hadoop/pull/6617#issuecomment-2071476916 Thank you @steveloughran very much! Have opened a PR against branch-3.4 https://github.com/apache/hadoop/pull/6763. Thank you! > [ABFS]: FooterReadBufferSize should not be greater than readBufferSize > -- > > Key: HADOOP-19102 > URL: https://issues.apache.org/jira/browse/HADOOP-19102 > Project: Hadoop Common > Issue Type: Sub-task > Components: fs/azure >Affects Versions: 3.4.0 >Reporter: Pranav Saxena >Assignee: Pranav Saxena >Priority: Major > Labels: pull-request-available > > The method `optimisedRead` creates a buffer array of size `readBufferSize`. > If footerReadBufferSize is greater than readBufferSize, abfs will attempt to > read more data than the buffer array can hold, which causes an exception. > Change: To avoid this, we will keep footerBufferSize = > min(readBufferSizeConfig, footerBufferSizeConfig) > > -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Commented] (HADOOP-19102) [ABFS]: FooterReadBufferSize should not be greater than readBufferSize
[ https://issues.apache.org/jira/browse/HADOOP-19102?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17839951#comment-17839951 ] ASF GitHub Bot commented on HADOOP-19102: - saxenapranav commented on PR #6763: URL: https://github.com/apache/hadoop/pull/6763#issuecomment-2071475267 -- AGGREGATED TEST RESULT HNS-OAuth [WARNING] Tests run: 137, Failures: 0, Errors: 0, Skipped: 2 [WARNING] Tests run: 617, Failures: 0, Errors: 0, Skipped: 73 [WARNING] Tests run: 380, Failures: 0, Errors: 0, Skipped: 57 HNS-SharedKey [WARNING] Tests run: 137, Failures: 0, Errors: 0, Skipped: 3 [WARNING] Tests run: 617, Failures: 0, Errors: 0, Skipped: 28 [WARNING] Tests run: 380, Failures: 0, Errors: 0, Skipped: 41 NonHNS-SharedKey [WARNING] Tests run: 137, Failures: 0, Errors: 0, Skipped: 9 [WARNING] Tests run: 601, Failures: 0, Errors: 0, Skipped: 268 [WARNING] Tests run: 380, Failures: 0, Errors: 0, Skipped: 44 AppendBlob-HNS-OAuth [WARNING] Tests run: 137, Failures: 0, Errors: 0, Skipped: 2 [WARNING] Tests run: 617, Failures: 0, Errors: 0, Skipped: 75 [WARNING] Tests run: 380, Failures: 0, Errors: 0, Skipped: 81 Time taken: 20 mins 36 secs. azureuser@Hadoop-VM-EAST2:~/hadoop/hadoop-tools/hadoop-azure$ git log commit f7f022f8ef957ff32d3f13eaa3e7e7c245b75406 (HEAD -> saxenapranav/footerBufferSizeFix-3.4, origin/saxenapranav/footerBufferSizeFix-3.4) Author: Pranav Saxena <108325433+saxenapra...@users.noreply.github.com> Date: Mon Apr 22 23:06:12 2024 +0530 HADOOP-19102. [ABFS] FooterReadBufferSize should not be greater than readBufferSize (#6617) Contributed by Pranav Saxena > [ABFS]: FooterReadBufferSize should not be greater than readBufferSize > -- > > Key: HADOOP-19102 > URL: https://issues.apache.org/jira/browse/HADOOP-19102 > Project: Hadoop Common > Issue Type: Sub-task > Components: fs/azure >Affects Versions: 3.4.0 >Reporter: Pranav Saxena >Assignee: Pranav Saxena >Priority: Major > Labels: pull-request-available > > The method `optimisedRead` creates a buffer array of size `readBufferSize`. > If footerReadBufferSize is greater than readBufferSize, abfs will attempt to > read more data than the buffer array can hold, which causes an exception. > Change: To avoid this, we will keep footerBufferSize = > min(readBufferSizeConfig, footerBufferSizeConfig) > > -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Commented] (HADOOP-19154) upgrade bouncy castle to 1.78.1 due to CVEs
[ https://issues.apache.org/jira/browse/HADOOP-19154?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17839943#comment-17839943 ] ASF GitHub Bot commented on HADOOP-19154: - hadoop-yetus commented on PR #6755: URL: https://github.com/apache/hadoop/pull/6755#issuecomment-2071444504 :broken_heart: **-1 overall** | Vote | Subsystem | Runtime | Logfile | Comment | |::|--:|:|::|:---:| _ Prechecks _ | | +1 :green_heart: | dupname | 0m 00s | | No case conflicting files found. | | +0 :ok: | codespell | 0m 00s | | codespell was not available. | | +0 :ok: | detsecrets | 0m 00s | | detect-secrets was not available. | | +0 :ok: | shellcheck | 0m 01s | | Shellcheck was not available. | | +0 :ok: | shelldocs | 0m 01s | | Shelldocs was not available. | | +0 :ok: | markdownlint | 0m 01s | | markdownlint was not available. | | +0 :ok: | xmllint | 0m 00s | | xmllint was not available. | | +1 :green_heart: | @author | 0m 00s | | The patch does not contain any @author tags. | | -1 :x: | test4tests | 0m 00s | | The patch doesn't appear to include any new or modified tests. Please justify why no new tests are needed for this patch. Also please list what manual steps were performed to verify this patch. | _ trunk Compile Tests _ | | +0 :ok: | mvndep | 4m 01s | | Maven dependency ordering for branch | | +1 :green_heart: | mvninstall | 88m 53s | | trunk passed | | +1 :green_heart: | compile | 39m 12s | | trunk passed | | -1 :x: | mvnsite | 23m 14s | [/branch-mvnsite-root.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch-windows-10/job/PR-6755/1/artifact/out/branch-mvnsite-root.txt) | root in trunk failed. | | +1 :green_heart: | javadoc | 15m 06s | | trunk passed | | +1 :green_heart: | shadedclient | 314m 57s | | branch has no errors when building and testing our client artifacts. | _ Patch Compile Tests _ | | +0 :ok: | mvndep | 2m 38s | | Maven dependency ordering for patch | | +1 :green_heart: | mvninstall | 92m 00s | | the patch passed | | +1 :green_heart: | compile | 39m 05s | | the patch passed | | +1 :green_heart: | javac | 39m 05s | | the patch passed | | +1 :green_heart: | blanks | 0m 00s | | The patch has no blanks issues. | | -1 :x: | mvnsite | 22m 26s | [/patch-mvnsite-root.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch-windows-10/job/PR-6755/1/artifact/out/patch-mvnsite-root.txt) | root in the patch failed. | | +1 :green_heart: | javadoc | 15m 39s | | the patch passed | | +1 :green_heart: | shadedclient | 188m 34s | | patch has no errors when building and testing our client artifacts. | _ Other Tests _ | | +1 :green_heart: | asflicense | 6m 04s | | The patch does not generate ASF License warnings. | | | | 653m 34s | | | | Subsystem | Report/Notes | |--:|:-| | GITHUB PR | https://github.com/apache/hadoop/pull/6755 | | Optional Tests | dupname asflicense codespell detsecrets shellcheck shelldocs mvnsite markdownlint compile javac javadoc mvninstall unit shadedclient xmllint | | uname | MINGW64_NT-10.0-17763 178c6f9cc74c 3.4.10-87d57229.x86_64 2024-02-14 20:17 UTC x86_64 Msys | | Build tool | maven | | Personality | /c/hadoop/dev-support/bin/hadoop.sh | | git revision | trunk / c9f6b3d37891a0a31607e3a0ff1c035061d4f616 | | Default Java | Azul Systems, Inc.-1.8.0_332-b09 | | Test Results | https://ci-hadoop.apache.org/job/hadoop-multibranch-windows-10/job/PR-6755/1/testReport/ | | modules | C: hadoop-project hadoop-cloud-storage-project/hadoop-cos . U: . | | Console output | https://ci-hadoop.apache.org/job/hadoop-multibranch-windows-10/job/PR-6755/1/console | | versions | git=2.44.0.windows.1 | | Powered by | Apache Yetus 0.14.0 https://yetus.apache.org | This message was automatically generated. > upgrade bouncy castle to 1.78.1 due to CVEs > --- > > Key: HADOOP-19154 > URL: https://issues.apache.org/jira/browse/HADOOP-19154 > Project: Hadoop Common > Issue Type: Improvement > Components: common >Affects Versions: 3.4.0, 3.3.6 >Reporter: PJ Fanning >Priority: Major > Labels: pull-request-available > > [https://www.bouncycastle.org/releasenotes.html#r1rv78] > There is a v1.78.1 release but no notes for it yet. > For v1.78 > h3. 2.1.5 Security Advisories. > Release 1.78 deals with the following CVEs: > * CVE-2024-29857 - Importing an EC certificate with specially crafted F2m > parameters can cause high CPU usage during parameter evaluation. > *