[jira] [Commented] (HADOOP-18120) Hadoop auth does not handle HTTP Headers in a case-insensitive way
[ https://issues.apache.org/jira/browse/HADOOP-18120?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17493044#comment-17493044 ] Daniel Fritsi commented on HADOOP-18120: [~weichiu], I won't be able to handle this ticket, sorry. I don't have time right now. That's why I mentioned that it would be great if someone else could check it. Since the coding part is pretty much done, there's not much left. > Hadoop auth does not handle HTTP Headers in a case-insensitive way > -- > > Key: HADOOP-18120 > URL: https://issues.apache.org/jira/browse/HADOOP-18120 > Project: Hadoop Common > Issue Type: Bug > Components: auth >Reporter: Daniel Fritsi >Priority: Critical > Attachments: hadoop-auth-headers.patch > > > According to [RFC-2616|https://www.ietf.org/rfc/rfc2616.txt] HTTP Headers are > case-insensitive. There are proxies / load balancers (e.g.: newer versions of > HA-proxy) which deliberately make some of the HTTP headers lower-case results > in an authentication / authorization failure inside the Hadoop codebase. > I've created a small patch (I'm from Cloudera): > [^hadoop-auth-headers.patch]. This resolves our authentication issue. Can > someone please have a look at this? -- This message was sent by Atlassian Jira (v8.20.1#820001) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Assigned] (HADOOP-18120) Hadoop auth does not handle HTTP Headers in a case-insensitive way
[ https://issues.apache.org/jira/browse/HADOOP-18120?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Daniel Fritsi reassigned HADOOP-18120: -- Assignee: (was: Daniel Fritsi) > Hadoop auth does not handle HTTP Headers in a case-insensitive way > -- > > Key: HADOOP-18120 > URL: https://issues.apache.org/jira/browse/HADOOP-18120 > Project: Hadoop Common > Issue Type: Bug > Components: auth >Reporter: Daniel Fritsi >Priority: Critical > Attachments: hadoop-auth-headers.patch > > > According to [RFC-2616|https://www.ietf.org/rfc/rfc2616.txt] HTTP Headers are > case-insensitive. There are proxies / load balancers (e.g.: newer versions of > HA-proxy) which deliberately make some of the HTTP headers lower-case results > in an authentication / authorization failure inside the Hadoop codebase. > I've created a small patch (I'm from Cloudera): > [^hadoop-auth-headers.patch]. This resolves our authentication issue. Can > someone please have a look at this? -- This message was sent by Atlassian Jira (v8.20.1#820001) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Work logged] (HADOOP-15980) Enable TLS in RPC client/server
[ https://issues.apache.org/jira/browse/HADOOP-15980?focusedWorklogId=728065&page=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-728065 ] ASF GitHub Bot logged work on HADOOP-15980: --- Author: ASF GitHub Bot Created on: 16/Feb/22 06:50 Start Date: 16/Feb/22 06:50 Worklog Time Spent: 10m Work Description: hadoop-yetus commented on pull request #3966: URL: https://github.com/apache/hadoop/pull/3966#issuecomment-1041170858 :broken_heart: **-1 overall** | Vote | Subsystem | Runtime | Logfile | Comment | |::|--:|:|::|:---:| | +0 :ok: | reexec | 0m 51s | | Docker mode activated. | _ Prechecks _ | | +1 :green_heart: | dupname | 0m 0s | | No case conflicting files found. | | +0 :ok: | codespell | 0m 0s | | codespell was not available. | | +0 :ok: | shelldocs | 0m 0s | | Shelldocs was not available. | | +1 :green_heart: | @author | 0m 0s | | The patch does not contain any @author tags. | | +1 :green_heart: | test4tests | 0m 0s | | The patch appears to include 15 new or modified test files. | _ trunk Compile Tests _ | | +0 :ok: | mvndep | 12m 15s | | Maven dependency ordering for branch | | +1 :green_heart: | mvninstall | 24m 37s | | trunk passed | | +1 :green_heart: | compile | 24m 17s | | trunk passed with JDK Ubuntu-11.0.13+8-Ubuntu-0ubuntu1.20.04 | | +1 :green_heart: | compile | 20m 42s | | trunk passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | checkstyle | 3m 50s | | trunk passed | | +1 :green_heart: | mvnsite | 3m 6s | | trunk passed | | +1 :green_heart: | javadoc | 2m 33s | | trunk passed with JDK Ubuntu-11.0.13+8-Ubuntu-0ubuntu1.20.04 | | +1 :green_heart: | javadoc | 2m 59s | | trunk passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | +0 :ok: | spotbugs | 0m 29s | | branch/hadoop-client-modules/hadoop-client-check-invariants no spotbugs output file (spotbugsXml.xml) | | +0 :ok: | spotbugs | 0m 29s | | branch/hadoop-client-modules/hadoop-client-minicluster no spotbugs output file (spotbugsXml.xml) | | +0 :ok: | spotbugs | 0m 29s | | branch/hadoop-client-modules/hadoop-client-check-test-invariants no spotbugs output file (spotbugsXml.xml) | | +1 :green_heart: | shadedclient | 22m 51s | | branch has no errors when building and testing our client artifacts. | _ Patch Compile Tests _ | | +0 :ok: | mvndep | 0m 23s | | Maven dependency ordering for patch | | +1 :green_heart: | mvninstall | 6m 22s | | the patch passed | | +1 :green_heart: | compile | 23m 41s | | the patch passed with JDK Ubuntu-11.0.13+8-Ubuntu-0ubuntu1.20.04 | | -1 :x: | javac | 23m 41s | [/results-compile-javac-root-jdkUbuntu-11.0.13+8-Ubuntu-0ubuntu1.20.04.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-3966/2/artifact/out/results-compile-javac-root-jdkUbuntu-11.0.13+8-Ubuntu-0ubuntu1.20.04.txt) | root-jdkUbuntu-11.0.13+8-Ubuntu-0ubuntu1.20.04 with JDK Ubuntu-11.0.13+8-Ubuntu-0ubuntu1.20.04 generated 1 new + 1809 unchanged - 1 fixed = 1810 total (was 1810) | | +1 :green_heart: | compile | 21m 0s | | the patch passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | -1 :x: | javac | 21m 0s | [/results-compile-javac-root-jdkPrivateBuild-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-3966/2/artifact/out/results-compile-javac-root-jdkPrivateBuild-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07.txt) | root-jdkPrivateBuild-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 generated 1 new + 1686 unchanged - 1 fixed = 1687 total (was 1687) | | +1 :green_heart: | blanks | 0m 0s | | The patch has no blanks issues. | | -0 :warning: | checkstyle | 3m 47s | [/results-checkstyle-root.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-3966/2/artifact/out/results-checkstyle-root.txt) | root: The patch generated 24 new + 505 unchanged - 22 fixed = 529 total (was 527) | | +1 :green_heart: | mvnsite | 3m 4s | | the patch passed | | +1 :green_heart: | shellcheck | 0m 0s | | No new issues. | | +1 :green_heart: | xml | 0m 2s | | The patch has no ill-formed XML file. | | -1 :x: | javadoc | 1m 5s | [/patch-javadoc-hadoop-common-project_hadoop-common-jdkUbuntu-11.0.13+8-Ubuntu-0ubuntu1.20.04.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-3966/2/artifact/out/patch-javadoc-hadoop-common-project_hadoop-common-jdkUbuntu-11.0.13+8-Ubuntu-0ubuntu1.20.04.txt) | hadoop-common in the patch failed with JDK Ubuntu-11.0.1
[GitHub] [hadoop] hadoop-yetus commented on pull request #3966: [WIP] HADOOP-15980 : Enable TLS in RPC client/server
hadoop-yetus commented on pull request #3966: URL: https://github.com/apache/hadoop/pull/3966#issuecomment-1041170858 :broken_heart: **-1 overall** | Vote | Subsystem | Runtime | Logfile | Comment | |::|--:|:|::|:---:| | +0 :ok: | reexec | 0m 51s | | Docker mode activated. | _ Prechecks _ | | +1 :green_heart: | dupname | 0m 0s | | No case conflicting files found. | | +0 :ok: | codespell | 0m 0s | | codespell was not available. | | +0 :ok: | shelldocs | 0m 0s | | Shelldocs was not available. | | +1 :green_heart: | @author | 0m 0s | | The patch does not contain any @author tags. | | +1 :green_heart: | test4tests | 0m 0s | | The patch appears to include 15 new or modified test files. | _ trunk Compile Tests _ | | +0 :ok: | mvndep | 12m 15s | | Maven dependency ordering for branch | | +1 :green_heart: | mvninstall | 24m 37s | | trunk passed | | +1 :green_heart: | compile | 24m 17s | | trunk passed with JDK Ubuntu-11.0.13+8-Ubuntu-0ubuntu1.20.04 | | +1 :green_heart: | compile | 20m 42s | | trunk passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | checkstyle | 3m 50s | | trunk passed | | +1 :green_heart: | mvnsite | 3m 6s | | trunk passed | | +1 :green_heart: | javadoc | 2m 33s | | trunk passed with JDK Ubuntu-11.0.13+8-Ubuntu-0ubuntu1.20.04 | | +1 :green_heart: | javadoc | 2m 59s | | trunk passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | +0 :ok: | spotbugs | 0m 29s | | branch/hadoop-client-modules/hadoop-client-check-invariants no spotbugs output file (spotbugsXml.xml) | | +0 :ok: | spotbugs | 0m 29s | | branch/hadoop-client-modules/hadoop-client-minicluster no spotbugs output file (spotbugsXml.xml) | | +0 :ok: | spotbugs | 0m 29s | | branch/hadoop-client-modules/hadoop-client-check-test-invariants no spotbugs output file (spotbugsXml.xml) | | +1 :green_heart: | shadedclient | 22m 51s | | branch has no errors when building and testing our client artifacts. | _ Patch Compile Tests _ | | +0 :ok: | mvndep | 0m 23s | | Maven dependency ordering for patch | | +1 :green_heart: | mvninstall | 6m 22s | | the patch passed | | +1 :green_heart: | compile | 23m 41s | | the patch passed with JDK Ubuntu-11.0.13+8-Ubuntu-0ubuntu1.20.04 | | -1 :x: | javac | 23m 41s | [/results-compile-javac-root-jdkUbuntu-11.0.13+8-Ubuntu-0ubuntu1.20.04.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-3966/2/artifact/out/results-compile-javac-root-jdkUbuntu-11.0.13+8-Ubuntu-0ubuntu1.20.04.txt) | root-jdkUbuntu-11.0.13+8-Ubuntu-0ubuntu1.20.04 with JDK Ubuntu-11.0.13+8-Ubuntu-0ubuntu1.20.04 generated 1 new + 1809 unchanged - 1 fixed = 1810 total (was 1810) | | +1 :green_heart: | compile | 21m 0s | | the patch passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | -1 :x: | javac | 21m 0s | [/results-compile-javac-root-jdkPrivateBuild-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-3966/2/artifact/out/results-compile-javac-root-jdkPrivateBuild-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07.txt) | root-jdkPrivateBuild-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 generated 1 new + 1686 unchanged - 1 fixed = 1687 total (was 1687) | | +1 :green_heart: | blanks | 0m 0s | | The patch has no blanks issues. | | -0 :warning: | checkstyle | 3m 47s | [/results-checkstyle-root.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-3966/2/artifact/out/results-checkstyle-root.txt) | root: The patch generated 24 new + 505 unchanged - 22 fixed = 529 total (was 527) | | +1 :green_heart: | mvnsite | 3m 4s | | the patch passed | | +1 :green_heart: | shellcheck | 0m 0s | | No new issues. | | +1 :green_heart: | xml | 0m 2s | | The patch has no ill-formed XML file. | | -1 :x: | javadoc | 1m 5s | [/patch-javadoc-hadoop-common-project_hadoop-common-jdkUbuntu-11.0.13+8-Ubuntu-0ubuntu1.20.04.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-3966/2/artifact/out/patch-javadoc-hadoop-common-project_hadoop-common-jdkUbuntu-11.0.13+8-Ubuntu-0ubuntu1.20.04.txt) | hadoop-common in the patch failed with JDK Ubuntu-11.0.13+8-Ubuntu-0ubuntu1.20.04. | | +1 :green_heart: | javadoc | 2m 59s | | the patch passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | -1 :x: | spotbugs | 2m 43s | [/new-spotbugs-hadoop-common-project_hadoop-common.html](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-3966/2/artifact/out/new-spotbugs-hadoop-common-project_hadoop-common.html) | hadoop-common-project/hadoop-common
[jira] [Work logged] (HADOOP-18110) ViewFileSystem: Add Support for Localized Trash Root
[ https://issues.apache.org/jira/browse/HADOOP-18110?focusedWorklogId=728052&page=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-728052 ] ASF GitHub Bot logged work on HADOOP-18110: --- Author: ASF GitHub Bot Created on: 16/Feb/22 06:09 Start Date: 16/Feb/22 06:09 Worklog Time Spent: 10m Work Description: hadoop-yetus commented on pull request #3994: URL: https://github.com/apache/hadoop/pull/3994#issuecomment-1041143564 :broken_heart: **-1 overall** | Vote | Subsystem | Runtime | Logfile | Comment | |::|--:|:|::|:---:| | +0 :ok: | reexec | 0m 56s | | Docker mode activated. | _ Prechecks _ | | +1 :green_heart: | dupname | 0m 0s | | No case conflicting files found. | | +0 :ok: | codespell | 0m 0s | | codespell was not available. | | +1 :green_heart: | @author | 0m 0s | | The patch does not contain any @author tags. | | +1 :green_heart: | test4tests | 0m 0s | | The patch appears to include 1 new or modified test files. | _ branch-2.10 Compile Tests _ | | +1 :green_heart: | mvninstall | 14m 12s | | branch-2.10 passed | | +1 :green_heart: | compile | 13m 4s | | branch-2.10 passed with JDK Azul Systems, Inc.-1.7.0_262-b10 | | +1 :green_heart: | compile | 10m 53s | | branch-2.10 passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~18.04-b07 | | +1 :green_heart: | checkstyle | 0m 43s | | branch-2.10 passed | | +1 :green_heart: | mvnsite | 1m 18s | | branch-2.10 passed | | +1 :green_heart: | javadoc | 1m 16s | | branch-2.10 passed with JDK Azul Systems, Inc.-1.7.0_262-b10 | | +1 :green_heart: | javadoc | 1m 5s | | branch-2.10 passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~18.04-b07 | | -1 :x: | spotbugs | 2m 1s | [/branch-spotbugs-hadoop-common-project_hadoop-common-warnings.html](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-3994/2/artifact/out/branch-spotbugs-hadoop-common-project_hadoop-common-warnings.html) | hadoop-common-project/hadoop-common in branch-2.10 has 2 extant spotbugs warnings. | _ Patch Compile Tests _ | | +1 :green_heart: | mvninstall | 0m 46s | | the patch passed | | +1 :green_heart: | compile | 12m 27s | | the patch passed with JDK Azul Systems, Inc.-1.7.0_262-b10 | | +1 :green_heart: | javac | 12m 27s | | the patch passed | | +1 :green_heart: | compile | 10m 49s | | the patch passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~18.04-b07 | | +1 :green_heart: | javac | 10m 49s | | the patch passed | | +1 :green_heart: | blanks | 0m 0s | | The patch has no blanks issues. | | +1 :green_heart: | checkstyle | 0m 42s | | the patch passed | | +1 :green_heart: | mvnsite | 1m 18s | | the patch passed | | +1 :green_heart: | javadoc | 1m 13s | | the patch passed with JDK Azul Systems, Inc.-1.7.0_262-b10 | | +1 :green_heart: | javadoc | 1m 5s | | the patch passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~18.04-b07 | | +1 :green_heart: | spotbugs | 2m 10s | | the patch passed | _ Other Tests _ | | -1 :x: | unit | 9m 25s | [/patch-unit-hadoop-common-project_hadoop-common.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-3994/2/artifact/out/patch-unit-hadoop-common-project_hadoop-common.txt) | hadoop-common in the patch passed. | | +1 :green_heart: | asflicense | 0m 49s | | The patch does not generate ASF License warnings. | | | | 91m 13s | | | | Reason | Tests | |---:|:--| | Failed junit tests | hadoop.util.TestReadWriteDiskValidator | | | hadoop.util.TestDiskCheckerWithDiskIo | | | hadoop.fs.sftp.TestSFTPFileSystem | | | hadoop.io.compress.TestCompressorDecompressor | | | hadoop.io.compress.snappy.TestSnappyCompressorDecompressor | | Subsystem | Report/Notes | |--:|:-| | Docker | ClientAPI=1.41 ServerAPI=1.41 base: https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-3994/2/artifact/out/Dockerfile | | GITHUB PR | https://github.com/apache/hadoop/pull/3994 | | Optional Tests | dupname asflicense compile javac javadoc mvninstall mvnsite unit shadedclient spotbugs checkstyle codespell | | uname | Linux 74522267b2e6 4.15.0-65-generic #74-Ubuntu SMP Tue Sep 17 17:06:04 UTC 2019 x86_64 x86_64 x86_64 GNU/Linux | | Build tool | maven | | Personality | dev-support/bin/hadoop.sh | | git revision | branch-2.10 / b0c54fa62d91748706a5c22cb961926d98bf580c | | Default Java | Private Build-1.8.0_312-8u312-b07-0ubuntu1~18.04-b07 | | Multi-JDK versions | /usr/lib/jvm/zulu-7-amd64:Azul Systems, Inc.-1.7.0_262-b10 /usr/lib/jvm/java-8-openjdk-amd
[GitHub] [hadoop] hadoop-yetus commented on pull request #3994: HADOOP-18110. ViewFileSystem: Add Support for Localized Trash Root
hadoop-yetus commented on pull request #3994: URL: https://github.com/apache/hadoop/pull/3994#issuecomment-1041143564 :broken_heart: **-1 overall** | Vote | Subsystem | Runtime | Logfile | Comment | |::|--:|:|::|:---:| | +0 :ok: | reexec | 0m 56s | | Docker mode activated. | _ Prechecks _ | | +1 :green_heart: | dupname | 0m 0s | | No case conflicting files found. | | +0 :ok: | codespell | 0m 0s | | codespell was not available. | | +1 :green_heart: | @author | 0m 0s | | The patch does not contain any @author tags. | | +1 :green_heart: | test4tests | 0m 0s | | The patch appears to include 1 new or modified test files. | _ branch-2.10 Compile Tests _ | | +1 :green_heart: | mvninstall | 14m 12s | | branch-2.10 passed | | +1 :green_heart: | compile | 13m 4s | | branch-2.10 passed with JDK Azul Systems, Inc.-1.7.0_262-b10 | | +1 :green_heart: | compile | 10m 53s | | branch-2.10 passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~18.04-b07 | | +1 :green_heart: | checkstyle | 0m 43s | | branch-2.10 passed | | +1 :green_heart: | mvnsite | 1m 18s | | branch-2.10 passed | | +1 :green_heart: | javadoc | 1m 16s | | branch-2.10 passed with JDK Azul Systems, Inc.-1.7.0_262-b10 | | +1 :green_heart: | javadoc | 1m 5s | | branch-2.10 passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~18.04-b07 | | -1 :x: | spotbugs | 2m 1s | [/branch-spotbugs-hadoop-common-project_hadoop-common-warnings.html](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-3994/2/artifact/out/branch-spotbugs-hadoop-common-project_hadoop-common-warnings.html) | hadoop-common-project/hadoop-common in branch-2.10 has 2 extant spotbugs warnings. | _ Patch Compile Tests _ | | +1 :green_heart: | mvninstall | 0m 46s | | the patch passed | | +1 :green_heart: | compile | 12m 27s | | the patch passed with JDK Azul Systems, Inc.-1.7.0_262-b10 | | +1 :green_heart: | javac | 12m 27s | | the patch passed | | +1 :green_heart: | compile | 10m 49s | | the patch passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~18.04-b07 | | +1 :green_heart: | javac | 10m 49s | | the patch passed | | +1 :green_heart: | blanks | 0m 0s | | The patch has no blanks issues. | | +1 :green_heart: | checkstyle | 0m 42s | | the patch passed | | +1 :green_heart: | mvnsite | 1m 18s | | the patch passed | | +1 :green_heart: | javadoc | 1m 13s | | the patch passed with JDK Azul Systems, Inc.-1.7.0_262-b10 | | +1 :green_heart: | javadoc | 1m 5s | | the patch passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~18.04-b07 | | +1 :green_heart: | spotbugs | 2m 10s | | the patch passed | _ Other Tests _ | | -1 :x: | unit | 9m 25s | [/patch-unit-hadoop-common-project_hadoop-common.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-3994/2/artifact/out/patch-unit-hadoop-common-project_hadoop-common.txt) | hadoop-common in the patch passed. | | +1 :green_heart: | asflicense | 0m 49s | | The patch does not generate ASF License warnings. | | | | 91m 13s | | | | Reason | Tests | |---:|:--| | Failed junit tests | hadoop.util.TestReadWriteDiskValidator | | | hadoop.util.TestDiskCheckerWithDiskIo | | | hadoop.fs.sftp.TestSFTPFileSystem | | | hadoop.io.compress.TestCompressorDecompressor | | | hadoop.io.compress.snappy.TestSnappyCompressorDecompressor | | Subsystem | Report/Notes | |--:|:-| | Docker | ClientAPI=1.41 ServerAPI=1.41 base: https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-3994/2/artifact/out/Dockerfile | | GITHUB PR | https://github.com/apache/hadoop/pull/3994 | | Optional Tests | dupname asflicense compile javac javadoc mvninstall mvnsite unit shadedclient spotbugs checkstyle codespell | | uname | Linux 74522267b2e6 4.15.0-65-generic #74-Ubuntu SMP Tue Sep 17 17:06:04 UTC 2019 x86_64 x86_64 x86_64 GNU/Linux | | Build tool | maven | | Personality | dev-support/bin/hadoop.sh | | git revision | branch-2.10 / b0c54fa62d91748706a5c22cb961926d98bf580c | | Default Java | Private Build-1.8.0_312-8u312-b07-0ubuntu1~18.04-b07 | | Multi-JDK versions | /usr/lib/jvm/zulu-7-amd64:Azul Systems, Inc.-1.7.0_262-b10 /usr/lib/jvm/java-8-openjdk-amd64:Private Build-1.8.0_312-8u312-b07-0ubuntu1~18.04-b07 | | Test Results | https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-3994/2/testReport/ | | Max. process+thread count | 1368 (vs. ulimit of 5500) | | modules | C: hadoop-common-project/hadoop-common U: hadoop-common-project/hadoop-common | | Console output | https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-3994/2/console | | versions |
[jira] [Commented] (HADOOP-18120) Hadoop auth does not handle HTTP Headers in a case-insensitive way
[ https://issues.apache.org/jira/browse/HADOOP-18120?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17493013#comment-17493013 ] Wei-Chiu Chuang commented on HADOOP-18120: -- It would be nice to add a test to it. > Hadoop auth does not handle HTTP Headers in a case-insensitive way > -- > > Key: HADOOP-18120 > URL: https://issues.apache.org/jira/browse/HADOOP-18120 > Project: Hadoop Common > Issue Type: Bug > Components: auth >Reporter: Daniel Fritsi >Assignee: Daniel Fritsi >Priority: Critical > Attachments: hadoop-auth-headers.patch > > > According to [RFC-2616|https://www.ietf.org/rfc/rfc2616.txt] HTTP Headers are > case-insensitive. There are proxies / load balancers (e.g.: newer versions of > HA-proxy) which deliberately make some of the HTTP headers lower-case results > in an authentication / authorization failure inside the Hadoop codebase. > I've created a small patch (I'm from Cloudera): > [^hadoop-auth-headers.patch]. This resolves our authentication issue. Can > someone please have a look at this? -- This message was sent by Atlassian Jira (v8.20.1#820001) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Updated] (HADOOP-18120) Hadoop auth does not handle HTTP Headers in a case-insensitive way
[ https://issues.apache.org/jira/browse/HADOOP-18120?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Wei-Chiu Chuang updated HADOOP-18120: - Status: Patch Available (was: Open) > Hadoop auth does not handle HTTP Headers in a case-insensitive way > -- > > Key: HADOOP-18120 > URL: https://issues.apache.org/jira/browse/HADOOP-18120 > Project: Hadoop Common > Issue Type: Bug > Components: auth >Reporter: Daniel Fritsi >Assignee: Daniel Fritsi >Priority: Critical > Attachments: hadoop-auth-headers.patch > > > According to [RFC-2616|https://www.ietf.org/rfc/rfc2616.txt] HTTP Headers are > case-insensitive. There are proxies / load balancers (e.g.: newer versions of > HA-proxy) which deliberately make some of the HTTP headers lower-case results > in an authentication / authorization failure inside the Hadoop codebase. > I've created a small patch (I'm from Cloudera): > [^hadoop-auth-headers.patch]. This resolves our authentication issue. Can > someone please have a look at this? -- This message was sent by Atlassian Jira (v8.20.1#820001) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Work logged] (HADOOP-18110) ViewFileSystem: Add Support for Localized Trash Root
[ https://issues.apache.org/jira/browse/HADOOP-18110?focusedWorklogId=728032&page=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-728032 ] ASF GitHub Bot logged work on HADOOP-18110: --- Author: ASF GitHub Bot Created on: 16/Feb/22 04:41 Start Date: 16/Feb/22 04:41 Worklog Time Spent: 10m Work Description: xinglin commented on pull request #3994: URL: https://github.com/apache/hadoop/pull/3994#issuecomment-1041100691 these unit test failures seem to be false alarms. I am able to run and pass these tests on my laptop. | hadoop.util.TestReadWriteDiskValidator | hadoop.fs.sftp.TestSFTPFileSystem | hadoop.io.compress.TestCompressorDecompressor | hadoop.io.compress.snappy.TestSnappyCompressorDecompressor ``` [INFO] --- [INFO] T E S T S [INFO] --- [INFO] Running org.apache.hadoop.util.TestReadWriteDiskValidator [INFO] Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.278 s - in org.apache.hadoop.util.TestReadWriteDiskValidator [INFO] --- [INFO] T E S T S [INFO] --- [INFO] Running org.apache.hadoop.fs.sftp.TestSFTPFileSystem [INFO] Tests run: 11, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 5.031 s - in org.apache.hadoop.fs.sftp.TestSFTPFileSystem [INFO] --- [INFO] T E S T S [INFO] --- [INFO] Running org.apache.hadoop.io.compress.TestCompressorDecompressor [INFO] Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.112 s - in org.apache.hadoop.io.compress.TestCompressorDecompressor [INFO] --- [INFO] T E S T S [INFO] --- [INFO] Running org.apache.hadoop.io.compress.snappy.TestSnappyCompressorDecompressor [WARNING] Tests run: 14, Failures: 0, Errors: 0, Skipped: 14, Time elapsed: 0.085 s ``` -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org Issue Time Tracking --- Worklog Id: (was: 728032) Time Spent: 3.5h (was: 3h 20m) > ViewFileSystem: Add Support for Localized Trash Root > > > Key: HADOOP-18110 > URL: https://issues.apache.org/jira/browse/HADOOP-18110 > Project: Hadoop Common > Issue Type: Improvement > Components: common >Reporter: Xing Lin >Assignee: Xing Lin >Priority: Major > Labels: pull-request-available > Fix For: 3.4.0 > > Time Spent: 3.5h > Remaining Estimate: 0h > > getTrashRoot() in ViewFileSystem calls getTrashRoot() from underlying > filesystem, to return the trash root. Most of the time, we get a trash root > in user home dir. This can lead to problems when an application wants to > delete a file in a mounted point using moveToTrash() in TrashPolicyDefault, > because we can not rename across multiple filesystems/hdfs namenodes. > > We propose the following extension to getTrashRoot/getTrashRoots in > ViewFileSystem: add a flag to return a localized trash root for > ViewFileSystem. A localized trash root is a trash root which starts from the > root of a mount point (e.g., /mountpointRoot/.Trash/\{user}). > * If CONFIG_VIEWFS_MOUNT_POINT_LOCAL_TRASH is not set to true, or > * when the path p is in a snapshot or an encryption zone, return > * the default trash root in user home dir. > * > * when CONFIG_VIEWFS_MOUNT_POINT_LOCAL_TRASH is set to true, > * 1) if path p is mounted from the same targetFS as user home dir, > * return a trash root in user home dir. > * 2) else, return a trash root in the mounted targetFS > * -- This message was sent by Atlassian Jira (v8.20.1#820001) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] xinglin commented on pull request #3994: HADOOP-18110. ViewFileSystem: Add Support for Localized Trash Root
xinglin commented on pull request #3994: URL: https://github.com/apache/hadoop/pull/3994#issuecomment-1041100691 these unit test failures seem to be false alarms. I am able to run and pass these tests on my laptop. | hadoop.util.TestReadWriteDiskValidator | hadoop.fs.sftp.TestSFTPFileSystem | hadoop.io.compress.TestCompressorDecompressor | hadoop.io.compress.snappy.TestSnappyCompressorDecompressor ``` [INFO] --- [INFO] T E S T S [INFO] --- [INFO] Running org.apache.hadoop.util.TestReadWriteDiskValidator [INFO] Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.278 s - in org.apache.hadoop.util.TestReadWriteDiskValidator [INFO] --- [INFO] T E S T S [INFO] --- [INFO] Running org.apache.hadoop.fs.sftp.TestSFTPFileSystem [INFO] Tests run: 11, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 5.031 s - in org.apache.hadoop.fs.sftp.TestSFTPFileSystem [INFO] --- [INFO] T E S T S [INFO] --- [INFO] Running org.apache.hadoop.io.compress.TestCompressorDecompressor [INFO] Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.112 s - in org.apache.hadoop.io.compress.TestCompressorDecompressor [INFO] --- [INFO] T E S T S [INFO] --- [INFO] Running org.apache.hadoop.io.compress.snappy.TestSnappyCompressorDecompressor [WARNING] Tests run: 14, Failures: 0, Errors: 0, Skipped: 14, Time elapsed: 0.085 s ``` -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] luoge457 commented on a change in pull request #3781: YARN-10863: CGroupElasticMemoryController is not work
luoge457 commented on a change in pull request #3781: URL: https://github.com/apache/hadoop/pull/3781#discussion_r807506381 ## File path: hadoop-yarn-project/hadoop-yarn/hadoop-yarn-server/hadoop-yarn-server-nodemanager/src/main/java/org/apache/hadoop/yarn/server/nodemanager/containermanager/monitor/ContainersMonitorImpl.java ## @@ -742,7 +742,8 @@ private void checkLimit(ContainerId containerId, String pId, ProcessTreeInfo ptInfo, long currentVmemUsage, long currentPmemUsage) { - if (strictMemoryEnforcement && !elasticMemoryEnforcement) { + if ((strictMemoryEnforcement && !elasticMemoryEnforcement) || + (!strictMemoryEnforcement && elasticMemoryEnforcement)) { Review comment: @Kimahriman please let me know if there is anything to improve, I really want to contribute to the community,thanks very much。 -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Work logged] (HADOOP-17386) fs.s3a.buffer.dir to be under Yarn container path on yarn applications
[ https://issues.apache.org/jira/browse/HADOOP-17386?focusedWorklogId=728020&page=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-728020 ] ASF GitHub Bot logged work on HADOOP-17386: --- Author: ASF GitHub Bot Created on: 16/Feb/22 03:11 Start Date: 16/Feb/22 03:11 Worklog Time Spent: 10m Work Description: aajisaka commented on pull request #3908: URL: https://github.com/apache/hadoop/pull/3908#issuecomment-1041050863 Hi @steveloughran, would you review this PR? -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org Issue Time Tracking --- Worklog Id: (was: 728020) Time Spent: 1h 10m (was: 1h) > fs.s3a.buffer.dir to be under Yarn container path on yarn applications > -- > > Key: HADOOP-17386 > URL: https://issues.apache.org/jira/browse/HADOOP-17386 > Project: Hadoop Common > Issue Type: Sub-task > Components: fs/s3 >Affects Versions: 3.3.0 >Reporter: Steve Loughran >Priority: Major > Labels: pull-request-available > Time Spent: 1h 10m > Remaining Estimate: 0h > > # fs.s3a.buffer.dir defaults to hadoop.tmp.dir which is /tmp or similar > # we use this for storing file blocks during upload > # staging committers use it for all files in a task, which can be a lot more > # a lot of systems don't clean up /tmp until reboot -and if they stay up for > a long time then they accrue files written through s3a staging committer from > spark containers which fail > Fix: use ${env.LOCAL_DIRS:-${hadoop.tmp.dir}}/s3a as the option so that if > env.LOCAL_DIRS is set is used over hadoop.tmp.dir. YARN-deployed apps will > use that for the buffer dir. When the app container is destroyed, so is the > directory. -- This message was sent by Atlassian Jira (v8.20.1#820001) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] aajisaka commented on pull request #3908: HADOOP-17386. Change default fs.s3a.buffer.dir to be under Yarn container path on yarn applications
aajisaka commented on pull request #3908: URL: https://github.com/apache/hadoop/pull/3908#issuecomment-1041050863 Hi @steveloughran, would you review this PR? -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Updated] (HADOOP-17631) Configuration ${env.VAR:-FALLBACK} should eval FALLBACK when restrictSystemProps=true
[ https://issues.apache.org/jira/browse/HADOOP-17631?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Akira Ajisaka updated HADOOP-17631: --- Fix Version/s: 3.4.0 > Configuration ${env.VAR:-FALLBACK} should eval FALLBACK when > restrictSystemProps=true > -- > > Key: HADOOP-17631 > URL: https://issues.apache.org/jira/browse/HADOOP-17631 > Project: Hadoop Common > Issue Type: Bug > Components: common >Affects Versions: 3.3.0 >Reporter: Steve Loughran >Assignee: Steve Loughran >Priority: Minor > Labels: pull-request-available > Fix For: 3.4.0, 3.3.2 > > Time Spent: 1h > Remaining Estimate: 0h > > When configuration reads in resources with a restricted parser, it skips > evaluaging system ${env. } vars. But it also skips evaluating fallbacks > As a result, a property like {{fs.s3a.buffer.dir}} > {code} > ${env.LOCAL_DIRS:-${hadoop.tmp.dir}} ends up evaluating as > ${env.LOCAL_DIRS:-${hadoop.tmp.dir}} > {code} > It should instead fall back to the "env var unset" option of > ${hadoop.tmp.dir}. This allows for configs (like for s3a buffer dirs) which > are usable in restricted mode as well as unrestricted deployments. -- This message was sent by Atlassian Jira (v8.20.1#820001) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Updated] (HADOOP-18127) Backport HADOOP-13055 into branch-2.10
[ https://issues.apache.org/jira/browse/HADOOP-18127?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Konstantin Shvachko updated HADOOP-18127: - Target Version/s: 2.10.2 Affects Version/s: 2.10.0 > Backport HADOOP-13055 into branch-2.10 > -- > > Key: HADOOP-18127 > URL: https://issues.apache.org/jira/browse/HADOOP-18127 > Project: Hadoop Common > Issue Type: Sub-task > Components: viewfs >Affects Versions: 2.10.0 >Reporter: Konstantin Shvachko >Priority: Major > > HADOOP-13055 introduce linkMergeSlash and linkFallback for ViewFileSystem. > Would be good to backport it to branch-2.10 -- This message was sent by Atlassian Jira (v8.20.1#820001) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Created] (HADOOP-18127) Backport HADOOP-13055 into branch-2.10
Konstantin Shvachko created HADOOP-18127: Summary: Backport HADOOP-13055 into branch-2.10 Key: HADOOP-18127 URL: https://issues.apache.org/jira/browse/HADOOP-18127 Project: Hadoop Common Issue Type: Sub-task Components: viewfs Reporter: Konstantin Shvachko HADOOP-13055 introduce linkMergeSlash and linkFallback for ViewFileSystem. Would be good to backport it to branch-2.10 -- This message was sent by Atlassian Jira (v8.20.1#820001) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Work logged] (HADOOP-15980) Enable TLS in RPC client/server
[ https://issues.apache.org/jira/browse/HADOOP-15980?focusedWorklogId=728017&page=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-728017 ] ASF GitHub Bot logged work on HADOOP-15980: --- Author: ASF GitHub Bot Created on: 16/Feb/22 02:55 Start Date: 16/Feb/22 02:55 Worklog Time Spent: 10m Work Description: vnhive commented on pull request #3966: URL: https://github.com/apache/hadoop/pull/3966#issuecomment-1041043255 > Some quick comments. I haven't gone through the netty code yet. Thanks a ton for the quick review and reply. I have updated the latest patch addressing the comments given by you and some other issues that the automated hadoop build system pointed out. I am currently working on fixing the unit tests. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org Issue Time Tracking --- Worklog Id: (was: 728017) Time Spent: 1h 10m (was: 1h) > Enable TLS in RPC client/server > --- > > Key: HADOOP-15980 > URL: https://issues.apache.org/jira/browse/HADOOP-15980 > Project: Hadoop Common > Issue Type: Sub-task > Components: ipc, security >Reporter: Daryn Sharp >Assignee: Daryn Sharp >Priority: Major > Labels: pull-request-available > Time Spent: 1h 10m > Remaining Estimate: 0h > > Once the RPC client and server can be configured to use Netty, the TLS engine > can be added to the channel pipeline. The server should allow QoS-like > functionality to determine if TLS is mandatory or optional for a client. -- This message was sent by Atlassian Jira (v8.20.1#820001) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] vnhive commented on pull request #3966: [WIP] HADOOP-15980 : Enable TLS in RPC client/server
vnhive commented on pull request #3966: URL: https://github.com/apache/hadoop/pull/3966#issuecomment-1041043255 > Some quick comments. I haven't gone through the netty code yet. Thanks a ton for the quick review and reply. I have updated the latest patch addressing the comments given by you and some other issues that the automated hadoop build system pointed out. I am currently working on fixing the unit tests. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Work logged] (HADOOP-15980) Enable TLS in RPC client/server
[ https://issues.apache.org/jira/browse/HADOOP-15980?focusedWorklogId=728016&page=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-728016 ] ASF GitHub Bot logged work on HADOOP-15980: --- Author: ASF GitHub Bot Created on: 16/Feb/22 02:53 Start Date: 16/Feb/22 02:53 Worklog Time Spent: 10m Work Description: vnhive commented on a change in pull request #3966: URL: https://github.com/apache/hadoop/pull/3966#discussion_r807491809 ## File path: hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/fs/CommonConfigurationKeys.java ## @@ -59,6 +59,16 @@ "ipc.client.rpc-timeout.ms"; /** Default value for IPC_CLIENT_RPC_TIMEOUT_KEY. */ public static final int IPC_CLIENT_RPC_TIMEOUT_DEFAULT = 12; + /** Enable the experimental use of netty instead of nio. */ + public static final String IPC_SERVER_NETTY_ENABLE_KEY = Review comment: I guess by backward compatible what is meant is, The Server checks if the incoming request is SSL enabled, if yes, it continues with SSL otherwise it resorts to non-SSL based communication. If you meant the above, yes the goal is to eventually support the above behaviour. I guess this what Daryn meant in his original comment on supporting a QoS like functionality. But in the first version I plan to get SSL working and reject a communication from a client that is not SSL enabled. I will definitely change the patch to be backward compatible in a later revision. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org Issue Time Tracking --- Worklog Id: (was: 728016) Time Spent: 1h (was: 50m) > Enable TLS in RPC client/server > --- > > Key: HADOOP-15980 > URL: https://issues.apache.org/jira/browse/HADOOP-15980 > Project: Hadoop Common > Issue Type: Sub-task > Components: ipc, security >Reporter: Daryn Sharp >Assignee: Daryn Sharp >Priority: Major > Labels: pull-request-available > Time Spent: 1h > Remaining Estimate: 0h > > Once the RPC client and server can be configured to use Netty, the TLS engine > can be added to the channel pipeline. The server should allow QoS-like > functionality to determine if TLS is mandatory or optional for a client. -- This message was sent by Atlassian Jira (v8.20.1#820001) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] vnhive commented on a change in pull request #3966: [WIP] HADOOP-15980 : Enable TLS in RPC client/server
vnhive commented on a change in pull request #3966: URL: https://github.com/apache/hadoop/pull/3966#discussion_r807491809 ## File path: hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/fs/CommonConfigurationKeys.java ## @@ -59,6 +59,16 @@ "ipc.client.rpc-timeout.ms"; /** Default value for IPC_CLIENT_RPC_TIMEOUT_KEY. */ public static final int IPC_CLIENT_RPC_TIMEOUT_DEFAULT = 12; + /** Enable the experimental use of netty instead of nio. */ + public static final String IPC_SERVER_NETTY_ENABLE_KEY = Review comment: I guess by backward compatible what is meant is, The Server checks if the incoming request is SSL enabled, if yes, it continues with SSL otherwise it resorts to non-SSL based communication. If you meant the above, yes the goal is to eventually support the above behaviour. I guess this what Daryn meant in his original comment on supporting a QoS like functionality. But in the first version I plan to get SSL working and reject a communication from a client that is not SSL enabled. I will definitely change the patch to be backward compatible in a later revision. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Work logged] (HADOOP-15980) Enable TLS in RPC client/server
[ https://issues.apache.org/jira/browse/HADOOP-15980?focusedWorklogId=728014&page=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-728014 ] ASF GitHub Bot logged work on HADOOP-15980: --- Author: ASF GitHub Bot Created on: 16/Feb/22 02:50 Start Date: 16/Feb/22 02:50 Worklog Time Spent: 10m Work Description: vnhive commented on a change in pull request #3966: URL: https://github.com/apache/hadoop/pull/3966#discussion_r807490477 ## File path: hadoop-common-project/hadoop-common/pom.xml ## @@ -669,6 +676,36 @@ + +org.apache.maven.plugins +maven-shade-plugin +${maven-shade-plugin.version} + + +package + +shade + + + + +io.netty:netty-all:jar:* + + + +*:netty + + + + +io.netty +hrpc.io.netty Review comment: Changed it to ` io.netty org.apache.hadoop.thirdparty.io.netty ` -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org Issue Time Tracking --- Worklog Id: (was: 728014) Time Spent: 50m (was: 40m) > Enable TLS in RPC client/server > --- > > Key: HADOOP-15980 > URL: https://issues.apache.org/jira/browse/HADOOP-15980 > Project: Hadoop Common > Issue Type: Sub-task > Components: ipc, security >Reporter: Daryn Sharp >Assignee: Daryn Sharp >Priority: Major > Labels: pull-request-available > Time Spent: 50m > Remaining Estimate: 0h > > Once the RPC client and server can be configured to use Netty, the TLS engine > can be added to the channel pipeline. The server should allow QoS-like > functionality to determine if TLS is mandatory or optional for a client. -- This message was sent by Atlassian Jira (v8.20.1#820001) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] vnhive commented on a change in pull request #3966: [WIP] HADOOP-15980 : Enable TLS in RPC client/server
vnhive commented on a change in pull request #3966: URL: https://github.com/apache/hadoop/pull/3966#discussion_r807490477 ## File path: hadoop-common-project/hadoop-common/pom.xml ## @@ -669,6 +676,36 @@ + +org.apache.maven.plugins +maven-shade-plugin +${maven-shade-plugin.version} + + +package + +shade + + + + +io.netty:netty-all:jar:* + + + +*:netty + + + + +io.netty +hrpc.io.netty Review comment: Changed it to ` io.netty org.apache.hadoop.thirdparty.io.netty ` -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] jianghuazhu commented on pull request #3861: HDFS-16316.Improve DirectoryScanner: add regular file check related block.
jianghuazhu commented on pull request #3861: URL: https://github.com/apache/hadoop/pull/3861#issuecomment-1041040826 Thanks for the suggestion, @jojochuang . I re-updated the unit tests and also did some tests. When I remove the fix, the newly added unit test does not succeed, which is expected and does not affect the execution of other unit tests. Here is an example of the test when removing the fix: ![image](https://user-images.githubusercontent.com/6416939/154185727-620eacac-5b4e-4b49-b6f2-1e612017cc35.png) Here is an example during normal testing: ![image](https://user-images.githubusercontent.com/6416939/154186788-f53338e6-2a40-46b1-95c9-59282fa7616b.png) -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Work logged] (HADOOP-15980) Enable TLS in RPC client/server
[ https://issues.apache.org/jira/browse/HADOOP-15980?focusedWorklogId=728011&page=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-728011 ] ASF GitHub Bot logged work on HADOOP-15980: --- Author: ASF GitHub Bot Created on: 16/Feb/22 02:49 Start Date: 16/Feb/22 02:49 Worklog Time Spent: 10m Work Description: vnhive commented on a change in pull request #3966: URL: https://github.com/apache/hadoop/pull/3966#discussion_r807490192 ## File path: hadoop-common-project/hadoop-common/pom.xml ## @@ -393,6 +393,13 @@ lz4-java provided + + + io.netty + netty-all + 4.1.27.Final Review comment: Removed the version. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org Issue Time Tracking --- Worklog Id: (was: 728011) Time Spent: 40m (was: 0.5h) > Enable TLS in RPC client/server > --- > > Key: HADOOP-15980 > URL: https://issues.apache.org/jira/browse/HADOOP-15980 > Project: Hadoop Common > Issue Type: Sub-task > Components: ipc, security >Reporter: Daryn Sharp >Assignee: Daryn Sharp >Priority: Major > Labels: pull-request-available > Time Spent: 40m > Remaining Estimate: 0h > > Once the RPC client and server can be configured to use Netty, the TLS engine > can be added to the channel pipeline. The server should allow QoS-like > functionality to determine if TLS is mandatory or optional for a client. -- This message was sent by Atlassian Jira (v8.20.1#820001) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] vnhive commented on a change in pull request #3966: [WIP] HADOOP-15980 : Enable TLS in RPC client/server
vnhive commented on a change in pull request #3966: URL: https://github.com/apache/hadoop/pull/3966#discussion_r807490192 ## File path: hadoop-common-project/hadoop-common/pom.xml ## @@ -393,6 +393,13 @@ lz4-java provided + + + io.netty + netty-all + 4.1.27.Final Review comment: Removed the version. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Work logged] (HADOOP-18110) ViewFileSystem: Add Support for Localized Trash Root
[ https://issues.apache.org/jira/browse/HADOOP-18110?focusedWorklogId=728009&page=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-728009 ] ASF GitHub Bot logged work on HADOOP-18110: --- Author: ASF GitHub Bot Created on: 16/Feb/22 02:38 Start Date: 16/Feb/22 02:38 Worklog Time Spent: 10m Work Description: hadoop-yetus commented on pull request #3994: URL: https://github.com/apache/hadoop/pull/3994#issuecomment-1041034080 :broken_heart: **-1 overall** | Vote | Subsystem | Runtime | Logfile | Comment | |::|--:|:|::|:---:| | +0 :ok: | reexec | 12m 7s | | Docker mode activated. | _ Prechecks _ | | +1 :green_heart: | dupname | 0m 0s | | No case conflicting files found. | | +0 :ok: | codespell | 0m 1s | | codespell was not available. | | +1 :green_heart: | @author | 0m 0s | | The patch does not contain any @author tags. | | +1 :green_heart: | test4tests | 0m 0s | | The patch appears to include 1 new or modified test files. | _ branch-2.10 Compile Tests _ | | +1 :green_heart: | mvninstall | 14m 16s | | branch-2.10 passed | | +1 :green_heart: | compile | 13m 5s | | branch-2.10 passed with JDK Azul Systems, Inc.-1.7.0_262-b10 | | +1 :green_heart: | compile | 10m 44s | | branch-2.10 passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~18.04-b07 | | +1 :green_heart: | checkstyle | 0m 42s | | branch-2.10 passed | | +1 :green_heart: | mvnsite | 1m 18s | | branch-2.10 passed | | +1 :green_heart: | javadoc | 1m 14s | | branch-2.10 passed with JDK Azul Systems, Inc.-1.7.0_262-b10 | | +1 :green_heart: | javadoc | 1m 3s | | branch-2.10 passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~18.04-b07 | | -1 :x: | spotbugs | 2m 0s | [/branch-spotbugs-hadoop-common-project_hadoop-common-warnings.html](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-3994/1/artifact/out/branch-spotbugs-hadoop-common-project_hadoop-common-warnings.html) | hadoop-common-project/hadoop-common in branch-2.10 has 2 extant spotbugs warnings. | _ Patch Compile Tests _ | | +1 :green_heart: | mvninstall | 0m 44s | | the patch passed | | +1 :green_heart: | compile | 12m 28s | | the patch passed with JDK Azul Systems, Inc.-1.7.0_262-b10 | | +1 :green_heart: | javac | 12m 28s | | the patch passed | | +1 :green_heart: | compile | 10m 46s | | the patch passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~18.04-b07 | | +1 :green_heart: | javac | 10m 46s | | the patch passed | | +1 :green_heart: | blanks | 0m 0s | | The patch has no blanks issues. | | -0 :warning: | checkstyle | 0m 42s | [/results-checkstyle-hadoop-common-project_hadoop-common.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-3994/1/artifact/out/results-checkstyle-hadoop-common-project_hadoop-common.txt) | hadoop-common-project/hadoop-common: The patch generated 1 new + 182 unchanged - 0 fixed = 183 total (was 182) | | +1 :green_heart: | mvnsite | 1m 17s | | the patch passed | | +1 :green_heart: | javadoc | 1m 13s | | the patch passed with JDK Azul Systems, Inc.-1.7.0_262-b10 | | +1 :green_heart: | javadoc | 1m 6s | | the patch passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~18.04-b07 | | +1 :green_heart: | spotbugs | 2m 10s | | the patch passed | _ Other Tests _ | | -1 :x: | unit | 9m 27s | [/patch-unit-hadoop-common-project_hadoop-common.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-3994/1/artifact/out/patch-unit-hadoop-common-project_hadoop-common.txt) | hadoop-common in the patch passed. | | +1 :green_heart: | asflicense | 0m 49s | | The patch does not generate ASF License warnings. | | | | 102m 5s | | | | Reason | Tests | |---:|:--| | Failed junit tests | hadoop.util.TestReadWriteDiskValidator | | | hadoop.fs.sftp.TestSFTPFileSystem | | | hadoop.io.compress.TestCompressorDecompressor | | | hadoop.io.compress.snappy.TestSnappyCompressorDecompressor | | Subsystem | Report/Notes | |--:|:-| | Docker | ClientAPI=1.41 ServerAPI=1.41 base: https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-3994/1/artifact/out/Dockerfile | | GITHUB PR | https://github.com/apache/hadoop/pull/3994 | | Optional Tests | dupname asflicense compile javac javadoc mvninstall mvnsite unit shadedclient spotbugs checkstyle codespell | | uname | Linux c18a4756a3fd 4.15.0-65-generic #74-Ubuntu SMP Tue Sep 17 17:06:04 UTC 2019 x86_64 x86_64 x86_64 GNU/Linux | | Build tool | maven | | Personality | dev-support/bin/hadoop.sh | | git revision | branc
[GitHub] [hadoop] hadoop-yetus commented on pull request #3994: HADOOP-18110. ViewFileSystem: Add Support for Localized Trash Root
hadoop-yetus commented on pull request #3994: URL: https://github.com/apache/hadoop/pull/3994#issuecomment-1041034080 :broken_heart: **-1 overall** | Vote | Subsystem | Runtime | Logfile | Comment | |::|--:|:|::|:---:| | +0 :ok: | reexec | 12m 7s | | Docker mode activated. | _ Prechecks _ | | +1 :green_heart: | dupname | 0m 0s | | No case conflicting files found. | | +0 :ok: | codespell | 0m 1s | | codespell was not available. | | +1 :green_heart: | @author | 0m 0s | | The patch does not contain any @author tags. | | +1 :green_heart: | test4tests | 0m 0s | | The patch appears to include 1 new or modified test files. | _ branch-2.10 Compile Tests _ | | +1 :green_heart: | mvninstall | 14m 16s | | branch-2.10 passed | | +1 :green_heart: | compile | 13m 5s | | branch-2.10 passed with JDK Azul Systems, Inc.-1.7.0_262-b10 | | +1 :green_heart: | compile | 10m 44s | | branch-2.10 passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~18.04-b07 | | +1 :green_heart: | checkstyle | 0m 42s | | branch-2.10 passed | | +1 :green_heart: | mvnsite | 1m 18s | | branch-2.10 passed | | +1 :green_heart: | javadoc | 1m 14s | | branch-2.10 passed with JDK Azul Systems, Inc.-1.7.0_262-b10 | | +1 :green_heart: | javadoc | 1m 3s | | branch-2.10 passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~18.04-b07 | | -1 :x: | spotbugs | 2m 0s | [/branch-spotbugs-hadoop-common-project_hadoop-common-warnings.html](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-3994/1/artifact/out/branch-spotbugs-hadoop-common-project_hadoop-common-warnings.html) | hadoop-common-project/hadoop-common in branch-2.10 has 2 extant spotbugs warnings. | _ Patch Compile Tests _ | | +1 :green_heart: | mvninstall | 0m 44s | | the patch passed | | +1 :green_heart: | compile | 12m 28s | | the patch passed with JDK Azul Systems, Inc.-1.7.0_262-b10 | | +1 :green_heart: | javac | 12m 28s | | the patch passed | | +1 :green_heart: | compile | 10m 46s | | the patch passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~18.04-b07 | | +1 :green_heart: | javac | 10m 46s | | the patch passed | | +1 :green_heart: | blanks | 0m 0s | | The patch has no blanks issues. | | -0 :warning: | checkstyle | 0m 42s | [/results-checkstyle-hadoop-common-project_hadoop-common.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-3994/1/artifact/out/results-checkstyle-hadoop-common-project_hadoop-common.txt) | hadoop-common-project/hadoop-common: The patch generated 1 new + 182 unchanged - 0 fixed = 183 total (was 182) | | +1 :green_heart: | mvnsite | 1m 17s | | the patch passed | | +1 :green_heart: | javadoc | 1m 13s | | the patch passed with JDK Azul Systems, Inc.-1.7.0_262-b10 | | +1 :green_heart: | javadoc | 1m 6s | | the patch passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~18.04-b07 | | +1 :green_heart: | spotbugs | 2m 10s | | the patch passed | _ Other Tests _ | | -1 :x: | unit | 9m 27s | [/patch-unit-hadoop-common-project_hadoop-common.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-3994/1/artifact/out/patch-unit-hadoop-common-project_hadoop-common.txt) | hadoop-common in the patch passed. | | +1 :green_heart: | asflicense | 0m 49s | | The patch does not generate ASF License warnings. | | | | 102m 5s | | | | Reason | Tests | |---:|:--| | Failed junit tests | hadoop.util.TestReadWriteDiskValidator | | | hadoop.fs.sftp.TestSFTPFileSystem | | | hadoop.io.compress.TestCompressorDecompressor | | | hadoop.io.compress.snappy.TestSnappyCompressorDecompressor | | Subsystem | Report/Notes | |--:|:-| | Docker | ClientAPI=1.41 ServerAPI=1.41 base: https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-3994/1/artifact/out/Dockerfile | | GITHUB PR | https://github.com/apache/hadoop/pull/3994 | | Optional Tests | dupname asflicense compile javac javadoc mvninstall mvnsite unit shadedclient spotbugs checkstyle codespell | | uname | Linux c18a4756a3fd 4.15.0-65-generic #74-Ubuntu SMP Tue Sep 17 17:06:04 UTC 2019 x86_64 x86_64 x86_64 GNU/Linux | | Build tool | maven | | Personality | dev-support/bin/hadoop.sh | | git revision | branch-2.10 / 63ccc116c4ff9798f2227fe0d5d2b727fce3079f | | Default Java | Private Build-1.8.0_312-8u312-b07-0ubuntu1~18.04-b07 | | Multi-JDK versions | /usr/lib/jvm/zulu-7-amd64:Azul Systems, Inc.-1.7.0_262-b10 /usr/lib/jvm/java-8-openjdk-amd64:Private Build-1.8.0_312-8u312-b07-0ubuntu1~18.04-b07 | | Test Results | https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-3994/1/testReport/ | | Max. process+thread cou
[jira] [Work logged] (HADOOP-18122) ViewFileSystem fails on determining owning group when primary group doesn't exist for user
[ https://issues.apache.org/jira/browse/HADOOP-18122?focusedWorklogId=727999&page=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-727999 ] ASF GitHub Bot logged work on HADOOP-18122: --- Author: ASF GitHub Bot Created on: 16/Feb/22 02:12 Start Date: 16/Feb/22 02:12 Worklog Time Spent: 10m Work Description: hadoop-yetus commented on pull request #3987: URL: https://github.com/apache/hadoop/pull/3987#issuecomment-1041020448 :confetti_ball: **+1 overall** | Vote | Subsystem | Runtime | Logfile | Comment | |::|--:|:|::|:---:| | +0 :ok: | reexec | 0m 52s | | Docker mode activated. | _ Prechecks _ | | +1 :green_heart: | dupname | 0m 0s | | No case conflicting files found. | | +0 :ok: | codespell | 0m 1s | | codespell was not available. | | +1 :green_heart: | @author | 0m 0s | | The patch does not contain any @author tags. | | +1 :green_heart: | test4tests | 0m 0s | | The patch appears to include 2 new or modified test files. | _ trunk Compile Tests _ | | +1 :green_heart: | mvninstall | 36m 23s | | trunk passed | | +1 :green_heart: | compile | 27m 36s | | trunk passed with JDK Ubuntu-11.0.13+8-Ubuntu-0ubuntu1.20.04 | | +1 :green_heart: | compile | 23m 29s | | trunk passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | checkstyle | 1m 4s | | trunk passed | | +1 :green_heart: | mvnsite | 1m 42s | | trunk passed | | +1 :green_heart: | javadoc | 1m 10s | | trunk passed with JDK Ubuntu-11.0.13+8-Ubuntu-0ubuntu1.20.04 | | +1 :green_heart: | javadoc | 1m 42s | | trunk passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | spotbugs | 2m 51s | | trunk passed | | +1 :green_heart: | shadedclient | 26m 7s | | branch has no errors when building and testing our client artifacts. | _ Patch Compile Tests _ | | +1 :green_heart: | mvninstall | 1m 7s | | the patch passed | | +1 :green_heart: | compile | 23m 59s | | the patch passed with JDK Ubuntu-11.0.13+8-Ubuntu-0ubuntu1.20.04 | | +1 :green_heart: | javac | 23m 59s | | the patch passed | | +1 :green_heart: | compile | 20m 39s | | the patch passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | javac | 20m 39s | | the patch passed | | +1 :green_heart: | blanks | 0m 0s | | The patch has no blanks issues. | | +1 :green_heart: | checkstyle | 1m 0s | | the patch passed | | +1 :green_heart: | mvnsite | 1m 35s | | the patch passed | | +1 :green_heart: | javadoc | 1m 4s | | the patch passed with JDK Ubuntu-11.0.13+8-Ubuntu-0ubuntu1.20.04 | | +1 :green_heart: | javadoc | 1m 33s | | the patch passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | spotbugs | 2m 37s | | the patch passed | | +1 :green_heart: | shadedclient | 25m 28s | | patch has no errors when building and testing our client artifacts. | _ Other Tests _ | | +1 :green_heart: | unit | 17m 27s | | hadoop-common in the patch passed. | | +1 :green_heart: | asflicense | 0m 48s | | The patch does not generate ASF License warnings. | | | | 219m 22s | | | | Subsystem | Report/Notes | |--:|:-| | Docker | ClientAPI=1.41 ServerAPI=1.41 base: https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-3987/2/artifact/out/Dockerfile | | GITHUB PR | https://github.com/apache/hadoop/pull/3987 | | Optional Tests | dupname asflicense compile javac javadoc mvninstall mvnsite unit shadedclient spotbugs checkstyle codespell | | uname | Linux ef188856c382 4.15.0-163-generic #171-Ubuntu SMP Fri Nov 5 11:55:11 UTC 2021 x86_64 x86_64 x86_64 GNU/Linux | | Build tool | maven | | Personality | dev-support/bin/hadoop.sh | | git revision | trunk / be45af5669015a9ec672a84e3eabfdca460dab33 | | Default Java | Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | Multi-JDK versions | /usr/lib/jvm/java-11-openjdk-amd64:Ubuntu-11.0.13+8-Ubuntu-0ubuntu1.20.04 /usr/lib/jvm/java-8-openjdk-amd64:Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | Test Results | https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-3987/2/testReport/ | | Max. process+thread count | 2906 (vs. ulimit of 5500) | | modules | C: hadoop-common-project/hadoop-common U: hadoop-common-project/hadoop-common | | Console output | https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-3987/2/console | | versions | git=2.25.1 maven=3.6.3 spotbugs=4.2.2 | | Powered by | Apache Yetus 0.14.0-SNAPSHOT https://yetus.apache.org |
[GitHub] [hadoop] hadoop-yetus commented on pull request #3987: HADOOP-18122. ViewFileSystem fails on determining owning group when primary group doesn't exist for user
hadoop-yetus commented on pull request #3987: URL: https://github.com/apache/hadoop/pull/3987#issuecomment-1041020448 :confetti_ball: **+1 overall** | Vote | Subsystem | Runtime | Logfile | Comment | |::|--:|:|::|:---:| | +0 :ok: | reexec | 0m 52s | | Docker mode activated. | _ Prechecks _ | | +1 :green_heart: | dupname | 0m 0s | | No case conflicting files found. | | +0 :ok: | codespell | 0m 1s | | codespell was not available. | | +1 :green_heart: | @author | 0m 0s | | The patch does not contain any @author tags. | | +1 :green_heart: | test4tests | 0m 0s | | The patch appears to include 2 new or modified test files. | _ trunk Compile Tests _ | | +1 :green_heart: | mvninstall | 36m 23s | | trunk passed | | +1 :green_heart: | compile | 27m 36s | | trunk passed with JDK Ubuntu-11.0.13+8-Ubuntu-0ubuntu1.20.04 | | +1 :green_heart: | compile | 23m 29s | | trunk passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | checkstyle | 1m 4s | | trunk passed | | +1 :green_heart: | mvnsite | 1m 42s | | trunk passed | | +1 :green_heart: | javadoc | 1m 10s | | trunk passed with JDK Ubuntu-11.0.13+8-Ubuntu-0ubuntu1.20.04 | | +1 :green_heart: | javadoc | 1m 42s | | trunk passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | spotbugs | 2m 51s | | trunk passed | | +1 :green_heart: | shadedclient | 26m 7s | | branch has no errors when building and testing our client artifacts. | _ Patch Compile Tests _ | | +1 :green_heart: | mvninstall | 1m 7s | | the patch passed | | +1 :green_heart: | compile | 23m 59s | | the patch passed with JDK Ubuntu-11.0.13+8-Ubuntu-0ubuntu1.20.04 | | +1 :green_heart: | javac | 23m 59s | | the patch passed | | +1 :green_heart: | compile | 20m 39s | | the patch passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | javac | 20m 39s | | the patch passed | | +1 :green_heart: | blanks | 0m 0s | | The patch has no blanks issues. | | +1 :green_heart: | checkstyle | 1m 0s | | the patch passed | | +1 :green_heart: | mvnsite | 1m 35s | | the patch passed | | +1 :green_heart: | javadoc | 1m 4s | | the patch passed with JDK Ubuntu-11.0.13+8-Ubuntu-0ubuntu1.20.04 | | +1 :green_heart: | javadoc | 1m 33s | | the patch passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | spotbugs | 2m 37s | | the patch passed | | +1 :green_heart: | shadedclient | 25m 28s | | patch has no errors when building and testing our client artifacts. | _ Other Tests _ | | +1 :green_heart: | unit | 17m 27s | | hadoop-common in the patch passed. | | +1 :green_heart: | asflicense | 0m 48s | | The patch does not generate ASF License warnings. | | | | 219m 22s | | | | Subsystem | Report/Notes | |--:|:-| | Docker | ClientAPI=1.41 ServerAPI=1.41 base: https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-3987/2/artifact/out/Dockerfile | | GITHUB PR | https://github.com/apache/hadoop/pull/3987 | | Optional Tests | dupname asflicense compile javac javadoc mvninstall mvnsite unit shadedclient spotbugs checkstyle codespell | | uname | Linux ef188856c382 4.15.0-163-generic #171-Ubuntu SMP Fri Nov 5 11:55:11 UTC 2021 x86_64 x86_64 x86_64 GNU/Linux | | Build tool | maven | | Personality | dev-support/bin/hadoop.sh | | git revision | trunk / be45af5669015a9ec672a84e3eabfdca460dab33 | | Default Java | Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | Multi-JDK versions | /usr/lib/jvm/java-11-openjdk-amd64:Ubuntu-11.0.13+8-Ubuntu-0ubuntu1.20.04 /usr/lib/jvm/java-8-openjdk-amd64:Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | Test Results | https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-3987/2/testReport/ | | Max. process+thread count | 2906 (vs. ulimit of 5500) | | modules | C: hadoop-common-project/hadoop-common U: hadoop-common-project/hadoop-common | | Console output | https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-3987/2/console | | versions | git=2.25.1 maven=3.6.3 spotbugs=4.2.2 | | Powered by | Apache Yetus 0.14.0-SNAPSHOT https://yetus.apache.org | This message was automatically generated. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org --
[jira] [Work logged] (HADOOP-18110) ViewFileSystem: Add Support for Localized Trash Root
[ https://issues.apache.org/jira/browse/HADOOP-18110?focusedWorklogId=727970&page=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-727970 ] ASF GitHub Bot logged work on HADOOP-18110: --- Author: ASF GitHub Bot Created on: 16/Feb/22 00:55 Start Date: 16/Feb/22 00:55 Worklog Time Spent: 10m Work Description: xinglin opened a new pull request #3994: URL: https://github.com/apache/hadoop/pull/3994 Fixes #3956 (cherry picked from commit ca8ba24051b7fca4612c9c182cb49f5183ce33ba) ### Description of PR Cherry-pick the patch from trunk to branch-2.10, with a few other changes, to make it work for branch-2.10. ### How was this patch tested? Manually run the new test cases from Intellij. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org Issue Time Tracking --- Worklog Id: (was: 727970) Time Spent: 3h 10m (was: 3h) > ViewFileSystem: Add Support for Localized Trash Root > > > Key: HADOOP-18110 > URL: https://issues.apache.org/jira/browse/HADOOP-18110 > Project: Hadoop Common > Issue Type: Improvement > Components: common >Reporter: Xing Lin >Assignee: Xing Lin >Priority: Major > Labels: pull-request-available > Fix For: 3.4.0 > > Time Spent: 3h 10m > Remaining Estimate: 0h > > getTrashRoot() in ViewFileSystem calls getTrashRoot() from underlying > filesystem, to return the trash root. Most of the time, we get a trash root > in user home dir. This can lead to problems when an application wants to > delete a file in a mounted point using moveToTrash() in TrashPolicyDefault, > because we can not rename across multiple filesystems/hdfs namenodes. > > We propose the following extension to getTrashRoot/getTrashRoots in > ViewFileSystem: add a flag to return a localized trash root for > ViewFileSystem. A localized trash root is a trash root which starts from the > root of a mount point (e.g., /mountpointRoot/.Trash/\{user}). > * If CONFIG_VIEWFS_MOUNT_POINT_LOCAL_TRASH is not set to true, or > * when the path p is in a snapshot or an encryption zone, return > * the default trash root in user home dir. > * > * when CONFIG_VIEWFS_MOUNT_POINT_LOCAL_TRASH is set to true, > * 1) if path p is mounted from the same targetFS as user home dir, > * return a trash root in user home dir. > * 2) else, return a trash root in the mounted targetFS > * -- This message was sent by Atlassian Jira (v8.20.1#820001) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] xinglin opened a new pull request #3994: HADOOP-18110. ViewFileSystem: Add Support for Localized Trash Root
xinglin opened a new pull request #3994: URL: https://github.com/apache/hadoop/pull/3994 Fixes #3956 (cherry picked from commit ca8ba24051b7fca4612c9c182cb49f5183ce33ba) ### Description of PR Cherry-pick the patch from trunk to branch-2.10, with a few other changes, to make it work for branch-2.10. ### How was this patch tested? Manually run the new test cases from Intellij. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] tasanuma merged pull request #3992: HDFS-15745. Make DataNodePeerMetrics#LOW_THRESHOLD_MS and MIN_OUTLIER_DETECTION_NODES configurable. Contributed by Haibin Huang.
tasanuma merged pull request #3992: URL: https://github.com/apache/hadoop/pull/3992 -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] tasanuma commented on pull request #3992: HDFS-15745. Make DataNodePeerMetrics#LOW_THRESHOLD_MS and MIN_OUTLIER_DETECTION_NODES configurable. Contributed by Haibin Huang.
tasanuma commented on pull request #3992: URL: https://github.com/apache/hadoop/pull/3992#issuecomment-1040943937 The failed tests seem not to related. I'm merging it. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] tomscut commented on pull request #3828: HDFS-16397. Reconfig slow disk parameters for datanode
tomscut commented on pull request #3828: URL: https://github.com/apache/hadoop/pull/3828#issuecomment-1040929250 Hi @tasanuma @ayushtkn @Hexiaoqiao , could you please review this PR? Thanks. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Work logged] (HADOOP-18126) junit-vintage tests seem to be failing
[ https://issues.apache.org/jira/browse/HADOOP-18126?focusedWorklogId=727923&page=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-727923 ] ASF GitHub Bot logged work on HADOOP-18126: --- Author: ASF GitHub Bot Created on: 15/Feb/22 23:42 Start Date: 15/Feb/22 23:42 Worklog Time Spent: 10m Work Description: hadoop-yetus commented on pull request #3993: URL: https://github.com/apache/hadoop/pull/3993#issuecomment-1040910653 :broken_heart: **-1 overall** | Vote | Subsystem | Runtime | Logfile | Comment | |::|--:|:|::|:---:| | +0 :ok: | reexec | 0m 41s | | Docker mode activated. | _ Prechecks _ | | +1 :green_heart: | dupname | 0m 0s | | No case conflicting files found. | | +0 :ok: | codespell | 0m 1s | | codespell was not available. | | +1 :green_heart: | @author | 0m 0s | | The patch does not contain any @author tags. | | -1 :x: | test4tests | 0m 0s | | The patch doesn't appear to include any new or modified tests. Please justify why no new tests are needed for this patch. Also please list what manual steps were performed to verify this patch. | _ trunk Compile Tests _ | | +1 :green_heart: | mvninstall | 32m 8s | | trunk passed | | +1 :green_heart: | compile | 0m 23s | | trunk passed with JDK Ubuntu-11.0.13+8-Ubuntu-0ubuntu1.20.04 | | +1 :green_heart: | compile | 0m 22s | | trunk passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | mvnsite | 0m 27s | | trunk passed | | +1 :green_heart: | javadoc | 0m 25s | | trunk passed with JDK Ubuntu-11.0.13+8-Ubuntu-0ubuntu1.20.04 | | +1 :green_heart: | javadoc | 0m 25s | | trunk passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | shadedclient | 53m 46s | | branch has no errors when building and testing our client artifacts. | _ Patch Compile Tests _ | | +1 :green_heart: | mvninstall | 0m 15s | | the patch passed | | +1 :green_heart: | compile | 0m 14s | | the patch passed with JDK Ubuntu-11.0.13+8-Ubuntu-0ubuntu1.20.04 | | +1 :green_heart: | javac | 0m 14s | | the patch passed | | +1 :green_heart: | compile | 0m 13s | | the patch passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | javac | 0m 13s | | the patch passed | | +1 :green_heart: | blanks | 0m 0s | | The patch has no blanks issues. | | +1 :green_heart: | mvnsite | 0m 15s | | the patch passed | | +1 :green_heart: | xml | 0m 1s | | The patch has no ill-formed XML file. | | +1 :green_heart: | javadoc | 0m 14s | | the patch passed with JDK Ubuntu-11.0.13+8-Ubuntu-0ubuntu1.20.04 | | +1 :green_heart: | javadoc | 0m 15s | | the patch passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | -1 :x: | shadedclient | 7m 5s | | patch has errors when building and testing our client artifacts. | _ Other Tests _ | | +1 :green_heart: | unit | 0m 15s | | hadoop-project in the patch passed. | | +1 :green_heart: | asflicense | 0m 29s | | The patch does not generate ASF License warnings. | | | | 65m 8s | | | | Subsystem | Report/Notes | |--:|:-| | Docker | ClientAPI=1.41 ServerAPI=1.41 base: https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-3993/1/artifact/out/Dockerfile | | GITHUB PR | https://github.com/apache/hadoop/pull/3993 | | Optional Tests | dupname asflicense compile javac javadoc mvninstall mvnsite unit shadedclient codespell xml | | uname | Linux 719a196847cd 4.15.0-58-generic #64-Ubuntu SMP Tue Aug 6 11:12:41 UTC 2019 x86_64 x86_64 x86_64 GNU/Linux | | Build tool | maven | | Personality | dev-support/bin/hadoop.sh | | git revision | trunk / d02e501d95f46c87ca06a599489e62d46fe49c60 | | Default Java | Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | Multi-JDK versions | /usr/lib/jvm/java-11-openjdk-amd64:Ubuntu-11.0.13+8-Ubuntu-0ubuntu1.20.04 /usr/lib/jvm/java-8-openjdk-amd64:Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | Test Results | https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-3993/1/testReport/ | | Max. process+thread count | 548 (vs. ulimit of 5500) | | modules | C: hadoop-project U: hadoop-project | | Console output | https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-3993/1/console | | versions | git=2.25.1 maven=3.6.3 | | Powered by | Apache Yetus 0.14.0-SNAPSHOT https://yetus.apache.org | This message was automatically generated. -- This is an automated message from the Apache Git Service. To respond to the message, please
[GitHub] [hadoop] hadoop-yetus commented on pull request #3993: [HADOOP-18126] update junit due to build issues
hadoop-yetus commented on pull request #3993: URL: https://github.com/apache/hadoop/pull/3993#issuecomment-1040910653 :broken_heart: **-1 overall** | Vote | Subsystem | Runtime | Logfile | Comment | |::|--:|:|::|:---:| | +0 :ok: | reexec | 0m 41s | | Docker mode activated. | _ Prechecks _ | | +1 :green_heart: | dupname | 0m 0s | | No case conflicting files found. | | +0 :ok: | codespell | 0m 1s | | codespell was not available. | | +1 :green_heart: | @author | 0m 0s | | The patch does not contain any @author tags. | | -1 :x: | test4tests | 0m 0s | | The patch doesn't appear to include any new or modified tests. Please justify why no new tests are needed for this patch. Also please list what manual steps were performed to verify this patch. | _ trunk Compile Tests _ | | +1 :green_heart: | mvninstall | 32m 8s | | trunk passed | | +1 :green_heart: | compile | 0m 23s | | trunk passed with JDK Ubuntu-11.0.13+8-Ubuntu-0ubuntu1.20.04 | | +1 :green_heart: | compile | 0m 22s | | trunk passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | mvnsite | 0m 27s | | trunk passed | | +1 :green_heart: | javadoc | 0m 25s | | trunk passed with JDK Ubuntu-11.0.13+8-Ubuntu-0ubuntu1.20.04 | | +1 :green_heart: | javadoc | 0m 25s | | trunk passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | shadedclient | 53m 46s | | branch has no errors when building and testing our client artifacts. | _ Patch Compile Tests _ | | +1 :green_heart: | mvninstall | 0m 15s | | the patch passed | | +1 :green_heart: | compile | 0m 14s | | the patch passed with JDK Ubuntu-11.0.13+8-Ubuntu-0ubuntu1.20.04 | | +1 :green_heart: | javac | 0m 14s | | the patch passed | | +1 :green_heart: | compile | 0m 13s | | the patch passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | javac | 0m 13s | | the patch passed | | +1 :green_heart: | blanks | 0m 0s | | The patch has no blanks issues. | | +1 :green_heart: | mvnsite | 0m 15s | | the patch passed | | +1 :green_heart: | xml | 0m 1s | | The patch has no ill-formed XML file. | | +1 :green_heart: | javadoc | 0m 14s | | the patch passed with JDK Ubuntu-11.0.13+8-Ubuntu-0ubuntu1.20.04 | | +1 :green_heart: | javadoc | 0m 15s | | the patch passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | -1 :x: | shadedclient | 7m 5s | | patch has errors when building and testing our client artifacts. | _ Other Tests _ | | +1 :green_heart: | unit | 0m 15s | | hadoop-project in the patch passed. | | +1 :green_heart: | asflicense | 0m 29s | | The patch does not generate ASF License warnings. | | | | 65m 8s | | | | Subsystem | Report/Notes | |--:|:-| | Docker | ClientAPI=1.41 ServerAPI=1.41 base: https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-3993/1/artifact/out/Dockerfile | | GITHUB PR | https://github.com/apache/hadoop/pull/3993 | | Optional Tests | dupname asflicense compile javac javadoc mvninstall mvnsite unit shadedclient codespell xml | | uname | Linux 719a196847cd 4.15.0-58-generic #64-Ubuntu SMP Tue Aug 6 11:12:41 UTC 2019 x86_64 x86_64 x86_64 GNU/Linux | | Build tool | maven | | Personality | dev-support/bin/hadoop.sh | | git revision | trunk / d02e501d95f46c87ca06a599489e62d46fe49c60 | | Default Java | Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | Multi-JDK versions | /usr/lib/jvm/java-11-openjdk-amd64:Ubuntu-11.0.13+8-Ubuntu-0ubuntu1.20.04 /usr/lib/jvm/java-8-openjdk-amd64:Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | Test Results | https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-3993/1/testReport/ | | Max. process+thread count | 548 (vs. ulimit of 5500) | | modules | C: hadoop-project U: hadoop-project | | Console output | https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-3993/1/console | | versions | git=2.25.1 maven=3.6.3 | | Powered by | Apache Yetus 0.14.0-SNAPSHOT https://yetus.apache.org | This message was automatically generated. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apa
[jira] [Work logged] (HADOOP-18109) Ensure that default permissions of directories under internal ViewFS directories are the same as directories on target filesystems
[ https://issues.apache.org/jira/browse/HADOOP-18109?focusedWorklogId=727919&page=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-727919 ] ASF GitHub Bot logged work on HADOOP-18109: --- Author: ASF GitHub Bot Created on: 15/Feb/22 23:38 Start Date: 15/Feb/22 23:38 Worklog Time Spent: 10m Work Description: shvachko commented on pull request #3953: URL: https://github.com/apache/hadoop/pull/3953#issuecomment-1040908572 +1 on the latest patch. Will commit this shortly -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org Issue Time Tracking --- Worklog Id: (was: 727919) Time Spent: 2h 20m (was: 2h 10m) > Ensure that default permissions of directories under internal ViewFS > directories are the same as directories on target filesystems > -- > > Key: HADOOP-18109 > URL: https://issues.apache.org/jira/browse/HADOOP-18109 > Project: Hadoop Common > Issue Type: Bug > Components: viewfs >Reporter: Chentao Yu >Assignee: Chentao Yu >Priority: Major > Labels: pull-request-available > Time Spent: 2h 20m > Remaining Estimate: 0h > > * Ensure that default permissions of directories under internal ViewFS > directories are the same as directories on target filesystems > * Add new unit test -- This message was sent by Atlassian Jira (v8.20.1#820001) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] shvachko commented on pull request #3953: HADOOP-18109. Ensure that default permissions of directories under internal ViewFS directories are the same as directories on target filesys
shvachko commented on pull request #3953: URL: https://github.com/apache/hadoop/pull/3953#issuecomment-1040908572 +1 on the latest patch. Will commit this shortly -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Updated] (HADOOP-18126) junit-vintage tests seem to be failing
[ https://issues.apache.org/jira/browse/HADOOP-18126?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] ASF GitHub Bot updated HADOOP-18126: Labels: pull-request-available (was: ) > junit-vintage tests seem to be failing > -- > > Key: HADOOP-18126 > URL: https://issues.apache.org/jira/browse/HADOOP-18126 > Project: Hadoop Common > Issue Type: Bug > Components: bulid >Reporter: PJ Fanning >Priority: Major > Labels: pull-request-available > Time Spent: 10m > Remaining Estimate: 0h > > {code:java} > Feb 11, 2022 11:31:43 AM org.junit.platform.launcher.core.DefaultLauncher > handleThrowable WARNING: TestEngine with ID 'junit-vintage' failed to > discover tests org.junit.platform.commons.JUnitException: Failed to parse > version of junit:junit: 4.13.2 at > org.junit.vintage.engine.JUnit4VersionCheck.parseVersion(JUnit4VersionCheck.java:54) > {code} > [https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-3980/1/artifact/out/patch-unit-root.txt] > seems like junit.vintage.version=5.5.1 is incompatible with > junit.version=4.13.2 > see 2nd answer on > [https://stackoverflow.com/questions/59900637/error-testengine-with-id-junit-vintage-failed-to-discover-tests-with-spring] > my plan is to upgrade junit.vintage.version and junit.jupiter.version to 5.8.2 > -- This message was sent by Atlassian Jira (v8.20.1#820001) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Work logged] (HADOOP-18126) junit-vintage tests seem to be failing
[ https://issues.apache.org/jira/browse/HADOOP-18126?focusedWorklogId=727878&page=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-727878 ] ASF GitHub Bot logged work on HADOOP-18126: --- Author: ASF GitHub Bot Created on: 15/Feb/22 22:36 Start Date: 15/Feb/22 22:36 Worklog Time Spent: 10m Work Description: pjfanning opened a new pull request #3993: URL: https://github.com/apache/hadoop/pull/3993 ### Description of PR Fix for broken tests ### How was this patch tested? ### For code changes: - [X] Does the title or this PR starts with the corresponding JIRA issue id (e.g. 'HADOOP-17799. Your PR title ...')? - [ ] Object storage: have the integration tests been executed and the endpoint declared according to the connector-specific documentation? - [ ] If adding new dependencies to the code, are these dependencies licensed in a way that is compatible for inclusion under [ASF 2.0](http://www.apache.org/legal/resolved.html#category-a)? - [X] If applicable, have you updated the `LICENSE`, `LICENSE-binary`, `NOTICE-binary` files? -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org Issue Time Tracking --- Worklog Id: (was: 727878) Remaining Estimate: 0h Time Spent: 10m > junit-vintage tests seem to be failing > -- > > Key: HADOOP-18126 > URL: https://issues.apache.org/jira/browse/HADOOP-18126 > Project: Hadoop Common > Issue Type: Bug > Components: bulid >Reporter: PJ Fanning >Priority: Major > Time Spent: 10m > Remaining Estimate: 0h > > {code:java} > Feb 11, 2022 11:31:43 AM org.junit.platform.launcher.core.DefaultLauncher > handleThrowable WARNING: TestEngine with ID 'junit-vintage' failed to > discover tests org.junit.platform.commons.JUnitException: Failed to parse > version of junit:junit: 4.13.2 at > org.junit.vintage.engine.JUnit4VersionCheck.parseVersion(JUnit4VersionCheck.java:54) > {code} > [https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-3980/1/artifact/out/patch-unit-root.txt] > seems like junit.vintage.version=5.5.1 is incompatible with > junit.version=4.13.2 > see 2nd answer on > [https://stackoverflow.com/questions/59900637/error-testengine-with-id-junit-vintage-failed-to-discover-tests-with-spring] > my plan is to upgrade junit.vintage.version and junit.jupiter.version to 5.8.2 > -- This message was sent by Atlassian Jira (v8.20.1#820001) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] pjfanning opened a new pull request #3993: [HADOOP-18126] update junit due to build issues
pjfanning opened a new pull request #3993: URL: https://github.com/apache/hadoop/pull/3993 ### Description of PR Fix for broken tests ### How was this patch tested? ### For code changes: - [X] Does the title or this PR starts with the corresponding JIRA issue id (e.g. 'HADOOP-17799. Your PR title ...')? - [ ] Object storage: have the integration tests been executed and the endpoint declared according to the connector-specific documentation? - [ ] If adding new dependencies to the code, are these dependencies licensed in a way that is compatible for inclusion under [ASF 2.0](http://www.apache.org/legal/resolved.html#category-a)? - [X] If applicable, have you updated the `LICENSE`, `LICENSE-binary`, `NOTICE-binary` files? -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Work logged] (HADOOP-18122) ViewFileSystem fails on determining owning group when primary group doesn't exist for user
[ https://issues.apache.org/jira/browse/HADOOP-18122?focusedWorklogId=727876&page=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-727876 ] ASF GitHub Bot logged work on HADOOP-18122: --- Author: ASF GitHub Bot Created on: 15/Feb/22 22:34 Start Date: 15/Feb/22 22:34 Worklog Time Spent: 10m Work Description: cheyu2022 commented on a change in pull request #3987: URL: https://github.com/apache/hadoop/pull/3987#discussion_r807374580 ## File path: hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/fs/viewfs/ViewFileSystem.java ## @@ -1359,6 +1365,7 @@ public InternalDirOfViewFs(final InodeTree.INodeDir dir, showMountLinksAsSymlinks = config .getBoolean(CONFIG_VIEWFS_MOUNT_LINKS_AS_SYMLINKS, CONFIG_VIEWFS_MOUNT_LINKS_AS_SYMLINKS_DEFAULT); + this.config = config; Review comment: > else the ugi logic... Yeah and we can set mountLinkGroupName & mountLinkUserName for the ugi logic so that we can retrieve the info directly next time we call the getters. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org Issue Time Tracking --- Worklog Id: (was: 727876) Time Spent: 1h 20m (was: 1h 10m) > ViewFileSystem fails on determining owning group when primary group doesn't > exist for user > -- > > Key: HADOOP-18122 > URL: https://issues.apache.org/jira/browse/HADOOP-18122 > Project: Hadoop Common > Issue Type: Bug >Reporter: Chentao Yu >Assignee: Chentao Yu >Priority: Major > Labels: pull-request-available > Time Spent: 1h 20m > Remaining Estimate: 0h > > ViewFileSystem should not fail on determining owning group when primary group > doesn't exist for user -- This message was sent by Atlassian Jira (v8.20.1#820001) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] cheyu2022 commented on a change in pull request #3987: HADOOP-18122. ViewFileSystem fails on determining owning group when primary group doesn't exist for user
cheyu2022 commented on a change in pull request #3987: URL: https://github.com/apache/hadoop/pull/3987#discussion_r807374580 ## File path: hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/fs/viewfs/ViewFileSystem.java ## @@ -1359,6 +1365,7 @@ public InternalDirOfViewFs(final InodeTree.INodeDir dir, showMountLinksAsSymlinks = config .getBoolean(CONFIG_VIEWFS_MOUNT_LINKS_AS_SYMLINKS, CONFIG_VIEWFS_MOUNT_LINKS_AS_SYMLINKS_DEFAULT); + this.config = config; Review comment: > else the ugi logic... Yeah and we can set mountLinkGroupName & mountLinkUserName for the ugi logic so that we can retrieve the info directly next time we call the getters. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Commented] (HADOOP-18109) Ensure that default permissions of directories under internal ViewFS directories are the same as directories on target filesystems
[ https://issues.apache.org/jira/browse/HADOOP-18109?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17492882#comment-17492882 ] Konstantin Shvachko commented on HADOOP-18109: -- Linking jiras related to this. > Ensure that default permissions of directories under internal ViewFS > directories are the same as directories on target filesystems > -- > > Key: HADOOP-18109 > URL: https://issues.apache.org/jira/browse/HADOOP-18109 > Project: Hadoop Common > Issue Type: Bug > Components: viewfs >Reporter: Chentao Yu >Assignee: Chentao Yu >Priority: Major > Labels: pull-request-available > Time Spent: 2h 10m > Remaining Estimate: 0h > > * Ensure that default permissions of directories under internal ViewFS > directories are the same as directories on target filesystems > * Add new unit test -- This message was sent by Atlassian Jira (v8.20.1#820001) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Updated] (HADOOP-18109) Ensure that default permissions of directories under internal ViewFS directories are the same as directories on target filesystems
[ https://issues.apache.org/jira/browse/HADOOP-18109?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Konstantin Shvachko updated HADOOP-18109: - Component/s: viewfs Target Version/s: 2.10.1 > Ensure that default permissions of directories under internal ViewFS > directories are the same as directories on target filesystems > -- > > Key: HADOOP-18109 > URL: https://issues.apache.org/jira/browse/HADOOP-18109 > Project: Hadoop Common > Issue Type: Bug > Components: viewfs >Reporter: Chentao Yu >Assignee: Chentao Yu >Priority: Major > Labels: pull-request-available > Time Spent: 2h 10m > Remaining Estimate: 0h > > * Ensure that default permissions of directories under internal ViewFS > directories are the same as directories on target filesystems > * Add new unit test -- This message was sent by Atlassian Jira (v8.20.1#820001) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] hadoop-yetus commented on pull request #3861: HDFS-16316.Improve DirectoryScanner: add regular file check related block.
hadoop-yetus commented on pull request #3861: URL: https://github.com/apache/hadoop/pull/3861#issuecomment-1040835489 :confetti_ball: **+1 overall** | Vote | Subsystem | Runtime | Logfile | Comment | |::|--:|:|::|:---:| | +0 :ok: | reexec | 0m 43s | | Docker mode activated. | _ Prechecks _ | | +1 :green_heart: | dupname | 0m 0s | | No case conflicting files found. | | +0 :ok: | codespell | 0m 0s | | codespell was not available. | | +1 :green_heart: | @author | 0m 0s | | The patch does not contain any @author tags. | | +1 :green_heart: | test4tests | 0m 0s | | The patch appears to include 2 new or modified test files. | _ trunk Compile Tests _ | | +0 :ok: | mvndep | 12m 48s | | Maven dependency ordering for branch | | +1 :green_heart: | mvninstall | 25m 22s | | trunk passed | | +1 :green_heart: | compile | 26m 32s | | trunk passed with JDK Ubuntu-11.0.13+8-Ubuntu-0ubuntu1.20.04 | | +1 :green_heart: | compile | 23m 0s | | trunk passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | checkstyle | 4m 8s | | trunk passed | | +1 :green_heart: | mvnsite | 3m 21s | | trunk passed | | +1 :green_heart: | javadoc | 2m 30s | | trunk passed with JDK Ubuntu-11.0.13+8-Ubuntu-0ubuntu1.20.04 | | +1 :green_heart: | javadoc | 3m 36s | | trunk passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | spotbugs | 5m 57s | | trunk passed | | +1 :green_heart: | shadedclient | 24m 8s | | branch has no errors when building and testing our client artifacts. | _ Patch Compile Tests _ | | +0 :ok: | mvndep | 0m 29s | | Maven dependency ordering for patch | | +1 :green_heart: | mvninstall | 2m 23s | | the patch passed | | +1 :green_heart: | compile | 23m 11s | | the patch passed with JDK Ubuntu-11.0.13+8-Ubuntu-0ubuntu1.20.04 | | +1 :green_heart: | javac | 23m 11s | | the patch passed | | +1 :green_heart: | compile | 21m 53s | | the patch passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | javac | 21m 53s | | the patch passed | | +1 :green_heart: | blanks | 0m 0s | | The patch has no blanks issues. | | +1 :green_heart: | checkstyle | 3m 33s | | the patch passed | | +1 :green_heart: | mvnsite | 3m 22s | | the patch passed | | +1 :green_heart: | javadoc | 2m 18s | | the patch passed with JDK Ubuntu-11.0.13+8-Ubuntu-0ubuntu1.20.04 | | +1 :green_heart: | javadoc | 3m 31s | | the patch passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | spotbugs | 6m 34s | | the patch passed | | +1 :green_heart: | shadedclient | 24m 1s | | patch has no errors when building and testing our client artifacts. | _ Other Tests _ | | +1 :green_heart: | unit | 17m 55s | | hadoop-common in the patch passed. | | +1 :green_heart: | unit | 230m 38s | | hadoop-hdfs in the patch passed. | | +1 :green_heart: | asflicense | 1m 7s | | The patch does not generate ASF License warnings. | | | | 471m 50s | | | | Subsystem | Report/Notes | |--:|:-| | Docker | ClientAPI=1.41 ServerAPI=1.41 base: https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-3861/7/artifact/out/Dockerfile | | GITHUB PR | https://github.com/apache/hadoop/pull/3861 | | Optional Tests | dupname asflicense compile javac javadoc mvninstall mvnsite unit shadedclient spotbugs checkstyle codespell | | uname | Linux dbbf79d9ac98 4.15.0-58-generic #64-Ubuntu SMP Tue Aug 6 11:12:41 UTC 2019 x86_64 x86_64 x86_64 GNU/Linux | | Build tool | maven | | Personality | dev-support/bin/hadoop.sh | | git revision | trunk / f5e27e408d9aa8f1e563d139d35a001375e19f7f | | Default Java | Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | Multi-JDK versions | /usr/lib/jvm/java-11-openjdk-amd64:Ubuntu-11.0.13+8-Ubuntu-0ubuntu1.20.04 /usr/lib/jvm/java-8-openjdk-amd64:Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | Test Results | https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-3861/7/testReport/ | | Max. process+thread count | 3544 (vs. ulimit of 5500) | | modules | C: hadoop-common-project/hadoop-common hadoop-hdfs-project/hadoop-hdfs U: . | | Console output | https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-3861/7/console | | versions | git=2.25.1 maven=3.6.3 spotbugs=4.2.2 | | Powered by | Apache Yetus 0.14.0-SNAPSHOT https://yetus.apache.org | This message was automatically generated. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL
[jira] [Work logged] (HADOOP-13386) Upgrade Avro to 1.8.x or later
[ https://issues.apache.org/jira/browse/HADOOP-13386?focusedWorklogId=727780&page=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-727780 ] ASF GitHub Bot logged work on HADOOP-13386: --- Author: ASF GitHub Bot Created on: 15/Feb/22 20:10 Start Date: 15/Feb/22 20:10 Worklog Time Spent: 10m Work Description: hadoop-yetus commented on pull request #3990: URL: https://github.com/apache/hadoop/pull/3990#issuecomment-1040742476 :broken_heart: **-1 overall** | Vote | Subsystem | Runtime | Logfile | Comment | |::|--:|:|::|:---:| | +0 :ok: | reexec | 0m 57s | | Docker mode activated. | _ Prechecks _ | | +1 :green_heart: | dupname | 0m 0s | | No case conflicting files found. | | +0 :ok: | codespell | 0m 1s | | codespell was not available. | | +0 :ok: | shelldocs | 0m 1s | | Shelldocs was not available. | | +1 :green_heart: | @author | 0m 0s | | The patch does not contain any @author tags. | | -1 :x: | test4tests | 0m 0s | | The patch doesn't appear to include any new or modified tests. Please justify why no new tests are needed for this patch. Also please list what manual steps were performed to verify this patch. | _ trunk Compile Tests _ | | +0 :ok: | mvndep | 12m 23s | | Maven dependency ordering for branch | | +1 :green_heart: | mvninstall | 25m 26s | | trunk passed | | +1 :green_heart: | compile | 24m 13s | | trunk passed with JDK Ubuntu-11.0.13+8-Ubuntu-0ubuntu1.20.04 | | +1 :green_heart: | compile | 20m 40s | | trunk passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | mvnsite | 26m 42s | | trunk passed | | +1 :green_heart: | javadoc | 8m 18s | | trunk passed with JDK Ubuntu-11.0.13+8-Ubuntu-0ubuntu1.20.04 | | +1 :green_heart: | javadoc | 8m 23s | | trunk passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | shadedclient | 40m 17s | | branch has no errors when building and testing our client artifacts. | _ Patch Compile Tests _ | | +0 :ok: | mvndep | 0m 29s | | Maven dependency ordering for patch | | +1 :green_heart: | mvninstall | 31m 13s | | the patch passed | | +1 :green_heart: | compile | 23m 40s | | the patch passed with JDK Ubuntu-11.0.13+8-Ubuntu-0ubuntu1.20.04 | | -1 :x: | javac | 23m 40s | [/results-compile-javac-root-jdkUbuntu-11.0.13+8-Ubuntu-0ubuntu1.20.04.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-3990/3/artifact/out/results-compile-javac-root-jdkUbuntu-11.0.13+8-Ubuntu-0ubuntu1.20.04.txt) | root-jdkUbuntu-11.0.13+8-Ubuntu-0ubuntu1.20.04 with JDK Ubuntu-11.0.13+8-Ubuntu-0ubuntu1.20.04 generated 1 new + 1806 unchanged - 1 fixed = 1807 total (was 1807) | | +1 :green_heart: | compile | 20m 37s | | the patch passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | -1 :x: | javac | 20m 37s | [/results-compile-javac-root-jdkPrivateBuild-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-3990/3/artifact/out/results-compile-javac-root-jdkPrivateBuild-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07.txt) | root-jdkPrivateBuild-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 generated 1 new + 1684 unchanged - 1 fixed = 1685 total (was 1685) | | +1 :green_heart: | blanks | 0m 0s | | The patch has no blanks issues. | | +1 :green_heart: | mvnsite | 22m 15s | | the patch passed | | +1 :green_heart: | shellcheck | 0m 0s | | No new issues. | | +1 :green_heart: | xml | 0m 9s | | The patch has no ill-formed XML file. | | +1 :green_heart: | javadoc | 8m 13s | | the patch passed with JDK Ubuntu-11.0.13+8-Ubuntu-0ubuntu1.20.04 | | +1 :green_heart: | javadoc | 8m 13s | | the patch passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | shadedclient | 40m 4s | | patch has no errors when building and testing our client artifacts. | _ Other Tests _ | | -1 :x: | unit | 195m 12s | [/patch-unit-root.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-3990/3/artifact/out/patch-unit-root.txt) | root in the patch failed. | | +1 :green_heart: | asflicense | 1m 30s | | The patch does not generate ASF License warnings. | | | | 488m 10s | | | | Reason | Tests | |---:|:--| | Failed junit tests | hadoop.hdfs.server.datanode.TestDataNodeReconfiguration | | | hadoop.hdfs.server.datanode.TestDirectoryScanner | | | hadoop.hdfs.server.datanode.TestDataNodeMetrics | | | hadoop.hdfs.server.datanode.TestBlockScanne
[GitHub] [hadoop] hadoop-yetus commented on pull request #3990: [HADOOP-13386] upgrade to avro 1.9.2
hadoop-yetus commented on pull request #3990: URL: https://github.com/apache/hadoop/pull/3990#issuecomment-1040742476 :broken_heart: **-1 overall** | Vote | Subsystem | Runtime | Logfile | Comment | |::|--:|:|::|:---:| | +0 :ok: | reexec | 0m 57s | | Docker mode activated. | _ Prechecks _ | | +1 :green_heart: | dupname | 0m 0s | | No case conflicting files found. | | +0 :ok: | codespell | 0m 1s | | codespell was not available. | | +0 :ok: | shelldocs | 0m 1s | | Shelldocs was not available. | | +1 :green_heart: | @author | 0m 0s | | The patch does not contain any @author tags. | | -1 :x: | test4tests | 0m 0s | | The patch doesn't appear to include any new or modified tests. Please justify why no new tests are needed for this patch. Also please list what manual steps were performed to verify this patch. | _ trunk Compile Tests _ | | +0 :ok: | mvndep | 12m 23s | | Maven dependency ordering for branch | | +1 :green_heart: | mvninstall | 25m 26s | | trunk passed | | +1 :green_heart: | compile | 24m 13s | | trunk passed with JDK Ubuntu-11.0.13+8-Ubuntu-0ubuntu1.20.04 | | +1 :green_heart: | compile | 20m 40s | | trunk passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | mvnsite | 26m 42s | | trunk passed | | +1 :green_heart: | javadoc | 8m 18s | | trunk passed with JDK Ubuntu-11.0.13+8-Ubuntu-0ubuntu1.20.04 | | +1 :green_heart: | javadoc | 8m 23s | | trunk passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | shadedclient | 40m 17s | | branch has no errors when building and testing our client artifacts. | _ Patch Compile Tests _ | | +0 :ok: | mvndep | 0m 29s | | Maven dependency ordering for patch | | +1 :green_heart: | mvninstall | 31m 13s | | the patch passed | | +1 :green_heart: | compile | 23m 40s | | the patch passed with JDK Ubuntu-11.0.13+8-Ubuntu-0ubuntu1.20.04 | | -1 :x: | javac | 23m 40s | [/results-compile-javac-root-jdkUbuntu-11.0.13+8-Ubuntu-0ubuntu1.20.04.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-3990/3/artifact/out/results-compile-javac-root-jdkUbuntu-11.0.13+8-Ubuntu-0ubuntu1.20.04.txt) | root-jdkUbuntu-11.0.13+8-Ubuntu-0ubuntu1.20.04 with JDK Ubuntu-11.0.13+8-Ubuntu-0ubuntu1.20.04 generated 1 new + 1806 unchanged - 1 fixed = 1807 total (was 1807) | | +1 :green_heart: | compile | 20m 37s | | the patch passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | -1 :x: | javac | 20m 37s | [/results-compile-javac-root-jdkPrivateBuild-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-3990/3/artifact/out/results-compile-javac-root-jdkPrivateBuild-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07.txt) | root-jdkPrivateBuild-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 generated 1 new + 1684 unchanged - 1 fixed = 1685 total (was 1685) | | +1 :green_heart: | blanks | 0m 0s | | The patch has no blanks issues. | | +1 :green_heart: | mvnsite | 22m 15s | | the patch passed | | +1 :green_heart: | shellcheck | 0m 0s | | No new issues. | | +1 :green_heart: | xml | 0m 9s | | The patch has no ill-formed XML file. | | +1 :green_heart: | javadoc | 8m 13s | | the patch passed with JDK Ubuntu-11.0.13+8-Ubuntu-0ubuntu1.20.04 | | +1 :green_heart: | javadoc | 8m 13s | | the patch passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | shadedclient | 40m 4s | | patch has no errors when building and testing our client artifacts. | _ Other Tests _ | | -1 :x: | unit | 195m 12s | [/patch-unit-root.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-3990/3/artifact/out/patch-unit-root.txt) | root in the patch failed. | | +1 :green_heart: | asflicense | 1m 30s | | The patch does not generate ASF License warnings. | | | | 488m 10s | | | | Reason | Tests | |---:|:--| | Failed junit tests | hadoop.hdfs.server.datanode.TestDataNodeReconfiguration | | | hadoop.hdfs.server.datanode.TestDirectoryScanner | | | hadoop.hdfs.server.datanode.TestDataNodeMetrics | | | hadoop.hdfs.server.datanode.TestBlockScanner | | | hadoop.hdfs.server.datanode.TestDataNodeFaultInjector | | | hadoop.hdfs.server.datanode.TestBatchIbr | | | hadoop.io.TestEnumSetWritable | | Subsystem | Report/Notes | |--:|:-| | Docker | ClientAPI=1.41 ServerAPI=1.41 base: https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-3990/3/artifact/out/Dockerfile | | GITHUB PR | https://github.com/apache/hadoop/pull/
[GitHub] [hadoop] hadoop-yetus commented on pull request #3828: HDFS-16397. Reconfig slow disk parameters for datanode
hadoop-yetus commented on pull request #3828: URL: https://github.com/apache/hadoop/pull/3828#issuecomment-1040706117 :confetti_ball: **+1 overall** | Vote | Subsystem | Runtime | Logfile | Comment | |::|--:|:|::|:---:| | +0 :ok: | reexec | 0m 41s | | Docker mode activated. | _ Prechecks _ | | +1 :green_heart: | dupname | 0m 0s | | No case conflicting files found. | | +0 :ok: | codespell | 0m 0s | | codespell was not available. | | +1 :green_heart: | @author | 0m 0s | | The patch does not contain any @author tags. | | +1 :green_heart: | test4tests | 0m 0s | | The patch appears to include 2 new or modified test files. | _ trunk Compile Tests _ | | +1 :green_heart: | mvninstall | 32m 18s | | trunk passed | | +1 :green_heart: | compile | 1m 28s | | trunk passed with JDK Ubuntu-11.0.13+8-Ubuntu-0ubuntu1.20.04 | | +1 :green_heart: | compile | 1m 20s | | trunk passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | checkstyle | 1m 4s | | trunk passed | | +1 :green_heart: | mvnsite | 1m 30s | | trunk passed | | +1 :green_heart: | javadoc | 1m 4s | | trunk passed with JDK Ubuntu-11.0.13+8-Ubuntu-0ubuntu1.20.04 | | +1 :green_heart: | javadoc | 1m 34s | | trunk passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | spotbugs | 3m 19s | | trunk passed | | +1 :green_heart: | shadedclient | 22m 32s | | branch has no errors when building and testing our client artifacts. | _ Patch Compile Tests _ | | +1 :green_heart: | mvninstall | 1m 16s | | the patch passed | | +1 :green_heart: | compile | 1m 21s | | the patch passed with JDK Ubuntu-11.0.13+8-Ubuntu-0ubuntu1.20.04 | | +1 :green_heart: | javac | 1m 21s | | the patch passed | | +1 :green_heart: | compile | 1m 11s | | the patch passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | javac | 1m 11s | | the patch passed | | +1 :green_heart: | blanks | 0m 0s | | The patch has no blanks issues. | | -0 :warning: | checkstyle | 0m 51s | [/results-checkstyle-hadoop-hdfs-project_hadoop-hdfs.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-3828/5/artifact/out/results-checkstyle-hadoop-hdfs-project_hadoop-hdfs.txt) | hadoop-hdfs-project/hadoop-hdfs: The patch generated 3 new + 136 unchanged - 2 fixed = 139 total (was 138) | | +1 :green_heart: | mvnsite | 1m 17s | | the patch passed | | +1 :green_heart: | javadoc | 0m 50s | | the patch passed with JDK Ubuntu-11.0.13+8-Ubuntu-0ubuntu1.20.04 | | +1 :green_heart: | javadoc | 1m 24s | | the patch passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | spotbugs | 3m 13s | | the patch passed | | +1 :green_heart: | shadedclient | 22m 6s | | patch has no errors when building and testing our client artifacts. | _ Other Tests _ | | +1 :green_heart: | unit | 226m 8s | | hadoop-hdfs in the patch passed. | | +1 :green_heart: | asflicense | 0m 47s | | The patch does not generate ASF License warnings. | | | | 325m 11s | | | | Subsystem | Report/Notes | |--:|:-| | Docker | ClientAPI=1.41 ServerAPI=1.41 base: https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-3828/5/artifact/out/Dockerfile | | GITHUB PR | https://github.com/apache/hadoop/pull/3828 | | Optional Tests | dupname asflicense compile javac javadoc mvninstall mvnsite unit shadedclient spotbugs checkstyle codespell | | uname | Linux dcca2c281b61 4.15.0-112-generic #113-Ubuntu SMP Thu Jul 9 23:41:39 UTC 2020 x86_64 x86_64 x86_64 GNU/Linux | | Build tool | maven | | Personality | dev-support/bin/hadoop.sh | | git revision | trunk / 10536590e771f077d4ffdbfb9fe92112fc40254e | | Default Java | Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | Multi-JDK versions | /usr/lib/jvm/java-11-openjdk-amd64:Ubuntu-11.0.13+8-Ubuntu-0ubuntu1.20.04 /usr/lib/jvm/java-8-openjdk-amd64:Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | Test Results | https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-3828/5/testReport/ | | Max. process+thread count | 3263 (vs. ulimit of 5500) | | modules | C: hadoop-hdfs-project/hadoop-hdfs U: hadoop-hdfs-project/hadoop-hdfs | | Console output | https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-3828/5/console | | versions | git=2.25.1 maven=3.6.3 spotbugs=4.2.2 | | Powered by | Apache Yetus 0.14.0-SNAPSHOT https://yetus.apache.org | This message was automatically generated. -- This is an automated message from the Apache Git Service. To respond to the message, please log
[jira] [Work logged] (HADOOP-18122) ViewFileSystem fails on determining owning group when primary group doesn't exist for user
[ https://issues.apache.org/jira/browse/HADOOP-18122?focusedWorklogId=727708&page=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-727708 ] ASF GitHub Bot logged work on HADOOP-18122: --- Author: ASF GitHub Bot Created on: 15/Feb/22 19:11 Start Date: 15/Feb/22 19:11 Worklog Time Spent: 10m Work Description: ayushtkn commented on a change in pull request #3987: URL: https://github.com/apache/hadoop/pull/3987#discussion_r807180151 ## File path: hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/fs/viewfs/ViewFileSystem.java ## @@ -1359,6 +1365,7 @@ public InternalDirOfViewFs(final InodeTree.INodeDir dir, showMountLinksAsSymlinks = config .getBoolean(CONFIG_VIEWFS_MOUNT_LINKS_AS_SYMLINKS, CONFIG_VIEWFS_MOUNT_LINKS_AS_SYMLINKS_DEFAULT); + this.config = config; Review comment: I don't think, you need to store the config, You have the config here, just check for your conf params and initialise the mountLinkGroupName & mountLinkUserName. For the getter methods just check if it isn't null, return these values, else the ugi logic... -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org Issue Time Tracking --- Worklog Id: (was: 727708) Time Spent: 1h 10m (was: 1h) > ViewFileSystem fails on determining owning group when primary group doesn't > exist for user > -- > > Key: HADOOP-18122 > URL: https://issues.apache.org/jira/browse/HADOOP-18122 > Project: Hadoop Common > Issue Type: Bug >Reporter: Chentao Yu >Assignee: Chentao Yu >Priority: Major > Labels: pull-request-available > Time Spent: 1h 10m > Remaining Estimate: 0h > > ViewFileSystem should not fail on determining owning group when primary group > doesn't exist for user -- This message was sent by Atlassian Jira (v8.20.1#820001) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] ayushtkn commented on a change in pull request #3987: HADOOP-18122. ViewFileSystem fails on determining owning group when primary group doesn't exist for user
ayushtkn commented on a change in pull request #3987: URL: https://github.com/apache/hadoop/pull/3987#discussion_r807180151 ## File path: hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/fs/viewfs/ViewFileSystem.java ## @@ -1359,6 +1365,7 @@ public InternalDirOfViewFs(final InodeTree.INodeDir dir, showMountLinksAsSymlinks = config .getBoolean(CONFIG_VIEWFS_MOUNT_LINKS_AS_SYMLINKS, CONFIG_VIEWFS_MOUNT_LINKS_AS_SYMLINKS_DEFAULT); + this.config = config; Review comment: I don't think, you need to store the config, You have the config here, just check for your conf params and initialise the mountLinkGroupName & mountLinkUserName. For the getter methods just check if it isn't null, return these values, else the ugi logic... -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] goiri merged pull request #3971: HDFS-16440. RBF: Support router get HAServiceStatus with Lifeline RPC address
goiri merged pull request #3971: URL: https://github.com/apache/hadoop/pull/3971 -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] hadoop-yetus commented on pull request #3992: HDFS-15745. Make DataNodePeerMetrics#LOW_THRESHOLD_MS and MIN_OUTLIER_DETECTION_NODES configurable. Contributed by Haibin Huang.
hadoop-yetus commented on pull request #3992: URL: https://github.com/apache/hadoop/pull/3992#issuecomment-1040638017 :broken_heart: **-1 overall** | Vote | Subsystem | Runtime | Logfile | Comment | |::|--:|:|::|:---:| | +0 :ok: | reexec | 9m 48s | | Docker mode activated. | _ Prechecks _ | | +1 :green_heart: | dupname | 0m 0s | | No case conflicting files found. | | +0 :ok: | codespell | 0m 0s | | codespell was not available. | | +1 :green_heart: | @author | 0m 0s | | The patch does not contain any @author tags. | | -1 :x: | test4tests | 0m 0s | | The patch doesn't appear to include any new or modified tests. Please justify why no new tests are needed for this patch. Also please list what manual steps were performed to verify this patch. | _ branch-3.3 Compile Tests _ | | +1 :green_heart: | mvninstall | 33m 37s | | branch-3.3 passed | | +1 :green_heart: | compile | 1m 13s | | branch-3.3 passed | | +1 :green_heart: | checkstyle | 0m 49s | | branch-3.3 passed | | +1 :green_heart: | mvnsite | 1m 19s | | branch-3.3 passed | | +1 :green_heart: | javadoc | 1m 25s | | branch-3.3 passed | | +1 :green_heart: | spotbugs | 3m 14s | | branch-3.3 passed | | +1 :green_heart: | shadedclient | 27m 9s | | branch has no errors when building and testing our client artifacts. | _ Patch Compile Tests _ | | +1 :green_heart: | mvninstall | 1m 11s | | the patch passed | | +1 :green_heart: | compile | 1m 6s | | the patch passed | | +1 :green_heart: | javac | 1m 6s | | the patch passed | | +1 :green_heart: | blanks | 0m 0s | | The patch has no blanks issues. | | +1 :green_heart: | checkstyle | 0m 41s | | the patch passed | | +1 :green_heart: | mvnsite | 1m 15s | | the patch passed | | +1 :green_heart: | xml | 0m 1s | | The patch has no ill-formed XML file. | | +1 :green_heart: | javadoc | 1m 16s | | the patch passed | | +1 :green_heart: | spotbugs | 3m 16s | | the patch passed | | +1 :green_heart: | shadedclient | 26m 59s | | patch has no errors when building and testing our client artifacts. | _ Other Tests _ | | -1 :x: | unit | 210m 34s | [/patch-unit-hadoop-hdfs-project_hadoop-hdfs.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-3992/1/artifact/out/patch-unit-hadoop-hdfs-project_hadoop-hdfs.txt) | hadoop-hdfs in the patch passed. | | +1 :green_heart: | asflicense | 0m 37s | | The patch does not generate ASF License warnings. | | | | 322m 44s | | | | Reason | Tests | |---:|:--| | Failed junit tests | hadoop.hdfs.server.blockmanagement.TestBlockTokenWithDFSStriped | | | hadoop.hdfs.server.datanode.TestDirectoryScanner | | | hadoop.hdfs.tools.offlineImageViewer.TestOfflineImageViewer | | | hadoop.hdfs.TestRollingUpgrade | | Subsystem | Report/Notes | |--:|:-| | Docker | ClientAPI=1.41 ServerAPI=1.41 base: https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-3992/1/artifact/out/Dockerfile | | GITHUB PR | https://github.com/apache/hadoop/pull/3992 | | Optional Tests | dupname asflicense compile javac javadoc mvninstall mvnsite unit shadedclient spotbugs checkstyle codespell xml | | uname | Linux 84d43b1571d0 4.15.0-163-generic #171-Ubuntu SMP Fri Nov 5 11:55:11 UTC 2021 x86_64 x86_64 x86_64 GNU/Linux | | Build tool | maven | | Personality | dev-support/bin/hadoop.sh | | git revision | branch-3.3 / 81aa7a942b0af7d854b94431e34dd731bdb343c7 | | Default Java | Private Build-1.8.0_312-8u312-b07-0ubuntu1~18.04-b07 | | Test Results | https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-3992/1/testReport/ | | Max. process+thread count | 2163 (vs. ulimit of 5500) | | modules | C: hadoop-hdfs-project/hadoop-hdfs U: hadoop-hdfs-project/hadoop-hdfs | | Console output | https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-3992/1/console | | versions | git=2.17.1 maven=3.6.0 spotbugs=4.2.2 | | Powered by | Apache Yetus 0.14.0-SNAPSHOT https://yetus.apache.org | This message was automatically generated. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Work logged] (HADOOP-18125) Utility to identify git commit / Jira fixVersion discrepancies for RC preparation
[ https://issues.apache.org/jira/browse/HADOOP-18125?focusedWorklogId=727568&page=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-727568 ] ASF GitHub Bot logged work on HADOOP-18125: --- Author: ASF GitHub Bot Created on: 15/Feb/22 19:02 Start Date: 15/Feb/22 19:02 Worklog Time Spent: 10m Work Description: hadoop-yetus commented on pull request #3991: URL: https://github.com/apache/hadoop/pull/3991#issuecomment-1039247698 -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org Issue Time Tracking --- Worklog Id: (was: 727568) Time Spent: 50m (was: 40m) > Utility to identify git commit / Jira fixVersion discrepancies for RC > preparation > - > > Key: HADOOP-18125 > URL: https://issues.apache.org/jira/browse/HADOOP-18125 > Project: Hadoop Common > Issue Type: Task >Reporter: Viraj Jasani >Assignee: Viraj Jasani >Priority: Major > Labels: pull-request-available > Time Spent: 50m > Remaining Estimate: 0h > > As part of RC preparation, we need to identify all git commits that landed > on release branch, however their corresponding Jira is either not resolved > yet or does not contain expected fixVersions. Only when we have git commits > and corresponding Jiras with expected fixVersion resolved, we get all such > Jiras included in auto-generated CHANGES.md as per Yetus changelog generator. > Proposal of this Jira is to provide such script that can be useful for all > upcoming RC preparations and list down all Jiras where we need manual > intervention. This utility script should use Jira API to retrieve individual > fields and use git log to loop through commit history. > The script should identify these issues: > # commit is reverted as per commit message > # commit does not contain Jira number format (e.g. HADOOP- / HDFS- > etc) in message > # Jira does not have expected fixVersion > # Jira has expected fixVersion, but it is not yet resolved > # Jira has release corresponding fixVersion and is resolved, but no > corresponding commit yet found > It can take inputs as: > # First commit hash to start excluding commits from history > # Fix Version > # JIRA Project Name > # Path of project's working dir > # Jira server url -- This message was sent by Atlassian Jira (v8.20.1#820001) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] hadoop-yetus commented on pull request #3991: HADOOP-18125. Utility to identify git commit / Jira fixVersion discrepancies for RC preparation
hadoop-yetus commented on pull request #3991: URL: https://github.com/apache/hadoop/pull/3991#issuecomment-1039247698 -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] tasanuma merged pull request #3827: HDFS-16396. Reconfig slow peer parameters for datanode
tasanuma merged pull request #3827: URL: https://github.com/apache/hadoop/pull/3827 -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Work logged] (HADOOP-18122) ViewFileSystem fails on determining owning group when primary group doesn't exist for user
[ https://issues.apache.org/jira/browse/HADOOP-18122?focusedWorklogId=727518&page=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-727518 ] ASF GitHub Bot logged work on HADOOP-18122: --- Author: ASF GitHub Bot Created on: 15/Feb/22 18:58 Start Date: 15/Feb/22 18:58 Worklog Time Spent: 10m Work Description: hadoop-yetus commented on pull request #3987: URL: https://github.com/apache/hadoop/pull/3987#issuecomment-1039807399 :confetti_ball: **+1 overall** | Vote | Subsystem | Runtime | Logfile | Comment | |::|--:|:|::|:---:| | +0 :ok: | reexec | 0m 52s | | Docker mode activated. | _ Prechecks _ | | +1 :green_heart: | dupname | 0m 0s | | No case conflicting files found. | | +0 :ok: | codespell | 0m 1s | | codespell was not available. | | +1 :green_heart: | @author | 0m 0s | | The patch does not contain any @author tags. | | +1 :green_heart: | test4tests | 0m 0s | | The patch appears to include 2 new or modified test files. | _ trunk Compile Tests _ | | +1 :green_heart: | mvninstall | 35m 8s | | trunk passed | | +1 :green_heart: | compile | 24m 24s | | trunk passed with JDK Ubuntu-11.0.13+8-Ubuntu-0ubuntu1.20.04 | | +1 :green_heart: | compile | 20m 41s | | trunk passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | checkstyle | 1m 2s | | trunk passed | | +1 :green_heart: | mvnsite | 1m 35s | | trunk passed | | +1 :green_heart: | javadoc | 1m 8s | | trunk passed with JDK Ubuntu-11.0.13+8-Ubuntu-0ubuntu1.20.04 | | +1 :green_heart: | javadoc | 1m 40s | | trunk passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | spotbugs | 2m 27s | | trunk passed | | +1 :green_heart: | shadedclient | 25m 9s | | branch has no errors when building and testing our client artifacts. | _ Patch Compile Tests _ | | +1 :green_heart: | mvninstall | 1m 0s | | the patch passed | | +1 :green_heart: | compile | 23m 37s | | the patch passed with JDK Ubuntu-11.0.13+8-Ubuntu-0ubuntu1.20.04 | | +1 :green_heart: | javac | 23m 37s | | the patch passed | | +1 :green_heart: | compile | 20m 47s | | the patch passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | javac | 20m 47s | | the patch passed | | +1 :green_heart: | blanks | 0m 0s | | The patch has no blanks issues. | | +1 :green_heart: | checkstyle | 0m 58s | | the patch passed | | +1 :green_heart: | mvnsite | 1m 33s | | the patch passed | | +1 :green_heart: | javadoc | 1m 4s | | the patch passed with JDK Ubuntu-11.0.13+8-Ubuntu-0ubuntu1.20.04 | | +1 :green_heart: | javadoc | 1m 35s | | the patch passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | spotbugs | 2m 37s | | the patch passed | | +1 :green_heart: | shadedclient | 25m 35s | | patch has no errors when building and testing our client artifacts. | _ Other Tests _ | | +1 :green_heart: | unit | 17m 27s | | hadoop-common in the patch passed. | | +1 :green_heart: | asflicense | 0m 48s | | The patch does not generate ASF License warnings. | | | | 211m 0s | | | | Subsystem | Report/Notes | |--:|:-| | Docker | ClientAPI=1.41 ServerAPI=1.41 base: https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-3987/1/artifact/out/Dockerfile | | GITHUB PR | https://github.com/apache/hadoop/pull/3987 | | Optional Tests | dupname asflicense compile javac javadoc mvninstall mvnsite unit shadedclient spotbugs checkstyle codespell | | uname | Linux 736c2b72df4b 4.15.0-163-generic #171-Ubuntu SMP Fri Nov 5 11:55:11 UTC 2021 x86_64 x86_64 x86_64 GNU/Linux | | Build tool | maven | | Personality | dev-support/bin/hadoop.sh | | git revision | trunk / 8aabc9e786349055f1a43a701ee6ec9cc26a9c72 | | Default Java | Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | Multi-JDK versions | /usr/lib/jvm/java-11-openjdk-amd64:Ubuntu-11.0.13+8-Ubuntu-0ubuntu1.20.04 /usr/lib/jvm/java-8-openjdk-amd64:Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | Test Results | https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-3987/1/testReport/ | | Max. process+thread count | 1350 (vs. ulimit of 5500) | | modules | C: hadoop-common-project/hadoop-common U: hadoop-common-project/hadoop-common | | Console output | https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-3987/1/console | | versions | git=2.25.1 maven=3.6.3 spotbugs=4.2.2 | | Powered by | Apache Yetus 0.14.0-SNAPSHOT https://yetus.apache.org |
[GitHub] [hadoop] aajisaka commented on pull request #3989: YARN-10788. TestCsiClient fails
aajisaka commented on pull request #3989: URL: https://github.com/apache/hadoop/pull/3989#issuecomment-1039273693 Merged. Thank you @ayushtkn ! -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] jianghuazhu commented on a change in pull request #3861: HDFS-16316.Improve DirectoryScanner: add regular file check related block.
jianghuazhu commented on a change in pull request #3861: URL: https://github.com/apache/hadoop/pull/3861#discussion_r806465949 ## File path: hadoop-hdfs-project/hadoop-hdfs/src/test/java/org/apache/hadoop/hdfs/server/datanode/TestDirectoryScanner.java ## @@ -507,6 +509,71 @@ public void testDeleteBlockOnTransientStorage() throws Exception { } } + @Test(timeout = 60) + public void testRegularBlock() throws Exception { +// add a logger stream to check what has printed to log +ByteArrayOutputStream loggerStream = new ByteArrayOutputStream(); Review comment: Yes, it was my mistake. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] hadoop-yetus commented on pull request #3987: HADOOP-18122. ViewFileSystem fails on determining owning group when primary group doesn't exist for user
hadoop-yetus commented on pull request #3987: URL: https://github.com/apache/hadoop/pull/3987#issuecomment-1039807399 :confetti_ball: **+1 overall** | Vote | Subsystem | Runtime | Logfile | Comment | |::|--:|:|::|:---:| | +0 :ok: | reexec | 0m 52s | | Docker mode activated. | _ Prechecks _ | | +1 :green_heart: | dupname | 0m 0s | | No case conflicting files found. | | +0 :ok: | codespell | 0m 1s | | codespell was not available. | | +1 :green_heart: | @author | 0m 0s | | The patch does not contain any @author tags. | | +1 :green_heart: | test4tests | 0m 0s | | The patch appears to include 2 new or modified test files. | _ trunk Compile Tests _ | | +1 :green_heart: | mvninstall | 35m 8s | | trunk passed | | +1 :green_heart: | compile | 24m 24s | | trunk passed with JDK Ubuntu-11.0.13+8-Ubuntu-0ubuntu1.20.04 | | +1 :green_heart: | compile | 20m 41s | | trunk passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | checkstyle | 1m 2s | | trunk passed | | +1 :green_heart: | mvnsite | 1m 35s | | trunk passed | | +1 :green_heart: | javadoc | 1m 8s | | trunk passed with JDK Ubuntu-11.0.13+8-Ubuntu-0ubuntu1.20.04 | | +1 :green_heart: | javadoc | 1m 40s | | trunk passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | spotbugs | 2m 27s | | trunk passed | | +1 :green_heart: | shadedclient | 25m 9s | | branch has no errors when building and testing our client artifacts. | _ Patch Compile Tests _ | | +1 :green_heart: | mvninstall | 1m 0s | | the patch passed | | +1 :green_heart: | compile | 23m 37s | | the patch passed with JDK Ubuntu-11.0.13+8-Ubuntu-0ubuntu1.20.04 | | +1 :green_heart: | javac | 23m 37s | | the patch passed | | +1 :green_heart: | compile | 20m 47s | | the patch passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | javac | 20m 47s | | the patch passed | | +1 :green_heart: | blanks | 0m 0s | | The patch has no blanks issues. | | +1 :green_heart: | checkstyle | 0m 58s | | the patch passed | | +1 :green_heart: | mvnsite | 1m 33s | | the patch passed | | +1 :green_heart: | javadoc | 1m 4s | | the patch passed with JDK Ubuntu-11.0.13+8-Ubuntu-0ubuntu1.20.04 | | +1 :green_heart: | javadoc | 1m 35s | | the patch passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | spotbugs | 2m 37s | | the patch passed | | +1 :green_heart: | shadedclient | 25m 35s | | patch has no errors when building and testing our client artifacts. | _ Other Tests _ | | +1 :green_heart: | unit | 17m 27s | | hadoop-common in the patch passed. | | +1 :green_heart: | asflicense | 0m 48s | | The patch does not generate ASF License warnings. | | | | 211m 0s | | | | Subsystem | Report/Notes | |--:|:-| | Docker | ClientAPI=1.41 ServerAPI=1.41 base: https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-3987/1/artifact/out/Dockerfile | | GITHUB PR | https://github.com/apache/hadoop/pull/3987 | | Optional Tests | dupname asflicense compile javac javadoc mvninstall mvnsite unit shadedclient spotbugs checkstyle codespell | | uname | Linux 736c2b72df4b 4.15.0-163-generic #171-Ubuntu SMP Fri Nov 5 11:55:11 UTC 2021 x86_64 x86_64 x86_64 GNU/Linux | | Build tool | maven | | Personality | dev-support/bin/hadoop.sh | | git revision | trunk / 8aabc9e786349055f1a43a701ee6ec9cc26a9c72 | | Default Java | Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | Multi-JDK versions | /usr/lib/jvm/java-11-openjdk-amd64:Ubuntu-11.0.13+8-Ubuntu-0ubuntu1.20.04 /usr/lib/jvm/java-8-openjdk-amd64:Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | Test Results | https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-3987/1/testReport/ | | Max. process+thread count | 1350 (vs. ulimit of 5500) | | modules | C: hadoop-common-project/hadoop-common U: hadoop-common-project/hadoop-common | | Console output | https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-3987/1/console | | versions | git=2.25.1 maven=3.6.3 spotbugs=4.2.2 | | Powered by | Apache Yetus 0.14.0-SNAPSHOT https://yetus.apache.org | This message was automatically generated. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org --
[GitHub] [hadoop] jianghuazhu commented on pull request #3861: HDFS-16316.Improve DirectoryScanner: add regular file check related block.
jianghuazhu commented on pull request #3861: URL: https://github.com/apache/hadoop/pull/3861#issuecomment-1039826045 -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] Neilxzn commented on a change in pull request #3983: HDFS-16455. RBF: Add `zk-dt-secret-manager.jute.maxbuffer` property for Router's ZKDelegationTokenManager
Neilxzn commented on a change in pull request #3983: URL: https://github.com/apache/hadoop/pull/3983#discussion_r806389802 ## File path: hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/security/token/delegation/ZKDelegationTokenSecretManager.java ## @@ -199,6 +202,10 @@ public ZKDelegationTokenSecretManager(Configuration conf) { ZK_DTSM_ZK_SESSION_TIMEOUT_DEFAULT); int numRetries = conf.getInt(ZK_DTSM_ZK_NUM_RETRIES, ZK_DTSM_ZK_NUM_RETRIES_DEFAULT); +String juteMaxBuffer = +conf.get(ZK_DTSM_ZK_JUTE_MAXBUFFER, ZK_DTSM_ZK_JUTE_MAXBUFFER_DEFAULT); Review comment: Thank you for your review. fix it. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] goiri commented on pull request #3971: HDFS-16440 RBF: Support router get HAServiceStatus with Lifeline RPC address
goiri commented on pull request #3971: URL: https://github.com/apache/hadoop/pull/3971#issuecomment-1039354689 It would be nice to have a full Yetus run, not sure what happened with the previous one. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] aajisaka merged pull request #3989: YARN-10788. TestCsiClient fails
aajisaka merged pull request #3989: URL: https://github.com/apache/hadoop/pull/3989 -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] jianghuazhu removed a comment on pull request #3861: HDFS-16316.Improve DirectoryScanner: add regular file check related block.
jianghuazhu removed a comment on pull request #3861: URL: https://github.com/apache/hadoop/pull/3861#issuecomment-1039826045 Here are some examples of online clusters. We construct a block device file such as: ![image](https://user-images.githubusercontent.com/6416939/153989107-901a87e4-4b1c-44f4-a654-225ce495ede1.png) This file is non-standard. This kind of file is found when DirectoryScanner is working. log: ` 2022-02-15 11:24:10,286 WARN org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetImpl: Block:1073741828 is not a regular file. ` ` 2022-02-15 11:24:10,286 WARN org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetImpl: Reporting the block blk_1073741828_0 as corrupt due to length mismatch ` Then DataNode will tell NameNode that there are some unqualified blocks through NameNodeRpcServer#reportBadBlocks(). After the NameNode gets the data, it will process it further. After a period of time, the DataNode will automatically clean up these unqualified replica data. ![image](https://user-images.githubusercontent.com/6416939/153989296-65e0230c-031c-4fd0-ace1-d247f15791b3.png) Can you help review this pr again, @jojochuang . Thank you so much. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Work logged] (HADOOP-18119) ViewFileSystem#getUri should return a URI with a empty path component
[ https://issues.apache.org/jira/browse/HADOOP-18119?focusedWorklogId=727438&page=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-727438 ] ASF GitHub Bot logged work on HADOOP-18119: --- Author: ASF GitHub Bot Created on: 15/Feb/22 18:51 Start Date: 15/Feb/22 18:51 Worklog Time Spent: 10m Work Description: hadoop-yetus commented on pull request #3979: URL: https://github.com/apache/hadoop/pull/3979#issuecomment-1039678349 :confetti_ball: **+1 overall** | Vote | Subsystem | Runtime | Logfile | Comment | |::|--:|:|::|:---:| | +0 :ok: | reexec | 0m 39s | | Docker mode activated. | _ Prechecks _ | | +1 :green_heart: | dupname | 0m 0s | | No case conflicting files found. | | +0 :ok: | codespell | 0m 0s | | codespell was not available. | | +1 :green_heart: | @author | 0m 0s | | The patch does not contain any @author tags. | | +1 :green_heart: | test4tests | 0m 0s | | The patch appears to include 2 new or modified test files. | _ trunk Compile Tests _ | | +1 :green_heart: | mvninstall | 32m 6s | | trunk passed | | +1 :green_heart: | compile | 22m 18s | | trunk passed with JDK Ubuntu-11.0.13+8-Ubuntu-0ubuntu1.20.04 | | +1 :green_heart: | compile | 19m 33s | | trunk passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | checkstyle | 1m 7s | | trunk passed | | +1 :green_heart: | mvnsite | 1m 41s | | trunk passed | | +1 :green_heart: | javadoc | 1m 13s | | trunk passed with JDK Ubuntu-11.0.13+8-Ubuntu-0ubuntu1.20.04 | | +1 :green_heart: | javadoc | 1m 45s | | trunk passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | spotbugs | 2m 30s | | trunk passed | | +1 :green_heart: | shadedclient | 22m 8s | | branch has no errors when building and testing our client artifacts. | _ Patch Compile Tests _ | | +1 :green_heart: | mvninstall | 0m 57s | | the patch passed | | +1 :green_heart: | compile | 21m 40s | | the patch passed with JDK Ubuntu-11.0.13+8-Ubuntu-0ubuntu1.20.04 | | +1 :green_heart: | javac | 21m 40s | | the patch passed | | +1 :green_heart: | compile | 19m 45s | | the patch passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | javac | 19m 45s | | the patch passed | | +1 :green_heart: | blanks | 0m 0s | | The patch has no blanks issues. | | +1 :green_heart: | checkstyle | 1m 7s | | the patch passed | | +1 :green_heart: | mvnsite | 1m 39s | | the patch passed | | +1 :green_heart: | javadoc | 1m 12s | | the patch passed with JDK Ubuntu-11.0.13+8-Ubuntu-0ubuntu1.20.04 | | +1 :green_heart: | javadoc | 1m 43s | | the patch passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | spotbugs | 2m 39s | | the patch passed | | +1 :green_heart: | shadedclient | 22m 23s | | patch has no errors when building and testing our client artifacts. | _ Other Tests _ | | +1 :green_heart: | unit | 17m 41s | | hadoop-common in the patch passed. | | +1 :green_heart: | asflicense | 0m 54s | | The patch does not generate ASF License warnings. | | | | 196m 55s | | | | Subsystem | Report/Notes | |--:|:-| | Docker | ClientAPI=1.41 ServerAPI=1.41 base: https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-3979/3/artifact/out/Dockerfile | | GITHUB PR | https://github.com/apache/hadoop/pull/3979 | | Optional Tests | dupname asflicense compile javac javadoc mvninstall mvnsite unit shadedclient spotbugs checkstyle codespell | | uname | Linux 530bf60df458 4.15.0-156-generic #163-Ubuntu SMP Thu Aug 19 23:31:58 UTC 2021 x86_64 x86_64 x86_64 GNU/Linux | | Build tool | maven | | Personality | dev-support/bin/hadoop.sh | | git revision | trunk / 2dc341cfe51caeb6ddc262c5b35849ec11a5de0c | | Default Java | Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | Multi-JDK versions | /usr/lib/jvm/java-11-openjdk-amd64:Ubuntu-11.0.13+8-Ubuntu-0ubuntu1.20.04 /usr/lib/jvm/java-8-openjdk-amd64:Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | Test Results | https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-3979/3/testReport/ | | Max. process+thread count | 1258 (vs. ulimit of 5500) | | modules | C: hadoop-common-project/hadoop-common U: hadoop-common-project/hadoop-common | | Console output | https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-3979/3/console | | versions | git=2.25.1 maven=3.6.3 spotbugs=4.2.2 | | Powered by | Apache Yetus 0.14.0-SNAPSHOT https://yetus.apache.org |
[GitHub] [hadoop] hadoop-yetus commented on pull request #3979: HADOOP-18119. ViewFileSystem#getUri should return a URI with an empty path component
hadoop-yetus commented on pull request #3979: URL: https://github.com/apache/hadoop/pull/3979#issuecomment-1039678349 :confetti_ball: **+1 overall** | Vote | Subsystem | Runtime | Logfile | Comment | |::|--:|:|::|:---:| | +0 :ok: | reexec | 0m 39s | | Docker mode activated. | _ Prechecks _ | | +1 :green_heart: | dupname | 0m 0s | | No case conflicting files found. | | +0 :ok: | codespell | 0m 0s | | codespell was not available. | | +1 :green_heart: | @author | 0m 0s | | The patch does not contain any @author tags. | | +1 :green_heart: | test4tests | 0m 0s | | The patch appears to include 2 new or modified test files. | _ trunk Compile Tests _ | | +1 :green_heart: | mvninstall | 32m 6s | | trunk passed | | +1 :green_heart: | compile | 22m 18s | | trunk passed with JDK Ubuntu-11.0.13+8-Ubuntu-0ubuntu1.20.04 | | +1 :green_heart: | compile | 19m 33s | | trunk passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | checkstyle | 1m 7s | | trunk passed | | +1 :green_heart: | mvnsite | 1m 41s | | trunk passed | | +1 :green_heart: | javadoc | 1m 13s | | trunk passed with JDK Ubuntu-11.0.13+8-Ubuntu-0ubuntu1.20.04 | | +1 :green_heart: | javadoc | 1m 45s | | trunk passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | spotbugs | 2m 30s | | trunk passed | | +1 :green_heart: | shadedclient | 22m 8s | | branch has no errors when building and testing our client artifacts. | _ Patch Compile Tests _ | | +1 :green_heart: | mvninstall | 0m 57s | | the patch passed | | +1 :green_heart: | compile | 21m 40s | | the patch passed with JDK Ubuntu-11.0.13+8-Ubuntu-0ubuntu1.20.04 | | +1 :green_heart: | javac | 21m 40s | | the patch passed | | +1 :green_heart: | compile | 19m 45s | | the patch passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | javac | 19m 45s | | the patch passed | | +1 :green_heart: | blanks | 0m 0s | | The patch has no blanks issues. | | +1 :green_heart: | checkstyle | 1m 7s | | the patch passed | | +1 :green_heart: | mvnsite | 1m 39s | | the patch passed | | +1 :green_heart: | javadoc | 1m 12s | | the patch passed with JDK Ubuntu-11.0.13+8-Ubuntu-0ubuntu1.20.04 | | +1 :green_heart: | javadoc | 1m 43s | | the patch passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | spotbugs | 2m 39s | | the patch passed | | +1 :green_heart: | shadedclient | 22m 23s | | patch has no errors when building and testing our client artifacts. | _ Other Tests _ | | +1 :green_heart: | unit | 17m 41s | | hadoop-common in the patch passed. | | +1 :green_heart: | asflicense | 0m 54s | | The patch does not generate ASF License warnings. | | | | 196m 55s | | | | Subsystem | Report/Notes | |--:|:-| | Docker | ClientAPI=1.41 ServerAPI=1.41 base: https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-3979/3/artifact/out/Dockerfile | | GITHUB PR | https://github.com/apache/hadoop/pull/3979 | | Optional Tests | dupname asflicense compile javac javadoc mvninstall mvnsite unit shadedclient spotbugs checkstyle codespell | | uname | Linux 530bf60df458 4.15.0-156-generic #163-Ubuntu SMP Thu Aug 19 23:31:58 UTC 2021 x86_64 x86_64 x86_64 GNU/Linux | | Build tool | maven | | Personality | dev-support/bin/hadoop.sh | | git revision | trunk / 2dc341cfe51caeb6ddc262c5b35849ec11a5de0c | | Default Java | Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | Multi-JDK versions | /usr/lib/jvm/java-11-openjdk-amd64:Ubuntu-11.0.13+8-Ubuntu-0ubuntu1.20.04 /usr/lib/jvm/java-8-openjdk-amd64:Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | Test Results | https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-3979/3/testReport/ | | Max. process+thread count | 1258 (vs. ulimit of 5500) | | modules | C: hadoop-common-project/hadoop-common U: hadoop-common-project/hadoop-common | | Console output | https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-3979/3/console | | versions | git=2.25.1 maven=3.6.3 spotbugs=4.2.2 | | Powered by | Apache Yetus 0.14.0-SNAPSHOT https://yetus.apache.org | This message was automatically generated. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org -
[GitHub] [hadoop] tomscut commented on pull request #3827: HDFS-16396. Reconfig slow peer parameters for datanode
tomscut commented on pull request #3827: URL: https://github.com/apache/hadoop/pull/3827#issuecomment-1039826206 Thanks @tasanuma and @ayushtkn for the review and confirming this. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Work logged] (HADOOP-18117) Add an option to preserve root directory permissions
[ https://issues.apache.org/jira/browse/HADOOP-18117?focusedWorklogId=727392&page=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-727392 ] ASF GitHub Bot logged work on HADOOP-18117: --- Author: ASF GitHub Bot Created on: 15/Feb/22 18:47 Start Date: 15/Feb/22 18:47 Worklog Time Spent: 10m Work Description: hadoop-yetus commented on pull request #3970: URL: https://github.com/apache/hadoop/pull/3970#issuecomment-1039276152 -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org Issue Time Tracking --- Worklog Id: (was: 727392) Time Spent: 2.5h (was: 2h 20m) > Add an option to preserve root directory permissions > > > Key: HADOOP-18117 > URL: https://issues.apache.org/jira/browse/HADOOP-18117 > Project: Hadoop Common > Issue Type: Improvement >Reporter: Mohanad Elsafty >Assignee: Mohanad Elsafty >Priority: Minor > Labels: pull-request-available > Time Spent: 2.5h > Remaining Estimate: 0h > > As mentioned in https://issues.apache.org/jira/browse/HADOOP-15211 > > If *-update* or *-overwrite* is being passed when *distcp* used, the root > directory will be skipped in two occasions (CopyListing#doBuildListing & > CopyCommitter#preserveFileAttributesForDirectories), which will ignore root > directory's attributes. > > We face the same issue when distcp huge data between clusters and it takes > too much effort to update root directories attributes manually. > > From the earlier ticket it's obvious why this behaviour is there, but > sometime we need to enforce root directory update hence I will add a new > option for distcp to enable someone (who understands the need of this and > know what they are doing) to enforce the update of root directory's > attributes (permissions, ownership, ...) > > It should be simple one, something like this > {code:java} > $ hadoop distcp -p -update -updateRootDirectoryAttributes /a/b/c /a/b/d {code} > This behaviour is optional and will be *false* by default. (it should not > affect existing *distcp* users). -- This message was sent by Atlassian Jira (v8.20.1#820001) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] hadoop-yetus commented on pull request #3970: HADOOP-18117. Add an option to preserve root directory permissions
hadoop-yetus commented on pull request #3970: URL: https://github.com/apache/hadoop/pull/3970#issuecomment-1039276152 -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] jojochuang commented on a change in pull request #3861: HDFS-16316.Improve DirectoryScanner: add regular file check related block.
jojochuang commented on a change in pull request #3861: URL: https://github.com/apache/hadoop/pull/3861#discussion_r806448723 ## File path: hadoop-hdfs-project/hadoop-hdfs/src/test/java/org/apache/hadoop/hdfs/server/datanode/TestDirectoryScanner.java ## @@ -507,6 +509,71 @@ public void testDeleteBlockOnTransientStorage() throws Exception { } } + @Test(timeout = 60) + public void testRegularBlock() throws Exception { +// add a logger stream to check what has printed to log +ByteArrayOutputStream loggerStream = new ByteArrayOutputStream(); Review comment: Can you use the Hadoop utility class LogCapturer https://github.com/apache/hadoop/blob/6342d5e523941622a140fd877f06e9b59f48c48b/hadoop-common-project/hadoop-common/src/test/java/org/apache/hadoop/test/GenericTestUtils.java#L533 for this purpose? -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Comment Edited] (HADOOP-18073) Upgrade AWS SDK to v2
[ https://issues.apache.org/jira/browse/HADOOP-18073?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17471377#comment-17471377 ] Steve Loughran edited comment on HADOOP-18073 at 2/15/22, 6:41 PM: --- that is going to be a pretty traumatic update. Currently we are just moving to 1.12 in HADOOP-18068. I believe the API is radically different. One concern there is that it drops the transfer manager which we used for copy/rename and uploading from the local FS. I see there is now a preview an implementation of that... If it does not include any regressions then it should be possible to use. Otherwise someone is going to have to implement in the S3a code the parallelize block upload/ copy. I'm not going to volunteer for this. If you want to contribute it -it is certainly something which ultimately we would like. In the meantime, S3A does take session credentials. If you can use the SSO mechanism and the AWS CLI to generate a set then you set the relevant properties (ideally in a JCEKs file) and use them for the life of the credentials. You will be able to use the session delegation tokens to propagate those secrets from your machine to the cluster -so deploy hey cluster in EC2 with lower privileges than the users. You also have the option of providing your own AWS credential provider and delegation token implementation. FWIW some of the cloudera products do exactly this to let someone to go from kerberos auth to session credentials for their assigned roles. was (Author: ste...@apache.org): that is going to be a pretty traumatic update. Currently we are just moving to 1.12 in HADOOP-18068. I believe the API is radically different. One concern there is that it drops the transfer manager which we used for copy stroke rename and uploading from the local FS. I see there is now a preview an implementation of that... If it does not include any regressions then it should be possible to use. Otherwise someone is going to have to implement in the S3a code the parallelize block upload/ copy. I'm not going to volunteer for this. If you want to contribute it -it is certainly something which ultimately we would like. In the meantime, S3A does take session credentials. If you can use the SSO mechanism and the AWS CLI to generate a set then you set the relevant properties (ideally in a JCEKs file) and use them for the life of the credentials. You will be able to use the session delegation tokens to propagate those secrets from your machine to the cluster -so deploy hey cluster in EC2 with lower privileges than the users. You also have the option of providing your own AWS credential provider and delegation token implementation. FWIW some of the cloudera products do exactly this to let someone to go from kerberos auth to session credentials for their assigned roles. > Upgrade AWS SDK to v2 > - > > Key: HADOOP-18073 > URL: https://issues.apache.org/jira/browse/HADOOP-18073 > Project: Hadoop Common > Issue Type: Sub-task > Components: auth, fs/s3 >Affects Versions: 3.3.1 >Reporter: xiaowei sun >Priority: Major > > We would like to access s3 with AWS SSO, which is supported in > software.amazon.awssdk:sdk-core:2.*. > In particular, from > [https://hadoop.apache.org/docs/stable/hadoop-aws/tools/hadoop-aws/index.html], > when to set 'fs.s3a.aws.credentials.provider', it must be > "com.amazonaws.auth.AWSCredentialsProvider". We would like to support > "software.amazon.awssdk.auth.credentials.ProfileCredentialsProvider" which > supports AWS SSO, so users only need to authenticate once. -- This message was sent by Atlassian Jira (v8.20.1#820001) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Updated] (HADOOP-18073) Upgrade AWS SDK to v2
[ https://issues.apache.org/jira/browse/HADOOP-18073?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Steve Loughran updated HADOOP-18073: Parent: HADOOP-18067 Issue Type: Sub-task (was: Improvement) > Upgrade AWS SDK to v2 > - > > Key: HADOOP-18073 > URL: https://issues.apache.org/jira/browse/HADOOP-18073 > Project: Hadoop Common > Issue Type: Sub-task > Components: auth, fs/s3 >Affects Versions: 3.3.1 >Reporter: xiaowei sun >Priority: Major > > We would like to access s3 with AWS SSO, which is supported in > software.amazon.awssdk:sdk-core:2.*. > In particular, from > [https://hadoop.apache.org/docs/stable/hadoop-aws/tools/hadoop-aws/index.html], > when to set 'fs.s3a.aws.credentials.provider', it must be > "com.amazonaws.auth.AWSCredentialsProvider". We would like to support > "software.amazon.awssdk.auth.credentials.ProfileCredentialsProvider" which > supports AWS SSO, so users only need to authenticate once. -- This message was sent by Atlassian Jira (v8.20.1#820001) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] goiri commented on a change in pull request #3983: HDFS-16455. RBF: Add `zk-dt-secret-manager.jute.maxbuffer` property for Router's ZKDelegationTokenManager
goiri commented on a change in pull request #3983: URL: https://github.com/apache/hadoop/pull/3983#discussion_r806077863 ## File path: hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/security/token/delegation/ZKDelegationTokenSecretManager.java ## @@ -98,6 +98,8 @@ + "kerberos.keytab"; public static final String ZK_DTSM_ZK_KERBEROS_PRINCIPAL = ZK_CONF_PREFIX + "kerberos.principal"; + public static final String ZK_DTSM_ZK_JUTE_MAXBUFFER = ZK_CONF_PREFIX + + "jute.maxbuffer"; Review comment: The indentation is not correct. Check the checkstyle. ## File path: hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/security/token/delegation/ZKDelegationTokenSecretManager.java ## @@ -199,6 +202,10 @@ public ZKDelegationTokenSecretManager(Configuration conf) { ZK_DTSM_ZK_SESSION_TIMEOUT_DEFAULT); int numRetries = conf.getInt(ZK_DTSM_ZK_NUM_RETRIES, ZK_DTSM_ZK_NUM_RETRIES_DEFAULT); +String juteMaxBuffer = +conf.get(ZK_DTSM_ZK_JUTE_MAXBUFFER, ZK_DTSM_ZK_JUTE_MAXBUFFER_DEFAULT); Review comment: Indentation fix. ## File path: hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/security/token/delegation/ZKDelegationTokenSecretManager.java ## @@ -199,6 +202,10 @@ public ZKDelegationTokenSecretManager(Configuration conf) { ZK_DTSM_ZK_SESSION_TIMEOUT_DEFAULT); int numRetries = conf.getInt(ZK_DTSM_ZK_NUM_RETRIES, ZK_DTSM_ZK_NUM_RETRIES_DEFAULT); +String juteMaxBuffer = +conf.get(ZK_DTSM_ZK_JUTE_MAXBUFFER, ZK_DTSM_ZK_JUTE_MAXBUFFER_DEFAULT); +System.setProperty(ZKClientConfig.JUTE_MAXBUFFER, + juteMaxBuffer); Review comment: This could go to the previous line. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Work logged] (HADOOP-18109) Ensure that default permissions of directories under internal ViewFS directories are the same as directories on target filesystems
[ https://issues.apache.org/jira/browse/HADOOP-18109?focusedWorklogId=727304&page=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-727304 ] ASF GitHub Bot logged work on HADOOP-18109: --- Author: ASF GitHub Bot Created on: 15/Feb/22 18:38 Start Date: 15/Feb/22 18:38 Worklog Time Spent: 10m Work Description: hadoop-yetus commented on pull request #3953: URL: https://github.com/apache/hadoop/pull/3953#issuecomment-1039738874 -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org Issue Time Tracking --- Worklog Id: (was: 727304) Time Spent: 2h 10m (was: 2h) > Ensure that default permissions of directories under internal ViewFS > directories are the same as directories on target filesystems > -- > > Key: HADOOP-18109 > URL: https://issues.apache.org/jira/browse/HADOOP-18109 > Project: Hadoop Common > Issue Type: Bug >Reporter: Chentao Yu >Assignee: Chentao Yu >Priority: Major > Labels: pull-request-available > Time Spent: 2h 10m > Remaining Estimate: 0h > > * Ensure that default permissions of directories under internal ViewFS > directories are the same as directories on target filesystems > * Add new unit test -- This message was sent by Atlassian Jira (v8.20.1#820001) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] tasanuma commented on pull request #3827: HDFS-16396. Reconfig slow peer parameters for datanode
tasanuma commented on pull request #3827: URL: https://github.com/apache/hadoop/pull/3827#issuecomment-1039851634 Merged it. Thanks for your contribution, @tomscut, and thanks for your review, @ayushtkn! -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] hadoop-yetus commented on pull request #3953: HADOOP-18109. Ensure that default permissions of directories under internal ViewFS directories are the same as directories on target fil
hadoop-yetus commented on pull request #3953: URL: https://github.com/apache/hadoop/pull/3953#issuecomment-1039738874 -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Work logged] (HADOOP-13386) Upgrade Avro to 1.8.x or later
[ https://issues.apache.org/jira/browse/HADOOP-13386?focusedWorklogId=727256&page=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-727256 ] ASF GitHub Bot logged work on HADOOP-13386: --- Author: ASF GitHub Bot Created on: 15/Feb/22 18:35 Start Date: 15/Feb/22 18:35 Worklog Time Spent: 10m Work Description: pjfanning commented on pull request #3990: URL: https://github.com/apache/hadoop/pull/3990#issuecomment-1040192443 @aajisaka a lot of tests started failing when due to missing jackson1 jars when I upgraded to a version of avro that uses jackson2 - the pom files deliberately exclude the jersey-json jackson1 dependencies https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-3990/2/artifact/out/patch-unit-root.txt I have added a commit since this to allow jersey-json to import jackson1 - I guess the exclusions were to stop avro and jersey-json importing different versions of jackson1 jars. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org Issue Time Tracking --- Worklog Id: (was: 727256) Time Spent: 1h 40m (was: 1.5h) > Upgrade Avro to 1.8.x or later > -- > > Key: HADOOP-13386 > URL: https://issues.apache.org/jira/browse/HADOOP-13386 > Project: Hadoop Common > Issue Type: Sub-task > Components: build >Reporter: Ben McCann >Priority: Major > Labels: pull-request-available > Time Spent: 1h 40m > Remaining Estimate: 0h > > Avro 1.8.x makes generated classes serializable which makes them much easier > to use with Spark. It would be great to upgrade Avro to 1.8.x -- This message was sent by Atlassian Jira (v8.20.1#820001) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Work logged] (HADOOP-18117) Add an option to preserve root directory permissions
[ https://issues.apache.org/jira/browse/HADOOP-18117?focusedWorklogId=727255&page=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-727255 ] ASF GitHub Bot logged work on HADOOP-18117: --- Author: ASF GitHub Bot Created on: 15/Feb/22 18:35 Start Date: 15/Feb/22 18:35 Worklog Time Spent: 10m Work Description: mohan3d commented on a change in pull request #3970: URL: https://github.com/apache/hadoop/pull/3970#discussion_r805781663 ## File path: hadoop-tools/hadoop-distcp/src/site/markdown/DistCp.md.vm ## @@ -363,6 +363,7 @@ Command Line Options | `-xtrack ` | Save information about missing source files to the specified path. | This option is only valid with `-update` option. This is an experimental property and it cannot be used with `-atomic` option. | | `-direct` | Write directly to destination paths | Useful for avoiding potentially very expensive temporary file rename operations when the destination is an object store | | `-useiterator` | Uses single threaded listStatusIterator to build listing | Useful for saving memory at the client side. Using this option will ignore the numListstatusThreads option | +| `-updateRootDirectoryAttributes` | Update root directory attributes (eg permissions, ownership ...) | Useful if you need to enforce root directory attributes update when using distcp | Review comment: @ferhui should ~~I re-work the internal code so everything is `updateRoot` instead of `updateRootDirectoryAttributes` or only the enduser `distcp` tool?~~ Better to be consistent and use updateRoot everywhere. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org Issue Time Tracking --- Worklog Id: (was: 727255) Time Spent: 2h 20m (was: 2h 10m) > Add an option to preserve root directory permissions > > > Key: HADOOP-18117 > URL: https://issues.apache.org/jira/browse/HADOOP-18117 > Project: Hadoop Common > Issue Type: Improvement >Reporter: Mohanad Elsafty >Assignee: Mohanad Elsafty >Priority: Minor > Labels: pull-request-available > Time Spent: 2h 20m > Remaining Estimate: 0h > > As mentioned in https://issues.apache.org/jira/browse/HADOOP-15211 > > If *-update* or *-overwrite* is being passed when *distcp* used, the root > directory will be skipped in two occasions (CopyListing#doBuildListing & > CopyCommitter#preserveFileAttributesForDirectories), which will ignore root > directory's attributes. > > We face the same issue when distcp huge data between clusters and it takes > too much effort to update root directories attributes manually. > > From the earlier ticket it's obvious why this behaviour is there, but > sometime we need to enforce root directory update hence I will add a new > option for distcp to enable someone (who understands the need of this and > know what they are doing) to enforce the update of root directory's > attributes (permissions, ownership, ...) > > It should be simple one, something like this > {code:java} > $ hadoop distcp -p -update -updateRootDirectoryAttributes /a/b/c /a/b/d {code} > This behaviour is optional and will be *false* by default. (it should not > affect existing *distcp* users). -- This message was sent by Atlassian Jira (v8.20.1#820001) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] hadoop-yetus commented on pull request #3983: HDFS-16455. RBF: Add `zk-dt-secret-manager.jute.maxbuffer` property for Router's ZKDelegationTokenManager
hadoop-yetus commented on pull request #3983: URL: https://github.com/apache/hadoop/pull/3983#issuecomment-1039875071 -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] yulongz commented on pull request #3971: HDFS-16440 RBF: Support router get HAServiceStatus with Lifeline RPC address
yulongz commented on pull request #3971: URL: https://github.com/apache/hadoop/pull/3971#issuecomment-1039887623 @goiri This failed unit test is unrelated to my change. All tests work fine locally. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] pjfanning commented on pull request #3990: [HADOOP-13386] upgrade to avro 1.9.2
pjfanning commented on pull request #3990: URL: https://github.com/apache/hadoop/pull/3990#issuecomment-1040192443 @aajisaka a lot of tests started failing when due to missing jackson1 jars when I upgraded to a version of avro that uses jackson2 - the pom files deliberately exclude the jersey-json jackson1 dependencies https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-3990/2/artifact/out/patch-unit-root.txt I have added a commit since this to allow jersey-json to import jackson1 - I guess the exclusions were to stop avro and jersey-json importing different versions of jackson1 jars. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] mohan3d commented on a change in pull request #3970: HADOOP-18117. Add an option to preserve root directory permissions
mohan3d commented on a change in pull request #3970: URL: https://github.com/apache/hadoop/pull/3970#discussion_r805781663 ## File path: hadoop-tools/hadoop-distcp/src/site/markdown/DistCp.md.vm ## @@ -363,6 +363,7 @@ Command Line Options | `-xtrack ` | Save information about missing source files to the specified path. | This option is only valid with `-update` option. This is an experimental property and it cannot be used with `-atomic` option. | | `-direct` | Write directly to destination paths | Useful for avoiding potentially very expensive temporary file rename operations when the destination is an object store | | `-useiterator` | Uses single threaded listStatusIterator to build listing | Useful for saving memory at the client side. Using this option will ignore the numListstatusThreads option | +| `-updateRootDirectoryAttributes` | Update root directory attributes (eg permissions, ownership ...) | Useful if you need to enforce root directory attributes update when using distcp | Review comment: @ferhui should ~~I re-work the internal code so everything is `updateRoot` instead of `updateRootDirectoryAttributes` or only the enduser `distcp` tool?~~ Better to be consistent and use updateRoot everywhere. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] hadoop-yetus commented on pull request #3861: HDFS-16316.Improve DirectoryScanner: add regular file check related block.
hadoop-yetus commented on pull request #3861: URL: https://github.com/apache/hadoop/pull/3861#issuecomment-1039728159 -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Work logged] (HADOOP-13386) Upgrade Avro to 1.8.x or later
[ https://issues.apache.org/jira/browse/HADOOP-13386?focusedWorklogId=727222&page=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-727222 ] ASF GitHub Bot logged work on HADOOP-13386: --- Author: ASF GitHub Bot Created on: 15/Feb/22 18:32 Start Date: 15/Feb/22 18:32 Worklog Time Spent: 10m Work Description: hadoop-yetus commented on pull request #3990: URL: https://github.com/apache/hadoop/pull/3990#issuecomment-1040165446 :broken_heart: **-1 overall** | Vote | Subsystem | Runtime | Logfile | Comment | |::|--:|:|::|:---:| | +0 :ok: | reexec | 0m 55s | | Docker mode activated. | _ Prechecks _ | | +1 :green_heart: | dupname | 0m 0s | | No case conflicting files found. | | +0 :ok: | codespell | 0m 1s | | codespell was not available. | | +0 :ok: | shelldocs | 0m 1s | | Shelldocs was not available. | | +1 :green_heart: | @author | 0m 0s | | The patch does not contain any @author tags. | | -1 :x: | test4tests | 0m 0s | | The patch doesn't appear to include any new or modified tests. Please justify why no new tests are needed for this patch. Also please list what manual steps were performed to verify this patch. | _ trunk Compile Tests _ | | +0 :ok: | mvndep | 12m 24s | | Maven dependency ordering for branch | | +1 :green_heart: | mvninstall | 25m 35s | | trunk passed | | +1 :green_heart: | compile | 27m 13s | | trunk passed with JDK Ubuntu-11.0.13+8-Ubuntu-0ubuntu1.20.04 | | +1 :green_heart: | compile | 20m 53s | | trunk passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | mvnsite | 26m 37s | | trunk passed | | +1 :green_heart: | javadoc | 8m 18s | | trunk passed with JDK Ubuntu-11.0.13+8-Ubuntu-0ubuntu1.20.04 | | +1 :green_heart: | javadoc | 8m 9s | | trunk passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | shadedclient | 38m 52s | | branch has no errors when building and testing our client artifacts. | _ Patch Compile Tests _ | | +0 :ok: | mvndep | 0m 25s | | Maven dependency ordering for patch | | +1 :green_heart: | mvninstall | 24m 20s | | the patch passed | | +1 :green_heart: | compile | 23m 50s | | the patch passed with JDK Ubuntu-11.0.13+8-Ubuntu-0ubuntu1.20.04 | | -1 :x: | javac | 23m 50s | [/results-compile-javac-root-jdkUbuntu-11.0.13+8-Ubuntu-0ubuntu1.20.04.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-3990/2/artifact/out/results-compile-javac-root-jdkUbuntu-11.0.13+8-Ubuntu-0ubuntu1.20.04.txt) | root-jdkUbuntu-11.0.13+8-Ubuntu-0ubuntu1.20.04 with JDK Ubuntu-11.0.13+8-Ubuntu-0ubuntu1.20.04 generated 1 new + 1806 unchanged - 1 fixed = 1807 total (was 1807) | | +1 :green_heart: | compile | 21m 10s | | the patch passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | -1 :x: | javac | 21m 10s | [/results-compile-javac-root-jdkPrivateBuild-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-3990/2/artifact/out/results-compile-javac-root-jdkPrivateBuild-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07.txt) | root-jdkPrivateBuild-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 generated 1 new + 1684 unchanged - 1 fixed = 1685 total (was 1685) | | +1 :green_heart: | blanks | 0m 0s | | The patch has no blanks issues. | | +1 :green_heart: | mvnsite | 22m 15s | | the patch passed | | +1 :green_heart: | shellcheck | 0m 0s | | No new issues. | | +1 :green_heart: | xml | 0m 1s | | The patch has no ill-formed XML file. | | +1 :green_heart: | javadoc | 8m 19s | | the patch passed with JDK Ubuntu-11.0.13+8-Ubuntu-0ubuntu1.20.04 | | +1 :green_heart: | javadoc | 8m 14s | | the patch passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | shadedclient | 39m 55s | | patch has no errors when building and testing our client artifacts. | _ Other Tests _ | | -1 :x: | unit | 1013m 7s | [/patch-unit-root.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-3990/2/artifact/out/patch-unit-root.txt) | root in the patch passed. | | +1 :green_heart: | asflicense | 1m 31s | | The patch does not generate ASF License warnings. | | | | 1301m 32s | | | | Reason | Tests | |---:|:--| | Failed junit tests | hadoop.hdfs.rbfbalance.TestRouterDistCpProcedure | | | hadoop.yarn.server.nodemanager.webapp.TestNMWebServices | | | hadoop.yarn.server.nodemanager.webapp.TestNMWebServicesAuxServices | | | hadoop.yarn.server.node
[jira] [Work logged] (HADOOP-18122) ViewFileSystem fails on determining owning group when primary group doesn't exist for user
[ https://issues.apache.org/jira/browse/HADOOP-18122?focusedWorklogId=727212&page=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-727212 ] ASF GitHub Bot logged work on HADOOP-18122: --- Author: ASF GitHub Bot Created on: 15/Feb/22 18:32 Start Date: 15/Feb/22 18:32 Worklog Time Spent: 10m Work Description: cheyu2022 commented on pull request #3987: URL: https://github.com/apache/hadoop/pull/3987#issuecomment-1039548096 > just had a cursory look. This I don't think will fix the bug, but will just give a workaround like, if you don't have a primaryGroup get this config set and you can dodge it? Why didn't we try something like: https://github.com/apache/hadoop/blob/trunk/hadoop-hdfs-project/hadoop-hdfs-rbf/src/main/java/org/apache/hadoop/hdfs/server/federation/store/records/MountTable.java#L152-L153 You are right about this. This fix is more like a workaround. But even this solution above sounds like a workaround as well - it just assumes if group isn't found, use username. I'm ok with either way. > Secondly, Why is user name also made configurable? Mount points can essentially have any user names, thus make it configurable as well. > Thirdly, Just saw the commit: [virajith](https://github.com/cheyu2022/hadoop/commits?author=virajith) authored and [cheyu2022](https://github.com/cheyu2022/hadoop/commits?author=cheyu2022) committed > > seems you are using some wrong author? Yeah nice catch, I will fix that. @ayushtkn -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org Issue Time Tracking --- Worklog Id: (was: 727212) Time Spent: 50m (was: 40m) > ViewFileSystem fails on determining owning group when primary group doesn't > exist for user > -- > > Key: HADOOP-18122 > URL: https://issues.apache.org/jira/browse/HADOOP-18122 > Project: Hadoop Common > Issue Type: Bug >Reporter: Chentao Yu >Assignee: Chentao Yu >Priority: Major > Labels: pull-request-available > Time Spent: 50m > Remaining Estimate: 0h > > ViewFileSystem should not fail on determining owning group when primary group > doesn't exist for user -- This message was sent by Atlassian Jira (v8.20.1#820001) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] hadoop-yetus commented on pull request #3990: [HADOOP-13386] upgrade to avro 1.9.2
hadoop-yetus commented on pull request #3990: URL: https://github.com/apache/hadoop/pull/3990#issuecomment-1040165446 :broken_heart: **-1 overall** | Vote | Subsystem | Runtime | Logfile | Comment | |::|--:|:|::|:---:| | +0 :ok: | reexec | 0m 55s | | Docker mode activated. | _ Prechecks _ | | +1 :green_heart: | dupname | 0m 0s | | No case conflicting files found. | | +0 :ok: | codespell | 0m 1s | | codespell was not available. | | +0 :ok: | shelldocs | 0m 1s | | Shelldocs was not available. | | +1 :green_heart: | @author | 0m 0s | | The patch does not contain any @author tags. | | -1 :x: | test4tests | 0m 0s | | The patch doesn't appear to include any new or modified tests. Please justify why no new tests are needed for this patch. Also please list what manual steps were performed to verify this patch. | _ trunk Compile Tests _ | | +0 :ok: | mvndep | 12m 24s | | Maven dependency ordering for branch | | +1 :green_heart: | mvninstall | 25m 35s | | trunk passed | | +1 :green_heart: | compile | 27m 13s | | trunk passed with JDK Ubuntu-11.0.13+8-Ubuntu-0ubuntu1.20.04 | | +1 :green_heart: | compile | 20m 53s | | trunk passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | mvnsite | 26m 37s | | trunk passed | | +1 :green_heart: | javadoc | 8m 18s | | trunk passed with JDK Ubuntu-11.0.13+8-Ubuntu-0ubuntu1.20.04 | | +1 :green_heart: | javadoc | 8m 9s | | trunk passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | shadedclient | 38m 52s | | branch has no errors when building and testing our client artifacts. | _ Patch Compile Tests _ | | +0 :ok: | mvndep | 0m 25s | | Maven dependency ordering for patch | | +1 :green_heart: | mvninstall | 24m 20s | | the patch passed | | +1 :green_heart: | compile | 23m 50s | | the patch passed with JDK Ubuntu-11.0.13+8-Ubuntu-0ubuntu1.20.04 | | -1 :x: | javac | 23m 50s | [/results-compile-javac-root-jdkUbuntu-11.0.13+8-Ubuntu-0ubuntu1.20.04.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-3990/2/artifact/out/results-compile-javac-root-jdkUbuntu-11.0.13+8-Ubuntu-0ubuntu1.20.04.txt) | root-jdkUbuntu-11.0.13+8-Ubuntu-0ubuntu1.20.04 with JDK Ubuntu-11.0.13+8-Ubuntu-0ubuntu1.20.04 generated 1 new + 1806 unchanged - 1 fixed = 1807 total (was 1807) | | +1 :green_heart: | compile | 21m 10s | | the patch passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | -1 :x: | javac | 21m 10s | [/results-compile-javac-root-jdkPrivateBuild-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-3990/2/artifact/out/results-compile-javac-root-jdkPrivateBuild-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07.txt) | root-jdkPrivateBuild-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 generated 1 new + 1684 unchanged - 1 fixed = 1685 total (was 1685) | | +1 :green_heart: | blanks | 0m 0s | | The patch has no blanks issues. | | +1 :green_heart: | mvnsite | 22m 15s | | the patch passed | | +1 :green_heart: | shellcheck | 0m 0s | | No new issues. | | +1 :green_heart: | xml | 0m 1s | | The patch has no ill-formed XML file. | | +1 :green_heart: | javadoc | 8m 19s | | the patch passed with JDK Ubuntu-11.0.13+8-Ubuntu-0ubuntu1.20.04 | | +1 :green_heart: | javadoc | 8m 14s | | the patch passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | shadedclient | 39m 55s | | patch has no errors when building and testing our client artifacts. | _ Other Tests _ | | -1 :x: | unit | 1013m 7s | [/patch-unit-root.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-3990/2/artifact/out/patch-unit-root.txt) | root in the patch passed. | | +1 :green_heart: | asflicense | 1m 31s | | The patch does not generate ASF License warnings. | | | | 1301m 32s | | | | Reason | Tests | |---:|:--| | Failed junit tests | hadoop.hdfs.rbfbalance.TestRouterDistCpProcedure | | | hadoop.yarn.server.nodemanager.webapp.TestNMWebServices | | | hadoop.yarn.server.nodemanager.webapp.TestNMWebServicesAuxServices | | | hadoop.yarn.server.nodemanager.webapp.TestNMWebServicesContainers | | | hadoop.yarn.server.nodemanager.webapp.TestNMWebServicesApps | | | hadoop.yarn.server.router.webapp.TestRouterWebServicesREST | | | hadoop.yarn.server.federation.policies.router.TestLocalityRouterPolicy | | | hadoop.yarn.server.federation.policies.manager.TestPriorityBroadcastPolicyManager | | | hadoop.yarn.server.federation.policies.amrmproxy.TestLocali
[GitHub] [hadoop] cheyu2022 commented on pull request #3987: HADOOP-18122. ViewFileSystem fails on determining owning group when primary group doesn't exist for user
cheyu2022 commented on pull request #3987: URL: https://github.com/apache/hadoop/pull/3987#issuecomment-1039548096 > just had a cursory look. This I don't think will fix the bug, but will just give a workaround like, if you don't have a primaryGroup get this config set and you can dodge it? Why didn't we try something like: https://github.com/apache/hadoop/blob/trunk/hadoop-hdfs-project/hadoop-hdfs-rbf/src/main/java/org/apache/hadoop/hdfs/server/federation/store/records/MountTable.java#L152-L153 You are right about this. This fix is more like a workaround. But even this solution above sounds like a workaround as well - it just assumes if group isn't found, use username. I'm ok with either way. > Secondly, Why is user name also made configurable? Mount points can essentially have any user names, thus make it configurable as well. > Thirdly, Just saw the commit: [virajith](https://github.com/cheyu2022/hadoop/commits?author=virajith) authored and [cheyu2022](https://github.com/cheyu2022/hadoop/commits?author=cheyu2022) committed > > seems you are using some wrong author? Yeah nice catch, I will fix that. @ayushtkn -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Updated] (HADOOP-18066) AbstractJavaKeyStoreProvider: need a way to read credential store password from Configuration
[ https://issues.apache.org/jira/browse/HADOOP-18066?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Chao Sun updated HADOOP-18066: -- Fix Version/s: (was: 3.3.2) > AbstractJavaKeyStoreProvider: need a way to read credential store password > from Configuration > - > > Key: HADOOP-18066 > URL: https://issues.apache.org/jira/browse/HADOOP-18066 > Project: Hadoop Common > Issue Type: Wish > Components: security >Reporter: László Bodor >Priority: Major > Labels: pull-request-available > Time Spent: 2h 50m > Remaining Estimate: 0h > > Codepath in focus is > [this|https://github.com/apache/hadoop/blob/c3006be516ce7d4f970e24e7407b401318ceec3c/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/security/alias/AbstractJavaKeyStoreProvider.java#L316] > {code} > password = ProviderUtils.locatePassword(CREDENTIAL_PASSWORD_ENV_VAR, > conf.get(CREDENTIAL_PASSWORD_FILE_KEY)); > {code} > Since HIVE-14822, we can use custom keystore that Hiveserver2 propagates to > jobs/tasks of different execution engines (mr, tez, spark). > We're able to pass any "jceks:" url, but not a password, e.g. on this > codepath: > {code} > Caused by: java.security.UnrecoverableKeyException: Password verification > failed > at com.sun.crypto.provider.JceKeyStore.engineLoad(JceKeyStore.java:879) > ~[sunjce_provider.jar:1.8.0_232] > at java.security.KeyStore.load(KeyStore.java:1445) ~[?:1.8.0_232] > at > org.apache.hadoop.security.alias.AbstractJavaKeyStoreProvider.locateKeystore(AbstractJavaKeyStoreProvider.java:326) > ~[hadoop-common-3.1.1.7.1.7.0-551.jar:?] > at > org.apache.hadoop.security.alias.AbstractJavaKeyStoreProvider.(AbstractJavaKeyStoreProvider.java:86) > ~[hadoop-common-3.1.1.7.1.7.0-551.jar:?] > at > org.apache.hadoop.security.alias.KeyStoreProvider.(KeyStoreProvider.java:49) > ~[hadoop-common-3.1.1.7.1.7.0-551.jar:?] > at > org.apache.hadoop.security.alias.JavaKeyStoreProvider.(JavaKeyStoreProvider.java:42) > ~[hadoop-common-3.1.1.7.1.7.0-551.jar:?] > at > org.apache.hadoop.security.alias.JavaKeyStoreProvider.(JavaKeyStoreProvider.java:35) > ~[hadoop-common-3.1.1.7.1.7.0-551.jar:?] > at > org.apache.hadoop.security.alias.JavaKeyStoreProvider$Factory.createProvider(JavaKeyStoreProvider.java:68) > ~[hadoop-common-3.1.1.7.1.7.0-551.jar:?] > at > org.apache.hadoop.security.alias.CredentialProviderFactory.getProviders(CredentialProviderFactory.java:73) > ~[hadoop-common-3.1.1.7.1.7.0-551.jar:?] > at > org.apache.hadoop.conf.Configuration.getPasswordFromCredentialProviders(Configuration.java:2409) > ~[hadoop-common-3.1.1.7.1.7.0-551.jar:?] > at > org.apache.hadoop.conf.Configuration.getPassword(Configuration.java:2347) > ~[hadoop-common-3.1.1.7.1.7.0-551.jar:?] > at > org.apache.hadoop.fs.azurebfs.AbfsConfiguration.getPasswordString(AbfsConfiguration.java:295) > ~[hadoop-azure-3.1.1.7.1.7.0-551.jar:?] > at > org.apache.hadoop.fs.azurebfs.AbfsConfiguration.getTokenProvider(AbfsConfiguration.java:525) > ~[hadoop-azure-3.1.1.7.1.7.0-551.jar:?] > {code} > Even there is a chance of reading a text file, it's not secure, we need to > try reading a Configuration property first and if it's null, we can go to the > environment variable. > Hacking the System.getenv() is only possible with reflection, doesn't look so > good. -- This message was sent by Atlassian Jira (v8.20.1#820001) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Updated] (HADOOP-17873) ABFS: Fix transient failures in ITestAbfsStreamStatistics and ITestAbfsRestOperationException
[ https://issues.apache.org/jira/browse/HADOOP-17873?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Chao Sun updated HADOOP-17873: -- Fix Version/s: (was: 3.3.2) > ABFS: Fix transient failures in ITestAbfsStreamStatistics and > ITestAbfsRestOperationException > - > > Key: HADOOP-17873 > URL: https://issues.apache.org/jira/browse/HADOOP-17873 > Project: Hadoop Common > Issue Type: Sub-task > Components: fs/azure >Affects Versions: 3.3.1 >Reporter: Sumangala Patki >Assignee: Sumangala Patki >Priority: Major > Labels: pull-request-available > Time Spent: 7h > Remaining Estimate: 0h > > To address transient failures in the following test classes: > * ITestAbfsStreamStatistics: Uses a filesystem level instance to record > read/write statistics, which also tracks these operations in other tests. > running parallelly. To be marked for sequential run only to avoid transient > failure > * ITestAbfsRestOperationException: The use of a static member to track retry > count causes transient failures when two tests of this class happen to run > together. Switch to non-static variable for assertions on retry count -- This message was sent by Atlassian Jira (v8.20.1#820001) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Updated] (HADOOP-17728) Fix issue of the StatisticsDataReferenceCleaner cleanUp
[ https://issues.apache.org/jira/browse/HADOOP-17728?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Chao Sun updated HADOOP-17728: -- Fix Version/s: (was: 3.3.2) > Fix issue of the StatisticsDataReferenceCleaner cleanUp > --- > > Key: HADOOP-17728 > URL: https://issues.apache.org/jira/browse/HADOOP-17728 > Project: Hadoop Common > Issue Type: Bug > Components: fs >Affects Versions: 3.2.1 >Reporter: yikf >Assignee: yikf >Priority: Minor > Labels: pull-request-available, reverted > Time Spent: 5h 10m > Remaining Estimate: 0h > > Cleaner thread will be blocked if we remove reference from ReferenceQueue > unless the `queue.enqueue` called. > > As shown below, We call ReferenceQueue.remove() now while cleanUp, Call > chain as follow: > *StatisticsDataReferenceCleaner#queue.remove() -> > ReferenceQueue.remove(0) -> lock.wait(0)* > But, lock.notifyAll is called when queue.enqueue only, so Cleaner thread > will be blocked. > > ThreadDump: > {code:java} > "Reference Handler" #2 daemon prio=10 os_prio=0 tid=0x7f7afc088800 > nid=0x2119 in Object.wait() [0x7f7b0023] >java.lang.Thread.State: WAITING (on object monitor) > at java.lang.Object.wait(Native Method) > - waiting on <0xc00c2f58> (a java.lang.ref.Reference$Lock) > at java.lang.Object.wait(Object.java:502) > at java.lang.ref.Reference.tryHandlePending(Reference.java:191) > - locked <0xc00c2f58> (a java.lang.ref.Reference$Lock) > at > java.lang.ref.Reference$ReferenceHandler.run(Reference.java:153){code} -- This message was sent by Atlassian Jira (v8.20.1#820001) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Updated] (HADOOP-17728) Fix issue of the StatisticsDataReferenceCleaner cleanUp
[ https://issues.apache.org/jira/browse/HADOOP-17728?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Chao Sun updated HADOOP-17728: -- Fix Version/s: 3.3.2 > Fix issue of the StatisticsDataReferenceCleaner cleanUp > --- > > Key: HADOOP-17728 > URL: https://issues.apache.org/jira/browse/HADOOP-17728 > Project: Hadoop Common > Issue Type: Bug > Components: fs >Affects Versions: 3.2.1 >Reporter: yikf >Assignee: yikf >Priority: Minor > Labels: pull-request-available, reverted > Fix For: 3.3.2 > > Time Spent: 5h 10m > Remaining Estimate: 0h > > Cleaner thread will be blocked if we remove reference from ReferenceQueue > unless the `queue.enqueue` called. > > As shown below, We call ReferenceQueue.remove() now while cleanUp, Call > chain as follow: > *StatisticsDataReferenceCleaner#queue.remove() -> > ReferenceQueue.remove(0) -> lock.wait(0)* > But, lock.notifyAll is called when queue.enqueue only, so Cleaner thread > will be blocked. > > ThreadDump: > {code:java} > "Reference Handler" #2 daemon prio=10 os_prio=0 tid=0x7f7afc088800 > nid=0x2119 in Object.wait() [0x7f7b0023] >java.lang.Thread.State: WAITING (on object monitor) > at java.lang.Object.wait(Native Method) > - waiting on <0xc00c2f58> (a java.lang.ref.Reference$Lock) > at java.lang.Object.wait(Object.java:502) > at java.lang.ref.Reference.tryHandlePending(Reference.java:191) > - locked <0xc00c2f58> (a java.lang.ref.Reference$Lock) > at > java.lang.ref.Reference$ReferenceHandler.run(Reference.java:153){code} -- This message was sent by Atlassian Jira (v8.20.1#820001) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Updated] (HADOOP-17988) Disable JIRA plugin for YETUS on Hadoop
[ https://issues.apache.org/jira/browse/HADOOP-17988?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Chao Sun updated HADOOP-17988: -- Fix Version/s: 3.3.2 (was: 3.3.3) > Disable JIRA plugin for YETUS on Hadoop > --- > > Key: HADOOP-17988 > URL: https://issues.apache.org/jira/browse/HADOOP-17988 > Project: Hadoop Common > Issue Type: Bug > Components: build >Affects Versions: 3.3.3 >Reporter: Gautham Banasandra >Assignee: Gautham Banasandra >Priority: Critical > Labels: pull-request-available > Fix For: 3.3.2 > > Time Spent: 2h 20m > Remaining Estimate: 0h > > I’ve been noticing an issue with Jenkins CI where a file jira-json goes > missing all of a sudden – jenkins / hadoop-multibranch / PR-3588 / #2 > (apache.org) > {code} > [2021-10-27T17:52:58.787Z] Processing: > https://github.com/apache/hadoop/pull/3588 > [2021-10-27T17:52:58.787Z] GITHUB PR #3588 is being downloaded from > [2021-10-27T17:52:58.787Z] > https://api.github.com/repos/apache/hadoop/pulls/3588 > [2021-10-27T17:52:58.787Z] JSON data at Wed Oct 27 17:52:55 UTC 2021 > [2021-10-27T17:52:58.787Z] Patch data at Wed Oct 27 17:52:56 UTC 2021 > [2021-10-27T17:52:58.787Z] Diff data at Wed Oct 27 17:52:56 UTC 2021 > [2021-10-27T17:52:59.814Z] awk: cannot open > /home/jenkins/jenkins-home/workspace/hadoop-multibranch_PR-3588/centos-7/out/jira-json > (No such file or directory) > [2021-10-27T17:52:59.814Z] ERROR: https://github.com/apache/hadoop/pull/3588 > issue status is not matched with "Patch Available". > [2021-10-27T17:52:59.814Z] > {code} > This causes the pipeline run to fail. I’ve seen this in my multiple attempts > to re-run the CI on my PR – > # After 45 minutes – [jenkins / hadoop-multibranch / PR-3588 / #1 > (apache.org)|https://ci-hadoop.apache.org/blue/organizations/jenkins/hadoop-multibranch/detail/PR-3588/1/pipeline/] > # After 1 minute – [jenkins / hadoop-multibranch / PR-3588 / #2 > (apache.org)|https://ci-hadoop.apache.org/blue/organizations/jenkins/hadoop-multibranch/detail/PR-3588/2/pipeline/] > # After 17 minutes – [jenkins / hadoop-multibranch / PR-3588 / #3 > (apache.org)|https://ci-hadoop.apache.org/blue/organizations/jenkins/hadoop-multibranch/detail/PR-3588/3/pipeline/] > The hadoop-multibranch pipeline doesn't use ASF JIRA, thus, we're disabling > the *jira* plugin to fix this issue. -- This message was sent by Atlassian Jira (v8.20.1#820001) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] goiri merged pull request #3971: HDFS-16440. RBF: Support router get HAServiceStatus with Lifeline RPC address
goiri merged pull request #3971: URL: https://github.com/apache/hadoop/pull/3971 -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] hadoop-yetus commented on pull request #3861: HDFS-16316.Improve DirectoryScanner: add regular file check related block.
hadoop-yetus commented on pull request #3861: URL: https://github.com/apache/hadoop/pull/3861#issuecomment-1040305852 :confetti_ball: **+1 overall** | Vote | Subsystem | Runtime | Logfile | Comment | |::|--:|:|::|:---:| | +0 :ok: | reexec | 0m 45s | | Docker mode activated. | _ Prechecks _ | | +1 :green_heart: | dupname | 0m 0s | | No case conflicting files found. | | +0 :ok: | codespell | 0m 0s | | codespell was not available. | | +1 :green_heart: | @author | 0m 0s | | The patch does not contain any @author tags. | | +1 :green_heart: | test4tests | 0m 0s | | The patch appears to include 2 new or modified test files. | _ trunk Compile Tests _ | | +0 :ok: | mvndep | 12m 36s | | Maven dependency ordering for branch | | +1 :green_heart: | mvninstall | 22m 44s | | trunk passed | | +1 :green_heart: | compile | 24m 35s | | trunk passed with JDK Ubuntu-11.0.13+8-Ubuntu-0ubuntu1.20.04 | | +1 :green_heart: | compile | 21m 11s | | trunk passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | checkstyle | 3m 45s | | trunk passed | | +1 :green_heart: | mvnsite | 3m 13s | | trunk passed | | +1 :green_heart: | javadoc | 2m 24s | | trunk passed with JDK Ubuntu-11.0.13+8-Ubuntu-0ubuntu1.20.04 | | +1 :green_heart: | javadoc | 3m 12s | | trunk passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | spotbugs | 6m 0s | | trunk passed | | +1 :green_heart: | shadedclient | 23m 31s | | branch has no errors when building and testing our client artifacts. | _ Patch Compile Tests _ | | +0 :ok: | mvndep | 0m 24s | | Maven dependency ordering for patch | | +1 :green_heart: | mvninstall | 2m 25s | | the patch passed | | +1 :green_heart: | compile | 24m 48s | | the patch passed with JDK Ubuntu-11.0.13+8-Ubuntu-0ubuntu1.20.04 | | +1 :green_heart: | javac | 24m 48s | | the patch passed | | +1 :green_heart: | compile | 22m 16s | | the patch passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | javac | 22m 16s | | the patch passed | | +1 :green_heart: | blanks | 0m 0s | | The patch has no blanks issues. | | +1 :green_heart: | checkstyle | 3m 51s | | the patch passed | | +1 :green_heart: | mvnsite | 3m 17s | | the patch passed | | +1 :green_heart: | javadoc | 2m 21s | | the patch passed with JDK Ubuntu-11.0.13+8-Ubuntu-0ubuntu1.20.04 | | +1 :green_heart: | javadoc | 3m 28s | | the patch passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | spotbugs | 6m 36s | | the patch passed | | +1 :green_heart: | shadedclient | 24m 22s | | patch has no errors when building and testing our client artifacts. | _ Other Tests _ | | +1 :green_heart: | unit | 17m 45s | | hadoop-common in the patch passed. | | +1 :green_heart: | unit | 238m 14s | | hadoop-hdfs in the patch passed. | | +1 :green_heart: | asflicense | 1m 2s | | The patch does not generate ASF License warnings. | | | | 473m 27s | | | | Subsystem | Report/Notes | |--:|:-| | Docker | ClientAPI=1.41 ServerAPI=1.41 base: https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-3861/6/artifact/out/Dockerfile | | GITHUB PR | https://github.com/apache/hadoop/pull/3861 | | Optional Tests | dupname asflicense compile javac javadoc mvninstall mvnsite unit shadedclient spotbugs checkstyle codespell | | uname | Linux 94871cc29013 4.15.0-58-generic #64-Ubuntu SMP Tue Aug 6 11:12:41 UTC 2019 x86_64 x86_64 x86_64 GNU/Linux | | Build tool | maven | | Personality | dev-support/bin/hadoop.sh | | git revision | trunk / ca61b8bee32722ede0c39562b39edeee90521ce0 | | Default Java | Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | Multi-JDK versions | /usr/lib/jvm/java-11-openjdk-amd64:Ubuntu-11.0.13+8-Ubuntu-0ubuntu1.20.04 /usr/lib/jvm/java-8-openjdk-amd64:Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | Test Results | https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-3861/6/testReport/ | | Max. process+thread count | 3234 (vs. ulimit of 5500) | | modules | C: hadoop-common-project/hadoop-common hadoop-hdfs-project/hadoop-hdfs U: . | | Console output | https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-3861/6/console | | versions | git=2.25.1 maven=3.6.3 spotbugs=4.2.2 | | Powered by | Apache Yetus 0.14.0-SNAPSHOT https://yetus.apache.org | This message was automatically generated. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL
[GitHub] [hadoop] tasanuma opened a new pull request #3992: HDFS-15745. Make DataNodePeerMetrics#LOW_THRESHOLD_MS and MIN_OUTLIER_DETECTION_NODES configurable. Contributed by Haibin Huang.
tasanuma opened a new pull request #3992: URL: https://github.com/apache/hadoop/pull/3992 ### Description of PR HDFS-15745. Make DataNodePeerMetrics#LOW_THRESHOLD_MS and MIN_OUTLIER_DETECTION_NODES configurable. ### How was this patch tested? ### For code changes: - [x] Does the title or this PR starts with the corresponding JIRA issue id (e.g. 'HADOOP-17799. Your PR title ...')? - [ ] Object storage: have the integration tests been executed and the endpoint declared according to the connector-specific documentation? - [ ] If adding new dependencies to the code, are these dependencies licensed in a way that is compatible for inclusion under [ASF 2.0](http://www.apache.org/legal/resolved.html#category-a)? - [ ] If applicable, have you updated the `LICENSE`, `LICENSE-binary`, `NOTICE-binary` files? -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Updated] (HADOOP-18126) junit-vintage tests seem to be failing
[ https://issues.apache.org/jira/browse/HADOOP-18126?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] PJ Fanning updated HADOOP-18126: Description: {code:java} Feb 11, 2022 11:31:43 AM org.junit.platform.launcher.core.DefaultLauncher handleThrowable WARNING: TestEngine with ID 'junit-vintage' failed to discover tests org.junit.platform.commons.JUnitException: Failed to parse version of junit:junit: 4.13.2 at org.junit.vintage.engine.JUnit4VersionCheck.parseVersion(JUnit4VersionCheck.java:54) {code} [https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-3980/1/artifact/out/patch-unit-root.txt] seems like junit.vintage.version=5.5.1 is incompatible with junit.version=4.13.2 see 2nd answer on [https://stackoverflow.com/questions/59900637/error-testengine-with-id-junit-vintage-failed-to-discover-tests-with-spring] my plan is to upgrade junit.vintage.version and junit.jupiter.version to 5.8.2 was: ``` Feb 11, 2022 11:31:43 AM org.junit.platform.launcher.core.DefaultLauncher handleThrowable WARNING: TestEngine with ID 'junit-vintage' failed to discover tests org.junit.platform.commons.JUnitException: Failed to parse version of junit:junit: 4.13.2 at org.junit.vintage.engine.JUnit4VersionCheck.parseVersion(JUnit4VersionCheck.java:54) ``` [https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-3980/1/artifact/out/patch-unit-root.txt] seems like junit.vintage.version=5.5.1 is incompatible with junit.version=4.13.2 see 2nd answer on [https://stackoverflow.com/questions/59900637/error-testengine-with-id-junit-vintage-failed-to-discover-tests-with-spring] my plan is to upgrade junit.vintage.version and junit.jupiter.version to 5.8.2 > junit-vintage tests seem to be failing > -- > > Key: HADOOP-18126 > URL: https://issues.apache.org/jira/browse/HADOOP-18126 > Project: Hadoop Common > Issue Type: Bug > Components: bulid >Reporter: PJ Fanning >Priority: Major > > {code:java} > Feb 11, 2022 11:31:43 AM org.junit.platform.launcher.core.DefaultLauncher > handleThrowable WARNING: TestEngine with ID 'junit-vintage' failed to > discover tests org.junit.platform.commons.JUnitException: Failed to parse > version of junit:junit: 4.13.2 at > org.junit.vintage.engine.JUnit4VersionCheck.parseVersion(JUnit4VersionCheck.java:54) > {code} > [https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-3980/1/artifact/out/patch-unit-root.txt] > seems like junit.vintage.version=5.5.1 is incompatible with > junit.version=4.13.2 > see 2nd answer on > [https://stackoverflow.com/questions/59900637/error-testengine-with-id-junit-vintage-failed-to-discover-tests-with-spring] > my plan is to upgrade junit.vintage.version and junit.jupiter.version to 5.8.2 > -- This message was sent by Atlassian Jira (v8.20.1#820001) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Created] (HADOOP-18126) junit-vintage tests seem to be failing
PJ Fanning created HADOOP-18126: --- Summary: junit-vintage tests seem to be failing Key: HADOOP-18126 URL: https://issues.apache.org/jira/browse/HADOOP-18126 Project: Hadoop Common Issue Type: Bug Components: bulid Reporter: PJ Fanning ``` Feb 11, 2022 11:31:43 AM org.junit.platform.launcher.core.DefaultLauncher handleThrowable WARNING: TestEngine with ID 'junit-vintage' failed to discover tests org.junit.platform.commons.JUnitException: Failed to parse version of junit:junit: 4.13.2 at org.junit.vintage.engine.JUnit4VersionCheck.parseVersion(JUnit4VersionCheck.java:54) ``` [https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-3980/1/artifact/out/patch-unit-root.txt] seems like junit.vintage.version=5.5.1 is incompatible with junit.version=4.13.2 see 2nd answer on [https://stackoverflow.com/questions/59900637/error-testengine-with-id-junit-vintage-failed-to-discover-tests-with-spring] my plan is to upgrade junit.vintage.version and junit.jupiter.version to 5.8.2 -- This message was sent by Atlassian Jira (v8.20.1#820001) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Work logged] (HADOOP-18109) Ensure that default permissions of directories under internal ViewFS directories are the same as directories on target filesystems
[ https://issues.apache.org/jira/browse/HADOOP-18109?focusedWorklogId=726993&page=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-726993 ] ASF GitHub Bot logged work on HADOOP-18109: --- Author: ASF GitHub Bot Created on: 15/Feb/22 12:21 Start Date: 15/Feb/22 12:21 Worklog Time Spent: 10m Work Description: hadoop-yetus commented on pull request #3953: URL: https://github.com/apache/hadoop/pull/3953#issuecomment-1040207940 :confetti_ball: **+1 overall** | Vote | Subsystem | Runtime | Logfile | Comment | |::|--:|:|::|:---:| | +0 :ok: | reexec | 1m 20s | | Docker mode activated. | _ Prechecks _ | | +1 :green_heart: | dupname | 0m 0s | | No case conflicting files found. | | +0 :ok: | codespell | 0m 1s | | codespell was not available. | | +1 :green_heart: | @author | 0m 0s | | The patch does not contain any @author tags. | | +1 :green_heart: | test4tests | 0m 0s | | The patch appears to include 1 new or modified test files. | _ trunk Compile Tests _ | | +0 :ok: | mvndep | 12m 40s | | Maven dependency ordering for branch | | +1 :green_heart: | mvninstall | 22m 41s | | trunk passed | | +1 :green_heart: | compile | 25m 26s | | trunk passed with JDK Ubuntu-11.0.13+8-Ubuntu-0ubuntu1.20.04 | | +1 :green_heart: | compile | 21m 55s | | trunk passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | checkstyle | 3m 46s | | trunk passed | | +1 :green_heart: | mvnsite | 3m 20s | | trunk passed | | +1 :green_heart: | javadoc | 2m 24s | | trunk passed with JDK Ubuntu-11.0.13+8-Ubuntu-0ubuntu1.20.04 | | +1 :green_heart: | javadoc | 3m 29s | | trunk passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | spotbugs | 6m 1s | | trunk passed | | +1 :green_heart: | shadedclient | 23m 8s | | branch has no errors when building and testing our client artifacts. | _ Patch Compile Tests _ | | +0 :ok: | mvndep | 0m 27s | | Maven dependency ordering for patch | | +1 :green_heart: | mvninstall | 2m 24s | | the patch passed | | +1 :green_heart: | compile | 24m 37s | | the patch passed with JDK Ubuntu-11.0.13+8-Ubuntu-0ubuntu1.20.04 | | +1 :green_heart: | javac | 24m 37s | | the patch passed | | +1 :green_heart: | compile | 22m 9s | | the patch passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | javac | 22m 9s | | the patch passed | | +1 :green_heart: | blanks | 0m 0s | | The patch has no blanks issues. | | +1 :green_heart: | checkstyle | 3m 35s | | the patch passed | | +1 :green_heart: | mvnsite | 3m 28s | | the patch passed | | +1 :green_heart: | javadoc | 2m 27s | | the patch passed with JDK Ubuntu-11.0.13+8-Ubuntu-0ubuntu1.20.04 | | +1 :green_heart: | javadoc | 3m 33s | | the patch passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | spotbugs | 6m 36s | | the patch passed | | +1 :green_heart: | shadedclient | 23m 31s | | patch has no errors when building and testing our client artifacts. | _ Other Tests _ | | +1 :green_heart: | unit | 18m 27s | | hadoop-common in the patch passed. | | +1 :green_heart: | unit | 245m 27s | | hadoop-hdfs in the patch passed. | | +1 :green_heart: | asflicense | 1m 10s | | The patch does not generate ASF License warnings. | | | | 483m 0s | | | | Subsystem | Report/Notes | |--:|:-| | Docker | ClientAPI=1.41 ServerAPI=1.41 base: https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-3953/7/artifact/out/Dockerfile | | GITHUB PR | https://github.com/apache/hadoop/pull/3953 | | Optional Tests | dupname asflicense compile javac javadoc mvninstall mvnsite unit shadedclient spotbugs checkstyle codespell | | uname | Linux 4d55d2d9dd66 4.15.0-112-generic #113-Ubuntu SMP Thu Jul 9 23:41:39 UTC 2020 x86_64 x86_64 x86_64 GNU/Linux | | Build tool | maven | | Personality | dev-support/bin/hadoop.sh | | git revision | trunk / dcc9896bdfc1cdbb5664efc2de0395dedccf | | Default Java | Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | Multi-JDK versions | /usr/lib/jvm/java-11-openjdk-amd64:Ubuntu-11.0.13+8-Ubuntu-0ubuntu1.20.04 /usr/lib/jvm/java-8-openjdk-amd64:Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | Test Results | https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-3953/7/testReport/ | | Max. process+thread count | 2897 (vs. ulimit of 5500) | | modules | C: hadoop-common-project/hadoop-common hadoop-hdfs-project/hado
[jira] [Work logged] (HADOOP-18109) Ensure that default permissions of directories under internal ViewFS directories are the same as directories on target filesystems
[ https://issues.apache.org/jira/browse/HADOOP-18109?focusedWorklogId=726994&page=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-726994 ] ASF GitHub Bot logged work on HADOOP-18109: --- Author: ASF GitHub Bot Created on: 15/Feb/22 12:21 Start Date: 15/Feb/22 12:21 Worklog Time Spent: 10m Work Description: hadoop-yetus commented on pull request #3953: URL: https://github.com/apache/hadoop/pull/3953#issuecomment-1040208650 :broken_heart: **-1 overall** | Vote | Subsystem | Runtime | Logfile | Comment | |::|--:|:|::|:---:| | +0 :ok: | reexec | 0m 40s | | Docker mode activated. | _ Prechecks _ | | +1 :green_heart: | dupname | 0m 0s | | No case conflicting files found. | | +0 :ok: | codespell | 0m 1s | | codespell was not available. | | +1 :green_heart: | @author | 0m 0s | | The patch does not contain any @author tags. | | +1 :green_heart: | test4tests | 0m 0s | | The patch appears to include 1 new or modified test files. | _ trunk Compile Tests _ | | +0 :ok: | mvndep | 12m 39s | | Maven dependency ordering for branch | | +1 :green_heart: | mvninstall | 22m 56s | | trunk passed | | +1 :green_heart: | compile | 25m 32s | | trunk passed with JDK Ubuntu-11.0.13+8-Ubuntu-0ubuntu1.20.04 | | +1 :green_heart: | compile | 22m 7s | | trunk passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | checkstyle | 3m 44s | | trunk passed | | +1 :green_heart: | mvnsite | 3m 18s | | trunk passed | | +1 :green_heart: | javadoc | 2m 16s | | trunk passed with JDK Ubuntu-11.0.13+8-Ubuntu-0ubuntu1.20.04 | | +1 :green_heart: | javadoc | 3m 27s | | trunk passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | spotbugs | 5m 56s | | trunk passed | | +1 :green_heart: | shadedclient | 22m 56s | | branch has no errors when building and testing our client artifacts. | _ Patch Compile Tests _ | | +0 :ok: | mvndep | 0m 29s | | Maven dependency ordering for patch | | +1 :green_heart: | mvninstall | 2m 22s | | the patch passed | | +1 :green_heart: | compile | 24m 43s | | the patch passed with JDK Ubuntu-11.0.13+8-Ubuntu-0ubuntu1.20.04 | | +1 :green_heart: | javac | 24m 43s | | the patch passed | | +1 :green_heart: | compile | 22m 9s | | the patch passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | javac | 22m 9s | | the patch passed | | +1 :green_heart: | blanks | 0m 0s | | The patch has no blanks issues. | | +1 :green_heart: | checkstyle | 3m 29s | | the patch passed | | +1 :green_heart: | mvnsite | 3m 22s | | the patch passed | | +1 :green_heart: | javadoc | 2m 22s | | the patch passed with JDK Ubuntu-11.0.13+8-Ubuntu-0ubuntu1.20.04 | | +1 :green_heart: | javadoc | 3m 37s | | the patch passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | spotbugs | 6m 30s | | the patch passed | | +1 :green_heart: | shadedclient | 23m 18s | | patch has no errors when building and testing our client artifacts. | _ Other Tests _ | | +1 :green_heart: | unit | 18m 24s | | hadoop-common in the patch passed. | | -1 :x: | unit | 246m 10s | [/patch-unit-hadoop-hdfs-project_hadoop-hdfs.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-3953/8/artifact/out/patch-unit-hadoop-hdfs-project_hadoop-hdfs.txt) | hadoop-hdfs in the patch passed. | | +1 :green_heart: | asflicense | 1m 10s | | The patch does not generate ASF License warnings. | | | | 482m 49s | | | | Reason | Tests | |---:|:--| | Failed junit tests | hadoop.hdfs.TestReconstructStripedFileWithRandomECPolicy | | Subsystem | Report/Notes | |--:|:-| | Docker | ClientAPI=1.41 ServerAPI=1.41 base: https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-3953/8/artifact/out/Dockerfile | | GITHUB PR | https://github.com/apache/hadoop/pull/3953 | | Optional Tests | dupname asflicense compile javac javadoc mvninstall mvnsite unit shadedclient spotbugs checkstyle codespell | | uname | Linux 3461d112a331 4.15.0-112-generic #113-Ubuntu SMP Thu Jul 9 23:41:39 UTC 2020 x86_64 x86_64 x86_64 GNU/Linux | | Build tool | maven | | Personality | dev-support/bin/hadoop.sh | | git revision | trunk / dcc9896bdfc1cdbb5664efc2de0395dedccf | | Default Java | Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | Multi-JDK versions | /usr/lib/jvm/java-11-openjdk-amd64:Ubuntu-11.0.13+8-Ubuntu-0ubuntu1.20.04 /usr/lib/jvm/java-8-openjdk-