[
https://issues.apache.org/jira/browse/HADOOP-19815?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=18068934#comment-18068934
]
ASF GitHub Bot commented on HADOOP-19815:
-----------------------------------------
hadoop-yetus commented on PR #8307:
URL: https://github.com/apache/hadoop/pull/8307#issuecomment-4141153169
:broken_heart: **-1 overall**
| Vote | Subsystem | Runtime | Logfile | Comment |
|:----:|----------:|--------:|:--------:|:-------:|
| +0 :ok: | reexec | 0m 52s | | Docker mode activated. |
|||| _ Prechecks _ |
| +1 :green_heart: | dupname | 0m 0s | | No case conflicting files
found. |
| +0 :ok: | codespell | 0m 0s | | codespell was not available. |
| +0 :ok: | detsecrets | 0m 0s | | detect-secrets was not available.
|
| +1 :green_heart: | @author | 0m 0s | | The patch does not contain
any @author tags. |
| +1 :green_heart: | test4tests | 0m 0s | | The patch appears to
include 8 new or modified test files. |
|||| _ trunk Compile Tests _ |
| +0 :ok: | mvndep | 1m 53s | | Maven dependency ordering for branch |
| +1 :green_heart: | mvninstall | 51m 1s | | trunk passed |
| +1 :green_heart: | compile | 20m 43s | | trunk passed with JDK
Ubuntu-21.0.10+7-Ubuntu-124.04 |
| +1 :green_heart: | compile | 18m 55s | | trunk passed with JDK
Ubuntu-17.0.18+8-Ubuntu-124.04.1 |
| +1 :green_heart: | checkstyle | 6m 16s | | trunk passed |
| +1 :green_heart: | mvnsite | 3m 30s | | trunk passed |
| +1 :green_heart: | javadoc | 2m 25s | | trunk passed with JDK
Ubuntu-21.0.10+7-Ubuntu-124.04 |
| +1 :green_heart: | javadoc | 2m 22s | | trunk passed with JDK
Ubuntu-17.0.18+8-Ubuntu-124.04.1 |
| +1 :green_heart: | spotbugs | 6m 56s | | trunk passed |
| +1 :green_heart: | shadedclient | 37m 34s | | branch has no errors
when building and testing our client artifacts. |
|||| _ Patch Compile Tests _ |
| +0 :ok: | mvndep | 0m 30s | | Maven dependency ordering for patch |
| -1 :x: | mvninstall | 0m 44s |
[/patch-mvninstall-hadoop-hdfs-project_hadoop-hdfs-client.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-8307/5/artifact/out/patch-mvninstall-hadoop-hdfs-project_hadoop-hdfs-client.txt)
| hadoop-hdfs-client in the patch failed. |
| -1 :x: | compile | 2m 0s |
[/patch-compile-root-jdkUbuntu-21.0.10+7-Ubuntu-124.04.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-8307/5/artifact/out/patch-compile-root-jdkUbuntu-21.0.10+7-Ubuntu-124.04.txt)
| root in the patch failed with JDK Ubuntu-21.0.10+7-Ubuntu-124.04. |
| -1 :x: | javac | 2m 0s |
[/patch-compile-root-jdkUbuntu-21.0.10+7-Ubuntu-124.04.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-8307/5/artifact/out/patch-compile-root-jdkUbuntu-21.0.10+7-Ubuntu-124.04.txt)
| root in the patch failed with JDK Ubuntu-21.0.10+7-Ubuntu-124.04. |
| -1 :x: | compile | 2m 5s |
[/patch-compile-root-jdkUbuntu-17.0.18+8-Ubuntu-124.04.1.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-8307/5/artifact/out/patch-compile-root-jdkUbuntu-17.0.18+8-Ubuntu-124.04.1.txt)
| root in the patch failed with JDK Ubuntu-17.0.18+8-Ubuntu-124.04.1. |
| -1 :x: | javac | 2m 5s |
[/patch-compile-root-jdkUbuntu-17.0.18+8-Ubuntu-124.04.1.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-8307/5/artifact/out/patch-compile-root-jdkUbuntu-17.0.18+8-Ubuntu-124.04.1.txt)
| root in the patch failed with JDK Ubuntu-17.0.18+8-Ubuntu-124.04.1. |
| +1 :green_heart: | blanks | 0m 0s | | The patch has no blanks
issues. |
| -0 :warning: | checkstyle | 5m 50s |
[/results-checkstyle-root.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-8307/5/artifact/out/results-checkstyle-root.txt)
| root: The patch generated 9 new + 326 unchanged - 5 fixed = 335 total (was
331) |
| -1 :x: | mvnsite | 0m 50s |
[/patch-mvnsite-hadoop-hdfs-project_hadoop-hdfs-client.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-8307/5/artifact/out/patch-mvnsite-hadoop-hdfs-project_hadoop-hdfs-client.txt)
| hadoop-hdfs-client in the patch failed. |
| -1 :x: | javadoc | 0m 33s |
[/patch-javadoc-hadoop-hdfs-project_hadoop-hdfs-client-jdkUbuntu-21.0.10+7-Ubuntu-124.04.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-8307/5/artifact/out/patch-javadoc-hadoop-hdfs-project_hadoop-hdfs-client-jdkUbuntu-21.0.10+7-Ubuntu-124.04.txt)
| hadoop-hdfs-client in the patch failed with JDK
Ubuntu-21.0.10+7-Ubuntu-124.04. |
| -1 :x: | javadoc | 0m 29s |
[/patch-javadoc-hadoop-hdfs-project_hadoop-hdfs-client-jdkUbuntu-17.0.18+8-Ubuntu-124.04.1.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-8307/5/artifact/out/patch-javadoc-hadoop-hdfs-project_hadoop-hdfs-client-jdkUbuntu-17.0.18+8-Ubuntu-124.04.1.txt)
| hadoop-hdfs-client in the patch failed with JDK
Ubuntu-17.0.18+8-Ubuntu-124.04.1. |
| -1 :x: | spotbugs | 0m 49s |
[/patch-spotbugs-hadoop-hdfs-project_hadoop-hdfs-client.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-8307/5/artifact/out/patch-spotbugs-hadoop-hdfs-project_hadoop-hdfs-client.txt)
| hadoop-hdfs-client in the patch failed. |
| -1 :x: | shadedclient | 10m 9s | | patch has errors when building
and testing our client artifacts. |
|||| _ Other Tests _ |
| -1 :x: | unit | 24m 1s |
[/patch-unit-hadoop-common-project_hadoop-common.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-8307/5/artifact/out/patch-unit-hadoop-common-project_hadoop-common.txt)
| hadoop-common in the patch passed. |
| -1 :x: | unit | 0m 44s |
[/patch-unit-hadoop-hdfs-project_hadoop-hdfs-client.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-8307/5/artifact/out/patch-unit-hadoop-hdfs-project_hadoop-hdfs-client.txt)
| hadoop-hdfs-client in the patch failed. |
| +1 :green_heart: | asflicense | 0m 32s | | The patch does not
generate ASF License warnings. |
| | | 211m 14s | | |
| Reason | Tests |
|-------:|:------|
| Failed junit tests | hadoop.fs.shell.TestPathData |
| | hadoop.fs.TestFsShellTouch |
| | hadoop.fs.viewfs.TestFSMainOperationsLocalFileSystem |
| | hadoop.fs.shell.TestCpCommand |
| | hadoop.fs.TestFsShellCopy |
| | hadoop.fs.shell.TestCopyPreserveFlag |
| | hadoop.fs.TestSymlinkLocalFSFileSystem |
| | hadoop.fs.shell.TestCopyToLocal |
| | hadoop.fs.TestRawLocalFileSystemContract |
| | hadoop.fs.TestFSMainOperationsLocalFileSystem |
| Subsystem | Report/Notes |
|----------:|:-------------|
| Docker | ClientAPI=1.54 ServerAPI=1.54 base:
https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-8307/5/artifact/out/Dockerfile
|
| GITHUB PR | https://github.com/apache/hadoop/pull/8307 |
| Optional Tests | dupname asflicense compile javac javadoc mvninstall
mvnsite unit shadedclient spotbugs checkstyle codespell detsecrets |
| uname | Linux 74083f56707e 5.15.0-173-generic #183-Ubuntu SMP Fri Mar 6
13:29:34 UTC 2026 x86_64 x86_64 x86_64 GNU/Linux |
| Build tool | maven |
| Personality | dev-support/bin/hadoop.sh |
| git revision | trunk / 2beb43cb303c884d20105a050b5ada5389345be6 |
| Default Java | Ubuntu-17.0.18+8-Ubuntu-124.04.1 |
| Multi-JDK versions |
/usr/lib/jvm/java-21-openjdk-amd64:Ubuntu-21.0.10+7-Ubuntu-124.04
/usr/lib/jvm/java-17-openjdk-amd64:Ubuntu-17.0.18+8-Ubuntu-124.04.1 |
| Test Results |
https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-8307/5/testReport/ |
| Max. process+thread count | 3105 (vs. ulimit of 10000) |
| modules | C: hadoop-common-project/hadoop-common
hadoop-hdfs-project/hadoop-hdfs-client U: . |
| Console output |
https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-8307/5/console |
| versions | git=2.43.0 maven=3.9.11 spotbugs=4.9.7 |
| Powered by | Apache Yetus 0.14.1 https://yetus.apache.org |
This message was automatically generated.
> Path normalizes away important trailing slash used for URI.resolve(other)
> -------------------------------------------------------------------------
>
> Key: HADOOP-19815
> URL: https://issues.apache.org/jira/browse/HADOOP-19815
> Project: Hadoop Common
> Issue Type: Bug
> Components: common
> Affects Versions: 3.4.2
> Reporter: Christopher Tubbs
> Priority: Major
> Labels: pull-request-available
>
> This issue appears to be a relatively long-standing bug with Hadoop's
> FileSystem and Path classes, but is nevertheless important.
> The core of the issue is that {{URI.resolve(...)}} relies on a trailing slash
> to determine how to resolve path components, but the trailing slash is often
> stripped out in common code paths for FileSystem and Path. This causes
> problems when trying to resolve new URIs/Paths from existing ones.
> Constructing a Path from a URI, rather than a String or another Path, does
> preserve the original URI, so things do resolve correctly, but that yields
> highly inconsistent behavior, and depends on the specifics of how it was
> constructed and how the original URI was preserved internally.
> However, even if one argues that the String constructor for Path is supposed
> to normalize, and the URI constructor is supposed to preserve, the problem
> also exists with many of the {{FileSystem}} methods, such as
> {{{}fs.getUri(){}}}, {{{}fs.getHomeDirectory(){}}},
> {{{}fs.getWorkingDirectory(){}}}, etc. So, one must do convoluted string
> manipulation to resolve one Path from another.
> For example:
> {code:java}
> new Path("hdfs://localhost:8020/path/to/somewhere").toUri().resolve("other");
> // expected ==> URI(hdfs://localhost:8020/path/to/other)
> // actual ==> URI(hdfs://localhost:8020/path/to/other)
> new Path("hdfs://localhost:8020/path/to/somewhere/").toUri().resolve("other");
> // expected ==> URI(hdfs://localhost:8020/path/to/somewhere/other)
> // actual ==> URI(hdfs://localhost:8020/path/to/other)
> new Path(new
> URI("hdfs://localhost:8020/path/to/somewhere")).toUri().resolve("other");
> // expected ==> URI(hdfs://localhost:8020/path/to/other)
> // actual ==> URI(hdfs://localhost:8020/path/to/other)
> new Path(new
> URI("hdfs://localhost:8020/path/to/somewhere/")).toUri().resolve("other");
> // expected ==> URI(hdfs://localhost:8020/path/to/somewhere/other)
> // actual ==> URI(hdfs://localhost:8020/path/to/somewhere/other)
> var fs = FileSystem.get(new Configuration());
> fs.getUri();
> // expected ==> URI(hdfs://localhost:8020/)
> // actual ==> URI(hdfs://localhost:8020) // probably matters more for
> LocalFileSystem or viewfs, etc.
> fs.getWorkingDirectory().toUri();
> fs.getHomeDirectory().toUri();
> // expected ==> URI(hdfs://localhost:8020/user/me/)
> // actual ==> URI(hdfs://localhost:8020/user/me)
> // broken code
> URI relativeURI = new URI("mytempdir");
> fs.getWorkingDirectory().toUri().resolve(relativeURI);
> // expected ==> hdfs://localhost:8020/user/me/mytempdir
> // actual ==> hdfs://localhost:8020/user/mytempdir
> // convoluted workaround (assuming relative path in the suffix without any
> other URI elements)
> URI relativeURI = new URI("mytempdir");
> fs.getWorkingDirectory().suffix("/" + relativeURI.toString()).toUri();
> // expected ==> hdfs://localhost:8020/user/me/mytempdir
> // actual ==> hdfs://localhost:8020/user/me/mytempdir
> {code}
> Some of this is workable, so long as you're staying with Path, but the moment
> you try to work with URIs/URLs, things get convoluted quickly, requiring
> {{toString()}} calls and concatenation with slash {{/}} characters, and edge
> cases when the other path isn't relative, or contains a different authority
> or scheme, etc. These are things {{URI.resolve()}} would already handle, so
> code can get unnecessarily complex to work around these API problems.
--
This message was sent by Atlassian Jira
(v8.20.10#820010)
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]