[
https://issues.apache.org/jira/browse/HADOOP-19130?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17831299#comment-17831299
]
ASF GitHub Bot commented on HADOOP-19130:
-----------------------------------------
hadoop-yetus commented on PR #6678:
URL: https://github.com/apache/hadoop/pull/6678#issuecomment-2022507824
:broken_heart: **-1 overall**
| Vote | Subsystem | Runtime | Logfile | Comment |
|:----:|----------:|--------:|:--------:|:-------:|
| +0 :ok: | reexec | 0m 59s | | Docker mode activated. |
|||| _ Prechecks _ |
| +1 :green_heart: | dupname | 0m 0s | | No case conflicting files
found. |
| +0 :ok: | codespell | 0m 0s | | codespell was not available. |
| +0 :ok: | detsecrets | 0m 0s | | detect-secrets was not available.
|
| +1 :green_heart: | @author | 0m 0s | | The patch does not contain
any @author tags. |
| +1 :green_heart: | test4tests | 0m 0s | | The patch appears to
include 1 new or modified test files. |
|||| _ trunk Compile Tests _ |
| +1 :green_heart: | mvninstall | 44m 23s | | trunk passed |
| +1 :green_heart: | compile | 17m 23s | | trunk passed with JDK
Ubuntu-11.0.22+7-post-Ubuntu-0ubuntu220.04.1 |
| +1 :green_heart: | compile | 16m 2s | | trunk passed with JDK
Private Build-1.8.0_402-8u402-ga-2ubuntu1~20.04-b06 |
| +1 :green_heart: | checkstyle | 1m 16s | | trunk passed |
| +1 :green_heart: | mvnsite | 1m 42s | | trunk passed |
| +1 :green_heart: | javadoc | 1m 17s | | trunk passed with JDK
Ubuntu-11.0.22+7-post-Ubuntu-0ubuntu220.04.1 |
| +1 :green_heart: | javadoc | 0m 53s | | trunk passed with JDK
Private Build-1.8.0_402-8u402-ga-2ubuntu1~20.04-b06 |
| +1 :green_heart: | spotbugs | 2m 38s | | trunk passed |
| +1 :green_heart: | shadedclient | 35m 7s | | branch has no errors
when building and testing our client artifacts. |
|||| _ Patch Compile Tests _ |
| +1 :green_heart: | mvninstall | 0m 55s | | the patch passed |
| +1 :green_heart: | compile | 16m 43s | | the patch passed with JDK
Ubuntu-11.0.22+7-post-Ubuntu-0ubuntu220.04.1 |
| +1 :green_heart: | javac | 16m 43s | | the patch passed |
| +1 :green_heart: | compile | 16m 3s | | the patch passed with JDK
Private Build-1.8.0_402-8u402-ga-2ubuntu1~20.04-b06 |
| +1 :green_heart: | javac | 16m 3s | | the patch passed |
| -1 :x: | blanks | 0m 0s |
[/blanks-eol.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-6678/2/artifact/out/blanks-eol.txt)
| The patch has 3 line(s) that end in blanks. Use git apply --whitespace=fix
<<patch_file>>. Refer https://git-scm.com/docs/git-apply |
| +1 :green_heart: | checkstyle | 1m 13s | | the patch passed |
| +1 :green_heart: | mvnsite | 1m 45s | | the patch passed |
| +1 :green_heart: | javadoc | 1m 9s | | the patch passed with JDK
Ubuntu-11.0.22+7-post-Ubuntu-0ubuntu220.04.1 |
| +1 :green_heart: | javadoc | 0m 55s | | the patch passed with JDK
Private Build-1.8.0_402-8u402-ga-2ubuntu1~20.04-b06 |
| +1 :green_heart: | spotbugs | 2m 47s | | the patch passed |
| +1 :green_heart: | shadedclient | 37m 7s | | patch has no errors
when building and testing our client artifacts. |
|||| _ Other Tests _ |
| +1 :green_heart: | unit | 20m 7s | | hadoop-common in the patch
passed. |
| +1 :green_heart: | asflicense | 1m 0s | | The patch does not
generate ASF License warnings. |
| | | 225m 11s | | |
| Subsystem | Report/Notes |
|----------:|:-------------|
| Docker | ClientAPI=1.45 ServerAPI=1.45 base:
https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-6678/2/artifact/out/Dockerfile
|
| GITHUB PR | https://github.com/apache/hadoop/pull/6678 |
| Optional Tests | dupname asflicense compile javac javadoc mvninstall
mvnsite unit shadedclient spotbugs checkstyle codespell detsecrets |
| uname | Linux 6deb55020a24 5.15.0-94-generic #104-Ubuntu SMP Tue Jan 9
15:25:40 UTC 2024 x86_64 x86_64 x86_64 GNU/Linux |
| Build tool | maven |
| Personality | dev-support/bin/hadoop.sh |
| git revision | trunk / 767cc3317445900a2b8399ab8552ae85edb47a4d |
| Default Java | Private Build-1.8.0_402-8u402-ga-2ubuntu1~20.04-b06 |
| Multi-JDK versions |
/usr/lib/jvm/java-11-openjdk-amd64:Ubuntu-11.0.22+7-post-Ubuntu-0ubuntu220.04.1
/usr/lib/jvm/java-8-openjdk-amd64:Private
Build-1.8.0_402-8u402-ga-2ubuntu1~20.04-b06 |
| Test Results |
https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-6678/2/testReport/ |
| Max. process+thread count | 1263 (vs. ulimit of 5500) |
| modules | C: hadoop-common-project/hadoop-common U:
hadoop-common-project/hadoop-common |
| Console output |
https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-6678/2/console |
| versions | git=2.25.1 maven=3.6.3 spotbugs=4.2.2 |
| Powered by | Apache Yetus 0.14.0 https://yetus.apache.org |
This message was automatically generated.
> FTPFileSystem rename with full qualified path broken
> ----------------------------------------------------
>
> Key: HADOOP-19130
> URL: https://issues.apache.org/jira/browse/HADOOP-19130
> Project: Hadoop Common
> Issue Type: Bug
> Components: fs
> Affects Versions: 0.20.2, 3.3.3, 3.3.4, 3.3.6
> Reporter: shawn
> Priority: Major
> Labels: pull-request-available
> Attachments: image-2024-03-27-09-59-12-381.png
>
> Original Estimate: 2h
> Remaining Estimate: 2h
>
> When use fs shell to rename file in ftp server, it always get "Input/output
> error", when full qualified path
> is passed to it(eg. [ftp://user:password@localhost/pathxxx]), the reason is
> that
> changeWorkingDirectory command underneath is being passed a string with
> [file://|file:///] uri prefix which will not be understand
> by ftp server。
> !image-2024-03-27-09-59-12-381.png!
> the solution should be pass
> absoluteSrc.getParent().toUri().getPath().toString to avoid
> [file://|file:///] uri prefix, like this:
> {code:java}
> --- a/FTPFileSystem.java
> +++ b/FTPFileSystem.java
> @@ -549,15 +549,15 @@ public class FTPFileSystem extends FileSystem {
> throw new IOException("Destination path " + dst
> + " already exist, cannot rename!");
> }
> - String parentSrc = absoluteSrc.getParent().toUri().toString();
> - String parentDst = absoluteDst.getParent().toUri().toString();
> + URI parentSrc = absoluteSrc.getParent().toUri();
> + URI parentDst = absoluteDst.getParent().toUri();
> String from = src.getName();
> String to = dst.getName();
> - if (!parentSrc.equals(parentDst)) {
> + if (!parentSrc.toString().equals(parentDst.toString())) {
> throw new IOException("Cannot rename parent(source): " + parentSrc
> + ", parent(destination): " + parentDst);
> }
> - client.changeWorkingDirectory(parentSrc);
> + client.changeWorkingDirectory(parentSrc.getPath().toString());
> boolean renamed = client.rename(from, to);
> return renamed;
> }{code}
> already related issue as follows
> https://issues.apache.org/jira/browse/HADOOP-8653
> I wonder why this bug haven't been fixed
>
>
--
This message was sent by Atlassian Jira
(v8.20.10#820010)
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]