[
https://issues.apache.org/jira/browse/HADOOP-19221?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17867131#comment-17867131
]
ASF GitHub Bot commented on HADOOP-19221:
-----------------------------------------
hadoop-yetus commented on PR #6938:
URL: https://github.com/apache/hadoop/pull/6938#issuecomment-2237741662
:broken_heart: **-1 overall**
| Vote | Subsystem | Runtime | Logfile | Comment |
|:----:|----------:|--------:|:--------:|:-------:|
| +0 :ok: | reexec | 0m 55s | | Docker mode activated. |
|||| _ Prechecks _ |
| +1 :green_heart: | dupname | 0m 0s | | No case conflicting files
found. |
| +0 :ok: | codespell | 0m 1s | | codespell was not available. |
| +0 :ok: | detsecrets | 0m 1s | | detect-secrets was not available.
|
| +1 :green_heart: | @author | 0m 0s | | The patch does not contain
any @author tags. |
| +1 :green_heart: | test4tests | 0m 0s | | The patch appears to
include 7 new or modified test files. |
|||| _ trunk Compile Tests _ |
| +0 :ok: | mvndep | 14m 39s | | Maven dependency ordering for branch |
| +1 :green_heart: | mvninstall | 32m 34s | | trunk passed |
| +1 :green_heart: | compile | 17m 40s | | trunk passed with JDK
Ubuntu-11.0.23+9-post-Ubuntu-1ubuntu120.04.2 |
| +1 :green_heart: | compile | 16m 21s | | trunk passed with JDK
Private Build-1.8.0_412-8u412-ga-1~20.04.1-b08 |
| +1 :green_heart: | checkstyle | 4m 22s | | trunk passed |
| +1 :green_heart: | mvnsite | 2m 45s | | trunk passed |
| +1 :green_heart: | javadoc | 1m 55s | | trunk passed with JDK
Ubuntu-11.0.23+9-post-Ubuntu-1ubuntu120.04.2 |
| +1 :green_heart: | javadoc | 1m 46s | | trunk passed with JDK
Private Build-1.8.0_412-8u412-ga-1~20.04.1-b08 |
| +1 :green_heart: | spotbugs | 3m 58s | | trunk passed |
| +1 :green_heart: | shadedclient | 34m 38s | | branch has no errors
when building and testing our client artifacts. |
|||| _ Patch Compile Tests _ |
| +0 :ok: | mvndep | 0m 33s | | Maven dependency ordering for patch |
| +1 :green_heart: | mvninstall | 1m 29s | | the patch passed |
| +1 :green_heart: | compile | 17m 0s | | the patch passed with JDK
Ubuntu-11.0.23+9-post-Ubuntu-1ubuntu120.04.2 |
| +1 :green_heart: | javac | 17m 0s | | the patch passed |
| +1 :green_heart: | compile | 16m 22s | | the patch passed with JDK
Private Build-1.8.0_412-8u412-ga-1~20.04.1-b08 |
| +1 :green_heart: | javac | 16m 22s | | the patch passed |
| -1 :x: | blanks | 0m 0s |
[/blanks-eol.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-6938/7/artifact/out/blanks-eol.txt)
| The patch has 1 line(s) that end in blanks. Use git apply --whitespace=fix
<<patch_file>>. Refer https://git-scm.com/docs/git-apply |
| -0 :warning: | checkstyle | 4m 27s |
[/results-checkstyle-root.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-6938/7/artifact/out/results-checkstyle-root.txt)
| root: The patch generated 27 new + 9 unchanged - 0 fixed = 36 total (was 9)
|
| +1 :green_heart: | mvnsite | 2m 36s | | the patch passed |
| +1 :green_heart: | javadoc | 1m 52s | | the patch passed with JDK
Ubuntu-11.0.23+9-post-Ubuntu-1ubuntu120.04.2 |
| +1 :green_heart: | javadoc | 1m 44s | | the patch passed with JDK
Private Build-1.8.0_412-8u412-ga-1~20.04.1-b08 |
| +1 :green_heart: | spotbugs | 4m 15s | | the patch passed |
| +1 :green_heart: | shadedclient | 35m 4s | | patch has no errors
when building and testing our client artifacts. |
|||| _ Other Tests _ |
| +1 :green_heart: | unit | 19m 44s | | hadoop-common in the patch
passed. |
| +1 :green_heart: | unit | 2m 57s | | hadoop-aws in the patch passed.
|
| +1 :green_heart: | asflicense | 1m 5s | | The patch does not
generate ASF License warnings. |
| | | 245m 4s | | |
| Subsystem | Report/Notes |
|----------:|:-------------|
| Docker | ClientAPI=1.46 ServerAPI=1.46 base:
https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-6938/7/artifact/out/Dockerfile
|
| GITHUB PR | https://github.com/apache/hadoop/pull/6938 |
| Optional Tests | dupname asflicense compile javac javadoc mvninstall
mvnsite unit shadedclient spotbugs checkstyle codespell detsecrets |
| uname | Linux 4160869c4af9 5.15.0-106-generic #116-Ubuntu SMP Wed Apr 17
09:17:56 UTC 2024 x86_64 x86_64 x86_64 GNU/Linux |
| Build tool | maven |
| Personality | dev-support/bin/hadoop.sh |
| git revision | trunk / 65fd7972109c90d25e9f5d6f531e8243def4a552 |
| Default Java | Private Build-1.8.0_412-8u412-ga-1~20.04.1-b08 |
| Multi-JDK versions |
/usr/lib/jvm/java-11-openjdk-amd64:Ubuntu-11.0.23+9-post-Ubuntu-1ubuntu120.04.2
/usr/lib/jvm/java-8-openjdk-amd64:Private
Build-1.8.0_412-8u412-ga-1~20.04.1-b08 |
| Test Results |
https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-6938/7/testReport/ |
| Max. process+thread count | 1263 (vs. ulimit of 5500) |
| modules | C: hadoop-common-project/hadoop-common hadoop-tools/hadoop-aws
U: . |
| Console output |
https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-6938/7/console |
| versions | git=2.25.1 maven=3.6.3 spotbugs=4.2.2 |
| Powered by | Apache Yetus 0.14.0 https://yetus.apache.org |
This message was automatically generated.
> S3A: Unable to recover from failure of multipart block upload attempt
> ---------------------------------------------------------------------
>
> Key: HADOOP-19221
> URL: https://issues.apache.org/jira/browse/HADOOP-19221
> Project: Hadoop Common
> Issue Type: Sub-task
> Components: fs/s3
> Affects Versions: 3.4.0
> Reporter: Steve Loughran
> Assignee: Steve Loughran
> Priority: Major
> Labels: pull-request-available
>
> If a multipart PUT request fails for some reason (e.g. networrk error) then
> all subsequent retry attempts fail with a 400 Response and ErrorCode
> RequestTimeout .
> {code}
> Your socket connection to the server was not read from or written to within
> the timeout period. Idle connections will be closed. (Service: Amazon S3;
> Status Code: 400; Error Code: RequestTimeout; Request ID:; S3 Extended
> Request ID:
> {code}
> The list of supporessed exceptions contains the root cause (the initial
> failure was a 500); all retries failed to upload properly from the source
> input stream {{RequestBody.fromInputStream(fileStream, size)}}.
> Hypothesis: the mark/reset stuff doesn't work for input streams. On the v1
> sdk we would build a multipart block upload request passing in (file, offset,
> length), the way we are now doing this doesn't recover.
> probably fixable by providing our own {{ContentStreamProvider}}
> implementations for
> # file + offset + length
> # bytebuffer
> # byte array
> The sdk does have explicit support for the memory ones, but they copy the
> data blocks first. we don't want that as it would double the memory
> requirements of active blocks.
--
This message was sent by Atlassian Jira
(v8.20.10#820010)
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]