[
https://issues.apache.org/jira/browse/HADOOP-16906?focusedWorklogId=548879&page=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-548879
]
ASF GitHub Bot logged work on HADOOP-16906:
-------------------------------------------
Author: ASF GitHub Bot
Created on: 05/Feb/21 22:18
Start Date: 05/Feb/21 22:18
Worklog Time Spent: 10m
Work Description: hadoop-yetus commented on pull request #2684:
URL: https://github.com/apache/hadoop/pull/2684#issuecomment-774318541
:broken_heart: **-1 overall**
| Vote | Subsystem | Runtime | Logfile | Comment |
|:----:|----------:|--------:|:--------:|:-------:|
| +0 :ok: | reexec | 0m 31s | | Docker mode activated. |
|||| _ Prechecks _ |
| +1 :green_heart: | dupname | 0m 0s | | No case conflicting files
found. |
| +0 :ok: | markdownlint | 0m 1s | | markdownlint was not available.
|
| +1 :green_heart: | @author | 0m 0s | | The patch does not contain
any @author tags. |
| +1 :green_heart: | | 0m 0s | [test4tests](test4tests) | The patch
appears to include 4 new or modified test files. |
|||| _ trunk Compile Tests _ |
| +0 :ok: | mvndep | 13m 59s | | Maven dependency ordering for branch |
| +1 :green_heart: | mvninstall | 21m 30s | | trunk passed |
| +1 :green_heart: | compile | 22m 19s | | trunk passed with JDK
Ubuntu-11.0.9.1+1-Ubuntu-0ubuntu1.20.04 |
| +1 :green_heart: | compile | 19m 52s | | trunk passed with JDK
Private Build-1.8.0_275-8u275-b01-0ubuntu1~20.04-b01 |
| +1 :green_heart: | checkstyle | 4m 0s | | trunk passed |
| +1 :green_heart: | mvnsite | 2m 17s | | trunk passed |
| +1 :green_heart: | shadedclient | 19m 54s | | branch has no errors
when building and testing our client artifacts. |
| +1 :green_heart: | javadoc | 1m 34s | | trunk passed with JDK
Ubuntu-11.0.9.1+1-Ubuntu-0ubuntu1.20.04 |
| +1 :green_heart: | javadoc | 2m 8s | | trunk passed with JDK
Private Build-1.8.0_275-8u275-b01-0ubuntu1~20.04-b01 |
| +0 :ok: | spotbugs | 1m 19s | | Used deprecated FindBugs config;
considering switching to SpotBugs. |
| +1 :green_heart: | findbugs | 3m 31s | | trunk passed |
|||| _ Patch Compile Tests _ |
| +0 :ok: | mvndep | 0m 26s | | Maven dependency ordering for patch |
| +1 :green_heart: | mvninstall | 1m 32s | | the patch passed |
| +1 :green_heart: | compile | 22m 13s | | the patch passed with JDK
Ubuntu-11.0.9.1+1-Ubuntu-0ubuntu1.20.04 |
| +1 :green_heart: | javac | 22m 13s | | the patch passed |
| +1 :green_heart: | compile | 19m 54s | | the patch passed with JDK
Private Build-1.8.0_275-8u275-b01-0ubuntu1~20.04-b01 |
| +1 :green_heart: | javac | 19m 54s | | the patch passed |
| -0 :warning: | checkstyle | 3m 52s |
[/diff-checkstyle-root.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-2684/1/artifact/out/diff-checkstyle-root.txt)
| root: The patch generated 3 new + 12 unchanged - 0 fixed = 15 total (was
12) |
| +1 :green_heart: | mvnsite | 2m 27s | | the patch passed |
| -1 :x: | whitespace | 0m 0s |
[/whitespace-eol.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-2684/1/artifact/out/whitespace-eol.txt)
| The patch has 6 line(s) that end in whitespace. Use git apply
--whitespace=fix <<patch_file>>. Refer https://git-scm.com/docs/git-apply |
| +1 :green_heart: | shadedclient | 13m 21s | | patch has no errors
when building and testing our client artifacts. |
| +1 :green_heart: | javadoc | 1m 34s | | the patch passed with JDK
Ubuntu-11.0.9.1+1-Ubuntu-0ubuntu1.20.04 |
| +1 :green_heart: | javadoc | 2m 13s | | the patch passed with JDK
Private Build-1.8.0_275-8u275-b01-0ubuntu1~20.04-b01 |
| -1 :x: | findbugs | 1m 32s |
[/new-findbugs-hadoop-tools_hadoop-aws.html](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-2684/1/artifact/out/new-findbugs-hadoop-tools_hadoop-aws.html)
| hadoop-tools/hadoop-aws generated 3 new + 0 unchanged - 0 fixed = 3 total
(was 0) |
|||| _ Other Tests _ |
| +1 :green_heart: | unit | 17m 50s | | hadoop-common in the patch
passed. |
| -1 :x: | unit | 2m 4s |
[/patch-unit-hadoop-tools_hadoop-aws.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-2684/1/artifact/out/patch-unit-hadoop-tools_hadoop-aws.txt)
| hadoop-aws in the patch passed. |
| +1 :green_heart: | asflicense | 0m 55s | | The patch does not
generate ASF License warnings. |
| | | 203m 2s | | |
| Reason | Tests |
|-------:|:------|
| FindBugs | module:hadoop-tools/hadoop-aws |
| | Dead store to block in
org.apache.hadoop.fs.s3a.S3ABlockOutputStream.abort() At
S3ABlockOutputStream.java:org.apache.hadoop.fs.s3a.S3ABlockOutputStream.abort()
At S3ABlockOutputStream.java:[line 454] |
| | Inconsistent synchronization of
org.apache.hadoop.fs.s3a.S3ABlockOutputStream.activeBlock; locked 75% of time
Unsynchronized access at S3ABlockOutputStream.java:75% of time Unsynchronized
access at S3ABlockOutputStream.java:[line 233] |
| | Inconsistent synchronization of
org.apache.hadoop.fs.s3a.S3ABlockOutputStream.multiPartUpload; locked 50% of
time Unsynchronized access at S3ABlockOutputStream.java:50% of time
Unsynchronized access at S3ABlockOutputStream.java:[line 457] |
| Failed junit tests | hadoop.fs.s3a.TestS3ABlockOutputStream |
| Subsystem | Report/Notes |
|----------:|:-------------|
| Docker | ClientAPI=1.41 ServerAPI=1.41 base:
https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-2684/1/artifact/out/Dockerfile
|
| GITHUB PR | https://github.com/apache/hadoop/pull/2684 |
| Optional Tests | dupname asflicense compile javac javadoc mvninstall
mvnsite unit shadedclient findbugs checkstyle markdownlint |
| uname | Linux c3364b0a3463 4.15.0-60-generic #67-Ubuntu SMP Thu Aug 22
16:55:30 UTC 2019 x86_64 x86_64 x86_64 GNU/Linux |
| Build tool | maven |
| Personality | dev-support/bin/hadoop.sh |
| git revision | trunk / c22c77af436 |
| Default Java | Private Build-1.8.0_275-8u275-b01-0ubuntu1~20.04-b01 |
| Multi-JDK versions |
/usr/lib/jvm/java-11-openjdk-amd64:Ubuntu-11.0.9.1+1-Ubuntu-0ubuntu1.20.04
/usr/lib/jvm/java-8-openjdk-amd64:Private
Build-1.8.0_275-8u275-b01-0ubuntu1~20.04-b01 |
| Test Results |
https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-2684/1/testReport/ |
| Max. process+thread count | 1377 (vs. ulimit of 5500) |
| modules | C: hadoop-common-project/hadoop-common hadoop-tools/hadoop-aws
U: . |
| Console output |
https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-2684/1/console |
| versions | git=2.25.1 maven=3.6.3 findbugs=4.0.6 |
| Powered by | Apache Yetus 0.13.0-SNAPSHOT https://yetus.apache.org |
This message was automatically generated.
----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
For queries about this service, please contact Infrastructure at:
[email protected]
Issue Time Tracking
-------------------
Worklog Id: (was: 548879)
Time Spent: 3h 10m (was: 3h)
> Add some Abortable.abort() interface for streams etc which can be terminated
> ----------------------------------------------------------------------------
>
> Key: HADOOP-16906
> URL: https://issues.apache.org/jira/browse/HADOOP-16906
> Project: Hadoop Common
> Issue Type: Sub-task
> Components: fs, fs/azure, fs/s3
> Affects Versions: 3.3.0
> Reporter: Steve Loughran
> Assignee: Jungtaek Lim
> Priority: Blocker
> Labels: pull-request-available
> Time Spent: 3h 10m
> Remaining Estimate: 0h
>
> Some IO we want to be able to abort rather than close cleanly, especially if
> the inner stream is an HTTP connection which itself supports some abort()
> method. For example: uploads to an object where we want to cancel the upload
> without close() making an incomplete write visible.
> Proposed: Add a generic interface which things like streams can implement
> {code}
> AbortableIO {
> public void abortIO() throws IOE;
> }
> {code}
> +do for s3a output stream. I wouldn't do this a passthrough on
> FSDataOutputStream because we need to consider what expectations callers have
> of an operation being "aborted"
--
This message was sent by Atlassian Jira
(v8.3.4#803005)
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]