[
https://issues.apache.org/jira/browse/HADOOP-17657?focusedWorklogId=588212&page=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-588212
]
ASF GitHub Bot logged work on HADOOP-17657:
-------------------------------------------
Author: ASF GitHub Bot
Created on: 24/Apr/21 00:15
Start Date: 24/Apr/21 00:15
Worklog Time Spent: 10m
Work Description: hadoop-yetus commented on pull request #2949:
URL: https://github.com/apache/hadoop/pull/2949#issuecomment-826001364
:broken_heart: **-1 overall**
| Vote | Subsystem | Runtime | Logfile | Comment |
|:----:|----------:|--------:|:--------:|:-------:|
| +0 :ok: | reexec | 0m 39s | | Docker mode activated. |
|||| _ Prechecks _ |
| +1 :green_heart: | dupname | 0m 0s | | No case conflicting files
found. |
| +0 :ok: | codespell | 0m 1s | | codespell was not available. |
| +1 :green_heart: | @author | 0m 0s | | The patch does not contain
any @author tags. |
| -1 :x: | test4tests | 0m 0s | | The patch doesn't appear to include
any new or modified tests. Please justify why no new tests are needed for this
patch. Also please list what manual steps were performed to verify this patch.
|
|||| _ trunk Compile Tests _ |
| +1 :green_heart: | mvninstall | 34m 13s | | trunk passed |
| +1 :green_heart: | compile | 20m 49s | | trunk passed with JDK
Ubuntu-11.0.10+9-Ubuntu-0ubuntu1.20.04 |
| +1 :green_heart: | compile | 18m 0s | | trunk passed with JDK
Private Build-1.8.0_282-8u282-b08-0ubuntu1~20.04-b08 |
| +1 :green_heart: | checkstyle | 1m 9s | | trunk passed |
| +1 :green_heart: | mvnsite | 1m 33s | | trunk passed |
| +1 :green_heart: | javadoc | 1m 4s | | trunk passed with JDK
Ubuntu-11.0.10+9-Ubuntu-0ubuntu1.20.04 |
| +1 :green_heart: | javadoc | 1m 34s | | trunk passed with JDK
Private Build-1.8.0_282-8u282-b08-0ubuntu1~20.04-b08 |
| +1 :green_heart: | spotbugs | 2m 18s | | trunk passed |
| +1 :green_heart: | shadedclient | 15m 30s | | branch has no errors
when building and testing our client artifacts. |
|||| _ Patch Compile Tests _ |
| +1 :green_heart: | mvninstall | 0m 52s | | the patch passed |
| +1 :green_heart: | compile | 19m 58s | | the patch passed with JDK
Ubuntu-11.0.10+9-Ubuntu-0ubuntu1.20.04 |
| -1 :x: | javac | 19m 58s |
[/results-compile-javac-root-jdkUbuntu-11.0.10+9-Ubuntu-0ubuntu1.20.04.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-2949/1/artifact/out/results-compile-javac-root-jdkUbuntu-11.0.10+9-Ubuntu-0ubuntu1.20.04.txt)
| root-jdkUbuntu-11.0.10+9-Ubuntu-0ubuntu1.20.04 with JDK
Ubuntu-11.0.10+9-Ubuntu-0ubuntu1.20.04 generated 1 new + 1936 unchanged - 0
fixed = 1937 total (was 1936) |
| +1 :green_heart: | compile | 17m 55s | | the patch passed with JDK
Private Build-1.8.0_282-8u282-b08-0ubuntu1~20.04-b08 |
| -1 :x: | javac | 17m 55s |
[/results-compile-javac-root-jdkPrivateBuild-1.8.0_282-8u282-b08-0ubuntu1~20.04-b08.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-2949/1/artifact/out/results-compile-javac-root-jdkPrivateBuild-1.8.0_282-8u282-b08-0ubuntu1~20.04-b08.txt)
| root-jdkPrivateBuild-1.8.0_282-8u282-b08-0ubuntu1~20.04-b08 with JDK
Private Build-1.8.0_282-8u282-b08-0ubuntu1~20.04-b08 generated 1 new + 1831
unchanged - 0 fixed = 1832 total (was 1831) |
| +1 :green_heart: | blanks | 0m 0s | | The patch has no blanks
issues. |
| -0 :warning: | checkstyle | 1m 6s |
[/results-checkstyle-hadoop-common-project_hadoop-common.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-2949/1/artifact/out/results-checkstyle-hadoop-common-project_hadoop-common.txt)
| hadoop-common-project/hadoop-common: The patch generated 1 new + 292
unchanged - 0 fixed = 293 total (was 292) |
| +1 :green_heart: | mvnsite | 1m 30s | | the patch passed |
| +1 :green_heart: | javadoc | 1m 2s | | the patch passed with JDK
Ubuntu-11.0.10+9-Ubuntu-0ubuntu1.20.04 |
| +1 :green_heart: | javadoc | 1m 38s | | the patch passed with JDK
Private Build-1.8.0_282-8u282-b08-0ubuntu1~20.04-b08 |
| +1 :green_heart: | spotbugs | 2m 29s | | the patch passed |
| +1 :green_heart: | shadedclient | 15m 50s | | patch has no errors
when building and testing our client artifacts. |
|||| _ Other Tests _ |
| +1 :green_heart: | unit | 18m 40s | | hadoop-common in the patch
passed. |
| +1 :green_heart: | asflicense | 0m 55s | | The patch does not
generate ASF License warnings. |
| | | 179m 17s | | |
| Subsystem | Report/Notes |
|----------:|:-------------|
| Docker | ClientAPI=1.41 ServerAPI=1.41 base:
https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-2949/1/artifact/out/Dockerfile
|
| GITHUB PR | https://github.com/apache/hadoop/pull/2949 |
| JIRA Issue | HADOOP-17657 |
| Optional Tests | dupname asflicense compile javac javadoc mvninstall
mvnsite unit shadedclient spotbugs checkstyle codespell |
| uname | Linux c69b6b6a0bfb 4.15.0-58-generic #64-Ubuntu SMP Tue Aug 6
11:12:41 UTC 2019 x86_64 x86_64 x86_64 GNU/Linux |
| Build tool | maven |
| Personality | dev-support/bin/hadoop.sh |
| git revision | trunk / 8ed1b9cbbb7165eaf8a711c134c03a3a5b08c239 |
| Default Java | Private Build-1.8.0_282-8u282-b08-0ubuntu1~20.04-b08 |
| Multi-JDK versions |
/usr/lib/jvm/java-11-openjdk-amd64:Ubuntu-11.0.10+9-Ubuntu-0ubuntu1.20.04
/usr/lib/jvm/java-8-openjdk-amd64:Private
Build-1.8.0_282-8u282-b08-0ubuntu1~20.04-b08 |
| Test Results |
https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-2949/1/testReport/ |
| Max. process+thread count | 2799 (vs. ulimit of 5500) |
| modules | C: hadoop-common-project/hadoop-common U:
hadoop-common-project/hadoop-common |
| Console output |
https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-2949/1/console |
| versions | git=2.25.1 maven=3.6.3 spotbugs=4.2.2 |
| Powered by | Apache Yetus 0.14.0-SNAPSHOT https://yetus.apache.org |
This message was automatically generated.
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
For queries about this service, please contact Infrastructure at:
[email protected]
Issue Time Tracking
-------------------
Worklog Id: (was: 588212)
Time Spent: 20m (was: 10m)
> SequeneFile.Writer should implement StreamCapabilities
> ------------------------------------------------------
>
> Key: HADOOP-17657
> URL: https://issues.apache.org/jira/browse/HADOOP-17657
> Project: Hadoop Common
> Issue Type: Bug
> Reporter: Kishen Das
> Assignee: Kishen Das
> Priority: Major
> Labels: pull-request-available
> Time Spent: 20m
> Remaining Estimate: 0h
>
> Following exception is thrown whenever we invoke ProtoMessageWriter.hflush on
> S3 from Tez, which internally calls
> org.apache.hadoop.io.SequenceFile$Writer.hflush -> org.apache.hadoop.fs.FS
> DataOutputStream.hflush -> S3ABlockOutputStream.hflush which is not
> implemented and throws java.lang.UnsupportedOperationException.
> bdffe22d96ae [mdc@18060 class="yarn.YarnUncaughtExceptionHandler"
> level="ERROR" thread="HistoryEventHandlingThread"] Thread
> Thread[HistoryEventHandlingThread, 5,main] threw an
> Exception.^Mjava.lang.UnsupportedOperationException: S3A streams are not
> Syncable^M at
> org.apache.hadoop.fs.s3a.S3ABlockOutputStream.hflush(S3ABlockOutputStream.java:657)^M
> at org.apache.hadoop.fs.FS
> DataOutputStream.hflush(FSDataOutputStream.java:136)^M at
> org.apache.hadoop.io.SequenceFile$Writer.hflush(SequenceFile.java:1367)^M at
> org.apache.tez.dag.history.logging.proto.ProtoMessageWriter.hflush(ProtoMessageWr
> iter.java:64)^M at
> org.apache.tez.dag.history.logging.proto.ProtoHistoryLoggingService.finishCurrentDag(ProtoHistoryLoggingService.java:239)^M
> at org.apache.tez.dag.history.logging.proto.ProtoHistoryLoggingService.han
> dleEvent(ProtoHistoryLoggingService.java:198)^M at
> org.apache.tez.dag.history.logging.proto.ProtoHistoryLoggingService.loop(ProtoHistoryLoggingService.java:153)^M
> at java.lang.Thread.run(Thread.java:748)^M
> In order to fix this issue we should implement StreamCapabilities in
> SequenceFile.Writer. Also, we should fall back to flush(), if hflush() is not
> supported.
--
This message was sent by Atlassian Jira
(v8.3.4#803005)
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]