[
https://issues.apache.org/jira/browse/HADOOP-17657?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17331202#comment-17331202
]
Hadoop QA commented on HADOOP-17657:
------------------------------------
| (x) *{color:red}-1 overall{color}* |
\\
\\
|| Vote || Subsystem || Runtime || Logfile || Comment ||
| {color:blue}0{color} | {color:blue} reexec {color} | {color:blue} 16m
40s{color} | | {color:blue} Docker mode activated. {color} |
|| || || || {color:brown} Prechecks {color} || ||
| {color:green}+1{color} | {color:green} dupname {color} | {color:green} 0m
0s{color} | | {color:green} No case conflicting files found. {color} |
| {color:blue}0{color} | {color:blue} codespell {color} | {color:blue} 0m
0s{color} | | {color:blue} codespell was not available. {color} |
| {color:green}+1{color} | {color:green} @author {color} | {color:green} 0m
0s{color} | | {color:green} The patch does not contain any @author tags.
{color} |
| {color:red}-1{color} | {color:red} test4tests {color} | {color:red} 0m
0s{color} | | {color:red} The patch doesn't appear to include any new or
modified tests. Please justify why no new tests are needed for this patch. Also
please list what manual steps were performed to verify this patch. {color} |
|| || || || {color:brown} trunk Compile Tests {color} || ||
| {color:green}+1{color} | {color:green} mvninstall {color} | {color:green} 33m
58s{color} | | {color:green} trunk passed {color} |
| {color:green}+1{color} | {color:green} compile {color} | {color:green} 20m
35s{color} | | {color:green} trunk passed with JDK
Ubuntu-11.0.10+9-Ubuntu-0ubuntu1.20.04 {color} |
| {color:green}+1{color} | {color:green} compile {color} | {color:green} 17m
58s{color} | | {color:green} trunk passed with JDK Private
Build-1.8.0_282-8u282-b08-0ubuntu1~20.04-b08 {color} |
| {color:green}+1{color} | {color:green} checkstyle {color} | {color:green} 1m
8s{color} | | {color:green} trunk passed {color} |
| {color:green}+1{color} | {color:green} mvnsite {color} | {color:green} 1m
32s{color} | | {color:green} trunk passed {color} |
| {color:green}+1{color} | {color:green} javadoc {color} | {color:green} 1m
5s{color} | | {color:green} trunk passed with JDK
Ubuntu-11.0.10+9-Ubuntu-0ubuntu1.20.04 {color} |
| {color:green}+1{color} | {color:green} javadoc {color} | {color:green} 1m
35s{color} | | {color:green} trunk passed with JDK Private
Build-1.8.0_282-8u282-b08-0ubuntu1~20.04-b08 {color} |
| {color:green}+1{color} | {color:green} spotbugs {color} | {color:green} 2m
21s{color} | | {color:green} trunk passed {color} |
| {color:green}+1{color} | {color:green} shadedclient {color} | {color:green}
15m 41s{color} | | {color:green} branch has no errors when building and
testing our client artifacts. {color} |
|| || || || {color:brown} Patch Compile Tests {color} || ||
| {color:green}+1{color} | {color:green} mvninstall {color} | {color:green} 0m
54s{color} | | {color:green} the patch passed {color} |
| {color:green}+1{color} | {color:green} compile {color} | {color:green} 19m
59s{color} | | {color:green} the patch passed with JDK
Ubuntu-11.0.10+9-Ubuntu-0ubuntu1.20.04 {color} |
| {color:green}+1{color} | {color:green} javac {color} | {color:green} 19m
59s{color} | | {color:green} the patch passed {color} |
| {color:green}+1{color} | {color:green} compile {color} | {color:green} 17m
59s{color} | | {color:green} the patch passed with JDK Private
Build-1.8.0_282-8u282-b08-0ubuntu1~20.04-b08 {color} |
| {color:green}+1{color} | {color:green} javac {color} | {color:green} 17m
59s{color} | | {color:green} the patch passed {color} |
| {color:green}+1{color} | {color:green} blanks {color} | {color:green} 0m
0s{color} | | {color:green} The patch has no blanks issues. {color} |
| {color:orange}-0{color} | {color:orange} checkstyle {color} | {color:orange}
1m 7s{color} |
[/results-checkstyle-hadoop-common-project_hadoop-common.txt|https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-2949/2/artifact/out/results-checkstyle-hadoop-common-project_hadoop-common.txt]
| {color:orange} hadoop-common-project/hadoop-common: The patch generated 1
new + 292 unchanged - 0 fixed = 293 total (was 292) {color} |
| {color:green}+1{color} | {color:green} mvnsite {color} | {color:green} 1m
30s{color} | | {color:green} the patch passed {color} |
| {color:green}+1{color} | {color:green} javadoc {color} | {color:green} 1m
1s{color} | | {color:green} the patch passed with JDK
Ubuntu-11.0.10+9-Ubuntu-0ubuntu1.20.04 {color} |
| {color:green}+1{color} | {color:green} javadoc {color} | {color:green} 1m
40s{color} | | {color:green} the patch passed with JDK Private
Build-1.8.0_282-8u282-b08-0ubuntu1~20.04-b08 {color} |
| {color:green}+1{color} | {color:green} spotbugs {color} | {color:green} 2m
30s{color} | | {color:green} the patch passed {color} |
| {color:green}+1{color} | {color:green} shadedclient {color} | {color:green}
15m 44s{color} | | {color:green} patch has no errors when building and testing
our client artifacts. {color} |
|| || || || {color:brown} Other Tests {color} || ||
| {color:green}+1{color} | {color:green} unit {color} | {color:green} 17m
25s{color} | | {color:green} hadoop-common in the patch passed. {color} |
| {color:green}+1{color} | {color:green} asflicense {color} | {color:green} 0m
56s{color} | | {color:green} The patch does not generate ASF License warnings.
{color} |
| {color:black}{color} | {color:black} {color} | {color:black}193m 58s{color} |
| {color:black}{color} |
\\
\\
|| Subsystem || Report/Notes ||
| Docker | ClientAPI=1.41 ServerAPI=1.41 base:
https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-2949/2/artifact/out/Dockerfile
|
| GITHUB PR | https://github.com/apache/hadoop/pull/2949 |
| JIRA Issue | HADOOP-17657 |
| Optional Tests | dupname asflicense compile javac javadoc mvninstall mvnsite
unit shadedclient spotbugs checkstyle codespell |
| uname | Linux a478e58eea4a 4.15.0-58-generic #64-Ubuntu SMP Tue Aug 6
11:12:41 UTC 2019 x86_64 x86_64 x86_64 GNU/Linux |
| Build tool | maven |
| Personality | dev-support/bin/hadoop.sh |
| git revision | trunk / 354f446222ce51a3584d2f74876796673e798526 |
| Default Java | Private Build-1.8.0_282-8u282-b08-0ubuntu1~20.04-b08 |
| Multi-JDK versions |
/usr/lib/jvm/java-11-openjdk-amd64:Ubuntu-11.0.10+9-Ubuntu-0ubuntu1.20.04
/usr/lib/jvm/java-8-openjdk-amd64:Private
Build-1.8.0_282-8u282-b08-0ubuntu1~20.04-b08 |
| Test Results |
https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-2949/2/testReport/ |
| Max. process+thread count | 1295 (vs. ulimit of 5500) |
| modules | C: hadoop-common-project/hadoop-common U:
hadoop-common-project/hadoop-common |
| Console output |
https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-2949/2/console |
| versions | git=2.25.1 maven=3.6.3 spotbugs=4.2.2 |
| Powered by | Apache Yetus 0.14.0-SNAPSHOT https://yetus.apache.org |
This message was automatically generated.
> SequeneFile.Writer should implement StreamCapabilities
> ------------------------------------------------------
>
> Key: HADOOP-17657
> URL: https://issues.apache.org/jira/browse/HADOOP-17657
> Project: Hadoop Common
> Issue Type: Bug
> Reporter: Kishen Das
> Assignee: Kishen Das
> Priority: Major
> Labels: pull-request-available
> Time Spent: 20m
> Remaining Estimate: 0h
>
> Following exception is thrown whenever we invoke ProtoMessageWriter.hflush on
> S3 from Tez, which internally calls
> org.apache.hadoop.io.SequenceFile$Writer.hflush -> org.apache.hadoop.fs.FS
> DataOutputStream.hflush -> S3ABlockOutputStream.hflush which is not
> implemented and throws java.lang.UnsupportedOperationException.
> bdffe22d96ae [mdc@18060 class="yarn.YarnUncaughtExceptionHandler"
> level="ERROR" thread="HistoryEventHandlingThread"] Thread
> Thread[HistoryEventHandlingThread, 5,main] threw an
> Exception.^Mjava.lang.UnsupportedOperationException: S3A streams are not
> Syncable^M at
> org.apache.hadoop.fs.s3a.S3ABlockOutputStream.hflush(S3ABlockOutputStream.java:657)^M
> at org.apache.hadoop.fs.FS
> DataOutputStream.hflush(FSDataOutputStream.java:136)^M at
> org.apache.hadoop.io.SequenceFile$Writer.hflush(SequenceFile.java:1367)^M at
> org.apache.tez.dag.history.logging.proto.ProtoMessageWriter.hflush(ProtoMessageWr
> iter.java:64)^M at
> org.apache.tez.dag.history.logging.proto.ProtoHistoryLoggingService.finishCurrentDag(ProtoHistoryLoggingService.java:239)^M
> at org.apache.tez.dag.history.logging.proto.ProtoHistoryLoggingService.han
> dleEvent(ProtoHistoryLoggingService.java:198)^M at
> org.apache.tez.dag.history.logging.proto.ProtoHistoryLoggingService.loop(ProtoHistoryLoggingService.java:153)^M
> at java.lang.Thread.run(Thread.java:748)^M
> In order to fix this issue we should implement StreamCapabilities in
> SequenceFile.Writer. Also, we should fall back to flush(), if hflush() is not
> supported.
--
This message was sent by Atlassian Jira
(v8.3.4#803005)
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]