[
https://issues.apache.org/jira/browse/HADOOP-17657?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17331110#comment-17331110
]
Hadoop QA commented on HADOOP-17657:
------------------------------------
| (x) *{color:red}-1 overall{color}* |
\\
\\
|| Vote || Subsystem || Runtime || Logfile || Comment ||
| {color:blue}0{color} | {color:blue} reexec {color} | {color:blue} 0m
39s{color} | | {color:blue} Docker mode activated. {color} |
|| || || || {color:brown} Prechecks {color} || ||
| {color:green}+1{color} | {color:green} dupname {color} | {color:green} 0m
0s{color} | | {color:green} No case conflicting files found. {color} |
| {color:blue}0{color} | {color:blue} codespell {color} | {color:blue} 0m
1s{color} | | {color:blue} codespell was not available. {color} |
| {color:green}+1{color} | {color:green} @author {color} | {color:green} 0m
0s{color} | | {color:green} The patch does not contain any @author tags.
{color} |
| {color:red}-1{color} | {color:red} test4tests {color} | {color:red} 0m
0s{color} | | {color:red} The patch doesn't appear to include any new or
modified tests. Please justify why no new tests are needed for this patch. Also
please list what manual steps were performed to verify this patch. {color} |
|| || || || {color:brown} trunk Compile Tests {color} || ||
| {color:green}+1{color} | {color:green} mvninstall {color} | {color:green} 34m
13s{color} | | {color:green} trunk passed {color} |
| {color:green}+1{color} | {color:green} compile {color} | {color:green} 20m
49s{color} | | {color:green} trunk passed with JDK
Ubuntu-11.0.10+9-Ubuntu-0ubuntu1.20.04 {color} |
| {color:green}+1{color} | {color:green} compile {color} | {color:green} 18m
0s{color} | | {color:green} trunk passed with JDK Private
Build-1.8.0_282-8u282-b08-0ubuntu1~20.04-b08 {color} |
| {color:green}+1{color} | {color:green} checkstyle {color} | {color:green} 1m
9s{color} | | {color:green} trunk passed {color} |
| {color:green}+1{color} | {color:green} mvnsite {color} | {color:green} 1m
33s{color} | | {color:green} trunk passed {color} |
| {color:green}+1{color} | {color:green} javadoc {color} | {color:green} 1m
4s{color} | | {color:green} trunk passed with JDK
Ubuntu-11.0.10+9-Ubuntu-0ubuntu1.20.04 {color} |
| {color:green}+1{color} | {color:green} javadoc {color} | {color:green} 1m
34s{color} | | {color:green} trunk passed with JDK Private
Build-1.8.0_282-8u282-b08-0ubuntu1~20.04-b08 {color} |
| {color:green}+1{color} | {color:green} spotbugs {color} | {color:green} 2m
18s{color} | | {color:green} trunk passed {color} |
| {color:green}+1{color} | {color:green} shadedclient {color} | {color:green}
15m 30s{color} | | {color:green} branch has no errors when building and
testing our client artifacts. {color} |
|| || || || {color:brown} Patch Compile Tests {color} || ||
| {color:green}+1{color} | {color:green} mvninstall {color} | {color:green} 0m
52s{color} | | {color:green} the patch passed {color} |
| {color:green}+1{color} | {color:green} compile {color} | {color:green} 19m
58s{color} | | {color:green} the patch passed with JDK
Ubuntu-11.0.10+9-Ubuntu-0ubuntu1.20.04 {color} |
| {color:red}-1{color} | {color:red} javac {color} | {color:red} 19m 58s{color}
|
[/results-compile-javac-root-jdkUbuntu-11.0.10+9-Ubuntu-0ubuntu1.20.04.txt|https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-2949/1/artifact/out/results-compile-javac-root-jdkUbuntu-11.0.10+9-Ubuntu-0ubuntu1.20.04.txt]
| {color:red} root-jdkUbuntu-11.0.10+9-Ubuntu-0ubuntu1.20.04 with JDK
Ubuntu-11.0.10+9-Ubuntu-0ubuntu1.20.04 generated 1 new + 1936 unchanged - 0
fixed = 1937 total (was 1936) {color} |
| {color:green}+1{color} | {color:green} compile {color} | {color:green} 17m
55s{color} | | {color:green} the patch passed with JDK Private
Build-1.8.0_282-8u282-b08-0ubuntu1~20.04-b08 {color} |
| {color:red}-1{color} | {color:red} javac {color} | {color:red} 17m 55s{color}
|
[/results-compile-javac-root-jdkPrivateBuild-1.8.0_282-8u282-b08-0ubuntu1~20.04-b08.txt|https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-2949/1/artifact/out/results-compile-javac-root-jdkPrivateBuild-1.8.0_282-8u282-b08-0ubuntu1~20.04-b08.txt]
| {color:red} root-jdkPrivateBuild-1.8.0_282-8u282-b08-0ubuntu1~20.04-b08 with
JDK Private Build-1.8.0_282-8u282-b08-0ubuntu1~20.04-b08 generated 1 new + 1831
unchanged - 0 fixed = 1832 total (was 1831) {color} |
| {color:green}+1{color} | {color:green} blanks {color} | {color:green} 0m
0s{color} | | {color:green} The patch has no blanks issues. {color} |
| {color:orange}-0{color} | {color:orange} checkstyle {color} | {color:orange}
1m 6s{color} |
[/results-checkstyle-hadoop-common-project_hadoop-common.txt|https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-2949/1/artifact/out/results-checkstyle-hadoop-common-project_hadoop-common.txt]
| {color:orange} hadoop-common-project/hadoop-common: The patch generated 1
new + 292 unchanged - 0 fixed = 293 total (was 292) {color} |
| {color:green}+1{color} | {color:green} mvnsite {color} | {color:green} 1m
30s{color} | | {color:green} the patch passed {color} |
| {color:green}+1{color} | {color:green} javadoc {color} | {color:green} 1m
2s{color} | | {color:green} the patch passed with JDK
Ubuntu-11.0.10+9-Ubuntu-0ubuntu1.20.04 {color} |
| {color:green}+1{color} | {color:green} javadoc {color} | {color:green} 1m
38s{color} | | {color:green} the patch passed with JDK Private
Build-1.8.0_282-8u282-b08-0ubuntu1~20.04-b08 {color} |
| {color:green}+1{color} | {color:green} spotbugs {color} | {color:green} 2m
29s{color} | | {color:green} the patch passed {color} |
| {color:green}+1{color} | {color:green} shadedclient {color} | {color:green}
15m 50s{color} | | {color:green} patch has no errors when building and testing
our client artifacts. {color} |
|| || || || {color:brown} Other Tests {color} || ||
| {color:green}+1{color} | {color:green} unit {color} | {color:green} 18m
40s{color} | | {color:green} hadoop-common in the patch passed. {color} |
| {color:green}+1{color} | {color:green} asflicense {color} | {color:green} 0m
55s{color} | | {color:green} The patch does not generate ASF License warnings.
{color} |
| {color:black}{color} | {color:black} {color} | {color:black}179m 17s{color} |
| {color:black}{color} |
\\
\\
|| Subsystem || Report/Notes ||
| Docker | ClientAPI=1.41 ServerAPI=1.41 base:
https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-2949/1/artifact/out/Dockerfile
|
| GITHUB PR | https://github.com/apache/hadoop/pull/2949 |
| JIRA Issue | HADOOP-17657 |
| Optional Tests | dupname asflicense compile javac javadoc mvninstall mvnsite
unit shadedclient spotbugs checkstyle codespell |
| uname | Linux c69b6b6a0bfb 4.15.0-58-generic #64-Ubuntu SMP Tue Aug 6
11:12:41 UTC 2019 x86_64 x86_64 x86_64 GNU/Linux |
| Build tool | maven |
| Personality | dev-support/bin/hadoop.sh |
| git revision | trunk / 8ed1b9cbbb7165eaf8a711c134c03a3a5b08c239 |
| Default Java | Private Build-1.8.0_282-8u282-b08-0ubuntu1~20.04-b08 |
| Multi-JDK versions |
/usr/lib/jvm/java-11-openjdk-amd64:Ubuntu-11.0.10+9-Ubuntu-0ubuntu1.20.04
/usr/lib/jvm/java-8-openjdk-amd64:Private
Build-1.8.0_282-8u282-b08-0ubuntu1~20.04-b08 |
| Test Results |
https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-2949/1/testReport/ |
| Max. process+thread count | 2799 (vs. ulimit of 5500) |
| modules | C: hadoop-common-project/hadoop-common U:
hadoop-common-project/hadoop-common |
| Console output |
https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-2949/1/console |
| versions | git=2.25.1 maven=3.6.3 spotbugs=4.2.2 |
| Powered by | Apache Yetus 0.14.0-SNAPSHOT https://yetus.apache.org |
This message was automatically generated.
> SequeneFile.Writer should implement StreamCapabilities
> ------------------------------------------------------
>
> Key: HADOOP-17657
> URL: https://issues.apache.org/jira/browse/HADOOP-17657
> Project: Hadoop Common
> Issue Type: Bug
> Reporter: Kishen Das
> Assignee: Kishen Das
> Priority: Major
> Labels: pull-request-available
> Time Spent: 10m
> Remaining Estimate: 0h
>
> Following exception is thrown whenever we invoke ProtoMessageWriter.hflush on
> S3 from Tez, which internally calls
> org.apache.hadoop.io.SequenceFile$Writer.hflush -> org.apache.hadoop.fs.FS
> DataOutputStream.hflush -> S3ABlockOutputStream.hflush which is not
> implemented and throws java.lang.UnsupportedOperationException.
> bdffe22d96ae [mdc@18060 class="yarn.YarnUncaughtExceptionHandler"
> level="ERROR" thread="HistoryEventHandlingThread"] Thread
> Thread[HistoryEventHandlingThread, 5,main] threw an
> Exception.^Mjava.lang.UnsupportedOperationException: S3A streams are not
> Syncable^M at
> org.apache.hadoop.fs.s3a.S3ABlockOutputStream.hflush(S3ABlockOutputStream.java:657)^M
> at org.apache.hadoop.fs.FS
> DataOutputStream.hflush(FSDataOutputStream.java:136)^M at
> org.apache.hadoop.io.SequenceFile$Writer.hflush(SequenceFile.java:1367)^M at
> org.apache.tez.dag.history.logging.proto.ProtoMessageWriter.hflush(ProtoMessageWr
> iter.java:64)^M at
> org.apache.tez.dag.history.logging.proto.ProtoHistoryLoggingService.finishCurrentDag(ProtoHistoryLoggingService.java:239)^M
> at org.apache.tez.dag.history.logging.proto.ProtoHistoryLoggingService.han
> dleEvent(ProtoHistoryLoggingService.java:198)^M at
> org.apache.tez.dag.history.logging.proto.ProtoHistoryLoggingService.loop(ProtoHistoryLoggingService.java:153)^M
> at java.lang.Thread.run(Thread.java:748)^M
> In order to fix this issue we should implement StreamCapabilities in
> SequenceFile.Writer. Also, we should fall back to flush(), if hflush() is not
> supported.
--
This message was sent by Atlassian Jira
(v8.3.4#803005)
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]