[
https://issues.apache.org/jira/browse/HADOOP-14418?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16009241#comment-16009241
]
Hadoop QA commented on HADOOP-14418:
------------------------------------
| (x) *{color:red}-1 overall{color}* |
\\
\\
|| Vote || Subsystem || Runtime || Comment ||
| {color:blue}0{color} | {color:blue} reexec {color} | {color:blue} 0m
18s{color} | {color:blue} Docker mode activated. {color} |
| {color:green}+1{color} | {color:green} @author {color} | {color:green} 0m
0s{color} | {color:green} The patch does not contain any @author tags. {color} |
| {color:red}-1{color} | {color:red} test4tests {color} | {color:red} 0m
0s{color} | {color:red} The patch doesn't appear to include any new or modified
tests. Please justify why no new tests are needed for this patch. Also please
list what manual steps were performed to verify this patch. {color} |
| {color:green}+1{color} | {color:green} mvninstall {color} | {color:green} 14m
41s{color} | {color:green} trunk passed {color} |
| {color:green}+1{color} | {color:green} compile {color} | {color:green} 17m
57s{color} | {color:green} trunk passed {color} |
| {color:green}+1{color} | {color:green} checkstyle {color} | {color:green} 0m
41s{color} | {color:green} trunk passed {color} |
| {color:green}+1{color} | {color:green} mvnsite {color} | {color:green} 1m
12s{color} | {color:green} trunk passed {color} |
| {color:green}+1{color} | {color:green} mvneclipse {color} | {color:green} 0m
21s{color} | {color:green} trunk passed {color} |
| {color:red}-1{color} | {color:red} findbugs {color} | {color:red} 1m
35s{color} | {color:red} hadoop-common-project/hadoop-common in trunk has 19
extant Findbugs warnings. {color} |
| {color:green}+1{color} | {color:green} javadoc {color} | {color:green} 0m
56s{color} | {color:green} trunk passed {color} |
| {color:green}+1{color} | {color:green} mvninstall {color} | {color:green} 0m
45s{color} | {color:green} the patch passed {color} |
| {color:green}+1{color} | {color:green} compile {color} | {color:green} 15m
19s{color} | {color:green} the patch passed {color} |
| {color:green}+1{color} | {color:green} javac {color} | {color:green} 15m
19s{color} | {color:green} the patch passed {color} |
| {color:orange}-0{color} | {color:orange} checkstyle {color} | {color:orange}
0m 36s{color} | {color:orange} hadoop-common-project/hadoop-common: The patch
generated 1 new + 0 unchanged - 0 fixed = 1 total (was 0) {color} |
| {color:green}+1{color} | {color:green} mvnsite {color} | {color:green} 1m
13s{color} | {color:green} the patch passed {color} |
| {color:green}+1{color} | {color:green} mvneclipse {color} | {color:green} 0m
22s{color} | {color:green} the patch passed {color} |
| {color:green}+1{color} | {color:green} whitespace {color} | {color:green} 0m
0s{color} | {color:green} The patch has no whitespace issues. {color} |
| {color:green}+1{color} | {color:green} findbugs {color} | {color:green} 1m
57s{color} | {color:green} the patch passed {color} |
| {color:green}+1{color} | {color:green} javadoc {color} | {color:green} 0m
57s{color} | {color:green} the patch passed {color} |
| {color:red}-1{color} | {color:red} unit {color} | {color:red} 9m 54s{color}
| {color:red} hadoop-common in the patch failed. {color} |
| {color:green}+1{color} | {color:green} asflicense {color} | {color:green} 0m
43s{color} | {color:green} The patch does not generate ASF License warnings.
{color} |
| {color:black}{color} | {color:black} {color} | {color:black} 71m 33s{color} |
{color:black} {color} |
\\
\\
|| Reason || Tests ||
| Failed junit tests | hadoop.security.TestRaceWhenRelogin |
| | hadoop.io.erasurecode.coder.TestHHXORErasureCoder |
| | hadoop.ha.TestZKFailoverController |
| | hadoop.io.erasurecode.coder.TestXORCoder |
| | hadoop.io.erasurecode.coder.TestRSErasureCoder |
| | hadoop.io.erasurecode.TestCodecRawCoderMapping |
\\
\\
|| Subsystem || Report/Notes ||
| Docker | Image:yetus/hadoop:14b5c93 |
| JIRA Issue | HADOOP-14418 |
| JIRA Patch URL |
https://issues.apache.org/jira/secure/attachment/12867934/HADOOP-14418.01.patch
|
| Optional Tests | asflicense compile javac javadoc mvninstall mvnsite
unit findbugs checkstyle |
| uname | Linux dd8e17b33dad 3.13.0-116-generic #163-Ubuntu SMP Fri Mar 31
14:13:22 UTC 2017 x86_64 x86_64 x86_64 GNU/Linux |
| Build tool | maven |
| Personality | /testptch/hadoop/patchprocess/precommit/personality/provided.sh
|
| git revision | trunk / 6600abb |
| Default Java | 1.8.0_131 |
| findbugs | v3.1.0-RC1 |
| findbugs |
https://builds.apache.org/job/PreCommit-HADOOP-Build/12303/artifact/patchprocess/branch-findbugs-hadoop-common-project_hadoop-common-warnings.html
|
| checkstyle |
https://builds.apache.org/job/PreCommit-HADOOP-Build/12303/artifact/patchprocess/diff-checkstyle-hadoop-common-project_hadoop-common.txt
|
| unit |
https://builds.apache.org/job/PreCommit-HADOOP-Build/12303/artifact/patchprocess/patch-unit-hadoop-common-project_hadoop-common.txt
|
| Test Results |
https://builds.apache.org/job/PreCommit-HADOOP-Build/12303/testReport/ |
| modules | C: hadoop-common-project/hadoop-common U:
hadoop-common-project/hadoop-common |
| Console output |
https://builds.apache.org/job/PreCommit-HADOOP-Build/12303/console |
| Powered by | Apache Yetus 0.5.0-SNAPSHOT http://yetus.apache.org |
This message was automatically generated.
> Confusing failure stack trace when codec fallback is happend
> ------------------------------------------------------------
>
> Key: HADOOP-14418
> URL: https://issues.apache.org/jira/browse/HADOOP-14418
> Project: Hadoop Common
> Issue Type: Sub-task
> Reporter: Kai Sasaki
> Assignee: Kai Sasaki
> Priority: Minor
> Attachments: HADOOP-14418.01.patch
>
>
> When erasure codec is fallbacked, all stacktrace is shown to client.
> {code}
> root@990705591ccc:/usr/local/hadoop# bin/hadoop fs -put README.txt /ec
> 17/05/13 08:23:46 WARN util.NativeCodeLoader: Unable to load native-hadoop
> library for your platform... using builtin-java classes where applicable
> 17/05/13 08:23:47 WARN erasurecode.CodecUtil: Failed to create raw erasure
> encoder rs_native, fallback to next codec if possible
> java.lang.ExceptionInInitializerError
> at
> org.apache.hadoop.io.erasurecode.rawcoder.NativeRSRawErasureCoderFactory.createEncoder(NativeRSRawErasureCoderFactory.java:35)
> at
> org.apache.hadoop.io.erasurecode.CodecUtil.createRawEncoderWithFallback(CodecUtil.java:173)
> at
> org.apache.hadoop.io.erasurecode.CodecUtil.createRawEncoder(CodecUtil.java:129)
> at
> org.apache.hadoop.hdfs.DFSStripedOutputStream.<init>(DFSStripedOutputStream.java:302)
> at
> org.apache.hadoop.hdfs.DFSOutputStream.newStreamForCreate(DFSOutputStream.java:309)
> at org.apache.hadoop.hdfs.DFSClient.create(DFSClient.java:1214)
> at org.apache.hadoop.hdfs.DFSClient.create(DFSClient.java:1193)
> at org.apache.hadoop.hdfs.DFSClient.create(DFSClient.java:1131)
> at
> org.apache.hadoop.hdfs.DistributedFileSystem$8.doCall(DistributedFileSystem.java:449)
> at
> org.apache.hadoop.hdfs.DistributedFileSystem$8.doCall(DistributedFileSystem.java:446)
> at
> org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
> at
> org.apache.hadoop.hdfs.DistributedFileSystem.create(DistributedFileSystem.java:460)
> at
> org.apache.hadoop.hdfs.DistributedFileSystem.create(DistributedFileSystem.java:387)
> at
> org.apache.hadoop.fs.FilterFileSystem.create(FilterFileSystem.java:181)
> at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:1074)
> at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:1054)
> at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:943)
> at
> org.apache.hadoop.fs.shell.CommandWithDestination$TargetFileSystem.create(CommandWithDestination.java:509)
> at
> org.apache.hadoop.fs.shell.CommandWithDestination$TargetFileSystem.writeStreamToFile(CommandWithDestination.java:484)
> at
> org.apache.hadoop.fs.shell.CommandWithDestination.copyStreamToTarget(CommandWithDestination.java:407)
> at
> org.apache.hadoop.fs.shell.CommandWithDestination.copyFileToTarget(CommandWithDestination.java:342)
> at
> org.apache.hadoop.fs.shell.CommandWithDestination.processPath(CommandWithDestination.java:277)
> at
> org.apache.hadoop.fs.shell.CommandWithDestination.processPath(CommandWithDestination.java:262)
> at org.apache.hadoop.fs.shell.Command.processPaths(Command.java:331)
> at
> org.apache.hadoop.fs.shell.Command.processPathArgument(Command.java:303)
> at
> org.apache.hadoop.fs.shell.CommandWithDestination.processPathArgument(CommandWithDestination.java:257)
> at org.apache.hadoop.fs.shell.Command.processArgument(Command.java:285)
> at org.apache.hadoop.fs.shell.Command.processArguments(Command.java:269)
> at
> org.apache.hadoop.fs.shell.CommandWithDestination.processArguments(CommandWithDestination.java:228)
> at
> org.apache.hadoop.fs.shell.CopyCommands$Put.processArguments(CopyCommands.java:286)
> at
> org.apache.hadoop.fs.shell.FsCommand.processRawArguments(FsCommand.java:119)
> at org.apache.hadoop.fs.shell.Command.run(Command.java:176)
> at org.apache.hadoop.fs.FsShell.run(FsShell.java:326)
> at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:76)
> at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:90)
> at org.apache.hadoop.fs.FsShell.main(FsShell.java:389)
> Caused by: java.lang.RuntimeException: hadoop native library cannot be loaded.
> at
> org.apache.hadoop.io.erasurecode.ErasureCodeNative.checkNativeCodeLoaded(ErasureCodeNative.java:69)
> at
> org.apache.hadoop.io.erasurecode.rawcoder.NativeRSRawEncoder.<clinit>(NativeRSRawEncoder.java:33)
> ... 36 more
> root@990705591ccc:/usr/local/hadoop#
> {code}
> This message is confusing to users because it looks like writing ec file is
> failed. It is useful to print only error message.
> {code}
> LOG.warn("Failed to create raw erasure encoder " + rawCoderName +
> ", fallback to next codec if possible", e);
> {code}
--
This message was sent by Atlassian JIRA
(v6.3.15#6346)
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]