[
https://issues.apache.org/jira/browse/HBASE-18377?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16086317#comment-16086317
]
Hadoop QA commented on HBASE-18377:
-----------------------------------
| (x) *{color:red}-1 overall{color}* |
\\
\\
|| Vote || Subsystem || Runtime || Comment ||
| {color:blue}0{color} | {color:blue} reexec {color} | {color:blue} 10m
49s{color} | {color:blue} Docker mode activated. {color} |
| {color:blue}0{color} | {color:blue} patch {color} | {color:blue} 0m
1s{color} | {color:blue} The patch file was not named according to hbase's
naming conventions. Please see
https://yetus.apache.org/documentation/0.4.0/precommit-patchnames for
instructions. {color} |
| {color:green}+1{color} | {color:green} hbaseanti {color} | {color:green} 0m
0s{color} | {color:green} Patch does not have any anti-patterns. {color} |
| {color:green}+1{color} | {color:green} @author {color} | {color:green} 0m
0s{color} | {color:green} The patch does not contain any @author tags. {color} |
| {color:red}-1{color} | {color:red} test4tests {color} | {color:red} 0m
0s{color} | {color:red} The patch doesn't appear to include any new or modified
tests. Please justify why no new tests are needed for this patch. Also please
list what manual steps were performed to verify this patch. {color} |
| {color:green}+1{color} | {color:green} mvninstall {color} | {color:green} 4m
51s{color} | {color:green} master passed {color} |
| {color:green}+1{color} | {color:green} compile {color} | {color:green} 0m
49s{color} | {color:green} master passed {color} |
| {color:green}+1{color} | {color:green} checkstyle {color} | {color:green} 0m
55s{color} | {color:green} master passed {color} |
| {color:green}+1{color} | {color:green} mvneclipse {color} | {color:green} 0m
20s{color} | {color:green} master passed {color} |
| {color:red}-1{color} | {color:red} findbugs {color} | {color:red} 2m
55s{color} | {color:red} hbase-server in master has 9 extant Findbugs warnings.
{color} |
| {color:green}+1{color} | {color:green} javadoc {color} | {color:green} 0m
30s{color} | {color:green} master passed {color} |
| {color:green}+1{color} | {color:green} mvninstall {color} | {color:green} 0m
46s{color} | {color:green} the patch passed {color} |
| {color:green}+1{color} | {color:green} compile {color} | {color:green} 0m
42s{color} | {color:green} the patch passed {color} |
| {color:green}+1{color} | {color:green} javac {color} | {color:green} 0m
42s{color} | {color:green} the patch passed {color} |
| {color:green}+1{color} | {color:green} checkstyle {color} | {color:green} 0m
47s{color} | {color:green} the patch passed {color} |
| {color:green}+1{color} | {color:green} mvneclipse {color} | {color:green} 0m
15s{color} | {color:green} the patch passed {color} |
| {color:green}+1{color} | {color:green} whitespace {color} | {color:green} 0m
0s{color} | {color:green} The patch has no whitespace issues. {color} |
| {color:green}+1{color} | {color:green} hadoopcheck {color} | {color:green}
31m 0s{color} | {color:green} Patch does not cause any errors with Hadoop
2.6.1 2.6.2 2.6.3 2.6.4 2.6.5 2.7.1 2.7.2 2.7.3 or 3.0.0-alpha4. {color} |
| {color:green}+1{color} | {color:green} findbugs {color} | {color:green} 3m
4s{color} | {color:green} the patch passed {color} |
| {color:green}+1{color} | {color:green} javadoc {color} | {color:green} 0m
28s{color} | {color:green} the patch passed {color} |
| {color:red}-1{color} | {color:red} unit {color} | {color:red}134m 4s{color}
| {color:red} hbase-server in the patch failed. {color} |
| {color:green}+1{color} | {color:green} asflicense {color} | {color:green} 0m
29s{color} | {color:green} The patch does not generate ASF License warnings.
{color} |
| {color:black}{color} | {color:black} {color} | {color:black}193m 7s{color} |
{color:black} {color} |
\\
\\
|| Reason || Tests ||
| Failed junit tests | hadoop.hbase.master.procedure.TestServerCrashProcedure |
\\
\\
|| Subsystem || Report/Notes ||
| Docker | Client=1.11.2 Server=1.11.2 Image:yetus/hbase:757bf37 |
| JIRA Issue | HBASE-18377 |
| JIRA Patch URL |
https://issues.apache.org/jira/secure/attachment/12877116/18377.v1.txt |
| Optional Tests | asflicense javac javadoc unit findbugs hadoopcheck
hbaseanti checkstyle compile |
| uname | Linux 2f86b7a385ae 3.13.0-123-generic #172-Ubuntu SMP Mon Jun 26
18:04:35 UTC 2017 x86_64 x86_64 x86_64 GNU/Linux |
| Build tool | maven |
| Personality |
/home/jenkins/jenkins-slave/workspace/PreCommit-HBASE-Build/component/dev-support/hbase-personality.sh
|
| git revision | master / 500592d |
| Default Java | 1.8.0_131 |
| findbugs | v3.1.0-RC3 |
| findbugs |
https://builds.apache.org/job/PreCommit-HBASE-Build/7649/artifact/patchprocess/branch-findbugs-hbase-server-warnings.html
|
| unit |
https://builds.apache.org/job/PreCommit-HBASE-Build/7649/artifact/patchprocess/patch-unit-hbase-server.txt
|
| Test Results |
https://builds.apache.org/job/PreCommit-HBASE-Build/7649/testReport/ |
| modules | C: hbase-server U: hbase-server |
| Console output |
https://builds.apache.org/job/PreCommit-HBASE-Build/7649/console |
| Powered by | Apache Yetus 0.4.0 http://yetus.apache.org |
This message was automatically generated.
> Error handling for FileNotFoundException should consider RemoteException in
> ReplicationSource#openReader()
> ----------------------------------------------------------------------------------------------------------
>
> Key: HBASE-18377
> URL: https://issues.apache.org/jira/browse/HBASE-18377
> Project: HBase
> Issue Type: Bug
> Reporter: Ted Yu
> Attachments: 18377.branch-1.3.txt, 18377.v1.txt
>
>
> In region server log, I observed the following:
> {code}
> org.apache.hadoop.ipc.RemoteException(java.io.FileNotFoundException): File
> does not exist:
> /apps/hbase/data/WALs/lx.p.com,16020,1497300923131/497300923131.
> default.1497302873178
> at
> org.apache.hadoop.hdfs.server.namenode.INodeFile.valueOf(INodeFile.java:71)
> at
> org.apache.hadoop.hdfs.server.namenode.INodeFile.valueOf(INodeFile.java:61)
> at
> org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getBlockLocationsInt(FSNamesystem.java:1860)
> at
> org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getBlockLocations(FSNamesystem.java:1831)
> at
> org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getBlockLocations(FSNamesystem.java:1744)
> ...
> at
> org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
> at
> org.apache.hadoop.hdfs.DistributedFileSystem.open(DistributedFileSystem.java:326)
> at org.apache.hadoop.fs.FilterFileSystem.open(FilterFileSystem.java:162)
> at org.apache.hadoop.fs.FileSystem.open(FileSystem.java:782)
> at org.apache.hadoop.hbase.wal.WALFactory.createReader(WALFactory.java:293)
> at org.apache.hadoop.hbase.wal.WALFactory.createReader(WALFactory.java:267)
> at org.apache.hadoop.hbase.wal.WALFactory.createReader(WALFactory.java:255)
> at org.apache.hadoop.hbase.wal.WALFactory.createReader(WALFactory.java:414)
> at
> org.apache.hadoop.hbase.replication.regionserver.ReplicationWALReaderManager.openReader(ReplicationWALReaderManager.java:69)
> at
> org.apache.hadoop.hbase.replication.regionserver.ReplicationSource.openReader(ReplicationSource.java:605)
> at
> org.apache.hadoop.hbase.replication.regionserver.ReplicationSource.run(ReplicationSource.java:364)
> {code}
> We have code in ReplicationSource#openReader() which is supposed to handle
> FileNotFoundException but RemoteException wrapping FileNotFoundException was
> missed.
--
This message was sent by Atlassian JIRA
(v6.4.14#64029)