[
https://issues.apache.org/jira/browse/HBASE-18377?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16090664#comment-16090664
]
Hadoop QA commented on HBASE-18377:
-----------------------------------
| (x) *{color:red}-1 overall{color}* |
\\
\\
|| Vote || Subsystem || Runtime || Comment ||
| {color:blue}0{color} | {color:blue} reexec {color} | {color:blue} 0m
20s{color} | {color:blue} Docker mode activated. {color} |
| {color:blue}0{color} | {color:blue} patch {color} | {color:blue} 0m
2s{color} | {color:blue} The patch file was not named according to hbase's
naming conventions. Please see
https://yetus.apache.org/documentation/0.4.0/precommit-patchnames for
instructions. {color} |
| {color:green}+1{color} | {color:green} hbaseanti {color} | {color:green} 0m
0s{color} | {color:green} Patch does not have any anti-patterns. {color} |
| {color:green}+1{color} | {color:green} @author {color} | {color:green} 0m
0s{color} | {color:green} The patch does not contain any @author tags. {color} |
| {color:red}-1{color} | {color:red} test4tests {color} | {color:red} 0m
0s{color} | {color:red} The patch doesn't appear to include any new or modified
tests. Please justify why no new tests are needed for this patch. Also please
list what manual steps were performed to verify this patch. {color} |
| {color:green}+1{color} | {color:green} mvninstall {color} | {color:green} 3m
26s{color} | {color:green} master passed {color} |
| {color:green}+1{color} | {color:green} compile {color} | {color:green} 0m
41s{color} | {color:green} master passed {color} |
| {color:green}+1{color} | {color:green} checkstyle {color} | {color:green} 0m
50s{color} | {color:green} master passed {color} |
| {color:green}+1{color} | {color:green} mvneclipse {color} | {color:green} 0m
17s{color} | {color:green} master passed {color} |
| {color:red}-1{color} | {color:red} findbugs {color} | {color:red} 3m
0s{color} | {color:red} hbase-server in master has 9 extant Findbugs warnings.
{color} |
| {color:green}+1{color} | {color:green} javadoc {color} | {color:green} 0m
27s{color} | {color:green} master passed {color} |
| {color:green}+1{color} | {color:green} mvninstall {color} | {color:green} 0m
45s{color} | {color:green} the patch passed {color} |
| {color:green}+1{color} | {color:green} compile {color} | {color:green} 0m
39s{color} | {color:green} the patch passed {color} |
| {color:green}+1{color} | {color:green} javac {color} | {color:green} 0m
39s{color} | {color:green} the patch passed {color} |
| {color:green}+1{color} | {color:green} checkstyle {color} | {color:green} 0m
53s{color} | {color:green} the patch passed {color} |
| {color:green}+1{color} | {color:green} mvneclipse {color} | {color:green} 0m
16s{color} | {color:green} the patch passed {color} |
| {color:green}+1{color} | {color:green} whitespace {color} | {color:green} 0m
0s{color} | {color:green} The patch has no whitespace issues. {color} |
| {color:green}+1{color} | {color:green} hadoopcheck {color} | {color:green}
30m 45s{color} | {color:green} Patch does not cause any errors with Hadoop
2.6.1 2.6.2 2.6.3 2.6.4 2.6.5 2.7.1 2.7.2 2.7.3 or 3.0.0-alpha4. {color} |
| {color:green}+1{color} | {color:green} findbugs {color} | {color:green} 3m
1s{color} | {color:green} the patch passed {color} |
| {color:green}+1{color} | {color:green} javadoc {color} | {color:green} 0m
30s{color} | {color:green} the patch passed {color} |
| {color:green}+1{color} | {color:green} unit {color} | {color:green}118m
5s{color} | {color:green} hbase-server in the patch passed. {color} |
| {color:green}+1{color} | {color:green} asflicense {color} | {color:green} 0m
17s{color} | {color:green} The patch does not generate ASF License warnings.
{color} |
| {color:black}{color} | {color:black} {color} | {color:black}164m 30s{color} |
{color:black} {color} |
\\
\\
|| Subsystem || Report/Notes ||
| Docker | Client=1.12.3 Server=1.12.3 Image:yetus/hbase:757bf37 |
| JIRA Issue | HBASE-18377 |
| JIRA Patch URL |
https://issues.apache.org/jira/secure/attachment/12877116/18377.v1.txt |
| Optional Tests | asflicense javac javadoc unit findbugs hadoopcheck
hbaseanti checkstyle compile |
| uname | Linux 1da7ae36da3b 3.13.0-119-generic #166-Ubuntu SMP Wed May 3
12:18:55 UTC 2017 x86_64 x86_64 x86_64 GNU/Linux |
| Build tool | maven |
| Personality |
/home/jenkins/jenkins-slave/workspace/PreCommit-HBASE-Build/component/dev-support/hbase-personality.sh
|
| git revision | master / 2d5a0fb |
| Default Java | 1.8.0_131 |
| findbugs | v3.1.0-RC3 |
| findbugs |
https://builds.apache.org/job/PreCommit-HBASE-Build/7682/artifact/patchprocess/branch-findbugs-hbase-server-warnings.html
|
| Test Results |
https://builds.apache.org/job/PreCommit-HBASE-Build/7682/testReport/ |
| modules | C: hbase-server U: hbase-server |
| Console output |
https://builds.apache.org/job/PreCommit-HBASE-Build/7682/console |
| Powered by | Apache Yetus 0.4.0 http://yetus.apache.org |
This message was automatically generated.
> Error handling for FileNotFoundException should consider RemoteException in
> ReplicationSource#openReader()
> ----------------------------------------------------------------------------------------------------------
>
> Key: HBASE-18377
> URL: https://issues.apache.org/jira/browse/HBASE-18377
> Project: HBase
> Issue Type: Bug
> Reporter: Ted Yu
> Attachments: 18377.branch-1.3.txt, 18377.v1.txt
>
>
> In region server log, I observed the following:
> {code}
> org.apache.hadoop.ipc.RemoteException(java.io.FileNotFoundException): File
> does not exist:
> /apps/hbase/data/WALs/lx.p.com,16020,1497300923131/497300923131.
> default.1497302873178
> at
> org.apache.hadoop.hdfs.server.namenode.INodeFile.valueOf(INodeFile.java:71)
> at
> org.apache.hadoop.hdfs.server.namenode.INodeFile.valueOf(INodeFile.java:61)
> at
> org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getBlockLocationsInt(FSNamesystem.java:1860)
> at
> org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getBlockLocations(FSNamesystem.java:1831)
> at
> org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getBlockLocations(FSNamesystem.java:1744)
> ...
> at
> org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
> at
> org.apache.hadoop.hdfs.DistributedFileSystem.open(DistributedFileSystem.java:326)
> at org.apache.hadoop.fs.FilterFileSystem.open(FilterFileSystem.java:162)
> at org.apache.hadoop.fs.FileSystem.open(FileSystem.java:782)
> at org.apache.hadoop.hbase.wal.WALFactory.createReader(WALFactory.java:293)
> at org.apache.hadoop.hbase.wal.WALFactory.createReader(WALFactory.java:267)
> at org.apache.hadoop.hbase.wal.WALFactory.createReader(WALFactory.java:255)
> at org.apache.hadoop.hbase.wal.WALFactory.createReader(WALFactory.java:414)
> at
> org.apache.hadoop.hbase.replication.regionserver.ReplicationWALReaderManager.openReader(ReplicationWALReaderManager.java:69)
> at
> org.apache.hadoop.hbase.replication.regionserver.ReplicationSource.openReader(ReplicationSource.java:605)
> at
> org.apache.hadoop.hbase.replication.regionserver.ReplicationSource.run(ReplicationSource.java:364)
> {code}
> We have code in ReplicationSource#openReader() which is supposed to handle
> FileNotFoundException but RemoteException wrapping FileNotFoundException was
> missed.
--
This message was sent by Atlassian JIRA
(v6.4.14#64029)