[jira] [Commented] (HBASE-16644) Errors when reading legit HFile' Trailer on branch 1.3

2016-10-01 Thread Anoop Sam John (JIRA)

[ 
https://issues.apache.org/jira/browse/HBASE-16644?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15538167#comment-15538167
 ] 

Anoop Sam John commented on HBASE-16644:


Patch looks good .

> Errors when reading legit HFile' Trailer on branch 1.3
> --
>
> Key: HBASE-16644
> URL: https://issues.apache.org/jira/browse/HBASE-16644
> Project: HBase
>  Issue Type: Bug
>  Components: HFile
>Affects Versions: 1.3.0, 1.4.0
>Reporter: Mikhail Antonov
>Assignee: Mikhail Antonov
>Priority: Critical
> Fix For: 1.3.0
>
> Attachments: HBASE-16644.branch-1.3.patch, 
> HBASE-16644.branch-1.3.patch
>
>
> There seems to be a regression in branch 1.3 where we can't read HFile 
> trailer(getting "CorruptHFileException: Problem reading HFile Trailer") on 
> some HFiles that could be successfully read on 1.2.
> I've seen this error manifesting in two ways so far.
> {code}Caused by: org.apache.hadoop.hbase.io.hfile.CorruptHFileException: 
> Problem reading HFile Trailer from file  
>   at 
> org.apache.hadoop.hbase.io.hfile.HFile.pickReaderVersion(HFile.java:497)
>   at org.apache.hadoop.hbase.io.hfile.HFile.createReader(HFile.java:525)
>   at 
> org.apache.hadoop.hbase.regionserver.StoreFile$Reader.(StoreFile.java:1164)
>   at 
> org.apache.hadoop.hbase.regionserver.StoreFileInfo.open(StoreFileInfo.java:259)
>   at 
> org.apache.hadoop.hbase.regionserver.StoreFile.open(StoreFile.java:427)
>   at 
> org.apache.hadoop.hbase.regionserver.StoreFile.createReader(StoreFile.java:528)
>   at 
> org.apache.hadoop.hbase.regionserver.StoreFile.createReader(StoreFile.java:518)
>   at 
> org.apache.hadoop.hbase.regionserver.HStore.createStoreFileAndReader(HStore.java:652)
>   at 
> org.apache.hadoop.hbase.regionserver.HStore.access$000(HStore.java:117)
>   at org.apache.hadoop.hbase.regionserver.HStore$1.call(HStore.java:519)
>   at org.apache.hadoop.hbase.regionserver.HStore$1.call(HStore.java:516)
>   ... 6 more
> Caused by: java.io.IOException: Invalid HFile block magic: 
> \x00\x00\x04\x00\x00\x00\x00\x00
>   at org.apache.hadoop.hbase.io.hfile.BlockType.parse(BlockType.java:155)
>   at org.apache.hadoop.hbase.io.hfile.BlockType.read(BlockType.java:167)
>   at 
> org.apache.hadoop.hbase.io.hfile.HFileBlock.(HFileBlock.java:344)
>   at 
> org.apache.hadoop.hbase.io.hfile.HFileBlock$FSReaderImpl.readBlockDataInternal(HFileBlock.java:1735)
>   at 
> org.apache.hadoop.hbase.io.hfile.HFileBlock$FSReaderImpl.readBlockData(HFileBlock.java:1558)
>   at 
> org.apache.hadoop.hbase.io.hfile.HFileBlock$AbstractFSReader$1.nextBlock(HFileBlock.java:1397)
>   at 
> org.apache.hadoop.hbase.io.hfile.HFileBlock$AbstractFSReader$1.nextBlockWithBlockType(HFileBlock.java:1405)
>   at 
> org.apache.hadoop.hbase.io.hfile.HFileReaderV2.(HFileReaderV2.java:156)
>   at 
> org.apache.hadoop.hbase.io.hfile.HFile.pickReaderVersion(HFile.java:485)
> {code}
> and second
> {code}
> Caused by: org.apache.hadoop.hbase.io.hfile.CorruptHFileException: Problem 
> reading HFile Trailer from file 
>   at 
> org.apache.hadoop.hbase.io.hfile.HFile.pickReaderVersion(HFile.java:497)
>   at org.apache.hadoop.hbase.io.hfile.HFile.createReader(HFile.java:525)
>   at 
> org.apache.hadoop.hbase.regionserver.StoreFile$Reader.(StoreFile.java:1164)
>   at 
> org.apache.hadoop.hbase.io.HalfStoreFileReader.(HalfStoreFileReader.java:104)
>   at 
> org.apache.hadoop.hbase.regionserver.StoreFileInfo.open(StoreFileInfo.java:256)
>   at 
> org.apache.hadoop.hbase.regionserver.StoreFile.open(StoreFile.java:427)
>   at 
> org.apache.hadoop.hbase.regionserver.StoreFile.createReader(StoreFile.java:528)
>   at 
> org.apache.hadoop.hbase.regionserver.StoreFile.createReader(StoreFile.java:518)
>   at 
> org.apache.hadoop.hbase.regionserver.HStore.createStoreFileAndReader(HStore.java:652)
>   at 
> org.apache.hadoop.hbase.regionserver.HStore.access$000(HStore.java:117)
>   at org.apache.hadoop.hbase.regionserver.HStore$1.call(HStore.java:519)
>   at org.apache.hadoop.hbase.regionserver.HStore$1.call(HStore.java:516)
>   ... 6 more
> Caused by: java.io.IOException: Premature EOF from inputStream (read returned 
> -1, was trying to read 10083 necessary bytes and 24 extra bytes, successfully 
> read 1072
>   at 
> org.apache.hadoop.hbase.io.hfile.HFileBlock.readWithExtra(HFileBlock.java:737)
>   at 
> org.apache.hadoop.hbase.io.hfile.HFileBlock$AbstractFSReader.readAtOffset(HFileBlock.java:1459)
>   at 
> org.apache.hadoop.hbase.io.hfile.HFileBlock$FSReaderImpl.readBlockDataInternal(HFileBlock.java:1712)
>   at 
> org.apache.hadoop.hbase.io.hfile.HFileBlock$FSReaderImpl.readBlockData(HFileBlock.java:1558)
>   

[jira] [Commented] (HBASE-16644) Errors when reading legit HFile' Trailer on branch 1.3

2016-09-30 Thread Gary Helmling (JIRA)

[ 
https://issues.apache.org/jira/browse/HBASE-16644?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15537429#comment-15537429
 ] 

Gary Helmling commented on HBASE-16644:
---

The patch looks good to me, though I'm not that familiar with this code.  Seems 
right that whether the checksum is included in the header size should be based 
on the file format itself, instead of whether we want to consult the checksum 
at the given moment.  So it seems correct.

+1 from me, assuming the 3 test failures are unrelated.

> Errors when reading legit HFile' Trailer on branch 1.3
> --
>
> Key: HBASE-16644
> URL: https://issues.apache.org/jira/browse/HBASE-16644
> Project: HBase
>  Issue Type: Bug
>  Components: HFile
>Affects Versions: 1.3.0, 1.4.0
>Reporter: Mikhail Antonov
>Assignee: Mikhail Antonov
>Priority: Critical
> Fix For: 1.3.0
>
> Attachments: HBASE-16644.branch-1.3.patch, 
> HBASE-16644.branch-1.3.patch
>
>
> There seems to be a regression in branch 1.3 where we can't read HFile 
> trailer(getting "CorruptHFileException: Problem reading HFile Trailer") on 
> some HFiles that could be successfully read on 1.2.
> I've seen this error manifesting in two ways so far.
> {code}Caused by: org.apache.hadoop.hbase.io.hfile.CorruptHFileException: 
> Problem reading HFile Trailer from file  
>   at 
> org.apache.hadoop.hbase.io.hfile.HFile.pickReaderVersion(HFile.java:497)
>   at org.apache.hadoop.hbase.io.hfile.HFile.createReader(HFile.java:525)
>   at 
> org.apache.hadoop.hbase.regionserver.StoreFile$Reader.(StoreFile.java:1164)
>   at 
> org.apache.hadoop.hbase.regionserver.StoreFileInfo.open(StoreFileInfo.java:259)
>   at 
> org.apache.hadoop.hbase.regionserver.StoreFile.open(StoreFile.java:427)
>   at 
> org.apache.hadoop.hbase.regionserver.StoreFile.createReader(StoreFile.java:528)
>   at 
> org.apache.hadoop.hbase.regionserver.StoreFile.createReader(StoreFile.java:518)
>   at 
> org.apache.hadoop.hbase.regionserver.HStore.createStoreFileAndReader(HStore.java:652)
>   at 
> org.apache.hadoop.hbase.regionserver.HStore.access$000(HStore.java:117)
>   at org.apache.hadoop.hbase.regionserver.HStore$1.call(HStore.java:519)
>   at org.apache.hadoop.hbase.regionserver.HStore$1.call(HStore.java:516)
>   ... 6 more
> Caused by: java.io.IOException: Invalid HFile block magic: 
> \x00\x00\x04\x00\x00\x00\x00\x00
>   at org.apache.hadoop.hbase.io.hfile.BlockType.parse(BlockType.java:155)
>   at org.apache.hadoop.hbase.io.hfile.BlockType.read(BlockType.java:167)
>   at 
> org.apache.hadoop.hbase.io.hfile.HFileBlock.(HFileBlock.java:344)
>   at 
> org.apache.hadoop.hbase.io.hfile.HFileBlock$FSReaderImpl.readBlockDataInternal(HFileBlock.java:1735)
>   at 
> org.apache.hadoop.hbase.io.hfile.HFileBlock$FSReaderImpl.readBlockData(HFileBlock.java:1558)
>   at 
> org.apache.hadoop.hbase.io.hfile.HFileBlock$AbstractFSReader$1.nextBlock(HFileBlock.java:1397)
>   at 
> org.apache.hadoop.hbase.io.hfile.HFileBlock$AbstractFSReader$1.nextBlockWithBlockType(HFileBlock.java:1405)
>   at 
> org.apache.hadoop.hbase.io.hfile.HFileReaderV2.(HFileReaderV2.java:156)
>   at 
> org.apache.hadoop.hbase.io.hfile.HFile.pickReaderVersion(HFile.java:485)
> {code}
> and second
> {code}
> Caused by: org.apache.hadoop.hbase.io.hfile.CorruptHFileException: Problem 
> reading HFile Trailer from file 
>   at 
> org.apache.hadoop.hbase.io.hfile.HFile.pickReaderVersion(HFile.java:497)
>   at org.apache.hadoop.hbase.io.hfile.HFile.createReader(HFile.java:525)
>   at 
> org.apache.hadoop.hbase.regionserver.StoreFile$Reader.(StoreFile.java:1164)
>   at 
> org.apache.hadoop.hbase.io.HalfStoreFileReader.(HalfStoreFileReader.java:104)
>   at 
> org.apache.hadoop.hbase.regionserver.StoreFileInfo.open(StoreFileInfo.java:256)
>   at 
> org.apache.hadoop.hbase.regionserver.StoreFile.open(StoreFile.java:427)
>   at 
> org.apache.hadoop.hbase.regionserver.StoreFile.createReader(StoreFile.java:528)
>   at 
> org.apache.hadoop.hbase.regionserver.StoreFile.createReader(StoreFile.java:518)
>   at 
> org.apache.hadoop.hbase.regionserver.HStore.createStoreFileAndReader(HStore.java:652)
>   at 
> org.apache.hadoop.hbase.regionserver.HStore.access$000(HStore.java:117)
>   at org.apache.hadoop.hbase.regionserver.HStore$1.call(HStore.java:519)
>   at org.apache.hadoop.hbase.regionserver.HStore$1.call(HStore.java:516)
>   ... 6 more
> Caused by: java.io.IOException: Premature EOF from inputStream (read returned 
> -1, was trying to read 10083 necessary bytes and 24 extra bytes, successfully 
> read 1072
>   at 
> org.apache.hadoop.hbase.io.hfile.HFileBlock.readWithExtra(HFileBlock.java:737)
>   at 
> 

[jira] [Commented] (HBASE-16644) Errors when reading legit HFile' Trailer on branch 1.3

2016-09-30 Thread Hadoop QA (JIRA)

[ 
https://issues.apache.org/jira/browse/HBASE-16644?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15537332#comment-15537332
 ] 

Hadoop QA commented on HBASE-16644:
---

| (x) *{color:red}-1 overall{color}* |
\\
\\
|| Vote || Subsystem || Runtime || Comment ||
| {color:blue}0{color} | {color:blue} reexec {color} | {color:blue} 17m 21s 
{color} | {color:blue} Docker mode activated. {color} |
| {color:green}+1{color} | {color:green} @author {color} | {color:green} 0m 0s 
{color} | {color:green} The patch does not contain any @author tags. {color} |
| {color:red}-1{color} | {color:red} test4tests {color} | {color:red} 0m 0s 
{color} | {color:red} The patch doesn't appear to include any new or modified 
tests. Please justify why no new tests are needed for this patch. Also please 
list what manual steps were performed to verify this patch. {color} |
| {color:green}+1{color} | {color:green} mvninstall {color} | {color:green} 6m 
56s {color} | {color:green} branch-1.3 passed {color} |
| {color:green}+1{color} | {color:green} compile {color} | {color:green} 0m 31s 
{color} | {color:green} branch-1.3 passed with JDK v1.8.0_101 {color} |
| {color:green}+1{color} | {color:green} compile {color} | {color:green} 0m 36s 
{color} | {color:green} branch-1.3 passed with JDK v1.7.0_111 {color} |
| {color:green}+1{color} | {color:green} checkstyle {color} | {color:green} 1m 
4s {color} | {color:green} branch-1.3 passed {color} |
| {color:green}+1{color} | {color:green} mvneclipse {color} | {color:green} 0m 
21s {color} | {color:green} branch-1.3 passed {color} |
| {color:green}+1{color} | {color:green} findbugs {color} | {color:green} 1m 
49s {color} | {color:green} branch-1.3 passed {color} |
| {color:green}+1{color} | {color:green} javadoc {color} | {color:green} 0m 33s 
{color} | {color:green} branch-1.3 passed with JDK v1.8.0_101 {color} |
| {color:green}+1{color} | {color:green} javadoc {color} | {color:green} 0m 33s 
{color} | {color:green} branch-1.3 passed with JDK v1.7.0_111 {color} |
| {color:green}+1{color} | {color:green} mvninstall {color} | {color:green} 0m 
45s {color} | {color:green} the patch passed {color} |
| {color:green}+1{color} | {color:green} compile {color} | {color:green} 0m 33s 
{color} | {color:green} the patch passed with JDK v1.8.0_101 {color} |
| {color:green}+1{color} | {color:green} javac {color} | {color:green} 0m 33s 
{color} | {color:green} the patch passed {color} |
| {color:green}+1{color} | {color:green} compile {color} | {color:green} 0m 35s 
{color} | {color:green} the patch passed with JDK v1.7.0_111 {color} |
| {color:green}+1{color} | {color:green} javac {color} | {color:green} 0m 35s 
{color} | {color:green} the patch passed {color} |
| {color:green}+1{color} | {color:green} checkstyle {color} | {color:green} 0m 
54s {color} | {color:green} the patch passed {color} |
| {color:green}+1{color} | {color:green} mvneclipse {color} | {color:green} 0m 
16s {color} | {color:green} the patch passed {color} |
| {color:green}+1{color} | {color:green} whitespace {color} | {color:green} 0m 
0s {color} | {color:green} The patch has no whitespace issues. {color} |
| {color:green}+1{color} | {color:green} hadoopcheck {color} | {color:green} 
19m 38s {color} | {color:green} The patch does not cause any errors with Hadoop 
2.4.0 2.4.1 2.5.0 2.5.1 2.5.2 2.6.1 2.6.2 2.6.3 2.7.1. {color} |
| {color:green}+1{color} | {color:green} hbaseprotoc {color} | {color:green} 0m 
18s {color} | {color:green} the patch passed {color} |
| {color:green}+1{color} | {color:green} findbugs {color} | {color:green} 2m 
21s {color} | {color:green} the patch passed {color} |
| {color:green}+1{color} | {color:green} javadoc {color} | {color:green} 0m 27s 
{color} | {color:green} the patch passed with JDK v1.8.0_101 {color} |
| {color:green}+1{color} | {color:green} javadoc {color} | {color:green} 0m 37s 
{color} | {color:green} the patch passed with JDK v1.7.0_111 {color} |
| {color:red}-1{color} | {color:red} unit {color} | {color:red} 104m 26s 
{color} | {color:red} hbase-server in the patch failed. {color} |
| {color:green}+1{color} | {color:green} asflicense {color} | {color:green} 0m 
22s {color} | {color:green} The patch does not generate ASF License warnings. 
{color} |
| {color:black}{color} | {color:black} {color} | {color:black} 161m 26s {color} 
| {color:black} {color} |
\\
\\
|| Reason || Tests ||
| Failed junit tests | hadoop.hbase.mapred.TestMultiTableSnapshotInputFormat |
|   | hadoop.hbase.regionserver.TestFailedAppendAndSync |
|   | hadoop.hbase.mapreduce.TestMultiTableSnapshotInputFormat |
\\
\\
|| Subsystem || Report/Notes ||
| Docker | Client=1.11.2 Server=1.11.2 Image:yetus/hbase:date2016-09-30 |
| JIRA Patch URL | 
https://issues.apache.org/jira/secure/attachment/12831158/HBASE-16644.branch-1.3.patch
 |
| JIRA Issue | HBASE-16644 |
| Optional Tests |  asflicense  javac  javadoc  unit  findbugs  hadoopcheck  
hbaseanti  checkstyle  compile  

[jira] [Commented] (HBASE-16644) Errors when reading legit HFile' Trailer on branch 1.3

2016-09-26 Thread stack (JIRA)

[ 
https://issues.apache.org/jira/browse/HBASE-16644?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15524001#comment-15524001
 ] 

stack commented on HBASE-16644:
---

Let me build a 0.92 and write a few examples of this file v2.0 hfile... and 
then try and reproduce this issue.

> Errors when reading legit HFile' Trailer on branch 1.3
> --
>
> Key: HBASE-16644
> URL: https://issues.apache.org/jira/browse/HBASE-16644
> Project: HBase
>  Issue Type: Bug
>  Components: HFile
>Affects Versions: 1.3.0, 1.4.0
>Reporter: Mikhail Antonov
>Assignee: Mikhail Antonov
>Priority: Critical
> Fix For: 1.3.0
>
> Attachments: HBASE-16644.branch-1.3.patch
>
>
> There seems to be a regression in branch 1.3 where we can't read HFile 
> trailer(getting "CorruptHFileException: Problem reading HFile Trailer") on 
> some HFiles that could be successfully read on 1.2.
> I've seen this error manifesting in two ways so far.
> {code}Caused by: org.apache.hadoop.hbase.io.hfile.CorruptHFileException: 
> Problem reading HFile Trailer from file  
>   at 
> org.apache.hadoop.hbase.io.hfile.HFile.pickReaderVersion(HFile.java:497)
>   at org.apache.hadoop.hbase.io.hfile.HFile.createReader(HFile.java:525)
>   at 
> org.apache.hadoop.hbase.regionserver.StoreFile$Reader.(StoreFile.java:1164)
>   at 
> org.apache.hadoop.hbase.regionserver.StoreFileInfo.open(StoreFileInfo.java:259)
>   at 
> org.apache.hadoop.hbase.regionserver.StoreFile.open(StoreFile.java:427)
>   at 
> org.apache.hadoop.hbase.regionserver.StoreFile.createReader(StoreFile.java:528)
>   at 
> org.apache.hadoop.hbase.regionserver.StoreFile.createReader(StoreFile.java:518)
>   at 
> org.apache.hadoop.hbase.regionserver.HStore.createStoreFileAndReader(HStore.java:652)
>   at 
> org.apache.hadoop.hbase.regionserver.HStore.access$000(HStore.java:117)
>   at org.apache.hadoop.hbase.regionserver.HStore$1.call(HStore.java:519)
>   at org.apache.hadoop.hbase.regionserver.HStore$1.call(HStore.java:516)
>   ... 6 more
> Caused by: java.io.IOException: Invalid HFile block magic: 
> \x00\x00\x04\x00\x00\x00\x00\x00
>   at org.apache.hadoop.hbase.io.hfile.BlockType.parse(BlockType.java:155)
>   at org.apache.hadoop.hbase.io.hfile.BlockType.read(BlockType.java:167)
>   at 
> org.apache.hadoop.hbase.io.hfile.HFileBlock.(HFileBlock.java:344)
>   at 
> org.apache.hadoop.hbase.io.hfile.HFileBlock$FSReaderImpl.readBlockDataInternal(HFileBlock.java:1735)
>   at 
> org.apache.hadoop.hbase.io.hfile.HFileBlock$FSReaderImpl.readBlockData(HFileBlock.java:1558)
>   at 
> org.apache.hadoop.hbase.io.hfile.HFileBlock$AbstractFSReader$1.nextBlock(HFileBlock.java:1397)
>   at 
> org.apache.hadoop.hbase.io.hfile.HFileBlock$AbstractFSReader$1.nextBlockWithBlockType(HFileBlock.java:1405)
>   at 
> org.apache.hadoop.hbase.io.hfile.HFileReaderV2.(HFileReaderV2.java:156)
>   at 
> org.apache.hadoop.hbase.io.hfile.HFile.pickReaderVersion(HFile.java:485)
> {code}
> and second
> {code}
> Caused by: org.apache.hadoop.hbase.io.hfile.CorruptHFileException: Problem 
> reading HFile Trailer from file 
>   at 
> org.apache.hadoop.hbase.io.hfile.HFile.pickReaderVersion(HFile.java:497)
>   at org.apache.hadoop.hbase.io.hfile.HFile.createReader(HFile.java:525)
>   at 
> org.apache.hadoop.hbase.regionserver.StoreFile$Reader.(StoreFile.java:1164)
>   at 
> org.apache.hadoop.hbase.io.HalfStoreFileReader.(HalfStoreFileReader.java:104)
>   at 
> org.apache.hadoop.hbase.regionserver.StoreFileInfo.open(StoreFileInfo.java:256)
>   at 
> org.apache.hadoop.hbase.regionserver.StoreFile.open(StoreFile.java:427)
>   at 
> org.apache.hadoop.hbase.regionserver.StoreFile.createReader(StoreFile.java:528)
>   at 
> org.apache.hadoop.hbase.regionserver.StoreFile.createReader(StoreFile.java:518)
>   at 
> org.apache.hadoop.hbase.regionserver.HStore.createStoreFileAndReader(HStore.java:652)
>   at 
> org.apache.hadoop.hbase.regionserver.HStore.access$000(HStore.java:117)
>   at org.apache.hadoop.hbase.regionserver.HStore$1.call(HStore.java:519)
>   at org.apache.hadoop.hbase.regionserver.HStore$1.call(HStore.java:516)
>   ... 6 more
> Caused by: java.io.IOException: Premature EOF from inputStream (read returned 
> -1, was trying to read 10083 necessary bytes and 24 extra bytes, successfully 
> read 1072
>   at 
> org.apache.hadoop.hbase.io.hfile.HFileBlock.readWithExtra(HFileBlock.java:737)
>   at 
> org.apache.hadoop.hbase.io.hfile.HFileBlock$AbstractFSReader.readAtOffset(HFileBlock.java:1459)
>   at 
> org.apache.hadoop.hbase.io.hfile.HFileBlock$FSReaderImpl.readBlockDataInternal(HFileBlock.java:1712)
>   at 
> 

[jira] [Commented] (HBASE-16644) Errors when reading legit HFile' Trailer on branch 1.3

2016-09-20 Thread Mikhail Antonov (JIRA)

[ 
https://issues.apache.org/jira/browse/HBASE-16644?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15505801#comment-15505801
 ] 

Mikhail Antonov commented on HBASE-16644:
-

[~stack] So we looked closer at offending files and found that they had HFile 
version: major: 2, minor: 0 in them. (in HBASE-5074 we added notion of minor 
version and checksums in blocks). Seems like really old files that never got 
compacted away.

Looks like it's matter of format compatibility with old version.

> Errors when reading legit HFile' Trailer on branch 1.3
> --
>
> Key: HBASE-16644
> URL: https://issues.apache.org/jira/browse/HBASE-16644
> Project: HBase
>  Issue Type: Bug
>  Components: HFile
>Affects Versions: 1.3.0, 1.4.0
>Reporter: Mikhail Antonov
>Assignee: Mikhail Antonov
>Priority: Critical
> Fix For: 1.3.0
>
> Attachments: HBASE-16644.branch-1.3.patch
>
>
> There seems to be a regression in branch 1.3 where we can't read HFile 
> trailer(getting "CorruptHFileException: Problem reading HFile Trailer") on 
> some HFiles that could be successfully read on 1.2.
> I've seen this error manifesting in two ways so far.
> {code}Caused by: org.apache.hadoop.hbase.io.hfile.CorruptHFileException: 
> Problem reading HFile Trailer from file  
>   at 
> org.apache.hadoop.hbase.io.hfile.HFile.pickReaderVersion(HFile.java:497)
>   at org.apache.hadoop.hbase.io.hfile.HFile.createReader(HFile.java:525)
>   at 
> org.apache.hadoop.hbase.regionserver.StoreFile$Reader.(StoreFile.java:1164)
>   at 
> org.apache.hadoop.hbase.regionserver.StoreFileInfo.open(StoreFileInfo.java:259)
>   at 
> org.apache.hadoop.hbase.regionserver.StoreFile.open(StoreFile.java:427)
>   at 
> org.apache.hadoop.hbase.regionserver.StoreFile.createReader(StoreFile.java:528)
>   at 
> org.apache.hadoop.hbase.regionserver.StoreFile.createReader(StoreFile.java:518)
>   at 
> org.apache.hadoop.hbase.regionserver.HStore.createStoreFileAndReader(HStore.java:652)
>   at 
> org.apache.hadoop.hbase.regionserver.HStore.access$000(HStore.java:117)
>   at org.apache.hadoop.hbase.regionserver.HStore$1.call(HStore.java:519)
>   at org.apache.hadoop.hbase.regionserver.HStore$1.call(HStore.java:516)
>   ... 6 more
> Caused by: java.io.IOException: Invalid HFile block magic: 
> \x00\x00\x04\x00\x00\x00\x00\x00
>   at org.apache.hadoop.hbase.io.hfile.BlockType.parse(BlockType.java:155)
>   at org.apache.hadoop.hbase.io.hfile.BlockType.read(BlockType.java:167)
>   at 
> org.apache.hadoop.hbase.io.hfile.HFileBlock.(HFileBlock.java:344)
>   at 
> org.apache.hadoop.hbase.io.hfile.HFileBlock$FSReaderImpl.readBlockDataInternal(HFileBlock.java:1735)
>   at 
> org.apache.hadoop.hbase.io.hfile.HFileBlock$FSReaderImpl.readBlockData(HFileBlock.java:1558)
>   at 
> org.apache.hadoop.hbase.io.hfile.HFileBlock$AbstractFSReader$1.nextBlock(HFileBlock.java:1397)
>   at 
> org.apache.hadoop.hbase.io.hfile.HFileBlock$AbstractFSReader$1.nextBlockWithBlockType(HFileBlock.java:1405)
>   at 
> org.apache.hadoop.hbase.io.hfile.HFileReaderV2.(HFileReaderV2.java:156)
>   at 
> org.apache.hadoop.hbase.io.hfile.HFile.pickReaderVersion(HFile.java:485)
> {code}
> and second
> {code}
> Caused by: org.apache.hadoop.hbase.io.hfile.CorruptHFileException: Problem 
> reading HFile Trailer from file 
>   at 
> org.apache.hadoop.hbase.io.hfile.HFile.pickReaderVersion(HFile.java:497)
>   at org.apache.hadoop.hbase.io.hfile.HFile.createReader(HFile.java:525)
>   at 
> org.apache.hadoop.hbase.regionserver.StoreFile$Reader.(StoreFile.java:1164)
>   at 
> org.apache.hadoop.hbase.io.HalfStoreFileReader.(HalfStoreFileReader.java:104)
>   at 
> org.apache.hadoop.hbase.regionserver.StoreFileInfo.open(StoreFileInfo.java:256)
>   at 
> org.apache.hadoop.hbase.regionserver.StoreFile.open(StoreFile.java:427)
>   at 
> org.apache.hadoop.hbase.regionserver.StoreFile.createReader(StoreFile.java:528)
>   at 
> org.apache.hadoop.hbase.regionserver.StoreFile.createReader(StoreFile.java:518)
>   at 
> org.apache.hadoop.hbase.regionserver.HStore.createStoreFileAndReader(HStore.java:652)
>   at 
> org.apache.hadoop.hbase.regionserver.HStore.access$000(HStore.java:117)
>   at org.apache.hadoop.hbase.regionserver.HStore$1.call(HStore.java:519)
>   at org.apache.hadoop.hbase.regionserver.HStore$1.call(HStore.java:516)
>   ... 6 more
> Caused by: java.io.IOException: Premature EOF from inputStream (read returned 
> -1, was trying to read 10083 necessary bytes and 24 extra bytes, successfully 
> read 1072
>   at 
> org.apache.hadoop.hbase.io.hfile.HFileBlock.readWithExtra(HFileBlock.java:737)
>   at 
> 

[jira] [Commented] (HBASE-16644) Errors when reading legit HFile' Trailer on branch 1.3

2016-09-16 Thread stack (JIRA)

[ 
https://issues.apache.org/jira/browse/HBASE-16644?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15497039#comment-15497039
 ] 

stack commented on HBASE-16644:
---

bq. Regardless of how such a file was written, still if we can read some file 
on 1.2 and we can't on 1.3 that is a critical regression on the read path IMO.

Agree.

> Errors when reading legit HFile' Trailer on branch 1.3
> --
>
> Key: HBASE-16644
> URL: https://issues.apache.org/jira/browse/HBASE-16644
> Project: HBase
>  Issue Type: Bug
>  Components: HFile
>Affects Versions: 1.3.0, 1.4.0
>Reporter: Mikhail Antonov
>Assignee: Mikhail Antonov
>Priority: Critical
> Fix For: 1.3.0
>
> Attachments: HBASE-16644.branch-1.3.patch
>
>
> There seems to be a regression in branch 1.3 where we can't read HFile 
> trailer(getting "CorruptHFileException: Problem reading HFile Trailer") on 
> some HFiles that could be successfully read on 1.2.
> I've seen this error manifesting in two ways so far.
> {code}Caused by: org.apache.hadoop.hbase.io.hfile.CorruptHFileException: 
> Problem reading HFile Trailer from file  
>   at 
> org.apache.hadoop.hbase.io.hfile.HFile.pickReaderVersion(HFile.java:497)
>   at org.apache.hadoop.hbase.io.hfile.HFile.createReader(HFile.java:525)
>   at 
> org.apache.hadoop.hbase.regionserver.StoreFile$Reader.(StoreFile.java:1164)
>   at 
> org.apache.hadoop.hbase.regionserver.StoreFileInfo.open(StoreFileInfo.java:259)
>   at 
> org.apache.hadoop.hbase.regionserver.StoreFile.open(StoreFile.java:427)
>   at 
> org.apache.hadoop.hbase.regionserver.StoreFile.createReader(StoreFile.java:528)
>   at 
> org.apache.hadoop.hbase.regionserver.StoreFile.createReader(StoreFile.java:518)
>   at 
> org.apache.hadoop.hbase.regionserver.HStore.createStoreFileAndReader(HStore.java:652)
>   at 
> org.apache.hadoop.hbase.regionserver.HStore.access$000(HStore.java:117)
>   at org.apache.hadoop.hbase.regionserver.HStore$1.call(HStore.java:519)
>   at org.apache.hadoop.hbase.regionserver.HStore$1.call(HStore.java:516)
>   ... 6 more
> Caused by: java.io.IOException: Invalid HFile block magic: 
> \x00\x00\x04\x00\x00\x00\x00\x00
>   at org.apache.hadoop.hbase.io.hfile.BlockType.parse(BlockType.java:155)
>   at org.apache.hadoop.hbase.io.hfile.BlockType.read(BlockType.java:167)
>   at 
> org.apache.hadoop.hbase.io.hfile.HFileBlock.(HFileBlock.java:344)
>   at 
> org.apache.hadoop.hbase.io.hfile.HFileBlock$FSReaderImpl.readBlockDataInternal(HFileBlock.java:1735)
>   at 
> org.apache.hadoop.hbase.io.hfile.HFileBlock$FSReaderImpl.readBlockData(HFileBlock.java:1558)
>   at 
> org.apache.hadoop.hbase.io.hfile.HFileBlock$AbstractFSReader$1.nextBlock(HFileBlock.java:1397)
>   at 
> org.apache.hadoop.hbase.io.hfile.HFileBlock$AbstractFSReader$1.nextBlockWithBlockType(HFileBlock.java:1405)
>   at 
> org.apache.hadoop.hbase.io.hfile.HFileReaderV2.(HFileReaderV2.java:156)
>   at 
> org.apache.hadoop.hbase.io.hfile.HFile.pickReaderVersion(HFile.java:485)
> {code}
> and second
> {code}
> Caused by: org.apache.hadoop.hbase.io.hfile.CorruptHFileException: Problem 
> reading HFile Trailer from file 
>   at 
> org.apache.hadoop.hbase.io.hfile.HFile.pickReaderVersion(HFile.java:497)
>   at org.apache.hadoop.hbase.io.hfile.HFile.createReader(HFile.java:525)
>   at 
> org.apache.hadoop.hbase.regionserver.StoreFile$Reader.(StoreFile.java:1164)
>   at 
> org.apache.hadoop.hbase.io.HalfStoreFileReader.(HalfStoreFileReader.java:104)
>   at 
> org.apache.hadoop.hbase.regionserver.StoreFileInfo.open(StoreFileInfo.java:256)
>   at 
> org.apache.hadoop.hbase.regionserver.StoreFile.open(StoreFile.java:427)
>   at 
> org.apache.hadoop.hbase.regionserver.StoreFile.createReader(StoreFile.java:528)
>   at 
> org.apache.hadoop.hbase.regionserver.StoreFile.createReader(StoreFile.java:518)
>   at 
> org.apache.hadoop.hbase.regionserver.HStore.createStoreFileAndReader(HStore.java:652)
>   at 
> org.apache.hadoop.hbase.regionserver.HStore.access$000(HStore.java:117)
>   at org.apache.hadoop.hbase.regionserver.HStore$1.call(HStore.java:519)
>   at org.apache.hadoop.hbase.regionserver.HStore$1.call(HStore.java:516)
>   ... 6 more
> Caused by: java.io.IOException: Premature EOF from inputStream (read returned 
> -1, was trying to read 10083 necessary bytes and 24 extra bytes, successfully 
> read 1072
>   at 
> org.apache.hadoop.hbase.io.hfile.HFileBlock.readWithExtra(HFileBlock.java:737)
>   at 
> org.apache.hadoop.hbase.io.hfile.HFileBlock$AbstractFSReader.readAtOffset(HFileBlock.java:1459)
>   at 
> org.apache.hadoop.hbase.io.hfile.HFileBlock$FSReaderImpl.readBlockDataInternal(HFileBlock.java:1712)
>   at 
> 

[jira] [Commented] (HBASE-16644) Errors when reading legit HFile' Trailer on branch 1.3

2016-09-16 Thread Mikhail Antonov (JIRA)

[ 
https://issues.apache.org/jira/browse/HBASE-16644?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15497029#comment-15497029
 ] 

Mikhail Antonov commented on HBASE-16644:
-

[~stack] thanks for quick reply! So far my thoughts on it:

 - that seems to only happen under specific conditions, since I've only seen a 
handful of such cases. Still looking at what's special about those files. So 
far it *seems" that it only happens on specific blocks, the code path in 
HFileReaderV2 constructor when we read meta index (data index was read fine).
 - Debugging and reproducing is kind of painful as I've only seen that on real 
files and don't yet have cooked-up test file that can demonstrate this effect 
on.
 - Regardless of how such a file was written, still if we can read some file on 
1.2 and we can't on 1.3 that is a critical regression on the read path IMO.

> Errors when reading legit HFile' Trailer on branch 1.3
> --
>
> Key: HBASE-16644
> URL: https://issues.apache.org/jira/browse/HBASE-16644
> Project: HBase
>  Issue Type: Bug
>  Components: HFile
>Affects Versions: 1.3.0, 1.4.0
>Reporter: Mikhail Antonov
>Assignee: Mikhail Antonov
>Priority: Critical
> Fix For: 1.3.0
>
> Attachments: HBASE-16644.branch-1.3.patch
>
>
> There seems to be a regression in branch 1.3 where we can't read HFile 
> trailer(getting "CorruptHFileException: Problem reading HFile Trailer") on 
> some HFiles that could be successfully read on 1.2.
> I've seen this error manifesting in two ways so far.
> {code}Caused by: org.apache.hadoop.hbase.io.hfile.CorruptHFileException: 
> Problem reading HFile Trailer from file  
>   at 
> org.apache.hadoop.hbase.io.hfile.HFile.pickReaderVersion(HFile.java:497)
>   at org.apache.hadoop.hbase.io.hfile.HFile.createReader(HFile.java:525)
>   at 
> org.apache.hadoop.hbase.regionserver.StoreFile$Reader.(StoreFile.java:1164)
>   at 
> org.apache.hadoop.hbase.regionserver.StoreFileInfo.open(StoreFileInfo.java:259)
>   at 
> org.apache.hadoop.hbase.regionserver.StoreFile.open(StoreFile.java:427)
>   at 
> org.apache.hadoop.hbase.regionserver.StoreFile.createReader(StoreFile.java:528)
>   at 
> org.apache.hadoop.hbase.regionserver.StoreFile.createReader(StoreFile.java:518)
>   at 
> org.apache.hadoop.hbase.regionserver.HStore.createStoreFileAndReader(HStore.java:652)
>   at 
> org.apache.hadoop.hbase.regionserver.HStore.access$000(HStore.java:117)
>   at org.apache.hadoop.hbase.regionserver.HStore$1.call(HStore.java:519)
>   at org.apache.hadoop.hbase.regionserver.HStore$1.call(HStore.java:516)
>   ... 6 more
> Caused by: java.io.IOException: Invalid HFile block magic: 
> \x00\x00\x04\x00\x00\x00\x00\x00
>   at org.apache.hadoop.hbase.io.hfile.BlockType.parse(BlockType.java:155)
>   at org.apache.hadoop.hbase.io.hfile.BlockType.read(BlockType.java:167)
>   at 
> org.apache.hadoop.hbase.io.hfile.HFileBlock.(HFileBlock.java:344)
>   at 
> org.apache.hadoop.hbase.io.hfile.HFileBlock$FSReaderImpl.readBlockDataInternal(HFileBlock.java:1735)
>   at 
> org.apache.hadoop.hbase.io.hfile.HFileBlock$FSReaderImpl.readBlockData(HFileBlock.java:1558)
>   at 
> org.apache.hadoop.hbase.io.hfile.HFileBlock$AbstractFSReader$1.nextBlock(HFileBlock.java:1397)
>   at 
> org.apache.hadoop.hbase.io.hfile.HFileBlock$AbstractFSReader$1.nextBlockWithBlockType(HFileBlock.java:1405)
>   at 
> org.apache.hadoop.hbase.io.hfile.HFileReaderV2.(HFileReaderV2.java:156)
>   at 
> org.apache.hadoop.hbase.io.hfile.HFile.pickReaderVersion(HFile.java:485)
> {code}
> and second
> {code}
> Caused by: org.apache.hadoop.hbase.io.hfile.CorruptHFileException: Problem 
> reading HFile Trailer from file 
>   at 
> org.apache.hadoop.hbase.io.hfile.HFile.pickReaderVersion(HFile.java:497)
>   at org.apache.hadoop.hbase.io.hfile.HFile.createReader(HFile.java:525)
>   at 
> org.apache.hadoop.hbase.regionserver.StoreFile$Reader.(StoreFile.java:1164)
>   at 
> org.apache.hadoop.hbase.io.HalfStoreFileReader.(HalfStoreFileReader.java:104)
>   at 
> org.apache.hadoop.hbase.regionserver.StoreFileInfo.open(StoreFileInfo.java:256)
>   at 
> org.apache.hadoop.hbase.regionserver.StoreFile.open(StoreFile.java:427)
>   at 
> org.apache.hadoop.hbase.regionserver.StoreFile.createReader(StoreFile.java:528)
>   at 
> org.apache.hadoop.hbase.regionserver.StoreFile.createReader(StoreFile.java:518)
>   at 
> org.apache.hadoop.hbase.regionserver.HStore.createStoreFileAndReader(HStore.java:652)
>   at 
> org.apache.hadoop.hbase.regionserver.HStore.access$000(HStore.java:117)
>   at org.apache.hadoop.hbase.regionserver.HStore$1.call(HStore.java:519)
>   at 

[jira] [Commented] (HBASE-16644) Errors when reading legit HFile' Trailer on branch 1.3

2016-09-16 Thread stack (JIRA)

[ 
https://issues.apache.org/jira/browse/HBASE-16644?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15496255#comment-15496255
 ] 

stack commented on HBASE-16644:
---

Ouch. That looks like an ugly one to debug [~mantonov]. Sorry. Yeah, made a 
mistake presuming we always checksum (how did you write files w/o a checksum? A 
mistake?). Patch is great but needs a unit test, don't you think?

> Errors when reading legit HFile' Trailer on branch 1.3
> --
>
> Key: HBASE-16644
> URL: https://issues.apache.org/jira/browse/HBASE-16644
> Project: HBase
>  Issue Type: Bug
>  Components: HFile
>Affects Versions: 1.3.0, 1.4.0
>Reporter: Mikhail Antonov
>Assignee: Mikhail Antonov
>Priority: Critical
> Fix For: 1.3.0
>
> Attachments: HBASE-16644.branch-1.3.patch
>
>
> There seems to be a regression in branch 1.3 where we can't read HFile 
> trailer(getting "CorruptHFileException: Problem reading HFile Trailer") on 
> some HFiles that could be successfully read on 1.2.
> I've seen this error manifesting in two ways so far.
> {code}Caused by: org.apache.hadoop.hbase.io.hfile.CorruptHFileException: 
> Problem reading HFile Trailer from file  
>   at 
> org.apache.hadoop.hbase.io.hfile.HFile.pickReaderVersion(HFile.java:497)
>   at org.apache.hadoop.hbase.io.hfile.HFile.createReader(HFile.java:525)
>   at 
> org.apache.hadoop.hbase.regionserver.StoreFile$Reader.(StoreFile.java:1164)
>   at 
> org.apache.hadoop.hbase.regionserver.StoreFileInfo.open(StoreFileInfo.java:259)
>   at 
> org.apache.hadoop.hbase.regionserver.StoreFile.open(StoreFile.java:427)
>   at 
> org.apache.hadoop.hbase.regionserver.StoreFile.createReader(StoreFile.java:528)
>   at 
> org.apache.hadoop.hbase.regionserver.StoreFile.createReader(StoreFile.java:518)
>   at 
> org.apache.hadoop.hbase.regionserver.HStore.createStoreFileAndReader(HStore.java:652)
>   at 
> org.apache.hadoop.hbase.regionserver.HStore.access$000(HStore.java:117)
>   at org.apache.hadoop.hbase.regionserver.HStore$1.call(HStore.java:519)
>   at org.apache.hadoop.hbase.regionserver.HStore$1.call(HStore.java:516)
>   ... 6 more
> Caused by: java.io.IOException: Invalid HFile block magic: 
> \x00\x00\x04\x00\x00\x00\x00\x00
>   at org.apache.hadoop.hbase.io.hfile.BlockType.parse(BlockType.java:155)
>   at org.apache.hadoop.hbase.io.hfile.BlockType.read(BlockType.java:167)
>   at 
> org.apache.hadoop.hbase.io.hfile.HFileBlock.(HFileBlock.java:344)
>   at 
> org.apache.hadoop.hbase.io.hfile.HFileBlock$FSReaderImpl.readBlockDataInternal(HFileBlock.java:1735)
>   at 
> org.apache.hadoop.hbase.io.hfile.HFileBlock$FSReaderImpl.readBlockData(HFileBlock.java:1558)
>   at 
> org.apache.hadoop.hbase.io.hfile.HFileBlock$AbstractFSReader$1.nextBlock(HFileBlock.java:1397)
>   at 
> org.apache.hadoop.hbase.io.hfile.HFileBlock$AbstractFSReader$1.nextBlockWithBlockType(HFileBlock.java:1405)
>   at 
> org.apache.hadoop.hbase.io.hfile.HFileReaderV2.(HFileReaderV2.java:156)
>   at 
> org.apache.hadoop.hbase.io.hfile.HFile.pickReaderVersion(HFile.java:485)
> {code}
> and second
> {code}
> Caused by: org.apache.hadoop.hbase.io.hfile.CorruptHFileException: Problem 
> reading HFile Trailer from file 
>   at 
> org.apache.hadoop.hbase.io.hfile.HFile.pickReaderVersion(HFile.java:497)
>   at org.apache.hadoop.hbase.io.hfile.HFile.createReader(HFile.java:525)
>   at 
> org.apache.hadoop.hbase.regionserver.StoreFile$Reader.(StoreFile.java:1164)
>   at 
> org.apache.hadoop.hbase.io.HalfStoreFileReader.(HalfStoreFileReader.java:104)
>   at 
> org.apache.hadoop.hbase.regionserver.StoreFileInfo.open(StoreFileInfo.java:256)
>   at 
> org.apache.hadoop.hbase.regionserver.StoreFile.open(StoreFile.java:427)
>   at 
> org.apache.hadoop.hbase.regionserver.StoreFile.createReader(StoreFile.java:528)
>   at 
> org.apache.hadoop.hbase.regionserver.StoreFile.createReader(StoreFile.java:518)
>   at 
> org.apache.hadoop.hbase.regionserver.HStore.createStoreFileAndReader(HStore.java:652)
>   at 
> org.apache.hadoop.hbase.regionserver.HStore.access$000(HStore.java:117)
>   at org.apache.hadoop.hbase.regionserver.HStore$1.call(HStore.java:519)
>   at org.apache.hadoop.hbase.regionserver.HStore$1.call(HStore.java:516)
>   ... 6 more
> Caused by: java.io.IOException: Premature EOF from inputStream (read returned 
> -1, was trying to read 10083 necessary bytes and 24 extra bytes, successfully 
> read 1072
>   at 
> org.apache.hadoop.hbase.io.hfile.HFileBlock.readWithExtra(HFileBlock.java:737)
>   at 
> org.apache.hadoop.hbase.io.hfile.HFileBlock$AbstractFSReader.readAtOffset(HFileBlock.java:1459)
>   at 
>