Re: hbase.bucketcache.bucket.sizes had set multiple of 1024 but still got "Invalid HFile block magic"

2017-11-28 Thread Anoop John
FYI

It has to be multiples of 256.


hbase.bucketcache.bucket.sizes

A comma-separated list of sizes for buckets for the
bucketcache.
Can be multiple sizes. List block sizes in order from smallest to largest.
The sizes you use will depend on your data access patterns.
Must be a multiple of 256 else you will run into
'java.io.IOException: Invalid HFile block magic' when you go to
read from cache.
If you specify no values here, then you pick up the default bucketsizes set
in code (See BucketAllocator#DEFAULT_BUCKET_SIZES).
  
  

-Anoop-

On Sat, Nov 25, 2017 at 7:53 PM, 苏国东  wrote:
> Hi:
>
> Class BucketEntry calculate offset like as :
> long offset()
> { // Java has no unsigned numbers long o = ((long) offsetBase) & 0x; 
> o += (((long) (offset1)) & 0xFF) << 32; return o << 8; }
> private void setOffset(long value)
> { assert (value & 0xFF) == 0; value >>= 8; offsetBase = (int) value; offset1 
> = (byte) (value >> 32); }
> If offset not multiple of 1024 , the method offset() will return wrong value. 
> if bucketsizes must be multiple of 1024, probably waste memory .
>
> 198211 is not multiple of 1024
>
>
> At 2017-11-25 20:52:10, "Weizhan Zeng"  wrote:
>>Hi , guys
>>   In https://issues.apache.org/jira/browse/HBASE-16993 
>>  , I found that
>>
>>hbase.bucketcache.bucket.sizes must set multiple of 1024, But when I set
>>
>>  
>>hbase.bucketcache.bucket.sizes
>>
>> 6144,9216,41984,50176,58368,66560,99328,132096,198211,263168,394240,525312,1049600,2099200
>>  
>>
>>And I still got  error :
>>
>>
>>2017-11-25 20:37:37,222 ERROR 
>>[B.defaultRpcServer.handler=20,queue=1,port=60020] bucket.BucketCache: Failed 
>>reading block d444ab4b244140c199f23a3870f59136_250591965 from bucket cache
>>java.io.IOException: Invalid HFile block magic: 
>>\x00\x00\x00\x00\x00\x00\x00\x00
>>   at org.apache.hadoop.hbase.io.hfile.BlockType.parse(BlockType.java:155)
>>   at org.apache.hadoop.hbase.io.hfile.BlockType.read(BlockType.java:167)
>>   at 
>> org.apache.hadoop.hbase.io.hfile.HFileBlock.(HFileBlock.java:275)
>>   at 
>> org.apache.hadoop.hbase.io.hfile.HFileBlock$1.deserialize(HFileBlock.java:136)
>>   at 
>> org.apache.hadoop.hbase.io.hfile.HFileBlock$1.deserialize(HFileBlock.java:123)
>>   at 
>> org.apache.hadoop.hbase.io.hfile.bucket.BucketCache.getBlock(BucketCache.java:428)
>>   at 
>> org.apache.hadoop.hbase.io.hfile.CombinedBlockCache.getBlock(CombinedBlockCache.java:85)
>>   at 
>> org.apache.hadoop.hbase.io.hfile.HFileReaderV2.getCachedBlock(HFileReaderV2.java:278)
>>   at 
>> org.apache.hadoop.hbase.io.hfile.HFileReaderV2.readBlock(HFileReaderV2.java:418)
>>   at 
>> org.apache.hadoop.hbase.io.hfile.HFileBlockIndex$BlockIndexReader.loadDataBlockWithScanInfo(HFileBlockIndex.java:271)
>>   at 
>> org.apache.hadoop.hbase.io.hfile.HFileReaderV2$AbstractScannerV2.seekTo(HFileReaderV2.java:649)
>>   at 
>> org.apache.hadoop.hbase.io.hfile.HFileReaderV2$AbstractScannerV2.seekTo(HFileReaderV2.java:599)
>>   at 
>> org.apache.hadoop.hbase.regionserver.StoreFileScanner.seekAtOrAfter(StoreFileScanner.java:268)
>>   at 
>> org.apache.hadoop.hbase.regionserver.StoreFileScanner.seek(StoreFileScanner.java:173)
>>   at 
>> org.apache.hadoop.hbase.regionserver.StoreScanner.seekScanners(StoreScanner.java:350)
>>   at 
>> org.apache.hadoop.hbase.regionserver.StoreScanner.(StoreScanner.java:199)
>>   at 
>> org.apache.hadoop.hbase.regionserver.HStore.getScanner(HStore.java:2077)
>>   at 
>> org.apache.hadoop.hbase.regionserver.HRegion$RegionScannerImpl.(HRegion.java:5556)
>>   at 
>> org.apache.hadoop.hbase.regionserver.HRegion.instantiateRegionScanner(HRegion.java:2574)
>>   at 
>> org.apache.hadoop.hbase.regionserver.HRegion.getScanner(HRegion.java:2560)
>>   at 
>> org.apache.hadoop.hbase.regionserver.HRegion.getScanner(HRegion.java:2541)
>>   at org.apache.hadoop.hbase.regionserver.HRegion.get(HRegion.java:6830)
>>   at org.apache.hadoop.hbase.regionserver.HRegion.get(HRegion.java:6809)
>>   at 
>> org.apache.hadoop.hbase.regionserver.RSRpcServices.get(RSRpcServices.java:2049)
>>   at 
>> org.apache.hadoop.hbase.protobuf.generated.ClientProtos$ClientService$2.callBlockingMethod(ClientProtos.java:33644)
>>   at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2196)
>>   at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:112)
>>   at 
>> org.apache.hadoop.hbase.ipc.RpcExecutor.consumerLoop(RpcExecutor.java:133)
>>   at org.apache.hadoop.hbase.ipc.RpcExecutor$1.run(RpcExecutor.java:108)
>>   at java.lang.Thread.run(Thread.java:748)
>>
>>
>>Is there anything I missed ?


Re: hbase.bucketcache.bucket.sizes had set multiple of 1024 but still got "Invalid HFile block magic"

2017-11-25 Thread Weizhan Zeng
sorry , I had  a  calculate  mistake。  Just ignore it ...

2017-11-25 20:52 GMT+08:00 Weizhan Zeng :

> Hi , guys
>In https://issues.apache.org/jira/browse/HBASE-16993 , I found that
>
> hbase.bucketcache.bucket.sizes must set multiple of 1024, But when I set
>
>   
> hbase.bucketcache.bucket.sizes
> 6144,9216,41984,50176,58368,66560,99328,132096,
> 198211,263168,394240,525312,1049600,2099200
>   
>
> And I still got  error :
>
>
> 2017-11-25 20:37:37,222 ERROR 
> [B.defaultRpcServer.handler=20,queue=1,port=60020]
> bucket.BucketCache: Failed reading block 
> d444ab4b244140c199f23a3870f59136_250591965
> from bucket cache
> java.io.IOException: Invalid HFile block magic:
> \x00\x00\x00\x00\x00\x00\x00\x00
> at org.apache.hadoop.hbase.io.hfile.BlockType.parse(BlockType.java:155)
> at org.apache.hadoop.hbase.io.hfile.BlockType.read(BlockType.java:167)
> at org.apache.hadoop.hbase.io.hfile.HFileBlock.(HFileBlock.java:275)
> at org.apache.hadoop.hbase.io.hfile.HFileBlock$1.
> deserialize(HFileBlock.java:136)
> at org.apache.hadoop.hbase.io.hfile.HFileBlock$1.
> deserialize(HFileBlock.java:123)
> at org.apache.hadoop.hbase.io.hfile.bucket.BucketCache.
> getBlock(BucketCache.java:428)
> at org.apache.hadoop.hbase.io.hfile.CombinedBlockCache.
> getBlock(CombinedBlockCache.java:85)
> at org.apache.hadoop.hbase.io.hfile.HFileReaderV2.
> getCachedBlock(HFileReaderV2.java:278)
> at org.apache.hadoop.hbase.io.hfile.HFileReaderV2.readBlock(
> HFileReaderV2.java:418)
> at org.apache.hadoop.hbase.io.hfile.HFileBlockIndex$BlockIndexReader.
> loadDataBlockWithScanInfo(HFileBlockIndex.java:271)
> at org.apache.hadoop.hbase.io.hfile.HFileReaderV2$
> AbstractScannerV2.seekTo(HFileReaderV2.java:649)
> at org.apache.hadoop.hbase.io.hfile.HFileReaderV2$
> AbstractScannerV2.seekTo(HFileReaderV2.java:599)
> at org.apache.hadoop.hbase.regionserver.StoreFileScanner.seekAtOrAfter(
> StoreFileScanner.java:268)
> at org.apache.hadoop.hbase.regionserver.StoreFileScanner.
> seek(StoreFileScanner.java:173)
> at org.apache.hadoop.hbase.regionserver.StoreScanner.
> seekScanners(StoreScanner.java:350)
> at org.apache.hadoop.hbase.regionserver.StoreScanner.<
> init>(StoreScanner.java:199)
> at org.apache.hadoop.hbase.regionserver.HStore.
> getScanner(HStore.java:2077)
> at org.apache.hadoop.hbase.regionserver.HRegion$RegionScannerImpl.(
> HRegion.java:5556)
> at org.apache.hadoop.hbase.regionserver.HRegion.instantiateRegionScanner(
> HRegion.java:2574)
> at org.apache.hadoop.hbase.regionserver.HRegion.
> getScanner(HRegion.java:2560)
> at org.apache.hadoop.hbase.regionserver.HRegion.
> getScanner(HRegion.java:2541)
> at org.apache.hadoop.hbase.regionserver.HRegion.get(HRegion.java:6830)
> at org.apache.hadoop.hbase.regionserver.HRegion.get(HRegion.java:6809)
> at org.apache.hadoop.hbase.regionserver.RSRpcServices.
> get(RSRpcServices.java:2049)
> at org.apache.hadoop.hbase.protobuf.generated.
> ClientProtos$ClientService$2.callBlockingMethod(ClientProtos.java:33644)
> at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2196)
> at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:112)
> at org.apache.hadoop.hbase.ipc.RpcExecutor.consumerLoop(
> RpcExecutor.java:133)
> at org.apache.hadoop.hbase.ipc.RpcExecutor$1.run(RpcExecutor.java:108)
> at java.lang.Thread.run(Thread.java:748)
>
>
> Is there anything I missed ?
>


hbase.bucketcache.bucket.sizes had set multiple of 1024 but still got "Invalid HFile block magic"

2017-11-25 Thread Weizhan Zeng
Hi , guys 
   In https://issues.apache.org/jira/browse/HBASE-16993 
 , I found that  

hbase.bucketcache.bucket.sizes must set multiple of 1024, But when I set 

  
hbase.bucketcache.bucket.sizes

6144,9216,41984,50176,58368,66560,99328,132096,198211,263168,394240,525312,1049600,2099200
  

And I still got  error :


2017-11-25 20:37:37,222 ERROR 
[B.defaultRpcServer.handler=20,queue=1,port=60020] bucket.BucketCache: Failed 
reading block d444ab4b244140c199f23a3870f59136_250591965 from bucket cache
java.io.IOException: Invalid HFile block magic: \x00\x00\x00\x00\x00\x00\x00\x00
at org.apache.hadoop.hbase.io.hfile.BlockType.parse(BlockType.java:155)
at org.apache.hadoop.hbase.io.hfile.BlockType.read(BlockType.java:167)
at 
org.apache.hadoop.hbase.io.hfile.HFileBlock.(HFileBlock.java:275)
at 
org.apache.hadoop.hbase.io.hfile.HFileBlock$1.deserialize(HFileBlock.java:136)
at 
org.apache.hadoop.hbase.io.hfile.HFileBlock$1.deserialize(HFileBlock.java:123)
at 
org.apache.hadoop.hbase.io.hfile.bucket.BucketCache.getBlock(BucketCache.java:428)
at 
org.apache.hadoop.hbase.io.hfile.CombinedBlockCache.getBlock(CombinedBlockCache.java:85)
at 
org.apache.hadoop.hbase.io.hfile.HFileReaderV2.getCachedBlock(HFileReaderV2.java:278)
at 
org.apache.hadoop.hbase.io.hfile.HFileReaderV2.readBlock(HFileReaderV2.java:418)
at 
org.apache.hadoop.hbase.io.hfile.HFileBlockIndex$BlockIndexReader.loadDataBlockWithScanInfo(HFileBlockIndex.java:271)
at 
org.apache.hadoop.hbase.io.hfile.HFileReaderV2$AbstractScannerV2.seekTo(HFileReaderV2.java:649)
at 
org.apache.hadoop.hbase.io.hfile.HFileReaderV2$AbstractScannerV2.seekTo(HFileReaderV2.java:599)
at 
org.apache.hadoop.hbase.regionserver.StoreFileScanner.seekAtOrAfter(StoreFileScanner.java:268)
at 
org.apache.hadoop.hbase.regionserver.StoreFileScanner.seek(StoreFileScanner.java:173)
at 
org.apache.hadoop.hbase.regionserver.StoreScanner.seekScanners(StoreScanner.java:350)
at 
org.apache.hadoop.hbase.regionserver.StoreScanner.(StoreScanner.java:199)
at 
org.apache.hadoop.hbase.regionserver.HStore.getScanner(HStore.java:2077)
at 
org.apache.hadoop.hbase.regionserver.HRegion$RegionScannerImpl.(HRegion.java:5556)
at 
org.apache.hadoop.hbase.regionserver.HRegion.instantiateRegionScanner(HRegion.java:2574)
at 
org.apache.hadoop.hbase.regionserver.HRegion.getScanner(HRegion.java:2560)
at 
org.apache.hadoop.hbase.regionserver.HRegion.getScanner(HRegion.java:2541)
at org.apache.hadoop.hbase.regionserver.HRegion.get(HRegion.java:6830)
at org.apache.hadoop.hbase.regionserver.HRegion.get(HRegion.java:6809)
at 
org.apache.hadoop.hbase.regionserver.RSRpcServices.get(RSRpcServices.java:2049)
at 
org.apache.hadoop.hbase.protobuf.generated.ClientProtos$ClientService$2.callBlockingMethod(ClientProtos.java:33644)
at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2196)
at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:112)
at 
org.apache.hadoop.hbase.ipc.RpcExecutor.consumerLoop(RpcExecutor.java:133)
at org.apache.hadoop.hbase.ipc.RpcExecutor$1.run(RpcExecutor.java:108)
at java.lang.Thread.run(Thread.java:748)


Is there anything I missed ?