Which hbase release are you using ?

I seem to recall that hbase.bucketcache.bucket.sizes was the key.

Cheers

On Mon, Jun 1, 2015 at 7:04 AM, Dejan Menges <dejan.men...@gmail.com> wrote:

> Hi,
>
> I'm getting messages like:
>
> 015-06-01 14:02:29,529 WARN
> org.apache.hadoop.hbase.io.hfile.bucket.BucketCache: Failed allocating for
> block ce18012f4dfa424db88e92de29e76a9b_25809098330
>
> org.apache.hadoop.hbase.io.hfile.bucket.BucketAllocatorException:
> Allocation too big size=750465
>
> at
>
> org.apache.hadoop.hbase.io.hfile.bucket.BucketAllocator.allocateBlock(BucketAllocator.java:400)
>
> at
>
> org.apache.hadoop.hbase.io.hfile.bucket.BucketCache$RAMQueueEntry.writeToCache(BucketCache.java:1153)
>
> at
>
> org.apache.hadoop.hbase.io.hfile.bucket.BucketCache$WriterThread.doDrain(BucketCache.java:703)
>
> at
>
> org.apache.hadoop.hbase.io.hfile.bucket.BucketCache$WriterThread.run(BucketCache.java:675)
>
> at java.lang.Thread.run(Thread.java:745)
>
>
> However, not sure why is this. If I understood it correctly (and probably I
> didn't :/) this should fit in one of those:
>
>    <property>
>
>    <name>hbase.bucketcache.sizes</name>
>
>  <value>65536,131072,196608,262144,327680,393216,655360,1310720</value>
>
> </property>
>
> In the same time, hbase,bucketcache.size is 24G. Not sure what I did
> (again) wrong?
>

Reply via email to