[ 
https://issues.apache.org/jira/browse/HBASE-9644?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Victor Xu updated HBASE-9644:
-----------------------------

    Description: 
Regionserver threw a "java.lang.NoClassDefFoundError: Ljava/lang/InternalError" 
Exception when it decompressed a hfileblock. 

The exception detail is:
{noformat} 
2013-09-15 05:44:03,612 ERROR 
org.apache.hadoop.hbase.regionserver.HRegionServer: 
java.lang.NoClassDefFoundError: Ljava/lang/InternalError
        at 
org.apache.hadoop.io.compress.snappy.SnappyDecompressor.decompressBytesDirect(Native
 Method)
        at 
org.apache.hadoop.io.compress.snappy.SnappyDecompressor.decompress(SnappyDecompressor.java:238)
        at 
org.apache.hadoop.io.compress.BlockDecompressorStream.decompress(BlockDecompressorStream.java:87)
        at 
org.apache.hadoop.io.compress.DecompressorStream.read(DecompressorStream.java:83)
        at java.io.BufferedInputStream.read1(BufferedInputStream.java:256)
        at java.io.BufferedInputStream.read(BufferedInputStream.java:317)
        at org.apache.hadoop.io.IOUtils.readFully(IOUtils.java:192)
        at 
org.apache.hadoop.hbase.io.hfile.HFileBlock$AbstractFSReader.decompress(HFileBlock.java:1461)
        at 
org.apache.hadoop.hbase.io.hfile.HFileBlock$FSReaderV2.readBlockDataInternal(HFileBlock.java:1890)
        at 
org.apache.hadoop.hbase.io.hfile.HFileBlock$FSReaderV2.readBlockData(HFileBlock.java:1703)
        at 
org.apache.hadoop.hbase.io.hfile.HFileReaderV2.readBlock(HFileReaderV2.java:342)
        at 
org.apache.hadoop.hbase.io.hfile.HFileBlockIndex$BlockIndexReader.loadDataBlockWithScanInfo(HFileBlockIndex.java:254)
        at 
org.apache.hadoop.hbase.io.hfile.HFileReaderV2$AbstractScannerV2.seekTo(HFileReaderV2.java:484)
        at 
org.apache.hadoop.hbase.io.hfile.HFileReaderV2$AbstractScannerV2.seekTo(HFileReaderV2.java:505)
        at 
org.apache.hadoop.hbase.regionserver.StoreFileScanner.seekAtOrAfter(StoreFileScanner.java:220)
        at 
org.apache.hadoop.hbase.regionserver.StoreFileScanner.seek(StoreFileScanner.java:140)
        at 
org.apache.hadoop.hbase.regionserver.StoreScanner.<init>(StoreScanner.java:131)
        at 
org.apache.hadoop.hbase.regionserver.Store.getScanner(Store.java:2208)
        at 
org.apache.hadoop.hbase.regionserver.HRegion$RegionScannerImpl.<init>(HRegion.java:3807)
        at 
org.apache.hadoop.hbase.regionserver.HRegion.instantiateRegionScanner(HRegion.java:1825)
        at 
org.apache.hadoop.hbase.regionserver.HRegion.getScanner(HRegion.java:1817)
        at 
org.apache.hadoop.hbase.regionserver.HRegion.getScanner(HRegion.java:1794)
        at org.apache.hadoop.hbase.regionserver.HRegion.get(HRegion.java:4828)
        at org.apache.hadoop.hbase.regionserver.HRegion.get(HRegion.java:4802)
        at 
org.apache.hadoop.hbase.regionserver.HRegionServer.get(HRegionServer.java:2196)
        at sun.reflect.GeneratedMethodAccessor48.invoke(Unknown Source)
        at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
        at java.lang.reflect.Method.invoke(Method.java:597)
        at 
org.apache.hadoop.hbase.ipc.WritableRpcEngine$Server.call(WritableRpcEngine.java:320)
        at 
org.apache.hadoop.hbase.ipc.HBaseServer$Handler.run(HBaseServer.java:1426)
Caused by: java.lang.ClassNotFoundException: Ljava.lang.InternalError
        at java.net.URLClassLoader$1.run(URLClassLoader.java:202)
        at java.security.AccessController.doPrivileged(Native Method)
        at java.net.URLClassLoader.findClass(URLClassLoader.java:190)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
        at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:247)
        ... 30 more
{noformat} 
There are two problems here:
1. Why use the class name 'Ljava/lang/InternalError' instead of 
'java/lang/InternalError' in Snappy library?
This involves the code of snappy lib, maybe there is a bug in it.

2. When I tried to read the hfileblock using HDFS tools, it told me that the 
local hfileblock didn't pass the file checksum and read another replica from 
remote datanode. Then here is the question, why hbase checksum cannot find this 
problem while hdfs can?

This is my hbase-site.xml configure for checksum:
  <property>
    <name>dfs.client.read.shortcircuit</name>
    <value>true</value>
    <description></description>
  </property>

  <property>
    <name>dfs.client.read.shortcircuit.skip.checksum</name>
    <value>true</value>
    <description></description>
  </property>

  <property>
    <name>hbase.regionserver.checksum.verify</name>
    <value>true</value>
    <description></description>
  </property>


  was:
Regionserver threw a "java.lang.NoClassDefFoundError: Ljava/lang/InternalError" 
Exception when it decompressed a hfileblock. 

The exception detail is:
{quote}
2013-09-15 05:44:03,612 ERROR 
org.apache.hadoop.hbase.regionserver.HRegionServer: 
java.lang.NoClassDefFoundError: Ljava/lang/InternalError
        at 
org.apache.hadoop.io.compress.snappy.SnappyDecompressor.decompressBytesDirect(Native
 Method)
        at 
org.apache.hadoop.io.compress.snappy.SnappyDecompressor.decompress(SnappyDecompressor.java:238)
        at 
org.apache.hadoop.io.compress.BlockDecompressorStream.decompress(BlockDecompressorStream.java:87)
        at 
org.apache.hadoop.io.compress.DecompressorStream.read(DecompressorStream.java:83)
        at java.io.BufferedInputStream.read1(BufferedInputStream.java:256)
        at java.io.BufferedInputStream.read(BufferedInputStream.java:317)
        at org.apache.hadoop.io.IOUtils.readFully(IOUtils.java:192)
        at 
org.apache.hadoop.hbase.io.hfile.HFileBlock$AbstractFSReader.decompress(HFileBlock.java:1461)
        at 
org.apache.hadoop.hbase.io.hfile.HFileBlock$FSReaderV2.readBlockDataInternal(HFileBlock.java:1890)
        at 
org.apache.hadoop.hbase.io.hfile.HFileBlock$FSReaderV2.readBlockData(HFileBlock.java:1703)
        at 
org.apache.hadoop.hbase.io.hfile.HFileReaderV2.readBlock(HFileReaderV2.java:342)
        at 
org.apache.hadoop.hbase.io.hfile.HFileBlockIndex$BlockIndexReader.loadDataBlockWithScanInfo(HFileBlockIndex.java:254)
        at 
org.apache.hadoop.hbase.io.hfile.HFileReaderV2$AbstractScannerV2.seekTo(HFileReaderV2.java:484)
        at 
org.apache.hadoop.hbase.io.hfile.HFileReaderV2$AbstractScannerV2.seekTo(HFileReaderV2.java:505)
        at 
org.apache.hadoop.hbase.regionserver.StoreFileScanner.seekAtOrAfter(StoreFileScanner.java:220)
        at 
org.apache.hadoop.hbase.regionserver.StoreFileScanner.seek(StoreFileScanner.java:140)
        at 
org.apache.hadoop.hbase.regionserver.StoreScanner.<init>(StoreScanner.java:131)
        at 
org.apache.hadoop.hbase.regionserver.Store.getScanner(Store.java:2208)
        at 
org.apache.hadoop.hbase.regionserver.HRegion$RegionScannerImpl.<init>(HRegion.java:3807)
        at 
org.apache.hadoop.hbase.regionserver.HRegion.instantiateRegionScanner(HRegion.java:1825)
        at 
org.apache.hadoop.hbase.regionserver.HRegion.getScanner(HRegion.java:1817)
        at 
org.apache.hadoop.hbase.regionserver.HRegion.getScanner(HRegion.java:1794)
        at org.apache.hadoop.hbase.regionserver.HRegion.get(HRegion.java:4828)
        at org.apache.hadoop.hbase.regionserver.HRegion.get(HRegion.java:4802)
        at 
org.apache.hadoop.hbase.regionserver.HRegionServer.get(HRegionServer.java:2196)
        at sun.reflect.GeneratedMethodAccessor48.invoke(Unknown Source)
        at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
        at java.lang.reflect.Method.invoke(Method.java:597)
        at 
org.apache.hadoop.hbase.ipc.WritableRpcEngine$Server.call(WritableRpcEngine.java:320)
        at 
org.apache.hadoop.hbase.ipc.HBaseServer$Handler.run(HBaseServer.java:1426)
Caused by: java.lang.ClassNotFoundException: Ljava.lang.InternalError
        at java.net.URLClassLoader$1.run(URLClassLoader.java:202)
        at java.security.AccessController.doPrivileged(Native Method)
        at java.net.URLClassLoader.findClass(URLClassLoader.java:190)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
        at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:247)
        ... 30 more
{quote}
There are two problems here:
1. Why use the class name 'Ljava/lang/InternalError' instead of 
'java/lang/InternalError' in Snappy library?
This involves the code of snappy lib, maybe there is a bug in it.

2. When I tried to read the hfileblock using HDFS tools, it told me that the 
local hfileblock didn't pass the file checksum and read another replica from 
remote datanode. Then here is the question, why hbase checksum cannot find this 
problem while hdfs can?

This is my hbase-site.xml configure for checksum:
  <property>
    <name>dfs.client.read.shortcircuit</name>
    <value>true</value>
    <description></description>
  </property>

  <property>
    <name>dfs.client.read.shortcircuit.skip.checksum</name>
    <value>true</value>
    <description></description>
  </property>

  <property>
    <name>hbase.regionserver.checksum.verify</name>
    <value>true</value>
    <description></description>
  </property>


    
> Regionserver throws java.lang.NoClassDefFoundError: Ljava/lang/InternalError 
> exception while decompressing hfileblock
> ---------------------------------------------------------------------------------------------------------------------
>
>                 Key: HBASE-9644
>                 URL: https://issues.apache.org/jira/browse/HBASE-9644
>             Project: HBase
>          Issue Type: Bug
>          Components: HFile, regionserver
>    Affects Versions: 0.94.10
>         Environment: Linux 2.6.32-el5.x86_64
>            Reporter: Victor Xu
>
> Regionserver threw a "java.lang.NoClassDefFoundError: 
> Ljava/lang/InternalError" Exception when it decompressed a hfileblock. 
> The exception detail is:
> {noformat} 
> 2013-09-15 05:44:03,612 ERROR 
> org.apache.hadoop.hbase.regionserver.HRegionServer: 
> java.lang.NoClassDefFoundError: Ljava/lang/InternalError
>         at 
> org.apache.hadoop.io.compress.snappy.SnappyDecompressor.decompressBytesDirect(Native
>  Method)
>         at 
> org.apache.hadoop.io.compress.snappy.SnappyDecompressor.decompress(SnappyDecompressor.java:238)
>         at 
> org.apache.hadoop.io.compress.BlockDecompressorStream.decompress(BlockDecompressorStream.java:87)
>         at 
> org.apache.hadoop.io.compress.DecompressorStream.read(DecompressorStream.java:83)
>         at java.io.BufferedInputStream.read1(BufferedInputStream.java:256)
>         at java.io.BufferedInputStream.read(BufferedInputStream.java:317)
>         at org.apache.hadoop.io.IOUtils.readFully(IOUtils.java:192)
>         at 
> org.apache.hadoop.hbase.io.hfile.HFileBlock$AbstractFSReader.decompress(HFileBlock.java:1461)
>         at 
> org.apache.hadoop.hbase.io.hfile.HFileBlock$FSReaderV2.readBlockDataInternal(HFileBlock.java:1890)
>         at 
> org.apache.hadoop.hbase.io.hfile.HFileBlock$FSReaderV2.readBlockData(HFileBlock.java:1703)
>         at 
> org.apache.hadoop.hbase.io.hfile.HFileReaderV2.readBlock(HFileReaderV2.java:342)
>         at 
> org.apache.hadoop.hbase.io.hfile.HFileBlockIndex$BlockIndexReader.loadDataBlockWithScanInfo(HFileBlockIndex.java:254)
>         at 
> org.apache.hadoop.hbase.io.hfile.HFileReaderV2$AbstractScannerV2.seekTo(HFileReaderV2.java:484)
>         at 
> org.apache.hadoop.hbase.io.hfile.HFileReaderV2$AbstractScannerV2.seekTo(HFileReaderV2.java:505)
>         at 
> org.apache.hadoop.hbase.regionserver.StoreFileScanner.seekAtOrAfter(StoreFileScanner.java:220)
>         at 
> org.apache.hadoop.hbase.regionserver.StoreFileScanner.seek(StoreFileScanner.java:140)
>         at 
> org.apache.hadoop.hbase.regionserver.StoreScanner.<init>(StoreScanner.java:131)
>         at 
> org.apache.hadoop.hbase.regionserver.Store.getScanner(Store.java:2208)
>         at 
> org.apache.hadoop.hbase.regionserver.HRegion$RegionScannerImpl.<init>(HRegion.java:3807)
>         at 
> org.apache.hadoop.hbase.regionserver.HRegion.instantiateRegionScanner(HRegion.java:1825)
>         at 
> org.apache.hadoop.hbase.regionserver.HRegion.getScanner(HRegion.java:1817)
>         at 
> org.apache.hadoop.hbase.regionserver.HRegion.getScanner(HRegion.java:1794)
>         at org.apache.hadoop.hbase.regionserver.HRegion.get(HRegion.java:4828)
>         at org.apache.hadoop.hbase.regionserver.HRegion.get(HRegion.java:4802)
>         at 
> org.apache.hadoop.hbase.regionserver.HRegionServer.get(HRegionServer.java:2196)
>         at sun.reflect.GeneratedMethodAccessor48.invoke(Unknown Source)
>         at 
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
>         at java.lang.reflect.Method.invoke(Method.java:597)
>         at 
> org.apache.hadoop.hbase.ipc.WritableRpcEngine$Server.call(WritableRpcEngine.java:320)
>         at 
> org.apache.hadoop.hbase.ipc.HBaseServer$Handler.run(HBaseServer.java:1426)
> Caused by: java.lang.ClassNotFoundException: Ljava.lang.InternalError
>         at java.net.URLClassLoader$1.run(URLClassLoader.java:202)
>         at java.security.AccessController.doPrivileged(Native Method)
>         at java.net.URLClassLoader.findClass(URLClassLoader.java:190)
>         at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
>         at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
>         at java.lang.ClassLoader.loadClass(ClassLoader.java:247)
>         ... 30 more
> {noformat} 
> There are two problems here:
> 1. Why use the class name 'Ljava/lang/InternalError' instead of 
> 'java/lang/InternalError' in Snappy library?
> This involves the code of snappy lib, maybe there is a bug in it.
> 2. When I tried to read the hfileblock using HDFS tools, it told me that the 
> local hfileblock didn't pass the file checksum and read another replica from 
> remote datanode. Then here is the question, why hbase checksum cannot find 
> this problem while hdfs can?
> This is my hbase-site.xml configure for checksum:
>   <property>
>     <name>dfs.client.read.shortcircuit</name>
>     <value>true</value>
>     <description></description>
>   </property>
>   <property>
>     <name>dfs.client.read.shortcircuit.skip.checksum</name>
>     <value>true</value>
>     <description></description>
>   </property>
>   <property>
>     <name>hbase.regionserver.checksum.verify</name>
>     <value>true</value>
>     <description></description>
>   </property>

--
This message is automatically generated by JIRA.
If you think it was sent incorrectly, please contact your JIRA administrators
For more information on JIRA, see: http://www.atlassian.com/software/jira

Reply via email to