It's the one from the cloudera repo.  0.92.1

On Wed, Sep 19, 2012 at 10:48 AM, Ted Yu <[email protected]> wrote:

> Can you tell us which HBase version you are using ?
>
> On Wed, Sep 19, 2012 at 7:27 AM, Bai Shen <[email protected]> wrote:
>
> > I'm running Nutch 2 using HBase as my backend in local mode.  Everything
> > seems to be working correctly except when I run the readdb method.  When
> I
> > run readdb, I get the following stack trace.
> >
> > 2012-09-19 10:15:46,485 WARN  mapred.LocalJobRunner - job_local_0001
> > org.apache.hadoop.hbase.client.RetriesExhaustedException: Failed after
> > attempts=10, exceptions:
> > Wed Sep 19 10:15:07 EDT 2012,
> > org.apache.hadoop.hbase.client.ScannerCallable@345ac4dc,
> > java.io.IOException: java.io.IOException: Could not iterate
> > StoreFileScanner[HFileScanner for reader
> >
> >
> reader=file:/data1/hbase/root/webpage/583a33aae4c4003021da635aba2f70c4/ol/d09b3cd6eb0b478cbd6f64d420e42034,
> > compression=none, cacheConf=CacheConfig:enabled [cacheDataOnRead=true]
> > [cacheDataOnWrite=false] [cacheIndexesOnWrite=false]
> > [cacheBloomsOnWrite=false] [cacheEvictOnClose=false]
> > [cacheCompressed=false], firstKey=edu.ndu.www:http/aa/catalogs.cfm/ol:
> > http://www.ndu.edu/aa/catalogs.cfm/1348036803793/Put,
> > lastKey=edu.ucla.anderson.www:http/mba-admissions.xml/ol:
> > http://www.anderson.ucla.edu/x40700.xml/1348036827378/Put,
> avgKeyLen=161,
> > avgValueLen=14, entries=17405, length=3238861,
> >
> >
> cur=edu.nps.www:http/About/News/NPS-Crushes-CubeSats-for-DARPA-Challenge.html/ol:
> >
> http://www.nps.edu/Technology/HPC/ContactHPC.html/1348036805774/Put/vlen=11
> > ]
> >         at
> >
> >
> org.apache.hadoop.hbase.regionserver.StoreFileScanner.next(StoreFileScanner.java:104)
> >         at
> >
> >
> org.apache.hadoop.hbase.regionserver.KeyValueHeap.next(KeyValueHeap.java:106)
> >         at
> >
> >
> org.apache.hadoop.hbase.regionserver.StoreScanner.next(StoreScanner.java:289)
> >         at
> >
> >
> org.apache.hadoop.hbase.regionserver.KeyValueHeap.next(KeyValueHeap.java:138)
> >         at
> >
> >
> org.apache.hadoop.hbase.regionserver.HRegion$RegionScannerImpl.nextInternal(HRegion.java:2978)
> >         at
> >
> >
> org.apache.hadoop.hbase.regionserver.HRegion$RegionScannerImpl.next(HRegion.java:2925)
> >         at
> >
> >
> org.apache.hadoop.hbase.regionserver.HRegion$RegionScannerImpl.next(HRegion.java:2942)
> >         at
> >
> >
> org.apache.hadoop.hbase.regionserver.HRegionServer.next(HRegionServer.java:2159)
> >         at sun.reflect.GeneratedMethodAccessor12.invoke(Unknown Source)
> >         at
> >
> >
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> >         at java.lang.reflect.Method.invoke(Method.java:601)
> >         at
> >
> >
> org.apache.hadoop.hbase.ipc.WritableRpcEngine$Server.call(WritableRpcEngine.java:364)
> >         at
> >
> org.apache.hadoop.hbase.ipc.HBaseServer$Handler.run(HBaseServer.java:1336)
> > Caused by: org.apache.hadoop.fs.ChecksumException: Checksum error:
> >
> >
> file:/data1/hbase/root/webpage/583a33aae4c4003021da635aba2f70c4/ol/d09b3cd6eb0b478cbd6f64d420e42034
> > at 1378304 exp: -89200966 got: -2503767
> >         at
> > org.apache.hadoop.fs.FSInputChecker.verifySums(FSInputChecker.java:320)
> >         at
> >
> >
> org.apache.hadoop.fs.FSInputChecker.readChecksumChunk(FSInputChecker.java:276)
> >         at
> > org.apache.hadoop.fs.FSInputChecker.fill(FSInputChecker.java:211)
> >         at
> > org.apache.hadoop.fs.FSInputChecker.read1(FSInputChecker.java:229)
> >         at
> > org.apache.hadoop.fs.FSInputChecker.read(FSInputChecker.java:193)
> >         at
> > org.apache.hadoop.fs.FSInputChecker.readFully(FSInputChecker.java:431)
> >         at
> > org.apache.hadoop.fs.FSInputChecker.seek(FSInputChecker.java:412)
> >         at
> > org.apache.hadoop.fs.FSDataInputStream.seek(FSDataInputStream.java:47)
> >         at
> >
> >
> org.apache.hadoop.fs.ChecksumFileSystem$FSDataBoundedInputStream.seek(ChecksumFileSystem.java:318)
> >         at
> >
> >
> org.apache.hadoop.hbase.io.hfile.HFileBlock$AbstractFSReader.readAtOffset(HFileBlock.java:1047)
> >         at
> >
> >
> org.apache.hadoop.hbase.io.hfile.HFileBlock$FSReaderV2.readBlockData(HFileBlock.java:1318)
> >         at
> >
> >
> org.apache.hadoop.hbase.io.hfile.HFileReaderV2.readBlock(HFileReaderV2.java:266)
> >         at
> >
> >
> org.apache.hadoop.hbase.io.hfile.HFileReaderV2$ScannerV2.readNextDataBlock(HFileReaderV2.java:452)
> >         at
> >
> >
> org.apache.hadoop.hbase.io.hfile.HFileReaderV2$ScannerV2.next(HFileReaderV2.java:416)
> >         at
> >
> >
> org.apache.hadoop.hbase.regionserver.StoreFileScanner.next(StoreFileScanner.java:99)
> >         ... 12 more
> >
> > Wed Sep 19 10:15:08 EDT 2012,
> > org.apache.hadoop.hbase.client.ScannerCallable@345ac4dc,
> > java.io.IOException: java.io.IOException:
> > java.lang.IllegalArgumentException
> >         at
> >
> >
> org.apache.hadoop.hbase.regionserver.HRegionServer.convertThrowableToIOE(HRegionServer.java:1084)
> >         at
> >
> >
> org.apache.hadoop.hbase.regionserver.HRegionServer.convertThrowableToIOE(HRegionServer.java:1073)
> >         at
> >
> >
> org.apache.hadoop.hbase.regionserver.HRegionServer.next(HRegionServer.java:2186)
> >         at sun.reflect.GeneratedMethodAccessor12.invoke(Unknown Source)
> >         at
> >
> >
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> >         at java.lang.reflect.Method.invoke(Method.java:601)
> >         at
> >
> >
> org.apache.hadoop.hbase.ipc.WritableRpcEngine$Server.call(WritableRpcEngine.java:364)
> >         at
> >
> org.apache.hadoop.hbase.ipc.HBaseServer$Handler.run(HBaseServer.java:1336)
> > Caused by: java.lang.IllegalArgumentException
> >         at java.nio.Buffer.position(Buffer.java:236)
> >         at
> >
> >
> org.apache.hadoop.hbase.io.hfile.HFileReaderV2$ScannerV2.next(HFileReaderV2.java:395)
> >         at
> >
> >
> org.apache.hadoop.hbase.regionserver.StoreFileScanner.next(StoreFileScanner.java:99)
> >         at
> >
> >
> org.apache.hadoop.hbase.regionserver.KeyValueHeap.next(KeyValueHeap.java:106)
> >         at
> >
> >
> org.apache.hadoop.hbase.regionserver.StoreScanner.next(StoreScanner.java:326)
> >         at
> >
> >
> org.apache.hadoop.hbase.regionserver.KeyValueHeap.next(KeyValueHeap.java:138)
> >         at
> >
> >
> org.apache.hadoop.hbase.regionserver.HRegion$RegionScannerImpl.nextInternal(HRegion.java:2978)
> >         at
> >
> >
> org.apache.hadoop.hbase.regionserver.HRegion$RegionScannerImpl.next(HRegion.java:2925)
> >         at
> >
> >
> org.apache.hadoop.hbase.regionserver.HRegion$RegionScannerImpl.next(HRegion.java:2942)
> >         at
> >
> >
> org.apache.hadoop.hbase.regionserver.HRegionServer.next(HRegionServer.java:2159)
> >         ... 5 more
> >
> >
> > I've run "hbase hbck" and returns the following.
> >
> > Summary:
> >   -ROOT- is okay.
> >     Number of regions: 1
> >     Deployed on:  node9-0,53595,1348062612459
> >   .META. is okay.
> >     Number of regions: 1
> >     Deployed on:  node9-0,53595,1348062612459
> >   webpage is okay.
> >     Number of regions: 18
> >     Deployed on:  node9-0,53595,1348062612459
> > 0 inconsistencies detected.
> >
> >
> > Any suggestions on what's wrong and how to fix it?
> >
> > Thanks.
> >
>

Reply via email to