Here is the stack : (with hbase-0.94.0.jar)
row count is 1
12/07/30 15:08:53 WARN client.HConnectionManager$HConnectionImplementation:
Error executing for row
java.util.concurrent.ExecutionException:
org.apache.hadoop.hbase.client.RetriesExhaustedException: Failed after
attempts=10, exceptions:
Mon Jul 30 15:08:14 CEST 2012,
org.apache.hadoop.hbase.ipc.ExecRPCInvoker$1@341049d3, java.io.IOException:
java.io.IOException: java.lang.NullPointerException
at
org.apache.hadoop.hbase.coprocessor.AggregateImplementation.getAvg(AggregateImplementation.java:189)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
at java.lang.reflect.Method.invoke(Method.java:597)
at org.apache.hadoop.hbase.regionserver.HRegion.exec(HRegion.java:4770)
at
org.apache.hadoop.hbase.regionserver.HRegionServer.execCoprocessor(HRegionServer.java:3457)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
at java.lang.reflect.Method.invoke(Method.java:597)
at
org.apache.hadoop.hbase.ipc.WritableRpcEngine$Server.call(WritableRpcEngine.java:364)
at
org.apache.hadoop.hbase.ipc.HBaseServer$Handler.run(HBaseServer.java:1376)
Regards
Cyril SCETBON
On Jul 29, 2012, at 11:54 PM, [email protected] wrote:
> Can you use 0.94 for your client jar ?
>
> Please show us the NullPointerException stack.
>
> Thanks
>
>
>
> On Jul 29, 2012, at 2:49 PM, Cyril Scetbon <[email protected]> wrote:
>
>> Hi,
>>
>> I'm testing AggregationClient functions to check if we could use
>> coprocessors for mathematical functions.
>>
>> The code I use is the following :
>>
>> package coreprocessor;
>>
>> import org.apache.hadoop.conf.Configuration;
>> import org.apache.hadoop.hbase.HBaseConfiguration;
>> import org.apache.hadoop.hbase.client.Scan;
>> import org.apache.hadoop.hbase.client.coprocessor.AggregationClient;
>> import org.apache.hadoop.hbase.util.Bytes;
>>
>> public class AggregationClientTest {
>>
>> private static final byte[] TABLE_NAME = Bytes.toBytes("ise");
>> private static final byte[] CF = Bytes.toBytes("core");
>>
>> public static void main(String[] args) throws Throwable {
>>
>> Configuration configuration = HBaseConfiguration.create();
>>
>> configuration.setLong("hbase.client.scanner.caching", 1000);
>> AggregationClient aggregationClient = new AggregationClient(
>> configuration);
>> Scan scan = new Scan();
>> scan.addColumn(CF, Bytes.toBytes("value"));
>> System.out.println("row count is " +
>> aggregationClient.rowCount(TABLE_NAME, null, scan));
>> System.out.println("avg is " + aggregationClient.avg(TABLE_NAME, null,
>> scan));
>> System.out.println("sum is " + aggregationClient.sum(TABLE_NAME, null,
>> scan));
>> }
>> }
>>
>> The only one working is the rowCount function. For others I get a NPE error !
>> I've checked that my table use only Long values for the column on which I
>> work, and I've only one row in my table :
>>
>> ROW COLUMN+CELL
>>
>>
>> id-cyr1 column=core:value,
>> timestamp=1343596419845, value=\x00\x00\x00\x00\x00\x00\x00\x0A
>>
>>
>> The only thing I can add is that my hbase server's version is 0.94.0 and
>> that I use version 0.92.0 of the hbase jar
>>
>> any idea why it doesn't work ?
>>
>> thanks
>> Cyril SCETBON
>>