unfortunately I can't remember/find it :( and I see in AggregationClient's javadoc that :
"Column family can't be null" , so I suppose I should have read it at first ! Thanks again Cyril SCETBON On Jul 30, 2012, at 7:30 PM, Himanshu Vashishtha <[email protected]> wrote: > We should fix the reference then. Where did you read it? > > On Mon, Jul 30, 2012 at 10:43 AM, Cyril Scetbon <[email protected]> wrote: >> Thanks, it's really better ! >> >> I've read that by default it supports only Long values, that's why I was >> using a null ColumnInterpreter. >> >> Regards. >> Cyril SCETBON >> >> On Jul 30, 2012, at 5:56 PM, Himanshu Vashishtha <[email protected]> >> wrote: >> >>> On Mon, Jul 30, 2012 at 6:55 AM, Cyril Scetbon <[email protected]> >>> wrote: >>> >>>> I've given the values returned by scan 'table' command in hbase shell in >>>> my first email. >>> Somehow I missed the scan result in your first email. So, can you pass >>> a LongColumnInterpreter instance instead of null? >>> See TestAggregateProtocol methods for usage. >>> >>> Thanks >>> Himanshu >>> >>>> >>>> Regards >>>> Cyril SCETBON >>>> >>>> On Jul 30, 2012, at 12:50 AM, Himanshu Vashishtha >>>> <[email protected]> wrote: >>>> >>>>> And also, what are your cell values look like? >>>>> >>>>> Himanshu >>>>> >>>>> On Sun, Jul 29, 2012 at 3:54 PM, <[email protected]> wrote: >>>>>> Can you use 0.94 for your client jar ? >>>>>> >>>>>> Please show us the NullPointerException stack. >>>>>> >>>>>> Thanks >>>>>> >>>>>> >>>>>> >>>>>> On Jul 29, 2012, at 2:49 PM, Cyril Scetbon <[email protected]> wrote: >>>>>> >>>>>>> Hi, >>>>>>> >>>>>>> I'm testing AggregationClient functions to check if we could use >>>>>>> coprocessors for mathematical functions. >>>>>>> >>>>>>> The code I use is the following : >>>>>>> >>>>>>> package coreprocessor; >>>>>>> >>>>>>> import org.apache.hadoop.conf.Configuration; >>>>>>> import org.apache.hadoop.hbase.HBaseConfiguration; >>>>>>> import org.apache.hadoop.hbase.client.Scan; >>>>>>> import org.apache.hadoop.hbase.client.coprocessor.AggregationClient; >>>>>>> import org.apache.hadoop.hbase.util.Bytes; >>>>>>> >>>>>>> public class AggregationClientTest { >>>>>>> >>>>>>> private static final byte[] TABLE_NAME = Bytes.toBytes("ise"); >>>>>>> private static final byte[] CF = Bytes.toBytes("core"); >>>>>>> >>>>>>> public static void main(String[] args) throws Throwable { >>>>>>> >>>>>>> Configuration configuration = HBaseConfiguration.create(); >>>>>>> >>>>>>> configuration.setLong("hbase.client.scanner.caching", 1000); >>>>>>> AggregationClient aggregationClient = new AggregationClient( >>>>>>> configuration); >>>>>>> Scan scan = new Scan(); >>>>>>> scan.addColumn(CF, Bytes.toBytes("value")); >>>>>>> System.out.println("row count is " + >>>>>>> aggregationClient.rowCount(TABLE_NAME, null, scan)); >>>>>>> System.out.println("avg is " + aggregationClient.avg(TABLE_NAME, >>>>>>> null, scan)); >>>>>>> System.out.println("sum is " + aggregationClient.sum(TABLE_NAME, >>>>>>> null, scan)); >>>>>>> } >>>>>>> } >>>>>>> >>>>>>> The only one working is the rowCount function. For others I get a NPE >>>>>>> error ! >>>>>>> I've checked that my table use only Long values for the column on which >>>>>>> I work, and I've only one row in my table : >>>>>>> >>>>>>> ROW COLUMN+CELL >>>>>>> id-cyr1 column=core:value, >>>>>>> timestamp=1343596419845, value=\x00\x00\x00\x00\x00\x00\x00\x0A >>>>>>> >>>>>>> The only thing I can add is that my hbase server's version is 0.94.0 >>>>>>> and that I use version 0.92.0 of the hbase jar >>>>>>> >>>>>>> any idea why it doesn't work ? >>>>>>> >>>>>>> thanks >>>>>>> Cyril SCETBON >>>>>>> >>>> >>
