Hi all,
I'm getting the following error messages everytime i run the map-reduce job
across multiple hadoop clusters:

java.lang.NullPointerException
    at org.apache.hadoop.hbase.util.Bytes.toBytes(Bytes.java:414)
    at org.apache.hadoop.hbase.client.HTable.<init>(HTable.java:170)
at com.company$AnalyzeMapper.contentidxjoin(MRjobt.java:153)


Here's the code:

public void map(ImmutableBytesWritable row, Result columns, Context context)
    throws IOException {
...
...
public static String contentidxjoin(String contentId) {
Configuration conf = HBaseConfiguration.create();
          HTable table;
        try {
            table = new HTable(conf, ContentidxTable);
            if(table!= null) {
            Get get1 = new Get(Bytes.toBytes(contentId));
            get1.addColumn(Bytes.toBytes(ContentidxTable_ColumnFamily),
Bytes.toBytes(ContentidxTable_ColumnQualifier));
            Result result1 = table.get(get1);
            byte[] val1 =
result1.getValue(Bytes.toBytes(ContentidxTable_ColumnFamily),
                  Bytes.toBytes(ContentidxTable_ColumnQualifier));
            if(val1!=null) {
                LOGGER.info("Fetched data from BARB-Content table");
            } else {
                LOGGER.error("Error fetching data from BARB-Content table");
            }
            return_value = contentjoin(Bytes.toString(val1),contentId);
            }
        }
catch (Exception e) {
            LOGGER.error("Error inside contentidxjoin method");
            e.printStackTrace();
        }
        return return_value;
}
}

Assume all variables are defined.

Can anyone please tell me why the table never gets instantiated or entered?
I had set up break points and this function gets called many times while
mapper executes.. everytime it says *Error inside contentidxjoin method*..
I'm 100% sure there are rows in the ContentidxTable so not sure why its not
able to fetch the value from it..

Please help!


-- 
Regards-
Pavan

Reply via email to