Thanks. Its odd there are any null CellTypes in the table.

On Sunday, September 18, 2016, Ted Yu <yuzhih...@gmail.com> wrote:

> The following KeyValue ctor is called by your code:
>
>   public KeyValue(final byte[] row, final byte[] family,
>
>       final byte[] qualifier, final long timestamp, final byte[] value) {
>
>     this(row, family, qualifier, timestamp, Type.Put, value);
>
> which should set Type to Type.Put
>
>
> FYI
>
> On Sun, Sep 18, 2016 at 11:34 AM, Krishna <research...@gmail.com
> <javascript:;>> wrote:
>
> > I will try that. And when inserting KeyValues, how would I set CellType?
> >
> >
> > On Sunday, September 18, 2016, Ted Yu <yuzhih...@gmail.com
> <javascript:;>> wrote:
> >
> > > If you have bandwidth, you can try the following change which would
> show
> > > the KeyValue that doesn't have CellType :
> > >
> > > http://pastebin.com/TRRF1gtd
> > >
> > > You need to apply the change, build hbase-client jar and use this jar.
> > >
> > > Cheers
> > >
> > > On Sat, Sep 17, 2016 at 6:34 PM, Krishna <research...@gmail.com
> <javascript:;>
> > > <javascript:;>> wrote:
> > >
> > > > I'm using HBase 1.1 and didn't encounter issue accessing this table
> in
> > > > other scenarios. Data is written to this table as HBase KeyValue
> > objects
> > > > using MapReduce jobs.
> > > >
> > > > ctx.write(rowkey, new KeyValue(rowkey.get(), "cf".getBytes(),
> > > > "cq".getBytes(), timestamp, field.getBytes()));
> > > >
> > > > On Sat, Sep 17, 2016 at 1:04 PM, Ted Yu <yuzhih...@gmail.com
> <javascript:;>
> > > <javascript:;>> wrote:
> > > >
> > > > > Here is related code from CellProtos.java :
> > > > >
> > > > >       public Builder
> > > > > setCellType(org.apache.hadoop.hbase.protobuf.generated.
> > > > CellProtos.CellType
> > > > > value) {
> > > > >         if (value == null) {
> > > > >           throw new NullPointerException();
> > > > >
> > > > > This means CellType.valueOf() returned null for the Cell.
> > > > >
> > > > > Which release of hbase are you using ?
> > > > >
> > > > > You didn't encounter any problem accessing <table_name> in other
> > > > scenarios
> > > > > ?
> > > > >
> > > > > Cheers
> > > > >
> > > > > On Sat, Sep 17, 2016 at 11:13 AM, Krishna <research...@gmail.com
> <javascript:;>
> > > <javascript:;>> wrote:
> > > > >
> > > > > > I'm getting NPE when attempting to export HBase table: hbase
> > > > > > org.apache.hadoop.hbase.mapreduce.Export <table_name> <hdfs_dir>
> > > > > > Does anyone know what could be causing the exception?
> > > > > >
> > > > > > Here is the error stack.
> > > > > >
> > > > > > Error: java.lang.NullPointerException
> > > > > >         at
> > > > > > org.apache.hadoop.hbase.protobuf.generated.CellProtos$
> > > > > > Cell$Builder.setCellType(CellProtos.java:1050)
> > > > > >         at
> > > > > > org.apache.hadoop.hbase.protobuf.ProtobufUtil.toCell(
> > > > > > ProtobufUtil.java:2531)
> > > > > >         at
> > > > > > org.apache.hadoop.hbase.protobuf.ProtobufUtil.
> > > > > toResult(ProtobufUtil.java:
> > > > > > 1303)
> > > > > >         at
> > > > > > org.apache.hadoop.hbase.mapreduce.ResultSerialization$
> > > > > > ResultSerializer.serialize(ResultSerialization.java:155)
> > > > > >         at
> > > > > > org.apache.hadoop.hbase.mapreduce.ResultSerialization$
> > > > > > ResultSerializer.serialize(ResultSerialization.java:140)
> > > > > >         at
> > > > > > org.apache.hadoop.io.SequenceFile$BlockCompressWriter.append(
> > > > > > SequenceFile.java:1591)
> > > > > >         at
> > > > > > org.apache.hadoop.mapreduce.lib.output.
> SequenceFileOutputFormat$1.
> > > > write(
> > > > > > SequenceFileOutputFormat.java:83)
> > > > > >         at
> > > > > > org.apache.hadoop.mapred.MapTask$NewDirectOutputCollector.
> > > > > > write(MapTask.java:658)
> > > > > >         at
> > > > > > org.apache.hadoop.mapreduce.task.TaskInputOutputContextImpl.
> write(
> > > > > > TaskInputOutputContextImpl.java:89)
> > > > > >         at
> > > > > > org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.
> > > > > > write(WrappedMapper.java:112)
> > > > > >         at
> > > > > > org.apache.hadoop.hbase.mapreduce.IdentityTableMapper.
> > > > > > map(IdentityTableMapper.java:66)
> > > > > >         at
> > > > > > org.apache.hadoop.hbase.mapreduce.IdentityTableMapper.
> > > > > > map(IdentityTableMapper.java:33)
> > > > > >         at org.apache.hadoop.mapreduce.
> Mapper.run(Mapper.java:145)
> > > > > >         at org.apache.hadoop.mapred.
> MapTask.runNewMapper(MapTask.
> > > > > java:787)
> > > > > >         at org.apache.hadoop.mapred.
> MapTask.run(MapTask.java:341)
> > > > > >         at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.
> > > > java:164)
> > > > > >         at java.security.AccessController.doPrivileged(Native
> > > Method)
> > > > > >         at javax.security.auth.Subject.doAs(Subject.java:415)
> > > > > >         at
> > > > > > org.apache.hadoop.security.UserGroupInformation.doAs(
> > > > > > UserGroupInformation.java:1693)
> > > > > >         at org.apache.hadoop.mapred.
> YarnChild.main(YarnChild.java:
> > > 158)
> > > > > >
> > > > >
> > > >
> > >
> >
>

Reply via email to