Hello,

I tried the shell command which Swarnim kindly provided and it allows me to
map an existing HBase table into Hive. However, since my qualifiers are
long but map only accepts string as a key, the result is garbled. Even with
the suggested patch which allows binary keys, the resulting datatype in
Hive would not be long but binary, making it hard to query from shell. It
seems there is no API for now, right?

Currently, is there any way to map HBase byte[] to Hive datatypes?

The assumption is, that all byte[] were generated using Hadoop's
Byte.toBytes(<type>) method and that either all row keys, qualifiers and
values share the same data type respectively (for example: row keys are
ints, qualifiers are longs and values are strings).

Thank you,

/David


On Thu, Dec 6, 2012 at 9:23 PM, David Koch <ogd...@googlemail.com> wrote:

> Hello Swarnim,
>
> Thank you for your answer. I will try the options you pointed out.
>
> /David
>
>
> On Thu, Dec 6, 2012 at 9:10 PM, kulkarni.swar...@gmail.com <
> kulkarni.swar...@gmail.com> wrote:
>
>> map
>
>
>

Reply via email to