Hello!

A rather rudimentary thing. I have noticed that sometimes HBase shell shows double values in a readable format and sometimes as array of (octal thus unreadable) 8 bytes .
This happens when I write them into the table from a java client:
    eg.
    org.apache.hadoop.hbase.util.Bytes.toBytes(1.23);
In this case I have problems reading/mapping the value in Pig script and CDH beeswax/hue.

I made a temporary workaround by writing the doubles using Pig's class:
    eg.
org.apache.pig.backend.hadoop.hbase.HBaseBinaryConverter.toBytes(1.23);

On the contrary, When I aggregate and store doubles from Pig script into some other table, they are readable in the HBase shell, as if they were strings and also readable by Pig and other programs. I wonder what is the proper way to write doubles into HBase table from a Java client?

Thanks.

Reply via email to