Thanks Ravi. Yes, I am using a composite row key with salt. Pig might not be the best option for me since I had planned to do something more unique in the reduce step. I had tried to make a phoenix call with a group by and count. It works ok for smaller sets but on larger tables I was getting scan timeouts. Also, it seemed that what I was trying to do lent itself better to m/r. Please keep me posted or know of a workaround. On Aug 21, 2014 8:51 PM, "Ravi Kiran" <maghamraviki...@gmail.com> wrote:
> Hi Jody, > > Can you please let us know if the HBase table that you would like to > read from has a composite row key. If not, I believe using the standard > TableMapReduceUtil api should do good. > However, it becomes a bit tricky when the row key is a composite one . > In this case, I am afraid you cannot write a M/R job for now as we haven't > exposed any *InputFormat , *RecordReader that seamlessly uses Phoenix > libraries to demystify the row key. We are working on this and will keep > you posted. > In case you are looking for any other option, we do have a phoenix-pig > artifact that can come handy . > HTH. > > Regards > Ravi > > > On Thu, Aug 21, 2014 at 6:06 PM, Jody Landreneau <jodylandren...@gmail.com > > wrote: > >> Hello, >> >> I'm attempting to write a m/r job to go across(read) my HBase tables that >> were written using phoenix. I'm not seeing much info on how this is done. I >> feel that if I were to use the normal HBase TableMapReduceUtils I will most >> likely not get what I expect. It looks like I would need some utility to >> convert a composite primary key back into its fields as they don't seem to >> be stored separately. Additionally the key is salted. >> Any advice or pointers to documentation would be appreciated. I see there >> are a few example for writing to phoenix table but I'm looking to read from >> a phoenix table. >> > >