Flavio:
Have you considered using TableSnapshotInputFormat ?

See TableMapReduceUtil#initTableSnapshotMapperJob()

Cheers

On Fri, Oct 31, 2014 at 10:01 AM, Flavio Pompermaier <[email protected]>
wrote:

> Is there anybody here..?
>
> On Thu, Oct 30, 2014 at 2:28 PM, Flavio Pompermaier <[email protected]>
> wrote:
>
> > Any help about this..?
> >
> > On Wed, Oct 29, 2014 at 9:08 AM, Flavio Pompermaier <
> [email protected]>
> > wrote:
> >
> >> Hi to all,
> >> I was reading
> >>
> http://www.abcn.net/2014/07/spark-hbase-result-keyvalue-bytearray.html?m=1
> >> and they say " still using
> >> org.apache.hadoop.hbase.mapreduce.TableInputFormat is a big problem,
> your
> >> job will fail when one of HBase Region for target HBase table is
> splitting
> >> ! because the original region will be offline by splitting".
> >>
> >> Is that true?
> >> Is there a solution to that?
> >>
> >> Best,
> >> Flavio
> >>
> >
>

Reply via email to