Hi,

It seems like your classpath is not setup correctly. /etc/hadoop/conf and
/etc/hbase/conf needs to be in MapReduce classpath.  Are you able to run
row counter job of hbase in distributed cluster? What version of Hadoop you
are using? Did you use Ambari or Clodura Manager to install the cluster?

Thanks,
Anil Gupta

On Thu, May 26, 2016 at 7:19 AM, Lucie Michaud <
[email protected]> wrote:

> Hello everybody,
>
>
>
> For a few days I developed a MapReduce code to insert values in HBase with
> Phoenix. But the code runs only in local and overcharge the machine.
>
> Whatever changes I make I observe that the mapred.LocalJobRunner class is
> systematically used.
>
>
>
> Do you have an idea of the problem?
>
>
>
> I attached to this post the execution logs of my program.
>
>
>
> Thank you in advance for your help. :)
>
>  Feel free to ask me for more details if it can help.
>



-- 
Thanks & Regards,
Anil Gupta

Reply via email to