Hum. I'm calling HTables, and doing puts and gets. So it will not be
working for my scenario.

But if I "simply" need to map the data and output on a file, that
should be working fine with HBase down, right?

2012/11/14, Ted Yu <[email protected]>:
> If your MapReduce job doesn't access HBase, HBase can be stopped.
>
> On Wed, Nov 14, 2012 at 10:51 AM, Jean-Marc Spaggiari <
> [email protected]> wrote:
>
>> Hi,
>>
>> I'm wondering, when I run a MapReduce job over the cluster, do I need
>> to have HBase running? Or I can close it? That will allow me to give
>> an additionnal 2GB to hadoop instead of giving it to HBase if it's not
>> used.
>>
>> Thanks,
>>
>> JM
>>
>

Reply via email to