Thanks Suraj. Finally I had to keep my HBase on because I'm outputing in an HTable.
But I understand that if I only read from the parameters I'm getting, I don't need HBase to be running. That's good to know. JM 2012/11/21, Suraj Varma <[email protected]>: > Right - if your map is not accessing HBase at all ... it can be down. > --Suraj > > On Wed, Nov 14, 2012 at 11:03 AM, Jean-Marc Spaggiari > <[email protected]> wrote: >> Hum. I'm calling HTables, and doing puts and gets. So it will not be >> working for my scenario. >> >> But if I "simply" need to map the data and output on a file, that >> should be working fine with HBase down, right? >> >> 2012/11/14, Ted Yu <[email protected]>: >>> If your MapReduce job doesn't access HBase, HBase can be stopped. >>> >>> On Wed, Nov 14, 2012 at 10:51 AM, Jean-Marc Spaggiari < >>> [email protected]> wrote: >>> >>>> Hi, >>>> >>>> I'm wondering, when I run a MapReduce job over the cluster, do I need >>>> to have HBase running? Or I can close it? That will allow me to give >>>> an additionnal 2GB to hadoop instead of giving it to HBase if it's not >>>> used. >>>> >>>> Thanks, >>>> >>>> JM >>>> >>> >
