most of the operations I do with MR are exporting tables and importing tables. Does that still require a lot of memory and does it help to allocate more memory for jobs like that?
Yes, I have 12 cores also. Are there any HDFS/MR/Hbase tuning tips for this many processors? btw, 64GB is a lot for us :-) On Fri, May 11, 2012 at 7:29 AM, Michael Segel <[email protected]>wrote: > Funny, but this is part of a talk that I submitted to Strata.... > > 64GB and HBase isn't necessarily a 'large machine'. > > If you're running w 12 cores, you're talking about a minimum of 48GB just > for M/R. > (4GB a core is a good rule of thumb ) > > Depending on what you want to do, you could set aside 8GB of heap and tune > that, but even that might not be enough... > > > On May 11, 2012, at 5:42 AM, Rita wrote: > > > Hello, > > > > While looking at, > http://hbase.apache.org/book.html#important_configurations, > > I noticed large machine configuration section still isnt completed. > > ¨Unfortunately¨, I am running on a large machine which as 64gb of memory > > therefore I would need some help tuning my hbase/hadoop instance for > > maximum performance. Can someone please shed light on what I should look > > into? > > > > > > > > -- > > --- Get your facts first, then you can distort them as you please.-- > > -- --- Get your facts first, then you can distort them as you please.--
