Harsh,
As you understand from some other post we have exchanged. I am at very early
stage of evaluating hadoop,base. We will start the development in next month
or so. So production is not in question at this time.

But wondering if the system should run in testing then why hbase book
section 2.2.4 asks to set the limit at initial stage. Any idea?

http://hbase.apache.org/book/os.html

-Jignesh

On Mon, Oct 17, 2011 at 1:57 PM, Harsh J <[email protected]> wrote:

> Jignesh,
>
> Curious if you are gonna run HBase on production over OSX. Otherwise,
> the default limits should be good enough for testing, and HBase also
> seems to set its own -n opt before it begins.
>
> But anyways, if you are looking to set things permanently, and using
> ulimit is not cutting it for you, checkout launchctl:
>
> http://developer.apple.com/library/mac/#documentation/Darwin/Reference/ManPages/man1/launchctl.1.html
> [See the "limit" expansions]
>
> On Mon, Oct 17, 2011 at 11:24 PM, Jignesh Patel <[email protected]>
> wrote:
> > My machine shows as follows:
> >
> > Jignesh-MacBookPro:~ hadoop-user$ sysctl -a | grep maxproc
> > kern.maxproc = 1064
> > kern.maxprocperuid = 709
> > kern.maxproc: 1064
> > kern.maxprocperuid: 709
> >
> >
> > On Mon, Oct 17, 2011 at 1:51 PM, Jignesh Patel <[email protected]
> >wrote:
> >
> >> While setting Hbase there is a article to setup limit for the file.
> However
> >> I didn't find any appropriate command to setup same in Mac.
> >>
> >> Please let me know how to setup max file size permanently.
> >>
> >> -Jignesh
> >>
> >
>
>
>
> --
> Harsh J
>

Reply via email to