Jignesh, for an example of the error that JD is citing (and other common errors) see the Troubleshooting chapter.
http://hbase.apache.org/book.html#trouble On 10/17/11 2:43 PM, "Jean-Daniel Cryans" <[email protected]> wrote: >It always depends on the scope of your test :) > >For me the defaults are not a problem, but if all of a sudden you decide >to >create a couple of hundreds of regions, then you will run into too many >open >files. > >J-D > >On Mon, Oct 17, 2011 at 11:37 AM, Jignesh Patel ><[email protected]>wrote: > >> J-D as Harsh said in test it will work without changing anything even >>with >> the limit open files of 256. Is that correct? >> >> _jginesh >> >> On Mon, Oct 17, 2011 at 2:27 PM, Jean-Daniel Cryans <[email protected] >> >wrote: >> >> > That's usually the first problem a user will hit, so we're very >>up-front. >> > >> > J-D >> > >> > On Mon, Oct 17, 2011 at 11:22 AM, Jignesh Patel >><[email protected] >> > >wrote: >> > >> > > Harsh, >> > > As you understand from some other post we have exchanged. I am at >>very >> > > early >> > > stage of evaluating hadoop,base. We will start the development in >>next >> > > month >> > > or so. So production is not in question at this time. >> > > >> > > But wondering if the system should run in testing then why hbase >>book >> > > section 2.2.4 asks to set the limit at initial stage. Any idea? >> > > >> > > http://hbase.apache.org/book/os.html >> > > >> > > -Jignesh >> > > >> > > On Mon, Oct 17, 2011 at 1:57 PM, Harsh J <[email protected]> wrote: >> > > >> > > > Jignesh, >> > > > >> > > > Curious if you are gonna run HBase on production over OSX. >>Otherwise, >> > > > the default limits should be good enough for testing, and HBase >>also >> > > > seems to set its own -n opt before it begins. >> > > > >> > > > But anyways, if you are looking to set things permanently, and >>using >> > > > ulimit is not cutting it for you, checkout launchctl: >> > > > >> > > > >> > > >> > >> >>http://developer.apple.com/library/mac/#documentation/Darwin/Reference/Ma >>nPages/man1/launchctl.1.html >> > > > [See the "limit" expansions] >> > > > >> > > > On Mon, Oct 17, 2011 at 11:24 PM, Jignesh Patel < >> > [email protected] >> > > > >> > > > wrote: >> > > > > My machine shows as follows: >> > > > > >> > > > > Jignesh-MacBookPro:~ hadoop-user$ sysctl -a | grep maxproc >> > > > > kern.maxproc = 1064 >> > > > > kern.maxprocperuid = 709 >> > > > > kern.maxproc: 1064 >> > > > > kern.maxprocperuid: 709 >> > > > > >> > > > > >> > > > > On Mon, Oct 17, 2011 at 1:51 PM, Jignesh Patel < >> > > [email protected] >> > > > >wrote: >> > > > > >> > > > >> While setting Hbase there is a article to setup limit for the >> file. >> > > > However >> > > > >> I didn't find any appropriate command to setup same in Mac. >> > > > >> >> > > > >> Please let me know how to setup max file size permanently. >> > > > >> >> > > > >> -Jignesh >> > > > >> >> > > > > >> > > > >> > > > >> > > > >> > > > -- >> > > > Harsh J >> > > > >> > > >> > >>
