And it also shows Jignesh-MacBookPro:~ hadoop-user$ ulimit -a core file size (blocks, -c) 0 data seg size (kbytes, -d) unlimited file size (blocks, -f) unlimited max locked memory (kbytes, -l) unlimited max memory size (kbytes, -m) unlimited *open files (-n) 256* pipe size (512 bytes, -p) 1 stack size (kbytes, -s) 8192 cpu time (seconds, -t) unlimited max user processes (-u) 709 virtual memory (kbytes, -v) unlimited Jignesh-MacBookPro:~ hadoop-user$
On Mon, Oct 17, 2011 at 1:54 PM, Jignesh Patel <[email protected]>wrote: > My machine shows as follows: > > Jignesh-MacBookPro:~ hadoop-user$ sysctl -a | grep maxproc > kern.maxproc = 1064 > kern.maxprocperuid = 709 > kern.maxproc: 1064 > kern.maxprocperuid: 709 > > > On Mon, Oct 17, 2011 at 1:51 PM, Jignesh Patel <[email protected]>wrote: > >> While setting Hbase there is a article to setup limit for the file. >> However I didn't find any appropriate command to setup same in Mac. >> >> Please let me know how to setup max file size permanently. >> >> -Jignesh >> > >
