Re: Too many open files in 0.18.3

2009-02-13 Thread Raghu Angadi
Sean, A few things in your messages is not clear to me. Currently this is what I make out of it : 1) with 1k limit, you do see the problem. 2) with 16 limit - (?) not clear if you see the problem 3) with 8k you don't see the problem 3a) with or without the patch, I don't know. But if

Re: Too many open files in 0.18.3

2009-02-13 Thread Sean Knapp
Raghu, Apologies for the confusion. I was seeing the problem with any setting for dfs.datanode.max.xcievers... 1k, 2k and 8k. Likewise, I was also seeing the problem with different open file settings, all the way up to 32k. Since I installed the patch, HDFS has been performing much better. The

Re: Too many open files in 0.18.3

2009-02-13 Thread Raghu Angadi
Sean Knapp wrote: Raghu, Apologies for the confusion. I was seeing the problem with any setting for dfs.datanode.max.xcievers... 1k, 2k and 8k. Likewise, I was also seeing the problem with different open file settings, all the way up to 32k. Since I installed the patch, HDFS has been performing

Re: Too many open files in 0.18.3

2009-02-13 Thread Sean Knapp
Raghu, Great, thanks for the help. Regards, Sean 2009/2/13 Raghu Angadi rang...@yahoo-inc.com Sean Knapp wrote: Raghu, Apologies for the confusion. I was seeing the problem with any setting for dfs.datanode.max.xcievers... 1k, 2k and 8k. Likewise, I was also seeing the problem with

Too many open files in 0.18.3

2009-02-12 Thread Sean Knapp
Hi all, I'm continually running into the Too many open files error on 18.3: DataXceiveServer: java.io.IOException: Too many open files at sun.nio.ch.ServerSocketChannelImpl.accept0(Native Method) at sun.nio.ch.ServerSocketChannelImpl.accept(ServerSocketChannelImpl.java:145)

Re: Too many open files in 0.18.3

2009-02-12 Thread Mark Kerzner
I once had too many open files when I was opening too many sockets and not closing them... On Thu, Feb 12, 2009 at 1:56 PM, Sean Knapp s...@ooyala.com wrote: Hi all, I'm continually running into the Too many open files error on 18.3: DataXceiveServer: java.io.IOException: Too many open files

Re: Too many open files in 0.18.3

2009-02-12 Thread Raghu Angadi
You are most likely hit by https://issues.apache.org/jira/browse/HADOOP-4346 . I hope it gets back ported. There is a 0.18 patch posted there. btw, does 16k help in your case? Ideally 1k should be enough (with small number of clients). Please try the above patch with 1k limit. Raghu.

Re: Too many open files in 0.18.3

2009-02-12 Thread Sean Knapp
Raghu, Thanks for the quick response. I've been beating up on the cluster for a while now and so far so good. I'm still at 8k... what should I expect to find with 16k versus 1k? The 8k didn't appear to be affecting things to begin with. Regards, Sean On Thu, Feb 12, 2009 at 2:07 PM, Raghu Angadi