Check the limit on number of open files in your environment (ulimit command on the shell). You should increase the limit larger value like 8k.
Raghu. Keith Fisher wrote:
I'm running hadoop version 0.17.0 on a Red Hat Enterprise Linux 4.4 box. I'm using an IBM provided JDK 1.5. I've configured Hadoop for a localhost. I've written a simple test to open and write to files in HDFS. I close the output stream after I write 10 bytes to the file. After 471 files, I see an exception from DFSClient in the log4j logs for my test: Exception in createBlockOutputStream java.io.IOException: Too many open files Abandoning block blk_ ..... DataStreamer Exception: java.net.SocketException: Too many open files Error recovery for block_ .... bad datanode[0] I'd appreciate any suggestions on how to resolve this problem. Thanks.
