Hi all, I'm having some trouble with a MR job that outputs to several thousand files using MultipleOutputs. I've increased the file handle limit to 65000. I've set the dfs.datanode.max.xcievers property in hdfs-site.xml to 2047 as:
<property> <name>dfs.datanode.max.xcievers</name> <value>2047</value> </property> This property seems to be set properly because it appears in the job.xml job file. But I'm still getting errors in the datanode logs that read: java.io.IOException: xceiverCount 258 exceeds the limit of concurrent xcievers 256 So it appears that the maximum number of xceivers is not being set properly and the default value is being used. I'm using Cloudera's Hadoop distribution. Hadoop version is 0.20.2. I'd be grateful for any insight. Many Thanks, Charaka -- View this message in context: http://old.nabble.com/dfs.datanode.max.xcievers-property-not-used-by-job-tp28287868p28287868.html Sent from the Hadoop core-user mailing list archive at Nabble.com.
