There isn't any limit like that. Can you reproduce this consistently? If so, please file a ticket.
It will definitely help if you can provide a test case which can reproduce this issue. Thanks, +Vinod On Thu, Jan 10, 2013 at 12:41 AM, Utkarsh Gupta <utkarsh_gu...@infosys.com>wrote: > Hi,**** > > ** ** > > I am using Apache Hadoop 1.0.4 on a 10 node cluster of commodity machines > with Ubuntu 12.04 Server edition. I am having a issue with my map reduce > code. While debugging I found that the reducer can take 262145 values for a > particular key. If more values are there, they seem to be corrupted. I > checked the values while emitting from map and again checked in reducer.** > ** > > I am wondering is there any such kind of limitation in the Hadoop or is it > a configuration problem.**** > > ** ** > > ** ** > > Thanks and Regards**** > > Utkarsh Gupta**** > > ** ** > > ** ** > > **************** CAUTION - Disclaimer ***************** > This e-mail contains PRIVILEGED AND CONFIDENTIAL INFORMATION intended solely > for the use of the addressee(s). If you are not the intended recipient, please > notify the sender by e-mail and delete the original message. Further, you are > not > to copy, disclose, or distribute this e-mail or its contents to any other > person and > any such actions are unlawful. This e-mail may contain viruses. Infosys has > taken > every reasonable precaution to minimize this risk, but is not liable for any > damage > you may sustain as a result of any virus in this e-mail. You should carry out > your > own virus checks before opening the e-mail or attachment. Infosys reserves the > right to monitor and review the content of all messages sent to or from this > e-mail > address. Messages sent to or from this e-mail address may be stored on the > Infosys e-mail system. > ***INFOSYS******** End of Disclaimer ********INFOSYS*** > > -- +Vinod Hortonworks Inc. http://hortonworks.com/