Reading from the docs at http://wiki.apache.org/hadoop/HadoopMapReduce, Map operation outputs are written onto files on HDFS scattered across multiple nodes, and judging by it there should not be any limit as such imposed by framework.
For the question for degrading the performance, isn't that what Hadoop was built for in the first place? - Ashwanth On Thu, Jan 19, 2012 at 12:30 PM, Ajit Ratnaparkhi < ajit.ratnapar...@gmail.com> wrote: > Hi, > > I have a question regarding reduce functionality. > > A reduce function receives key and list of values as argument, is there > any limit on count of value elements in value list which is received as > argument? > Can there be milions of elements in value list? Will it degrade > performance somehow? Does it keep all value elements of list in memory? > > thanks, > Ajit. >