Hello, > I'm currently developing a map/reduce program that emits a fair amount > of maps per input record (around 50 - 100), and I'm getting OutOfMemory > errors:
Sorry for the noise, I found out I had to set the mapred.child.java.opts JobConf parameter to "-Xmx512m" to make 512MB of heap space available in the map processes. However, I was wondering: are these hard architectural limits? Say that I wanted to emit 25,000 maps for a single input record, would that mean that I will require huge amounts of (virtual) memory? In other words, what exactly is the reason that increasing the number of emitted maps per input record causes an OutOfMemoryError ? Regards, Leon Mergen
