If I have 2 I/P format classes set up...Both giving different name and value pairs.... Then Is it possibe to configure multiple map and reduce classes in One job based on different key value pairs? If i overload the map() method does the framework call them polymorphically based on the varying parameters(the key and the value)? or do we need seperate classes? For adding multiple mappers and I am thinking of using: MultipleInputs.addInputPath(JobConf conf, Path path, Class<? extends InputFormat> inputFormatClass, Class<? extends Mapper> mapperClass) to add the mappers and my I/P format. And use MultipleOutputs class to configure the O/P from the mappers. IF this is right where do i add the multiple implementations for the reducers in the JobConf??
-- View this message in context: http://www.nabble.com/Polymorphic-behavior-of-Maps-in-One-Job--tp22907228p22907228.html Sent from the Hadoop core-user mailing list archive at Nabble.com.