hi wow,thank you liang 发自我的 iPhone
在 2013-4-2,17:25,Yanbo Liang <[email protected]> 写道: > You set the wrong parameter NodeReducer.class which should be subclass of > Mapper rather than Reducer. > > > 2013/4/2 YouPeng Yang <[email protected]> >> HI GUYS >> I want to use the the org.apache.hadoop.mapreduce.lib.input.MultipleInputs; >> >> >> However it comes a compile error in my eclipse(indigo): >> >> public static void main(String[] args) throws IOException, >> InterruptedException, ClassNotFoundException { >> Configuration conf = new Configuration(); >> String[] otheArgs = new >> GenericOptionsParser(conf,args).getRemainingArgs(); >> if(otheArgs.length != 2){ >> System.err.println("Usage:aaaa"); >> System.exit(2); >> } >> Job job = new Job(conf,"Data test2"); >> >> job.setReducerClass(CPUReducer.class); >> job.setOutputKeyClass(Text.class); >> job.setOutputValueClass(Text.class); >> MultipleInputs.addInputPath(job, new Path(otheArgs[0]), >> TextInputFormat.class,CPUMapper.class); >> MultipleInputs.addInputPath(job, new Path(otheArgs[1]), >> TextInputFormat.class,NodeReducer.class); >> ---->The method addInputPath(Job, Path, Class<? extends >> InputFormat>, Class<? extends Mapper>) in the type MultipleInputs is not >> applicable for the arguments (Job, Path, Class<TextInputFormat>, >> Class<NodeReducer> >> >> FileOutputFormat.setOutputPath(job,new Path(otheArgs[2])); >> System.exit(job.waitForCompletion(true) ? 0 : 1); >> >> } >> >> My questions: >> 1.Does the org.apache.hadoop.mapreduce.lib.input.MultipleInputs belong >> to the new Hadoop API? >> Why this error come out,did i miss something? >> 2.I also found the constructor Job(Configuration, String) is deprecated >> when i declare Job job = new Job(conf,"Data test2"). >> According to the Hadoop Defenitive Guide ,the API Job is the new >> Hadoop API. So why the job api is deprecated . >> >> I use the hadoop 2.0.0 with the CDH 4.1.2 . >> >> please help me. >> >> Thanks you. >> >> >> >> Regards >
