Whatever "csatAnalysis.MapClass" the compiler picked up, it appears to not be extending the org.apache.hadoop.mapreduce.Mapper class. From your snippets it appears that you have it all defined properly though. A common issue here has also been that people accidentally import the wrong API (mapred.*) but that doesn't seem to be the case either.
Can you post your full compilable source somewhere? Remove any logic you don't want to share - we'd mostly be interested in the framework definition parts alone. On Sat, Feb 9, 2013 at 11:27 PM, Ronan Lehane <ronan.leh...@gmail.com> wrote: > Hi All, > > I hope this is the right forum for this type of question so my apologies if > not. > > I'm looking to write a map reduce program which is giving me the following > compilation error: > The method setMapperClass(Class<? extends Mapper>) in the type Job is not > applicable for the arguments (Class<csatAnalysis.MapClass>) > > The components involved are: > > 1. Setting the Mapper > //Set the Mapper for the job. Calls MapClass.class > job.setMapperClass(MapClass.class); > > 2. Setting the inputFormat to TextInputFormat > //An InputFormat for plain text files. Files are broken into lines. > //Either linefeed or carriage-return are used to signal end of > line. > //Keys are the position in the file, and values are the line of > text.. > job.setInputFormatClass(TextInputFormat.class); > > 3. Taking the data into the mapper and processing it > public static class MapClass extends Mapper<LongWritable, Text, Text, > VectorWritable> { > public void map (LongWritable key, Text value,Context context) > throws IOException, InterruptedException { > > Would anyone have any clues as to what would be wrong with the arguements > being passed to the Mapper? > > Any help would be appreciated, > > Thanks. -- Harsh J