Exactly I had the same though as Ashwanth too, that is why I asked whether @Override annotation is being used or not.
Regards, Shahab On Thu, Aug 29, 2013 at 1:09 PM, Ashwanth Kumar < [email protected]> wrote: > Hey Praveenesh, I am not sure if this would help. > > But can you try moving your mapper to an inner class / separate class and > try the code? I somehow get a feeling that default Mapper (IdentityMapper) > is being used (may be you can check the mapreduce.map.class value?), that > would be the only reason why your value (BytesWritable) gets emitted out in > context.write(). > > > > On Thu, Aug 29, 2013 at 3:16 PM, praveenesh kumar <[email protected] > >wrote: > > > Hi all, > > > > I am trying to write a MR code to load a HBase table. > > > > I have a mapper that emits (null,put object) and I am using > > TableMapReduceUtil.initTableReducerJob() to write it into a HBase table. > > > > Following is my code snippet > > > > public class MYHBaseLoader extends > > Mapper<NullWritable,BytesWritable,NullWritable,Put> { > > > > protected void map (LongWritable key, BytesWritable value, Context > > context) throws IOException, InterruptedException { > > > > /****----- Some processing here.. Create put object and pushing it > > into Put object). > > context.write(null, put);// Pushing the put object. > > > > } > > > > public static void main (String args[]) throws IOException, > > ClassNotFoundException, InterruptedException{ > > Configuration conf = new Configuration(); > > Job job = new Job(conf); > > job.setJarByClass(MYHBaseLoader.class); > > job.setMapperClass(MYHBaseLoader.class); > > > > > > > TableMapReduceUtil.initTableReducerJob(MY_IMPORT_TABLE_NAME,IdentityTableReducer.class,job); > > job.setMapOutputKeyClass(NullWritable.class); > > job.setMapOutputValueClass(Put.class); > > job.setInputFormatClass(SequenceFileInputFormat.class); > > //job.setNumReduceTasks(0); > > > > FileInputFormat.setInputPaths(job, new Path("test")); > > Path outputPath = new Path("test_output"); > > FileOutputFormat.setOutputPath(job,outputPath); > > > > //outputPath.getFileSystem(conf).delete(outputPath, true); > > > > job.waitForCompletion(true); > > System.out.println("Done"); > > } > > > > > > I am getting the following error while running. Any help/guidance: > > > > > > java.io.IOException: Type mismatch in value from map: expected > > org.apache.hadoop.hbase.client.Put, recieved > > org.apache.hadoop.io.BytesWritable > > at > > > org.apache.hadoop.mapred.MapTask$MapOutputBuffer.collect(MapTask.java:1023) > > at > > > org.apache.hadoop.mapred.MapTask$NewOutputCollector.write(MapTask.java:689) > > at > > > > > org.apache.hadoop.mapreduce.TaskInputOutputContext.write(TaskInputOutputContext.java:80) > > at org.apache.hadoop.mapreduce.Mapper.map(Mapper.java:124) > > at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:145) > > at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:763) > > at org.apache.hadoop.mapred.MapTask.run(MapTask.java:363) > > at org.apache.hadoop.mapred.Child$4.run(Child.java:255) > > at java.security.AccessController.doPrivileged(Native Method) > > at javax.security.auth.Subject.doAs(Subject.java:396) > > at > > > > > org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1232) > > at org.apache.hadoop.mapred.Child.main(Child.java:249) > > > > > > Regards > > Praveenesh > > > > > > -- > > Ashwanth Kumar / ashwanthkumar.in >
