"....
public class MYHBaseLoader extends
Mapper<*NullWritable*,BytesWritable,NullWritable,Put> {
protected void map (*LongWritable* key, BytesWritable value, Context
context) throws IOException, InterruptedException {
..."
Why is the difference in types of the keys?
Regards,
Shahab
On Thu, Aug 29, 2013 at 5:46 AM, praveenesh kumar <[email protected]>wrote:
> Hi all,
>
> I am trying to write a MR code to load a HBase table.
>
> I have a mapper that emits (null,put object) and I am using
> TableMapReduceUtil.initTableReducerJob() to write it into a HBase table.
>
> Following is my code snippet
>
> public class MYHBaseLoader extends
> Mapper<NullWritable,BytesWritable,NullWritable,Put> {
>
> protected void map (LongWritable key, BytesWritable value, Context
> context) throws IOException, InterruptedException {
>
> /****----- Some processing here.. Create put object and pushing it
> into Put object).
> context.write(null, put);// Pushing the put object.
>
> }
>
> public static void main (String args[]) throws IOException,
> ClassNotFoundException, InterruptedException{
> Configuration conf = new Configuration();
> Job job = new Job(conf);
> job.setJarByClass(MYHBaseLoader.class);
> job.setMapperClass(MYHBaseLoader.class);
>
>
> TableMapReduceUtil.initTableReducerJob(MY_IMPORT_TABLE_NAME,IdentityTableReducer.class,job);
> job.setMapOutputKeyClass(NullWritable.class);
> job.setMapOutputValueClass(Put.class);
> job.setInputFormatClass(SequenceFileInputFormat.class);
> //job.setNumReduceTasks(0);
>
> FileInputFormat.setInputPaths(job, new Path("test"));
> Path outputPath = new Path("test_output");
> FileOutputFormat.setOutputPath(job,outputPath);
>
> //outputPath.getFileSystem(conf).delete(outputPath, true);
>
> job.waitForCompletion(true);
> System.out.println("Done");
> }
>
>
> I am getting the following error while running. Any help/guidance:
>
>
> java.io.IOException: Type mismatch in value from map: expected
> org.apache.hadoop.hbase.client.Put, recieved
> org.apache.hadoop.io.BytesWritable
> at
> org.apache.hadoop.mapred.MapTask$MapOutputBuffer.collect(MapTask.java:1023)
> at
> org.apache.hadoop.mapred.MapTask$NewOutputCollector.write(MapTask.java:689)
> at
>
> org.apache.hadoop.mapreduce.TaskInputOutputContext.write(TaskInputOutputContext.java:80)
> at org.apache.hadoop.mapreduce.Mapper.map(Mapper.java:124)
> at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:145)
> at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:763)
> at org.apache.hadoop.mapred.MapTask.run(MapTask.java:363)
> at org.apache.hadoop.mapred.Child$4.run(Child.java:255)
> at java.security.AccessController.doPrivileged(Native Method)
> at javax.security.auth.Subject.doAs(Subject.java:396)
> at
>
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1232)
> at org.apache.hadoop.mapred.Child.main(Child.java:249)
>
>
> Regards
> Praveenesh
>