Thanks Erik. I haven't set up my mind yet whether to work on the trunk code or to wait for the beta release, but it sure helps to know that this was already addressed.
-----Original Message----- From: Erik Holstad [mailto:[email protected]] Sent: Tuesday, July 14, 2009 8:08 PM To: [email protected] Subject: Re: hbase / hadoop 020 compatability error Hey Yair! Yeah, I think there has been some updates since the alpha. I'm looking at trunk and it looks like: import org.apache.hadoop.io.Writable; import org.apache.hadoop.mapreduce.JobContext; import org.apache.hadoop.mapreduce.OutputCommitter; import org.apache.hadoop.mapreduce.OutputFormat; import org.apache.hadoop.mapreduce.RecordWriter; import org.apache.hadoop.mapreduce.TaskAttemptContext; /** * Convert Map/Reduce output and write it to an HBase table. The KEY is ignored * while the output value <u>must</u> be either a {...@link Put} or a * {...@link Delete} instance. * * @param <KEY> The type of the key. Ignored in this class. */ public class TableOutputFormat<KEY> extends OutputFormat<KEY, Writable> { Regards Erik
