Hi Ted,
I modified the Job class as below. Then it's working.
Job class:
public int run(String[] args) throws Exception {
System.out.println( " Running with on tables "+args[1]+ " and "+args[2]+"
with zk "+args[3]);
Configuration hbaseConf = HBaseConfiguration.create(getConf());
// hbaseConf.set(Constants.HBASE_ZOOKEEPER_QUORUM_PROP,
Constants.HBASE_OS_CL1_QUORUM);
hbaseConf.set(Constants.HBASE_ZOOKEEPER_QUORUM_PROP, args[3]);
*hbaseConf.set("HBASE_DEST_CI_TABLE", args[1]);*
* hbaseConf.set("HBASE_CI_LOOKUP_TABLE", args[2]);*
Job job = new Job(hbaseConf);
job.setJarByClass(MultiTableTestJob.class);
job.setInputFormatClass(TextInputFormat.class);
job.setMapperClass(MultiTableTestMapper.class);
job.setMapOutputKeyClass(Text.class);
job.setMapOutputValueClass(Text.class);
job.setReducerClass(MultiTableTestReducer.class);
job.setOutputKeyClass(Text.class);
job.setOutputValueClass(Text.class);
FileInputFormat.setInputPaths(job, new Path(args[0]));
job.setOutputFormatClass(MultiTableOutputFormat.class);
TableMapReduceUtil.addDependencyJars(job);
TableMapReduceUtil.addDependencyJars(job.getConfiguration());
return job.waitForCompletion(true) == true ? 0 : -1;
}
public static void main(String[] args) throws Exception{
Configuration configuration = new Configuration();
*// configuration.set("HBASE_DEST_TABLE", args[1]);*
* // configuration.set("HBASE_LOOKUP_TABLE", args[2]);*
ToolRunner.run(configuration, new CISuperSessionJob(), args);
}
*When I set the properties in Configuration object, and when I try to get
from context in reducer, it's giving wrong values. But when set them to
hbaseConf, then it's working fine. Why is like that?*
Thanks & Regards,
B Anil Kumar.
On Tue, Jan 7, 2014 at 9:52 AM, AnilKumar B <[email protected]> wrote:
> Hi Ted,
>
> It seems some issue in my code itself, will retry and update you.
>
> Thanks,
> B Anil Kumar.
>
> Thanks & Regards,
> B Anil Kumar.
>
>
> On Tue, Jan 7, 2014 at 9:35 AM, AnilKumar B <[email protected]> wrote:
>
>> Hi Ted,
>>
>> System.out.println( " Running with on tables "+args[1]+ " and "+args[2]+"
>> with zk "+args[3]);
>>
>> What was the output from the above ?
>> >> Running with on tables ci_history and ci_lookup with zk 10.9.208.71
>>
>> Tables are exist on hbase:
>> hbase(main):001:0> list
>> TABLE
>> ci_history
>> ci_lookup
>>
>>
>>
>>
>> Thanks & Regards,
>> B Anil Kumar.
>>
>>
>> On Tue, Jan 7, 2014 at 9:07 AM, Ted Yu <[email protected]> wrote:
>>
>>> System.out.println( " Running with on tables "+args[1]+ " and
>>> "+args[2]+"
>>> with zk "+args[3]);
>>>
>>> What was the output from the above ?
>>>
>>> I would expect a call similar to the following in your run() method -
>>> this
>>> comes from TestTableMapReduce.java:
>>>
>>> TableMapReduceUtil.initTableReducerJob(
>>> Bytes.toString(table.getTableName()),
>>> IdentityTableReducer.class, job);
>>>
>>>
>>> On Mon, Jan 6, 2014 at 7:12 PM, AnilKumar B <[email protected]>
>>> wrote:
>>>
>>> > Hi,
>>> >
>>> > In my MR job, I need to write output into multiple tables, So I am
>>> > using MultiTableOutputFormat as below. But I am getting
>>> > TableNotFoundException.
>>> >
>>> > I am attaching code snippet below, Is this the correct way to use
>>> > MultiTableOutputFormat ?
>>> >
>>> >
>>> > Job class:
>>> > public int run(String[] args) throws Exception {
>>> > System.out.println( " Running with on tables "+args[1]+ " and
>>> "+args[2]+"
>>> > with zk "+args[3]);
>>> > Configuration hbaseConf = HBaseConfiguration.create(getConf());
>>> > // hbaseConf.set(Constants.HBASE_ZOOKEEPER_QUORUM_PROP,
>>> > Constants.HBASE_OS_CL1_QUORUM);
>>> > hbaseConf.set(Constants.HBASE_ZOOKEEPER_QUORUM_PROP, args[3]);
>>> > Job job = new Job(hbaseConf);
>>> > job.setJarByClass(MultiTableTestJob.class);
>>> > job.setInputFormatClass(TextInputFormat.class);
>>> > job.setMapperClass(MultiTableTestMapper.class);
>>> > job.setMapOutputKeyClass(Text.class);
>>> > job.setMapOutputValueClass(Text.class);
>>> > job.setReducerClass(MultiTableTestReducer.class);
>>> > job.setOutputKeyClass(Text.class);
>>> > job.setOutputValueClass(Text.class);
>>> > FileInputFormat.setInputPaths(job, new Path(args[0]));
>>> > job.setOutputFormatClass(MultiTableOutputFormat.class);
>>> > TableMapReduceUtil.addDependencyJars(job);
>>> > TableMapReduceUtil.addDependencyJars(job.getConfiguration());
>>> > return job.waitForCompletion(true) == true ? 0 : -1;
>>> > }
>>> > public static void main(String[] args) throws Exception{
>>> > Configuration configuration = new Configuration();
>>> > configuration.set("HBASE_DEST_TABLE", args[1]);
>>> > configuration.set("HBASE_LOOKUP_TABLE", args[2]);
>>> > ToolRunner.run(configuration, new CISuperSessionJob(), args);
>>> > }
>>> >
>>> > Reducer Class:
>>> >
>>> > private ImmutableBytesWritable tbl1;
>>> > private ImmutableBytesWritable tbl2;
>>> >
>>> > protected void setup(Context context) throws IOException
>>> > ,InterruptedException {
>>> > Configuration c = context.getConfiguration();
>>> > tbl1 = new
>>> >
>>> >
>>> ImmutableBytesWritable(Bytes.toBytes(context.getConfiguration().get("HBASE_DEST_TABLE")));
>>> > tbl2 = new
>>> >
>>> >
>>> ImmutableBytesWritable(Bytes.toBytes(context.getConfiguration().get("HBASE_LOOKUP_TABLE")));
>>> > };
>>> >
>>> > protected void reduce(Text key, java.lang.Iterable<Text>
>>> > values, Context context) throws IOException ,InterruptedException {
>>> > //
>>> > if (some condition) {
>>> > Put put = getSessionPut(key, vc);
>>> > if (put != null) {
>>> > context.write(tbl1, put);
>>> > }
>>> > } else {
>>> > //
>>> > Put put = getEventPut(key, vc);
>>> > context.write(tbl2, put);
>>> > }
>>> > }
>>> > }
>>> >
>>> >
>>> > Exception:
>>> > org.apache.hadoop.hbase.TableNotFoundException: mapred.reduce.tasks=100
>>> > at
>>> >
>>> >
>>> org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation.locateRegionInMeta(HConnectionManager.java:999)
>>> > at
>>> >
>>> >
>>> org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation.locateRegion(HConnectionManager.java:864)
>>> > at
>>> >
>>> >
>>> org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation.locateRegion(HConnectionManager.java:821)
>>> > at
>>> > org.apache.hadoop.hbase.client.HTable.finishSetup(HTable.java:234)
>>> > at
>>> org.apache.hadoop.hbase.client.HTable.<init>(HTable.java:174)
>>> > at
>>> >
>>> >
>>> org.apache.hadoop.hbase.mapreduce.MultiTableOutputFormat$MultiTableRecordWriter.getTable(MultiTableOutputFormat.java:101)
>>> > at
>>> >
>>> >
>>> org.apache.hadoop.hbase.mapreduce.MultiTableOutputFormat$MultiTableRecordWriter.write(MultiTableOutputFormat.java:127)
>>> > at
>>> >
>>> >
>>> org.apache.hadoop.hbase.mapreduce.MultiTableOutputFormat$MultiTableRecordWriter.write(MultiTableOutputFormat.java:68)
>>> > at
>>> >
>>> >
>>> org.apache.hadoop.mapred.ReduceTask$NewTrackingRecordWriter.write(ReduceTask.java:586)
>>> > at
>>> >
>>> >
>>> org.apache.hadoop.mapreduce.TaskInputOutputContext.write(TaskInputOutputContext.java:80)
>>> > at
>>> >
>>> >
>>> >
>>> > Thanks & Regards,
>>> > B Anil Kumar.
>>> >
>>>
>>
>>
>