Hi Durga Prasad,
    I don't see any thing suspicious in the code above. Is it possible for
you to give a new path entry when configuring the PhoenixInputFormat .

   MultipleInputs.addInputPath(job, new Path(args[1]),
PhoenixInputFormat.class, PhoenixMapper.class);
   MultipleInputs.addInputPath(job, new Path(args[0]),
TextInputFormat.class, simpleMapper.class);

Can you also share the phoenix version you are using , so I can try to
reproduce this locally.

Regards
Ravi


On Wed, Jun 3, 2015 at 8:32 PM, James Taylor <[email protected]> wrote:

> Wrong group. The Google group is no longer active. Please use the Apache
> dev mailing list: http://phoenix.apache.org/mailing_list.html
>
>
> On Wednesday, June 3, 2015, Ns G <[email protected]> wrote:
>
>> ++ group
>> ---------- Forwarded message ----------
>> From: Ns G <[email protected]>
>> Date: Thu, Jun 4, 2015 at 8:56 AM
>> Subject: Re: REG: Phoenix MR issue
>> To: Ravi Kiran <[email protected]>
>>
>>
>> Hi Ravi,
>>
>> Thanks for taking time. Below is my job setup code. I now used reducer
>> setup method to read the file.
>>
>> I am giving only a part of the code due to access restrictions
>>
>>   final String selectQuery = "SELECT * FROM  Table1 ";
>>         MultipleInputs.addInputPath(job, new Path(args[0]),
>> TextInputFormat.class, Mapper1.class);
>>         System.out.println("inmain" + args[0]);
>>
>>         PhoenixMapReduceUtil.setInput(job, Table1Writable.class,
>> "schema1.Table1",  selectQuery);
>>
>>         MultipleInputs.addInputPath(job, null, PhoenixInputFormat.class,
>> PhoenixMapper.class);  \\ if i give null, it throws null exception. so, i
>> gave args[0]
>>
>>         MultipleInputs.addInputPath(job, new Path(args[0]),
>> TextInputFormat.class, simpleMapper.class);
>>
>>
>>
>>         job.setReducerClass(NDMReducer.class);
>>         job.setOutputFormatClass(PhoenixOutputFormat.class);
>>         job.setJarByClass(com.nielsen.ndm.NDMMain.class);
>>         job.setMapOutputKeyClass(Text.class);
>>         job.setMapOutputValueClass(Text.class);
>>         job.setOutputKeyClass(NullWritable.class);
>>         job.setOutputValueClass(ElementRecogWriteable.class);
>>         TableMapReduceUtil.addDependencyJars(job);
>>         boolean success = job.waitForCompletion(true);
>>         return (success ? 0 : 1);
>>
>> Thanks,
>> Durga Prasad
>>
>> On Wed, Jun 3, 2015 at 11:07 PM, Ravi Kiran <[email protected]>
>> wrote:
>>
>>> Hi Durga Prasad,
>>>
>>> Can you please share the code you have written for creating and
>>> configuring the Job instance. I am assuming you used the MultiInputFormat,
>>> however,  would like to see what is written to help you.
>>>
>>> Regards
>>> Ravi
>>>
>>> On Wed, Jun 3, 2015 at 9:17 AM, Satya <[email protected]> wrote:
>>>
>>>> Hi Friends,
>>>>
>>>> I am trying to use Hadoop MR with Phoenix. I created two mapper classes
>>>> to read Phoenix table and a HDFS file. Reading a Phoenix table works, but
>>>> mapper which reads the file doesnt get invoked. But when I comment out the
>>>> phoenix mapper class, it is able to read files. so, the code works. The
>>>> connection to table works properly. Can any one shed light on this?
>>>>
>>>> Thanks,
>>>> Durga Prasad
>>>>
>>>> --
>>>> You received this message because you are subscribed to the Google
>>>> Groups "Phoenix HBase User" group.
>>>> To unsubscribe from this group and stop receiving emails from it, send
>>>> an email to [email protected].
>>>> For more options, visit https://groups.google.com/d/optout.
>>>>
>>>
>>>
>>
>>  --
>> You received this message because you are subscribed to the Google Groups
>> "Phoenix HBase User" group.
>> To unsubscribe from this group and stop receiving emails from it, send an
>> email to [email protected].
>> For more options, visit https://groups.google.com/d/optout.
>>
>  --
> You received this message because you are subscribed to the Google Groups
> "Phoenix HBase User" group.
> To unsubscribe from this group and stop receiving emails from it, send an
> email to [email protected].
> For more options, visit https://groups.google.com/d/optout.
>

Reply via email to