Hi Ravi,

Yes I have registered the driver before executing pig commands that's the
first step I did. I didn't understand your second part. As per my
understanding of your email, yes  I have first loaded data into phoenix
table. Then I am reading a different phoenix table created by MR process.
Yes, I am using zookeeper quorum in the  'servername'

Thanks,
Satya
On 05-Jun-2015 9:55 pm, "Ravi Kiran" <[email protected]> wrote:

> Hi Durga Prasad,
>     Assuming you have registered phoenix-[version]. jar and have used the
> same above command to LOAD data from a Phoenix table 'cftable' and
> 'server_name' is your zookeeper quorum  , things should be working.
>
> Can you please confirm.
>
> Regards
> Ravi
>
> On Fri, Jun 5, 2015 at 4:53 AM, Ns G <[email protected]> wrote:
>
>> Hi Team,
>>
>> I am trying to connect to Phoenix Database through PIG. I am able to save
>> the data. But when i read the data it fails..
>>
>> I am using 4.3.1 version of jar supplied by cloudera.
>>
>> FileData = load 'hbase://table/cftable' USING
>> org.apache.phoenix.pig.PhoenixHBaseLoader('server name');
>>
>>
>> Failed to parse: <line 7, column 118>  mismatched input '*servername*'
>> expecting SEMI_COLON
>>     at
>> org.apache.pig.parser.QueryParserDriver.parse(QueryParserDriver.java:241)
>>     at
>> org.apache.pig.parser.QueryParserDriver.parse(QueryParserDriver.java:179)
>>     at org.apache.pig.PigServer$Graph.validateQuery(PigServer.java:1660)
>>     at org.apache.pig.PigServer$Graph.registerQuery(PigServer.java:1633)
>>     at org.apache.pig.PigServer.registerQuery(PigServer.java:587)
>>     at
>> org.apache.pig.tools.grunt.GruntParser.processPig(GruntParser.java:1093)
>>     at
>> org.apache.pig.tools.pigscript.parser.PigScriptParser.parse(PigScriptParser.java:501)
>>     at
>> org.apache.pig.tools.grunt.GruntParser.parseStopOnError(GruntParser.java:198)
>>     at
>> org.apache.pig.tools.grunt.GruntParser.parseStopOnError(GruntParser.java:173)
>>     at org.apache.pig.tools.grunt.Grunt.run(Grunt.java:69)
>>     at org.apache.pig.Main.run(Main.java:541)
>>     at org.apache.pig.Main.main(Main.java:156)
>>     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>     at
>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>>     at
>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>>     at java.lang.reflect.Method.invoke(Method.java:606)
>>     at org.apache.hadoop.util.RunJar.run(RunJar.java:221)
>>     at org.apache.hadoop.util.RunJar.main(RunJar.java:136)
>>
>>
>> Can anyone of you advice how i can read data from Phoenix tables please?
>>
>>
>> Thanks,
>>
>> Durga Prasad
>> On 05-Jun-2015 4:38 am, "Ravi Kiran" <[email protected]> wrote:
>>
>>> Hi Durga Prasad,
>>>     I don't see any thing suspicious in the code above. Is it possible
>>> for you to give a new path entry when configuring the PhoenixInputFormat .
>>>
>>>    MultipleInputs.addInputPath(job, new Path(args[1]),
>>> PhoenixInputFormat.class, PhoenixMapper.class);
>>>    MultipleInputs.addInputPath(job, new Path(args[0]),
>>> TextInputFormat.class, simpleMapper.class);
>>>
>>> Can you also share the phoenix version you are using , so I can try to
>>> reproduce this locally.
>>>
>>> Regards
>>> Ravi
>>>
>>>
>>> On Wed, Jun 3, 2015 at 8:32 PM, James Taylor <[email protected]>
>>> wrote:
>>>
>>>> Wrong group. The Google group is no longer active. Please use the
>>>> Apache dev mailing list: http://phoenix.apache.org/mailing_list.html
>>>>
>>>>
>>>> On Wednesday, June 3, 2015, Ns G <[email protected]> wrote:
>>>>
>>>>> ++ group
>>>>> ---------- Forwarded message ----------
>>>>> From: Ns G <[email protected]>
>>>>> Date: Thu, Jun 4, 2015 at 8:56 AM
>>>>> Subject: Re: REG: Phoenix MR issue
>>>>> To: Ravi Kiran <[email protected]>
>>>>>
>>>>>
>>>>> Hi Ravi,
>>>>>
>>>>> Thanks for taking time. Below is my job setup code. I now used reducer
>>>>> setup method to read the file.
>>>>>
>>>>> I am giving only a part of the code due to access restrictions
>>>>>
>>>>>   final String selectQuery = "SELECT * FROM  Table1 ";
>>>>>         MultipleInputs.addInputPath(job, new Path(args[0]),
>>>>> TextInputFormat.class, Mapper1.class);
>>>>>         System.out.println("inmain" + args[0]);
>>>>>
>>>>>         PhoenixMapReduceUtil.setInput(job, Table1Writable.class,
>>>>> "schema1.Table1",  selectQuery);
>>>>>
>>>>>         MultipleInputs.addInputPath(job, null,
>>>>> PhoenixInputFormat.class, PhoenixMapper.class);  \\ if i give null, it
>>>>> throws null exception. so, i gave args[0]
>>>>>
>>>>>         MultipleInputs.addInputPath(job, new Path(args[0]),
>>>>> TextInputFormat.class, simpleMapper.class);
>>>>>
>>>>>
>>>>>
>>>>>         job.setReducerClass(NDMReducer.class);
>>>>>         job.setOutputFormatClass(PhoenixOutputFormat.class);
>>>>>         job.setJarByClass(com.nielsen.ndm.NDMMain.class);
>>>>>         job.setMapOutputKeyClass(Text.class);
>>>>>         job.setMapOutputValueClass(Text.class);
>>>>>         job.setOutputKeyClass(NullWritable.class);
>>>>>         job.setOutputValueClass(ElementRecogWriteable.class);
>>>>>         TableMapReduceUtil.addDependencyJars(job);
>>>>>         boolean success = job.waitForCompletion(true);
>>>>>         return (success ? 0 : 1);
>>>>>
>>>>> Thanks,
>>>>> Durga Prasad
>>>>>
>>>>> On Wed, Jun 3, 2015 at 11:07 PM, Ravi Kiran <[email protected]
>>>>> > wrote:
>>>>>
>>>>>> Hi Durga Prasad,
>>>>>>
>>>>>> Can you please share the code you have written for creating and
>>>>>> configuring the Job instance. I am assuming you used the 
>>>>>> MultiInputFormat,
>>>>>> however,  would like to see what is written to help you.
>>>>>>
>>>>>> Regards
>>>>>> Ravi
>>>>>>
>>>>>> On Wed, Jun 3, 2015 at 9:17 AM, Satya <[email protected]> wrote:
>>>>>>
>>>>>>> Hi Friends,
>>>>>>>
>>>>>>> I am trying to use Hadoop MR with Phoenix. I created two mapper
>>>>>>> classes to read Phoenix table and a HDFS file. Reading a Phoenix table
>>>>>>> works, but mapper which reads the file doesnt get invoked. But when I
>>>>>>> comment out the phoenix mapper class, it is able to read files. so, the
>>>>>>> code works. The connection to table works properly. Can any one shed 
>>>>>>> light
>>>>>>> on this?
>>>>>>>
>>>>>>> Thanks,
>>>>>>> Durga Prasad
>>>>>>>
>>>>>>> --
>>>>>>> You received this message because you are subscribed to the Google
>>>>>>> Groups "Phoenix HBase User" group.
>>>>>>> To unsubscribe from this group and stop receiving emails from it,
>>>>>>> send an email to [email protected].
>>>>>>> For more options, visit https://groups.google.com/d/optout.
>>>>>>>
>>>>>>
>>>>>>
>>>>>
>>>>>  --
>>>>> You received this message because you are subscribed to the Google
>>>>> Groups "Phoenix HBase User" group.
>>>>> To unsubscribe from this group and stop receiving emails from it, send
>>>>> an email to [email protected].
>>>>> For more options, visit https://groups.google.com/d/optout.
>>>>>
>>>>  --
>>>> You received this message because you are subscribed to the Google
>>>> Groups "Phoenix HBase User" group.
>>>> To unsubscribe from this group and stop receiving emails from it, send
>>>> an email to [email protected].
>>>> For more options, visit https://groups.google.com/d/optout.
>>>>
>>>
>>>
>

Reply via email to