I am using CDH 4.4, with HBase hbase-0.94.6+132 and pig-0.11.0+33. My
Hadoop client lib is hadoop-2.0.0+1475.

So it looks like my Pig is MR2, but Phoenix is expecting MR1?

I'm not really sure how to go about resolving this issue. CDH is a bit of a
black box - I don't know if their Pig is using MR1/2. And I don't have
source to recompile it.

It looks like my Pig is using


On Tue, Feb 11, 2014 at 11:12 PM, Prashant Kommireddi <
[email protected]> wrote:

> Yup, that seems like a classpath issue. Also, make sure to compile pig
> with the correct hadoop version if you are using the fat jar.
>
>
> On Tue, Feb 11, 2014 at 9:05 PM, Skanda <[email protected]>wrote:
>
>> Hi Russell,
>>
>> Which version of HBase and Hadoop are you using? The reason for this
>> issue is that TaskAttemptContext is an interface in Hadoop 2.x but is a
>> class in Hadoop 1.x.
>>
>> Regards,
>> Skanda
>>
>>
>> On Wed, Feb 12, 2014 at 10:06 AM, James Taylor <[email protected]>wrote:
>>
>>> This is beyond my knowledge of Pig, but Prashant may know as he
>>> contributed our Pig integration.
>>>
>>> Thanks,
>>> James
>>>
>>>
>>> On Tue, Feb 11, 2014 at 4:34 PM, Russell Jurney <
>>> [email protected]> wrote:
>>>
>>>> I am trying to store data into this table:
>>>>
>>>> CREATE TABLE IF NOT EXISTS BEACONING_ACTIVITY  (
>>>>
>>>> EVENT_TIME VARCHAR NOT NULL,
>>>> C_IP VARCHAR NOT NULL,
>>>> CS_HOST VARCHAR NOT NULL,
>>>>  SLD  VARCHAR NOT NULL,
>>>> CONFIDENCE DOUBLE NOT NULL,
>>>> RISK DOUBLE NOT NULL,
>>>>  ANOMOLY DOUBLE NOT NULL,
>>>> INTERVAL DOUBLE NOT NULL
>>>>
>>>> CONSTRAINT PK PRIMARY KEY (EVENT_TIME, C_IP, CS_HOST)
>>>> );
>>>>
>>>>
>>>> Using this Pig:
>>>>
>>>> hosts_and_risks = FOREACH hosts_and_anomaly GENERATE hour, c_ip,
>>>> cs_host, sld, confidence, (confidence * anomaly) AS risk:double, anomaly,
>>>> interval;
>>>> --hosts_and_risks = ORDER hosts_and_risks BY risk DESC;
>>>> --STORE hosts_and_risks INTO '/tmp/beacons.txt';
>>>> STORE hosts_and_risks into 'hbase://BEACONING_ACTIVITY' using
>>>> com.salesforce.phoenix.pig.PhoenixHBaseStorage('hiveapp1','-batchSize
>>>> 5000');
>>>>
>>>> And the most helpful error message I get is this:
>>>>
>>>> 2014-02-11 16:24:13,831 FATAL org.apache.hadoop.mapred.Child: Error 
>>>> running child : java.lang.IncompatibleClassChangeError: Found interface 
>>>> org.apache.hadoop.mapreduce.TaskAttemptContext, but class was expected
>>>>    at 
>>>> com.salesforce.phoenix.pig.hadoop.PhoenixOutputFormat.getRecordWriter(PhoenixOutputFormat.java:75)
>>>>    at 
>>>> org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.PigOutputFormat.getRecordWriter(PigOutputFormat.java:84)
>>>>    at 
>>>> org.apache.hadoop.mapred.ReduceTask.runNewReducer(ReduceTask.java:597)
>>>>    at org.apache.hadoop.mapred.ReduceTask.run(ReduceTask.java:444)
>>>>    at org.apache.hadoop.mapred.Child$4.run(Child.java:268)
>>>>    at java.security.AccessController.doPrivileged(Native Method)
>>>>    at javax.security.auth.Subject.doAs(Subject.java:415)
>>>>    at 
>>>> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1408)
>>>>    at org.apache.hadoop.mapred.Child.main(Child.java:262)
>>>>
>>>>
>>>> What am I to do?
>>>>
>>>>
>>>> --
>>>> Russell Jurney twitter.com/rjurney [email protected]
>>>> datasyndrome.com
>>>>
>>>
>>>
>>
>


-- 
Russell Jurney twitter.com/rjurney [email protected] datasyndrome.com

Reply via email to