Ok, so it is clear to me what I have to do. I have to edit my pom.xml to
point at CDH 5.1, which translates into:

Add the cloudera repo:

    <repository>
      <id>cloudera</id>
      <url>https://repository.cloudera.com/artifactory/cloudera-repos/</url>
    </repository>


Then change the hadoop and hbase versions:

    <!-- Hadoop Versions -->
    <hbase-hadoop1.version>0.98.1-cdh5.1.0</hbase-hadoop1.version>
    <hbase-hadoop2.version>0.98.1-cdh5.1.0</hbase-hadoop2.version>
    <hadoop-one.version>2.3.0-cdh5.1.0</hadoop-one.version>
    <hadoop-two.version>2.3.0-cdh5.1.0</hadoop-two.version>


However, I get this error when I build, which tells me there is more
complex POM surgery required.

[ERROR] Failed to execute goal on project phoenix-core: Could not resolve
dependencies for project
org.apache.phoenix:phoenix-core:jar:4.0.0-incubating: The following
artifacts could not be resolved:
org.apache.hadoop:hadoop-core:jar:2.3.0-cdh5.1.0,
org.apache.hadoop:hadoop-test:jar:2.3.0-cdh5.1.0: Could not find artifact
org.apache.hadoop:hadoop-core:jar:2.3.0-cdh5.1.0 in apache release (
https://repository.apache.org/content/repositories/releases/) -> [Help 1]

Beyond changing the versions, I do not know how to fix this. Can anyone
help?

In general, is it possible to actually handle different CDH versions in
this project? One shouldn't have to do pom surgery to build Phoenix for the
most common platform.
ᐧ


On Mon, Aug 18, 2014 at 5:15 PM, Russell Jurney <[email protected]>
wrote:

> When I try to store data into Phoenix from Pig, I get this error. I am on
> CDH 5.1, and Phoenix 4.0.
>
> Anyone know how to resolve this issue?
>
> 2014-08-18 17:11:25,165 INFO 
> org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.PigRecordReader: 
> Current split being processed 
> hdfs://cluster1-srv1.e8.com:8020/e8/prod/web_behavior/anomaly_profile.txt/2014/07/15/00/part-r-00000:0+845
> 2014-08-18 
> <http://cluster1-srv1.e8.com:8020/e8/prod/web_behavior/anomaly_profile.txt/2014/07/15/00/part-r-00000:0+8452014-08-18>
>  17:11:25,173 INFO org.apache.hadoop.mapred.TaskLogsTruncater: Initializing 
> logs' truncater with mapRetainSize=-1 and reduceRetainSize=-1
> 2014-08-18 17:11:25,175 FATAL org.apache.hadoop.mapred.Child: Error running 
> child : java.lang.IncompatibleClassChangeError: Found interface 
> org.apache.hadoop.mapreduce.TaskAttemptContext, but class was expected
>       at 
> org.apache.phoenix.pig.hadoop.PhoenixOutputFormat.getRecordWriter(PhoenixOutputFormat.java:65)
>       at 
> org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.PigOutputFormat.getRecordWriter(PigOutputFormat.java:84)
>       at 
> org.apache.hadoop.mapred.MapTask$NewDirectOutputCollector.<init>(MapTask.java:548)
>       at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:653)
>       at org.apache.hadoop.mapred.MapTask.run(MapTask.java:330)
>       at org.apache.hadoop.mapred.Child$4.run(Child.java:268)
>       at java.security.AccessController.doPrivileged(Native Method)
>       at javax.security.auth.Subject.doAs(Subject.java:415)
>       at 
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1554)
>       at org.apache.hadoop.mapred.Child.main(Child.java:262)
>
>
>
> --
> Russell Jurney twitter.com/rjurney [email protected] datasyndrome.
> com
> ᐧ
>



-- 
Russell Jurney twitter.com/rjurney [email protected] datasyndrome.com

Reply via email to