Yeah, that would mean http://phoenix.apache.org/download.html is wrong. It
claims 0.98 support.
ᐧ


On Tue, Aug 19, 2014 at 11:48 AM, Jesse Yates <[email protected]>
wrote:

> @James Taylor, correct me if I'm wrong, but it should be backwards
> compatible with older versions of HBase 0.98 - the build issues are
> separate from phoenix actually being able to run on that cluster
> (compatibilities should be handled via reflection).
>
>
> If there are cases where that's not true, means you should probably file a
> jira.
>
> -------------------
> Jesse Yates
> @jesse_yates
> jyates.github.com
>
>
> On Tue, Aug 19, 2014 at 11:36 AM, Russell Jurney <[email protected]
> > wrote:
>
>> Thats really bad. That means... CDH 5.x can't run Phoenix? How can this
>> be fixed? I'm not sure what to do. We're in limbo on our new cluster now.
>> ᐧ
>>
>>
>> On Mon, Aug 18, 2014 at 11:57 PM, Ravi Kiran <[email protected]>
>> wrote:
>>
>>> Hi Russel,
>>>     Apparently, Phoenix 4.0.0 leverages few API methods of HBase 0.98.4
>>> v  which aren't present within 0.98.1 that comes with CDH 5.1 . That's the
>>> primary cause for the build issues.
>>>
>>> Regards
>>> Ravi
>>>
>>>
>>>
>>> On Mon, Aug 18, 2014 at 5:56 PM, Russell Jurney <
>>> [email protected]> wrote:
>>>
>>>> Talking to myself, but hopefully creating good docs. Replacing the
>>>> previous hadoop version with one I found here:
>>>> https://repository.cloudera.com/artifactory/cloudera-repos/org/apache/hadoop/hadoop-core/,
>>>>  2.3.0-mr1-cdh5.1.0,
>>>> makes things get a little further.
>>>>
>>>> I can't get past some build errors, however. Has anyone done this
>>>> before me who can help?
>>>>
>>>> [ERROR]
>>>> /Users/rjurney/Software/phoenix4/phoenix-4.0.0-incubating-src/phoenix-core/src/main/java/org/apache/phoenix/query/ConnectionlessQueryServicesImpl.java:[143,27]
>>>> cannot find symbol
>>>>
>>>>   symbol:   method valueOf(java.lang.String,int,int)
>>>>
>>>>   location: class org.apache.hadoop.hbase.ServerName
>>>>
>>>>
>>>> [ERROR] Failed to execute goal
>>>> org.apache.maven.plugins:maven-compiler-plugin:3.0:compile
>>>> (default-compile) on project phoenix-core: Compilation failure
>>>>
>>>> [ERROR]
>>>> /Users/rjurney/Software/phoenix4/phoenix-4.0.0-incubating-src/phoenix-core/src/main/java/org/apache/phoenix/query/ConnectionlessQueryServicesImpl.java:[143,27]
>>>> cannot find symbol
>>>>
>>>> [ERROR] symbol:   method valueOf(java.lang.String,int,int)
>>>>
>>>> [ERROR] location: class org.apache.hadoop.hbase.ServerName
>>>>
>>>> [ERROR] -> [Help 1]
>>>>
>>>> ᐧ
>>>>
>>>>
>>>> On Mon, Aug 18, 2014 at 5:41 PM, Russell Jurney <
>>>> [email protected]> wrote:
>>>>
>>>>> Ok, so it is clear to me what I have to do. I have to edit my pom.xml
>>>>> to point at CDH 5.1, which translates into:
>>>>>
>>>>> Add the cloudera repo:
>>>>>
>>>>>     <repository>
>>>>>       <id>cloudera</id>
>>>>>       <url>https://repository.cloudera.com/artifactory/cloudera-repos/
>>>>> </url>
>>>>>     </repository>
>>>>>
>>>>>
>>>>> Then change the hadoop and hbase versions:
>>>>>
>>>>>      <!-- Hadoop Versions -->
>>>>>     <hbase-hadoop1.version>0.98.1-cdh5.1.0</hbase-hadoop1.version>
>>>>>     <hbase-hadoop2.version>0.98.1-cdh5.1.0</hbase-hadoop2.version>
>>>>>     <hadoop-one.version>2.3.0-cdh5.1.0</hadoop-one.version>
>>>>>     <hadoop-two.version>2.3.0-cdh5.1.0</hadoop-two.version>
>>>>>
>>>>>
>>>>> However, I get this error when I build, which tells me there is more
>>>>> complex POM surgery required.
>>>>>
>>>>> [ERROR] Failed to execute goal on project phoenix-core: Could not
>>>>> resolve dependencies for project
>>>>> org.apache.phoenix:phoenix-core:jar:4.0.0-incubating: The following
>>>>> artifacts could not be resolved:
>>>>> org.apache.hadoop:hadoop-core:jar:2.3.0-cdh5.1.0,
>>>>> org.apache.hadoop:hadoop-test:jar:2.3.0-cdh5.1.0: Could not find artifact
>>>>> org.apache.hadoop:hadoop-core:jar:2.3.0-cdh5.1.0 in apache release (
>>>>> https://repository.apache.org/content/repositories/releases/) ->
>>>>> [Help 1]
>>>>>
>>>>> Beyond changing the versions, I do not know how to fix this. Can
>>>>> anyone help?
>>>>>
>>>>> In general, is it possible to actually handle different CDH versions
>>>>> in this project? One shouldn't have to do pom surgery to build Phoenix for
>>>>> the most common platform.
>>>>> ᐧ
>>>>>
>>>>>
>>>>> On Mon, Aug 18, 2014 at 5:15 PM, Russell Jurney <
>>>>> [email protected]> wrote:
>>>>>
>>>>>> When I try to store data into Phoenix from Pig, I get this error. I
>>>>>> am on CDH 5.1, and Phoenix 4.0.
>>>>>>
>>>>>> Anyone know how to resolve this issue?
>>>>>>
>>>>>> 2014-08-18 17:11:25,165 INFO 
>>>>>> org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.PigRecordReader:
>>>>>>  Current split being processed 
>>>>>> hdfs://cluster1-srv1.e8.com:8020/e8/prod/web_behavior/anomaly_profile.txt/2014/07/15/00/part-r-00000:0+845
>>>>>> 2014-08-18 
>>>>>> <http://cluster1-srv1.e8.com:8020/e8/prod/web_behavior/anomaly_profile.txt/2014/07/15/00/part-r-00000:0+8452014-08-18>
>>>>>>  17:11:25,173 INFO org.apache.hadoop.mapred.TaskLogsTruncater: 
>>>>>> Initializing logs' truncater with mapRetainSize=-1 and 
>>>>>> reduceRetainSize=-1
>>>>>> 2014-08-18 17:11:25,175 FATAL org.apache.hadoop.mapred.Child: Error 
>>>>>> running child : java.lang.IncompatibleClassChangeError: Found interface 
>>>>>> org.apache.hadoop.mapreduce.TaskAttemptContext, but class was expected
>>>>>>  at 
>>>>>> org.apache.phoenix.pig.hadoop.PhoenixOutputFormat.getRecordWriter(PhoenixOutputFormat.java:65)
>>>>>>  at 
>>>>>> org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.PigOutputFormat.getRecordWriter(PigOutputFormat.java:84)
>>>>>>  at 
>>>>>> org.apache.hadoop.mapred.MapTask$NewDirectOutputCollector.<init>(MapTask.java:548)
>>>>>>  at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:653)
>>>>>>  at org.apache.hadoop.mapred.MapTask.run(MapTask.java:330)
>>>>>>  at org.apache.hadoop.mapred.Child$4.run(Child.java:268)
>>>>>>  at java.security.AccessController.doPrivileged(Native Method)
>>>>>>  at javax.security.auth.Subject.doAs(Subject.java:415)
>>>>>>  at 
>>>>>> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1554)
>>>>>>  at org.apache.hadoop.mapred.Child.main(Child.java:262)
>>>>>>
>>>>>>
>>>>>>
>>>>>> --
>>>>>> Russell Jurney twitter.com/rjurney [email protected]
>>>>>> datasyndrome.com
>>>>>> ᐧ
>>>>>>
>>>>>
>>>>>
>>>>>
>>>>> --
>>>>> Russell Jurney twitter.com/rjurney [email protected]
>>>>> datasyndrome.com
>>>>>
>>>>
>>>>
>>>>
>>>> --
>>>> Russell Jurney twitter.com/rjurney [email protected]
>>>> datasyndrome.com
>>>>
>>>
>>>
>>
>>
>> --
>> Russell Jurney twitter.com/rjurney [email protected] datasyndrome.
>> com
>>
>
>


-- 
Russell Jurney twitter.com/rjurney [email protected] datasyndrome.com

Reply via email to