Hi James,
I get a compilation error along with multiple warnings when
packaging 4.6-HBase-1.0-cdh5.5 branch. Attached is the error.

Also I realized that the pom.xml indicates the branch is for cloudera CDH
version 5.5.1. Do you know if it would work for the latest CDH version
5.5.2?

Thanks,
Amit.

On Tue, Mar 1, 2016 at 12:05 PM, Dor Ben Dov <dor.ben-...@amdocs.com> wrote:

> James,
>
> Do you have any problems working with Phoenix latest with CDH 5.5.X ?
>
>
>
> Dor
>
>
>
> *From:* James Taylor [mailto:jamestay...@apache.org]
> *Sent:* יום ג 01 מרץ 2016 08:24
> *To:* user
> *Cc:* Murugesan, Rani
> *Subject:* Re: HBase Phoenix Integration
>
>
>
> Hi Amit,
>
>
>
> For Phoenix 4.6 on CDH, try using this git repo instead, courtesy of
> Andrew Purtell:
> https://github.com/chiastic-security/phoenix-for-cloudera/tree/4.6-HBase-1.0-cdh5.5
>
>
>
> Thanks,
>
> James
>
>
>
>
>
>
>
> On Mon, Feb 29, 2016 at 10:19 PM, Amit Shah <amits...@gmail.com> wrote:
>
> Hi Sergey,
>
>
>
> I get lot of compilation errors when I compile the source code
> for 4.6-HBase-1.0 branch or v4.7.0-HBase-1.0-rc3 tag. Note that the source
> compilation succeeds when the changes to include cloudera dependent
> versions are not included. The only difference between the code changes
> suggested on the stackoverflow post and mine is the cloudera cdh version. I
> am using cdh 5.5.2. I didn't quite follow the reason behind the code
> changes needed in phoenix when deployed on CDH.
>
>
>
> Thanks,
>
> Amit.
>
>
>
> On Tue, Mar 1, 2016 at 1:15 AM, Sergey Soldatov <sergeysolda...@gmail.com>
> wrote:
>
> Hi Amit,
>
> Switching to 4.3 means you need HBase 0.98. What kind of problem you
> experienced after building 4.6 from sources with changes suggested on
> StackOverflow?
>
> Thanks,
> Sergey
>
>
> On Sun, Feb 28, 2016 at 10:49 PM, Amit Shah <amits...@gmail.com> wrote:
> > An update -
> >
> > I was able to execute "./sqlline.py <zookeeper-server-name>" command but
> I
> > get the same exception as I mentioned earlier.
> >
> > Later I tried following the steps mentioned on this link with phoenix
> 4.3.0
> > but I still get an error this time with a different stack trace
> (attached to
> > this mail)
> >
> > Any help would be appreciated
> >
> > On Sat, Feb 27, 2016 at 8:03 AM, Amit Shah <amits...@gmail.com> wrote:
> >>
> >> Hi Murugesan,
> >>
> >> What preconditions would I need on the server to execute the python
> >> script? I have Python 2.7.5 installed on the zookeeper server. If I just
> >> copy the sqlline script to the /etc/hbase/conf directory and execute it
> I
> >> get the below import errors. Note this time I had 4.5.2-HBase-1.0
> version
> >> server and core phoenix jars in HBase/lib directory on the master and
> region
> >> servers.
> >>
> >> Traceback (most recent call last):
> >>   File "./sqlline.py", line 25, in <module>
> >>     import phoenix_utils
> >> ImportError: No module named phoenix_utils
> >>
> >> Pardon me for my knowledge about python.
> >>
> >> Thanks,
> >> Amit
> >>
> >> On Fri, Feb 26, 2016 at 11:26 PM, Murugesan, Rani <ranmu...@visa.com>
> >> wrote:
> >>>
> >>> Did you test and confirm your phoenix shell from the zookeeper server?
> >>>
> >>> cd /etc/hbase/conf
> >>>
> >>> > phoenix-sqlline.py <zookeeperserver>:2181
> >>>
> >>>
> >>>
> >>>
> >>>
> >>> From: Amit Shah [mailto:amits...@gmail.com]
> >>> Sent: Friday, February 26, 2016 4:45 AM
> >>> To: user@phoenix.apache.org
> >>> Subject: HBase Phoenix Integration
> >>>
> >>>
> >>>
> >>> Hello,
> >>>
> >>>
> >>>
> >>> I have been trying to install phoenix on my cloudera hbase cluster.
> >>> Cloudera version is CDH5.5.2 while HBase version is 1.0.
> >>>
> >>>
> >>>
> >>> I copied the server & core jar (version 4.6-HBase-1.0) on the master
> and
> >>> region servers and restarted the hbase cluster. I copied the
> corresponding
> >>> client jar on my SQuirrel client but I get an exception on connect.
> Pasted
> >>> below. The connection url is
> “jdbc:phoenix:<zookeeper-server-name>:2181".
> >>>
> >>> I even tried compiling the source by adding cloudera dependencies as
> >>> suggested on this post but didn't succeed.
> >>>
> >>>
> >>>
> >>> Any suggestions to make this work?
> >>>
> >>>
> >>>
> >>> Thanks,
> >>>
> >>> Amit.
> >>>
> >>>
> >>>
> >>> ________________________________________________________________
> >>>
> >>>
> >>>
> >>> Caused by:
> >>>
> org.apache.hadoop.hbase.ipc.RemoteWithExtrasException(org.apache.hadoop.hbase.DoNotRetryIOException):
> >>> org.apache.hadoop.hbase.DoNotRetryIOException: SYSTEM.CATALOG:
> >>>
> org.apache.hadoop.hbase.client.Scan.setRaw(Z)Lorg/apache/hadoop/hbase/client/Scan;
> >>>
> >>>             at
> >>>
> org.apache.phoenix.util.ServerUtil.createIOException(ServerUtil.java:87)
> >>>
> >>>             at
> >>>
> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.createTable(MetaDataEndpointImpl.java:1319)
> >>>
> >>>             at
> >>>
> org.apache.phoenix.coprocessor.generated.MetaDataProtos$MetaDataService.callMethod(MetaDataProtos.java:11715)
> >>>
> >>>             at
> >>>
> org.apache.hadoop.hbase.regionserver.HRegion.execService(HRegion.java:7388)
> >>>
> >>>             at
> >>>
> org.apache.hadoop.hbase.regionserver.RSRpcServices.execServiceOnRegion(RSRpcServices.java:1776)
> >>>
> >>>             at
> >>>
> org.apache.hadoop.hbase.regionserver.RSRpcServices.execService(RSRpcServices.java:1758)
> >>>
> >>>             at
> >>>
> org.apache.hadoop.hbase.protobuf.generated.ClientProtos$ClientService$2.callBlockingMethod(ClientProtos.java:32209)
> >>>
> >>>             at
> >>> org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2034)
> >>>
> >>>             at
> >>> org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:107)
> >>>
> >>>             at
> >>>
> org.apache.hadoop.hbase.ipc.RpcExecutor.consumerLoop(RpcExecutor.java:130)
> >>>
> >>>             at
> >>> org.apache.hadoop.hbase.ipc.RpcExecutor$1.run(RpcExecutor.java:107)
> >>>
> >>>             at java.lang.Thread.run(Thread.java:745)
> >>>
> >>> Caused by: java.lang.NoSuchMethodError:
> >>>
> org.apache.hadoop.hbase.client.Scan.setRaw(Z)Lorg/apache/hadoop/hbase/client/Scan;
> >>>
> >>>             at
> >>>
> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.buildDeletedTable(MetaDataEndpointImpl.java:1016)
> >>>
> >>>             at
> >>>
> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.loadTable(MetaDataEndpointImpl.java:1092)
> >>>
> >>>             at
> >>>
> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.createTable(MetaDataEndpointImpl.java:1266)
> >>>
> >>>             ... 10 more
> >>>
> >>>
> >>>
> >>> P.S - The full stacktrace is attached in the mail.
> >>
> >>
> >
>
>
>
>
> This message and the information contained herein is proprietary and
> confidential and subject to the Amdocs policy statement, you may review at
> http://www.amdocs.com/email_disclaimer.asp
>
[WARNING] COMPILATION WARNING :
[INFO] -------------------------------------------------------------
[WARNING] 
/E:/Sundry/ResearchProjects/HBaseWithPhoenix/PhoenixSource/phoenix-for-cloudera/phoenix-core/src/main/java/org/apache/phoenix/execute/DescVa
rLengthFastByteComparisons.java:[25,16] sun.misc.Unsafe is internal proprietary 
API and may be removed in a future release
[WARNING] 
/E:/Sundry/ResearchProjects/HBaseWithPhoenix/PhoenixSource/phoenix-for-cloudera/phoenix-core/src/main/java/org/apache/phoenix/execute/DescVa
rLengthFastByteComparisons.java:[116,26] sun.misc.Unsafe is internal 
proprietary API and may be removed in a future release
[WARNING] 
/E:/Sundry/ResearchProjects/HBaseWithPhoenix/PhoenixSource/phoenix-for-cloudera/phoenix-core/src/main/java/org/apache/phoenix/execute/DescVa
rLengthFastByteComparisons.java:[122,30] sun.misc.Unsafe is internal 
proprietary API and may be removed in a future release
[WARNING] 
/E:/Sundry/ResearchProjects/HBaseWithPhoenix/PhoenixSource/phoenix-for-cloudera/phoenix-core/src/main/java/org/apache/phoenix/execute/DescVa
rLengthFastByteComparisons.java:[126,39] sun.misc.Unsafe is internal 
proprietary API and may be removed in a future release
[WARNING] 
/E:/Sundry/ResearchProjects/HBaseWithPhoenix/PhoenixSource/phoenix-for-cloudera/phoenix-core/src/main/java/org/apache/phoenix/query/Connecti
onQueryServices.java: Some input files use or override a deprecated API.
[WARNING] 
/E:/Sundry/ResearchProjects/HBaseWithPhoenix/PhoenixSource/phoenix-for-cloudera/phoenix-core/src/main/java/org/apache/phoenix/query/Connecti
onQueryServices.java: Recompile with -Xlint:deprecation for details.
[WARNING] 
/E:/Sundry/ResearchProjects/HBaseWithPhoenix/PhoenixSource/phoenix-for-cloudera/phoenix-core/src/main/java/org/apache/phoenix/jdbc/PhoenixSt
atement.java: Some input files use unchecked or unsafe operations.
[WARNING] 
/E:/Sundry/ResearchProjects/HBaseWithPhoenix/PhoenixSource/phoenix-for-cloudera/phoenix-core/src/main/java/org/apache/phoenix/jdbc/PhoenixSt
atement.java: Recompile with -Xlint:unchecked for details.
[INFO] 8 warnings
[INFO] -------------------------------------------------------------
[INFO] -------------------------------------------------------------
[ERROR] COMPILATION ERROR :
[INFO] -------------------------------------------------------------
[ERROR] 
/E:/Sundry/ResearchProjects/HBaseWithPhoenix/PhoenixSource/phoenix-for-cloudera/phoenix-core/src/main/java/org/apache/hadoop/hbase/regionserve
r/wal/IndexedWALEditCodec.java:[215,21] cannot find symbol
  symbol:   method getDataInput(java.io.InputStream)
  location: class 
org.apache.hadoop.hbase.regionserver.wal.IndexedWALEditCodec.BinaryCompatiblePhoenixBaseDecoder
[INFO] 1 error
[INFO] -------------------------------------------------------------
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO]
[INFO] Apache Phoenix ..................................... SUCCESS [ 53.334 s]
[INFO] Phoenix Core ....................................... FAILURE [05:02 min]
[INFO] Phoenix - Flume .................................... SKIPPED
[INFO] Phoenix - Pig ...................................... SKIPPED
[INFO] Phoenix Query Server Client ........................ SKIPPED
[INFO] Phoenix Query Server ............................... SKIPPED
[INFO] Phoenix - Pherf .................................... SKIPPED
[INFO] Phoenix - Spark .................................... SKIPPED
[INFO] Phoenix Assembly ................................... SKIPPED
[INFO] Phoenix - Tracing Web Application .................. SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 06:01 min
[INFO] Finished at: 2016-03-01T12:32:27+05:30
[INFO] Final Memory: 63M/683M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal 
org.apache.maven.plugins:maven-compiler-plugin:3.0:compile (default-compile) on 
project phoenix-core: Compilation failu
re
[ERROR] 
/E:/Sundry/ResearchProjects/HBaseWithPhoenix/PhoenixSource/phoenix-for-cloudera/phoenix-core/src/main/java/org/apache/hadoop/hbase/regionserve
r/wal/IndexedWALEditCodec.java:[215,21] cannot find symbol
[ERROR] symbol:   method getDataInput(java.io.InputStream)
[ERROR] location: class 
org.apache.hadoop.hbase.regionserver.wal.IndexedWALEditCodec.BinaryCompatiblePhoenixBaseDecoder
[ERROR] -> [Help 1]
[ERROR]
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e 
switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR]
[ERROR] For more information about the errors and possible solutions, please 
read the following articles:
[ERROR] [Help 1] 
http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR]
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :phoenix-core

Reply via email to