@Jeffrey: I will commit the code changes required in JDBC connection string(to accept keytab and principal) as well as the classpath that i used to connect to cluster in Phoenix-19. Is that fine with you?
@James: That is nice. I guess my timing was bad. I will submit the patch for Java code. Basically, it's the same code that is already attached to Phoenix-19. I will also provide the sqlline.sh scripts that made this work. Thanks, Anil Gupta On Tue, Apr 22, 2014 at 3:52 PM, James Taylor <jamestay...@apache.org>wrote: > Correction - it's not HBase 0.94.18, but 0.94.19 which is being voted on > now: > bq. Notable is HBASE-10847, which drops non-secure builds and make security > the default. From here on there is only one release build of HBase 0.94. > > > > On Tue, Apr 22, 2014 at 3:16 PM, James Taylor <jamestay...@apache.org > >wrote: > > > FWIW, I believe as of HBase 0.94.18, the secure build is the only build > > now for HBase and it's the one that is pushed to maven. You can likely > > change your pom locally to this version and hopefully things will become > > easier. > > Thanks, > > James > > > > > > On Tue, Apr 22, 2014 at 2:33 PM, Jeffrey Zhong <jzh...@hortonworks.com > >wrote: > > > >> > >> Hey Anil, > >> > >> Mind to create a JIRA on this? Basically we need hbase configuration and > >> related dependent hbase/hadoop jars to connect to a secure hbase > cluster. > >> This issue applies to psql as well. There is no issue to connect a > >> un-secure cluster. > >> > >> Thanks, > >> -Jeffrey > >> > >> On 4/22/14 12:18 PM, "anil gupta" <anilgupt...@gmail.com> wrote: > >> > >> >Fixed that classpath problem with changing the classpath problem. This > is > >> >very hacky but i am left with no option because of lack maven artifact > of > >> >hbase-security jar in 0.94. > >> >Now, the class is: > >> >java -cp > >> > >> > >"/etc/hbase/conf:.:../sqlline-1.1.2.jar:../jline-2.11.jar:/opt/cloudera/pa > >> > >> > >rcels/CDH/lib/hbase/hbase-0.94.15-cdh4.6.0-security.jar:/opt/cloudera/parc > >> > >> > >els/CDH/lib/hbase/lib/*:/opt/cloudera/parcels/CDH/lib/hadoop/*:/opt/cloude > >> > >> > >ra/parcels/CDH/lib/hadoop/lib/*:../phoenix-core-3.0.0-SNAPSHOT.jar:$phoeni > >> >x_client_jar" > >> >-Dlog4j.configuration=file:$current_dir/log4j.properties > sqlline.SqlLine > >> >-d > >> >org.apache.phoenix.jdbc.PhoenixDriver -u jdbc:phoenix:$1 -n none -p > none > >> >--color=true --fastConnect=false --verbose=true > >> >--isolation=TRANSACTION_READ_COMMITTED $sqlfile > >> > > >> >Now, i am able to connect! Phew!! > >> > > >> > > >> >On Mon, Apr 21, 2014 at 7:10 PM, anil gupta <anilgupt...@gmail.com> > >> wrote: > >> > > >> >> Wow! Moving /etc/hbase/conf to the very beginning of your class > >> >> path(before the "." current folder). Do you know why this behavior? > >> >> Initially, i was using /etc/hbase/conf but it was mind boggling to me > >> >>that > >> >> conf was not getting picked so i tried a fluke with > >> "/etc/hbase/conf/*". > >> >> Now, its able to get distributed cluster conf. However, now i get > >> >> NoSuchMethodError error: > >> >> > >> >> > >> >> > >> >> > >> >> > >> >> > >> >> > >> >> > >> >> > >> >> > >> >> > >> >> > >> >> > >> >> > >> >> > >> >> > >> >> > >> >> > >> >> > >> >> > >> >> > >> >> > >> >> > >> >> > >> >> > >> >> > >> >> > >> >> > >> >> > >> >> > >> >> > >> >> > >> >> > >> >> > >> >> > >> >> *2014-04-21 18:58:01 DEBUG SecureClient:263 - Connecting to > >> >> pprfihbdb406.corp.intuit.net/10.164.74.157:60000 > >> >> > >> >><http://pprfihbdb406.corp.intuit.net/10.164.74.157:60000 > >> >java.lang.NoSuch > >> >>MethodError: > >> >> > >> > >> > >>org.apache.hadoop.net.NetUtils.getInputStream(Ljava/net/Socket;)Lorg/apac > >> >>he/hadoop/net/SocketInputWrapper; > >> >> at > >> >> > >> > >> > >>org.apache.hadoop.hbase.ipc.SecureClient$SecureConnection.setupIOstreams( > >> >>SecureClient.java:270) > >> >> at > >> >> > >> > >> > >>org.apache.hadoop.hbase.ipc.HBaseClient.getConnection(HBaseClient.java:11 > >> >>41) > >> >> at org.apache.hadoop.hbase.ipc.HBaseClient.call(HBaseClient.java:988) > >> >> at > >> >> > >> > >> > >>org.apache.hadoop.hbase.ipc.SecureRpcEngine$Invoker.invoke(SecureRpcEngin > >> >>e.java:107) > >> >> at com.sun.proxy.$Proxy5.getProtocolVersion(Unknown Source) at > >> >> > >> > >> > >>org.apache.hadoop.hbase.ipc.SecureRpcEngine.getProxy(SecureRpcEngine.java > >> >>:149) > >> >> at > >> >> > >> > >> > >>org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementati > >> >>on.getMaster(HConnectionManager.java:813) > >> >> at > >> org.apache.hadoop.hbase.client.HBaseAdmin.<init>(HBaseAdmin.java:127) > >> >> at > >> >> > >> > >> > >>org.apache.phoenix.query.ConnectionQueryServicesImpl.ensureTableCreated(C > >> >>onnectionQueryServicesImpl.java:617) > >> >> at > >> >> > >> > >> > >>org.apache.phoenix.query.ConnectionQueryServicesImpl.createTable(Connecti > >> >>onQueryServicesImpl.java:844) > >> >> at > >> >> > >> > >> > >>org.apache.phoenix.schema.MetaDataClient.createTableInternal(MetaDataClie > >> >>nt.java:988) > >> >> at > >> >> > >> > >> > >>org.apache.phoenix.schema.MetaDataClient.createTable(MetaDataClient.java: > >> >>384) > >> >> at > >> >> > >> > >> > >>org.apache.phoenix.compile.CreateTableCompiler$2.execute(CreateTableCompi > >> >>ler.java:168) > >> >> at > >> >> > >> > >> > >>org.apache.phoenix.jdbc.PhoenixStatement$ExecutableCreateTableStatement.e > >> >>xecuteUpdate(PhoenixStatement.java:350) > >> >> at > >> >> > >> > >> > >>org.apache.phoenix.jdbc.PhoenixStatement.executeUpdate(PhoenixStatement.j > >> >>ava:1047) > >> >> at > >> >> > >> > >> > >>org.apache.phoenix.query.ConnectionQueryServicesImpl.init(ConnectionQuery > >> >>ServicesImpl.java:1039) > >> >> at > >> >> > >> > >> > >>org.apache.phoenix.jdbc.PhoenixDriver.getConnectionQueryServices(PhoenixD > >> >>river.java:79) > >> >> at > >> >> > >> > >> > >>org.apache.phoenix.jdbc.PhoenixEmbeddedDriver.connect(PhoenixEmbeddedDriv > >> >>er.java:107) > >> >> at sqlline.SqlLine$DatabaseConnection.connect(SqlLine.java:4650) > >> >>at > >> >> sqlline.SqlLine$DatabaseConnection.getConnection(SqlLine.java:4701) > >> >>at > >> >> sqlline.SqlLine$Commands.connect(SqlLine.java:3942) at > >> >> sqlline.SqlLine$Commands.connect(SqlLine.java:3851) at > >> >> sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at > >> >> > >> > >> > >>sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java > >> >>:57) > >> >> at > >> >> > >> > >> > >>sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorI > >> >>mpl.java:43) > >> >> at java.lang.reflect.Method.invoke(Method.java:606) at > >> >> sqlline.SqlLine$ReflectiveCommandHandler.execute(SqlLine.java:2810) > >> >>at > >> >> sqlline.SqlLine.dispatch(SqlLine.java:817) at > >> >> sqlline.SqlLine.initArgs(SqlLine.java:633) at > >> >> sqlline.SqlLine.begin(SqlLine.java:680) at > >> >> sqlline.SqlLine.mainWithInputRedirection(SqlLine.java:441) at > >> >> sqlline.SqlLine.main(SqlLine.java:424)sqlline version 1.1.2* > >> >> > >> >> > >> >> > >> >> > >> >> *I feel like i am close to making it work. My gut feeling is that > this > >> >>is > >> >> related some incompatible jar file or hadoop1/hadoop2 profile. I am > >> >>using > >> >> Hadoop 2.0.0-cdh4.6.0. Please let me know if i need to pass some > params > >> >> while building phoenix on my machine. Right now, i am just doing "mvn > >> >>clean > >> >> install" to build. * > >> >> Thanks for help! > >> >> Anil > >> >> > >> >> > >> >> > >> >> > >> >> On Mon, Apr 21, 2014 at 2:16 PM, Jeffrey Zhong > >> >><jzh...@hortonworks.com>wrote: > >> >> > >> >>> > >> >>> Hey Anil, > >> >>> > >> >>> In classpath, when you point to configuration files, you can only > >> >>>specify > >> >>> folder name(in your case it should be /etc/hbase/conf). The wildcard > >> >>>"*" > >> >>> only works for files with extension .jar. In addition, you can move > >> the > >> >>> /etc/hbase/conf to the very beginning of your class path(before the > >> "." > >> >>> current folder) > >> >>> > >> >>> -Jeffrey > >> >>> > >> >>> On 4/21/14 10:32 AM, "anil gupta" <anilgupt...@gmail.com> wrote: > >> >>> > >> >>> >$1= <zk>:<keytab>:<principal> > >> >>> > > >> >>> >This work is part of > >> https://issues.apache.org/jira/browse/PHOENIX-19 > >> >>>. > >> >>> >So, > >> >>> >i modified the connection string to extra params. The patch is > >> >>>attached > >> >>> to > >> >>> >the jira. I'll also upload the most recent patch. > >> >>> > > >> >>> >I also tried your recommendation: > >> >>> >$1= <zk>:<port>:<root_dir>:<keytab>:<principal> > >> >>> > > >> >>> >Still, i get the same error. It seems like the conf files for my > >> >>> >distributed cluster is not getting picked up or some other conf > file > >> >>>is > >> >>> >also present in the classpath. Is there anyway i can specify the > >> exact > >> >>> >path > >> >>> >of conf file in java code so that i can debug this? Or Is there a > way > >> >>>to > >> >>> >know which files Configuration is using to instantiate the > >> >>>Configuration > >> >>> >object? > >> >>> > > >> >>> > > >> >>> >On Mon, Apr 21, 2014 at 9:57 AM, Jeffrey Zhong > >> >>> ><jzh...@hortonworks.com>wrote: > >> >>> > > >> >>> >> > >> >>> >> What's the value for your following "$1"? You need to specify the > >> >>>value > >> >>> >>as > >> >>> >> <hbase zookeeper quorum host string(without port)>:<zookeeper > >> >>> >>port>:<hbase > >> >>> >> root node> > >> >>> >> > >> >>> >> A sample value is > quorumhost1,quoruamhost2,quorumhost3:2181:/hbase > >> >>> >> > >> >>> >> On 4/21/14 12:48 AM, "anil gupta" <anilgupt...@gmail.com> wrote: > >> >>> >> > >> >>> >> >Hi All, > >> >>> >> > > >> >>> >> >Phoenix is trying to connect to a Standalone hbase rather than > my > >> >>> Fully > >> >>> >> >distributed HBase cluster. Hence, it is getting > >> >>> >>MasterNotRunningException. > >> >>> >> > > >> >>> >> >This is my current command to invoke Phoenix: > >> >>> >> >java -cp > >> >>> >> > >> >>> > >> >>> > >> > >> > >>>>>>".:/etc/hbase/conf/*:../sqlline-1.1.2.jar:../jline-2.11.jar:../phoeni > >> >>>>>>x-c > >> >>> >>>or > >> >>> >> > >> >>> > >> >>> > >> > >> > >>>>>>e-3.0.0-SNAPSHOT.jar:$phoenix_client_jar:/opt/cloudera/parcels/CDH/li > >> >>>>>>b/h > >> >>> >>>ba > >> >>> >> > >> >>> > >> >>> > >> > >> > >>>>>>se/hbase-0.94.15-cdh4.6.0-security.jar:/opt/cloudera/parcels/CDH/lib/ > >> >>>>>>hba > >> >>> >>>se > >> >>> >> > >> >>> > >> >>> > >> > >> > >>>>>>/lib/*:/opt/cloudera/parcels/CDH/lib/hadoop/*:/opt/cloudera/parcels/C > >> >>>>>>DH/ > >> >>> >>>li > >> >>> >> >b/hadoop/lib/*" > >> >>> >> >-Dlog4j.configuration=file:$current_dir/log4j.properties > >> >>> >>sqlline.SqlLine > >> >>> >> >-d > >> >>> >> >org.apache.phoenix.jdbc.PhoenixDriver -u jdbc:phoenix:$1 -n none > >> -p > >> >>> >>none > >> >>> >> >--color=true --fastConnect=false --verbose=true > >> >>> >> >--isolation=TRANSACTION_READ_COMMITTED $sqlfile > >> >>> >> > > >> >>> >> >As you can see that i have added /etc/hbase/conf/* in > classpath. I > >> >>> also > >> >>> >> >verified that correct files are present in /etc/hbase/conf/ > >> folder. > >> >>> >>Then > >> >>> >> >also Phoenix is not picking up the conf and its trying to > connect > >> >>>to > >> >>> >>local > >> >>> >> >cluster where hbase.rootdir="file:///tmp/hbase-intuit/hbase" . I > >> am > >> >>> >>unable > >> >>> >> >to figure out why Phoenix is not picking up the conf. IMO, > adding > >> >>> >> >/etc/hbase/conf/* in classpath should be enough but it seems > like > >> >>>this > >> >>> >>is > >> >>> >> >not sufficient. Any ideas/suggestions on how to make Phoenix > pick > >> >>>up > >> >>> >>the > >> >>> >> >correct configuration? > >> >>> >> > > >> >>> >> >Thanks, > >> >>> >> >Anil Gupta > >> >>> >> > > >> >>> >> > > >> >>> >> >On Sun, Apr 20, 2014 at 11:47 PM, anil gupta > >> >>><anilgupt...@gmail.com> > >> >>> >> >wrote: > >> >>> >> > > >> >>> >> >> It seems like Phoenix is unable to connect to master. I am > able > >> >>>to > >> >>> >>use > >> >>> >> >> hbase shell from that node. So, everything should be fine. I > >> have > >> >>> >>also > >> >>> >> >> included hbase conf directories in classpath. Is there any > other > >> >>> >>thing > >> >>> >> >>i am > >> >>> >> >> missing? > >> >>> >> >> > >> >>> >> >> This is the error i got: > >> >>> >> >> > >> >>> >> >> Found quorum: > >> >>>pprf1:2181,pprf2:2181,pprf3:2181,pprf4:2181,pprf5:2181 > >> >>> >> >> Error: Retried 14 times (state=08000,code=101) > >> >>> >> >> org.apache.phoenix.exception.PhoenixIOException: Retried 14 > >> times > >> >>> >> >> at > >> >>> >> >> > >> >>> >> > >> >>> > >> >>> > >> > >> > >>>>>>>org.apache.phoenix.util.ServerUtil.parseServerException(ServerUtil.j > >> >>>>>>>ava > >> >>> >>>>:9 > >> >>> >> >>9) > >> >>> >> >> at > >> >>> >> >> > >> >>> >> > >> >>> > >> >>> > >> > >> > >>>>>>>org.apache.phoenix.query.ConnectionQueryServicesImpl.ensureTableCrea > >> >>>>>>>ted > >> >>> >>>>(C > >> >>> >> >>onnectionQueryServicesImpl.java:680) > >> >>> >> >> at > >> >>> >> >> > >> >>> >> > >> >>> > >> >>> > >> > >> > >>>>>>>org.apache.phoenix.query.ConnectionQueryServicesImpl.createTable(Con > >> >>>>>>>nec > >> >>> >>>>ti > >> >>> >> >>onQueryServicesImpl.java:821) > >> >>> >> >> at > >> >>> >> >> > >> >>> >> > >> >>> > >> >>> > >> > >> > >>>>>>>org.apache.phoenix.schema.MetaDataClient.createTableInternal(MetaDat > >> >>>>>>>aCl > >> >>> >>>>ie > >> >>> >> >>nt.java:988) > >> >>> >> >> at > >> >>> >> >> > >> >>> >> > >> >>> > >> >>> > >> > >> > >>>>>>>org.apache.phoenix.schema.MetaDataClient.createTable(MetaDataClient. > >> >>>>>>>jav > >> >>> >>>>a: > >> >>> >> >>384) > >> >>> >> >> at > >> >>> >> >> > >> >>> >> > >> >>> > >> >>> > >> > >> > >>>>>>>org.apache.phoenix.compile.CreateTableCompiler$2.execute(CreateTable > >> >>>>>>>Com > >> >>> >>>>pi > >> >>> >> >>ler.java:168) > >> >>> >> >> at > >> >>> >> >> > >> >>> >> > >> >>> > >> >>> > >> > >> > >>>>>>>org.apache.phoenix.jdbc.PhoenixStatement$ExecutableCreateTableStatem > >> >>>>>>>ent > >> >>> >>>>.e > >> >>> >> >>xecuteUpdate(PhoenixStatement.java:350) > >> >>> >> >> at > >> >>> >> >> > >> >>> >> > >> >>> > >> >>> > >> > >> > >>>>>>>org.apache.phoenix.jdbc.PhoenixStatement.executeUpdate(PhoenixStatem > >> >>>>>>>ent > >> >>> >>>>.j > >> >>> >> >>ava:1047) > >> >>> >> >> at > >> >>> >> >> > >> >>> >> > >> >>> > >> >>> > >> > >> > >>>>>>>org.apache.phoenix.query.ConnectionQueryServicesImpl.init(Connection > >> >>>>>>>Que > >> >>> >>>>ry > >> >>> >> >>ServicesImpl.java:1016) > >> >>> >> >> at > >> >>> >> >> > >> >>> >> > >> >>> > >> >>> > >> > >> > >>>>>>>org.apache.phoenix.jdbc.PhoenixDriver.getConnectionQueryServices(Pho > >> >>>>>>>eni > >> >>> >>>>xD > >> >>> >> >>river.java:79) > >> >>> >> >> at > >> >>> >> >> > >> >>> >> > >> >>> > >> >>> > >> > >> > >>>>>>>org.apache.phoenix.jdbc.PhoenixEmbeddedDriver.connect(PhoenixEmbedde > >> >>>>>>>dDr > >> >>> >>>>iv > >> >>> >> >>er.java:107) > >> >>> >> >> at > >> >>>sqlline.SqlLine$DatabaseConnection.connect(SqlLine.java:4650) > >> >>> >> >> at > >> >>> >> > >> >>>>>sqlline.SqlLine$DatabaseConnection.getConnection(SqlLine.java:4701) > >> >>> >> >> at sqlline.SqlLine$Commands.connect(SqlLine.java:3942) > >> >>> >> >> at sqlline.SqlLine$Commands.connect(SqlLine.java:3851) > >> >>> >> >> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native > >> >>>Method) > >> >>> >> >> at > >> >>> >> >> > >> >>> >> > >> >>> > >> >>> > >> > >> > >>>>>>>sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl > >> >>>>>>>.ja > >> >>> >>>>va > >> >>> >> >>:57) > >> >>> >> >> at > >> >>> >> >> > >> >>> >> > >> >>> > >> >>> > >> > >> > >>>>>>>sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAcce > >> >>>>>>>sso > >> >>> >>>>rI > >> >>> >> >>mpl.java:43) > >> >>> >> >> at java.lang.reflect.Method.invoke(Method.java:606) > >> >>> >> >> at > >> >>> >> > >> >>>>>sqlline.SqlLine$ReflectiveCommandHandler.execute(SqlLine.java:2810) > >> >>> >> >> at sqlline.SqlLine.dispatch(SqlLine.java:817) > >> >>> >> >> at sqlline.SqlLine.initArgs(SqlLine.java:633) > >> >>> >> >> at sqlline.SqlLine.begin(SqlLine.java:680) > >> >>> >> >> at > >> sqlline.SqlLine.mainWithInputRedirection(SqlLine.java:441) > >> >>> >> >> at sqlline.SqlLine.main(SqlLine.java:424) > >> >>> >> >> Caused by: org.apache.hadoop.hbase.MasterNotRunningException: > >> >>> >>Retried 14 > >> >>> >> >> times > >> >>> >> >> at > >> >>> >> >> > >> >>> > org.apache.hadoop.hbase.client.HBaseAdmin.<init>(HBaseAdmin.java:139) > >> >>> >> >> at > >> >>> >> >> > >> >>> >> > >> >>> > >> >>> > >> > >> > >>>>>>>org.apache.phoenix.query.ConnectionQueryServicesImpl.ensureTableCrea > >> >>>>>>>ted > >> >>> >>>>(C > >> >>> >> >>onnectionQueryServicesImpl.java:606) > >> >>> >> >> ... 23 more > >> >>> >> >> > >> >>> >> >> > >> >>> >> >> > >> >>> >> >> On Sun, Apr 20, 2014 at 9:18 PM, anil gupta > >> >>><anilgupt...@gmail.com> > >> >>> >> >>wrote: > >> >>> >> >> > >> >>> >> >>> I just fixed couple of initialization errors by putting > >> phoenix > >> >>> >>jars, > >> >>> >> >>> sqlline and jline jars before hbase jars in classpath and > added > >> >>> >>Hadoop > >> >>> >> >>> jars. Now the command is: > >> >>> >> >>> *java -cp > >> >>> >> >>> > >> >>> >> > >> >>> > >> >>> > >> > >> > >>>>>>>>".:/etc/hadoop/conf:/etc/hbase/conf:../sqlline-1.1.2.jar:../jline-2 > >> >>>>>>>>.11 > >> >>> >>>>>.j > >> >>> >> > >> >>> > >> >>> > >> > >> > >>>>>>>>ar:../phoenix-core-3.0.0-SNAPSHOT.jar:$phoenix_client_jar:/opt/clou > >> >>>>>>>>der > >> >>> >>>>>a/ > >> >>> >> > >> >>> > >> >>> > >> > >> > >>>>>>>>parcels/CDH/lib/hbase/hbase-0.94.15-cdh4.6.0-security.jar:/opt/clou > >> >>>>>>>>der > >> >>> >>>>>a/ > >> >>> >> > >> >>> > >> >>> > >> > >> > >>>>>>>>parcels/CDH/lib/hbase/lib/*:/opt/cloudera/parcels/CDH/lib/hadoop/*: > >> >>>>>>>>/op > >> >>> >>>>>t/ > >> >>> >> >>>cloudera/parcels/CDH/lib/hadoop/lib/*" > >> >>> >> >>> *-Dlog4j.configuration=file:$current_dir/log4j.properties > >> >>> >> >>> sqlline.SqlLine -d org.apache.phoenix.jdbc.PhoenixDriver -u > >> >>> >> >>>jdbc:phoenix:$1 > >> >>> >> >>> -n none -p none --color=true --fastConnect=false > --verbose=true > >> >>> >> >>> --isolation=TRANSACTION_READ_COMMITTED $sqlfile > >> >>> >> >>> > >> >>> >> >>> At present, it seems like the classpath errors are fixed. I > see > >> >>> >>this on > >> >>> >> >>> the console, but its stuck at this line.: > >> >>> >> >>> Found quorum: > >> >>> pprf1:2181,pprf2:2181,pprf3:2181,pprf4:2181,pprf5:2181 > >> >>> >> >>> > >> >>> >> >>> Can anyone tell me where it is probably stuck? > >> >>> >> >>> > >> >>> >> >>> Thanks, > >> >>> >> >>> Anil Gupta > >> >>> >> >>> > >> >>> >> >>> > >> >>> >> >>> On Sun, Apr 20, 2014 at 9:02 PM, anil gupta > >> >>> >> >>><anilgupt...@gmail.com>wrote: > >> >>> >> >>> > >> >>> >> >>>> Hi All, > >> >>> >> >>>> > >> >>> >> >>>> Due to issues faced in Phoenix-19 we cannot use > >> >>> >> >>>> phoenix-3.0.0-SNAPSHOT-client.jar, i am trying to run > Phoenix > >> >>>on > >> >>> >> >>>> commandline with > >> >>>phoenix-3.0.0-SNAPSHOT-client-without-hbase.jar. > >> >>> >> >>>>Modified > >> >>> >> >>>> command to invoke phoenix in sqlline.sh looks like this: > >> >>> >> >>>> java -cp " > >> >>> >> >>>> > >> >>> >> > >> >>> > >> >>> > >> > >> > >>>>>>>>>*.:/etc/hadoop/conf:/etc/hbase/conf:/opt/cloudera/parcels/CDH/lib/ > >> >>>>>>>>>hba > >> >>> >>>>>>se > >> >>> >> > >> >>> > >> >>> > >> > >> > >>>>>>>>>/hbase-0.94.15-cdh4.6.0-security.jar:/opt/cloudera/parcels/CDH/lib > >> >>>>>>>>>/hb > >> >>> >>>>>>as > >> >>> >> > >> >>> > >> >>> > >> > >> > >>>>>>>>>e/lib/*:../sqlline-1.1.2.jar:../jline-2.11.jar:../phoenix-core-3.0 > >> >>>>>>>>>.0- > >> >>> >>>>>>SN > >> >>> >> >>>>APSHOT.jar*:$phoenix_client_jar" > >> >>> >> >>>> -Dlog4j.configuration=file:$current_dir/log4j.properties > >> >>> >> >>>>sqlline.SqlLine -d > >> >>> >> >>>> org.apache.phoenix.jdbc.PhoenixDriver -u jdbc:phoenix:$1 -n > >> >>>none > >> >>> -p > >> >>> >> >>>>none > >> >>> >> >>>> --color=true --fastConnect=false --verbose=true > >> >>> >> >>>> --isolation=TRANSACTION_READ_COMMITTED $sqlfile > >> >>> >> >>>> > >> >>> >> >>>> At present, i get the following error: > >> >>> >> >>>> [ERROR] Terminal initialization failed; falling back to > >> >>> unsupported > >> >>> >> >>>> java.lang.IncompatibleClassChangeError: Found class > >> >>> jline.Terminal, > >> >>> >> >>>>but > >> >>> >> >>>> interface was expected > >> >>> >> >>>> at > jline.TerminalFactory.create(TerminalFactory.java:101) > >> >>> >> >>>> at jline.TerminalFactory.get(TerminalFactory.java:159) > >> >>> >> >>>> at sqlline.SqlLine$Opts.<init>(SqlLine.java:4846) > >> >>> >> >>>> at sqlline.SqlLine.<init>(SqlLine.java:175) > >> >>> >> >>>> at > >> >>>sqlline.SqlLine.mainWithInputRedirection(SqlLine.java:440) > >> >>> >> >>>> at sqlline.SqlLine.main(SqlLine.java:424) > >> >>> >> >>>> > >> >>> >> >>>> Exception in thread "main" > >> >>>java.lang.IncompatibleClassChangeError: > >> >>> >> >>>>Found > >> >>> >> >>>> class jline.Terminal, but interface was expected > >> >>> >> >>>> at sqlline.SqlLine$Opts.<init>(SqlLine.java:4846) > >> >>> >> >>>> at sqlline.SqlLine.<init>(SqlLine.java:175) > >> >>> >> >>>> at > >> >>>sqlline.SqlLine.mainWithInputRedirection(SqlLine.java:440) > >> >>> >> >>>> at sqlline.SqlLine.main(SqlLine.java:424) > >> >>> >> >>>> > >> >>> >> >>>> Please let me know what am i missing now? Or What's wrong > with > >> >>>the > >> >>> >> >>>> initialization command? > >> >>> >> >>>> I am using cdh4.6 with HBase0.94.15. > >> >>> >> >>>> > >> >>> >> >>>> > >> >>> >> >>>> -- > >> >>> >> >>>> Thanks & Regards, > >> >>> >> >>>> Anil Gupta > >> >>> >> >>>> > >> >>> >> >>> > >> >>> >> >>> > >> >>> >> >>> > >> >>> >> >>> -- > >> >>> >> >>> Thanks & Regards, > >> >>> >> >>> Anil Gupta > >> >>> >> >>> > >> >>> >> >> > >> >>> >> >> > >> >>> >> >> > >> >>> >> >> -- > >> >>> >> >> Thanks & Regards, > >> >>> >> >> Anil Gupta > >> >>> >> >> > >> >>> >> > > >> >>> >> > > >> >>> >> > > >> >>> >> >-- > >> >>> >> >Thanks & Regards, > >> >>> >> >Anil Gupta > >> >>> >> > >> >>> >> > >> >>> >> > >> >>> >> -- > >> >>> >> CONFIDENTIALITY NOTICE > >> >>> >> NOTICE: This message is intended for the use of the individual or > >> >>> >>entity to > >> >>> >> which it is addressed and may contain information that is > >> >>>confidential, > >> >>> >> privileged and exempt from disclosure under applicable law. If > the > >> >>> >>reader > >> >>> >> of this message is not the intended recipient, you are hereby > >> >>>notified > >> >>> >>that > >> >>> >> any printing, copying, dissemination, distribution, disclosure or > >> >>> >> forwarding of this communication is strictly prohibited. If you > >> have > >> >>> >> received this communication in error, please contact the sender > >> >>> >>immediately > >> >>> >> and delete it from your system. Thank You. > >> >>> >> > >> >>> > > >> >>> > > >> >>> > > >> >>> >-- > >> >>> >Thanks & Regards, > >> >>> >Anil Gupta > >> >>> > >> >>> > >> >>> > >> >>> -- > >> >>> CONFIDENTIALITY NOTICE > >> >>> NOTICE: This message is intended for the use of the individual or > >> >>>entity > >> >>> to > >> >>> which it is addressed and may contain information that is > >> confidential, > >> >>> privileged and exempt from disclosure under applicable law. If the > >> >>>reader > >> >>> of this message is not the intended recipient, you are hereby > notified > >> >>> that > >> >>> any printing, copying, dissemination, distribution, disclosure or > >> >>> forwarding of this communication is strictly prohibited. If you have > >> >>> received this communication in error, please contact the sender > >> >>> immediately > >> >>> and delete it from your system. Thank You. > >> >>> > >> >> > >> >> > >> >> > >> >> -- > >> >> Thanks & Regards, > >> >> Anil Gupta > >> >> > >> > > >> > > >> > > >> >-- > >> >Thanks & Regards, > >> >Anil Gupta > >> > >> > >> > >> -- > >> CONFIDENTIALITY NOTICE > >> NOTICE: This message is intended for the use of the individual or entity > >> to > >> which it is addressed and may contain information that is confidential, > >> privileged and exempt from disclosure under applicable law. If the > reader > >> of this message is not the intended recipient, you are hereby notified > >> that > >> any printing, copying, dissemination, distribution, disclosure or > >> forwarding of this communication is strictly prohibited. If you have > >> received this communication in error, please contact the sender > >> immediately > >> and delete it from your system. Thank You. > >> > > > > > -- Thanks & Regards, Anil Gupta