Re: spark 2.0.2 connect phoenix query server error

2016-11-23 Thread Divya Gehlot
Can you try with below driver "driver" -> "org.apache.phoenix.jdbc.PhoenixDriver", Thanks, Divya On 22 November 2016 at 11:14, Dequn Zhang wrote: > Hello, since spark 2.x can not use Phoenix Spark Interpreter to load data, > so I want to use JDBC, but when I want to

Re: cannot drop a table in Phoenix

2016-10-18 Thread Divya Gehlot
Hi Mich, Which version of Phoenix are you using ? Thanks, Divya On 17 October 2016 at 23:41, Mich Talebzadeh wrote: > Hi, > > I have a table marketDataHbase create on Hbase as seen below: > > [image: Inline images 1] > > > Trying to drop it but it cannot find it > >

Re: Does Phoenix support select by version?

2016-10-18 Thread Divya Gehlot
Hi Yang , Can you share the details in the forum would be useful for everybody. Thanks, Divya On 19 October 2016 at 10:56, Yang Zhang wrote: > William and ames JTaylor > helped me solve this problem > > Thanks > >

Re: Fwd: org.apache.hadoop.hbase.DoNotRetryIOException

2016-09-06 Thread Divya Gehlot
Hi , Phoenix maintain its metadata in system.catalog table Check your table information there . Is it the same as your table information ? Because this exception comes when the the table and system.catalog metadata doesn't match . Thanks , Divya On Jul 23, 2016 10:34 AM, "Yang Zhang"

Re: How to use a variable in phoenix sql script?

2016-05-18 Thread Divya Gehlot
Can you please elaborate your query with an example . On May 9, 2016 2:12 PM, "景涛" <844300...@qq.com> wrote: > I want to add a variable in the Phoenix SQL script, and how do I change > the sample? > Anybody can help me ? > Thank you very much. >

Re: [Spark 1.5.2]Check Foreign Key constraint

2016-05-11 Thread Divya Gehlot
Can you please help me with example . Thanks, Divya On 11 May 2016 at 16:55, Ankit Singhal <ankitsingha...@gmail.com> wrote: > You can use Joins as a substitute to subqueries. > > On Wed, May 11, 2016 at 1:27 PM, Divya Gehlot <divya.htco...@gmail.com> > wrote: >

[Spark 1.5.2]Check Foreign Key constraint

2016-05-11 Thread Divya Gehlot
Hi, I am using Spark 1.5.2 with Apache Phoenix 4.4 As Spark 1.5.2 doesn't support subquery in where conditions . https://issues.apache.org/jira/browse/SPARK-4226 Is there any alternative way to find foreign key constraints. Would really appreciate the help. Thanks, Divya

[Phoenix 4.4]Rename table Supported ?

2016-05-08 Thread Divya Gehlot
Hi, Does Phoenix Support renaming of table ? Is yes, Please help me with the syntax. Thanks, Divya

Re: spark 1.6.1 build failure of : scala-maven-plugin

2016-05-04 Thread Divya Gehlot
Divya On 4 May 2016 at 21:31, sunday2000 <2314476...@qq.com> wrote: > Check your javac version, and update it. > > > -- 原始邮件 ------ > *发件人:* "Divya Gehlot";<divya.htco...@gmail.com>; > *发送时间:* 2016年5月4日(星期三) 中午11:25 &g

Re: spark 1.6.1 build failure of : scala-maven-plugin

2016-05-03 Thread Divya Gehlot
Hi , Even I am getting the similar error Failed to execute goal net.alchim31.maven:scala-maven-plugin:3.2.2:compile When I tried to build Phoenix Project using maven . Maven version : 3.3 Java version - 1.7_67 Phoenix - downloaded latest master from Git hub If anybody find the the resolution

Re: [ERROR:]Phoenix 4.4 Plugin for Flume 1.5

2016-05-02 Thread Divya Gehlot
s that's pre-Apache and > more than two years old. Use this URL instead: > https://phoenix.apache.org/download.html > > Thanks, > James > > > On Thursday, April 28, 2016, Divya Gehlot <divya.htco...@gmail.com> wrote: > >> Hi, >> I am trying to

Re: [HELP:]Save Spark Dataframe in Phoenix Table

2016-04-18 Thread Divya Gehlot
binary release of > Phoenix, or compile the latest version yourself, you will be able to see > and use it. It does not come with the HDP 2.3.4 platform, at least last I > checked. > > Regards, > > Josh > > On Sat, Apr 9, 2016 at 2:24 PM, Divya Gehlot <divya.htco...@gmail.com>

SQL editor for Phoenix 4.4

2016-04-12 Thread Divya Gehlot
Hi, I would like to know ,Is there SQL editor apart from Squirrel? Thanks, Divya

Re: [Error] Spark - Save to Phoenix

2016-04-11 Thread Divya Gehlot
.executor.extraClassPath=/usr/hdp/current/phoenix-client/phoenix-client.jar" > > using it without quotes: > > --conf > spark.executor.extraClassPath=/usr/hdp/current/phoenix-client/phoenix-client.jar > > > > > 2016-04-11 12:43 GMT+02:00 Divya Gehlot <divya.

Re: [Error] Spark - Save to Phoenix

2016-04-11 Thread Divya Gehlot
h. > > > Best Regards, > Ricardo > > > 2016-04-11 12:15 GMT+02:00 Divya Gehlot <divya.htco...@gmail.com>: > >> Hi Ricardo , >> If would have observed my previous post carefully >> I am already passing the below jars >> --jars /usr/hdp/2.3.4.0

Re: [Error] Spark - Save to Phoenix

2016-04-11 Thread Divya Gehlot
options here: > http://spark.apache.org/docs/latest/running-on-yarn.html > > > Another option is to included phoenix on your jar with mvn assembly plugin: > > http://maven.apache.org/plugins/maven-assembly-plugin/ > > Best regards, > > Ricardo > > > > 2016-04-11 1

Re: [HELP:]Save Spark Dataframe in Phoenix Table

2016-04-10 Thread Divya Gehlot
this particular case, HDP 2.3.4 doesn't actually provide the > necessary phoenix client-spark JAR by default, so your options are limited > here. Again, I recommend filing a support ticket with Hortonworks. > > Regards, > > Josh > > On Sat, Apr 9, 2016 at 9:11 AM, Divya Geh

Re: [HELP:]Save Spark Dataframe in Phoenix Table

2016-04-09 Thread Divya Gehlot
client-spark JAR by default, so your options are limited > here. Again, I recommend filing a support ticket with Hortonworks. > > Regards, > > Josh > > On Sat, Apr 9, 2016 at 9:11 AM, Divya Gehlot <divya.htco...@gmail.com> > wrote: > >> Hi, >> The code which I

Fwd: [HELP:]Save Spark Dataframe in Phoenix Table

2016-04-09 Thread Divya Gehlot
eption: org.apache.phoenix.spark.DefaultSource does not allow user-specified schemas.; Am I on the right track or missing any properties ? Because of this I am unable to proceed with Phoenix and have to find alternate options. Would really appreciate the help -- Forwarded mes

Fwd: [HELP:]Save Spark Dataframe in Phoenix Table

2016-04-09 Thread Divya Gehlot
Reposting for other user benefits -- Forwarded message -- From: Divya Gehlot <divya.htco...@gmail.com> Date: 8 April 2016 at 19:54 Subject: Re: [HELP:]Save Spark Dataframe in Phoenix Table To: Josh Mahonin <jmaho...@gmail.com> Hi Josh, I am doing in the same manner

[HELP:]Save Spark Dataframe in Phoenix Table

2016-04-07 Thread Divya Gehlot
Hi, I hava a Hortonworks Hadoop cluster having below Configurations : Spark 1.5.2 HBASE 1.1.x Phoenix 4.4 I am able to connect to Phoenix through JDBC connection and able to read the Phoenix tables . But while writing the data back to Phoenix table I am getting below error :

[Query:]Table creation with column family in Phoenix

2016-03-10 Thread Divya Gehlot
Hi, I created a table in Phoenix with three column families and Inserted the values as shown below Syntax : > CREATE TABLE TESTCF (MYKEY VARCHAR NOT NULL PRIMARY KEY, CF1.COL1 VARCHAR, > CF2.COL2 VARCHAR, CF3.COL3 VARCHAR) > UPSERT INTO TESTCF (MYKEY,CF1.COL1,CF2.COL2,CF3.COL3)values >

[Issue:]Getting null values for Numeric types while accessing hive tables (Registered on Hbase,created through Phoenix)

2016-03-03 Thread Divya Gehlot
Hi, I am registering hive table on Hbase CREATE EXTERNAL TABLE IF NOT EXISTS TEST(NAME STRING,AGE INT) STORED BY 'org.apache.hadoop.hive.hbase.HBaseStorageHandler' WITH SERDEPROPERTIES ("hbase.columns.mapping" = ":key,0:AGE") TBLPROPERTIES ("hbase.table.name" = "TEST",

Re: ***UNCHECKED*** Re: HBase Phoenix Integration

2016-03-02 Thread Divya Gehlot
t; > No, it doesn't work for phoenix 4.6. Attached is the error I get when I > execute 'sqlline.py :2181' > > > > Can you please give more details about the patch? > > > > Thanks, > > Amit. > > > > On Tue, Mar 1, 2016 at 10:39 AM, Divya Gehlot <divya

Re: ***UNCHECKED*** Re: HBase Phoenix Integration

2016-02-29 Thread Divya Gehlot
anks, > Amit. > > On Tue, Mar 1, 2016 at 10:08 AM, Divya Gehlot <divya.htco...@gmail.com> > wrote: > >> Hi Amit, >> Extract attached jar and try placing it in your hbase classpath >> >> P.S. Please remove the 'x' from the jar extension >> Hope this hel

[BEST PRACTICES]: Registering Hbase table as hive external table

2016-02-28 Thread Divya Gehlot
Hi, Has any worked on registering Hbase tables as hive ? I would like to know the best practices as well as pros and cons of it . Would really appreciate if you could refer me to good blog ,study materials etc. If anybody has hands on /production experience ,could you please share the tips?

[Error] : while registering Hbase table with hive

2016-02-28 Thread Divya Gehlot
Hi, I trying to register a hbase table with hive and getting following error : Error while processing statement: FAILED: Execution Error, return code 1 > from org.apache.hadoop.hive.ql.exec.DDLTask. java.lang.RuntimeException: > MetaException(message:org.apache.hadoop.hive.serde2.SerDeException

Pheonix and Spark -transactional queries

2016-02-18 Thread Divya Gehlot
Hi, How can we ensure the Spark job which connects to Phoenix through JDBC connection how can we commit if job is success and rollback in case of failure . Thanks, Divya

Re: Error : starting spark-shell with phoenix client jar

2016-02-18 Thread Divya Gehlot
, please find a jira for the same. > https://issues.apache.org/jira/browse/PHOENIX-2608 > > Regards, > Ankit Singhal > > On Thu, Feb 18, 2016 at 2:03 PM, Divya Gehlot <divya.htco...@gmail.com> > wrote: > >> Hi, >> I am getting following error while star

Fwd: Error : starting spark-shell with phoenix client jar

2016-02-18 Thread Divya Gehlot
Hi, I am getting following error while starting spark shell with phoenix clients spark-shell --jars /usr/hdp/current/phoenix-client/phoenix-4.4.0.2.3.4.0-3485-client.jar --driver-class-path /usr/hdp/current/phoenix-client/phoenix-4.4.0.2.3.4.0-3485-client.jar --master yarn-client StackTrace : >