Re: spark 2.0.2 connect phoenix query server error

2016-11-23 Thread Divya Gehlot
Can you try with below driver "driver" -> "org.apache.phoenix.jdbc.PhoenixDriver", Thanks, Divya On 22 November 2016 at 11:14, Dequn Zhang wrote: > Hello, since spark 2.x can not use Phoenix Spark Interpreter to load data, > so I want to use JDBC, but when I want to get a *thin connection*, I

Re: cannot drop a table in Phoenix

2016-10-18 Thread Divya Gehlot
Hi Mich, Which version of Phoenix are you using ? Thanks, Divya On 17 October 2016 at 23:41, Mich Talebzadeh wrote: > Hi, > > I have a table marketDataHbase create on Hbase as seen below: > > [image: Inline images 1] > > > Trying to drop it but it cannot find it > > 0: jdbc:phoenix:rhes564:218

Re: Does Phoenix support select by version?

2016-10-18 Thread Divya Gehlot
Hi Yang , Can you share the details in the forum would be useful for everybody. Thanks, Divya On 19 October 2016 at 10:56, Yang Zhang wrote: > Williamand ames JTaylor > helped me solve this problem > > Thanks > > 2016-10-19 10:20 GMT+08:00 Yang Zhang : > >> Hi >> >> I saw that Phoenix

Re: Joins dont work

2016-09-15 Thread Divya Gehlot
I used Phoenix 4.4 and joins did work for me . Refer this link https://phoenix.apache.org/joins.html HTH . Thanks, Divya On 16 September 2016 at 04:37, Cheyenne Forbes < cheyenne.osanu.for...@gmail.com> wrote: > I was using phoenix 4.4 then I switched to

Re: Fwd: org.apache.hadoop.hbase.DoNotRetryIOException

2016-09-06 Thread Divya Gehlot
Hi , Phoenix maintain its metadata in system.catalog table Check your table information there . Is it the same as your table information ? Because this exception comes when the the table and system.catalog metadata doesn't match . Thanks , Divya On Jul 23, 2016 10:34 AM, "Yang Zhang" wrote: Hel

Re: List of keywords/Tokens not allowed in Phoenix

2016-07-28 Thread Divya Gehlot
https://phoenix.apache.org/language/ On 28 July 2016 at 14:19, Dharmesh Guna wrote: > Dear All, > > > > While running below query to create a table I encountered an error. Is > there any list of tokens/keywords that we cannot use as column name in > Phoenix? > > CREATE TABLE TTEST (ACTIVE BIGINT

Re: How to use a variable in phoenix sql script?

2016-05-18 Thread Divya Gehlot
Can you please elaborate your query with an example . On May 9, 2016 2:12 PM, "景涛" <844300...@qq.com> wrote: > I want to add a variable in the Phoenix SQL script, and how do I change > the sample? > Anybody can help me ? > Thank you very much. >

Re: [Spark 1.5.2]Check Foreign Key constraint

2016-05-11 Thread Divya Gehlot
Can you please help me with example . Thanks, Divya On 11 May 2016 at 16:55, Ankit Singhal wrote: > You can use Joins as a substitute to subqueries. > > On Wed, May 11, 2016 at 1:27 PM, Divya Gehlot > wrote: > >> Hi, >> I am using Spark 1.5.2 with Apache Ph

[Spark 1.5.2]Check Foreign Key constraint

2016-05-11 Thread Divya Gehlot
Hi, I am using Spark 1.5.2 with Apache Phoenix 4.4 As Spark 1.5.2 doesn't support subquery in where conditions . https://issues.apache.org/jira/browse/SPARK-4226 Is there any alternative way to find foreign key constraints. Would really appreciate the help. Thanks, Divya

[Phoenix 4.4]Rename table Supported ?

2016-05-08 Thread Divya Gehlot
Hi, Does Phoenix Support renaming of table ? Is yes, Please help me with the syntax. Thanks, Divya

Re: spark 1.6.1 build failure of : scala-maven-plugin

2016-05-04 Thread Divya Gehlot
Divya On 4 May 2016 at 21:31, sunday2000 <2314476...@qq.com> wrote: > Check your javac version, and update it. > > > -- 原始邮件 ------ > *发件人:* "Divya Gehlot";; > *发送时间:* 2016年5月4日(星期三) 中午11:25 > *收件人:* "sunday2000"<2314476

Re: spark 1.6.1 build failure of : scala-maven-plugin

2016-05-03 Thread Divya Gehlot
Hi , Even I am getting the similar error Failed to execute goal net.alchim31.maven:scala-maven-plugin:3.2.2:compile When I tried to build Phoenix Project using maven . Maven version : 3.3 Java version - 1.7_67 Phoenix - downloaded latest master from Git hub If anybody find the the resolution please

Re: [ERROR:]Phoenix 4.4 Plugin for Flume 1.5

2016-05-02 Thread Divya Gehlot
and > more than two years old. Use this URL instead: > https://phoenix.apache.org/download.html > > Thanks, > James > > > On Thursday, April 28, 2016, Divya Gehlot wrote: > >> Hi, >> I am trying to move data from hdfs to Phoenix >> I downloaded the htt

[ERROR:]Phoenix 4.4 Plugin for Flume 1.5

2016-04-28 Thread Divya Gehlot
Hi, I am trying to move data from hdfs to Phoenix I downloaded the https://github.com/forcedotcom/phoenix/ and build the project as per instrunctions in Apache Phoenix site. and placed the phoenix-3.0.0-SNAPSHOT-client.jar in /flume/lib/phoenix-3.0.0-SNAPSHOT-client.jar and When I am ruuning Flume

Re: [HELP:]Save Spark Dataframe in Phoenix Table

2016-04-18 Thread Divya Gehlot
gt; Phoenix, or compile the latest version yourself, you will be able to see > and use it. It does not come with the HDP 2.3.4 platform, at least last I > checked. > > Regards, > > Josh > > On Sat, Apr 9, 2016 at 2:24 PM, Divya Gehlot > wrote: > >> Hi Josh, >&g

SQL editor for Phoenix 4.4

2016-04-12 Thread Divya Gehlot
Hi, I would like to know ,Is there SQL editor apart from Squirrel? Thanks, Divya

Re: [Error] Spark - Save to Phoenix

2016-04-11 Thread Divya Gehlot
t/phoenix-client/phoenix-client.jar" > > using it without quotes: > > --conf > spark.executor.extraClassPath=/usr/hdp/current/phoenix-client/phoenix-client.jar > > > > > 2016-04-11 12:43 GMT+02:00 Divya Gehlot : > >> Hi Ricardo, >> Are you talking

Re: [Error] Spark - Save to Phoenix

2016-04-11 Thread Divya Gehlot
> > I check that the error that you are getting is on the executor, could you > add this parameter to spark submit: > > > --conf spark.executor.extraClassPath=path_to_phoenix_jar > > > That way the executor should get phoenix in the classpath. > > > Best Regards, >

Re: [Error] Spark - Save to Phoenix

2016-04-11 Thread Divya Gehlot
g/docs/latest/running-on-yarn.html > > > Another option is to included phoenix on your jar with mvn assembly plugin: > > http://maven.apache.org/plugins/maven-assembly-plugin/ > > Best regards, > > Ricardo > > > > 2016-04-11 11:17 GMT+02:00 Divya Gehlot : > >

[Error] Spark - Save to Phoenix

2016-04-11 Thread Divya Gehlot
Hi, I am getting below error when I try to save data to Phoenix Below are Cluster configuration and steps which I followed : *Cluster Configuration :* Hortonworks distribution 2.3.4 version Spark 1.5.2 Pheonix 4.4 *Table created in Phoenix * CREATE TABLE TEST ( RPT_DATE varchar(100) PRIMARY K

Re: [HELP:]Save Spark Dataframe in Phoenix Table

2016-04-10 Thread Divya Gehlot
rticular case, HDP 2.3.4 doesn't actually provide the > necessary phoenix client-spark JAR by default, so your options are limited > here. Again, I recommend filing a support ticket with Hortonworks. > > Regards, > > Josh > > On Sat, Apr 9, 2016 at 9:11 AM, Divya Gehlot &g

Re: [HELP:]Save Spark Dataframe in Phoenix Table

2016-04-09 Thread Divya Gehlot
here. Again, I recommend filing a support ticket with Hortonworks. > > Regards, > > Josh > > On Sat, Apr 9, 2016 at 9:11 AM, Divya Gehlot > wrote: > >> Hi, >> The code which I using to connect to Phoenix for writing >> def writeToTable(df: DataFrame,dbtable: String)

Fwd: [HELP:]Save Spark Dataframe in Phoenix Table

2016-04-09 Thread Divya Gehlot
eption: org.apache.phoenix.spark.DefaultSource does not allow user-specified schemas.; Am I on the right track or missing any properties ? Because of this I am unable to proceed with Phoenix and have to find alternate options. Would really appreciate the help -- Forwarded message --

Fwd: [HELP:]Save Spark Dataframe in Phoenix Table

2016-04-09 Thread Divya Gehlot
Reposting for other user benefits -- Forwarded message -- From: Divya Gehlot Date: 8 April 2016 at 19:54 Subject: Re: [HELP:]Save Spark Dataframe in Phoenix Table To: Josh Mahonin Hi Josh, I am doing in the same manner as mentioned in Phoenix Spark manner. Using the latest

[HELP:]Save Spark Dataframe in Phoenix Table

2016-04-07 Thread Divya Gehlot
Hi, I hava a Hortonworks Hadoop cluster having below Configurations : Spark 1.5.2 HBASE 1.1.x Phoenix 4.4 I am able to connect to Phoenix through JDBC connection and able to read the Phoenix tables . But while writing the data back to Phoenix table I am getting below error : org.apache.spark.sql.

[Query:]Table creation with column family in Phoenix

2016-03-10 Thread Divya Gehlot
Hi, I created a table in Phoenix with three column families and Inserted the values as shown below Syntax : > CREATE TABLE TESTCF (MYKEY VARCHAR NOT NULL PRIMARY KEY, CF1.COL1 VARCHAR, > CF2.COL2 VARCHAR, CF3.COL3 VARCHAR) > UPSERT INTO TESTCF (MYKEY,CF1.COL1,CF2.COL2,CF3.COL3)values > ('Key2','

[Issue:]Getting null values for Numeric types while accessing hive tables (Registered on Hbase,created through Phoenix)

2016-03-03 Thread Divya Gehlot
Hi, I am registering hive table on Hbase CREATE EXTERNAL TABLE IF NOT EXISTS TEST(NAME STRING,AGE INT) STORED BY 'org.apache.hadoop.hive.hbase.HBaseStorageHandler' WITH SERDEPROPERTIES ("hbase.columns.mapping" = ":key,0:AGE") TBLPROPERTIES ("hbase.table.name" = "TEST", "hbase.mapre

Re: ***UNCHECKED*** Re: HBase Phoenix Integration

2016-03-02 Thread Divya Gehlot
Hi Dor, I didnt understand what are you trying to say . could you please elaborate. Thanks, Divya On 2 March 2016 at 17:33, Dor Ben Dov wrote: > Divya, > > How much are you working or what kind of ‘use’ are you using it on top HDP > ? > > > > Dor >

Re: ***UNCHECKED*** Re: HBase Phoenix Integration

2016-02-29 Thread Divya Gehlot
t when I > execute 'sqlline.py :2181' > > Can you please give more details about the patch? > > Thanks, > Amit. > > On Tue, Mar 1, 2016 at 10:39 AM, Divya Gehlot > wrote: > >> Hi Amit, >> Is it working ? >> No , Mine is phoenix 4.4 . >&

Re: ***UNCHECKED*** Re: HBase Phoenix Integration

2016-02-29 Thread Divya Gehlot
n Tue, Mar 1, 2016 at 10:08 AM, Divya Gehlot > wrote: > >> Hi Amit, >> Extract attached jar and try placing it in your hbase classpath >> >> P.S. Please remove the 'x' from the jar extension >> Hope this helps. >> >> >> Thanks, >&

[BEST PRACTICES]: Registering Hbase table as hive external table

2016-02-28 Thread Divya Gehlot
Hi, Has any worked on registering Hbase tables as hive ? I would like to know the best practices as well as pros and cons of it . Would really appreciate if you could refer me to good blog ,study materials etc. If anybody has hands on /production experience ,could you please share the tips? Than

[Error] : while registering Hbase table with hive

2016-02-28 Thread Divya Gehlot
Hi, I trying to register a hbase table with hive and getting following error : Error while processing statement: FAILED: Execution Error, return code 1 > from org.apache.hadoop.hive.ql.exec.DDLTask. java.lang.RuntimeException: > MetaException(message:org.apache.hadoop.hive.serde2.SerDeException E

Pheonix and Spark -transactional queries

2016-02-18 Thread Divya Gehlot
Hi, How can we ensure the Spark job which connects to Phoenix through JDBC connection how can we commit if job is success and rollback in case of failure . Thanks, Divya

Re: Error : starting spark-shell with phoenix client jar

2016-02-18 Thread Divya Gehlot
or the same. > https://issues.apache.org/jira/browse/PHOENIX-2608 > > Regards, > Ankit Singhal > > On Thu, Feb 18, 2016 at 2:03 PM, Divya Gehlot > wrote: > >> Hi, >> I am getting following error while starting spark shell with phoenix >> clients >

Fwd: Error : starting spark-shell with phoenix client jar

2016-02-18 Thread Divya Gehlot
Hi, I am getting following error while starting spark shell with phoenix clients spark-shell --jars /usr/hdp/current/phoenix-client/phoenix-4.4.0.2.3.4.0-3485-client.jar --driver-class-path /usr/hdp/current/phoenix-client/phoenix-4.4.0.2.3.4.0-3485-client.jar --master yarn-client StackTrace : >