Can you try with below driver
"driver" -> "org.apache.phoenix.jdbc.PhoenixDriver",
Thanks,
Divya
On 22 November 2016 at 11:14, Dequn Zhang wrote:
> Hello, since spark 2.x can not use Phoenix Spark Interpreter to load data,
> so I want to use JDBC, but when I want to
Hi Mich,
Which version of Phoenix are you using ?
Thanks,
Divya
On 17 October 2016 at 23:41, Mich Talebzadeh
wrote:
> Hi,
>
> I have a table marketDataHbase create on Hbase as seen below:
>
> [image: Inline images 1]
>
>
> Trying to drop it but it cannot find it
>
>
Hi Yang ,
Can you share the details in the forum would be useful for everybody.
Thanks,
Divya
On 19 October 2016 at 10:56, Yang Zhang wrote:
> William and ames JTaylor
> helped me solve this problem
>
> Thanks
>
>
Hi ,
Phoenix maintain its metadata in system.catalog table
Check your table information there .
Is it the same as your table information ?
Because this exception comes when the the table and system.catalog metadata
doesn't match .
Thanks ,
Divya
On Jul 23, 2016 10:34 AM, "Yang Zhang"
Can you please elaborate your query with an example .
On May 9, 2016 2:12 PM, "景涛" <844300...@qq.com> wrote:
> I want to add a variable in the Phoenix SQL script, and how do I change
> the sample?
> Anybody can help me ?
> Thank you very much.
>
Can you please help me with example .
Thanks,
Divya
On 11 May 2016 at 16:55, Ankit Singhal <ankitsingha...@gmail.com> wrote:
> You can use Joins as a substitute to subqueries.
>
> On Wed, May 11, 2016 at 1:27 PM, Divya Gehlot <divya.htco...@gmail.com>
> wrote:
>
Hi,
I am using Spark 1.5.2 with Apache Phoenix 4.4
As Spark 1.5.2 doesn't support subquery in where conditions .
https://issues.apache.org/jira/browse/SPARK-4226
Is there any alternative way to find foreign key constraints.
Would really appreciate the help.
Thanks,
Divya
Hi,
Does Phoenix Support renaming of table ?
Is yes, Please help me with the syntax.
Thanks,
Divya
Divya
On 4 May 2016 at 21:31, sunday2000 <2314476...@qq.com> wrote:
> Check your javac version, and update it.
>
>
> -- 原始邮件 ------
> *发件人:* "Divya Gehlot";<divya.htco...@gmail.com>;
> *发送时间:* 2016年5月4日(星期三) 中午11:25
&g
Hi ,
Even I am getting the similar error
Failed to execute goal net.alchim31.maven:scala-maven-plugin:3.2.2:compile
When I tried to build Phoenix Project using maven .
Maven version : 3.3
Java version - 1.7_67
Phoenix - downloaded latest master from Git hub
If anybody find the the resolution
s that's pre-Apache and
> more than two years old. Use this URL instead:
> https://phoenix.apache.org/download.html
>
> Thanks,
> James
>
>
> On Thursday, April 28, 2016, Divya Gehlot <divya.htco...@gmail.com> wrote:
>
>> Hi,
>> I am trying to
binary release of
> Phoenix, or compile the latest version yourself, you will be able to see
> and use it. It does not come with the HDP 2.3.4 platform, at least last I
> checked.
>
> Regards,
>
> Josh
>
> On Sat, Apr 9, 2016 at 2:24 PM, Divya Gehlot <divya.htco...@gmail.com>
Hi,
I would like to know ,Is there SQL editor apart from Squirrel?
Thanks,
Divya
.executor.extraClassPath=/usr/hdp/current/phoenix-client/phoenix-client.jar"
>
> using it without quotes:
>
> --conf
> spark.executor.extraClassPath=/usr/hdp/current/phoenix-client/phoenix-client.jar
>
>
>
>
> 2016-04-11 12:43 GMT+02:00 Divya Gehlot <divya.
h.
>
>
> Best Regards,
> Ricardo
>
>
> 2016-04-11 12:15 GMT+02:00 Divya Gehlot <divya.htco...@gmail.com>:
>
>> Hi Ricardo ,
>> If would have observed my previous post carefully
>> I am already passing the below jars
>> --jars /usr/hdp/2.3.4.0
options here:
> http://spark.apache.org/docs/latest/running-on-yarn.html
>
>
> Another option is to included phoenix on your jar with mvn assembly plugin:
>
> http://maven.apache.org/plugins/maven-assembly-plugin/
>
> Best regards,
>
> Ricardo
>
>
>
> 2016-04-11 1
this particular case, HDP 2.3.4 doesn't actually provide the
> necessary phoenix client-spark JAR by default, so your options are limited
> here. Again, I recommend filing a support ticket with Hortonworks.
>
> Regards,
>
> Josh
>
> On Sat, Apr 9, 2016 at 9:11 AM, Divya Geh
client-spark JAR by default, so your options are limited
> here. Again, I recommend filing a support ticket with Hortonworks.
>
> Regards,
>
> Josh
>
> On Sat, Apr 9, 2016 at 9:11 AM, Divya Gehlot <divya.htco...@gmail.com>
> wrote:
>
>> Hi,
>> The code which I
eption: org.apache.phoenix.spark.DefaultSource
does not allow user-specified schemas.;
Am I on the right track or missing any properties ?
Because of this I am unable to proceed with Phoenix and have to find
alternate options.
Would really appreciate the help
-- Forwarded mes
Reposting for other user benefits
-- Forwarded message --
From: Divya Gehlot <divya.htco...@gmail.com>
Date: 8 April 2016 at 19:54
Subject: Re: [HELP:]Save Spark Dataframe in Phoenix Table
To: Josh Mahonin <jmaho...@gmail.com>
Hi Josh,
I am doing in the same manner
Hi,
I hava a Hortonworks Hadoop cluster having below Configurations :
Spark 1.5.2
HBASE 1.1.x
Phoenix 4.4
I am able to connect to Phoenix through JDBC connection and able to read
the Phoenix tables .
But while writing the data back to Phoenix table
I am getting below error :
Hi,
I created a table in Phoenix with three column families and Inserted the
values as shown below
Syntax :
> CREATE TABLE TESTCF (MYKEY VARCHAR NOT NULL PRIMARY KEY, CF1.COL1 VARCHAR,
> CF2.COL2 VARCHAR, CF3.COL3 VARCHAR)
> UPSERT INTO TESTCF (MYKEY,CF1.COL1,CF2.COL2,CF3.COL3)values
>
Hi,
I am registering hive table on Hbase
CREATE EXTERNAL TABLE IF NOT EXISTS TEST(NAME STRING,AGE INT)
STORED BY 'org.apache.hadoop.hive.hbase.HBaseStorageHandler'
WITH SERDEPROPERTIES ("hbase.columns.mapping" = ":key,0:AGE")
TBLPROPERTIES ("hbase.table.name" = "TEST",
t;
> No, it doesn't work for phoenix 4.6. Attached is the error I get when I
> execute 'sqlline.py :2181'
>
>
>
> Can you please give more details about the patch?
>
>
>
> Thanks,
>
> Amit.
>
>
>
> On Tue, Mar 1, 2016 at 10:39 AM, Divya Gehlot <divya
anks,
> Amit.
>
> On Tue, Mar 1, 2016 at 10:08 AM, Divya Gehlot <divya.htco...@gmail.com>
> wrote:
>
>> Hi Amit,
>> Extract attached jar and try placing it in your hbase classpath
>>
>> P.S. Please remove the 'x' from the jar extension
>> Hope this hel
Hi,
Has any worked on registering Hbase tables as hive ?
I would like to know the best practices as well as pros and cons of it .
Would really appreciate if you could refer me to good blog ,study materials
etc.
If anybody has hands on /production experience ,could you please share the
tips?
Hi,
I trying to register a hbase table with hive and getting following error :
Error while processing statement: FAILED: Execution Error, return code 1
> from org.apache.hadoop.hive.ql.exec.DDLTask. java.lang.RuntimeException:
> MetaException(message:org.apache.hadoop.hive.serde2.SerDeException
Hi,
How can we ensure the Spark job which connects to Phoenix through JDBC
connection how can we commit if job is success and rollback in case of
failure .
Thanks,
Divya
, please find a jira for the same.
> https://issues.apache.org/jira/browse/PHOENIX-2608
>
> Regards,
> Ankit Singhal
>
> On Thu, Feb 18, 2016 at 2:03 PM, Divya Gehlot <divya.htco...@gmail.com>
> wrote:
>
>> Hi,
>> I am getting following error while star
Hi,
I am getting following error while starting spark shell with phoenix
clients
spark-shell --jars
/usr/hdp/current/phoenix-client/phoenix-4.4.0.2.3.4.0-3485-client.jar
--driver-class-path
/usr/hdp/current/phoenix-client/phoenix-4.4.0.2.3.4.0-3485-client.jar
--master yarn-client
StackTrace :
>
30 matches
Mail list logo