Re: write dataframe to phoenix

2017-03-27 Thread Dhaval Modi
Hi Sateesh,

If you are running from spark shell, then please include Phoenix spark jar
in classpath.

Kindly refer to url that Sandeep provide.


Regards,
Dhaval

On Mar 27, 2017 21:20, "Sateesh Karuturi" 
wrote:

Thanks Sandeep for your response.

This is the exception what i am getting:

org.apache.spark.SparkException: Job aborted due to stage failure:
Task 0 in stage 3.0 failed 4 times, most recent failure: Lost task 0.3
in stage 3.0 (TID 411,
ip-x-xx-xxx.ap-southeast-1.compute.internal):
java.lang.RuntimeException: java.sql.SQLException: No suitable driver
found for jdbc:phoenix:localhost:2181:/hbase-unsecure;
at 
org.apache.phoenix.mapreduce.PhoenixOutputFormat.getRecordWriter(PhoenixOutputFormat.java:58)
at 
org.apache.spark.rdd.PairRDDFunctions$anonfun$saveAsNewAPIHadoopDataset$1$anonfun$12.apply(PairRDDFunctions.scala:1030)
at 
org.apache.spark.rdd.PairRDDFunctions$anonfun$saveAsNewAPIHadoopDataset$1$anonfun$12.apply(PairRDDFunctions.scala:1014)
at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:66)
at org.apache.spark.scheduler.Task.run(Task.scala:88)
at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:214)
at 
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
at 
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)


On Mon, Mar 27, 2017 at 8:17 PM, Sandeep Nemuri 
wrote:

> What is the error you are seeing ?
>
> Ref: https://phoenix.apache.org/phoenix_spark.html
>
> df.write \
>   .format("org.apache.phoenix.spark") \
>   .mode("overwrite") \
>   .option("table", "TABLE1") \
>   .option("zkUrl", "localhost:2181") \
>   .save()
>
>
>
> On Mon, Mar 27, 2017 at 10:19 AM, Sateesh Karuturi <
> sateesh.karutu...@gmail.com> wrote:
>
>> Please anyone help me out how to write dataframe to phoenix in java?
>>
>> here is my code:
>>
>> pos_offer_new_join.write().format("org.apache.phoenix.spark"
>> ).mode(SaveMode.Overwrite)
>>
>> .options(ImmutableMap.of("driver",
>> "org.apache.phoenix.jdbc.PhoenixDriver","zkUrl",
>>
>> "jdbc:phoenix:localhost:2181","table","RESULT"))
>>
>> .save();
>>
>>
>> but i am not able to write data to phoenix.
>>
>>
>> Thanks.
>>
>>
>>
>
>
> --
> *  Regards*
> *  Sandeep Nemuri*
>


Re: Cannot upsert row_timestamp value

2017-02-25 Thread Dhaval Modi
Hi NaHeon Kim,

Please refer to mailing list:
https://lists.apache.org/thread.html/fb747661f535b0a407bf38e6b961a2c68634815189c80a7d612366b1@%3Cuser.phoenix.apache.org%3E


I also faced similar issue.


Regards,
Dhaval Modi
dhavalmod...@gmail.com

On 26 February 2017 at 10:34, NaHeon Kim <honey.and...@gmail.com> wrote:

> Hi all,
>
> UPSERT into a table with ROW_TIMESTAMP column is not possible.
> I'm using phoenix-4.8.0-hbase-1.1.
>
> Table Schema:
>
> create table my_table (
>obj_id varchar(20) not null,
>create_dt timestamp not null,
>keyword varchar(100) not null,
>count integer
>constraint pk primary key (obj_id, create_dt row_timestamp, keyword)
> );
>
>
> 1) Spark Integration
> No rows are inserted. No errors.
>
> 2) sqlline.py - timestamp column is in a query
> No rows are inserted.
>
> upsert into my_table (obj_id, create_dt, keyword, count)
> values ('objid', '2017-02-26 13:48:00', 'k', 100);
> 3) sqlline.py - timestamp column is not in a query
> This throws an exception:
>
>> java.lang.ArrayIndexOutOfBoundsException: 8
>> at org.apache.phoenix.execute.MutationState.getNewRowKeyWithRowTimestamp(
>> MutationState.java:548)
>> at org.apache.phoenix.execute.MutationState.generateMutations(
>> MutationState.java:627)
>> at org.apache.phoenix.execute.MutationState.addRowMutations(
>> MutationState.java:566)
>> at org.apache.phoenix.execute.MutationState.send(MutationState.java:908)
>> at org.apache.phoenix.execute.MutationState.send(MutationState.java:1329)
>> at org.apache.phoenix.execute.MutationState.commit(
>> MutationState.java:1161)
>> at org.apache.phoenix.jdbc.PhoenixConnection$3.call(
>> PhoenixConnection.java:529)
>
>
> upsert into my_table (obj_id, keyword, count)
> values ('objid', 'k', 100);
>
> Everything works well without row_timestamp.
> Thanks in advance! : )
>


Re: ROW_TIMESTAMP weird behaviour

2017-02-07 Thread Dhaval Modi
Thanks Ankit.

My issue is relevant to PHOENIX-3176.

But additional observation is, any timestamp value after 13:oo hours of the
same day is not added.

0: jdbc:phoenix:> select * from DUMMY;
+--+
|  XXX_TS  |
+--+
| 2017-01-01 15:02:21.050  |
| 2017-01-02 15:02:21.050  |
| 2017-01-13 15:02:21.050  |
| 2017-02-06 15:02:21.050  |
| 2017-02-07 11:02:21.050  |
| 2017-02-07 11:03:21.050  |
| 2017-02-07 12:02:21.050  |
+--+
7 rows selected (0.044 seconds)
0: jdbc:phoenix:> upsert into DUMMY values('2017-02-07T*12:03:21.050'*);
1 row affected (0.01 seconds)
0: jdbc:phoenix:> select * from DUMMY;
+--+
|  XXX_TS  |
+--+
| 2017-01-01 15:02:21.050  |
| 2017-01-02 15:02:21.050  |
| 2017-01-13 15:02:21.050  |
| 2017-02-06 15:02:21.050  |
| 2017-02-07 11:02:21.050  |
| 2017-02-07 11:03:21.050  |
| 2017-02-07 12:02:21.050  |
*| 2017-02-07 12:03:21.050  |*
+--+
8 rows selected (0.047 seconds)
0: jdbc:phoenix:> upsert into DUMMY values('2017-02-07T*13:03:21.050*');
1 row affected (0.009 seconds)
0: jdbc:phoenix:> select * from DUMMY;
+--+
|  XXX_TS  |
+--+
| 2017-01-01 15:02:21.050  |
| 2017-01-02 15:02:21.050  |
| 2017-01-13 15:02:21.050  |
| 2017-02-06 15:02:21.050  |
| 2017-02-07 11:02:21.050  |
| 2017-02-07 11:03:21.050  |
| 2017-02-07 12:02:21.050  |
| 2017-02-07 12:03:21.050  |
+--+
8 rows selected (0.04 seconds)






Regards,
Dhaval Modi
dhavalmod...@gmail.com

On 7 February 2017 at 15:28, Ankit Singhal <ankitsingha...@gmail.com> wrote:

> I think you are also hitting https://issues.apache.
> org/jira/browse/PHOENIX-3176.
>
> On Tue, Feb 7, 2017 at 2:18 PM, Dhaval Modi <dhavalmod...@gmail.com>
> wrote:
>
>> Hi Pedro,
>>
>> Upserted key are different. One key is for July month & other for January
>> month.
>> 1. '2017-*07*-02T15:02:21.050'
>> 2. '2017-*01*-02T15:02:21.050'
>>
>>
>> Regards,
>> Dhaval Modi
>> dhavalmod...@gmail.com
>>
>> On 7 February 2017 at 13:18, Pedro Boado <pedro.bo...@gmail.com> wrote:
>>
>>> Hi.
>>>
>>> I don't think it's weird. That column is PK and you've upserted twice
>>> the same key value so first one is inserted and second one is updated.
>>>
>>> Regards.
>>>
>>>
>>>
>>> On 7 Feb 2017 04:59, "Dhaval Modi" <dhavalmod...@gmail.com> wrote:
>>>
>>>> Hi All,
>>>>
>>>> I am facing abnormal scenarios with ROW_TIMESTAMP.
>>>>
>>>> I created table in Phoenix as below:
>>>> CREATE TABLE DUMMY(XXX_TS TIMESTAMP NOT NULL CONSTRAINT pk PRIMARY KEY
>>>> (XXX_TS ROW_TIMESTAMP))
>>>> where "XXX_TS" is used as ROW_TIMESTAMP.
>>>>
>>>> Now, I am trying to add data:
>>>> upsert into DUMMY values('2017-07-02T15:02:21.050');
>>>> upsert into DUMMY values('2017-01-02T15:02:21.050');
>>>>
>>>> I am only seeing one entry.
>>>> *==*
>>>> *0: jdbc:phoenix:> select * from DUMMY;*
>>>> *+--+*
>>>> *|  XXX_TS  |*
>>>> *+--+*
>>>> *| 2017-01-02 15:02:21.050  |*
>>>> *+--+*
>>>> *1 row selected (0.039 seconds)*
>>>> *==*
>>>>
>>>>
>>>> Additional info:
>>>> System date of HBase & Phoenix: mar feb  7 05:57:37 CET 2017
>>>>
>>>>
>>>> Regards,
>>>> Dhaval Modi
>>>> dhavalmod...@gmail.com
>>>>
>>>
>>
>


Re: ROW_TIMESTAMP weird behaviour

2017-02-07 Thread Dhaval Modi
Hi Pedro,

Upserted key are different. One key is for July month & other for January
month.
1. '2017-*07*-02T15:02:21.050'
2. '2017-*01*-02T15:02:21.050'


Regards,
Dhaval Modi
dhavalmod...@gmail.com

On 7 February 2017 at 13:18, Pedro Boado <pedro.bo...@gmail.com> wrote:

> Hi.
>
> I don't think it's weird. That column is PK and you've upserted twice the
> same key value so first one is inserted and second one is updated.
>
> Regards.
>
>
>
> On 7 Feb 2017 04:59, "Dhaval Modi" <dhavalmod...@gmail.com> wrote:
>
>> Hi All,
>>
>> I am facing abnormal scenarios with ROW_TIMESTAMP.
>>
>> I created table in Phoenix as below:
>> CREATE TABLE DUMMY(XXX_TS TIMESTAMP NOT NULL CONSTRAINT pk PRIMARY KEY
>> (XXX_TS ROW_TIMESTAMP))
>> where "XXX_TS" is used as ROW_TIMESTAMP.
>>
>> Now, I am trying to add data:
>> upsert into DUMMY values('2017-07-02T15:02:21.050');
>> upsert into DUMMY values('2017-01-02T15:02:21.050');
>>
>> I am only seeing one entry.
>> *==*
>> *0: jdbc:phoenix:> select * from DUMMY;*
>> *+--+*
>> *|  XXX_TS  |*
>> *+--+*
>> *| 2017-01-02 15:02:21.050  |*
>> *+--+*
>> *1 row selected (0.039 seconds)*
>> *==*
>>
>>
>> Additional info:
>> System date of HBase & Phoenix: mar feb  7 05:57:37 CET 2017
>>
>>
>> Regards,
>> Dhaval Modi
>> dhavalmod...@gmail.com
>>
>


ROW_TIMESTAMP weird behaviour

2017-02-06 Thread Dhaval Modi
Hi All,

I am facing abnormal scenarios with ROW_TIMESTAMP.

I created table in Phoenix as below:
CREATE TABLE DUMMY(XXX_TS TIMESTAMP NOT NULL CONSTRAINT pk PRIMARY KEY
(XXX_TS ROW_TIMESTAMP))
where "XXX_TS" is used as ROW_TIMESTAMP.

Now, I am trying to add data:
upsert into DUMMY values('2017-07-02T15:02:21.050');
upsert into DUMMY values('2017-01-02T15:02:21.050');

I am only seeing one entry.
*==*
*0: jdbc:phoenix:> select * from DUMMY;*
*+--+*
*|  XXX_TS  |*
*+--+*
*| 2017-01-02 15:02:21.050  |*
*+--+*
*1 row selected (0.039 seconds)*
*==*


Additional info:
System date of HBase & Phoenix: mar feb  7 05:57:37 CET 2017


Regards,
Dhaval Modi
dhavalmod...@gmail.com


Re: Unable to connect to HBase using Phoenix JDBC Driver

2017-02-02 Thread Dhaval Modi
Hi Anshuman & James,

Thanks for your input.

This issue is resolved. The main reason was the use of HBase-shaded-client
as forced dependency.
I removed it and issue got resolved.


Regards,
Dhaval Modi
dhavalmod...@gmail.com

On 1 February 2017 at 23:38, James Taylor <jamestay...@apache.org> wrote:

> Sounds like you're using the wrong jar on the server side. Or perhaps
> you're using a vendor distribution (in which case you'll need to use their
> supported Phoenix version). Please review the installation instructions as
> everything you need is in the one single jar.
> Thanks,
> James
>
> On Wed, Feb 1, 2017 at 9:56 AM Kumar Anshuman <anshuman.mail...@gmail.com>
> wrote:
>
>> Hi Dhaval,
>>
>> This error seems to be due to HBase and Phoenix jar version's mismatch or
>> due to missing of dependency for HBase-protocol-{version}.jar or
>> HBase-protobuff-{version}.jar, you will have to check the proper version of
>> the HBase jars and Phoenix jars used, Please check the versions and
>> validate the different alternatives suggested and try again and inform me
>> if it works or not.
>>
>> Regards,
>> Kumar Anshuman
>>
>> On Wed, Feb 1, 2017 at 9:41 PM, Kumar Anshuman <
>> anshuman.mail...@gmail.com> wrote:
>>
>> Hi again Dhaval,
>>
>> This error seems to be due to HBase and Phoenix jar version's mismatch or
>> due to missing of dependency for HBase-protocol-{version}.jar or
>> HBase-protobuff-{version}.jar, you will have to check the proper version of
>> the HBase jars and Phoenix jars used, Please check the versions and
>> validate the different alternatives suggested and try again and inform me
>> if it works or not.
>>
>> Regards,
>> Kumar Anshuman
>>
>>
>> On Wed, Feb 1, 2017 at 9:19 PM, Dhaval Modi <dhavalmod...@gmail.com>
>> wrote:
>>
>> Thanks Anshuman. It was really helpful.
>>
>> I added HBase-protocol jar in dependency and it got resolved.
>>
>> But now I am getting different error:
>>
>> ++
>> Caused by: java.lang.IllegalArgumentException: Can't find method newStub
>> in org.apache.phoenix.coprocessor.generated.MetaDataProtos$
>> MetaDataService!
>> at org.apache.hadoop.hbase.util.Methods.call(Methods.java:45)
>> at org.apache.hadoop.hbase.protobuf.ProtobufUtil.
>> newServiceStub(ProtobufUtil.java:1675)
>> at org.apache.hadoop.hbase.client.HTable$16.call(HTable.
>> java:1750)
>> ... 4 more
>> Caused by: java.lang.NoSuchMethodException: org.apache.phoenix.
>> coprocessor.generated.MetaDataProtos$MetaDataService.newStub(org.apache.
>> hadoop.hbase.shaded.com.google.protobuf.RpcChannel)
>> at java.lang.Class.getMethod(Class.java:1786)
>> at org.apache.hadoop.hbase.util.Methods.call(Methods.java:38)
>> ... 6 more
>> ++
>>
>>
>>
>> Regards,
>> Dhaval Modi
>> dhavalmod...@gmail.com
>>
>> On 1 February 2017 at 21:07, Kumar Anshuman <anshuman.mail...@gmail.com>
>> wrote:
>>
>> Hi Dhaval,
>>
>> Have you added HBase-protocol jar in your dependencies?
>> This error shows that you have either the incompatible version of
>> HBase-protocol jar or you are missing it( might be missed out from the
>> classpath).
>> Try to include the proper version(for this case 1.1.2) of the jar or set
>> this jar as HADOOP_CLASSPATH and see if it works.
>>
>> Regards,
>> Kumar Anshuman
>>
>>
>>
>>


Re: Unable to connect to HBase using Phoenix JDBC Driver

2017-02-01 Thread Dhaval Modi
Thanks Anshuman. It was really helpful.

I added HBase-protocol jar in dependency and it got resolved.

But now I am getting different error:

++
Caused by: java.lang.IllegalArgumentException: Can't find method newStub in
org.apache.phoenix.coprocessor.generated.MetaDataProtos$MetaDataService!
at org.apache.hadoop.hbase.util.Methods.call(Methods.java:45)
at
org.apache.hadoop.hbase.protobuf.ProtobufUtil.newServiceStub(ProtobufUtil.java:1675)
at org.apache.hadoop.hbase.client.HTable$16.call(HTable.java:1750)
... 4 more
Caused by: java.lang.NoSuchMethodException:
org.apache.phoenix.coprocessor.generated.MetaDataProtos$MetaDataService.newStub(org.apache.hadoop.hbase.shaded.com.google.protobuf.RpcChannel)
at java.lang.Class.getMethod(Class.java:1786)
at org.apache.hadoop.hbase.util.Methods.call(Methods.java:38)
... 6 more
++



Regards,
Dhaval Modi
dhavalmod...@gmail.com

On 1 February 2017 at 21:07, Kumar Anshuman <anshuman.mail...@gmail.com>
wrote:

> Hi Dhaval,
>
> Have you added HBase-protocol jar in your dependencies?
> This error shows that you have either the incompatible version of
> HBase-protocol jar or you are missing it( might be missed out from the
> classpath).
> Try to include the proper version(for this case 1.1.2) of the jar or set
> this jar as HADOOP_CLASSPATH and see if it works.
>
> Regards,
> Kumar Anshuman
>
>


Re: Unable to connect to HBase using Phoenix JDBC Driver

2017-02-01 Thread Dhaval Modi
Apologies, I missed important details.

Phoenix version: 4.9.0-HBase-1.1
HBase Version: 1.1.2

Regards,
Dhaval Modi
dhavalmod...@gmail.com

On 1 February 2017 at 14:51, Dhaval Modi <dhavalmod...@gmail.com> wrote:

> Hi All,
>
> I am trying to connect to HBase using Phoenix JDBC Driver and getting
> below error:
> This is simple implementation of JDBC connection manager.
>
> =
> Caused by: java.lang.IncompatibleClassChangeError: Class
> org.apache.hadoop.hbase.protobuf.generated.ClusterIdProtos$ClusterId$Builder
> does not implement the requested interface org.apache.hadoop.hbase.
> shaded.com.google.protobuf.Message$Builder
> at org.apache.hadoop.hbase.protobuf.ProtobufUtil.
> mergeFrom(ProtobufUtil.java:3154)
> at org.apache.hadoop.hbase.ClusterId.parseFrom(ClusterId.java:69)
> at org.apache.hadoop.hbase.zookeeper.ZKClusterId.
> readClusterIdZNode(ZKClusterId.java:75)
> at org.apache.hadoop.hbase.client.ZooKeeperRegistry.getClusterId(
> ZooKeeperRegistry.java:105)
> at org.apache.hadoop.hbase.client.ConnectionManager$
> HConnectionImplementation.retrieveClusterId(ConnectionManager.java:879)
> at org.apache.hadoop.hbase.client.ConnectionManager$
> HConnectionImplementation.(ConnectionManager.java:635)
> =
>
> Based on my research it seems, I need to use some shaded package of
> phoenix-client. I am not able to figure it out.
>
> Regards,
> Dhaval Modi
> dhavalmod...@gmail.com
>


Unable to connect to HBase using Phoenix JDBC Driver

2017-02-01 Thread Dhaval Modi
Hi All,

I am trying to connect to HBase using Phoenix JDBC Driver and getting below
error:
This is simple implementation of JDBC connection manager.

=
Caused by: java.lang.IncompatibleClassChangeError: Class
org.apache.hadoop.hbase.protobuf.generated.ClusterIdProtos$ClusterId$Builder
does not implement the requested interface
org.apache.hadoop.hbase.shaded.com.google.protobuf.Message$Builder
at
org.apache.hadoop.hbase.protobuf.ProtobufUtil.mergeFrom(ProtobufUtil.java:3154)
at org.apache.hadoop.hbase.ClusterId.parseFrom(ClusterId.java:69)
at
org.apache.hadoop.hbase.zookeeper.ZKClusterId.readClusterIdZNode(ZKClusterId.java:75)
at
org.apache.hadoop.hbase.client.ZooKeeperRegistry.getClusterId(ZooKeeperRegistry.java:105)
at
org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation.retrieveClusterId(ConnectionManager.java:879)
at
org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation.(ConnectionManager.java:635)
=

Based on my research it seems, I need to use some shaded package of
phoenix-client. I am not able to figure it out.

Regards,
Dhaval Modi
dhavalmod...@gmail.com