e DDL. See http://phoenix.apache.org/language
>
> On Tue, Feb 2, 2016 at 11:54 AM, Serega Sheypak <serega.shey...@gmail.com>
> wrote:
>
>> Hm... and what is the right to presplit table then?
>>
>> 2016-02-02 18:30 GMT+01:00 Mujtaba Chohan <mujt...@apache.org&
for non-salted case vs need for multiple blocks reads for
> salted one.
>
>
> On Tuesday, February 2, 2016, Serega Sheypak <serega.shey...@gmail.com>
> wrote:
>
>> > then you would be better off not using salt buckets all together
>> rather than having 100 p
Hi, is it possible to select all dynamic columns if you don't know their
names in advance?
Example:
I have a table with single defined column named PK, which is a primary key
Someone runs query:
UNSERT INTO MY_TBL(PK, C1, C2, C3) VALUES('x', '1', '2', '3')
where C1, C2, C3 are dynamic columns
and data types that were upserted so that the person writing
> queries for the first table can know what's fields are available.
>
> Also would like to know if there is a way to do bulk upserting with
> dynamic fields.
>
> On Tue, Feb 2, 2016 at 3:27 PM, Serega Sheypak <serega
t; get all the other standard features.
> Thanks,
> James
>
> [1] https://phoenix.apache.org/views.html
>
> On Tue, Feb 2, 2016 at 1:42 PM, Serega Sheypak <serega.shey...@gmail.com>
> wrote:
>
>> It super overhead, you have to query twice...
>>
>&g
on+ row table, non-salted table offer much better performance
> since it ends up reading fewer blocks from a single region.
>
> //mujtaba
>
> On Mon, Feb 1, 2016 at 1:16 PM, Serega Sheypak <serega.shey...@gmail.com>
> wrote:
>
>> Hi, here is my table DDL:
>> C
Does phoenix have something similar:
hbase org.apache.hadoop.hbase.util.RegionSplitter MY_TABLE HexStringSplit
-c 10 -f c
Command creates pre-splitte table with 10 splits where each split takes a
part of range from 000 to f?
2016-02-02 10:34 GMT+01:00 Serega Sheypak <serega.s
Hi, here is my table DDL:
CREATE TABLE IF NOT EXISTS id_ref
(
id1 VARCHAR NOT NULL,
value1 VARCHAR,
id2 VARCHAR NOT NULL,
value2 VARCHAR
CONSTRAINT id_ref_pk PRIMARY KEY (id1, id2)
)IMMUTABLE_ROWS=true,SALT_BUCKETS=100, VERSIONS=1, TTL=691200
I'm trying to analyze
Hi, I'm using phoenix in web-application. My phoenix version is 4.3.0
I'm getting exceptions immediately when restarting an application.
What it could be? I'm doing select by primary key.
aused by: org.apache.phoenix.exception.PhoenixIOException:
java.lang.RuntimeException:
such problems.
2015-10-06 18:52 GMT+02:00 Samarth Jain <sama...@apache.org>:
> Sergea, any chance you have other queries concurrently executing on the
> client? What version of Phoenix you are on?
>
>
> On Tuesday, October 6, 2015, Serega Sheypak <serega.shey...@gmail.com>
&
I'm using 4.3.0-clabs-phoenix-1.0.0 (phoenix for CDH)
2015-10-06 20:41 GMT+02:00 Serega Sheypak <serega.shey...@gmail.com>:
> Hi, It's web-app.
> There are many concurrent web-threads (100 per app). Each thread:
> 1. create connection
> 2. execute statement
> 3. close
Hi, found smth similar here:
http://mail-archives.apache.org/mod_mbox/phoenix-user/201501.mbox/%3CCAAF1Jdg-E4=54e5dC3WazL=mvue8c93e4zohobiywaovs86...@mail.gmail.com%3E
My queries are:
1. insert into TABLE(KEY_COL, A, B,C) values(?, ?,?,?)
2. select A, B, C, KEY_COL from TABLE where KEY_COL=?
https://phoenix.apache.org/dynamic_columns.html
It works, 100% feel free to ask if it doesn't work for you.
2015-09-10 11:08 GMT+02:00 Hafiz Mujadid :
> Hi!
>
> How can I add a new column into an existing table ?
>
> Thanks
>
mmitted rows in memory till they are sent over to HBase.
>
>
>
> On Thu, Sep 3, 2015 at 12:19 PM, Serega Sheypak <serega.shey...@gmail.com>
> wrote:
>
>> Hi, I'm using phoenix in java web-application. App does upsert or select
>> by primary key.
>> What is
Hi, I'm here again. Wrote local unit-tests, all works perfectly. Started to
run smokes on production and can't reach great success. What this exception
means?
My ddl is:
CREATE TABLE IF NOT EXISTS cross_id_attributes
(
crossIdVARCHAR NOT NULL
CONSTRAINT cross_id_reference_pk
Hi, I wrote ninja-application (ninjaframework.org) with phenix. I used my
custom testing utility to test my app. When I deployd my app to server, I
got exception:
java.sql.SQLException: No suitable driver found for
jdbc:phoenix:node01,node04,node05:2181
at
Thanks, I'll try.
it's tempale query, it works 100% through JDBC
2015-09-01 23:26 GMT+02:00 Michael McAllister <mmcallis...@homeaway.com>:
> I think you need a comma between your column definition and your
> constraint definition.
>
>
> On Sep 1, 2015, at 2:54 PM, Serega
): Malformed connection url.
jdbc:phoenix:node01:2181,node04:2181,node05:2181
at
org.apache.phoenix.exception.SQLExceptionCode$Factory$1.newException(SQLExceptionCode.java:361)
~[phoenix-core-4.3.0-clabs-phoenix-1.0.0.jar:na]
2015-09-01 23:37 GMT+02:00 Serega Sheypak <serega.shey...@gmail.com>:
&
Thy to use this:
https://repository.cloudera.com/cloudera/cloudera-repos/org/apache/phoenix/phoenix-core/4.3.0-clabs-phoenix-1.0.0/
2015-07-15 0:05 GMT+02:00 Veerraju Tadimeti tvvr...@gmail.com:
Hi,
I amtrying to connect from phoenix 4.0.0-HBase1.0 to Cloudera 5.4.3, HBase
1.0. I am getting
Hi, here is my table
CREATE TABLE IF NOT EXISTS cross_id_reference
(
id1VARCHAR NOT NULL,
id2VARCHAR NOT NULL,
CONSTRAINT my_pk PRIMARY KEY (id1)
) IMMUTABLE_ROWS=true, TTL=691200;
Is it ok to set TTL and IMMUTABLE_ROWS at the same time? TTL should delete
expired
Hi, I have an immutable table with 4 columns:
id1, id2, meta_id1, meta_id2.
Primary key is id1, I select all fields from table row by id1. So it's
the fastest way to get data.
Second access path is to select by id2.
I have serious mixed workload. What is better:
1. use secondary index for id2
2.
it is and how to use it.
Kevin
*From:* Serega Sheypak [mailto:serega.shey...@gmail.com
serega.shey...@gmail.com]
*Sent:* Tuesday, June 23, 2015 1:27 PM
*To:* user@phoenix.apache.org
*Subject:* Re: CDH 5.4 and Phoenix
I read that labs article. I would like to get phoenix and cdh compatible
jars
Hi, I'm testing dummy code:
int result = getJdbcFacade().createConnection().prepareStatement(upsert
into unique_site_visitor (visitorId, siteId, visitTs) values ('xxxyyyzzz',
1, 2)).executeUpdate();
LOG.debug(executeUpdate result: {}, result); //executeUpdate
result: 1
Hi!, did anyone try to integrate Phoenix 1.0 with CDH 5.4.x?
I see weird installation path here:
http://www.cloudera.com/content/cloudera/en/developers/home/cloudera-labs/apache-phoenix/install-apache-phoenix-cloudera-labs.pdf
I would like to avoid it and run app using plain maven dependencies.
24 matches
Mail list logo