Hi Thomas,
please ignore my last email, I wasn't running the sqlline.py command from
within the bin directory.
Now it works just fine!
Thanks!
On 18 June 2015 at 10:39, Yiannis Gkoufas johngou...@gmail.com wrote:
Hi Thomas,
unfortunately just modifying the hbase-site in the current
Hi All,
I am working on a Java Spring project where wer are connecting to Phoenix
and loading a file or trying to upsert a record through Spring(MVC).
The process executes correctly for some X number of records and then
suddenly all the threads get closed and main thread keeps running without
an
Hello,
I'm trying to access Phoenix using Microsoft Excel. I have written a very
thin ODBC wrapper around Phoenix that I use to pass SQL from Microsoft
Query to my ODBC (which just passes the SQL along) to Phoenix.
My problem is with case-insensitivity. All my column families/qualifiers
are
The ARRAY_APPEND function will appear in the 4.5.0 release.
Thanks,
James
On Thu, Jun 18, 2015 at 1:53 AM, guxiaobo1982 guxiaobo1...@qq.com wrote:
Hi,
I tried the following examples regarding to array data type
create table artest(a integer , b integer[], constraint pk primary key(a));
Hi Alex,
We don't have a way of globally enabling/disabling the normalization
we do for column names by uppercasing them. However, there are a
couple of feature that might help you:
1) You don't need to reference column family names unless column names
are ambiguous without it.
2) You can set a
you can use Apache Kafka and write your own publisher/subscriber
On Thu, Jun 18, 2015 at 5:46 AM, Isart Montane isart.mont...@gmail.com
wrote:
Hi,
I was wondering if someone has some experience on a way to transfer data
from MySQL to Phoenix/Hbase in near real time. I've checked Sqoop but
Hi,
I tried the following examples regarding to array data type
create table artest(a integer , b integer[], constraint pk primary key(a));
upsert into artest values(1, array[1,2]);
The following statement failed :
select a, ARRAY_APPEND(b, 4) as b from artest;
Error: ERROR 605
Hi,
I was wondering if someone has some experience on a way to transfer data
from MySQL to Phoenix/Hbase in near real time. I've checked Sqoop but that
will only transfer batches
Thanks,
Hi Thomas,
unfortunately just modifying the hbase-site in the current directory of
phoenix didnt work.
What I have now is:
configuration
property
namehbase.regionserver.wal.codec/name
valueorg.apache.hadoop.hbase.regionserver.wal.IndexedWALEditCodec/value
/property
property