Re: phoenix upsert select query fails with : java.util.concurrent.ExecutionException: java.lang.ArrayIndexOutOfBoundsException

2016-12-12 Thread venkata subbarayudu
Below is the complete stack trace INFO [main] org.apache.phoenix.iterate.BaseResultIterators: Failed to execute task during cancel java.util.concurrent.ExecutionException: java.lang.ArrayIndexOutOfBoundsException: 45 at java.util.concurrent.FutureTask.report(FutureTask.java:122) at

Possible Optimization Tips for Hbase via Phoenix

2016-12-12 Thread amey hegde
I am new to HBASE and Phoenix world. I have designed and executed a map reduce job which writes around 2.4 billion cells (rows*columns) in HBASE via Phoenix in about 80min. I have reduced the "mapreduce.input.fileinputformat.split.maxsize" to 8MB to increase the number of mapper which helped

Re: How to map sparse hbase table with dynamic columns into Phoenix

2016-12-12 Thread Ciureanu Constantin
Not sure if this works for the view use-case you have but it's working for a Phoenix table. The table create statement should have just the stable columns. CREATE TABLE IF NOT EXISTS TESTC ( TIMESTAMP BIGINT NOT NULL, NAME VARCHAR NOT NULL CONSTRAINT PK PRIMARY KEY (TIMESTAMP, NAME) ); --