Re: Deleting phoenix tables and views using hbase shell

2015-06-22 Thread Arun Kumaran Sabtharishi
James, Do you see any issues in using the delete statement below as a workaround for dropping views until the JIRA's are fixed and released? delete from SYSTEM.CATALOG where table_name = 'MY_VIEW' Thanks, Arun

StackOverflowError

2015-06-22 Thread Bahubali Jain
Hi, I am running into below error when I execute a query which has a join,group by and order by. But when I run the same query with hint /*+ USE_SORT_MERGE_JOIN*/ , it runs well. Can anybody please shed some light on this. Error: Encountered exception in sub plan [0] execution. (state=,code=0)

Re: TO_DATE function playing up?

2015-06-22 Thread Michael McAllister
Thanks for the help Gabriel, I really appreciate it. That did the trick! Regards Mike On Jun 22, 2015, at 10:38 AM, Gabriel Reid gabriel.r...@gmail.commailto:gabriel.r...@gmail.com wrote: Hi Michael, Thanks for the very detailed explanation of your scenario. I believe the issue is in your

Re: Deleting phoenix tables and views using hbase shell

2015-06-22 Thread James Taylor
Arun, Manually running DDL against the SYSTEM.CATALOG table can be problematic for a few reasons: - if a write failure occurs in the middle of running that statement, your SYSTEM.CATALOG table can be left in an inconsistent state. We prevent this internally by using a mutateRowsWithLocks call

Re: How to upsert data into dynamic columns in phoniex.

2015-06-22 Thread Thomas D'Silva
You can upsert rows by sepecifying the column name and data type along with the table in the select. For the example in http://phoenix.apache.org/dynamic_columns.html UPSERT INTO TABLE (eventId, eventTime, lastGCTime INTEGER) VALUES(1, CURRENT_TIME(), 1234); On Sun, Jun 21, 2015 at 6:51 PM,

Re: Indexing array type

2015-06-22 Thread Leon Prouger
That's interesting, I had a similar idea. But first we would like to model the array with two tables, which sound to be simpler. I contact you If I'll have time to work on the issue. On Mon, Jun 22, 2015 at 12:10 AM James Taylor jamestay...@apache.org wrote: Hey Leon, I filed PHOENIX-1544 a

How To Count Rows In Large Phoenix Table?

2015-06-22 Thread Riesland, Zack
I had a very large Hive table that I needed in HBase. After asking around, I came to the conclusion that my best bet was to: 1 - export the hive table to a CSV 'file'/folder on the HDFS 2 - Use the org.apache.phoenix.mapreduce.CsvBulkLoadTool to import the data. I found that if I tried to pass

Re: How To Count Rows In Large Phoenix Table?

2015-06-22 Thread anil gupta
For #2: hbase org.apache.hadoop.hbase.mapreduce.RowCounter TABLE_NAME On Mon, Jun 22, 2015 at 11:34 AM, Riesland, Zack zack.riesl...@sensus.com wrote: I had a very large Hive table that I needed in HBase. After asking around, I came to the conclusion that my best bet was to: 1 –

webcast on Phoenix this Thu @ 10am

2015-06-22 Thread James Taylor
If you're interested in learning more about Phoenix, tune in this Thursday @ 10am where I'll be talking about Phoenix in a free Webcast hosted by O'Reilly: http://www.oreilly.com/pub/e/3443 Thanks, James

Re: How To Count Rows In Large Phoenix Table?

2015-06-22 Thread anil gupta
For#2: You can use Row_Counter mapreduce job of HBase to count rows of large table. You dont need to write any code. Here is the sample command to invoke: hbase org.apache.hadoop.hbase.mapreduce.RowCounter TABLE_NAME ~Anil On Mon, Jun 22, 2015 at 12:08 PM, Ciureanu Constantin

Does phoniex support features similar to checkandXXX methods of the HTable interface?

2015-06-22 Thread guxiaobo1982

Re: StackOverflowError

2015-06-22 Thread Maryann Xue
Hi Bahubali, Could you please share your query? Thanks, Maryann On Mon, Jun 22, 2015 at 12:51 PM, Bahubali Jain bahub...@gmail.com wrote: Hi, I am running into below error when I execute a query which has a join,group by and order by. But when I run the same query with hint /*+

Advice for UDF used in GROUP BY

2015-06-22 Thread Yiannis Gkoufas
Hi there, I was just looking for some tips on implementing a UDF to be used in a GROUP BY statement. For instance lets say I have the table: ( (A, B), C, D) with (A,B) being the composite key My UDF targets the field C and I want to optimize the query: SELECT A,MYFUNCTION(C),SUM(D) FROM