@Jain Thanks. I have checked the !autocommit output. Your're right. In the console the default value is true.
@Liang I just changed the hbase-site.xml file in the phoenix bin directory. The change works for timeout settings but not for this. :( @all Finally I successfully solved the problem by resetting configurations according to the official document <http://phoenix.apache.org/secondary_indexing.html>. After restarting the HBase cluster, everything works fine now. Thank all. On Tue, Mar 31, 2015 at 12:18 PM, Samarth Jain <[email protected]> wrote: > If you are using sqlline.py, then by default autocommit should be on. To > confirm, can you run !autocommit and see what the output is? > > On Mon, Mar 30, 2015 at 9:05 PM, 梁鹏程 <[email protected]> wrote: > >> hi, >> phoenix.connection.autoCommit in the hbase-site.xml is server side or >> client side? >> thanks. >> >> Regards, >> Ben Liang >> >> 在 2015年3月31日,10:55,丁桂涛(桂花) <[email protected]> 写道: >> >> Got it. I could add this conf (phoenix.connection.autoCommit) in the >> hbase-site.xml. >> >> On Tue, Mar 31, 2015 at 10:49 AM, 丁桂涛(桂花) <[email protected]> wrote: >> >>> Thanks Taylor for you kind reply. >>> >>> Actually I'm using the phoenix console (i.e. ./sqlline.py xxxx). Is auto >>> commit on the default behaviour? Or how should I set this in the console? >>> >>> On Tue, Mar 31, 2015 at 10:05 AM, James Taylor <[email protected]> >>> wrote: >>> >>>> I suspect you may not have auto commit on, in which case Phoenix would >>>> attempt to buffer the results of the select in memory so that you >>>> could commit it when you choose. Try setting auto commit on >>>> (connection.setAutoCommit(true)) for your connection before issuing >>>> the UPSERT SELECT statement. >>>> >>>> On Mon, Mar 30, 2015 at 6:48 PM, 丁桂涛(桂花) <[email protected]> >>>> wrote: >>>> > Hi all, >>>> > >>>> > I'm new to phoenix and found the UPSERT SELECT query ran very slowly >>>> and >>>> > failed after about 1 hour (I have set phoenix.query.timeoutMs to >>>> 6000000). >>>> > >>>> > Versions: >>>> > >>>> > Phoenix: 4.3.0 >>>> > >>>> > HBase: 0.98.5-hadoop2 >>>> > >>>> > Hive: 0.14.0 >>>> > >>>> > >>>> > Work Flow: >>>> > >>>> > create HBase table in hbase shell: create 'H_TABLE', 'CF', >>>> {NUMREGIONS => >>>> > 21, SPLITALGO => 'HexStringSplit'} >>>> > >>>> > load data to HBase table in hive via >>>> > org.apache.hadoop.hive.hbase.HBaseStorageHandler >>>> > >>>> > create Phoenix view mapping to the HBase table H_TABLE: create view >>>> P_VIEW >>>> > (... ...); >>>> > >>>> > create a locally indexed Phoenix table: create table P_TABLE (... ...) >>>> > SALT_BUCKETS = 42; create local index idx on P_TABLE (...); >>>> > >>>> > insert data into P_TABLE: upsert into P_TABLE select * from P_VIEW; >>>> There're >>>> > total about 40M records and this query failed after about 1 hour. >>>> Below is >>>> > the error msg. Any suggestions? >>>> > >>>> > >>>> > Error: (state=08000,code=101) >>>> > [10/1869] >>>> > org.apache.phoenix.exception.PhoenixIOException >>>> > at >>>> > >>>> org.apache.phoenix.util.ServerUtil.parseServerException(ServerUtil.java:108) >>>> > at >>>> > >>>> org.apache.phoenix.iterate.BaseResultIterators.getIterators(BaseResultIterators.java:589) >>>> > at >>>> > >>>> org.apache.phoenix.iterate.ConcatResultIterator.getIterators(ConcatResultIterator.java:50) >>>> > at >>>> > >>>> org.apache.phoenix.iterate.ConcatResultIterator.currentIterator(ConcatResultIterator.java:97) >>>> > at >>>> > >>>> org.apache.phoenix.iterate.ConcatResultIterator.next(ConcatResultIterator.java:117) >>>> > at >>>> > >>>> org.apache.phoenix.compile.UpsertCompiler$2.execute(UpsertCompiler.java:679) >>>> > at >>>> > >>>> org.apache.phoenix.jdbc.PhoenixStatement$2.call(PhoenixStatement.java:280) >>>> > at >>>> > >>>> org.apache.phoenix.jdbc.PhoenixStatement$2.call(PhoenixStatement.java:272) >>>> > at org.apache.phoenix.call.CallRunner.run(CallRunner.java:53) >>>> > at >>>> > >>>> org.apache.phoenix.jdbc.PhoenixStatement.executeMutation(PhoenixStatement.java:270) >>>> > at >>>> > >>>> org.apache.phoenix.jdbc.PhoenixStatement.execute(PhoenixStatement.java:1063) >>>> > at sqlline.Commands.execute(Commands.java:822) >>>> > at sqlline.Commands.sql(Commands.java:732) >>>> > at sqlline.SqlLine.dispatch(SqlLine.java:808) >>>> > at sqlline.SqlLine.begin(SqlLine.java:681) >>>> > at sqlline.SqlLine.start(SqlLine.java:398) >>>> > at sqlline.SqlLine.main(SqlLine.java:292) >>>> > Caused by: java.lang.UnsupportedOperationException >>>> > at java.util.AbstractList.add(AbstractList.java:148) >>>> > at java.util.AbstractList.add(AbstractList.java:108) >>>> > at >>>> > >>>> org.apache.phoenix.iterate.BaseResultIterators.getIterators(BaseResultIterators.java:535) >>>> > ... 15 more >>>> > >>>> > >>>> > >>>> >>> >>> >> >> >
