Hello,
Looks live we got dead lock with repeating "ERROR 1120 (XCL20)" exception. At 
this time all indexes is ACTIVE.
Can you help to make deeper diagnose?

java.sql.SQLException: ERROR 1120 (XCL20): Writes to table blocked until index 
can be updated. tableName=TBL_MARK
        at 
org.apache.phoenix.exception.SQLExceptionCode$Factory$1.newException(SQLExceptionCode.java:494)
        at 
org.apache.phoenix.exception.SQLExceptionInfo.buildException(SQLExceptionInfo.java:150)
        at 
org.apache.phoenix.execute.MutationState.validateAndGetServerTimestamp(MutationState.java:815)
        at 
org.apache.phoenix.execute.MutationState.validateAll(MutationState.java:789)
        at org.apache.phoenix.execute.MutationState.send(MutationState.java:981)
        at 
org.apache.phoenix.execute.MutationState.send(MutationState.java:1514)
        at 
org.apache.phoenix.execute.MutationState.commit(MutationState.java:1337)
        at 
org.apache.phoenix.jdbc.PhoenixConnection$3.call(PhoenixConnection.java:670)
        at 
org.apache.phoenix.jdbc.PhoenixConnection$3.call(PhoenixConnection.java:666)
        at org.apache.phoenix.call.CallRunner.run(CallRunner.java:53)
        at 
org.apache.phoenix.jdbc.PhoenixConnection.commit(PhoenixConnection.java:666)
        at 
x.persistence.phoenix.PhoenixDao.$anonfun$doUpsert$1(PhoenixDao.scala:103)
        at scala.util.Try$.apply(Try.scala:209)
        at x.persistence.phoenix.PhoenixDao.doUpsert(PhoenixDao.scala:101)
        at 
x.persistence.phoenix.PhoenixDao.$anonfun$batchInsert$2(PhoenixDao.scala:45)
        at 
x.persistence.phoenix.PhoenixDao.$anonfun$batchInsert$2$adapted(PhoenixDao.scala:45)
        at scala.collection.immutable.Stream.flatMap(Stream.scala:486)
        at 
scala.collection.immutable.Stream.$anonfun$flatMap$1(Stream.scala:494)
        at scala.collection.immutable.Stream.$anonfun$append$1(Stream.scala:252)
        at scala.collection.immutable.Stream$Cons.tail(Stream.scala:1169)
        at scala.collection.immutable.Stream$Cons.tail(Stream.scala:1159)
        at scala.collection.immutable.Stream.length(Stream.scala:309)
        at scala.collection.SeqLike.size(SeqLike.scala:105)
        at scala.collection.SeqLike.size$(SeqLike.scala:105)
        at scala.collection.AbstractSeq.size(Seq.scala:41)
        at scala.collection.TraversableOnce.toArray(TraversableOnce.scala:285)
        at scala.collection.TraversableOnce.toArray$(TraversableOnce.scala:283)
        at scala.collection.AbstractTraversable.toArray(Traversable.scala:104)
        at 
x.persistence.phoenix.PhoenixDao.$anonfun$batchInsert$1(PhoenixDao.scala:45)
        at scala.util.Try$.apply(Try.scala:209)
        at x.persistence.phoenix.PhoenixDao.batchInsert(PhoenixDao.scala:45)
        at 
x.persistence.phoenix.PhoenixDao.$anonfun$insert$2(PhoenixDao.scala:35)
        at scala.concurrent.Future$.$anonfun$apply$1(Future.scala:655)
        at scala.util.Success.$anonfun$map$1(Try.scala:251)
        at scala.util.Success.map(Try.scala:209)
        at scala.concurrent.Future.$anonfun$map$1(Future.scala:289)
        at scala.concurrent.impl.Promise.liftedTree1$1(Promise.scala:29)
        at scala.concurrent.impl.Promise.$anonfun$transform$1(Promise.scala:29)
        at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:60)
        at 
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
        at 
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
        at java.lang.Thread.run(Thread.java:748)

Reply via email to