Apache Phoenix - Timeout crawler - Build https://builds.apache.org/job/Phoenix-master-matrix/13/

2020-02-23 Thread Apache Jenkins Server
[...truncated 21 lines...]
Looking at the log, list of test(s) that timed-out:

Build:
https://builds.apache.org/job/Phoenix-master-matrix/13/


Affected test class(es):
Set(['as SYSTEM'])


Build step 'Execute shell' marked build as failure
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any

Apache-Phoenix | 4.x-HBase-1.3 | Build Successful

2020-02-23 Thread Apache Jenkins Server
4.x-HBase-1.3 branch build status Successful

Source repository https://git-wip-us.apache.org/repos/asf?p=phoenix.git;a=shortlog;h=refs/heads/4.x-HBase-1.3

Compiled Artifacts https://builds.apache.org/job/Phoenix-4.x-HBase-1.3/lastSuccessfulBuild/artifact/

Test Report https://builds.apache.org/job/Phoenix-4.x-HBase-1.3/lastCompletedBuild/testReport/

Changes
[kadir] PHOENIX-5743 addendum for multi-column family indexes



Build times for last couple of runsLatest build time is the right most | Legend blue: normal, red: test failure, gray: timeout


Build failed in Jenkins: Phoenix-4.x-HBase-1.3 #692

2020-02-23 Thread Apache Jenkins Server
See 


Changes:

[kadir] PHOENIX-5743 addendum for multi-column family indexes


--
[...truncated 143.50 KB...]
[INFO] Excluding org.apache.omid:omid-hbase-common-hbase1.x:jar:1.0.1 from the 
shaded jar.
[INFO] Excluding org.apache.omid:omid-timestamp-storage-hbase1.x:jar:1.0.1 from 
the shaded jar.
[INFO] Excluding org.apache.omid:omid-metrics:jar:1.0.1 from the shaded jar.
[INFO] Excluding org.apache.omid:omid-codahale-metrics:jar:1.0.1 from the 
shaded jar.
[INFO] Excluding org.jboss.netty:netty:jar:3.2.6.Final from the shaded jar.
[INFO] Excluding org.apache.commons:commons-pool2:jar:2.4.2 from the shaded jar.
[INFO] Excluding commons-daemon:commons-daemon:jar:1.0.10 from the shaded jar.
[INFO] Excluding org.apache.omid:omid-tso-server-hbase1.x:test-jar:tests:1.0.1 
from the shaded jar.
[INFO] Excluding org.apache.tephra:tephra-api:jar:0.15.0-incubating from the 
shaded jar.
[INFO] Excluding org.apache.tephra:tephra-core:jar:0.15.0-incubating from the 
shaded jar.
[INFO] Excluding 
org.apache.tephra:tephra-hbase-compat-1.3:jar:0.15.0-incubating from the shaded 
jar.
[INFO] Excluding org.antlr:antlr-runtime:jar:3.5.2 from the shaded jar.
[INFO] Excluding jline:jline:jar:2.11 from the shaded jar.
[INFO] Excluding sqlline:sqlline:jar:1.5.0 from the shaded jar.
[INFO] Excluding net.sourceforge.argparse4j:argparse4j:jar:0.8.1 from the 
shaded jar.
[INFO] Excluding com.google.guava:guava:jar:13.0.1 from the shaded jar.
[INFO] Including joda-time:joda-time:jar:1.6 in the shaded jar.
[INFO] Excluding com.github.stephenc.findbugs:findbugs-annotations:jar:1.3.9-1 
from the shaded jar.
[INFO] Excluding com.github.stephenc.jcip:jcip-annotations:jar:1.0-1 from the 
shaded jar.
[INFO] Excluding org.codehaus.jackson:jackson-core-asl:jar:1.9.2 from the 
shaded jar.
[INFO] Excluding org.codehaus.jackson:jackson-mapper-asl:jar:1.9.2 from the 
shaded jar.
[INFO] Excluding com.google.protobuf:protobuf-java:jar:2.5.0 from the shaded 
jar.
[INFO] Excluding log4j:log4j:jar:1.2.17 from the shaded jar.
[INFO] Excluding org.slf4j:slf4j-api:jar:1.6.4 from the shaded jar.
[INFO] Excluding org.iq80.snappy:snappy:jar:0.3 from the shaded jar.
[INFO] Excluding org.apache.htrace:htrace-core:jar:3.1.0-incubating from the 
shaded jar.
[INFO] Excluding commons-codec:commons-codec:jar:1.7 from the shaded jar.
[INFO] Excluding commons-collections:commons-collections:jar:3.2.2 from the 
shaded jar.
[INFO] Including org.apache.commons:commons-csv:jar:1.0 in the shaded jar.
[INFO] Excluding com.google.code.findbugs:jsr305:jar:2.0.1 from the shaded jar.
[INFO] Excluding org.apache.hbase:hbase-annotations:jar:1.3.5 from the shaded 
jar.
[INFO] Excluding org.apache.hbase:hbase-common:jar:1.3.5 from the shaded jar.
[INFO] Excluding org.mortbay.jetty:jetty-util:jar:6.1.26 from the shaded jar.
[INFO] Excluding org.apache.hbase:hbase-protocol:jar:1.3.5 from the shaded jar.
[INFO] Excluding org.apache.hbase:hbase-client:jar:1.3.5 from the shaded jar.
[INFO] Excluding io.netty:netty-all:jar:4.0.50.Final from the shaded jar.
[INFO] Excluding org.apache.zookeeper:zookeeper:jar:3.4.6 from the shaded jar.
[INFO] Excluding org.jruby.jcodings:jcodings:jar:1.0.8 from the shaded jar.
[INFO] Excluding org.apache.hbase:hbase-server:jar:1.3.5 from the shaded jar.
[INFO] Excluding org.apache.hbase:hbase-procedure:jar:1.3.5 from the shaded jar.
[INFO] Excluding org.apache.hbase:hbase-prefix-tree:jar:1.3.5 from the shaded 
jar.
[INFO] Excluding commons-httpclient:commons-httpclient:jar:3.1 from the shaded 
jar.
[INFO] Excluding com.sun.jersey:jersey-core:jar:1.9 from the shaded jar.
[INFO] Excluding com.sun.jersey:jersey-server:jar:1.9 from the shaded jar.
[INFO] Excluding asm:asm:jar:3.1 from the shaded jar.
[INFO] Excluding org.mortbay.jetty:jetty:jar:6.1.26 from the shaded jar.
[INFO] Excluding org.mortbay.jetty:jetty-sslengine:jar:6.1.26 from the shaded 
jar.
[INFO] Excluding org.mortbay.jetty:jsp-2.1:jar:6.1.14 from the shaded jar.
[INFO] Excluding org.mortbay.jetty:jsp-api-2.1:jar:6.1.14 from the shaded jar.
[INFO] Excluding org.mortbay.jetty:servlet-api-2.5:jar:6.1.14 from the shaded 
jar.
[INFO] Excluding tomcat:jasper-compiler:jar:5.5.23 from the shaded jar.
[INFO] Excluding tomcat:jasper-runtime:jar:5.5.23 from the shaded jar.
[INFO] Excluding commons-el:commons-el:jar:1.0 from the shaded jar.
[INFO] Excluding org.jamon:jamon-runtime:jar:2.4.1 from the shaded jar.
[INFO] Excluding org.apache.hbase:hbase-hadoop-compat:jar:1.3.5 from the shaded 
jar.
[INFO] Excluding org.apache.hbase:hbase-hadoop2-compat:jar:1.3.5 from the 
shaded jar.
[INFO] Excluding org.apache.hadoop:hadoop-common:jar:2.7.1 from the shaded jar.
[INFO] Excluding xmlenc:xmlenc:jar:0.52 from the shaded jar.
[INFO] Excluding commons-net:commons-net:jar:3.1 from the shaded jar.
[INFO] Excluding javax.servlet:servlet-api:jar:2.5 from the shaded jar.
[INFO] Excluding 

Build failed in Jenkins: Phoenix | Master | HBase Profile » 2.0 #13

2020-02-23 Thread Apache Jenkins Server
See 


Changes:

[kadir] PHOENIX-5743 addendum for multi-column family indexes


--
[...truncated 405.86 KB...]
at 
org.apache.hadoop.hbase.regionserver.HRegion$BatchOperation.visitBatchOperations(HRegion.java:3068)
at 
org.apache.hadoop.hbase.regionserver.HRegion$MutationBatchOperation.checkAndPrepare(HRegion.java:3450)
at 
org.apache.hadoop.hbase.regionserver.HRegion.batchMutate(HRegion.java:3887)
at 
org.apache.hadoop.hbase.regionserver.HRegion.batchMutate(HRegion.java:3821)
at 
org.apache.hadoop.hbase.regionserver.HRegion.batchMutate(HRegion.java:3812)
at 
org.apache.hadoop.hbase.regionserver.HRegion.batchMutate(HRegion.java:3826)
at 
org.apache.hadoop.hbase.regionserver.HRegion.doBatchMutate(HRegion.java:4153)
at 
org.apache.hadoop.hbase.regionserver.HRegion.delete(HRegion.java:2907)
at 
org.apache.hadoop.hbase.regionserver.RSRpcServices.mutate(RSRpcServices.java:2840)
at 
org.apache.hadoop.hbase.client.ClientServiceCallable.doMutate(ClientServiceCallable.java:55)
at org.apache.hadoop.hbase.client.HTable$2.rpcCall(HTable.java:498)
at org.apache.hadoop.hbase.client.HTable$2.rpcCall(HTable.java:493)
at 
org.apache.hadoop.hbase.client.RegionServerCallable.call(RegionServerCallable.java:127)
at 
org.apache.hadoop.hbase.client.RpcRetryingCallerImpl.callWithRetries(RpcRetryingCallerImpl.java:107)
at org.apache.hadoop.hbase.client.HTable.delete(HTable.java:503)
at 
org.apache.hadoop.hbase.security.access.AccessControlLists.removePermissionRecord(AccessControlLists.java:262)
at 
org.apache.hadoop.hbase.security.access.AccessControlLists.removeUserPermission(AccessControlLists.java:246)
at 
org.apache.hadoop.hbase.security.access.AccessController$8.run(AccessController.java:2123)
at 
org.apache.hadoop.hbase.security.access.AccessController$8.run(AccessController.java:2117)
at java.base/java.security.AccessController.doPrivileged(Native Method)
at java.base/javax.security.auth.Subject.doAs(Subject.java:423)
at 
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1962)
at 
org.apache.hadoop.security.SecurityUtil.doAsUser(SecurityUtil.java:514)
at 
org.apache.hadoop.security.SecurityUtil.doAsLoginUser(SecurityUtil.java:495)
at jdk.internal.reflect.GeneratedMethodAccessor127.invoke(Unknown 
Source)
at 
java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.base/java.lang.reflect.Method.invoke(Method.java:566)
at org.apache.hadoop.hbase.util.Methods.call(Methods.java:40)
at org.apache.hadoop.hbase.security.User.runAsLoginUser(User.java:183)
... 11 more

Caused by: org.apache.hadoop.hbase.ipc.RemoteWithExtrasException: 
java.io.IOException: org.apache.hadoop.hbase.security.AccessDeniedException: 
Insufficient permissions (user=regularUser1_N78, scope=hbase:acl, 
family=l:regularUser2_N79, 
params=[table=hbase:acl,family=l:regularUser2_N79],action=WRITE)
at org.apache.hadoop.hbase.security.User.runAsLoginUser(User.java:185)
at 
org.apache.hadoop.hbase.security.access.AccessController.revoke(AccessController.java:2117)
at 
org.apache.hadoop.hbase.protobuf.generated.AccessControlProtos$AccessControlService$1.revoke(AccessControlProtos.java:10031)
at 
org.apache.hadoop.hbase.protobuf.generated.AccessControlProtos$AccessControlService.callMethod(AccessControlProtos.java:10192)
at 
org.apache.hadoop.hbase.regionserver.HRegion.execService(HRegion.java:8106)
at 
org.apache.hadoop.hbase.regionserver.RSRpcServices.execServiceOnRegion(RSRpcServices.java:2409)
at 
org.apache.hadoop.hbase.regionserver.RSRpcServices.execService(RSRpcServices.java:2391)
at 
org.apache.hadoop.hbase.shaded.protobuf.generated.ClientProtos$ClientService$2.callBlockingMethod(ClientProtos.java:42010)
at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:413)
at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:130)
at 
org.apache.hadoop.hbase.ipc.RpcExecutor$Handler.run(RpcExecutor.java:324)
at 
org.apache.hadoop.hbase.ipc.RpcExecutor$Handler.run(RpcExecutor.java:304)
Caused by: org.apache.hadoop.hbase.security.AccessDeniedException: Insufficient 
permissions (user=regularUser1_N78, scope=hbase:acl, 
family=l:regularUser2_N79, 
params=[table=hbase:acl,family=l:regularUser2_N79],action=WRITE)
at 
org.apache.hadoop.hbase.security.access.AccessController.preDelete(AccessController.java:1551)
at 
org.apache.hadoop.hbase.regionserver.RegionCoprocessorHost$26.call(RegionCoprocessorHost.java:980)
at 

Build failed in Jenkins: Phoenix | Master | HBase Profile » 2.1 #13

2020-02-23 Thread Apache Jenkins Server
See 


Changes:

[kadir] PHOENIX-5743 addendum for multi-column family indexes


--
[...truncated 572.03 KB...]
[ERROR] Tests run: 1, Failures: 0, Errors: 1, Skipped: 0, Time elapsed: 0.008 s 
<<< FAILURE! - in org.apache.phoenix.end2end.ViewMetadataIT
[ERROR] org.apache.phoenix.end2end.ViewMetadataIT  Time elapsed: 0.007 s  <<< 
ERROR!
java.lang.RuntimeException: java.lang.OutOfMemoryError: unable to create native 
thread: possibly out of memory or process/resource limits reached
at 
org.apache.phoenix.end2end.ViewMetadataIT.doSetup(ViewMetadataIT.java:98)
Caused by: java.lang.OutOfMemoryError: unable to create native thread: possibly 
out of memory or process/resource limits reached
at 
org.apache.phoenix.end2end.ViewMetadataIT.doSetup(ViewMetadataIT.java:98)

[INFO] Running org.apache.phoenix.end2end.AlterTableWithViewsIT
[INFO] Running org.apache.phoenix.end2end.DropIndexedColsIT
[ERROR] Tests run: 1, Failures: 1, Errors: 0, Skipped: 0, Time elapsed: 0.006 s 
<<< FAILURE! - in org.apache.phoenix.end2end.DropIndexedColsIT
[ERROR] org.apache.phoenix.end2end.DropIndexedColsIT  Time elapsed: 0.005 s  
<<< FAILURE!
java.lang.AssertionError: Multiple regions on 
asf927.gq1.ygridcore.net,37235,1582446465756

[INFO] Running org.apache.phoenix.end2end.DropTableWithViewsIT
[INFO] Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 14.065 s 
- in org.apache.phoenix.end2end.DropTableWithViewsIT
[ERROR] Tests run: 9, Failures: 0, Errors: 1, Skipped: 0, Time elapsed: 107.331 
s <<< FAILURE! - in org.apache.phoenix.end2end.AlterMultiTenantTableWithViewsIT
[ERROR] 
testAddPKColumnToBaseTableWhoseViewsHaveIndices(org.apache.phoenix.end2end.AlterMultiTenantTableWithViewsIT)
  Time elapsed: 2.562 s  <<< ERROR!
org.apache.phoenix.exception.PhoenixIOException: 
org.apache.hadoop.hbase.DoNotRetryIOException: SCHEMA1.N01: 
java.lang.OutOfMemoryError: unable to create native thread: possibly out of 
memory or process/resource limits reached
at 
org.apache.phoenix.util.ServerUtil.createIOException(ServerUtil.java:113)
at 
org.apache.phoenix.coprocessor.MetaDataEndpointImpl.createTable(MetaDataEndpointImpl.java:2126)
at 
org.apache.phoenix.coprocessor.generated.MetaDataProtos$MetaDataService.callMethod(MetaDataProtos.java:17218)
at 
org.apache.hadoop.hbase.regionserver.HRegion.execService(HRegion.java:8265)
at 
org.apache.hadoop.hbase.regionserver.RSRpcServices.execServiceOnRegion(RSRpcServices.java:2444)
at 
org.apache.hadoop.hbase.regionserver.RSRpcServices.execService(RSRpcServices.java:2426)
at 
org.apache.hadoop.hbase.shaded.protobuf.generated.ClientProtos$ClientService$2.callBlockingMethod(ClientProtos.java:42286)
at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:413)
at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:133)
at 
org.apache.hadoop.hbase.ipc.RpcExecutor$Handler.run(RpcExecutor.java:324)
at 
org.apache.hadoop.hbase.ipc.RpcExecutor$Handler.run(RpcExecutor.java:304)
Caused by: java.lang.RuntimeException: java.lang.OutOfMemoryError: unable to 
create native thread: possibly out of memory or process/resource limits reached
at 
org.apache.hadoop.hbase.client.RpcRetryingCallerImpl.callWithoutRetries(RpcRetryingCallerImpl.java:200)
at 
org.apache.hadoop.hbase.client.ClientScanner.call(ClientScanner.java:267)
at 
org.apache.hadoop.hbase.client.ClientScanner.loadCache(ClientScanner.java:435)
at 
org.apache.hadoop.hbase.client.ClientScanner.nextWithSyncCache(ClientScanner.java:310)
at 
org.apache.hadoop.hbase.client.ClientScanner.next(ClientScanner.java:595)
at org.apache.phoenix.util.ViewUtil.findRelatedViews(ViewUtil.java:127)
at org.apache.phoenix.util.ViewUtil.dropChildViews(ViewUtil.java:200)
at 
org.apache.phoenix.coprocessor.MetaDataEndpointImpl.createTable(MetaDataEndpointImpl.java:1767)
... 9 more
Caused by: java.lang.OutOfMemoryError: unable to create native thread: possibly 
out of memory or process/resource limits reached
at java.base/java.lang.Thread.start0(Native Method)
at java.base/java.lang.Thread.start(Thread.java:803)
at 
java.base/java.util.concurrent.ThreadPoolExecutor.addWorker(ThreadPoolExecutor.java:937)
at 
java.base/java.util.concurrent.ThreadPoolExecutor.execute(ThreadPoolExecutor.java:1343)
at 
org.apache.hadoop.hbase.client.ResultBoundedCompletionService.submit(ResultBoundedCompletionService.java:171)
at 
org.apache.hadoop.hbase.client.ScannerCallableWithReplicas.addCallsForCurrentReplica(ScannerCallableWithReplicas.java:329)
at 
org.apache.hadoop.hbase.client.ScannerCallableWithReplicas.call(ScannerCallableWithReplicas.java:191)
at 

Build failed in Jenkins: Phoenix-4.x-HBase-1.5 #281

2020-02-23 Thread Apache Jenkins Server
See 


Changes:

[kadir] PHOENIX-5743 addendum for multi-column family indexes


--
[...truncated 331.19 KB...]

[INFO] Tests run: 9, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 89.472 s 
- in org.apache.phoenix.end2end.index.MutableIndexExtendedIT
[INFO] Running org.apache.phoenix.end2end.index.MutableIndexFailureIT
[INFO] Running 
org.apache.phoenix.end2end.index.MutableIndexFailureWithNamespaceIT
[INFO] Tests run: 40, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 255.987 
s - in org.apache.phoenix.end2end.index.GlobalIndexCheckerIT
[INFO] Running org.apache.phoenix.end2end.index.MutableIndexRebuilderIT
[INFO] Tests run: 11, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 582.643 
s - in org.apache.phoenix.end2end.SystemCatalogCreationOnConnectionIT
[INFO] Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 51.673 s 
- in org.apache.phoenix.end2end.index.MutableIndexRebuilderIT
[INFO] Running org.apache.phoenix.end2end.index.MutableIndexReplicationIT
[INFO] Tests run: 8, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 120.183 
s - in org.apache.phoenix.end2end.index.MutableIndexFailureWithNamespaceIT
[INFO] Running org.apache.phoenix.end2end.index.PartialIndexRebuilderIT
[INFO] Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 24.92 s 
- in org.apache.phoenix.end2end.index.MutableIndexReplicationIT
[INFO] Tests run: 14, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 181.828 
s - in org.apache.phoenix.end2end.index.MutableIndexFailureIT
[INFO] Running org.apache.phoenix.end2end.index.PhoenixMRJobSubmitterIT
[INFO] Running org.apache.phoenix.end2end.index.ShortViewIndexIdIT
[INFO] Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 7.349 s 
- in org.apache.phoenix.end2end.index.PhoenixMRJobSubmitterIT
[INFO] Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 18.059 s 
- in org.apache.phoenix.end2end.index.ShortViewIndexIdIT
[INFO] Running org.apache.phoenix.end2end.index.txn.TxWriteFailureIT
[WARNING] Tests run: 63, Failures: 0, Errors: 0, Skipped: 18, Time elapsed: 
468.613 s - in org.apache.phoenix.end2end.index.ImmutableIndexIT
[INFO] Running org.apache.phoenix.end2end.join.HashJoinCacheIT
[INFO] Running org.apache.phoenix.end2end.join.SortMergeJoinNoSpoolingIT
[INFO] Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 23.87 s 
- in org.apache.phoenix.end2end.join.HashJoinCacheIT
[INFO] Tests run: 12, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 58.621 
s - in org.apache.phoenix.end2end.index.txn.TxWriteFailureIT
[INFO] Tests run: 35, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 19.053 
s - in org.apache.phoenix.end2end.join.SortMergeJoinNoSpoolingIT
[INFO] Running org.apache.phoenix.hbase.index.FailForUnsupportedHBaseVersionsIT
[INFO] Running org.apache.phoenix.execute.PartialCommitIT
[INFO] Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 23.084 s 
- in org.apache.phoenix.hbase.index.FailForUnsupportedHBaseVersionsIT
[INFO] Tests run: 20, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 188.483 
s - in org.apache.phoenix.end2end.index.PartialIndexRebuilderIT
[INFO] Running org.apache.phoenix.execute.UpsertSelectOverlappingBatchesIT
[INFO] Running org.apache.phoenix.iterate.RoundRobinResultIteratorWithStatsIT
[INFO] Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 5.198 s 
- in org.apache.phoenix.iterate.RoundRobinResultIteratorWithStatsIT
[INFO] Running org.apache.phoenix.jdbc.SecureUserConnectionsIT
[INFO] Tests run: 8, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.606 s 
- in org.apache.phoenix.jdbc.SecureUserConnectionsIT
[ERROR] Tests run: 9, Failures: 0, Errors: 2, Skipped: 0, Time elapsed: 
1,251.559 s <<< FAILURE! - in 
org.apache.phoenix.end2end.ParameterizedIndexUpgradeToolIT
[ERROR] 
testDryRunAndFailures[IndexUpgradeToolIT_mutable=false,upgrade=false,isNamespaceEnabled=true](org.apache.phoenix.end2end.ParameterizedIndexUpgradeToolIT)
  Time elapsed: 110.543 s  <<< ERROR!
org.apache.phoenix.exception.PhoenixIOException: 
org.apache.hadoop.hbase.DoNotRetryIOException: TEST1.INDEX3: 
java.lang.OutOfMemoryError: unable to create new native thread
at 
org.apache.phoenix.util.ServerUtil.createIOException(ServerUtil.java:121)
at 
org.apache.phoenix.coprocessor.MetaDataEndpointImpl.createTable(MetaDataEndpointImpl.java:2106)
at 
org.apache.phoenix.coprocessor.generated.MetaDataProtos$MetaDataService.callMethod(MetaDataProtos.java:17218)
at 
org.apache.hadoop.hbase.regionserver.HRegion.execService(HRegion.java:8523)
at 
org.apache.hadoop.hbase.regionserver.RSRpcServices.execServiceOnRegion(RSRpcServices.java:2282)
at 
org.apache.hadoop.hbase.regionserver.RSRpcServices.execService(RSRpcServices.java:2264)
at