See <https://builds.apache.org/job/Phoenix-4.0-hadoop2/155/changes>

Changes:

[jtaylor] PHOENIX-1285 Override default for histogram depth in 
QueryServicesTestImpl

[jtaylor] PHOENIX-1284 Override config properties for unit tests not making it 
to server

[jtaylor] Adding missing files from fix for PHOENIX-1284

------------------------------------------
[...truncated 650 lines...]
        at 
org.apache.phoenix.iterate.ParallelIterators$3.call(ParallelIterators.java:357)
        at java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:334)
        at java.util.concurrent.FutureTask.run(FutureTask.java:166)
        at 
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
        at 
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
        at java.lang.Thread.run(Thread.java:724)
Caused by: org.apache.hadoop.hbase.DoNotRetryIOException: 
org.apache.hadoop.hbase.DoNotRetryIOException: 
_LOCAL_IDX_T,e\x00\x00\x00\x00\x00\x00\x00\x00\x00,1411473040911.b941ea2e515f1eb34355d5da0040ca94.:
 Requested memory of 21196 bytes could not be allocated from remaining memory 
of 21196 bytes from global pool of 40000 bytes after waiting for 0ms.
        at 
org.apache.phoenix.util.ServerUtil.createIOException(ServerUtil.java:77)
        at 
org.apache.phoenix.util.ServerUtil.throwIOException(ServerUtil.java:45)
        at 
org.apache.phoenix.coprocessor.BaseScannerRegionObserver.postScannerOpen(BaseScannerRegionObserver.java:152)
        at 
org.apache.hadoop.hbase.regionserver.RegionCoprocessorHost.postScannerOpen(RegionCoprocessorHost.java:1845)
        at 
org.apache.hadoop.hbase.regionserver.HRegionServer.scan(HRegionServer.java:3092)
        at 
org.apache.hadoop.hbase.protobuf.generated.ClientProtos$ClientService$2.callBlockingMethod(ClientProtos.java:29497)
        at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2027)
        at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:98)
        at 
org.apache.hadoop.hbase.ipc.RpcExecutor.consumerLoop(RpcExecutor.java:114)
        at org.apache.hadoop.hbase.ipc.RpcExecutor$1.run(RpcExecutor.java:94)
        at java.lang.Thread.run(Thread.java:724)
Caused by: org.apache.phoenix.memory.InsufficientMemoryException: Requested 
memory of 21196 bytes could not be allocated from remaining memory of 21196 
bytes from global pool of 40000 bytes after waiting for 0ms.
        at 
org.apache.phoenix.memory.GlobalMemoryManager.allocateBytes(GlobalMemoryManager.java:81)
        at 
org.apache.phoenix.memory.GlobalMemoryManager.allocate(GlobalMemoryManager.java:100)
        at 
org.apache.phoenix.memory.GlobalMemoryManager.allocate(GlobalMemoryManager.java:106)
        at 
org.apache.phoenix.cache.aggcache.SpillableGroupByCache.<init>(SpillableGroupByCache.java:150)
        at 
org.apache.phoenix.coprocessor.GroupedAggregateRegionObserver$GroupByCacheFactory.newCache(GroupedAggregateRegionObserver.java:362)
        at 
org.apache.phoenix.coprocessor.GroupedAggregateRegionObserver.scanUnordered(GroupedAggregateRegionObserver.java:397)
        at 
org.apache.phoenix.coprocessor.GroupedAggregateRegionObserver.doPostScannerOpen(GroupedAggregateRegionObserver.java:160)
        at 
org.apache.phoenix.coprocessor.BaseScannerRegionObserver.postScannerOpen(BaseScannerRegionObserver.java:134)
        ... 8 more

        at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
        at 
sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
        at 
sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
        at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
        at 
org.apache.hadoop.ipc.RemoteException.instantiateException(RemoteException.java:106)
        at 
org.apache.hadoop.ipc.RemoteException.unwrapRemoteException(RemoteException.java:95)
        at 
org.apache.hadoop.hbase.protobuf.ProtobufUtil.getRemoteException(ProtobufUtil.java:285)
        at 
org.apache.hadoop.hbase.client.ScannerCallable.openScanner(ScannerCallable.java:316)
        at 
org.apache.hadoop.hbase.client.ScannerCallable.call(ScannerCallable.java:164)
        at 
org.apache.hadoop.hbase.client.ScannerCallable.call(ScannerCallable.java:59)
        at 
org.apache.hadoop.hbase.client.RpcRetryingCaller.callWithRetries(RpcRetryingCaller.java:114)
        at 
org.apache.hadoop.hbase.client.RpcRetryingCaller.callWithRetries(RpcRetryingCaller.java:90)
        at 
org.apache.hadoop.hbase.client.ClientScanner.nextScanner(ClientScanner.java:282)
        at 
org.apache.hadoop.hbase.client.ClientScanner.initializeScannerInConstruction(ClientScanner.java:187)
        at 
org.apache.hadoop.hbase.client.ClientScanner.<init>(ClientScanner.java:182)
        at 
org.apache.hadoop.hbase.client.ClientScanner.<init>(ClientScanner.java:109)
        at org.apache.hadoop.hbase.client.HTable.getScanner(HTable.java:738)
        at 
org.apache.phoenix.iterate.TableResultIterator.<init>(TableResultIterator.java:54)
        at 
org.apache.phoenix.iterate.ParallelIterators$3.call(ParallelIterators.java:362)
        at 
org.apache.phoenix.iterate.ParallelIterators$3.call(ParallelIterators.java:357)
        at java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:334)
        at java.util.concurrent.FutureTask.run(FutureTask.java:166)
        at 
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
        at 
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
        at java.lang.Thread.run(Thread.java:724)
Caused by: org.apache.hadoop.hbase.ipc.RemoteWithExtrasException: 
org.apache.hadoop.hbase.DoNotRetryIOException: 
_LOCAL_IDX_T,e\x00\x00\x00\x00\x00\x00\x00\x00\x00,1411473040911.b941ea2e515f1eb34355d5da0040ca94.:
 Requested memory of 21196 bytes could not be allocated from remaining memory 
of 21196 bytes from global pool of 40000 bytes after waiting for 0ms.
        at 
org.apache.phoenix.util.ServerUtil.createIOException(ServerUtil.java:77)
        at 
org.apache.phoenix.util.ServerUtil.throwIOException(ServerUtil.java:45)
        at 
org.apache.phoenix.coprocessor.BaseScannerRegionObserver.postScannerOpen(BaseScannerRegionObserver.java:152)
        at 
org.apache.hadoop.hbase.regionserver.RegionCoprocessorHost.postScannerOpen(RegionCoprocessorHost.java:1845)
        at 
org.apache.hadoop.hbase.regionserver.HRegionServer.scan(HRegionServer.java:3092)
        at 
org.apache.hadoop.hbase.protobuf.generated.ClientProtos$ClientService$2.callBlockingMethod(ClientProtos.java:29497)
        at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2027)
        at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:98)
        at 
org.apache.hadoop.hbase.ipc.RpcExecutor.consumerLoop(RpcExecutor.java:114)
        at org.apache.hadoop.hbase.ipc.RpcExecutor$1.run(RpcExecutor.java:94)
        at java.lang.Thread.run(Thread.java:724)
Caused by: org.apache.phoenix.memory.InsufficientMemoryException: Requested 
memory of 21196 bytes could not be allocated from remaining memory of 21196 
bytes from global pool of 40000 bytes after waiting for 0ms.
        at 
org.apache.phoenix.memory.GlobalMemoryManager.allocateBytes(GlobalMemoryManager.java:81)
        at 
org.apache.phoenix.memory.GlobalMemoryManager.allocate(GlobalMemoryManager.java:100)
        at 
org.apache.phoenix.memory.GlobalMemoryManager.allocate(GlobalMemoryManager.java:106)
        at 
org.apache.phoenix.cache.aggcache.SpillableGroupByCache.<init>(SpillableGroupByCache.java:150)
        at 
org.apache.phoenix.coprocessor.GroupedAggregateRegionObserver$GroupByCacheFactory.newCache(GroupedAggregateRegionObserver.java:362)
        at 
org.apache.phoenix.coprocessor.GroupedAggregateRegionObserver.scanUnordered(GroupedAggregateRegionObserver.java:397)
        at 
org.apache.phoenix.coprocessor.GroupedAggregateRegionObserver.doPostScannerOpen(GroupedAggregateRegionObserver.java:160)
        at 
org.apache.phoenix.coprocessor.BaseScannerRegionObserver.postScannerOpen(BaseScannerRegionObserver.java:134)
        ... 8 more

        at org.apache.hadoop.hbase.ipc.RpcClient.call(RpcClient.java:1452)
        at 
org.apache.hadoop.hbase.ipc.RpcClient.callBlockingMethod(RpcClient.java:1656)
        at 
org.apache.hadoop.hbase.ipc.RpcClient$BlockingRpcChannelImplementation.callBlockingMethod(RpcClient.java:1714)
        at 
org.apache.hadoop.hbase.protobuf.generated.ClientProtos$ClientService$BlockingStub.scan(ClientProtos.java:29900)
        at 
org.apache.hadoop.hbase.client.ScannerCallable.openScanner(ScannerCallable.java:308)
        at 
org.apache.hadoop.hbase.client.ScannerCallable.call(ScannerCallable.java:164)
        at 
org.apache.hadoop.hbase.client.ScannerCallable.call(ScannerCallable.java:59)
        at 
org.apache.hadoop.hbase.client.RpcRetryingCaller.callWithRetries(RpcRetryingCaller.java:114)
        at 
org.apache.hadoop.hbase.client.RpcRetryingCaller.callWithRetries(RpcRetryingCaller.java:90)
        at 
org.apache.hadoop.hbase.client.ClientScanner.nextScanner(ClientScanner.java:282)
        at 
org.apache.hadoop.hbase.client.ClientScanner.initializeScannerInConstruction(ClientScanner.java:187)
        at 
org.apache.hadoop.hbase.client.ClientScanner.<init>(ClientScanner.java:182)
        at 
org.apache.hadoop.hbase.client.ClientScanner.<init>(ClientScanner.java:109)
        at org.apache.hadoop.hbase.client.HTable.getScanner(HTable.java:738)
        at 
org.apache.phoenix.iterate.TableResultIterator.<init>(TableResultIterator.java:54)
        at 
org.apache.phoenix.iterate.ParallelIterators$3.call(ParallelIterators.java:362)
        at 
org.apache.phoenix.iterate.ParallelIterators$3.call(ParallelIterators.java:357)
        at java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:334)
        at java.util.concurrent.FutureTask.run(FutureTask.java:166)
        at 
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
        at 
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
        at java.lang.Thread.run(Thread.java:724)

Running org.apache.phoenix.end2end.GuidePostsLifeCycleIT
Tests run: 5, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 6.565 sec - in 
org.apache.phoenix.end2end.PhoenixEncodeDecodeIT
Running org.apache.phoenix.end2end.DecodeFunctionIT
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 3.321 sec - in 
org.apache.phoenix.end2end.GuidePostsLifeCycleIT
Running org.apache.phoenix.end2end.MD5FunctionIT
Tests run: 6, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 2.486 sec - in 
org.apache.phoenix.end2end.DecodeFunctionIT
Running org.apache.phoenix.end2end.UpsertSelectAutoCommitIT
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 3.388 sec - in 
org.apache.phoenix.end2end.MD5FunctionIT
Running org.apache.phoenix.end2end.ExecuteStatementsIT
Tests run: 19, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 20.722 sec - 
in org.apache.phoenix.end2end.DeleteIT
Running org.apache.phoenix.end2end.HashJoinIT
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 7.212 sec - in 
org.apache.phoenix.end2end.UpsertSelectAutoCommitIT
Running org.apache.phoenix.end2end.TenantSpecificViewIndexIT
Tests run: 26, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 98.129 sec - 
in org.apache.phoenix.end2end.index.MutableIndexIT
Running org.apache.phoenix.end2end.CoalesceFunctionIT
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 9.753 sec - in 
org.apache.phoenix.end2end.ExecuteStatementsIT
Running org.apache.phoenix.end2end.TimezoneOffsetFunctionIT
Tests run: 6, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 2.633 sec - in 
org.apache.phoenix.end2end.TimezoneOffsetFunctionIT
Running org.apache.phoenix.end2end.ArithmeticQueryIT
Tests run: 10, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 8.27 sec - in 
org.apache.phoenix.end2end.CoalesceFunctionIT
Running org.apache.phoenix.end2end.ReverseFunctionIT
Tests run: 6, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 13.352 sec - in 
org.apache.phoenix.end2end.TenantSpecificViewIndexIT
Running org.apache.phoenix.end2end.RegexpSplitFunctionIT
Tests run: 5, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 2.722 sec - in 
org.apache.phoenix.end2end.ReverseFunctionIT
Running org.apache.phoenix.end2end.SaltedViewIT
Tests run: 11, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 3.136 sec - in 
org.apache.phoenix.end2end.RegexpSplitFunctionIT
Running org.apache.phoenix.end2end.FirstValueFunctionIT
Tests run: 7, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 3.138 sec - in 
org.apache.phoenix.end2end.FirstValueFunctionIT
Running org.apache.phoenix.end2end.QueryPlanIT
Tests run: 20, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 12.718 sec - 
in org.apache.phoenix.end2end.ArithmeticQueryIT
Running org.apache.phoenix.end2end.QueryExecWithoutSCNIT
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 8.158 sec - in 
org.apache.phoenix.end2end.SaltedViewIT
Running org.apache.phoenix.end2end.ConvertTimezoneFunctionIT
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 3.133 sec - in 
org.apache.phoenix.end2end.QueryExecWithoutSCNIT
Running org.apache.phoenix.end2end.StatementHintsIT
Tests run: 6, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 2.343 sec - in 
org.apache.phoenix.end2end.ConvertTimezoneFunctionIT
Running org.apache.phoenix.end2end.UpsertBigValuesIT
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 2.099 sec - in 
org.apache.phoenix.end2end.StatementHintsIT
Running org.apache.phoenix.end2end.salted.SaltedTableUpsertSelectIT
Tests run: 25, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 75.6 sec - in 
org.apache.phoenix.end2end.InListIT
Running org.apache.phoenix.end2end.SortOrderFIT
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 7.664 sec - in 
org.apache.phoenix.end2end.QueryPlanIT
Running org.apache.phoenix.end2end.QueryMoreIT
Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 8.541 sec - in 
org.apache.phoenix.end2end.UpsertBigValuesIT
Tests run: 30, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 6.883 sec - in 
org.apache.phoenix.end2end.SortOrderFIT
Running org.apache.phoenix.end2end.RegexpSubstrFunctionIT
Running org.apache.phoenix.end2end.ReverseScanIT
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.985 sec - in 
org.apache.phoenix.end2end.RegexpSubstrFunctionIT
Running org.apache.phoenix.end2end.ServerExceptionIT
Tests run: 5, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 10.301 sec - in 
org.apache.phoenix.end2end.salted.SaltedTableUpsertSelectIT
Running org.apache.phoenix.end2end.AutoCommitIT
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 3.114 sec - in 
org.apache.phoenix.end2end.ReverseScanIT
Running org.apache.phoenix.end2end.LastValueFunctionIT
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.836 sec - in 
org.apache.phoenix.end2end.ServerExceptionIT
Running org.apache.phoenix.end2end.RoundFloorCeilFunctionsEnd2EndIT
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.799 sec - in 
org.apache.phoenix.end2end.AutoCommitIT
Running org.apache.phoenix.end2end.LpadFunctionIT
Tests run: 11, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 5.156 sec - in 
org.apache.phoenix.end2end.LastValueFunctionIT
Tests run: 10, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 5.809 sec - in 
org.apache.phoenix.end2end.LpadFunctionIT
Tests run: 30, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 6.946 sec - in 
org.apache.phoenix.end2end.RoundFloorCeilFunctionsEnd2EndIT
Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 32.721 sec - in 
org.apache.phoenix.end2end.QueryMoreIT
Tests run: 96, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 85.866 sec - 
in org.apache.phoenix.end2end.HashJoinIT

Results :

Tests in error: 
  LocalIndexIT.testLocalIndexScanJoinColumnsFromDataTable:439 ? PhoenixIO 
org.ap...

Tests run: 491, Failures: 0, Errors: 1, Skipped: 0

[INFO] 
[INFO] --- maven-failsafe-plugin:2.17:integration-test 
(NeedTheirOwnClusterTests) @ phoenix-core ---
[INFO] Failsafe report directory: 
<https://builds.apache.org/job/Phoenix-4.0-hadoop2/ws/phoenix-core/target/failsafe-reports>
[INFO] parallel='none', perCoreThreadCount=true, threadCount=0, 
useUnlimitedThreads=false, threadCountSuites=0, threadCountClasses=0, 
threadCountMethods=0, parallelOptimized=true

-------------------------------------------------------
 T E S T S
-------------------------------------------------------

-------------------------------------------------------
 T E S T S
-------------------------------------------------------
Running 
org.apache.phoenix.hbase.index.covered.EndToEndCoveredColumnsIndexBuilderIT
Running 
org.apache.phoenix.hbase.index.covered.example.EndtoEndIndexingWithCompressionIT
Running org.apache.phoenix.hbase.index.covered.example.EndToEndCoveredIndexingIT
Running org.apache.phoenix.hbase.index.covered.example.FailWithoutRetriesIT
Running org.apache.phoenix.hbase.index.balancer.IndexLoadBalancerIT
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 8.26 sec - in 
org.apache.phoenix.hbase.index.covered.EndToEndCoveredColumnsIndexBuilderIT
Running org.apache.phoenix.hbase.index.FailForUnsupportedHBaseVersionsIT
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 8.128 sec - in 
org.apache.phoenix.hbase.index.covered.example.FailWithoutRetriesIT
Running org.apache.phoenix.end2end.index.MutableIndexFailureIT
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 14.767 sec - in 
org.apache.phoenix.hbase.index.FailForUnsupportedHBaseVersionsIT
Running org.apache.phoenix.end2end.ContextClassloaderIT
Tests run: 11, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 47.655 sec - 
in 
org.apache.phoenix.hbase.index.covered.example.EndtoEndIndexingWithCompressionIT
Tests run: 11, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 47.921 sec - 
in org.apache.phoenix.hbase.index.covered.example.EndToEndCoveredIndexingIT
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 7.21 sec - in 
org.apache.phoenix.end2end.ContextClassloaderIT
Running org.apache.phoenix.mapreduce.CsvBulkLoadToolIT
Tests run: 9, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 102.095 sec - 
in org.apache.phoenix.hbase.index.balancer.IndexLoadBalancerIT
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 141.857 sec - 
in org.apache.phoenix.end2end.index.MutableIndexFailureIT
Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 172.978 sec - 
in org.apache.phoenix.mapreduce.CsvBulkLoadToolIT

Results :

Tests run: 46, Failures: 0, Errors: 0, Skipped: 0

[INFO] 
[INFO] --- maven-failsafe-plugin:2.17:verify (ClientManagedTimeTests) @ 
phoenix-core ---
[INFO] Failsafe report directory: 
<https://builds.apache.org/job/Phoenix-4.0-hadoop2/ws/phoenix-core/target/failsafe-reports>
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Phoenix .................................... SUCCESS [2.632s]
[INFO] Phoenix Hadoop Compatibility ...................... SUCCESS [2.016s]
[INFO] Phoenix Hadoop2 Compatibility ..................... SUCCESS [4.315s]
[INFO] Phoenix Core ...................................... FAILURE 
[1:24:40.317s]
[INFO] Phoenix - Flume ................................... SKIPPED
[INFO] Phoenix - Pig ..................................... SKIPPED
[INFO] Phoenix Assembly .................................. SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 1:24:49.813s
[INFO] Finished at: Tue Sep 23 11:57:29 UTC 2014
[INFO] Final Memory: 37M/368M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal 
org.apache.maven.plugins:maven-failsafe-plugin:2.17:verify 
(ClientManagedTimeTests) on project phoenix-core: There are test failures.
[ERROR] 
[ERROR] Please refer to 
<https://builds.apache.org/job/Phoenix-4.0-hadoop2/ws/phoenix-core/target/failsafe-reports>
 for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e 
switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please 
read the following articles:
[ERROR] [Help 1] 
http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :phoenix-core
Build step 'Invoke top-level Maven targets' marked build as failure
Archiving artifacts
Sending artifact delta relative to Phoenix | 4.0 | Hadoop2 #153
Archived 688 artifacts
Archive block size is 32768
Received 4277 blocks and 208880424 bytes
Compression is 40.2%
Took 1 min 47 sec
Recording test results
Updating PHOENIX-1284
Updating PHOENIX-1285

Reply via email to