Apache-Phoenix | 4.x-HBase-1.3 | Build Successful

2019-07-16 Thread Apache Jenkins Server
4.x-HBase-1.3 branch build status Successful

Source repository https://git-wip-us.apache.org/repos/asf?p=phoenix.git;a=shortlog;h=refs/heads/4.x-HBase-1.3

Compiled Artifacts https://builds.apache.org/job/Phoenix-4.x-HBase-1.3/lastSuccessfulBuild/artifact/

Test Report https://builds.apache.org/job/Phoenix-4.x-HBase-1.3/lastCompletedBuild/testReport/

Changes
[s.kadam] PHOENIX-5385: GlobalIndexChecker coproc doesn't load on view indexes



Build times for last couple of runsLatest build time is the right most | Legend blue: normal, red: test failure, gray: timeout


Jenkins build is back to normal : Phoenix | Master #2458

2019-07-16 Thread Apache Jenkins Server
See 




Apache-Phoenix | 4.x-HBase-1.3 | Build Successful

2019-07-16 Thread Apache Jenkins Server
4.x-HBase-1.3 branch build status Successful

Source repository https://git-wip-us.apache.org/repos/asf?p=phoenix.git;a=shortlog;h=refs/heads/4.x-HBase-1.3

Compiled Artifacts https://builds.apache.org/job/Phoenix-4.x-HBase-1.3/lastSuccessfulBuild/artifact/

Test Report https://builds.apache.org/job/Phoenix-4.x-HBase-1.3/lastCompletedBuild/testReport/

Changes
[s.kadam] PHOENIX-5385: GlobalIndexChecker coproc doesn't load on view indexes



Build times for last couple of runsLatest build time is the right most | Legend blue: normal, red: test failure, gray: timeout


Build failed in Jenkins: Phoenix-4.x-HBase-1.3 #473

2019-07-16 Thread Apache Jenkins Server
See 


Changes:

[s.kadam] PHOENIX-5385: GlobalIndexChecker coproc doesn't load on view indexes

--
[...truncated 102.59 KB...]
[WARNING] Tests run: 3676, Failures: 0, Errors: 0, Skipped: 2
[INFO] 
[INFO] 
[INFO] --- maven-failsafe-plugin:2.20:integration-test (HBaseManagedTimeTests) 
@ phoenix-core ---
[INFO] 
[INFO] ---
[INFO]  T E S T S
[INFO] ---
[INFO] 
[INFO] Results:
[INFO] 
[INFO] Tests run: 0, Failures: 0, Errors: 0, Skipped: 0
[INFO] 
[INFO] 
[INFO] --- maven-failsafe-plugin:2.20:integration-test 
(NeedTheirOwnClusterTests) @ phoenix-core ---
[INFO] 
[INFO] ---
[INFO]  T E S T S
[INFO] ---
[INFO] Running 
org.apache.hadoop.hbase.regionserver.wal.WALReplayWithIndexWritesAndCompressedWALIT
[WARNING] Tests run: 1, Failures: 0, Errors: 0, Skipped: 1, Time elapsed: 0.003 
s - in 
org.apache.hadoop.hbase.regionserver.wal.WALReplayWithIndexWritesAndCompressedWALIT
[INFO] Running org.apache.phoenix.end2end.ConnectionUtilIT
[INFO] Running 
org.apache.hadoop.hbase.regionserver.wal.WALRecoveryRegionPostOpenIT
[INFO] Running org.apache.phoenix.end2end.ColumnEncodedMutableTxStatsCollectorIT
[INFO] Running org.apache.phoenix.end2end.ConcurrentMutationsExtendedIT
[INFO] Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 14.822 s 
- in org.apache.hadoop.hbase.regionserver.wal.WALRecoveryRegionPostOpenIT
[INFO] Running 
org.apache.phoenix.end2end.ColumnEncodedImmutableNonTxStatsCollectorIT
[INFO] Running 
org.apache.phoenix.end2end.ColumnEncodedMutableNonTxStatsCollectorIT
[INFO] Running 
org.apache.phoenix.end2end.ColumnEncodedImmutableTxStatsCollectorIT
[INFO] Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 65.269 s 
- in org.apache.phoenix.end2end.ConnectionUtilIT
[INFO] Running org.apache.phoenix.end2end.ContextClassloaderIT
[INFO] Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 4.787 s 
- in org.apache.phoenix.end2end.ContextClassloaderIT
[INFO] Running org.apache.phoenix.end2end.CostBasedDecisionIT
[INFO] Running org.apache.phoenix.end2end.CountDistinctCompressionIT
[INFO] Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 4.712 s 
- in org.apache.phoenix.end2end.CountDistinctCompressionIT
[WARNING] Tests run: 28, Failures: 0, Errors: 0, Skipped: 4, Time elapsed: 
209.606 s - in 
org.apache.phoenix.end2end.ColumnEncodedMutableNonTxStatsCollectorIT
[WARNING] Tests run: 28, Failures: 0, Errors: 0, Skipped: 4, Time elapsed: 
214.362 s - in 
org.apache.phoenix.end2end.ColumnEncodedImmutableNonTxStatsCollectorIT
[WARNING] Tests run: 28, Failures: 0, Errors: 0, Skipped: 4, Time elapsed: 
218.169 s - in 
org.apache.phoenix.end2end.ColumnEncodedImmutableTxStatsCollectorIT
[WARNING] Tests run: 28, Failures: 0, Errors: 0, Skipped: 4, Time elapsed: 
224.214 s - in org.apache.phoenix.end2end.ColumnEncodedMutableTxStatsCollectorIT
[INFO] Running org.apache.phoenix.end2end.CsvBulkLoadToolIT
[INFO] Running org.apache.phoenix.end2end.DropSchemaIT
[INFO] Running org.apache.phoenix.end2end.IndexExtendedIT
[INFO] Running org.apache.phoenix.end2end.FlappingLocalIndexIT
[INFO] Running org.apache.phoenix.end2end.IndexBuildTimestampIT
[INFO] Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 40.617 s 
- in org.apache.phoenix.end2end.DropSchemaIT
[INFO] Tests run: 20, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 107.067 
s - in org.apache.phoenix.end2end.IndexExtendedIT
[INFO] Tests run: 10, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 435.401 
s - in org.apache.phoenix.end2end.ConcurrentMutationsExtendedIT
[INFO] Tests run: 16, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 159.219 
s - in org.apache.phoenix.end2end.CsvBulkLoadToolIT
[INFO] Running org.apache.phoenix.end2end.IndexToolForPartialBuildIT
[INFO] Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 22.249 s 
- in org.apache.phoenix.end2end.IndexToolForPartialBuildIT
[INFO] Running org.apache.phoenix.end2end.MigrateSystemTablesToSystemNamespaceIT
[INFO] Running org.apache.phoenix.end2end.IndexToolIT
[INFO] Running 
org.apache.phoenix.end2end.IndexToolForPartialBuildWithNamespaceEnabledIT
[INFO] Running org.apache.phoenix.end2end.LocalIndexSplitMergeIT
[INFO] Tests run: 12, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 233.186 
s - in org.apache.phoenix.end2end.FlappingLocalIndexIT
[INFO] Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 43.352 s 
- in org.apache.phoenix.end2end.IndexToolForPartialBuildWithNamespaceEnabledIT
[INFO] Tests run: 16, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 287.789 
s - in org.apache.phoenix.end2end.IndexBuildTimestampIT
[INFO] Running 
org.apache.phoenix.end2end.NonColumnEncodedImmutableNonT

Apache-Phoenix | 4.x-HBase-1.3 | Build Successful

2019-07-16 Thread Apache Jenkins Server
4.x-HBase-1.3 branch build status Successful

Source repository https://git-wip-us.apache.org/repos/asf?p=phoenix.git;a=shortlog;h=refs/heads/4.x-HBase-1.3

Compiled Artifacts https://builds.apache.org/job/Phoenix-4.x-HBase-1.3/lastSuccessfulBuild/artifact/

Test Report https://builds.apache.org/job/Phoenix-4.x-HBase-1.3/lastCompletedBuild/testReport/

Changes
[larsh] PHOENIX-5290 HashJoinMoreIT is flapping.



Build times for last couple of runsLatest build time is the right most | Legend blue: normal, red: test failure, gray: timeout


Jenkins build is back to normal : Phoenix-4.x-HBase-1.4 #217

2019-07-16 Thread Apache Jenkins Server
See 




Apache Phoenix - Timeout crawler - Build https://builds.apache.org/job/Phoenix-master/2457/

2019-07-16 Thread Apache Jenkins Server
[...truncated 56 lines...]

Build failed in Jenkins: Phoenix | Master #2457

2019-07-16 Thread Apache Jenkins Server
See 


Changes:

[larsh] PHOENIX-5290 HashJoinMoreIT is flapping.

--
[...truncated 112.16 KB...]
[INFO] Running org.apache.phoenix.tx.TransactionIT
[INFO] Tests run: 8, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 36.802 s 
- in org.apache.phoenix.tx.FlappingTransactionIT
[INFO] Running org.apache.phoenix.tx.TxCheckpointIT
[INFO] Tests run: 9, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 74.474 s 
- in org.apache.phoenix.iterate.RoundRobinResultIteratorIT
[INFO] Running org.apache.phoenix.util.IndexScrutinyIT
[INFO] Tests run: 8, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 66.931 s 
- in org.apache.phoenix.rpc.UpdateCacheIT
[INFO] Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 32.488 s 
- in org.apache.phoenix.util.IndexScrutinyIT
[INFO] Tests run: 24, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 392.367 
s - in org.apache.phoenix.end2end.join.SubqueryIT
[INFO] Tests run: 28, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 268.989 
s - in org.apache.phoenix.tx.TransactionIT
[INFO] Tests run: 90, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 923.066 
s - in org.apache.phoenix.end2end.index.MutableIndexIT
[INFO] Tests run: 50, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 465.156 
s - in org.apache.phoenix.tx.TxCheckpointIT
[INFO] Tests run: 66, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 557.458 
s - in org.apache.phoenix.tx.ParameterizedTransactionIT
[INFO] 
[INFO] Results:
[INFO] 
[ERROR] Errors: 
[ERROR]   
SubqueryUsingSortMergeJoinIT>ParallelStatsDisabledIT.doSetup:60->BaseTest.setUpTestDriver:515->BaseTest.setUpTestDriver:520->BaseTest.checkClusterInitialized:434->BaseTest.setUpTestCluster:448->BaseTest.initMiniCluster:549
 » Runtime
[INFO] 
[ERROR] Tests run: 3666, Failures: 0, Errors: 1, Skipped: 2
[INFO] 
[INFO] 
[INFO] --- maven-failsafe-plugin:2.20:integration-test (HBaseManagedTimeTests) 
@ phoenix-core ---
[INFO] 
[INFO] ---
[INFO]  T E S T S
[INFO] ---
[INFO] 
[INFO] Results:
[INFO] 
[INFO] Tests run: 0, Failures: 0, Errors: 0, Skipped: 0
[INFO] 
[INFO] 
[INFO] --- maven-failsafe-plugin:2.20:integration-test 
(NeedTheirOwnClusterTests) @ phoenix-core ---
[INFO] 
[INFO] ---
[INFO]  T E S T S
[INFO] ---
[INFO] Running 
org.apache.hadoop.hbase.regionserver.wal.WALReplayWithIndexWritesAndCompressedWALIT
[WARNING] Tests run: 1, Failures: 0, Errors: 0, Skipped: 1, Time elapsed: 0.003 
s - in 
org.apache.hadoop.hbase.regionserver.wal.WALReplayWithIndexWritesAndCompressedWALIT
[INFO] Running org.apache.phoenix.end2end.ConnectionUtilIT
[INFO] Running 
org.apache.hadoop.hbase.regionserver.wal.WALRecoveryRegionPostOpenIT
[INFO] Running org.apache.phoenix.end2end.ConcurrentMutationsExtendedIT
[INFO] Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 12.326 s 
- in org.apache.hadoop.hbase.regionserver.wal.WALRecoveryRegionPostOpenIT
[INFO] Running org.apache.phoenix.end2end.CountDistinctCompressionIT
[INFO] Running org.apache.phoenix.end2end.CsvBulkLoadToolIT
[INFO] Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 5.134 s 
- in org.apache.phoenix.end2end.CountDistinctCompressionIT
[INFO] Running org.apache.phoenix.end2end.ContextClassloaderIT
[INFO] Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 54.981 s 
- in org.apache.phoenix.end2end.ConnectionUtilIT
[INFO] Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 4.98 s - 
in org.apache.phoenix.end2end.ContextClassloaderIT
[INFO] Running org.apache.phoenix.end2end.CostBasedDecisionIT
[INFO] Running org.apache.phoenix.end2end.DropSchemaIT
[INFO] Running org.apache.phoenix.end2end.FlappingLocalIndexIT
[INFO] Running org.apache.phoenix.end2end.IndexBuildTimestampIT
[INFO] Running org.apache.phoenix.end2end.IndexExtendedIT
[INFO] Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 25.681 s 
- in org.apache.phoenix.end2end.DropSchemaIT
[INFO] Tests run: 16, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 144.546 
s - in org.apache.phoenix.end2end.CsvBulkLoadToolIT
[INFO] Running org.apache.phoenix.end2end.IndexToolForPartialBuildIT
[INFO] Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 22.976 s 
- in org.apache.phoenix.end2end.IndexToolForPartialBuildIT
[INFO] Tests run: 20, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 108.612 
s - in org.apache.phoenix.end2end.IndexExtendedIT
[INFO] Running 
org.apache.phoenix.end2end.IndexToolForPartialBuildWithNamespaceEnabledIT
[INFO] Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 45.449 s 
- in org.apache.phoenix.end2end.IndexToolForPartialBuildWithNamespaceEnabledIT
[INFO] Running org.apache.phoenix.end2end.IndexToolIT
[INFO] Running org.ap

Build failed in Jenkins: Phoenix-4.x-HBase-1.3 #472

2019-07-16 Thread Apache Jenkins Server
See 


Changes:

[larsh] PHOENIX-5290 HashJoinMoreIT is flapping.

--
[...truncated 526.52 KB...]
[INFO] Tests run: 8, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 37.742 s 
- in org.apache.phoenix.tx.FlappingTransactionIT
[INFO] Running org.apache.phoenix.util.IndexScrutinyIT
[INFO] Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 29.223 s 
- in org.apache.phoenix.util.IndexScrutinyIT
[INFO] Tests run: 15, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 287.014 
s - in org.apache.phoenix.end2end.join.SubqueryUsingSortMergeJoinIT
[INFO] Tests run: 28, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 273.289 
s - in org.apache.phoenix.tx.TransactionIT
[INFO] Tests run: 24, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 389.195 
s - in org.apache.phoenix.end2end.join.SubqueryIT
[INFO] Tests run: 90, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 924.031 
s - in org.apache.phoenix.end2end.index.MutableIndexIT
[INFO] Tests run: 50, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 485.316 
s - in org.apache.phoenix.tx.TxCheckpointIT
[INFO] Tests run: 66, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 554.683 
s - in org.apache.phoenix.tx.ParameterizedTransactionIT
[INFO] 
[INFO] Results:
[INFO] 
[WARNING] Tests run: 3676, Failures: 0, Errors: 0, Skipped: 2
[INFO] 
[INFO] 
[INFO] --- maven-failsafe-plugin:2.20:integration-test (HBaseManagedTimeTests) 
@ phoenix-core ---
[INFO] 
[INFO] ---
[INFO]  T E S T S
[INFO] ---
[INFO] 
[INFO] Results:
[INFO] 
[INFO] Tests run: 0, Failures: 0, Errors: 0, Skipped: 0
[INFO] 
[INFO] 
[INFO] --- maven-failsafe-plugin:2.20:integration-test 
(NeedTheirOwnClusterTests) @ phoenix-core ---
[INFO] 
[INFO] ---
[INFO]  T E S T S
[INFO] ---
[INFO] Running 
org.apache.hadoop.hbase.regionserver.wal.WALReplayWithIndexWritesAndCompressedWALIT
[WARNING] Tests run: 1, Failures: 0, Errors: 0, Skipped: 1, Time elapsed: 0.002 
s - in 
org.apache.hadoop.hbase.regionserver.wal.WALReplayWithIndexWritesAndCompressedWALIT
[INFO] Running org.apache.phoenix.end2end.ConnectionUtilIT
[INFO] Running 
org.apache.hadoop.hbase.regionserver.wal.WALRecoveryRegionPostOpenIT
[INFO] Running 
org.apache.phoenix.end2end.ColumnEncodedMutableNonTxStatsCollectorIT
[INFO] Running 
org.apache.phoenix.end2end.ColumnEncodedImmutableNonTxStatsCollectorIT
[INFO] Running org.apache.phoenix.end2end.ConcurrentMutationsExtendedIT
[INFO] Running 
org.apache.phoenix.end2end.ColumnEncodedImmutableTxStatsCollectorIT
[INFO] Running org.apache.phoenix.end2end.ColumnEncodedMutableTxStatsCollectorIT
[INFO] Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 8.802 s 
- in org.apache.hadoop.hbase.regionserver.wal.WALRecoveryRegionPostOpenIT
[INFO] Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 53.991 s 
- in org.apache.phoenix.end2end.ConnectionUtilIT
[INFO] Running org.apache.phoenix.end2end.ContextClassloaderIT
[INFO] Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 4.617 s 
- in org.apache.phoenix.end2end.ContextClassloaderIT
[INFO] Running org.apache.phoenix.end2end.CostBasedDecisionIT
[INFO] Running org.apache.phoenix.end2end.CountDistinctCompressionIT
[INFO] Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 5.243 s 
- in org.apache.phoenix.end2end.CountDistinctCompressionIT
[WARNING] Tests run: 28, Failures: 0, Errors: 0, Skipped: 4, Time elapsed: 
186.651 s - in 
org.apache.phoenix.end2end.ColumnEncodedMutableNonTxStatsCollectorIT
[WARNING] Tests run: 28, Failures: 0, Errors: 0, Skipped: 4, Time elapsed: 
186.215 s - in org.apache.phoenix.end2end.ColumnEncodedMutableTxStatsCollectorIT
[WARNING] Tests run: 28, Failures: 0, Errors: 0, Skipped: 4, Time elapsed: 
187.541 s - in 
org.apache.phoenix.end2end.ColumnEncodedImmutableNonTxStatsCollectorIT
[WARNING] Tests run: 28, Failures: 0, Errors: 0, Skipped: 4, Time elapsed: 
190.774 s - in 
org.apache.phoenix.end2end.ColumnEncodedImmutableTxStatsCollectorIT
[INFO] Running org.apache.phoenix.end2end.CsvBulkLoadToolIT
[INFO] Running org.apache.phoenix.end2end.DropSchemaIT
[INFO] Running org.apache.phoenix.end2end.FlappingLocalIndexIT
[INFO] Running org.apache.phoenix.end2end.IndexExtendedIT
[INFO] Running org.apache.phoenix.end2end.IndexBuildTimestampIT
[INFO] Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 34.701 s 
- in org.apache.phoenix.end2end.DropSchemaIT
[INFO] Running org.apache.phoenix.end2end.IndexToolForPartialBuildIT
[INFO] Tests run: 16, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 152.252 
s - in org.apache.phoenix.end2end.CsvBulkLoadToolIT
[INFO] Tests run: 20, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 104.995 
s - in org.apach

[phoenix] branch master updated: PHOENIX-5385: GlobalIndexChecker coproc doesn't load on view indexes

2019-07-16 Thread skadam
This is an automated email from the ASF dual-hosted git repository.

skadam pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/phoenix.git


The following commit(s) were added to refs/heads/master by this push:
 new fab96c3  PHOENIX-5385: GlobalIndexChecker coproc doesn't load on view 
indexes
fab96c3 is described below

commit fab96c3a9d031045e876cbea11033507e5c425ca
Author: s.kadam 
AuthorDate: Tue Jul 16 12:14:41 2019 -0700

PHOENIX-5385: GlobalIndexChecker coproc doesn't load on view indexes
---
 .../end2end/ParameterizedIndexUpgradeToolIT.java   | 31 +--
 .../apache/phoenix/end2end/index/DropColumnIT.java |  9 ++--
 .../org/apache/phoenix/execute/MutationState.java  | 60 ++
 .../phoenix/mapreduce/index/IndexUpgradeTool.java  | 23 +
 .../phoenix/query/ConnectionQueryServicesImpl.java | 13 +++--
 5 files changed, 81 insertions(+), 55 deletions(-)

diff --git 
a/phoenix-core/src/it/java/org/apache/phoenix/end2end/ParameterizedIndexUpgradeToolIT.java
 
b/phoenix-core/src/it/java/org/apache/phoenix/end2end/ParameterizedIndexUpgradeToolIT.java
index d2d68e2..2cde910 100644
--- 
a/phoenix-core/src/it/java/org/apache/phoenix/end2end/ParameterizedIndexUpgradeToolIT.java
+++ 
b/phoenix-core/src/it/java/org/apache/phoenix/end2end/ParameterizedIndexUpgradeToolIT.java
@@ -25,6 +25,7 @@ import org.apache.phoenix.hbase.index.IndexRegionObserver;
 import org.apache.phoenix.hbase.index.Indexer;
 import org.apache.phoenix.index.GlobalIndexChecker;
 import org.apache.phoenix.jdbc.PhoenixConnection;
+
 import org.apache.phoenix.mapreduce.index.IndexTool;
 import org.apache.phoenix.mapreduce.index.IndexUpgradeTool;
 import org.apache.phoenix.query.BaseTest;
@@ -60,11 +61,11 @@ import static 
org.apache.phoenix.mapreduce.index.IndexUpgradeTool.UPGRADE_OP;
 @RunWith(Parameterized.class)
 @Category(NeedsOwnMiniClusterTest.class)
 public class ParameterizedIndexUpgradeToolIT extends BaseTest {
-//Please do not remove/uncomment commented items in the list until 
PHOENIX-5385 is fixed
 private static final String [] INDEXES_LIST = {"TEST.INDEX1", 
"TEST.INDEX2", "TEST1.INDEX3",
-"TEST1.INDEX2","TEST1.INDEX1","TEST.INDEX3"/*, "_IDX_TEST.MOCK1", 
"_IDX_TEST1.MOCK2"*/};
-private static final String [] INDEXES_LIST_NAMESPACE = {"TEST:INDEX1", 
"TEST:INDEX2", "TEST1:INDEX3",
-"TEST1:INDEX2","TEST1:INDEX1","TEST:INDEX3"/*, "TEST:_IDX_MOCK1", 
"TEST1:_IDX_MOCK2"*/};
+"TEST1.INDEX2","TEST1.INDEX1","TEST.INDEX3", "_IDX_TEST.MOCK1", 
"_IDX_TEST1.MOCK2"};
+private static final String [] INDEXES_LIST_NAMESPACE = {"TEST:INDEX1", 
"TEST:INDEX2"
+, "TEST1:INDEX3", "TEST1:INDEX2","TEST1:INDEX1"
+, "TEST:INDEX3", "TEST:_IDX_MOCK1", "TEST1:_IDX_MOCK2"};
 private static final String [] TABLE_LIST = 
{"TEST.MOCK1","TEST1.MOCK2","TEST.MOCK3"};
 private static final String [] TABLE_LIST_NAMESPACE = 
{"TEST:MOCK1","TEST1:MOCK2","TEST:MOCK3"};
 
@@ -99,8 +100,8 @@ public class ParameterizedIndexUpgradeToolIT extends 
BaseTest {
 .getQueryServices();
 admin = queryServices.getAdmin();
 iut = new IndexUpgradeTool(upgrade ? UPGRADE_OP : ROLLBACK_OP, 
INPUT_LIST,
-null, "/tmp/index_upgrade_" + 
UUID.randomUUID().toString(),true,
-Mockito.mock(IndexTool.class));
+null, "/tmp/index_upgrade_" + 
UUID.randomUUID().toString(),true, Mockito.mock(
+IndexTool.class));
 iut.setConf(getUtility().getConfiguration());
 iut.setTest(true);
 if (!mutable) {
@@ -141,8 +142,7 @@ public class ParameterizedIndexUpgradeToolIT extends 
BaseTest {
 + "PRIMARY KEY, name varchar, city varchar, phone 
bigint)"+tableDDLOptions);
 conn.createStatement().execute("CREATE TABLE TEST.MOCK3 (id bigint NOT 
NULL "
 + "PRIMARY KEY, name varchar, age bigint)"+tableDDLOptions);
-/*
-Please do not remove/uncomment commented code until PHOENIX-5385 is 
fixed
+
 //views
 conn.createStatement().execute("CREATE VIEW TEST.MOCK1_VIEW 
(view_column varchar) "
 + "AS SELECT * FROM TEST.MOCK1 WHERE a.name = 'a'");
@@ -158,7 +158,7 @@ public class ParameterizedIndexUpgradeToolIT extends 
BaseTest {
 conn.createStatement().execute("CREATE INDEX MOCK2_INDEX1 ON 
TEST1.MOCK2_VIEW "
 + "(state, city)");
 conn.createStatement().execute("CREATE INDEX MOCK1_INDEX3 ON 
TEST.MOCK1_VIEW "
-+ "(view_column)");*/
++ "(view_column)");
 //indexes
 conn.createStatement().execute("CREATE INDEX INDEX1 ON TEST.MOCK1 
(sal, a.name)");
 conn.createStatement().execute("CREATE INDEX INDEX2 ON TEST.MOCK1 
(a.name)");
@@ -190,7 +190,8 @@ public class ParameterizedIndexUpgradeToolIT extends 
BaseTest {
 }
 }
 
-private void checkNewIndexingCoprocessors(String [] 

[phoenix] branch 4.x-HBase-1.5 updated: PHOENIX-5385: GlobalIndexChecker coproc doesn't load on view indexes

2019-07-16 Thread skadam
This is an automated email from the ASF dual-hosted git repository.

skadam pushed a commit to branch 4.x-HBase-1.5
in repository https://gitbox.apache.org/repos/asf/phoenix.git


The following commit(s) were added to refs/heads/4.x-HBase-1.5 by this push:
 new ac2faec  PHOENIX-5385: GlobalIndexChecker coproc doesn't load on view 
indexes
ac2faec is described below

commit ac2faec980f84f2cc395c1fffd6b2678e56900c8
Author: s.kadam 
AuthorDate: Tue Jul 16 12:34:01 2019 -0700

PHOENIX-5385: GlobalIndexChecker coproc doesn't load on view indexes
---
 .../end2end/ParameterizedIndexUpgradeToolIT.java   | 31 +++--
 .../apache/phoenix/end2end/index/DropColumnIT.java |  8 ++--
 .../org/apache/phoenix/execute/MutationState.java  | 53 ++
 .../phoenix/mapreduce/index/IndexUpgradeTool.java  | 38 ++--
 .../phoenix/query/ConnectionQueryServicesImpl.java | 13 ++
 5 files changed, 84 insertions(+), 59 deletions(-)

diff --git 
a/phoenix-core/src/it/java/org/apache/phoenix/end2end/ParameterizedIndexUpgradeToolIT.java
 
b/phoenix-core/src/it/java/org/apache/phoenix/end2end/ParameterizedIndexUpgradeToolIT.java
index 400df93..24c0f39 100644
--- 
a/phoenix-core/src/it/java/org/apache/phoenix/end2end/ParameterizedIndexUpgradeToolIT.java
+++ 
b/phoenix-core/src/it/java/org/apache/phoenix/end2end/ParameterizedIndexUpgradeToolIT.java
@@ -60,11 +60,11 @@ import static 
org.apache.phoenix.mapreduce.index.IndexUpgradeTool.UPGRADE_OP;
 @RunWith(Parameterized.class)
 @Category(NeedsOwnMiniClusterTest.class)
 public class ParameterizedIndexUpgradeToolIT extends BaseTest {
-//Please do not remove/uncomment commented items in the list until 
PHOENIX-5385 is fixed
 private static final String [] INDEXES_LIST = {"TEST.INDEX1", 
"TEST.INDEX2", "TEST1.INDEX3",
-"TEST1.INDEX2","TEST1.INDEX1","TEST.INDEX3"/*, "_IDX_TEST.MOCK1", 
"_IDX_TEST1.MOCK2"*/};
-private static final String [] INDEXES_LIST_NAMESPACE = {"TEST:INDEX1", 
"TEST:INDEX2", "TEST1:INDEX3",
-"TEST1:INDEX2","TEST1:INDEX1","TEST:INDEX3"/*, "TEST:_IDX_MOCK1", 
"TEST1:_IDX_MOCK2"*/};
+"TEST1.INDEX2","TEST1.INDEX1","TEST.INDEX3", "_IDX_TEST.MOCK1", 
"_IDX_TEST1.MOCK2"};
+private static final String [] INDEXES_LIST_NAMESPACE = {"TEST:INDEX1", 
"TEST:INDEX2",
+"TEST1:INDEX3", "TEST1:INDEX2","TEST1:INDEX1","TEST:INDEX3", 
"TEST:_IDX_MOCK1",
+"TEST1:_IDX_MOCK2"};
 private static final String [] TABLE_LIST = 
{"TEST.MOCK1","TEST1.MOCK2","TEST.MOCK3"};
 private static final String [] TABLE_LIST_NAMESPACE = 
{"TEST:MOCK1","TEST1:MOCK2","TEST:MOCK3"};
 
@@ -99,7 +99,8 @@ public class ParameterizedIndexUpgradeToolIT extends BaseTest 
{
 .getQueryServices();
 admin = queryServices.getAdmin();
 iut = new IndexUpgradeTool(upgrade ? UPGRADE_OP : ROLLBACK_OP, 
INPUT_LIST,
-null, "/tmp/index_upgrade_" + 
UUID.randomUUID().toString(),true, Mockito.mock(IndexTool.class));
+null, "/tmp/index_upgrade_" + UUID.randomUUID().toString(),
+true, Mockito.mock(IndexTool.class));
 iut.setConf(getUtility().getConfiguration());
 iut.setTest(true);
 if (!mutable) {
@@ -141,9 +142,8 @@ public class ParameterizedIndexUpgradeToolIT extends 
BaseTest {
 conn.createStatement().execute("CREATE TABLE TEST.MOCK3 (id bigint NOT 
NULL "
 + "PRIMARY KEY, name varchar, age bigint)"+tableDDLOptions);
 
-//Please do not remove/uncomment commented code until PHOENIX-5385 is 
fixed
 //views
-/*conn.createStatement().execute("CREATE VIEW TEST.MOCK1_VIEW 
(view_column varchar) "
+conn.createStatement().execute("CREATE VIEW TEST.MOCK1_VIEW 
(view_column varchar) "
 + "AS SELECT * FROM TEST.MOCK1 WHERE a.name = 'a'");
 conn.createStatement().execute("CREATE VIEW TEST.MOCK1_VIEW1 
(view_column varchar,"
 + " zip varchar) AS SELECT * FROM TEST.MOCK1 WHERE a.name = 
'a'");
@@ -157,7 +157,7 @@ public class ParameterizedIndexUpgradeToolIT extends 
BaseTest {
 conn.createStatement().execute("CREATE INDEX MOCK2_INDEX1 ON 
TEST1.MOCK2_VIEW "
 + "(state, city)");
 conn.createStatement().execute("CREATE INDEX MOCK1_INDEX3 ON 
TEST.MOCK1_VIEW "
-+ "(view_column)");*/
++ "(view_column)");
 //indexes
 conn.createStatement().execute("CREATE INDEX INDEX1 ON TEST.MOCK1 
(sal, a.name)");
 conn.createStatement().execute("CREATE INDEX INDEX2 ON TEST.MOCK1 
(a.name)");
@@ -189,7 +189,8 @@ public class ParameterizedIndexUpgradeToolIT extends 
BaseTest {
 }
 }
 
-private void checkNewIndexingCoprocessors(String [] indexList, String [] 
tableList) throws IOException {
+private void checkNewIndexingCoprocessors(String [] indexList, String [] 
tableList)
+throws IOException {
 if (mutable) {
 f

[phoenix] branch 4.x-HBase-1.4 updated: PHOENIX-5385: GlobalIndexChecker coproc doesn't load on view indexes

2019-07-16 Thread skadam
This is an automated email from the ASF dual-hosted git repository.

skadam pushed a commit to branch 4.x-HBase-1.4
in repository https://gitbox.apache.org/repos/asf/phoenix.git


The following commit(s) were added to refs/heads/4.x-HBase-1.4 by this push:
 new ca13453  PHOENIX-5385: GlobalIndexChecker coproc doesn't load on view 
indexes
ca13453 is described below

commit ca134532b49a5a7ba77e08b3fcd84486216cd8ee
Author: s.kadam 
AuthorDate: Tue Jul 16 12:34:01 2019 -0700

PHOENIX-5385: GlobalIndexChecker coproc doesn't load on view indexes
---
 .../end2end/ParameterizedIndexUpgradeToolIT.java   | 31 +++--
 .../apache/phoenix/end2end/index/DropColumnIT.java |  8 ++--
 .../org/apache/phoenix/execute/MutationState.java  | 53 ++
 .../phoenix/mapreduce/index/IndexUpgradeTool.java  | 38 ++--
 .../phoenix/query/ConnectionQueryServicesImpl.java | 13 ++
 5 files changed, 84 insertions(+), 59 deletions(-)

diff --git 
a/phoenix-core/src/it/java/org/apache/phoenix/end2end/ParameterizedIndexUpgradeToolIT.java
 
b/phoenix-core/src/it/java/org/apache/phoenix/end2end/ParameterizedIndexUpgradeToolIT.java
index 400df93..24c0f39 100644
--- 
a/phoenix-core/src/it/java/org/apache/phoenix/end2end/ParameterizedIndexUpgradeToolIT.java
+++ 
b/phoenix-core/src/it/java/org/apache/phoenix/end2end/ParameterizedIndexUpgradeToolIT.java
@@ -60,11 +60,11 @@ import static 
org.apache.phoenix.mapreduce.index.IndexUpgradeTool.UPGRADE_OP;
 @RunWith(Parameterized.class)
 @Category(NeedsOwnMiniClusterTest.class)
 public class ParameterizedIndexUpgradeToolIT extends BaseTest {
-//Please do not remove/uncomment commented items in the list until 
PHOENIX-5385 is fixed
 private static final String [] INDEXES_LIST = {"TEST.INDEX1", 
"TEST.INDEX2", "TEST1.INDEX3",
-"TEST1.INDEX2","TEST1.INDEX1","TEST.INDEX3"/*, "_IDX_TEST.MOCK1", 
"_IDX_TEST1.MOCK2"*/};
-private static final String [] INDEXES_LIST_NAMESPACE = {"TEST:INDEX1", 
"TEST:INDEX2", "TEST1:INDEX3",
-"TEST1:INDEX2","TEST1:INDEX1","TEST:INDEX3"/*, "TEST:_IDX_MOCK1", 
"TEST1:_IDX_MOCK2"*/};
+"TEST1.INDEX2","TEST1.INDEX1","TEST.INDEX3", "_IDX_TEST.MOCK1", 
"_IDX_TEST1.MOCK2"};
+private static final String [] INDEXES_LIST_NAMESPACE = {"TEST:INDEX1", 
"TEST:INDEX2",
+"TEST1:INDEX3", "TEST1:INDEX2","TEST1:INDEX1","TEST:INDEX3", 
"TEST:_IDX_MOCK1",
+"TEST1:_IDX_MOCK2"};
 private static final String [] TABLE_LIST = 
{"TEST.MOCK1","TEST1.MOCK2","TEST.MOCK3"};
 private static final String [] TABLE_LIST_NAMESPACE = 
{"TEST:MOCK1","TEST1:MOCK2","TEST:MOCK3"};
 
@@ -99,7 +99,8 @@ public class ParameterizedIndexUpgradeToolIT extends BaseTest 
{
 .getQueryServices();
 admin = queryServices.getAdmin();
 iut = new IndexUpgradeTool(upgrade ? UPGRADE_OP : ROLLBACK_OP, 
INPUT_LIST,
-null, "/tmp/index_upgrade_" + 
UUID.randomUUID().toString(),true, Mockito.mock(IndexTool.class));
+null, "/tmp/index_upgrade_" + UUID.randomUUID().toString(),
+true, Mockito.mock(IndexTool.class));
 iut.setConf(getUtility().getConfiguration());
 iut.setTest(true);
 if (!mutable) {
@@ -141,9 +142,8 @@ public class ParameterizedIndexUpgradeToolIT extends 
BaseTest {
 conn.createStatement().execute("CREATE TABLE TEST.MOCK3 (id bigint NOT 
NULL "
 + "PRIMARY KEY, name varchar, age bigint)"+tableDDLOptions);
 
-//Please do not remove/uncomment commented code until PHOENIX-5385 is 
fixed
 //views
-/*conn.createStatement().execute("CREATE VIEW TEST.MOCK1_VIEW 
(view_column varchar) "
+conn.createStatement().execute("CREATE VIEW TEST.MOCK1_VIEW 
(view_column varchar) "
 + "AS SELECT * FROM TEST.MOCK1 WHERE a.name = 'a'");
 conn.createStatement().execute("CREATE VIEW TEST.MOCK1_VIEW1 
(view_column varchar,"
 + " zip varchar) AS SELECT * FROM TEST.MOCK1 WHERE a.name = 
'a'");
@@ -157,7 +157,7 @@ public class ParameterizedIndexUpgradeToolIT extends 
BaseTest {
 conn.createStatement().execute("CREATE INDEX MOCK2_INDEX1 ON 
TEST1.MOCK2_VIEW "
 + "(state, city)");
 conn.createStatement().execute("CREATE INDEX MOCK1_INDEX3 ON 
TEST.MOCK1_VIEW "
-+ "(view_column)");*/
++ "(view_column)");
 //indexes
 conn.createStatement().execute("CREATE INDEX INDEX1 ON TEST.MOCK1 
(sal, a.name)");
 conn.createStatement().execute("CREATE INDEX INDEX2 ON TEST.MOCK1 
(a.name)");
@@ -189,7 +189,8 @@ public class ParameterizedIndexUpgradeToolIT extends 
BaseTest {
 }
 }
 
-private void checkNewIndexingCoprocessors(String [] indexList, String [] 
tableList) throws IOException {
+private void checkNewIndexingCoprocessors(String [] indexList, String [] 
tableList)
+throws IOException {
 if (mutable) {
 f

[phoenix] branch 4.x-HBase-1.3 updated: PHOENIX-5385: GlobalIndexChecker coproc doesn't load on view indexes

2019-07-16 Thread skadam
This is an automated email from the ASF dual-hosted git repository.

skadam pushed a commit to branch 4.x-HBase-1.3
in repository https://gitbox.apache.org/repos/asf/phoenix.git


The following commit(s) were added to refs/heads/4.x-HBase-1.3 by this push:
 new e261fbe  PHOENIX-5385: GlobalIndexChecker coproc doesn't load on view 
indexes
e261fbe is described below

commit e261fbe776e4917484674929cda1d660187353dd
Author: s.kadam 
AuthorDate: Tue Jul 16 12:34:01 2019 -0700

PHOENIX-5385: GlobalIndexChecker coproc doesn't load on view indexes
---
 .../end2end/ParameterizedIndexUpgradeToolIT.java   | 31 +++--
 .../apache/phoenix/end2end/index/DropColumnIT.java |  8 ++--
 .../org/apache/phoenix/execute/MutationState.java  | 53 ++
 .../phoenix/mapreduce/index/IndexUpgradeTool.java  | 38 ++--
 .../phoenix/query/ConnectionQueryServicesImpl.java | 13 ++
 5 files changed, 84 insertions(+), 59 deletions(-)

diff --git 
a/phoenix-core/src/it/java/org/apache/phoenix/end2end/ParameterizedIndexUpgradeToolIT.java
 
b/phoenix-core/src/it/java/org/apache/phoenix/end2end/ParameterizedIndexUpgradeToolIT.java
index 400df93..24c0f39 100644
--- 
a/phoenix-core/src/it/java/org/apache/phoenix/end2end/ParameterizedIndexUpgradeToolIT.java
+++ 
b/phoenix-core/src/it/java/org/apache/phoenix/end2end/ParameterizedIndexUpgradeToolIT.java
@@ -60,11 +60,11 @@ import static 
org.apache.phoenix.mapreduce.index.IndexUpgradeTool.UPGRADE_OP;
 @RunWith(Parameterized.class)
 @Category(NeedsOwnMiniClusterTest.class)
 public class ParameterizedIndexUpgradeToolIT extends BaseTest {
-//Please do not remove/uncomment commented items in the list until 
PHOENIX-5385 is fixed
 private static final String [] INDEXES_LIST = {"TEST.INDEX1", 
"TEST.INDEX2", "TEST1.INDEX3",
-"TEST1.INDEX2","TEST1.INDEX1","TEST.INDEX3"/*, "_IDX_TEST.MOCK1", 
"_IDX_TEST1.MOCK2"*/};
-private static final String [] INDEXES_LIST_NAMESPACE = {"TEST:INDEX1", 
"TEST:INDEX2", "TEST1:INDEX3",
-"TEST1:INDEX2","TEST1:INDEX1","TEST:INDEX3"/*, "TEST:_IDX_MOCK1", 
"TEST1:_IDX_MOCK2"*/};
+"TEST1.INDEX2","TEST1.INDEX1","TEST.INDEX3", "_IDX_TEST.MOCK1", 
"_IDX_TEST1.MOCK2"};
+private static final String [] INDEXES_LIST_NAMESPACE = {"TEST:INDEX1", 
"TEST:INDEX2",
+"TEST1:INDEX3", "TEST1:INDEX2","TEST1:INDEX1","TEST:INDEX3", 
"TEST:_IDX_MOCK1",
+"TEST1:_IDX_MOCK2"};
 private static final String [] TABLE_LIST = 
{"TEST.MOCK1","TEST1.MOCK2","TEST.MOCK3"};
 private static final String [] TABLE_LIST_NAMESPACE = 
{"TEST:MOCK1","TEST1:MOCK2","TEST:MOCK3"};
 
@@ -99,7 +99,8 @@ public class ParameterizedIndexUpgradeToolIT extends BaseTest 
{
 .getQueryServices();
 admin = queryServices.getAdmin();
 iut = new IndexUpgradeTool(upgrade ? UPGRADE_OP : ROLLBACK_OP, 
INPUT_LIST,
-null, "/tmp/index_upgrade_" + 
UUID.randomUUID().toString(),true, Mockito.mock(IndexTool.class));
+null, "/tmp/index_upgrade_" + UUID.randomUUID().toString(),
+true, Mockito.mock(IndexTool.class));
 iut.setConf(getUtility().getConfiguration());
 iut.setTest(true);
 if (!mutable) {
@@ -141,9 +142,8 @@ public class ParameterizedIndexUpgradeToolIT extends 
BaseTest {
 conn.createStatement().execute("CREATE TABLE TEST.MOCK3 (id bigint NOT 
NULL "
 + "PRIMARY KEY, name varchar, age bigint)"+tableDDLOptions);
 
-//Please do not remove/uncomment commented code until PHOENIX-5385 is 
fixed
 //views
-/*conn.createStatement().execute("CREATE VIEW TEST.MOCK1_VIEW 
(view_column varchar) "
+conn.createStatement().execute("CREATE VIEW TEST.MOCK1_VIEW 
(view_column varchar) "
 + "AS SELECT * FROM TEST.MOCK1 WHERE a.name = 'a'");
 conn.createStatement().execute("CREATE VIEW TEST.MOCK1_VIEW1 
(view_column varchar,"
 + " zip varchar) AS SELECT * FROM TEST.MOCK1 WHERE a.name = 
'a'");
@@ -157,7 +157,7 @@ public class ParameterizedIndexUpgradeToolIT extends 
BaseTest {
 conn.createStatement().execute("CREATE INDEX MOCK2_INDEX1 ON 
TEST1.MOCK2_VIEW "
 + "(state, city)");
 conn.createStatement().execute("CREATE INDEX MOCK1_INDEX3 ON 
TEST.MOCK1_VIEW "
-+ "(view_column)");*/
++ "(view_column)");
 //indexes
 conn.createStatement().execute("CREATE INDEX INDEX1 ON TEST.MOCK1 
(sal, a.name)");
 conn.createStatement().execute("CREATE INDEX INDEX2 ON TEST.MOCK1 
(a.name)");
@@ -189,7 +189,8 @@ public class ParameterizedIndexUpgradeToolIT extends 
BaseTest {
 }
 }
 
-private void checkNewIndexingCoprocessors(String [] indexList, String [] 
tableList) throws IOException {
+private void checkNewIndexingCoprocessors(String [] indexList, String [] 
tableList)
+throws IOException {
 if (mutable) {
 f

[phoenix] branch master updated: PHOENIX-5290 HashJoinMoreIT is flapping.

2019-07-16 Thread larsh
This is an automated email from the ASF dual-hosted git repository.

larsh pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/phoenix.git


The following commit(s) were added to refs/heads/master by this push:
 new b5242ff  PHOENIX-5290 HashJoinMoreIT is flapping.
b5242ff is described below

commit b5242ff75f18696ea29fe4c95fde27a5d557966d
Author: Lars Hofhansl 
AuthorDate: Tue Jul 16 15:45:48 2019 -0700

PHOENIX-5290 HashJoinMoreIT is flapping.
---
 .../phoenix/end2end/RowValueConstructorIT.java | 32 ++
 .../org/apache/phoenix/compile/WhereOptimizer.java |  8 --
 .../org/apache/phoenix/schema/types/PVarchar.java  |  2 +-
 3 files changed, 39 insertions(+), 3 deletions(-)

diff --git 
a/phoenix-core/src/it/java/org/apache/phoenix/end2end/RowValueConstructorIT.java
 
b/phoenix-core/src/it/java/org/apache/phoenix/end2end/RowValueConstructorIT.java
index fb04261..390d831 100644
--- 
a/phoenix-core/src/it/java/org/apache/phoenix/end2end/RowValueConstructorIT.java
+++ 
b/phoenix-core/src/it/java/org/apache/phoenix/end2end/RowValueConstructorIT.java
@@ -51,6 +51,7 @@ import java.sql.Timestamp;
 import java.util.List;
 import java.util.Properties;
 
+import org.apache.phoenix.jdbc.PhoenixPreparedStatement;
 import org.apache.phoenix.util.DateUtil;
 import org.apache.phoenix.util.PhoenixRuntime;
 import org.apache.phoenix.util.PropertiesUtil;
@@ -1691,6 +1692,37 @@ public class RowValueConstructorIT extends 
ParallelStatsDisabledIT {
 }
 }
 
+@Test
+public void testTrailingSeparator() throws Exception {
+Connection conn = null;
+try {
+conn = DriverManager.getConnection(getUrl());
+conn.createStatement().execute("CREATE TABLE test2961 (\n"
++ "ACCOUNT_ID VARCHAR NOT NULL,\n" + "BUCKET_ID VARCHAR 
NOT NULL,\n"
++ "OBJECT_ID VARCHAR NOT NULL,\n" + "OBJECT_VERSION 
VARCHAR NOT NULL,\n"
++ "LOC VARCHAR,\n"
++ "CONSTRAINT PK PRIMARY KEY (ACCOUNT_ID, BUCKET_ID, 
OBJECT_ID, OBJECT_VERSION DESC))");
+
+String sql = "SELECT  OBJ.ACCOUNT_ID from  test2961 as OBJ where "
++ "(OBJ.ACCOUNT_ID, OBJ.BUCKET_ID, OBJ.OBJECT_ID, 
OBJ.OBJECT_VERSION) IN "
++ "((?,?,?,?),(?,?,?,?))";
+
+PhoenixPreparedStatement statement = conn.prepareStatement(sql)
+.unwrap(PhoenixPreparedStatement.class);
+statement.setString(1, new String(new char[] { (char) 3 }));
+statement.setString(2, new String(new char[] { (char) 55 }));
+statement.setString(3, new String(new char[] { (char) 39 }));
+statement.setString(4, new String(new char[] { (char) 0 }));
+statement.setString(5, new String(new char[] { (char) 83 }));
+statement.setString(6, new String(new char[] { (char) 15 }));
+statement.setString(7, new String(new char[] { (char) 55 }));
+statement.setString(8, new String(new char[] { (char) 147 }));
+statement.optimizeQuery(sql);
+} finally {
+conn.close();
+}
+}
+
 private StringBuilder generateQueryToTest(int numItemsInClause, String 
fullViewName) {
 StringBuilder querySb =
 new StringBuilder("SELECT OBJECT_ID,OBJECT_DATA2,OBJECT_DATA 
FROM " + fullViewName);
diff --git 
a/phoenix-core/src/main/java/org/apache/phoenix/compile/WhereOptimizer.java 
b/phoenix-core/src/main/java/org/apache/phoenix/compile/WhereOptimizer.java
index b845a09..0964d9d 100644
--- a/phoenix-core/src/main/java/org/apache/phoenix/compile/WhereOptimizer.java
+++ b/phoenix-core/src/main/java/org/apache/phoenix/compile/WhereOptimizer.java
@@ -347,7 +347,9 @@ public class WhereOptimizer {
 byte[] lowerRange = KeyRange.UNBOUND;
 boolean lowerInclusive = false;
 // Lower range of trailing part of RVC must be true, so we can form a 
new range to intersect going forward
-if (!range.lowerUnbound() && Bytes.startsWith(range.getLowerRange(), 
clippedResult.getLowerRange())) {
+if (!range.lowerUnbound()
+&& range.getLowerRange().length > 
clippedResult.getLowerRange().length
+&& Bytes.startsWith(range.getLowerRange(), 
clippedResult.getLowerRange())) {
 lowerRange = range.getLowerRange();
 int offset = clippedResult.getLowerRange().length + 
separatorLength;
 ptr.set(lowerRange, offset, lowerRange.length - offset);
@@ -356,7 +358,9 @@ public class WhereOptimizer {
 }
 byte[] upperRange = KeyRange.UNBOUND;
 boolean upperInclusive = false;
-if (!range.upperUnbound() && Bytes.startsWith(range.getUpperRange(), 
clippedResult.getUpperRange())) {
+if (!range.upperUnbound()
+&& range.getUpperRange().length > 
clippedResult.getUpperRange().length
+&& Bytes.startsWith(r

[phoenix] branch 4.x-HBase-1.3 updated: PHOENIX-5290 HashJoinMoreIT is flapping.

2019-07-16 Thread larsh
This is an automated email from the ASF dual-hosted git repository.

larsh pushed a commit to branch 4.x-HBase-1.3
in repository https://gitbox.apache.org/repos/asf/phoenix.git


The following commit(s) were added to refs/heads/4.x-HBase-1.3 by this push:
 new 9364a94  PHOENIX-5290 HashJoinMoreIT is flapping.
9364a94 is described below

commit 9364a9431d604d072ca78932c39edbc85d5aaf3d
Author: Lars Hofhansl 
AuthorDate: Tue Jul 16 15:42:58 2019 -0700

PHOENIX-5290 HashJoinMoreIT is flapping.
---
 .../phoenix/end2end/RowValueConstructorIT.java | 32 ++
 .../org/apache/phoenix/compile/WhereOptimizer.java |  8 --
 .../org/apache/phoenix/schema/types/PVarchar.java  |  2 +-
 3 files changed, 39 insertions(+), 3 deletions(-)

diff --git 
a/phoenix-core/src/it/java/org/apache/phoenix/end2end/RowValueConstructorIT.java
 
b/phoenix-core/src/it/java/org/apache/phoenix/end2end/RowValueConstructorIT.java
index fb04261..390d831 100644
--- 
a/phoenix-core/src/it/java/org/apache/phoenix/end2end/RowValueConstructorIT.java
+++ 
b/phoenix-core/src/it/java/org/apache/phoenix/end2end/RowValueConstructorIT.java
@@ -51,6 +51,7 @@ import java.sql.Timestamp;
 import java.util.List;
 import java.util.Properties;
 
+import org.apache.phoenix.jdbc.PhoenixPreparedStatement;
 import org.apache.phoenix.util.DateUtil;
 import org.apache.phoenix.util.PhoenixRuntime;
 import org.apache.phoenix.util.PropertiesUtil;
@@ -1691,6 +1692,37 @@ public class RowValueConstructorIT extends 
ParallelStatsDisabledIT {
 }
 }
 
+@Test
+public void testTrailingSeparator() throws Exception {
+Connection conn = null;
+try {
+conn = DriverManager.getConnection(getUrl());
+conn.createStatement().execute("CREATE TABLE test2961 (\n"
++ "ACCOUNT_ID VARCHAR NOT NULL,\n" + "BUCKET_ID VARCHAR 
NOT NULL,\n"
++ "OBJECT_ID VARCHAR NOT NULL,\n" + "OBJECT_VERSION 
VARCHAR NOT NULL,\n"
++ "LOC VARCHAR,\n"
++ "CONSTRAINT PK PRIMARY KEY (ACCOUNT_ID, BUCKET_ID, 
OBJECT_ID, OBJECT_VERSION DESC))");
+
+String sql = "SELECT  OBJ.ACCOUNT_ID from  test2961 as OBJ where "
++ "(OBJ.ACCOUNT_ID, OBJ.BUCKET_ID, OBJ.OBJECT_ID, 
OBJ.OBJECT_VERSION) IN "
++ "((?,?,?,?),(?,?,?,?))";
+
+PhoenixPreparedStatement statement = conn.prepareStatement(sql)
+.unwrap(PhoenixPreparedStatement.class);
+statement.setString(1, new String(new char[] { (char) 3 }));
+statement.setString(2, new String(new char[] { (char) 55 }));
+statement.setString(3, new String(new char[] { (char) 39 }));
+statement.setString(4, new String(new char[] { (char) 0 }));
+statement.setString(5, new String(new char[] { (char) 83 }));
+statement.setString(6, new String(new char[] { (char) 15 }));
+statement.setString(7, new String(new char[] { (char) 55 }));
+statement.setString(8, new String(new char[] { (char) 147 }));
+statement.optimizeQuery(sql);
+} finally {
+conn.close();
+}
+}
+
 private StringBuilder generateQueryToTest(int numItemsInClause, String 
fullViewName) {
 StringBuilder querySb =
 new StringBuilder("SELECT OBJECT_ID,OBJECT_DATA2,OBJECT_DATA 
FROM " + fullViewName);
diff --git 
a/phoenix-core/src/main/java/org/apache/phoenix/compile/WhereOptimizer.java 
b/phoenix-core/src/main/java/org/apache/phoenix/compile/WhereOptimizer.java
index b845a09..0964d9d 100644
--- a/phoenix-core/src/main/java/org/apache/phoenix/compile/WhereOptimizer.java
+++ b/phoenix-core/src/main/java/org/apache/phoenix/compile/WhereOptimizer.java
@@ -347,7 +347,9 @@ public class WhereOptimizer {
 byte[] lowerRange = KeyRange.UNBOUND;
 boolean lowerInclusive = false;
 // Lower range of trailing part of RVC must be true, so we can form a 
new range to intersect going forward
-if (!range.lowerUnbound() && Bytes.startsWith(range.getLowerRange(), 
clippedResult.getLowerRange())) {
+if (!range.lowerUnbound()
+&& range.getLowerRange().length > 
clippedResult.getLowerRange().length
+&& Bytes.startsWith(range.getLowerRange(), 
clippedResult.getLowerRange())) {
 lowerRange = range.getLowerRange();
 int offset = clippedResult.getLowerRange().length + 
separatorLength;
 ptr.set(lowerRange, offset, lowerRange.length - offset);
@@ -356,7 +358,9 @@ public class WhereOptimizer {
 }
 byte[] upperRange = KeyRange.UNBOUND;
 boolean upperInclusive = false;
-if (!range.upperUnbound() && Bytes.startsWith(range.getUpperRange(), 
clippedResult.getUpperRange())) {
+if (!range.upperUnbound()
+&& range.getUpperRange().length > 
clippedResult.getUpperRange().length
+&& Byte

[phoenix] branch 4.x-HBase-1.4 updated: PHOENIX-5290 HashJoinMoreIT is flapping.

2019-07-16 Thread larsh
This is an automated email from the ASF dual-hosted git repository.

larsh pushed a commit to branch 4.x-HBase-1.4
in repository https://gitbox.apache.org/repos/asf/phoenix.git


The following commit(s) were added to refs/heads/4.x-HBase-1.4 by this push:
 new 8599a4d  PHOENIX-5290 HashJoinMoreIT is flapping.
8599a4d is described below

commit 8599a4dc67f9f13a867e23e7c5c5a2cd54a89154
Author: Lars Hofhansl 
AuthorDate: Tue Jul 16 15:42:33 2019 -0700

PHOENIX-5290 HashJoinMoreIT is flapping.
---
 .../phoenix/end2end/RowValueConstructorIT.java | 32 ++
 .../org/apache/phoenix/compile/WhereOptimizer.java |  8 --
 .../org/apache/phoenix/schema/types/PVarchar.java  |  2 +-
 3 files changed, 39 insertions(+), 3 deletions(-)

diff --git 
a/phoenix-core/src/it/java/org/apache/phoenix/end2end/RowValueConstructorIT.java
 
b/phoenix-core/src/it/java/org/apache/phoenix/end2end/RowValueConstructorIT.java
index fb04261..390d831 100644
--- 
a/phoenix-core/src/it/java/org/apache/phoenix/end2end/RowValueConstructorIT.java
+++ 
b/phoenix-core/src/it/java/org/apache/phoenix/end2end/RowValueConstructorIT.java
@@ -51,6 +51,7 @@ import java.sql.Timestamp;
 import java.util.List;
 import java.util.Properties;
 
+import org.apache.phoenix.jdbc.PhoenixPreparedStatement;
 import org.apache.phoenix.util.DateUtil;
 import org.apache.phoenix.util.PhoenixRuntime;
 import org.apache.phoenix.util.PropertiesUtil;
@@ -1691,6 +1692,37 @@ public class RowValueConstructorIT extends 
ParallelStatsDisabledIT {
 }
 }
 
+@Test
+public void testTrailingSeparator() throws Exception {
+Connection conn = null;
+try {
+conn = DriverManager.getConnection(getUrl());
+conn.createStatement().execute("CREATE TABLE test2961 (\n"
++ "ACCOUNT_ID VARCHAR NOT NULL,\n" + "BUCKET_ID VARCHAR 
NOT NULL,\n"
++ "OBJECT_ID VARCHAR NOT NULL,\n" + "OBJECT_VERSION 
VARCHAR NOT NULL,\n"
++ "LOC VARCHAR,\n"
++ "CONSTRAINT PK PRIMARY KEY (ACCOUNT_ID, BUCKET_ID, 
OBJECT_ID, OBJECT_VERSION DESC))");
+
+String sql = "SELECT  OBJ.ACCOUNT_ID from  test2961 as OBJ where "
++ "(OBJ.ACCOUNT_ID, OBJ.BUCKET_ID, OBJ.OBJECT_ID, 
OBJ.OBJECT_VERSION) IN "
++ "((?,?,?,?),(?,?,?,?))";
+
+PhoenixPreparedStatement statement = conn.prepareStatement(sql)
+.unwrap(PhoenixPreparedStatement.class);
+statement.setString(1, new String(new char[] { (char) 3 }));
+statement.setString(2, new String(new char[] { (char) 55 }));
+statement.setString(3, new String(new char[] { (char) 39 }));
+statement.setString(4, new String(new char[] { (char) 0 }));
+statement.setString(5, new String(new char[] { (char) 83 }));
+statement.setString(6, new String(new char[] { (char) 15 }));
+statement.setString(7, new String(new char[] { (char) 55 }));
+statement.setString(8, new String(new char[] { (char) 147 }));
+statement.optimizeQuery(sql);
+} finally {
+conn.close();
+}
+}
+
 private StringBuilder generateQueryToTest(int numItemsInClause, String 
fullViewName) {
 StringBuilder querySb =
 new StringBuilder("SELECT OBJECT_ID,OBJECT_DATA2,OBJECT_DATA 
FROM " + fullViewName);
diff --git 
a/phoenix-core/src/main/java/org/apache/phoenix/compile/WhereOptimizer.java 
b/phoenix-core/src/main/java/org/apache/phoenix/compile/WhereOptimizer.java
index b845a09..0964d9d 100644
--- a/phoenix-core/src/main/java/org/apache/phoenix/compile/WhereOptimizer.java
+++ b/phoenix-core/src/main/java/org/apache/phoenix/compile/WhereOptimizer.java
@@ -347,7 +347,9 @@ public class WhereOptimizer {
 byte[] lowerRange = KeyRange.UNBOUND;
 boolean lowerInclusive = false;
 // Lower range of trailing part of RVC must be true, so we can form a 
new range to intersect going forward
-if (!range.lowerUnbound() && Bytes.startsWith(range.getLowerRange(), 
clippedResult.getLowerRange())) {
+if (!range.lowerUnbound()
+&& range.getLowerRange().length > 
clippedResult.getLowerRange().length
+&& Bytes.startsWith(range.getLowerRange(), 
clippedResult.getLowerRange())) {
 lowerRange = range.getLowerRange();
 int offset = clippedResult.getLowerRange().length + 
separatorLength;
 ptr.set(lowerRange, offset, lowerRange.length - offset);
@@ -356,7 +358,9 @@ public class WhereOptimizer {
 }
 byte[] upperRange = KeyRange.UNBOUND;
 boolean upperInclusive = false;
-if (!range.upperUnbound() && Bytes.startsWith(range.getUpperRange(), 
clippedResult.getUpperRange())) {
+if (!range.upperUnbound()
+&& range.getUpperRange().length > 
clippedResult.getUpperRange().length
+&& Byte

[phoenix] branch 4.x-HBase-1.5 updated: PHOENIX-5290 HashJoinMoreIT is flapping.

2019-07-16 Thread larsh
This is an automated email from the ASF dual-hosted git repository.

larsh pushed a commit to branch 4.x-HBase-1.5
in repository https://gitbox.apache.org/repos/asf/phoenix.git


The following commit(s) were added to refs/heads/4.x-HBase-1.5 by this push:
 new f3318a9  PHOENIX-5290 HashJoinMoreIT is flapping.
f3318a9 is described below

commit f3318a97af8c71fc50b0332f46bd297b465983eb
Author: Lars Hofhansl 
AuthorDate: Tue Jul 16 15:39:57 2019 -0700

PHOENIX-5290 HashJoinMoreIT is flapping.
---
 .../phoenix/end2end/RowValueConstructorIT.java | 32 ++
 .../org/apache/phoenix/compile/WhereOptimizer.java |  8 --
 .../org/apache/phoenix/schema/types/PVarchar.java  |  2 +-
 3 files changed, 39 insertions(+), 3 deletions(-)

diff --git 
a/phoenix-core/src/it/java/org/apache/phoenix/end2end/RowValueConstructorIT.java
 
b/phoenix-core/src/it/java/org/apache/phoenix/end2end/RowValueConstructorIT.java
index fb04261..390d831 100644
--- 
a/phoenix-core/src/it/java/org/apache/phoenix/end2end/RowValueConstructorIT.java
+++ 
b/phoenix-core/src/it/java/org/apache/phoenix/end2end/RowValueConstructorIT.java
@@ -51,6 +51,7 @@ import java.sql.Timestamp;
 import java.util.List;
 import java.util.Properties;
 
+import org.apache.phoenix.jdbc.PhoenixPreparedStatement;
 import org.apache.phoenix.util.DateUtil;
 import org.apache.phoenix.util.PhoenixRuntime;
 import org.apache.phoenix.util.PropertiesUtil;
@@ -1691,6 +1692,37 @@ public class RowValueConstructorIT extends 
ParallelStatsDisabledIT {
 }
 }
 
+@Test
+public void testTrailingSeparator() throws Exception {
+Connection conn = null;
+try {
+conn = DriverManager.getConnection(getUrl());
+conn.createStatement().execute("CREATE TABLE test2961 (\n"
++ "ACCOUNT_ID VARCHAR NOT NULL,\n" + "BUCKET_ID VARCHAR 
NOT NULL,\n"
++ "OBJECT_ID VARCHAR NOT NULL,\n" + "OBJECT_VERSION 
VARCHAR NOT NULL,\n"
++ "LOC VARCHAR,\n"
++ "CONSTRAINT PK PRIMARY KEY (ACCOUNT_ID, BUCKET_ID, 
OBJECT_ID, OBJECT_VERSION DESC))");
+
+String sql = "SELECT  OBJ.ACCOUNT_ID from  test2961 as OBJ where "
++ "(OBJ.ACCOUNT_ID, OBJ.BUCKET_ID, OBJ.OBJECT_ID, 
OBJ.OBJECT_VERSION) IN "
++ "((?,?,?,?),(?,?,?,?))";
+
+PhoenixPreparedStatement statement = conn.prepareStatement(sql)
+.unwrap(PhoenixPreparedStatement.class);
+statement.setString(1, new String(new char[] { (char) 3 }));
+statement.setString(2, new String(new char[] { (char) 55 }));
+statement.setString(3, new String(new char[] { (char) 39 }));
+statement.setString(4, new String(new char[] { (char) 0 }));
+statement.setString(5, new String(new char[] { (char) 83 }));
+statement.setString(6, new String(new char[] { (char) 15 }));
+statement.setString(7, new String(new char[] { (char) 55 }));
+statement.setString(8, new String(new char[] { (char) 147 }));
+statement.optimizeQuery(sql);
+} finally {
+conn.close();
+}
+}
+
 private StringBuilder generateQueryToTest(int numItemsInClause, String 
fullViewName) {
 StringBuilder querySb =
 new StringBuilder("SELECT OBJECT_ID,OBJECT_DATA2,OBJECT_DATA 
FROM " + fullViewName);
diff --git 
a/phoenix-core/src/main/java/org/apache/phoenix/compile/WhereOptimizer.java 
b/phoenix-core/src/main/java/org/apache/phoenix/compile/WhereOptimizer.java
index b845a09..0964d9d 100644
--- a/phoenix-core/src/main/java/org/apache/phoenix/compile/WhereOptimizer.java
+++ b/phoenix-core/src/main/java/org/apache/phoenix/compile/WhereOptimizer.java
@@ -347,7 +347,9 @@ public class WhereOptimizer {
 byte[] lowerRange = KeyRange.UNBOUND;
 boolean lowerInclusive = false;
 // Lower range of trailing part of RVC must be true, so we can form a 
new range to intersect going forward
-if (!range.lowerUnbound() && Bytes.startsWith(range.getLowerRange(), 
clippedResult.getLowerRange())) {
+if (!range.lowerUnbound()
+&& range.getLowerRange().length > 
clippedResult.getLowerRange().length
+&& Bytes.startsWith(range.getLowerRange(), 
clippedResult.getLowerRange())) {
 lowerRange = range.getLowerRange();
 int offset = clippedResult.getLowerRange().length + 
separatorLength;
 ptr.set(lowerRange, offset, lowerRange.length - offset);
@@ -356,7 +358,9 @@ public class WhereOptimizer {
 }
 byte[] upperRange = KeyRange.UNBOUND;
 boolean upperInclusive = false;
-if (!range.upperUnbound() && Bytes.startsWith(range.getUpperRange(), 
clippedResult.getUpperRange())) {
+if (!range.upperUnbound()
+&& range.getUpperRange().length > 
clippedResult.getUpperRange().length
+&& Byte

Jenkins build is back to normal : Phoenix | Master #2456

2019-07-16 Thread Apache Jenkins Server
See 




Apache-Phoenix | 4.x-HBase-1.3 | Build Successful

2019-07-16 Thread Apache Jenkins Server
4.x-HBase-1.3 branch build status Successful

Source repository https://git-wip-us.apache.org/repos/asf?p=phoenix.git;a=shortlog;h=refs/heads/4.x-HBase-1.3

Compiled Artifacts https://builds.apache.org/job/Phoenix-4.x-HBase-1.3/lastSuccessfulBuild/artifact/

Test Report https://builds.apache.org/job/Phoenix-4.x-HBase-1.3/lastCompletedBuild/testReport/

Changes
[monani.mihir] PHOENIX-5327 PherfMainIT fails with duplicate TABLE/INDEX.



Build times for last couple of runsLatest build time is the right most | Legend blue: normal, red: test failure, gray: timeout


Build failed in Jenkins: Phoenix-4.x-HBase-1.4 #216

2019-07-16 Thread Apache Jenkins Server
See 


Changes:

[monani.mihir] PHOENIX-5327 PherfMainIT fails with duplicate TABLE/INDEX.

--
[...truncated 573.46 KB...]
[ERROR]   
OrphanViewToolIT.testDeleteGrandchildViewRows:344->verifyOrphanFileLineCounts:256->verifyLineCount:209
[ERROR]   
OrphanViewToolIT.testDeleteGrandchildViewRows:344->verifyOrphanFileLineCounts:255->verifyLineCount:209
[ERROR]   
OrphanViewToolIT.testDeleteParentChildLinkRows:374->verifyOrphanFileLineCounts:255->verifyLineCount:209
[ERROR]   
OrphanViewToolIT.testDeleteParentChildLinkRows:370->verifyCountQuery:218
[ERROR]   
OrphanViewToolIT.testDeletePhysicalTableLinks:424->verifyCountQuery:218
[ERROR]   
OrphanViewToolIT.testDeletePhysicalTableLinks:424->verifyCountQuery:218
[ERROR] Errors: 
[ERROR]   
OrphanViewToolIT.testCreateTableAndViews:231->verifyOrphanFileLineCounts:255->verifyLineCount:203
 » FileNotFound
[ERROR]   
GlobalMutableNonTxIndexIT>BaseIndexIT.testCreateIndexAfterUpsertStarted:263->BaseIndexIT.testCreateIndexAfterUpsertStarted:274
 » PhoenixIO
[ERROR]   
GlobalMutableNonTxIndexIT>BaseIndexIT.testInClauseWithIndexOnColumnOfUsignedIntType:468
 » PhoenixIO
[ERROR]   IndexMetadataIT.testAsyncCreatedDate:554 » PhoenixIO 
org.apache.hadoop.hbase.D...
[ERROR]   
IndexUsageIT.testImmutableLocalCaseSensitiveFunctionIndex:629->helpTestCaseSensitiveFunctionIndex:649
 » PhoenixIO
[ERROR]   
IndexUsageIT.testSelectColOnlyInDataTableImmutableLocalIndex:416->helpTestSelectColOnlyInDataTable:438->createDataTable:80
 » PhoenixIO
[ERROR]   
IndexWithTableSchemaChangeIT.testImmutableIndexDropCoveredColumn:148->helpTestDropCoveredColumn:178
 » PhoenixIO
[ERROR]   
InvalidIndexStateClientSideIT>ParallelStatsDisabledIT.doSetup:60->BaseTest.setUpTestDriver:516->BaseTest.setUpTestDriver:521->BaseTest.checkClusterInitialized:435->BaseTest.setUpTestCluster:449->BaseTest.initMiniCluster:550
 » Runtime
[INFO] 
[ERROR] Tests run: 3674, Failures: 13, Errors: 8, Skipped: 2
[INFO] 
[INFO] 
[INFO] --- maven-failsafe-plugin:2.20:integration-test (HBaseManagedTimeTests) 
@ phoenix-core ---
[INFO] 
[INFO] ---
[INFO]  T E S T S
[INFO] ---
[INFO] 
[INFO] Results:
[INFO] 
[INFO] Tests run: 0, Failures: 0, Errors: 0, Skipped: 0
[INFO] 
[INFO] 
[INFO] --- maven-failsafe-plugin:2.20:integration-test 
(NeedTheirOwnClusterTests) @ phoenix-core ---
[INFO] 
[INFO] ---
[INFO]  T E S T S
[INFO] ---
[INFO] Running 
org.apache.hadoop.hbase.regionserver.wal.WALReplayWithIndexWritesAndCompressedWALIT
[WARNING] Tests run: 1, Failures: 0, Errors: 0, Skipped: 1, Time elapsed: 0.003 
s - in 
org.apache.hadoop.hbase.regionserver.wal.WALReplayWithIndexWritesAndCompressedWALIT
[INFO] Running org.apache.phoenix.end2end.ConnectionUtilIT
[INFO] Running 
org.apache.hadoop.hbase.regionserver.wal.WALRecoveryRegionPostOpenIT
[INFO] Running org.apache.phoenix.end2end.ContextClassloaderIT
[INFO] Running org.apache.phoenix.end2end.CsvBulkLoadToolIT
[INFO] Running org.apache.phoenix.end2end.ConcurrentMutationsExtendedIT
[INFO] Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 2.457 s 
- in org.apache.phoenix.end2end.ContextClassloaderIT
[INFO] Running org.apache.phoenix.end2end.CountDistinctCompressionIT
[INFO] Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 15.266 s 
- in org.apache.hadoop.hbase.regionserver.wal.WALRecoveryRegionPostOpenIT
[INFO] Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 2.456 s 
- in org.apache.phoenix.end2end.CountDistinctCompressionIT
[INFO] Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 32.162 s 
- in org.apache.phoenix.end2end.ConnectionUtilIT
[INFO] Running org.apache.phoenix.end2end.CostBasedDecisionIT
[INFO] Running org.apache.phoenix.end2end.DropSchemaIT
[INFO] Running org.apache.phoenix.end2end.IndexBuildTimestampIT
[INFO] Running org.apache.phoenix.end2end.IndexExtendedIT
[INFO] Running org.apache.phoenix.end2end.FlappingLocalIndexIT
[INFO] Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 19.414 s 
- in org.apache.phoenix.end2end.DropSchemaIT
[INFO] Tests run: 16, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 108.396 
s - in org.apache.phoenix.end2end.CsvBulkLoadToolIT
[INFO] Running org.apache.phoenix.end2end.IndexToolForPartialBuildIT
[INFO] Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 15.712 s 
- in org.apache.phoenix.end2end.IndexToolForPartialBuildIT
[INFO] Tests run: 20, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 74.808 
s - in org.apache.phoenix.end2end.IndexExtendedIT
[INFO] Running 
org.apache.phoenix.end2end.IndexToolForPartialBuildWithNamespaceEnabledIT
[INFO] Running org.apache.phoenix.end2end.IndexToolIT
[INFO] Tests run: 2, Failures: 0, Error

[phoenix] branch 4.x-HBase-1.5 updated: PHOENIX-5327 PherfMainIT fails with duplicate TABLE/INDEX.

2019-07-16 Thread mihir6692
This is an automated email from the ASF dual-hosted git repository.

mihir6692 pushed a commit to branch 4.x-HBase-1.5
in repository https://gitbox.apache.org/repos/asf/phoenix.git


The following commit(s) were added to refs/heads/4.x-HBase-1.5 by this push:
 new d023054  PHOENIX-5327 PherfMainIT fails with duplicate TABLE/INDEX.
d023054 is described below

commit d023054eef16ca8485ea1d0eefeb7ae3b8840ec2
Author: Viraj Jasani 
AuthorDate: Tue Jul 16 17:02:31 2019 +0530

PHOENIX-5327 PherfMainIT fails with duplicate TABLE/INDEX.

Signed-off-by: Monani Mihir 
---
 .../org/apache/phoenix/pherf/DataIngestIT.java |  6 +--
 .../java/org/apache/phoenix/pherf/PherfMainIT.java |  3 +-
 .../main/java/org/apache/phoenix/pherf/Pherf.java  |  7 ++-
 .../phoenix/pherf/workload/QueryExecutor.java  | 16 +--
 .../apache/phoenix/pherf/workload/Workload.java| 13 +++---
 .../phoenix/pherf/workload/WriteWorkload.java  |  7 +--
 .../scenario/prod_test_unsalted_scenario.xml   | 50 ++
 7 files changed, 62 insertions(+), 40 deletions(-)

diff --git 
a/phoenix-pherf/src/it/java/org/apache/phoenix/pherf/DataIngestIT.java 
b/phoenix-pherf/src/it/java/org/apache/phoenix/pherf/DataIngestIT.java
index 973ce2c..bc768e2 100644
--- a/phoenix-pherf/src/it/java/org/apache/phoenix/pherf/DataIngestIT.java
+++ b/phoenix-pherf/src/it/java/org/apache/phoenix/pherf/DataIngestIT.java
@@ -44,7 +44,6 @@ import org.apache.phoenix.pherf.workload.Workload;
 import org.apache.phoenix.pherf.workload.WorkloadExecutor;
 import org.apache.phoenix.pherf.workload.WriteWorkload;
 import org.junit.Before;
-import org.junit.Ignore;
 import org.junit.Test;
 
 import com.jcabi.jdbc.JdbcSession;
@@ -217,8 +216,7 @@ public class DataIngestIT extends ResultBaseTestIT {
 assertExpectedNumberOfRecordsWritten(scenario);
 }
 
-private void assertExpectedNumberOfRecordsWritten(Scenario scenario) 
throws Exception,
-SQLException {
+private void assertExpectedNumberOfRecordsWritten(Scenario scenario) 
throws Exception {
 Connection connection = util.getConnection(scenario.getTenantId());
 String sql = "select count(*) from " + scenario.getTableName();
 Integer count = new JdbcSession(connection).sql(sql).select(new 
Outcome() {
@@ -230,7 +228,7 @@ public class DataIngestIT extends ResultBaseTestIT {
 return null;
 }
 });
-assertNotNull("Could not retrieve count. " + count);
+assertNotNull("Could not retrieve count. ", count);
 assertEquals("Expected 100 rows to have been inserted",
 scenario.getRowCount(), count.intValue());
 }
diff --git 
a/phoenix-pherf/src/it/java/org/apache/phoenix/pherf/PherfMainIT.java 
b/phoenix-pherf/src/it/java/org/apache/phoenix/pherf/PherfMainIT.java
index 7a080c8..3ee9327 100644
--- a/phoenix-pherf/src/it/java/org/apache/phoenix/pherf/PherfMainIT.java
+++ b/phoenix-pherf/src/it/java/org/apache/phoenix/pherf/PherfMainIT.java
@@ -25,10 +25,11 @@ import 
org.junit.contrib.java.lang.system.ExpectedSystemExit;
 import java.util.concurrent.Future;
 
 public class PherfMainIT extends ResultBaseTestIT {
+
 @Rule
 public final ExpectedSystemExit exit = ExpectedSystemExit.none();
 
-//@Test disabled until PHOENIX-5327 is fixed
+@Test
 public void testPherfMain() throws Exception {
 String[] args = { "-q", "-l",
 "--schemaFile", ".*create_prod_test_unsalted.sql",
diff --git a/phoenix-pherf/src/main/java/org/apache/phoenix/pherf/Pherf.java 
b/phoenix-pherf/src/main/java/org/apache/phoenix/pherf/Pherf.java
index 2b55e29..05e747a 100644
--- a/phoenix-pherf/src/main/java/org/apache/phoenix/pherf/Pherf.java
+++ b/phoenix-pherf/src/main/java/org/apache/phoenix/pherf/Pherf.java
@@ -25,7 +25,6 @@ import java.util.List;
 import java.util.Properties;
 
 import com.google.common.annotations.VisibleForTesting;
-import jline.internal.TestAccessible;
 import org.apache.commons.cli.CommandLine;
 import org.apache.commons.cli.CommandLineParser;
 import org.apache.commons.cli.HelpFormatter;
@@ -229,10 +228,10 @@ public class Pherf {
 }
 
 // Compare results and exit  
-   if (null != compareResults) {
+if (null != compareResults) {
 LOGGER.info("\nStarting to compare results and exiting for " + 
compareResults);
-   new GoogleChartGenerator(compareResults, 
compareType).readAndRender();
-   return;
+new GoogleChartGenerator(compareResults, 
compareType).readAndRender();
+return;
 }
 
 XMLConfigParser parser = new XMLConfigParser(scenarioFile);
diff --git 
a/phoenix-pherf/src/main/java/org/apache/phoenix/pherf/workload/QueryExecutor.java
 
b/phoenix-pherf/src/main/java/org/apache/phoenix/pherf/workload/QueryExecutor.java
index d894

[phoenix] branch 4.x-HBase-1.4 updated: PHOENIX-5327 PherfMainIT fails with duplicate TABLE/INDEX.

2019-07-16 Thread mihir6692
This is an automated email from the ASF dual-hosted git repository.

mihir6692 pushed a commit to branch 4.x-HBase-1.4
in repository https://gitbox.apache.org/repos/asf/phoenix.git


The following commit(s) were added to refs/heads/4.x-HBase-1.4 by this push:
 new ffa45c7  PHOENIX-5327 PherfMainIT fails with duplicate TABLE/INDEX.
ffa45c7 is described below

commit ffa45c7bc37d261c26985e2f6d1967ad5ba35cb6
Author: Viraj Jasani 
AuthorDate: Tue Jul 16 17:02:05 2019 +0530

PHOENIX-5327 PherfMainIT fails with duplicate TABLE/INDEX.

Signed-off-by: Monani Mihir 
---
 .../org/apache/phoenix/pherf/DataIngestIT.java |  6 +--
 .../java/org/apache/phoenix/pherf/PherfMainIT.java |  3 +-
 .../main/java/org/apache/phoenix/pherf/Pherf.java  |  7 ++-
 .../phoenix/pherf/workload/QueryExecutor.java  | 16 +--
 .../apache/phoenix/pherf/workload/Workload.java| 13 +++---
 .../phoenix/pherf/workload/WriteWorkload.java  |  7 +--
 .../scenario/prod_test_unsalted_scenario.xml   | 50 ++
 7 files changed, 62 insertions(+), 40 deletions(-)

diff --git 
a/phoenix-pherf/src/it/java/org/apache/phoenix/pherf/DataIngestIT.java 
b/phoenix-pherf/src/it/java/org/apache/phoenix/pherf/DataIngestIT.java
index 973ce2c..bc768e2 100644
--- a/phoenix-pherf/src/it/java/org/apache/phoenix/pherf/DataIngestIT.java
+++ b/phoenix-pherf/src/it/java/org/apache/phoenix/pherf/DataIngestIT.java
@@ -44,7 +44,6 @@ import org.apache.phoenix.pherf.workload.Workload;
 import org.apache.phoenix.pherf.workload.WorkloadExecutor;
 import org.apache.phoenix.pherf.workload.WriteWorkload;
 import org.junit.Before;
-import org.junit.Ignore;
 import org.junit.Test;
 
 import com.jcabi.jdbc.JdbcSession;
@@ -217,8 +216,7 @@ public class DataIngestIT extends ResultBaseTestIT {
 assertExpectedNumberOfRecordsWritten(scenario);
 }
 
-private void assertExpectedNumberOfRecordsWritten(Scenario scenario) 
throws Exception,
-SQLException {
+private void assertExpectedNumberOfRecordsWritten(Scenario scenario) 
throws Exception {
 Connection connection = util.getConnection(scenario.getTenantId());
 String sql = "select count(*) from " + scenario.getTableName();
 Integer count = new JdbcSession(connection).sql(sql).select(new 
Outcome() {
@@ -230,7 +228,7 @@ public class DataIngestIT extends ResultBaseTestIT {
 return null;
 }
 });
-assertNotNull("Could not retrieve count. " + count);
+assertNotNull("Could not retrieve count. ", count);
 assertEquals("Expected 100 rows to have been inserted",
 scenario.getRowCount(), count.intValue());
 }
diff --git 
a/phoenix-pherf/src/it/java/org/apache/phoenix/pherf/PherfMainIT.java 
b/phoenix-pherf/src/it/java/org/apache/phoenix/pherf/PherfMainIT.java
index 7a080c8..3ee9327 100644
--- a/phoenix-pherf/src/it/java/org/apache/phoenix/pherf/PherfMainIT.java
+++ b/phoenix-pherf/src/it/java/org/apache/phoenix/pherf/PherfMainIT.java
@@ -25,10 +25,11 @@ import 
org.junit.contrib.java.lang.system.ExpectedSystemExit;
 import java.util.concurrent.Future;
 
 public class PherfMainIT extends ResultBaseTestIT {
+
 @Rule
 public final ExpectedSystemExit exit = ExpectedSystemExit.none();
 
-//@Test disabled until PHOENIX-5327 is fixed
+@Test
 public void testPherfMain() throws Exception {
 String[] args = { "-q", "-l",
 "--schemaFile", ".*create_prod_test_unsalted.sql",
diff --git a/phoenix-pherf/src/main/java/org/apache/phoenix/pherf/Pherf.java 
b/phoenix-pherf/src/main/java/org/apache/phoenix/pherf/Pherf.java
index 2b55e29..05e747a 100644
--- a/phoenix-pherf/src/main/java/org/apache/phoenix/pherf/Pherf.java
+++ b/phoenix-pherf/src/main/java/org/apache/phoenix/pherf/Pherf.java
@@ -25,7 +25,6 @@ import java.util.List;
 import java.util.Properties;
 
 import com.google.common.annotations.VisibleForTesting;
-import jline.internal.TestAccessible;
 import org.apache.commons.cli.CommandLine;
 import org.apache.commons.cli.CommandLineParser;
 import org.apache.commons.cli.HelpFormatter;
@@ -229,10 +228,10 @@ public class Pherf {
 }
 
 // Compare results and exit  
-   if (null != compareResults) {
+if (null != compareResults) {
 LOGGER.info("\nStarting to compare results and exiting for " + 
compareResults);
-   new GoogleChartGenerator(compareResults, 
compareType).readAndRender();
-   return;
+new GoogleChartGenerator(compareResults, 
compareType).readAndRender();
+return;
 }
 
 XMLConfigParser parser = new XMLConfigParser(scenarioFile);
diff --git 
a/phoenix-pherf/src/main/java/org/apache/phoenix/pherf/workload/QueryExecutor.java
 
b/phoenix-pherf/src/main/java/org/apache/phoenix/pherf/workload/QueryExecutor.java
index d894

[phoenix] branch 4.x-HBase-1.3 updated: PHOENIX-5327 PherfMainIT fails with duplicate TABLE/INDEX.

2019-07-16 Thread mihir6692
This is an automated email from the ASF dual-hosted git repository.

mihir6692 pushed a commit to branch 4.x-HBase-1.3
in repository https://gitbox.apache.org/repos/asf/phoenix.git


The following commit(s) were added to refs/heads/4.x-HBase-1.3 by this push:
 new 2d23283  PHOENIX-5327 PherfMainIT fails with duplicate TABLE/INDEX.
2d23283 is described below

commit 2d23283cac5b765a980ab92eb88209a13481f783
Author: Viraj Jasani 
AuthorDate: Tue Jul 16 16:54:11 2019 +0530

PHOENIX-5327 PherfMainIT fails with duplicate TABLE/INDEX.

Signed-off-by: Monani Mihir 
---
 .../org/apache/phoenix/pherf/DataIngestIT.java |  6 +--
 .../java/org/apache/phoenix/pherf/PherfMainIT.java |  3 +-
 .../main/java/org/apache/phoenix/pherf/Pherf.java  |  7 ++-
 .../phoenix/pherf/workload/QueryExecutor.java  | 16 +--
 .../apache/phoenix/pherf/workload/Workload.java| 13 +++---
 .../phoenix/pherf/workload/WriteWorkload.java  |  7 +--
 .../scenario/prod_test_unsalted_scenario.xml   | 50 ++
 7 files changed, 62 insertions(+), 40 deletions(-)

diff --git 
a/phoenix-pherf/src/it/java/org/apache/phoenix/pherf/DataIngestIT.java 
b/phoenix-pherf/src/it/java/org/apache/phoenix/pherf/DataIngestIT.java
index 973ce2c..bc768e2 100644
--- a/phoenix-pherf/src/it/java/org/apache/phoenix/pherf/DataIngestIT.java
+++ b/phoenix-pherf/src/it/java/org/apache/phoenix/pherf/DataIngestIT.java
@@ -44,7 +44,6 @@ import org.apache.phoenix.pherf.workload.Workload;
 import org.apache.phoenix.pherf.workload.WorkloadExecutor;
 import org.apache.phoenix.pherf.workload.WriteWorkload;
 import org.junit.Before;
-import org.junit.Ignore;
 import org.junit.Test;
 
 import com.jcabi.jdbc.JdbcSession;
@@ -217,8 +216,7 @@ public class DataIngestIT extends ResultBaseTestIT {
 assertExpectedNumberOfRecordsWritten(scenario);
 }
 
-private void assertExpectedNumberOfRecordsWritten(Scenario scenario) 
throws Exception,
-SQLException {
+private void assertExpectedNumberOfRecordsWritten(Scenario scenario) 
throws Exception {
 Connection connection = util.getConnection(scenario.getTenantId());
 String sql = "select count(*) from " + scenario.getTableName();
 Integer count = new JdbcSession(connection).sql(sql).select(new 
Outcome() {
@@ -230,7 +228,7 @@ public class DataIngestIT extends ResultBaseTestIT {
 return null;
 }
 });
-assertNotNull("Could not retrieve count. " + count);
+assertNotNull("Could not retrieve count. ", count);
 assertEquals("Expected 100 rows to have been inserted",
 scenario.getRowCount(), count.intValue());
 }
diff --git 
a/phoenix-pherf/src/it/java/org/apache/phoenix/pherf/PherfMainIT.java 
b/phoenix-pherf/src/it/java/org/apache/phoenix/pherf/PherfMainIT.java
index 7a080c8..3ee9327 100644
--- a/phoenix-pherf/src/it/java/org/apache/phoenix/pherf/PherfMainIT.java
+++ b/phoenix-pherf/src/it/java/org/apache/phoenix/pherf/PherfMainIT.java
@@ -25,10 +25,11 @@ import 
org.junit.contrib.java.lang.system.ExpectedSystemExit;
 import java.util.concurrent.Future;
 
 public class PherfMainIT extends ResultBaseTestIT {
+
 @Rule
 public final ExpectedSystemExit exit = ExpectedSystemExit.none();
 
-//@Test disabled until PHOENIX-5327 is fixed
+@Test
 public void testPherfMain() throws Exception {
 String[] args = { "-q", "-l",
 "--schemaFile", ".*create_prod_test_unsalted.sql",
diff --git a/phoenix-pherf/src/main/java/org/apache/phoenix/pherf/Pherf.java 
b/phoenix-pherf/src/main/java/org/apache/phoenix/pherf/Pherf.java
index 2b55e29..05e747a 100644
--- a/phoenix-pherf/src/main/java/org/apache/phoenix/pherf/Pherf.java
+++ b/phoenix-pherf/src/main/java/org/apache/phoenix/pherf/Pherf.java
@@ -25,7 +25,6 @@ import java.util.List;
 import java.util.Properties;
 
 import com.google.common.annotations.VisibleForTesting;
-import jline.internal.TestAccessible;
 import org.apache.commons.cli.CommandLine;
 import org.apache.commons.cli.CommandLineParser;
 import org.apache.commons.cli.HelpFormatter;
@@ -229,10 +228,10 @@ public class Pherf {
 }
 
 // Compare results and exit  
-   if (null != compareResults) {
+if (null != compareResults) {
 LOGGER.info("\nStarting to compare results and exiting for " + 
compareResults);
-   new GoogleChartGenerator(compareResults, 
compareType).readAndRender();
-   return;
+new GoogleChartGenerator(compareResults, 
compareType).readAndRender();
+return;
 }
 
 XMLConfigParser parser = new XMLConfigParser(scenarioFile);
diff --git 
a/phoenix-pherf/src/main/java/org/apache/phoenix/pherf/workload/QueryExecutor.java
 
b/phoenix-pherf/src/main/java/org/apache/phoenix/pherf/workload/QueryExecutor.java
index d894

[phoenix] branch master updated: PHOENIX-5327 PherfMainIT fails with duplicate TABLE/INDEX.

2019-07-16 Thread mihir6692
This is an automated email from the ASF dual-hosted git repository.

mihir6692 pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/phoenix.git


The following commit(s) were added to refs/heads/master by this push:
 new d611422  PHOENIX-5327 PherfMainIT fails with duplicate TABLE/INDEX.
d611422 is described below

commit d61142206389b845376384ed1b2ef5e01e806a9a
Author: Viraj Jasani 
AuthorDate: Tue Jul 16 16:37:01 2019 +0530

PHOENIX-5327 PherfMainIT fails with duplicate TABLE/INDEX.

Signed-off-by: Monani Mihir 
---
 .../org/apache/phoenix/pherf/DataIngestIT.java |  6 +--
 .../java/org/apache/phoenix/pherf/PherfMainIT.java |  3 +-
 .../main/java/org/apache/phoenix/pherf/Pherf.java  |  7 ++-
 .../phoenix/pherf/workload/QueryExecutor.java  | 16 +--
 .../apache/phoenix/pherf/workload/Workload.java| 13 +++---
 .../phoenix/pherf/workload/WriteWorkload.java  | 10 ++---
 .../scenario/prod_test_unsalted_scenario.xml   | 50 ++
 7 files changed, 64 insertions(+), 41 deletions(-)

diff --git 
a/phoenix-pherf/src/it/java/org/apache/phoenix/pherf/DataIngestIT.java 
b/phoenix-pherf/src/it/java/org/apache/phoenix/pherf/DataIngestIT.java
index 973ce2c..bc768e2 100644
--- a/phoenix-pherf/src/it/java/org/apache/phoenix/pherf/DataIngestIT.java
+++ b/phoenix-pherf/src/it/java/org/apache/phoenix/pherf/DataIngestIT.java
@@ -44,7 +44,6 @@ import org.apache.phoenix.pherf.workload.Workload;
 import org.apache.phoenix.pherf.workload.WorkloadExecutor;
 import org.apache.phoenix.pherf.workload.WriteWorkload;
 import org.junit.Before;
-import org.junit.Ignore;
 import org.junit.Test;
 
 import com.jcabi.jdbc.JdbcSession;
@@ -217,8 +216,7 @@ public class DataIngestIT extends ResultBaseTestIT {
 assertExpectedNumberOfRecordsWritten(scenario);
 }
 
-private void assertExpectedNumberOfRecordsWritten(Scenario scenario) 
throws Exception,
-SQLException {
+private void assertExpectedNumberOfRecordsWritten(Scenario scenario) 
throws Exception {
 Connection connection = util.getConnection(scenario.getTenantId());
 String sql = "select count(*) from " + scenario.getTableName();
 Integer count = new JdbcSession(connection).sql(sql).select(new 
Outcome() {
@@ -230,7 +228,7 @@ public class DataIngestIT extends ResultBaseTestIT {
 return null;
 }
 });
-assertNotNull("Could not retrieve count. " + count);
+assertNotNull("Could not retrieve count. ", count);
 assertEquals("Expected 100 rows to have been inserted",
 scenario.getRowCount(), count.intValue());
 }
diff --git 
a/phoenix-pherf/src/it/java/org/apache/phoenix/pherf/PherfMainIT.java 
b/phoenix-pherf/src/it/java/org/apache/phoenix/pherf/PherfMainIT.java
index 7a080c8..3ee9327 100644
--- a/phoenix-pherf/src/it/java/org/apache/phoenix/pherf/PherfMainIT.java
+++ b/phoenix-pherf/src/it/java/org/apache/phoenix/pherf/PherfMainIT.java
@@ -25,10 +25,11 @@ import 
org.junit.contrib.java.lang.system.ExpectedSystemExit;
 import java.util.concurrent.Future;
 
 public class PherfMainIT extends ResultBaseTestIT {
+
 @Rule
 public final ExpectedSystemExit exit = ExpectedSystemExit.none();
 
-//@Test disabled until PHOENIX-5327 is fixed
+@Test
 public void testPherfMain() throws Exception {
 String[] args = { "-q", "-l",
 "--schemaFile", ".*create_prod_test_unsalted.sql",
diff --git a/phoenix-pherf/src/main/java/org/apache/phoenix/pherf/Pherf.java 
b/phoenix-pherf/src/main/java/org/apache/phoenix/pherf/Pherf.java
index 2b55e29..05e747a 100644
--- a/phoenix-pherf/src/main/java/org/apache/phoenix/pherf/Pherf.java
+++ b/phoenix-pherf/src/main/java/org/apache/phoenix/pherf/Pherf.java
@@ -25,7 +25,6 @@ import java.util.List;
 import java.util.Properties;
 
 import com.google.common.annotations.VisibleForTesting;
-import jline.internal.TestAccessible;
 import org.apache.commons.cli.CommandLine;
 import org.apache.commons.cli.CommandLineParser;
 import org.apache.commons.cli.HelpFormatter;
@@ -229,10 +228,10 @@ public class Pherf {
 }
 
 // Compare results and exit  
-   if (null != compareResults) {
+if (null != compareResults) {
 LOGGER.info("\nStarting to compare results and exiting for " + 
compareResults);
-   new GoogleChartGenerator(compareResults, 
compareType).readAndRender();
-   return;
+new GoogleChartGenerator(compareResults, 
compareType).readAndRender();
+return;
 }
 
 XMLConfigParser parser = new XMLConfigParser(scenarioFile);
diff --git 
a/phoenix-pherf/src/main/java/org/apache/phoenix/pherf/workload/QueryExecutor.java
 
b/phoenix-pherf/src/main/java/org/apache/phoenix/pherf/workload/QueryExecutor.java
index d894a96..c15cf1a

Build failed in Jenkins: Phoenix Compile Compatibility with HBase #1060

2019-07-16 Thread Apache Jenkins Server
See 


--
Started by timer
[EnvInject] - Loading node environment variables.
Building remotely on H25 (ubuntu xenial) in workspace 

[Phoenix_Compile_Compat_wHBase] $ /bin/bash /tmp/jenkins6996390183754167556.sh
core file size  (blocks, -c) 0
data seg size   (kbytes, -d) unlimited
scheduling priority (-e) 0
file size   (blocks, -f) unlimited
pending signals (-i) 386407
max locked memory   (kbytes, -l) 64
max memory size (kbytes, -m) unlimited
open files  (-n) 6
pipe size(512 bytes, -p) 8
POSIX message queues (bytes, -q) 819200
real-time priority  (-r) 0
stack size  (kbytes, -s) 8192
cpu time   (seconds, -t) unlimited
max user processes  (-u) 10240
virtual memory  (kbytes, -v) unlimited
file locks  (-x) unlimited
core id : 0
core id : 1
core id : 2
core id : 3
core id : 4
core id : 5
physical id : 0
physical id : 1
MemTotal:   98957636 kB
MemFree:38415660 kB
Filesystem  Size  Used Avail Use% Mounted on
udev 48G 0   48G   0% /dev
tmpfs   9.5G  970M  8.5G  11% /run
/dev/sda3   3.6T  486G  3.0T  14% /
tmpfs48G 0   48G   0% /dev/shm
tmpfs   5.0M 0  5.0M   0% /run/lock
tmpfs48G 0   48G   0% /sys/fs/cgroup
/dev/sda2   473M  236M  213M  53% /boot
tmpfs   9.5G  4.0K  9.5G   1% /run/user/910
tmpfs   9.5G 0  9.5G   0% /run/user/1000
/dev/loop11  57M   57M 0 100% /snap/snapcraft/3022
/dev/loop4   57M   57M 0 100% /snap/snapcraft/3059
/dev/loop10  55M   55M 0 100% /snap/lxd/10972
/dev/loop7   89M   89M 0 100% /snap/core/7169
/dev/loop8   89M   89M 0 100% /snap/core/7270
/dev/loop2   55M   55M 0 100% /snap/lxd/11098
apache-maven-2.2.1
apache-maven-3.0.4
apache-maven-3.0.5
apache-maven-3.1.1
apache-maven-3.2.1
apache-maven-3.2.5
apache-maven-3.3.3
apache-maven-3.3.9
apache-maven-3.5.0
apache-maven-3.5.2
apache-maven-3.5.4
apache-maven-3.6.0
latest
latest2
latest3


===
Verifying compile level compatibility with HBase 0.98 with Phoenix 
4.x-HBase-0.98
===

Cloning into 'hbase'...
Switched to a new branch '0.98'
Branch 0.98 set up to track remote branch 0.98 from origin.
[ERROR] Plugin org.codehaus.mojo:findbugs-maven-plugin:2.5.2 or one of its 
dependencies could not be resolved: Failed to read artifact descriptor for 
org.codehaus.mojo:findbugs-maven-plugin:jar:2.5.2: Could not transfer artifact 
org.codehaus.mojo:findbugs-maven-plugin:pom:2.5.2 from/to central 
(https://repo.maven.apache.org/maven2): Received fatal alert: protocol_version 
-> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e 
switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please 
read the following articles:
[ERROR] [Help 1] 
http://cwiki.apache.org/confluence/display/MAVEN/PluginResolutionException
Build step 'Execute shell' marked build as failure