Apache-Phoenix | Master | Build Successful

2017-09-12 Thread Apache Jenkins Server
Master branch build status Successful
Source repository https://git-wip-us.apache.org/repos/asf?p=phoenix.git;a=shortlog;h=refs/heads/master

Last Successful Compiled Artifacts https://builds.apache.org/job/Phoenix-master/lastSuccessfulBuild/artifact/

Last Complete Test Report https://builds.apache.org/job/Phoenix-master/lastCompletedBuild/testReport/

Changes
[jamestaylor] PHOENIX-4205 Modify OutOfOrderMutationsIT to not use CURRENT_SCN

[jamestaylor] PHOENIX-4169 Explicitly cap timeout for index disable RPC on compaction

[jamestaylor] PHOENIX-4185 Rewrite tests to disable DDL and DML for PercentileIT.java



Build times for last couple of runsLatest build time is the right most | Legend blue: normal, red: test failure, gray: timeout


Apache-Phoenix | 4.x-HBase-1.1 | Build Successful

2017-09-12 Thread Apache Jenkins Server
4.x-HBase-1.1 branch build status Successful

Source repository https://git-wip-us.apache.org/repos/asf?p=phoenix.git;a=shortlog;h=refs/heads/4.x-HBase-1.1

Compiled Artifacts https://builds.apache.org/job/Phoenix-4.x-HBase-1.1/lastSuccessfulBuild/artifact/

Test Report https://builds.apache.org/job/Phoenix-4.x-HBase-1.1/lastCompletedBuild/testReport/

Changes
[jamestaylor] PHOENIX-4185 Rewrite tests to disable DDL and DML for PercentileIT.java

[jamestaylor] PHOENIX-4205 Modify OutOfOrderMutationsIT to not use CURRENT_SCN

[jamestaylor] PHOENIX-4169 Explicitly cap timeout for index disable RPC on compaction



Build times for last couple of runsLatest build time is the right most | Legend blue: normal, red: test failure, gray: timeout


Apache-Phoenix | 4.x-HBase-1.2 | Build Successful

2017-09-12 Thread Apache Jenkins Server
4.x-HBase-1.2 branch build status Successful

Source repository https://git-wip-us.apache.org/repos/asf?p=phoenix.git;a=shortlog;h=refs/heads/4.x-HBase-1.2

Compiled Artifacts https://builds.apache.org/job/Phoenix-4.x-HBase-1.2/lastSuccessfulBuild/artifact/

Test Report https://builds.apache.org/job/Phoenix-4.x-HBase-1.2/lastCompletedBuild/testReport/

Changes
[samarth] PHOENIX-4201 Addendum to clear cache in deleteMetadata

[elserj] PHOENIX-4191 Categorize uncategorized integration tests

[jamestaylor] PHOENIX-4185 Rewrite tests to disable DDL and DML for PercentileIT.java

[jamestaylor] PHOENIX-4205 Modify OutOfOrderMutationsIT to not use CURRENT_SCN

[jamestaylor] PHOENIX-4169 Explicitly cap timeout for index disable RPC on compaction



Build times for last couple of runsLatest build time is the right most | Legend blue: normal, red: test failure, gray: timeout


Apache-Phoenix | Master | Build Successful

2017-09-12 Thread Apache Jenkins Server
Master branch build status Successful
Source repository https://git-wip-us.apache.org/repos/asf?p=phoenix.git;a=shortlog;h=refs/heads/master

Last Successful Compiled Artifacts https://builds.apache.org/job/Phoenix-master/lastSuccessfulBuild/artifact/

Last Complete Test Report https://builds.apache.org/job/Phoenix-master/lastCompletedBuild/testReport/

Changes
[samarth] PHOENIX-4201 Addendum to clear cache in deleteMetadata

[elserj] PHOENIX-4191 Categorize uncategorized integration tests



Build times for last couple of runsLatest build time is the right most | Legend blue: normal, red: test failure, gray: timeout


Apache-Phoenix | 4.x-HBase-1.1 | Build Successful

2017-09-12 Thread Apache Jenkins Server
4.x-HBase-1.1 branch build status Successful

Source repository https://git-wip-us.apache.org/repos/asf?p=phoenix.git;a=shortlog;h=refs/heads/4.x-HBase-1.1

Compiled Artifacts https://builds.apache.org/job/Phoenix-4.x-HBase-1.1/lastSuccessfulBuild/artifact/

Test Report https://builds.apache.org/job/Phoenix-4.x-HBase-1.1/lastCompletedBuild/testReport/

Changes
[samarth] PHOENIX-4201 Addendum to clear cache in deleteMetadata

[elserj] PHOENIX-4191 Categorize uncategorized integration tests



Build times for last couple of runsLatest build time is the right most | Legend blue: normal, red: test failure, gray: timeout


Apache-Phoenix | 4.x-HBase-1.2 | Build Successful

2017-09-12 Thread Apache Jenkins Server
4.x-HBase-1.2 branch build status Successful

Source repository https://git-wip-us.apache.org/repos/asf?p=phoenix.git;a=shortlog;h=refs/heads/4.x-HBase-1.2

Compiled Artifacts https://builds.apache.org/job/Phoenix-4.x-HBase-1.2/lastSuccessfulBuild/artifact/

Test Report https://builds.apache.org/job/Phoenix-4.x-HBase-1.2/lastCompletedBuild/testReport/

Changes
[samarth] PHOENIX-4201 Remove usage of SCN from QueryDatabaseMetadataIT



Build times for last couple of runsLatest build time is the right most | Legend blue: normal, red: test failure, gray: timeout


[4/4] phoenix git commit: PHOENIX-4169 Explicitly cap timeout for index disable RPC on compaction (Vincent Poon)

2017-09-12 Thread jamestaylor
PHOENIX-4169 Explicitly cap timeout for index disable RPC on compaction 
(Vincent Poon)


Project: http://git-wip-us.apache.org/repos/asf/phoenix/repo
Commit: http://git-wip-us.apache.org/repos/asf/phoenix/commit/28941bc6
Tree: http://git-wip-us.apache.org/repos/asf/phoenix/tree/28941bc6
Diff: http://git-wip-us.apache.org/repos/asf/phoenix/diff/28941bc6

Branch: refs/heads/4.x-HBase-1.2
Commit: 28941bc6f0421dbf0c48582768356b3ea90063b7
Parents: c27d7de
Author: James Taylor 
Authored: Tue Sep 12 17:00:47 2017 -0700
Committer: James Taylor 
Committed: Tue Sep 12 17:11:10 2017 -0700

--
 .../UngroupedAggregateRegionObserver.java   | 29 
 .../org/apache/phoenix/hbase/index/Indexer.java | 14 +-
 .../org/apache/phoenix/query/QueryServices.java |  4 +++
 .../phoenix/query/QueryServicesOptions.java |  5 
 4 files changed, 46 insertions(+), 6 deletions(-)
--


http://git-wip-us.apache.org/repos/asf/phoenix/blob/28941bc6/phoenix-core/src/main/java/org/apache/phoenix/coprocessor/UngroupedAggregateRegionObserver.java
--
diff --git 
a/phoenix-core/src/main/java/org/apache/phoenix/coprocessor/UngroupedAggregateRegionObserver.java
 
b/phoenix-core/src/main/java/org/apache/phoenix/coprocessor/UngroupedAggregateRegionObserver.java
index a61f502..0773ebc 100644
--- 
a/phoenix-core/src/main/java/org/apache/phoenix/coprocessor/UngroupedAggregateRegionObserver.java
+++ 
b/phoenix-core/src/main/java/org/apache/phoenix/coprocessor/UngroupedAggregateRegionObserver.java
@@ -53,6 +53,7 @@ import org.apache.hadoop.hbase.HTableDescriptor;
 import org.apache.hadoop.hbase.KeyValue;
 import org.apache.hadoop.hbase.NamespaceDescriptor;
 import org.apache.hadoop.hbase.TableName;
+import org.apache.hadoop.hbase.client.CoprocessorHConnection;
 import org.apache.hadoop.hbase.client.Delete;
 import org.apache.hadoop.hbase.client.Durability;
 import org.apache.hadoop.hbase.client.Get;
@@ -67,6 +68,7 @@ import 
org.apache.hadoop.hbase.coprocessor.RegionCoprocessorEnvironment;
 import org.apache.hadoop.hbase.io.ImmutableBytesWritable;
 import org.apache.hadoop.hbase.ipc.RpcControllerFactory;
 import 
org.apache.hadoop.hbase.ipc.controller.InterRegionServerIndexRpcControllerFactory;
+import org.apache.hadoop.hbase.regionserver.HRegionServer;
 import org.apache.hadoop.hbase.regionserver.InternalScanner;
 import org.apache.hadoop.hbase.regionserver.Region;
 import org.apache.hadoop.hbase.regionserver.RegionScanner;
@@ -98,6 +100,7 @@ import org.apache.phoenix.index.PhoenixIndexCodec;
 import org.apache.phoenix.jdbc.PhoenixDatabaseMetaData;
 import org.apache.phoenix.join.HashJoinInfo;
 import org.apache.phoenix.query.QueryConstants;
+import org.apache.phoenix.query.QueryServices;
 import org.apache.phoenix.query.QueryServicesOptions;
 import org.apache.phoenix.schema.ColumnFamilyNotFoundException;
 import org.apache.phoenix.schema.PColumn;
@@ -192,6 +195,7 @@ public class UngroupedAggregateRegionObserver extends 
BaseScannerRegionObserver
 private static final Logger logger = 
LoggerFactory.getLogger(UngroupedAggregateRegionObserver.class);
 private KeyValueBuilder kvBuilder;
 private Configuration upsertSelectConfig;
+private Configuration compactionConfig;
 
 @Override
 public void start(CoprocessorEnvironment e) throws IOException {
@@ -212,6 +216,15 @@ public class UngroupedAggregateRegionObserver extends 
BaseScannerRegionObserver
  */
 
upsertSelectConfig.setClass(RpcControllerFactory.CUSTOM_CONTROLLER_CONF_KEY,
 InterRegionServerIndexRpcControllerFactory.class, 
RpcControllerFactory.class);
+
+compactionConfig = PropertiesUtil.cloneConfig(e.getConfiguration());
+// lower the number of rpc retries, so we don't hang the compaction
+compactionConfig.setInt(HConstants.HBASE_CLIENT_RETRIES_NUMBER,
+
e.getConfiguration().getInt(QueryServices.METADATA_WRITE_RETRIES_NUMBER,
+QueryServicesOptions.DEFAULT_METADATA_WRITE_RETRIES_NUMBER));
+compactionConfig.setInt(HConstants.HBASE_CLIENT_PAUSE,
+
e.getConfiguration().getInt(QueryServices.METADATA_WRITE_RETRY_PAUSE,
+QueryServicesOptions.DEFAULT_METADATA_WRITE_RETRY_PAUSE));
 }
 
 private void commitBatch(Region region, List mutations, long 
blockingMemstoreSize) throws IOException {
@@ -924,11 +937,16 @@ public class UngroupedAggregateRegionObserver extends 
BaseScannerRegionObserver
 public Void run() throws Exception {
 MutationCode mutationCode = null;
 long disableIndexTimestamp = 0;
-
-try (HTableInterface htable = e.getEnvironment().getTable(
- 

[1/4] phoenix git commit: PHOENIX-4185 Rewrite tests to disable DDL and DML for PercentileIT.java and ProductMetricsIT.java (Ethan Wang)

2017-09-12 Thread jamestaylor
Repository: phoenix
Updated Branches:
  refs/heads/4.x-HBase-1.2 b1751c4de -> 28941bc6f


http://git-wip-us.apache.org/repos/asf/phoenix/blob/d1787fc9/phoenix-core/src/it/java/org/apache/phoenix/end2end/ProductMetricsIT.java
--
diff --git 
a/phoenix-core/src/it/java/org/apache/phoenix/end2end/ProductMetricsIT.java 
b/phoenix-core/src/it/java/org/apache/phoenix/end2end/ProductMetricsIT.java
index 87b7af6..969b585 100644
--- a/phoenix-core/src/it/java/org/apache/phoenix/end2end/ProductMetricsIT.java
+++ b/phoenix-core/src/it/java/org/apache/phoenix/end2end/ProductMetricsIT.java
@@ -48,13 +48,15 @@ import org.apache.phoenix.util.DateUtil;
 import org.apache.phoenix.util.PhoenixRuntime;
 import org.apache.phoenix.util.PropertiesUtil;
 import org.apache.phoenix.util.SchemaUtil;
+import org.apache.phoenix.util.TestUtil;
+import org.junit.After;
 import org.junit.Test;
 
 import com.google.common.collect.Lists;
 import com.google.common.collect.Ordering;
 
 
-public class ProductMetricsIT extends BaseClientManagedTimeIT {
+public class ProductMetricsIT extends ParallelStatsDisabledIT {
 private static final String PRODUCT_METRICS_NAME = "PRODUCT_METRICS";
 private static final String PRODUCT_METRICS_SCHEMA_NAME = "";
 private static final String DS1 = "1970-01-01 00:58:00";
@@ -76,57 +78,55 @@ public class ProductMetricsIT extends 
BaseClientManagedTimeIT {
 private static final String F3 = "C";
 private static final String R1 = "R1";
 private static final String R2 = "R2";
-
+
 private static byte[][] getSplits(String tenantId) {
-return new byte[][] { 
-ByteUtil.concat(Bytes.toBytes(tenantId), 
PDate.INSTANCE.toBytes(D3)),
-ByteUtil.concat(Bytes.toBytes(tenantId), 
PDate.INSTANCE.toBytes(D5)),
-};
+return new byte[][] {
+ByteUtil.concat(Bytes.toBytes(tenantId), 
PDate.INSTANCE.toBytes(D3)),
+ByteUtil.concat(Bytes.toBytes(tenantId), 
PDate.INSTANCE.toBytes(D5)),
+};
 }
-
+
 private static Date toDate(String dateString) {
 return DateUtil.parseDate(dateString);
 }
-
-private static void initTable(byte[][] splits, long ts) throws Exception {
-ensureTableCreated(getUrl(), PRODUCT_METRICS_NAME, 
PRODUCT_METRICS_NAME,splits, ts-2, null);
+
+private static void initTable(String tablename, byte[][] splits) throws 
Exception {
+ensureTableCreated(getUrl(), tablename, PRODUCT_METRICS_NAME, splits, 
null, null);
 }
 
-private static void assertNoRows(Connection conn) throws SQLException {
+private static void assertNoRows(String tablename,Connection conn) throws 
SQLException {
 Statement stmt = conn.createStatement();
-ResultSet rs = stmt.executeQuery("select 1 from PRODUCT_METRICS");
+ResultSet rs = stmt.executeQuery("select 1 from "+tablename);
 assertFalse(rs.next());
 }
-
-private static void initTableValues(String tenantId, byte[][] splits, long 
ts) throws Exception {
-initTable(splits, ts);
 
-String url = getUrl() + ";" + PhoenixRuntime.CURRENT_SCN_ATTRIB + "=" 
+ ts; // Run query at timestamp 5
+private static void initTableValues(String tablename, String tenantId, 
byte[][] splits) throws Exception {
+initTable(tablename, splits);
 Properties props = PropertiesUtil.deepCopy(TEST_PROPERTIES);
-Connection conn = DriverManager.getConnection(url, props);
+Connection conn = DriverManager.getConnection(getUrl(), props);
 try {
-assertNoRows(conn);
-initTableValues(conn, tenantId);
+assertNoRows(tablename, conn);
+initTableValues(tablename, conn, tenantId);
 conn.commit();
 } finally {
 conn.close();
 }
 }
-
-protected static void initTableValues(Connection conn, String tenantId) 
throws Exception {
+
+protected static void initTableValues(String tablename, Connection conn, 
String tenantId) throws Exception {
 PreparedStatement stmt = conn.prepareStatement(
-"upsert into " +
-"PRODUCT_METRICS(" +
-"ORGANIZATION_ID, " +
-"\"DATE\", " +
-"FEATURE, " +
-"UNIQUE_USERS, " +
-"TRANSACTIONS, " +
-"CPU_UTILIZATION, " +
-"DB_UTILIZATION, " +
-"REGION, " +
-"IO_TIME)" +
-"VALUES (?, ?, ?, ?, ?, ?, ?, ?, ?)");
+"upsert into " + tablename +
+" (" +
+"ORGANIZATION_ID, " +
+"\"DATE\", " +
+"FEATURE, " +
+"UNIQUE_USERS, " +
+"TRANSACTIONS, " +
+"CPU_UTILIZATION, " +
+ 

[2/4] phoenix git commit: PHOENIX-4185 Rewrite tests to disable DDL and DML for PercentileIT.java and ProductMetricsIT.java (Ethan Wang)

2017-09-12 Thread jamestaylor
PHOENIX-4185 Rewrite tests to disable DDL and DML for PercentileIT.java and 
ProductMetricsIT.java (Ethan Wang)


Project: http://git-wip-us.apache.org/repos/asf/phoenix/repo
Commit: http://git-wip-us.apache.org/repos/asf/phoenix/commit/d1787fc9
Tree: http://git-wip-us.apache.org/repos/asf/phoenix/tree/d1787fc9
Diff: http://git-wip-us.apache.org/repos/asf/phoenix/diff/d1787fc9

Branch: refs/heads/4.x-HBase-1.2
Commit: d1787fc91f66602003284a7d53718b80007b2932
Parents: b1751c4
Author: James Taylor 
Authored: Tue Sep 12 17:04:18 2017 -0700
Committer: James Taylor 
Committed: Tue Sep 12 17:11:01 2017 -0700

--
 .gitignore  |   2 +
 .../apache/phoenix/end2end/PercentileIT.java| 145 ++-
 .../phoenix/end2end/ProductMetricsIT.java   | 947 +--
 3 files changed, 512 insertions(+), 582 deletions(-)
--


http://git-wip-us.apache.org/repos/asf/phoenix/blob/d1787fc9/.gitignore
--
diff --git a/.gitignore b/.gitignore
index 803e8ea..33d40ea 100644
--- a/.gitignore
+++ b/.gitignore
@@ -18,6 +18,8 @@
 # intellij stuff
 .idea/
 *.iml
+*.ipr
+*.iws
 
 #maven stuffs
 target/

http://git-wip-us.apache.org/repos/asf/phoenix/blob/d1787fc9/phoenix-core/src/it/java/org/apache/phoenix/end2end/PercentileIT.java
--
diff --git 
a/phoenix-core/src/it/java/org/apache/phoenix/end2end/PercentileIT.java 
b/phoenix-core/src/it/java/org/apache/phoenix/end2end/PercentileIT.java
index 2d0ead9..965fc2c 100644
--- a/phoenix-core/src/it/java/org/apache/phoenix/end2end/PercentileIT.java
+++ b/phoenix-core/src/it/java/org/apache/phoenix/end2end/PercentileIT.java
@@ -46,7 +46,6 @@ import java.sql.Date;
 import java.sql.DriverManager;
 import java.sql.PreparedStatement;
 import java.sql.ResultSet;
-import java.sql.ResultSetMetaData;
 import java.sql.Statement;
 import java.sql.SQLException;
 import java.sql.Types;
@@ -54,7 +53,6 @@ import java.util.Properties;
 
 import org.apache.phoenix.query.QueryConstants;
 import org.apache.phoenix.util.DateUtil;
-import org.apache.phoenix.util.PhoenixRuntime;
 import org.apache.phoenix.util.PropertiesUtil;
 import org.apache.phoenix.util.TestUtil;
 import org.junit.Test;
@@ -65,10 +63,9 @@ public class PercentileIT extends ParallelStatsDisabledIT {
 @Test
 public void testPercentile() throws Exception {
 String tenantId = getOrganizationId();
-String tableName = initATableValues(tenantId, null, 
getDefaultSplits(tenantId), null, null);
+String tableName = initATableValues(tenantId, null, 
getDefaultSplits(tenantId), null);
 
 String query = "SELECT PERCENTILE_CONT(0.9) WITHIN GROUP (ORDER BY 
A_INTEGER ASC) FROM " + tableName;
-
 Properties props = PropertiesUtil.deepCopy(TEST_PROPERTIES);
 Connection conn = DriverManager.getConnection(getUrl(), props);
 try {
@@ -87,7 +84,7 @@ public class PercentileIT extends ParallelStatsDisabledIT {
 @Test
 public void testPercentileDesc() throws Exception {
 String tenantId = getOrganizationId();
-String tableName = initATableValues(tenantId, null, 
getDefaultSplits(tenantId), null, null);
+String tableName = initATableValues(tenantId, null, 
getDefaultSplits(tenantId), null);
 
 String query = "SELECT PERCENTILE_CONT(0.9) WITHIN GROUP (ORDER BY 
A_INTEGER DESC) FROM " + tableName;
 
@@ -105,11 +102,11 @@ public class PercentileIT extends ParallelStatsDisabledIT 
{
 conn.close();
 }
 }
-
+
 @Test
 public void testPercentileWithGroupby() throws Exception {
 String tenantId = getOrganizationId();
-String tableName = initATableValues(tenantId, null, 
getDefaultSplits(tenantId), null, null);
+String tableName = initATableValues(tenantId, null, 
getDefaultSplits(tenantId), null);
 
 String query = "SELECT A_STRING, PERCENTILE_CONT(0.9) WITHIN GROUP 
(ORDER BY A_INTEGER ASC) FROM " + tableName + " GROUP BY A_STRING";
 
@@ -142,7 +139,7 @@ public class PercentileIT extends ParallelStatsDisabledIT {
 @Test
 public void testPercentileWithGroupbyAndOrderBy() throws Exception {
 String tenantId = getOrganizationId();
-String tableName = initATableValues(tenantId, null, 
getDefaultSplits(tenantId), null, null);
+String tableName = initATableValues(tenantId, null, 
getDefaultSplits(tenantId), null);
 
 String query = "SELECT A_STRING, PERCENTILE_CONT(0.9) WITHIN GROUP 
(ORDER BY A_INTEGER ASC) AS PC FROM " + tableName + " GROUP BY A_STRING ORDER 
BY PC";
 
@@ -173,51 +170,51 @@ public class PercentileIT extends ParallelStatsDisabledIT 
{
 }
 
 @Test
-   public void testPercentileDiscAsc() 

[2/4] phoenix git commit: PHOENIX-4185 Rewrite tests to disable DDL and DML for PercentileIT.java and ProductMetricsIT.java (Ethan Wang)

2017-09-12 Thread jamestaylor
PHOENIX-4185 Rewrite tests to disable DDL and DML for PercentileIT.java and 
ProductMetricsIT.java (Ethan Wang)


Project: http://git-wip-us.apache.org/repos/asf/phoenix/repo
Commit: http://git-wip-us.apache.org/repos/asf/phoenix/commit/cd203d1d
Tree: http://git-wip-us.apache.org/repos/asf/phoenix/tree/cd203d1d
Diff: http://git-wip-us.apache.org/repos/asf/phoenix/diff/cd203d1d

Branch: refs/heads/4.x-HBase-1.1
Commit: cd203d1d55dd4b76781127d3b37e7c6a5035ec06
Parents: ca8bd4a
Author: James Taylor 
Authored: Tue Sep 12 17:04:18 2017 -0700
Committer: James Taylor 
Committed: Tue Sep 12 17:09:00 2017 -0700

--
 .gitignore  |   2 +
 .../apache/phoenix/end2end/PercentileIT.java| 145 ++-
 .../phoenix/end2end/ProductMetricsIT.java   | 947 +--
 3 files changed, 512 insertions(+), 582 deletions(-)
--


http://git-wip-us.apache.org/repos/asf/phoenix/blob/cd203d1d/.gitignore
--
diff --git a/.gitignore b/.gitignore
index 803e8ea..33d40ea 100644
--- a/.gitignore
+++ b/.gitignore
@@ -18,6 +18,8 @@
 # intellij stuff
 .idea/
 *.iml
+*.ipr
+*.iws
 
 #maven stuffs
 target/

http://git-wip-us.apache.org/repos/asf/phoenix/blob/cd203d1d/phoenix-core/src/it/java/org/apache/phoenix/end2end/PercentileIT.java
--
diff --git 
a/phoenix-core/src/it/java/org/apache/phoenix/end2end/PercentileIT.java 
b/phoenix-core/src/it/java/org/apache/phoenix/end2end/PercentileIT.java
index 2d0ead9..965fc2c 100644
--- a/phoenix-core/src/it/java/org/apache/phoenix/end2end/PercentileIT.java
+++ b/phoenix-core/src/it/java/org/apache/phoenix/end2end/PercentileIT.java
@@ -46,7 +46,6 @@ import java.sql.Date;
 import java.sql.DriverManager;
 import java.sql.PreparedStatement;
 import java.sql.ResultSet;
-import java.sql.ResultSetMetaData;
 import java.sql.Statement;
 import java.sql.SQLException;
 import java.sql.Types;
@@ -54,7 +53,6 @@ import java.util.Properties;
 
 import org.apache.phoenix.query.QueryConstants;
 import org.apache.phoenix.util.DateUtil;
-import org.apache.phoenix.util.PhoenixRuntime;
 import org.apache.phoenix.util.PropertiesUtil;
 import org.apache.phoenix.util.TestUtil;
 import org.junit.Test;
@@ -65,10 +63,9 @@ public class PercentileIT extends ParallelStatsDisabledIT {
 @Test
 public void testPercentile() throws Exception {
 String tenantId = getOrganizationId();
-String tableName = initATableValues(tenantId, null, 
getDefaultSplits(tenantId), null, null);
+String tableName = initATableValues(tenantId, null, 
getDefaultSplits(tenantId), null);
 
 String query = "SELECT PERCENTILE_CONT(0.9) WITHIN GROUP (ORDER BY 
A_INTEGER ASC) FROM " + tableName;
-
 Properties props = PropertiesUtil.deepCopy(TEST_PROPERTIES);
 Connection conn = DriverManager.getConnection(getUrl(), props);
 try {
@@ -87,7 +84,7 @@ public class PercentileIT extends ParallelStatsDisabledIT {
 @Test
 public void testPercentileDesc() throws Exception {
 String tenantId = getOrganizationId();
-String tableName = initATableValues(tenantId, null, 
getDefaultSplits(tenantId), null, null);
+String tableName = initATableValues(tenantId, null, 
getDefaultSplits(tenantId), null);
 
 String query = "SELECT PERCENTILE_CONT(0.9) WITHIN GROUP (ORDER BY 
A_INTEGER DESC) FROM " + tableName;
 
@@ -105,11 +102,11 @@ public class PercentileIT extends ParallelStatsDisabledIT 
{
 conn.close();
 }
 }
-
+
 @Test
 public void testPercentileWithGroupby() throws Exception {
 String tenantId = getOrganizationId();
-String tableName = initATableValues(tenantId, null, 
getDefaultSplits(tenantId), null, null);
+String tableName = initATableValues(tenantId, null, 
getDefaultSplits(tenantId), null);
 
 String query = "SELECT A_STRING, PERCENTILE_CONT(0.9) WITHIN GROUP 
(ORDER BY A_INTEGER ASC) FROM " + tableName + " GROUP BY A_STRING";
 
@@ -142,7 +139,7 @@ public class PercentileIT extends ParallelStatsDisabledIT {
 @Test
 public void testPercentileWithGroupbyAndOrderBy() throws Exception {
 String tenantId = getOrganizationId();
-String tableName = initATableValues(tenantId, null, 
getDefaultSplits(tenantId), null, null);
+String tableName = initATableValues(tenantId, null, 
getDefaultSplits(tenantId), null);
 
 String query = "SELECT A_STRING, PERCENTILE_CONT(0.9) WITHIN GROUP 
(ORDER BY A_INTEGER ASC) AS PC FROM " + tableName + " GROUP BY A_STRING ORDER 
BY PC";
 
@@ -173,51 +170,51 @@ public class PercentileIT extends ParallelStatsDisabledIT 
{
 }
 
 @Test
-   public void testPercentileDiscAsc() 

[3/4] phoenix git commit: PHOENIX-4205 Modify OutOfOrderMutationsIT to not use CURRENT_SCN

2017-09-12 Thread jamestaylor
PHOENIX-4205 Modify OutOfOrderMutationsIT to not use CURRENT_SCN


Project: http://git-wip-us.apache.org/repos/asf/phoenix/repo
Commit: http://git-wip-us.apache.org/repos/asf/phoenix/commit/f5ffabba
Tree: http://git-wip-us.apache.org/repos/asf/phoenix/tree/f5ffabba
Diff: http://git-wip-us.apache.org/repos/asf/phoenix/diff/f5ffabba

Branch: refs/heads/4.x-HBase-1.1
Commit: f5ffabba4afb29032419ac61c039a103213183a0
Parents: cd203d1
Author: James Taylor 
Authored: Tue Sep 12 16:50:27 2017 -0700
Committer: James Taylor 
Committed: Tue Sep 12 17:09:07 2017 -0700

--
 .../phoenix/end2end/OutOfOrderMutationsIT.java  | 955 +++
 1 file changed, 346 insertions(+), 609 deletions(-)
--


http://git-wip-us.apache.org/repos/asf/phoenix/blob/f5ffabba/phoenix-core/src/it/java/org/apache/phoenix/end2end/OutOfOrderMutationsIT.java
--
diff --git 
a/phoenix-core/src/it/java/org/apache/phoenix/end2end/OutOfOrderMutationsIT.java
 
b/phoenix-core/src/it/java/org/apache/phoenix/end2end/OutOfOrderMutationsIT.java
index e8adf6b..40b58f2 100644
--- 
a/phoenix-core/src/it/java/org/apache/phoenix/end2end/OutOfOrderMutationsIT.java
+++ 
b/phoenix-core/src/it/java/org/apache/phoenix/end2end/OutOfOrderMutationsIT.java
@@ -19,631 +19,368 @@ package org.apache.phoenix.end2end;
 
 import static org.junit.Assert.assertEquals;
 import static org.junit.Assert.assertFalse;
-import static org.junit.Assert.assertNotEquals;
 import static org.junit.Assert.assertTrue;
 
-import java.io.IOException;
 import java.sql.Connection;
 import java.sql.DriverManager;
 import java.sql.PreparedStatement;
 import java.sql.ResultSet;
-import java.sql.SQLException;
 import java.sql.Timestamp;
 import java.util.Properties;
 
-import org.apache.hadoop.hbase.CellScanner;
-import org.apache.hadoop.hbase.client.HTableInterface;
-import org.apache.hadoop.hbase.client.Result;
-import org.apache.hadoop.hbase.client.ResultScanner;
-import org.apache.hadoop.hbase.client.Scan;
 import org.apache.hadoop.hbase.util.Bytes;
 import org.apache.phoenix.jdbc.PhoenixConnection;
+import org.apache.phoenix.util.EnvironmentEdge;
+import org.apache.phoenix.util.EnvironmentEdgeManager;
 import org.apache.phoenix.util.IndexScrutiny;
-import org.apache.phoenix.util.PhoenixRuntime;
 import org.apache.phoenix.util.PropertiesUtil;
 import org.apache.phoenix.util.TestUtil;
 import org.junit.Ignore;
 import org.junit.Test;
 
 public class OutOfOrderMutationsIT extends ParallelStatsDisabledIT {
-@Test
-public void testOutOfOrderDelete() throws Exception {
-String tableName = generateUniqueName();
-String indexName = generateUniqueName();
-Properties props = PropertiesUtil.deepCopy(TestUtil.TEST_PROPERTIES);
-long ts = 1000;
-props.setProperty(PhoenixRuntime.CURRENT_SCN_ATTRIB, 
Long.toString(ts));
-Connection conn = DriverManager.getConnection(getUrl(), props); 
-conn.createStatement().execute("CREATE TABLE " + tableName + "(k1 
CHAR(2) NOT NULL, k2 CHAR(2) NOT NULL, ts TIMESTAMP, CONSTRAINT pk PRIMARY KEY 
(k1,k2)) COLUMN_ENCODED_BYTES = 0");
-conn.close();
-
-ts = 1010;
-props.setProperty(PhoenixRuntime.CURRENT_SCN_ATTRIB, 
Long.toString(ts));
-conn = DriverManager.getConnection(getUrl(), props);
-conn.createStatement().execute("CREATE INDEX " + indexName + " ON " + 
tableName + "(k2,k1,ts)");
-conn.close();
-
-ts = 1020;
-props.setProperty(PhoenixRuntime.CURRENT_SCN_ATTRIB, 
Long.toString(ts));
-conn = DriverManager.getConnection(getUrl(), props);
-PreparedStatement stmt = conn.prepareStatement("UPSERT INTO " + 
tableName + " VALUES('aa','aa',?)");
-stmt.setTimestamp(1, new Timestamp(1000L));
-stmt.executeUpdate();
-conn.commit();
-conn.close();
-
-ts = 1040;
-props.setProperty(PhoenixRuntime.CURRENT_SCN_ATTRIB, 
Long.toString(ts));
-conn = DriverManager.getConnection(getUrl(), props);
-conn.createStatement().execute("DELETE FROM " + tableName + " WHERE 
k1='aa'");
-conn.commit();
-conn.close();
-
-ts = 1030;
-props.setProperty(PhoenixRuntime.CURRENT_SCN_ATTRIB, 
Long.toString(ts));
-conn = DriverManager.getConnection(getUrl(), props);
-stmt = conn.prepareStatement("UPSERT INTO " + tableName + " 
VALUES('aa','aa',?)");
-stmt.setTimestamp(1, new Timestamp(2000L));
-stmt.executeUpdate();
-conn.commit();
-conn.close();
-
-ts = 1050;
-props.setProperty(PhoenixRuntime.CURRENT_SCN_ATTRIB, 
Long.toString(ts));
-conn = DriverManager.getConnection(getUrl(), props);
-
-

[1/4] phoenix git commit: PHOENIX-4185 Rewrite tests to disable DDL and DML for PercentileIT.java and ProductMetricsIT.java (Ethan Wang)

2017-09-12 Thread jamestaylor
Repository: phoenix
Updated Branches:
  refs/heads/4.x-HBase-1.1 ca8bd4a90 -> 2352f819e


http://git-wip-us.apache.org/repos/asf/phoenix/blob/cd203d1d/phoenix-core/src/it/java/org/apache/phoenix/end2end/ProductMetricsIT.java
--
diff --git 
a/phoenix-core/src/it/java/org/apache/phoenix/end2end/ProductMetricsIT.java 
b/phoenix-core/src/it/java/org/apache/phoenix/end2end/ProductMetricsIT.java
index 87b7af6..969b585 100644
--- a/phoenix-core/src/it/java/org/apache/phoenix/end2end/ProductMetricsIT.java
+++ b/phoenix-core/src/it/java/org/apache/phoenix/end2end/ProductMetricsIT.java
@@ -48,13 +48,15 @@ import org.apache.phoenix.util.DateUtil;
 import org.apache.phoenix.util.PhoenixRuntime;
 import org.apache.phoenix.util.PropertiesUtil;
 import org.apache.phoenix.util.SchemaUtil;
+import org.apache.phoenix.util.TestUtil;
+import org.junit.After;
 import org.junit.Test;
 
 import com.google.common.collect.Lists;
 import com.google.common.collect.Ordering;
 
 
-public class ProductMetricsIT extends BaseClientManagedTimeIT {
+public class ProductMetricsIT extends ParallelStatsDisabledIT {
 private static final String PRODUCT_METRICS_NAME = "PRODUCT_METRICS";
 private static final String PRODUCT_METRICS_SCHEMA_NAME = "";
 private static final String DS1 = "1970-01-01 00:58:00";
@@ -76,57 +78,55 @@ public class ProductMetricsIT extends 
BaseClientManagedTimeIT {
 private static final String F3 = "C";
 private static final String R1 = "R1";
 private static final String R2 = "R2";
-
+
 private static byte[][] getSplits(String tenantId) {
-return new byte[][] { 
-ByteUtil.concat(Bytes.toBytes(tenantId), 
PDate.INSTANCE.toBytes(D3)),
-ByteUtil.concat(Bytes.toBytes(tenantId), 
PDate.INSTANCE.toBytes(D5)),
-};
+return new byte[][] {
+ByteUtil.concat(Bytes.toBytes(tenantId), 
PDate.INSTANCE.toBytes(D3)),
+ByteUtil.concat(Bytes.toBytes(tenantId), 
PDate.INSTANCE.toBytes(D5)),
+};
 }
-
+
 private static Date toDate(String dateString) {
 return DateUtil.parseDate(dateString);
 }
-
-private static void initTable(byte[][] splits, long ts) throws Exception {
-ensureTableCreated(getUrl(), PRODUCT_METRICS_NAME, 
PRODUCT_METRICS_NAME,splits, ts-2, null);
+
+private static void initTable(String tablename, byte[][] splits) throws 
Exception {
+ensureTableCreated(getUrl(), tablename, PRODUCT_METRICS_NAME, splits, 
null, null);
 }
 
-private static void assertNoRows(Connection conn) throws SQLException {
+private static void assertNoRows(String tablename,Connection conn) throws 
SQLException {
 Statement stmt = conn.createStatement();
-ResultSet rs = stmt.executeQuery("select 1 from PRODUCT_METRICS");
+ResultSet rs = stmt.executeQuery("select 1 from "+tablename);
 assertFalse(rs.next());
 }
-
-private static void initTableValues(String tenantId, byte[][] splits, long 
ts) throws Exception {
-initTable(splits, ts);
 
-String url = getUrl() + ";" + PhoenixRuntime.CURRENT_SCN_ATTRIB + "=" 
+ ts; // Run query at timestamp 5
+private static void initTableValues(String tablename, String tenantId, 
byte[][] splits) throws Exception {
+initTable(tablename, splits);
 Properties props = PropertiesUtil.deepCopy(TEST_PROPERTIES);
-Connection conn = DriverManager.getConnection(url, props);
+Connection conn = DriverManager.getConnection(getUrl(), props);
 try {
-assertNoRows(conn);
-initTableValues(conn, tenantId);
+assertNoRows(tablename, conn);
+initTableValues(tablename, conn, tenantId);
 conn.commit();
 } finally {
 conn.close();
 }
 }
-
-protected static void initTableValues(Connection conn, String tenantId) 
throws Exception {
+
+protected static void initTableValues(String tablename, Connection conn, 
String tenantId) throws Exception {
 PreparedStatement stmt = conn.prepareStatement(
-"upsert into " +
-"PRODUCT_METRICS(" +
-"ORGANIZATION_ID, " +
-"\"DATE\", " +
-"FEATURE, " +
-"UNIQUE_USERS, " +
-"TRANSACTIONS, " +
-"CPU_UTILIZATION, " +
-"DB_UTILIZATION, " +
-"REGION, " +
-"IO_TIME)" +
-"VALUES (?, ?, ?, ?, ?, ?, ?, ?, ?)");
+"upsert into " + tablename +
+" (" +
+"ORGANIZATION_ID, " +
+"\"DATE\", " +
+"FEATURE, " +
+"UNIQUE_USERS, " +
+"TRANSACTIONS, " +
+"CPU_UTILIZATION, " +
+ 

[4/4] phoenix git commit: PHOENIX-4169 Explicitly cap timeout for index disable RPC on compaction (Vincent Poon)

2017-09-12 Thread jamestaylor
PHOENIX-4169 Explicitly cap timeout for index disable RPC on compaction 
(Vincent Poon)


Project: http://git-wip-us.apache.org/repos/asf/phoenix/repo
Commit: http://git-wip-us.apache.org/repos/asf/phoenix/commit/2352f819
Tree: http://git-wip-us.apache.org/repos/asf/phoenix/tree/2352f819
Diff: http://git-wip-us.apache.org/repos/asf/phoenix/diff/2352f819

Branch: refs/heads/4.x-HBase-1.1
Commit: 2352f819e7d91cbd45ea6f55023c4e1ed0579ed6
Parents: f5ffabb
Author: James Taylor 
Authored: Tue Sep 12 17:00:47 2017 -0700
Committer: James Taylor 
Committed: Tue Sep 12 17:09:34 2017 -0700

--
 .../UngroupedAggregateRegionObserver.java   | 29 
 .../org/apache/phoenix/hbase/index/Indexer.java | 14 +-
 .../org/apache/phoenix/query/QueryServices.java |  4 +++
 .../phoenix/query/QueryServicesOptions.java |  5 
 4 files changed, 46 insertions(+), 6 deletions(-)
--


http://git-wip-us.apache.org/repos/asf/phoenix/blob/2352f819/phoenix-core/src/main/java/org/apache/phoenix/coprocessor/UngroupedAggregateRegionObserver.java
--
diff --git 
a/phoenix-core/src/main/java/org/apache/phoenix/coprocessor/UngroupedAggregateRegionObserver.java
 
b/phoenix-core/src/main/java/org/apache/phoenix/coprocessor/UngroupedAggregateRegionObserver.java
index a61f502..0773ebc 100644
--- 
a/phoenix-core/src/main/java/org/apache/phoenix/coprocessor/UngroupedAggregateRegionObserver.java
+++ 
b/phoenix-core/src/main/java/org/apache/phoenix/coprocessor/UngroupedAggregateRegionObserver.java
@@ -53,6 +53,7 @@ import org.apache.hadoop.hbase.HTableDescriptor;
 import org.apache.hadoop.hbase.KeyValue;
 import org.apache.hadoop.hbase.NamespaceDescriptor;
 import org.apache.hadoop.hbase.TableName;
+import org.apache.hadoop.hbase.client.CoprocessorHConnection;
 import org.apache.hadoop.hbase.client.Delete;
 import org.apache.hadoop.hbase.client.Durability;
 import org.apache.hadoop.hbase.client.Get;
@@ -67,6 +68,7 @@ import 
org.apache.hadoop.hbase.coprocessor.RegionCoprocessorEnvironment;
 import org.apache.hadoop.hbase.io.ImmutableBytesWritable;
 import org.apache.hadoop.hbase.ipc.RpcControllerFactory;
 import 
org.apache.hadoop.hbase.ipc.controller.InterRegionServerIndexRpcControllerFactory;
+import org.apache.hadoop.hbase.regionserver.HRegionServer;
 import org.apache.hadoop.hbase.regionserver.InternalScanner;
 import org.apache.hadoop.hbase.regionserver.Region;
 import org.apache.hadoop.hbase.regionserver.RegionScanner;
@@ -98,6 +100,7 @@ import org.apache.phoenix.index.PhoenixIndexCodec;
 import org.apache.phoenix.jdbc.PhoenixDatabaseMetaData;
 import org.apache.phoenix.join.HashJoinInfo;
 import org.apache.phoenix.query.QueryConstants;
+import org.apache.phoenix.query.QueryServices;
 import org.apache.phoenix.query.QueryServicesOptions;
 import org.apache.phoenix.schema.ColumnFamilyNotFoundException;
 import org.apache.phoenix.schema.PColumn;
@@ -192,6 +195,7 @@ public class UngroupedAggregateRegionObserver extends 
BaseScannerRegionObserver
 private static final Logger logger = 
LoggerFactory.getLogger(UngroupedAggregateRegionObserver.class);
 private KeyValueBuilder kvBuilder;
 private Configuration upsertSelectConfig;
+private Configuration compactionConfig;
 
 @Override
 public void start(CoprocessorEnvironment e) throws IOException {
@@ -212,6 +216,15 @@ public class UngroupedAggregateRegionObserver extends 
BaseScannerRegionObserver
  */
 
upsertSelectConfig.setClass(RpcControllerFactory.CUSTOM_CONTROLLER_CONF_KEY,
 InterRegionServerIndexRpcControllerFactory.class, 
RpcControllerFactory.class);
+
+compactionConfig = PropertiesUtil.cloneConfig(e.getConfiguration());
+// lower the number of rpc retries, so we don't hang the compaction
+compactionConfig.setInt(HConstants.HBASE_CLIENT_RETRIES_NUMBER,
+
e.getConfiguration().getInt(QueryServices.METADATA_WRITE_RETRIES_NUMBER,
+QueryServicesOptions.DEFAULT_METADATA_WRITE_RETRIES_NUMBER));
+compactionConfig.setInt(HConstants.HBASE_CLIENT_PAUSE,
+
e.getConfiguration().getInt(QueryServices.METADATA_WRITE_RETRY_PAUSE,
+QueryServicesOptions.DEFAULT_METADATA_WRITE_RETRY_PAUSE));
 }
 
 private void commitBatch(Region region, List mutations, long 
blockingMemstoreSize) throws IOException {
@@ -924,11 +937,16 @@ public class UngroupedAggregateRegionObserver extends 
BaseScannerRegionObserver
 public Void run() throws Exception {
 MutationCode mutationCode = null;
 long disableIndexTimestamp = 0;
-
-try (HTableInterface htable = e.getEnvironment().getTable(
- 

[1/4] phoenix git commit: PHOENIX-4169 Explicitly cap timeout for index disable RPC on compaction (Vincent Poon)

2017-09-12 Thread jamestaylor
Repository: phoenix
Updated Branches:
  refs/heads/4.x-HBase-0.98 7f38f7e70 -> 458973fd3


PHOENIX-4169 Explicitly cap timeout for index disable RPC on compaction 
(Vincent Poon)


Project: http://git-wip-us.apache.org/repos/asf/phoenix/repo
Commit: http://git-wip-us.apache.org/repos/asf/phoenix/commit/63a409a4
Tree: http://git-wip-us.apache.org/repos/asf/phoenix/tree/63a409a4
Diff: http://git-wip-us.apache.org/repos/asf/phoenix/diff/63a409a4

Branch: refs/heads/4.x-HBase-0.98
Commit: 63a409a4028ac27e06c8a152ea75cf6b8cd32d1a
Parents: 7f38f7e
Author: James Taylor 
Authored: Tue Sep 12 17:06:21 2017 -0700
Committer: James Taylor 
Committed: Tue Sep 12 17:06:21 2017 -0700

--
 .../UngroupedAggregateRegionObserver.java   | 29 
 .../org/apache/phoenix/hbase/index/Indexer.java | 14 +-
 .../org/apache/phoenix/query/QueryServices.java |  4 +++
 .../phoenix/query/QueryServicesOptions.java |  5 
 4 files changed, 46 insertions(+), 6 deletions(-)
--


http://git-wip-us.apache.org/repos/asf/phoenix/blob/63a409a4/phoenix-core/src/main/java/org/apache/phoenix/coprocessor/UngroupedAggregateRegionObserver.java
--
diff --git 
a/phoenix-core/src/main/java/org/apache/phoenix/coprocessor/UngroupedAggregateRegionObserver.java
 
b/phoenix-core/src/main/java/org/apache/phoenix/coprocessor/UngroupedAggregateRegionObserver.java
index b4d7e7f..4ae5087 100644
--- 
a/phoenix-core/src/main/java/org/apache/phoenix/coprocessor/UngroupedAggregateRegionObserver.java
+++ 
b/phoenix-core/src/main/java/org/apache/phoenix/coprocessor/UngroupedAggregateRegionObserver.java
@@ -53,6 +53,7 @@ import org.apache.hadoop.hbase.HTableDescriptor;
 import org.apache.hadoop.hbase.KeyValue;
 import org.apache.hadoop.hbase.NamespaceDescriptor;
 import org.apache.hadoop.hbase.TableName;
+import org.apache.hadoop.hbase.client.CoprocessorHConnection;
 import org.apache.hadoop.hbase.client.Delete;
 import org.apache.hadoop.hbase.client.Durability;
 import org.apache.hadoop.hbase.client.Get;
@@ -68,6 +69,7 @@ import org.apache.hadoop.hbase.io.ImmutableBytesWritable;
 import org.apache.hadoop.hbase.ipc.RpcControllerFactory;
 import 
org.apache.hadoop.hbase.ipc.controller.InterRegionServerIndexRpcControllerFactory;
 import org.apache.hadoop.hbase.regionserver.HRegion;
+import org.apache.hadoop.hbase.regionserver.HRegionServer;
 import org.apache.hadoop.hbase.regionserver.InternalScanner;
 import org.apache.hadoop.hbase.regionserver.RegionScanner;
 import org.apache.hadoop.hbase.regionserver.ScanType;
@@ -98,6 +100,7 @@ import org.apache.phoenix.index.PhoenixIndexCodec;
 import org.apache.phoenix.jdbc.PhoenixDatabaseMetaData;
 import org.apache.phoenix.join.HashJoinInfo;
 import org.apache.phoenix.query.QueryConstants;
+import org.apache.phoenix.query.QueryServices;
 import org.apache.phoenix.query.QueryServicesOptions;
 import org.apache.phoenix.schema.ColumnFamilyNotFoundException;
 import org.apache.phoenix.schema.PColumn;
@@ -192,6 +195,7 @@ public class UngroupedAggregateRegionObserver extends 
BaseScannerRegionObserver
 private static final Logger logger = 
LoggerFactory.getLogger(UngroupedAggregateRegionObserver.class);
 private KeyValueBuilder kvBuilder;
 private Configuration upsertSelectConfig;
+private Configuration compactionConfig;
 
 @Override
 public void start(CoprocessorEnvironment e) throws IOException {
@@ -212,6 +216,15 @@ public class UngroupedAggregateRegionObserver extends 
BaseScannerRegionObserver
  */
 
upsertSelectConfig.setClass(RpcControllerFactory.CUSTOM_CONTROLLER_CONF_KEY,
 InterRegionServerIndexRpcControllerFactory.class, 
RpcControllerFactory.class);
+
+compactionConfig = PropertiesUtil.cloneConfig(e.getConfiguration());
+// lower the number of rpc retries, so we don't hang the compaction
+compactionConfig.setInt(HConstants.HBASE_CLIENT_RETRIES_NUMBER,
+
e.getConfiguration().getInt(QueryServices.METADATA_WRITE_RETRIES_NUMBER,
+QueryServicesOptions.DEFAULT_METADATA_WRITE_RETRIES_NUMBER));
+compactionConfig.setInt(HConstants.HBASE_CLIENT_PAUSE,
+
e.getConfiguration().getInt(QueryServices.METADATA_WRITE_RETRY_PAUSE,
+QueryServicesOptions.DEFAULT_METADATA_WRITE_RETRY_PAUSE));
 }
 
 private void commitBatch(HRegion region, List mutations, long 
blockingMemstoreSize) throws IOException {
@@ -929,11 +942,16 @@ public class UngroupedAggregateRegionObserver extends 
BaseScannerRegionObserver
 public Void run() throws Exception {
 MutationCode mutationCode = null;
 long disableIndexTimestamp = 0;
-
-   

[3/4] phoenix git commit: PHOENIX-4185 Rewrite tests to disable DDL and DML for PercentileIT.java and ProductMetricsIT.java (Ethan Wang)

2017-09-12 Thread jamestaylor
http://git-wip-us.apache.org/repos/asf/phoenix/blob/458973fd/phoenix-core/src/it/java/org/apache/phoenix/end2end/ProductMetricsIT.java
--
diff --git 
a/phoenix-core/src/it/java/org/apache/phoenix/end2end/ProductMetricsIT.java 
b/phoenix-core/src/it/java/org/apache/phoenix/end2end/ProductMetricsIT.java
index 87b7af6..969b585 100644
--- a/phoenix-core/src/it/java/org/apache/phoenix/end2end/ProductMetricsIT.java
+++ b/phoenix-core/src/it/java/org/apache/phoenix/end2end/ProductMetricsIT.java
@@ -48,13 +48,15 @@ import org.apache.phoenix.util.DateUtil;
 import org.apache.phoenix.util.PhoenixRuntime;
 import org.apache.phoenix.util.PropertiesUtil;
 import org.apache.phoenix.util.SchemaUtil;
+import org.apache.phoenix.util.TestUtil;
+import org.junit.After;
 import org.junit.Test;
 
 import com.google.common.collect.Lists;
 import com.google.common.collect.Ordering;
 
 
-public class ProductMetricsIT extends BaseClientManagedTimeIT {
+public class ProductMetricsIT extends ParallelStatsDisabledIT {
 private static final String PRODUCT_METRICS_NAME = "PRODUCT_METRICS";
 private static final String PRODUCT_METRICS_SCHEMA_NAME = "";
 private static final String DS1 = "1970-01-01 00:58:00";
@@ -76,57 +78,55 @@ public class ProductMetricsIT extends 
BaseClientManagedTimeIT {
 private static final String F3 = "C";
 private static final String R1 = "R1";
 private static final String R2 = "R2";
-
+
 private static byte[][] getSplits(String tenantId) {
-return new byte[][] { 
-ByteUtil.concat(Bytes.toBytes(tenantId), 
PDate.INSTANCE.toBytes(D3)),
-ByteUtil.concat(Bytes.toBytes(tenantId), 
PDate.INSTANCE.toBytes(D5)),
-};
+return new byte[][] {
+ByteUtil.concat(Bytes.toBytes(tenantId), 
PDate.INSTANCE.toBytes(D3)),
+ByteUtil.concat(Bytes.toBytes(tenantId), 
PDate.INSTANCE.toBytes(D5)),
+};
 }
-
+
 private static Date toDate(String dateString) {
 return DateUtil.parseDate(dateString);
 }
-
-private static void initTable(byte[][] splits, long ts) throws Exception {
-ensureTableCreated(getUrl(), PRODUCT_METRICS_NAME, 
PRODUCT_METRICS_NAME,splits, ts-2, null);
+
+private static void initTable(String tablename, byte[][] splits) throws 
Exception {
+ensureTableCreated(getUrl(), tablename, PRODUCT_METRICS_NAME, splits, 
null, null);
 }
 
-private static void assertNoRows(Connection conn) throws SQLException {
+private static void assertNoRows(String tablename,Connection conn) throws 
SQLException {
 Statement stmt = conn.createStatement();
-ResultSet rs = stmt.executeQuery("select 1 from PRODUCT_METRICS");
+ResultSet rs = stmt.executeQuery("select 1 from "+tablename);
 assertFalse(rs.next());
 }
-
-private static void initTableValues(String tenantId, byte[][] splits, long 
ts) throws Exception {
-initTable(splits, ts);
 
-String url = getUrl() + ";" + PhoenixRuntime.CURRENT_SCN_ATTRIB + "=" 
+ ts; // Run query at timestamp 5
+private static void initTableValues(String tablename, String tenantId, 
byte[][] splits) throws Exception {
+initTable(tablename, splits);
 Properties props = PropertiesUtil.deepCopy(TEST_PROPERTIES);
-Connection conn = DriverManager.getConnection(url, props);
+Connection conn = DriverManager.getConnection(getUrl(), props);
 try {
-assertNoRows(conn);
-initTableValues(conn, tenantId);
+assertNoRows(tablename, conn);
+initTableValues(tablename, conn, tenantId);
 conn.commit();
 } finally {
 conn.close();
 }
 }
-
-protected static void initTableValues(Connection conn, String tenantId) 
throws Exception {
+
+protected static void initTableValues(String tablename, Connection conn, 
String tenantId) throws Exception {
 PreparedStatement stmt = conn.prepareStatement(
-"upsert into " +
-"PRODUCT_METRICS(" +
-"ORGANIZATION_ID, " +
-"\"DATE\", " +
-"FEATURE, " +
-"UNIQUE_USERS, " +
-"TRANSACTIONS, " +
-"CPU_UTILIZATION, " +
-"DB_UTILIZATION, " +
-"REGION, " +
-"IO_TIME)" +
-"VALUES (?, ?, ?, ?, ?, ?, ?, ?, ?)");
+"upsert into " + tablename +
+" (" +
+"ORGANIZATION_ID, " +
+"\"DATE\", " +
+"FEATURE, " +
+"UNIQUE_USERS, " +
+"TRANSACTIONS, " +
+"CPU_UTILIZATION, " +
+"DB_UTILIZATION, " +
+"REGION, " +
+   

[4/4] phoenix git commit: PHOENIX-4185 Rewrite tests to disable DDL and DML for PercentileIT.java and ProductMetricsIT.java (Ethan Wang)

2017-09-12 Thread jamestaylor
PHOENIX-4185 Rewrite tests to disable DDL and DML for PercentileIT.java and 
ProductMetricsIT.java (Ethan Wang)


Project: http://git-wip-us.apache.org/repos/asf/phoenix/repo
Commit: http://git-wip-us.apache.org/repos/asf/phoenix/commit/458973fd
Tree: http://git-wip-us.apache.org/repos/asf/phoenix/tree/458973fd
Diff: http://git-wip-us.apache.org/repos/asf/phoenix/diff/458973fd

Branch: refs/heads/4.x-HBase-0.98
Commit: 458973fd3139e10349623a432c39ac3152536d47
Parents: 8b7aa52
Author: James Taylor 
Authored: Tue Sep 12 17:04:18 2017 -0700
Committer: James Taylor 
Committed: Tue Sep 12 17:07:13 2017 -0700

--
 .gitignore  |   2 +
 .../apache/phoenix/end2end/PercentileIT.java| 145 ++-
 .../phoenix/end2end/ProductMetricsIT.java   | 947 +--
 3 files changed, 512 insertions(+), 582 deletions(-)
--


http://git-wip-us.apache.org/repos/asf/phoenix/blob/458973fd/.gitignore
--
diff --git a/.gitignore b/.gitignore
index 0756d49..3fec6f9 100644
--- a/.gitignore
+++ b/.gitignore
@@ -18,6 +18,8 @@
 # intellij stuff
 .idea/
 *.iml
+*.ipr
+*.iws
 
 #maven stuffs
 target/

http://git-wip-us.apache.org/repos/asf/phoenix/blob/458973fd/phoenix-core/src/it/java/org/apache/phoenix/end2end/PercentileIT.java
--
diff --git 
a/phoenix-core/src/it/java/org/apache/phoenix/end2end/PercentileIT.java 
b/phoenix-core/src/it/java/org/apache/phoenix/end2end/PercentileIT.java
index 2d0ead9..965fc2c 100644
--- a/phoenix-core/src/it/java/org/apache/phoenix/end2end/PercentileIT.java
+++ b/phoenix-core/src/it/java/org/apache/phoenix/end2end/PercentileIT.java
@@ -46,7 +46,6 @@ import java.sql.Date;
 import java.sql.DriverManager;
 import java.sql.PreparedStatement;
 import java.sql.ResultSet;
-import java.sql.ResultSetMetaData;
 import java.sql.Statement;
 import java.sql.SQLException;
 import java.sql.Types;
@@ -54,7 +53,6 @@ import java.util.Properties;
 
 import org.apache.phoenix.query.QueryConstants;
 import org.apache.phoenix.util.DateUtil;
-import org.apache.phoenix.util.PhoenixRuntime;
 import org.apache.phoenix.util.PropertiesUtil;
 import org.apache.phoenix.util.TestUtil;
 import org.junit.Test;
@@ -65,10 +63,9 @@ public class PercentileIT extends ParallelStatsDisabledIT {
 @Test
 public void testPercentile() throws Exception {
 String tenantId = getOrganizationId();
-String tableName = initATableValues(tenantId, null, 
getDefaultSplits(tenantId), null, null);
+String tableName = initATableValues(tenantId, null, 
getDefaultSplits(tenantId), null);
 
 String query = "SELECT PERCENTILE_CONT(0.9) WITHIN GROUP (ORDER BY 
A_INTEGER ASC) FROM " + tableName;
-
 Properties props = PropertiesUtil.deepCopy(TEST_PROPERTIES);
 Connection conn = DriverManager.getConnection(getUrl(), props);
 try {
@@ -87,7 +84,7 @@ public class PercentileIT extends ParallelStatsDisabledIT {
 @Test
 public void testPercentileDesc() throws Exception {
 String tenantId = getOrganizationId();
-String tableName = initATableValues(tenantId, null, 
getDefaultSplits(tenantId), null, null);
+String tableName = initATableValues(tenantId, null, 
getDefaultSplits(tenantId), null);
 
 String query = "SELECT PERCENTILE_CONT(0.9) WITHIN GROUP (ORDER BY 
A_INTEGER DESC) FROM " + tableName;
 
@@ -105,11 +102,11 @@ public class PercentileIT extends ParallelStatsDisabledIT 
{
 conn.close();
 }
 }
-
+
 @Test
 public void testPercentileWithGroupby() throws Exception {
 String tenantId = getOrganizationId();
-String tableName = initATableValues(tenantId, null, 
getDefaultSplits(tenantId), null, null);
+String tableName = initATableValues(tenantId, null, 
getDefaultSplits(tenantId), null);
 
 String query = "SELECT A_STRING, PERCENTILE_CONT(0.9) WITHIN GROUP 
(ORDER BY A_INTEGER ASC) FROM " + tableName + " GROUP BY A_STRING";
 
@@ -142,7 +139,7 @@ public class PercentileIT extends ParallelStatsDisabledIT {
 @Test
 public void testPercentileWithGroupbyAndOrderBy() throws Exception {
 String tenantId = getOrganizationId();
-String tableName = initATableValues(tenantId, null, 
getDefaultSplits(tenantId), null, null);
+String tableName = initATableValues(tenantId, null, 
getDefaultSplits(tenantId), null);
 
 String query = "SELECT A_STRING, PERCENTILE_CONT(0.9) WITHIN GROUP 
(ORDER BY A_INTEGER ASC) AS PC FROM " + tableName + " GROUP BY A_STRING ORDER 
BY PC";
 
@@ -173,51 +170,51 @@ public class PercentileIT extends ParallelStatsDisabledIT 
{
 }
 
 @Test
-   public void testPercentileDiscAsc() 

[2/4] phoenix git commit: PHOENIX-4205 Modify OutOfOrderMutationsIT to not use CURRENT_SCN

2017-09-12 Thread jamestaylor
PHOENIX-4205 Modify OutOfOrderMutationsIT to not use CURRENT_SCN


Project: http://git-wip-us.apache.org/repos/asf/phoenix/repo
Commit: http://git-wip-us.apache.org/repos/asf/phoenix/commit/8b7aa524
Tree: http://git-wip-us.apache.org/repos/asf/phoenix/tree/8b7aa524
Diff: http://git-wip-us.apache.org/repos/asf/phoenix/diff/8b7aa524

Branch: refs/heads/4.x-HBase-0.98
Commit: 8b7aa5240512d131bc5aec4e0ac81d1478d510b1
Parents: 63a409a
Author: James Taylor 
Authored: Tue Sep 12 16:50:27 2017 -0700
Committer: James Taylor 
Committed: Tue Sep 12 17:06:50 2017 -0700

--
 .../phoenix/end2end/OutOfOrderMutationsIT.java  | 955 +++
 1 file changed, 346 insertions(+), 609 deletions(-)
--


http://git-wip-us.apache.org/repos/asf/phoenix/blob/8b7aa524/phoenix-core/src/it/java/org/apache/phoenix/end2end/OutOfOrderMutationsIT.java
--
diff --git 
a/phoenix-core/src/it/java/org/apache/phoenix/end2end/OutOfOrderMutationsIT.java
 
b/phoenix-core/src/it/java/org/apache/phoenix/end2end/OutOfOrderMutationsIT.java
index e8adf6b..40b58f2 100644
--- 
a/phoenix-core/src/it/java/org/apache/phoenix/end2end/OutOfOrderMutationsIT.java
+++ 
b/phoenix-core/src/it/java/org/apache/phoenix/end2end/OutOfOrderMutationsIT.java
@@ -19,631 +19,368 @@ package org.apache.phoenix.end2end;
 
 import static org.junit.Assert.assertEquals;
 import static org.junit.Assert.assertFalse;
-import static org.junit.Assert.assertNotEquals;
 import static org.junit.Assert.assertTrue;
 
-import java.io.IOException;
 import java.sql.Connection;
 import java.sql.DriverManager;
 import java.sql.PreparedStatement;
 import java.sql.ResultSet;
-import java.sql.SQLException;
 import java.sql.Timestamp;
 import java.util.Properties;
 
-import org.apache.hadoop.hbase.CellScanner;
-import org.apache.hadoop.hbase.client.HTableInterface;
-import org.apache.hadoop.hbase.client.Result;
-import org.apache.hadoop.hbase.client.ResultScanner;
-import org.apache.hadoop.hbase.client.Scan;
 import org.apache.hadoop.hbase.util.Bytes;
 import org.apache.phoenix.jdbc.PhoenixConnection;
+import org.apache.phoenix.util.EnvironmentEdge;
+import org.apache.phoenix.util.EnvironmentEdgeManager;
 import org.apache.phoenix.util.IndexScrutiny;
-import org.apache.phoenix.util.PhoenixRuntime;
 import org.apache.phoenix.util.PropertiesUtil;
 import org.apache.phoenix.util.TestUtil;
 import org.junit.Ignore;
 import org.junit.Test;
 
 public class OutOfOrderMutationsIT extends ParallelStatsDisabledIT {
-@Test
-public void testOutOfOrderDelete() throws Exception {
-String tableName = generateUniqueName();
-String indexName = generateUniqueName();
-Properties props = PropertiesUtil.deepCopy(TestUtil.TEST_PROPERTIES);
-long ts = 1000;
-props.setProperty(PhoenixRuntime.CURRENT_SCN_ATTRIB, 
Long.toString(ts));
-Connection conn = DriverManager.getConnection(getUrl(), props); 
-conn.createStatement().execute("CREATE TABLE " + tableName + "(k1 
CHAR(2) NOT NULL, k2 CHAR(2) NOT NULL, ts TIMESTAMP, CONSTRAINT pk PRIMARY KEY 
(k1,k2)) COLUMN_ENCODED_BYTES = 0");
-conn.close();
-
-ts = 1010;
-props.setProperty(PhoenixRuntime.CURRENT_SCN_ATTRIB, 
Long.toString(ts));
-conn = DriverManager.getConnection(getUrl(), props);
-conn.createStatement().execute("CREATE INDEX " + indexName + " ON " + 
tableName + "(k2,k1,ts)");
-conn.close();
-
-ts = 1020;
-props.setProperty(PhoenixRuntime.CURRENT_SCN_ATTRIB, 
Long.toString(ts));
-conn = DriverManager.getConnection(getUrl(), props);
-PreparedStatement stmt = conn.prepareStatement("UPSERT INTO " + 
tableName + " VALUES('aa','aa',?)");
-stmt.setTimestamp(1, new Timestamp(1000L));
-stmt.executeUpdate();
-conn.commit();
-conn.close();
-
-ts = 1040;
-props.setProperty(PhoenixRuntime.CURRENT_SCN_ATTRIB, 
Long.toString(ts));
-conn = DriverManager.getConnection(getUrl(), props);
-conn.createStatement().execute("DELETE FROM " + tableName + " WHERE 
k1='aa'");
-conn.commit();
-conn.close();
-
-ts = 1030;
-props.setProperty(PhoenixRuntime.CURRENT_SCN_ATTRIB, 
Long.toString(ts));
-conn = DriverManager.getConnection(getUrl(), props);
-stmt = conn.prepareStatement("UPSERT INTO " + tableName + " 
VALUES('aa','aa',?)");
-stmt.setTimestamp(1, new Timestamp(2000L));
-stmt.executeUpdate();
-conn.commit();
-conn.close();
-
-ts = 1050;
-props.setProperty(PhoenixRuntime.CURRENT_SCN_ATTRIB, 
Long.toString(ts));
-conn = DriverManager.getConnection(getUrl(), props);
-
-

[4/4] phoenix git commit: PHOENIX-4205 Modify OutOfOrderMutationsIT to not use CURRENT_SCN

2017-09-12 Thread jamestaylor
PHOENIX-4205 Modify OutOfOrderMutationsIT to not use CURRENT_SCN


Project: http://git-wip-us.apache.org/repos/asf/phoenix/repo
Commit: http://git-wip-us.apache.org/repos/asf/phoenix/commit/33b12c78
Tree: http://git-wip-us.apache.org/repos/asf/phoenix/tree/33b12c78
Diff: http://git-wip-us.apache.org/repos/asf/phoenix/diff/33b12c78

Branch: refs/heads/master
Commit: 33b12c789e5c24cbf4d711006e4b8c41e9393a78
Parents: d9ac3f1
Author: James Taylor 
Authored: Tue Sep 12 16:50:27 2017 -0700
Committer: James Taylor 
Committed: Tue Sep 12 17:04:44 2017 -0700

--
 .../phoenix/end2end/OutOfOrderMutationsIT.java  | 955 +++
 1 file changed, 346 insertions(+), 609 deletions(-)
--


http://git-wip-us.apache.org/repos/asf/phoenix/blob/33b12c78/phoenix-core/src/it/java/org/apache/phoenix/end2end/OutOfOrderMutationsIT.java
--
diff --git 
a/phoenix-core/src/it/java/org/apache/phoenix/end2end/OutOfOrderMutationsIT.java
 
b/phoenix-core/src/it/java/org/apache/phoenix/end2end/OutOfOrderMutationsIT.java
index e8adf6b..40b58f2 100644
--- 
a/phoenix-core/src/it/java/org/apache/phoenix/end2end/OutOfOrderMutationsIT.java
+++ 
b/phoenix-core/src/it/java/org/apache/phoenix/end2end/OutOfOrderMutationsIT.java
@@ -19,631 +19,368 @@ package org.apache.phoenix.end2end;
 
 import static org.junit.Assert.assertEquals;
 import static org.junit.Assert.assertFalse;
-import static org.junit.Assert.assertNotEquals;
 import static org.junit.Assert.assertTrue;
 
-import java.io.IOException;
 import java.sql.Connection;
 import java.sql.DriverManager;
 import java.sql.PreparedStatement;
 import java.sql.ResultSet;
-import java.sql.SQLException;
 import java.sql.Timestamp;
 import java.util.Properties;
 
-import org.apache.hadoop.hbase.CellScanner;
-import org.apache.hadoop.hbase.client.HTableInterface;
-import org.apache.hadoop.hbase.client.Result;
-import org.apache.hadoop.hbase.client.ResultScanner;
-import org.apache.hadoop.hbase.client.Scan;
 import org.apache.hadoop.hbase.util.Bytes;
 import org.apache.phoenix.jdbc.PhoenixConnection;
+import org.apache.phoenix.util.EnvironmentEdge;
+import org.apache.phoenix.util.EnvironmentEdgeManager;
 import org.apache.phoenix.util.IndexScrutiny;
-import org.apache.phoenix.util.PhoenixRuntime;
 import org.apache.phoenix.util.PropertiesUtil;
 import org.apache.phoenix.util.TestUtil;
 import org.junit.Ignore;
 import org.junit.Test;
 
 public class OutOfOrderMutationsIT extends ParallelStatsDisabledIT {
-@Test
-public void testOutOfOrderDelete() throws Exception {
-String tableName = generateUniqueName();
-String indexName = generateUniqueName();
-Properties props = PropertiesUtil.deepCopy(TestUtil.TEST_PROPERTIES);
-long ts = 1000;
-props.setProperty(PhoenixRuntime.CURRENT_SCN_ATTRIB, 
Long.toString(ts));
-Connection conn = DriverManager.getConnection(getUrl(), props); 
-conn.createStatement().execute("CREATE TABLE " + tableName + "(k1 
CHAR(2) NOT NULL, k2 CHAR(2) NOT NULL, ts TIMESTAMP, CONSTRAINT pk PRIMARY KEY 
(k1,k2)) COLUMN_ENCODED_BYTES = 0");
-conn.close();
-
-ts = 1010;
-props.setProperty(PhoenixRuntime.CURRENT_SCN_ATTRIB, 
Long.toString(ts));
-conn = DriverManager.getConnection(getUrl(), props);
-conn.createStatement().execute("CREATE INDEX " + indexName + " ON " + 
tableName + "(k2,k1,ts)");
-conn.close();
-
-ts = 1020;
-props.setProperty(PhoenixRuntime.CURRENT_SCN_ATTRIB, 
Long.toString(ts));
-conn = DriverManager.getConnection(getUrl(), props);
-PreparedStatement stmt = conn.prepareStatement("UPSERT INTO " + 
tableName + " VALUES('aa','aa',?)");
-stmt.setTimestamp(1, new Timestamp(1000L));
-stmt.executeUpdate();
-conn.commit();
-conn.close();
-
-ts = 1040;
-props.setProperty(PhoenixRuntime.CURRENT_SCN_ATTRIB, 
Long.toString(ts));
-conn = DriverManager.getConnection(getUrl(), props);
-conn.createStatement().execute("DELETE FROM " + tableName + " WHERE 
k1='aa'");
-conn.commit();
-conn.close();
-
-ts = 1030;
-props.setProperty(PhoenixRuntime.CURRENT_SCN_ATTRIB, 
Long.toString(ts));
-conn = DriverManager.getConnection(getUrl(), props);
-stmt = conn.prepareStatement("UPSERT INTO " + tableName + " 
VALUES('aa','aa',?)");
-stmt.setTimestamp(1, new Timestamp(2000L));
-stmt.executeUpdate();
-conn.commit();
-conn.close();
-
-ts = 1050;
-props.setProperty(PhoenixRuntime.CURRENT_SCN_ATTRIB, 
Long.toString(ts));
-conn = DriverManager.getConnection(getUrl(), props);
-
-

[3/4] phoenix git commit: PHOENIX-4169 Explicitly cap timeout for index disable RPC on compaction (Vincent Poon)

2017-09-12 Thread jamestaylor
PHOENIX-4169 Explicitly cap timeout for index disable RPC on compaction 
(Vincent Poon)


Project: http://git-wip-us.apache.org/repos/asf/phoenix/repo
Commit: http://git-wip-us.apache.org/repos/asf/phoenix/commit/2e5986a7
Tree: http://git-wip-us.apache.org/repos/asf/phoenix/tree/2e5986a7
Diff: http://git-wip-us.apache.org/repos/asf/phoenix/diff/2e5986a7

Branch: refs/heads/master
Commit: 2e5986a76aa171a62b342a46cc984a00a3a36746
Parents: 33b12c7
Author: James Taylor 
Authored: Tue Sep 12 17:00:47 2017 -0700
Committer: James Taylor 
Committed: Tue Sep 12 17:04:44 2017 -0700

--
 .../UngroupedAggregateRegionObserver.java   | 29 
 .../org/apache/phoenix/hbase/index/Indexer.java | 14 +-
 .../org/apache/phoenix/query/QueryServices.java |  4 +++
 .../phoenix/query/QueryServicesOptions.java |  5 
 4 files changed, 46 insertions(+), 6 deletions(-)
--


http://git-wip-us.apache.org/repos/asf/phoenix/blob/2e5986a7/phoenix-core/src/main/java/org/apache/phoenix/coprocessor/UngroupedAggregateRegionObserver.java
--
diff --git 
a/phoenix-core/src/main/java/org/apache/phoenix/coprocessor/UngroupedAggregateRegionObserver.java
 
b/phoenix-core/src/main/java/org/apache/phoenix/coprocessor/UngroupedAggregateRegionObserver.java
index a61f502..0773ebc 100644
--- 
a/phoenix-core/src/main/java/org/apache/phoenix/coprocessor/UngroupedAggregateRegionObserver.java
+++ 
b/phoenix-core/src/main/java/org/apache/phoenix/coprocessor/UngroupedAggregateRegionObserver.java
@@ -53,6 +53,7 @@ import org.apache.hadoop.hbase.HTableDescriptor;
 import org.apache.hadoop.hbase.KeyValue;
 import org.apache.hadoop.hbase.NamespaceDescriptor;
 import org.apache.hadoop.hbase.TableName;
+import org.apache.hadoop.hbase.client.CoprocessorHConnection;
 import org.apache.hadoop.hbase.client.Delete;
 import org.apache.hadoop.hbase.client.Durability;
 import org.apache.hadoop.hbase.client.Get;
@@ -67,6 +68,7 @@ import 
org.apache.hadoop.hbase.coprocessor.RegionCoprocessorEnvironment;
 import org.apache.hadoop.hbase.io.ImmutableBytesWritable;
 import org.apache.hadoop.hbase.ipc.RpcControllerFactory;
 import 
org.apache.hadoop.hbase.ipc.controller.InterRegionServerIndexRpcControllerFactory;
+import org.apache.hadoop.hbase.regionserver.HRegionServer;
 import org.apache.hadoop.hbase.regionserver.InternalScanner;
 import org.apache.hadoop.hbase.regionserver.Region;
 import org.apache.hadoop.hbase.regionserver.RegionScanner;
@@ -98,6 +100,7 @@ import org.apache.phoenix.index.PhoenixIndexCodec;
 import org.apache.phoenix.jdbc.PhoenixDatabaseMetaData;
 import org.apache.phoenix.join.HashJoinInfo;
 import org.apache.phoenix.query.QueryConstants;
+import org.apache.phoenix.query.QueryServices;
 import org.apache.phoenix.query.QueryServicesOptions;
 import org.apache.phoenix.schema.ColumnFamilyNotFoundException;
 import org.apache.phoenix.schema.PColumn;
@@ -192,6 +195,7 @@ public class UngroupedAggregateRegionObserver extends 
BaseScannerRegionObserver
 private static final Logger logger = 
LoggerFactory.getLogger(UngroupedAggregateRegionObserver.class);
 private KeyValueBuilder kvBuilder;
 private Configuration upsertSelectConfig;
+private Configuration compactionConfig;
 
 @Override
 public void start(CoprocessorEnvironment e) throws IOException {
@@ -212,6 +216,15 @@ public class UngroupedAggregateRegionObserver extends 
BaseScannerRegionObserver
  */
 
upsertSelectConfig.setClass(RpcControllerFactory.CUSTOM_CONTROLLER_CONF_KEY,
 InterRegionServerIndexRpcControllerFactory.class, 
RpcControllerFactory.class);
+
+compactionConfig = PropertiesUtil.cloneConfig(e.getConfiguration());
+// lower the number of rpc retries, so we don't hang the compaction
+compactionConfig.setInt(HConstants.HBASE_CLIENT_RETRIES_NUMBER,
+
e.getConfiguration().getInt(QueryServices.METADATA_WRITE_RETRIES_NUMBER,
+QueryServicesOptions.DEFAULT_METADATA_WRITE_RETRIES_NUMBER));
+compactionConfig.setInt(HConstants.HBASE_CLIENT_PAUSE,
+
e.getConfiguration().getInt(QueryServices.METADATA_WRITE_RETRY_PAUSE,
+QueryServicesOptions.DEFAULT_METADATA_WRITE_RETRY_PAUSE));
 }
 
 private void commitBatch(Region region, List mutations, long 
blockingMemstoreSize) throws IOException {
@@ -924,11 +937,16 @@ public class UngroupedAggregateRegionObserver extends 
BaseScannerRegionObserver
 public Void run() throws Exception {
 MutationCode mutationCode = null;
 long disableIndexTimestamp = 0;
-
-try (HTableInterface htable = e.getEnvironment().getTable(
-

[2/4] phoenix git commit: PHOENIX-4185 Rewrite tests to disable DDL and DML for PercentileIT.java and ProductMetricsIT.java (Ethan Wang)

2017-09-12 Thread jamestaylor
PHOENIX-4185 Rewrite tests to disable DDL and DML for PercentileIT.java and 
ProductMetricsIT.java (Ethan Wang)


Project: http://git-wip-us.apache.org/repos/asf/phoenix/repo
Commit: http://git-wip-us.apache.org/repos/asf/phoenix/commit/586cdab9
Tree: http://git-wip-us.apache.org/repos/asf/phoenix/tree/586cdab9
Diff: http://git-wip-us.apache.org/repos/asf/phoenix/diff/586cdab9

Branch: refs/heads/master
Commit: 586cdab9f82661a40fa12be881b6adfed034b09c
Parents: 2e5986a
Author: James Taylor 
Authored: Tue Sep 12 17:04:18 2017 -0700
Committer: James Taylor 
Committed: Tue Sep 12 17:04:44 2017 -0700

--
 .gitignore  |   2 +
 .../apache/phoenix/end2end/PercentileIT.java| 145 ++-
 .../phoenix/end2end/ProductMetricsIT.java   | 947 +--
 3 files changed, 512 insertions(+), 582 deletions(-)
--


http://git-wip-us.apache.org/repos/asf/phoenix/blob/586cdab9/.gitignore
--
diff --git a/.gitignore b/.gitignore
index 803e8ea..33d40ea 100644
--- a/.gitignore
+++ b/.gitignore
@@ -18,6 +18,8 @@
 # intellij stuff
 .idea/
 *.iml
+*.ipr
+*.iws
 
 #maven stuffs
 target/

http://git-wip-us.apache.org/repos/asf/phoenix/blob/586cdab9/phoenix-core/src/it/java/org/apache/phoenix/end2end/PercentileIT.java
--
diff --git 
a/phoenix-core/src/it/java/org/apache/phoenix/end2end/PercentileIT.java 
b/phoenix-core/src/it/java/org/apache/phoenix/end2end/PercentileIT.java
index 2d0ead9..965fc2c 100644
--- a/phoenix-core/src/it/java/org/apache/phoenix/end2end/PercentileIT.java
+++ b/phoenix-core/src/it/java/org/apache/phoenix/end2end/PercentileIT.java
@@ -46,7 +46,6 @@ import java.sql.Date;
 import java.sql.DriverManager;
 import java.sql.PreparedStatement;
 import java.sql.ResultSet;
-import java.sql.ResultSetMetaData;
 import java.sql.Statement;
 import java.sql.SQLException;
 import java.sql.Types;
@@ -54,7 +53,6 @@ import java.util.Properties;
 
 import org.apache.phoenix.query.QueryConstants;
 import org.apache.phoenix.util.DateUtil;
-import org.apache.phoenix.util.PhoenixRuntime;
 import org.apache.phoenix.util.PropertiesUtil;
 import org.apache.phoenix.util.TestUtil;
 import org.junit.Test;
@@ -65,10 +63,9 @@ public class PercentileIT extends ParallelStatsDisabledIT {
 @Test
 public void testPercentile() throws Exception {
 String tenantId = getOrganizationId();
-String tableName = initATableValues(tenantId, null, 
getDefaultSplits(tenantId), null, null);
+String tableName = initATableValues(tenantId, null, 
getDefaultSplits(tenantId), null);
 
 String query = "SELECT PERCENTILE_CONT(0.9) WITHIN GROUP (ORDER BY 
A_INTEGER ASC) FROM " + tableName;
-
 Properties props = PropertiesUtil.deepCopy(TEST_PROPERTIES);
 Connection conn = DriverManager.getConnection(getUrl(), props);
 try {
@@ -87,7 +84,7 @@ public class PercentileIT extends ParallelStatsDisabledIT {
 @Test
 public void testPercentileDesc() throws Exception {
 String tenantId = getOrganizationId();
-String tableName = initATableValues(tenantId, null, 
getDefaultSplits(tenantId), null, null);
+String tableName = initATableValues(tenantId, null, 
getDefaultSplits(tenantId), null);
 
 String query = "SELECT PERCENTILE_CONT(0.9) WITHIN GROUP (ORDER BY 
A_INTEGER DESC) FROM " + tableName;
 
@@ -105,11 +102,11 @@ public class PercentileIT extends ParallelStatsDisabledIT 
{
 conn.close();
 }
 }
-
+
 @Test
 public void testPercentileWithGroupby() throws Exception {
 String tenantId = getOrganizationId();
-String tableName = initATableValues(tenantId, null, 
getDefaultSplits(tenantId), null, null);
+String tableName = initATableValues(tenantId, null, 
getDefaultSplits(tenantId), null);
 
 String query = "SELECT A_STRING, PERCENTILE_CONT(0.9) WITHIN GROUP 
(ORDER BY A_INTEGER ASC) FROM " + tableName + " GROUP BY A_STRING";
 
@@ -142,7 +139,7 @@ public class PercentileIT extends ParallelStatsDisabledIT {
 @Test
 public void testPercentileWithGroupbyAndOrderBy() throws Exception {
 String tenantId = getOrganizationId();
-String tableName = initATableValues(tenantId, null, 
getDefaultSplits(tenantId), null, null);
+String tableName = initATableValues(tenantId, null, 
getDefaultSplits(tenantId), null);
 
 String query = "SELECT A_STRING, PERCENTILE_CONT(0.9) WITHIN GROUP 
(ORDER BY A_INTEGER ASC) AS PC FROM " + tableName + " GROUP BY A_STRING ORDER 
BY PC";
 
@@ -173,51 +170,51 @@ public class PercentileIT extends ParallelStatsDisabledIT 
{
 }
 
 @Test
-   public void testPercentileDiscAsc() throws 

[1/4] phoenix git commit: PHOENIX-4185 Rewrite tests to disable DDL and DML for PercentileIT.java and ProductMetricsIT.java (Ethan Wang)

2017-09-12 Thread jamestaylor
Repository: phoenix
Updated Branches:
  refs/heads/master d9ac3f109 -> 586cdab9f


http://git-wip-us.apache.org/repos/asf/phoenix/blob/586cdab9/phoenix-core/src/it/java/org/apache/phoenix/end2end/ProductMetricsIT.java
--
diff --git 
a/phoenix-core/src/it/java/org/apache/phoenix/end2end/ProductMetricsIT.java 
b/phoenix-core/src/it/java/org/apache/phoenix/end2end/ProductMetricsIT.java
index 87b7af6..969b585 100644
--- a/phoenix-core/src/it/java/org/apache/phoenix/end2end/ProductMetricsIT.java
+++ b/phoenix-core/src/it/java/org/apache/phoenix/end2end/ProductMetricsIT.java
@@ -48,13 +48,15 @@ import org.apache.phoenix.util.DateUtil;
 import org.apache.phoenix.util.PhoenixRuntime;
 import org.apache.phoenix.util.PropertiesUtil;
 import org.apache.phoenix.util.SchemaUtil;
+import org.apache.phoenix.util.TestUtil;
+import org.junit.After;
 import org.junit.Test;
 
 import com.google.common.collect.Lists;
 import com.google.common.collect.Ordering;
 
 
-public class ProductMetricsIT extends BaseClientManagedTimeIT {
+public class ProductMetricsIT extends ParallelStatsDisabledIT {
 private static final String PRODUCT_METRICS_NAME = "PRODUCT_METRICS";
 private static final String PRODUCT_METRICS_SCHEMA_NAME = "";
 private static final String DS1 = "1970-01-01 00:58:00";
@@ -76,57 +78,55 @@ public class ProductMetricsIT extends 
BaseClientManagedTimeIT {
 private static final String F3 = "C";
 private static final String R1 = "R1";
 private static final String R2 = "R2";
-
+
 private static byte[][] getSplits(String tenantId) {
-return new byte[][] { 
-ByteUtil.concat(Bytes.toBytes(tenantId), 
PDate.INSTANCE.toBytes(D3)),
-ByteUtil.concat(Bytes.toBytes(tenantId), 
PDate.INSTANCE.toBytes(D5)),
-};
+return new byte[][] {
+ByteUtil.concat(Bytes.toBytes(tenantId), 
PDate.INSTANCE.toBytes(D3)),
+ByteUtil.concat(Bytes.toBytes(tenantId), 
PDate.INSTANCE.toBytes(D5)),
+};
 }
-
+
 private static Date toDate(String dateString) {
 return DateUtil.parseDate(dateString);
 }
-
-private static void initTable(byte[][] splits, long ts) throws Exception {
-ensureTableCreated(getUrl(), PRODUCT_METRICS_NAME, 
PRODUCT_METRICS_NAME,splits, ts-2, null);
+
+private static void initTable(String tablename, byte[][] splits) throws 
Exception {
+ensureTableCreated(getUrl(), tablename, PRODUCT_METRICS_NAME, splits, 
null, null);
 }
 
-private static void assertNoRows(Connection conn) throws SQLException {
+private static void assertNoRows(String tablename,Connection conn) throws 
SQLException {
 Statement stmt = conn.createStatement();
-ResultSet rs = stmt.executeQuery("select 1 from PRODUCT_METRICS");
+ResultSet rs = stmt.executeQuery("select 1 from "+tablename);
 assertFalse(rs.next());
 }
-
-private static void initTableValues(String tenantId, byte[][] splits, long 
ts) throws Exception {
-initTable(splits, ts);
 
-String url = getUrl() + ";" + PhoenixRuntime.CURRENT_SCN_ATTRIB + "=" 
+ ts; // Run query at timestamp 5
+private static void initTableValues(String tablename, String tenantId, 
byte[][] splits) throws Exception {
+initTable(tablename, splits);
 Properties props = PropertiesUtil.deepCopy(TEST_PROPERTIES);
-Connection conn = DriverManager.getConnection(url, props);
+Connection conn = DriverManager.getConnection(getUrl(), props);
 try {
-assertNoRows(conn);
-initTableValues(conn, tenantId);
+assertNoRows(tablename, conn);
+initTableValues(tablename, conn, tenantId);
 conn.commit();
 } finally {
 conn.close();
 }
 }
-
-protected static void initTableValues(Connection conn, String tenantId) 
throws Exception {
+
+protected static void initTableValues(String tablename, Connection conn, 
String tenantId) throws Exception {
 PreparedStatement stmt = conn.prepareStatement(
-"upsert into " +
-"PRODUCT_METRICS(" +
-"ORGANIZATION_ID, " +
-"\"DATE\", " +
-"FEATURE, " +
-"UNIQUE_USERS, " +
-"TRANSACTIONS, " +
-"CPU_UTILIZATION, " +
-"DB_UTILIZATION, " +
-"REGION, " +
-"IO_TIME)" +
-"VALUES (?, ?, ?, ?, ?, ?, ?, ?, ?)");
+"upsert into " + tablename +
+" (" +
+"ORGANIZATION_ID, " +
+"\"DATE\", " +
+"FEATURE, " +
+"UNIQUE_USERS, " +
+"TRANSACTIONS, " +
+"CPU_UTILIZATION, " +
+"   

Apache-Phoenix | Master | Build Successful

2017-09-12 Thread Apache Jenkins Server
Master branch build status Successful
Source repository https://git-wip-us.apache.org/repos/asf?p=phoenix.git;a=shortlog;h=refs/heads/master

Last Successful Compiled Artifacts https://builds.apache.org/job/Phoenix-master/lastSuccessfulBuild/artifact/

Last Complete Test Report https://builds.apache.org/job/Phoenix-master/lastCompletedBuild/testReport/

Changes
[samarth] PHOENIX-4201 Remove usage of SCN from QueryDatabaseMetadataIT



Build times for last couple of runsLatest build time is the right most | Legend blue: normal, red: test failure, gray: timeout


Apache-Phoenix | 4.x-HBase-1.1 | Build Successful

2017-09-12 Thread Apache Jenkins Server
4.x-HBase-1.1 branch build status Successful

Source repository https://git-wip-us.apache.org/repos/asf?p=phoenix.git;a=shortlog;h=refs/heads/4.x-HBase-1.1

Compiled Artifacts https://builds.apache.org/job/Phoenix-4.x-HBase-1.1/lastSuccessfulBuild/artifact/

Test Report https://builds.apache.org/job/Phoenix-4.x-HBase-1.1/lastCompletedBuild/testReport/

Changes
[samarth] PHOENIX-4201 Remove usage of SCN from QueryDatabaseMetadataIT



Build times for last couple of runsLatest build time is the right most | Legend blue: normal, red: test failure, gray: timeout


[2/4] phoenix git commit: PHOENIX-4191 Categorize uncategorized integration tests

2017-09-12 Thread elserj
PHOENIX-4191 Categorize uncategorized integration tests

Uncategorized tests results in Maven not running them.


Project: http://git-wip-us.apache.org/repos/asf/phoenix/repo
Commit: http://git-wip-us.apache.org/repos/asf/phoenix/commit/b1751c4d
Tree: http://git-wip-us.apache.org/repos/asf/phoenix/tree/b1751c4d
Diff: http://git-wip-us.apache.org/repos/asf/phoenix/diff/b1751c4d

Branch: refs/heads/4.x-HBase-1.2
Commit: b1751c4deb9855c8a9e46425b4969684f629c5d3
Parents: 8f2b2be
Author: Josh Elser 
Authored: Tue Sep 12 18:26:36 2017 -0400
Committer: Josh Elser 
Committed: Tue Sep 12 18:49:19 2017 -0400

--
 .../wal/ReadWriteKeyValuesWithCodecIT.java  | 184 --
 .../phoenix/end2end/IndexScrutinyToolIT.java|   2 +
 .../end2end/SystemTablePermissionsIT.java   |   2 +
 .../wal/ReadWriteKeyValuesWithCodecTest.java| 186 +++
 .../phoenix/hive/BaseHivePhoenixStoreIT.java|   3 +
 .../apache/phoenix/hive/HivePhoenixStoreIT.java |   3 +
 6 files changed, 196 insertions(+), 184 deletions(-)
--


http://git-wip-us.apache.org/repos/asf/phoenix/blob/b1751c4d/phoenix-core/src/it/java/org/apache/hadoop/hbase/regionserver/wal/ReadWriteKeyValuesWithCodecIT.java
--
diff --git 
a/phoenix-core/src/it/java/org/apache/hadoop/hbase/regionserver/wal/ReadWriteKeyValuesWithCodecIT.java
 
b/phoenix-core/src/it/java/org/apache/hadoop/hbase/regionserver/wal/ReadWriteKeyValuesWithCodecIT.java
deleted file mode 100644
index 39eb871..000
--- 
a/phoenix-core/src/it/java/org/apache/hadoop/hbase/regionserver/wal/ReadWriteKeyValuesWithCodecIT.java
+++ /dev/null
@@ -1,184 +0,0 @@
-/*
- * Licensed to the Apache Software Foundation (ASF) under one
- * or more contributor license agreements.  See the NOTICE file
- * distributed with this work for additional information
- * regarding copyright ownership.  The ASF licenses this file
- * to you under the Apache License, Version 2.0 (the
- * "License"); you may not use this file except in compliance
- * with the License.  You may obtain a copy of the License at
- *
- * http://www.apache.org/licenses/LICENSE-2.0
- *
- * Unless required by applicable law or agreed to in writing, software
- * distributed under the License is distributed on an "AS IS" BASIS,
- * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
- * See the License for the specific language governing permissions and
- * limitations under the License.
- */
-
-package org.apache.hadoop.hbase.regionserver.wal;
-
-import static org.junit.Assert.assertEquals;
-
-import java.io.IOException;
-import java.util.ArrayList;
-import java.util.List;
-
-import org.apache.hadoop.conf.Configuration;
-import org.apache.hadoop.fs.FSDataInputStream;
-import org.apache.hadoop.fs.FSDataOutputStream;
-import org.apache.hadoop.fs.FileSystem;
-import org.apache.hadoop.fs.Path;
-import org.apache.hadoop.hbase.Cell;
-import org.apache.hadoop.hbase.HBaseTestingUtility;
-import org.apache.hadoop.hbase.KeyValue;
-import org.apache.hadoop.hbase.KeyValueUtil;
-import org.apache.hadoop.hbase.client.Delete;
-import org.apache.hadoop.hbase.client.Mutation;
-import org.apache.hadoop.hbase.client.Put;
-import org.apache.hadoop.hbase.codec.Codec;
-import org.apache.hadoop.hbase.io.util.LRUDictionary;
-import org.apache.hadoop.hbase.util.Bytes;
-import org.apache.phoenix.hbase.index.IndexTestingUtils;
-import org.apache.phoenix.hbase.index.wal.IndexedKeyValue;
-import org.junit.BeforeClass;
-import org.junit.Test;
-
-/**
- * Simple test to read/write simple files via our custom {@link WALCellCodec} 
to ensure properly
- * encoding/decoding without going through a cluster.
- */
-public class ReadWriteKeyValuesWithCodecIT {
-
-  private static final HBaseTestingUtility UTIL = new HBaseTestingUtility();
-  private static final byte[] ROW = Bytes.toBytes("row");
-  private static final byte[] FAMILY = Bytes.toBytes("family");
-
-  @BeforeClass
-  public static void setupCodec() {
-Configuration conf = UTIL.getConfiguration();
-IndexTestingUtils.setupConfig(conf);
-conf.set(WALCellCodec.WAL_CELL_CODEC_CLASS_KEY, 
IndexedWALEditCodec.class.getName());
-  }
-
-  @Test
-  public void testWithoutCompression() throws Exception {
-// get the FS ready to read/write the edits
-Path testDir = 
UTIL.getDataTestDir("TestReadWriteCustomEdits_withoutCompression");
-Path testFile = new Path(testDir, "testfile");
-FileSystem fs = UTIL.getTestFileSystem();
-
-List edits = getEdits();
-writeReadAndVerify(null, fs, edits, testFile);
-  }
-
-  @Test
-  public void testWithCompression() throws Exception {
-// get the FS ready to read/write the edit
-Path testDir = 
UTIL.getDataTestDir("TestReadWriteCustomEdits_withCompression");
-Path testFile 

[3/4] phoenix git commit: PHOENIX-4191 Categorize uncategorized integration tests

2017-09-12 Thread elserj
PHOENIX-4191 Categorize uncategorized integration tests

Uncategorized tests results in Maven not running them.


Project: http://git-wip-us.apache.org/repos/asf/phoenix/repo
Commit: http://git-wip-us.apache.org/repos/asf/phoenix/commit/ca8bd4a9
Tree: http://git-wip-us.apache.org/repos/asf/phoenix/tree/ca8bd4a9
Diff: http://git-wip-us.apache.org/repos/asf/phoenix/diff/ca8bd4a9

Branch: refs/heads/4.x-HBase-1.1
Commit: ca8bd4a9043b36868cba5ea4acd9ceaf3985a3c8
Parents: acb9e13
Author: Josh Elser 
Authored: Tue Sep 12 18:26:36 2017 -0400
Committer: Josh Elser 
Committed: Tue Sep 12 18:54:29 2017 -0400

--
 .../wal/ReadWriteKeyValuesWithCodecIT.java  | 184 --
 .../phoenix/end2end/IndexScrutinyToolIT.java|   2 +
 .../end2end/SystemTablePermissionsIT.java   |   2 +
 .../wal/ReadWriteKeyValuesWithCodecTest.java| 186 +++
 .../phoenix/hive/BaseHivePhoenixStoreIT.java|   3 +
 .../apache/phoenix/hive/HivePhoenixStoreIT.java |   3 +
 6 files changed, 196 insertions(+), 184 deletions(-)
--


http://git-wip-us.apache.org/repos/asf/phoenix/blob/ca8bd4a9/phoenix-core/src/it/java/org/apache/hadoop/hbase/regionserver/wal/ReadWriteKeyValuesWithCodecIT.java
--
diff --git 
a/phoenix-core/src/it/java/org/apache/hadoop/hbase/regionserver/wal/ReadWriteKeyValuesWithCodecIT.java
 
b/phoenix-core/src/it/java/org/apache/hadoop/hbase/regionserver/wal/ReadWriteKeyValuesWithCodecIT.java
deleted file mode 100644
index 39eb871..000
--- 
a/phoenix-core/src/it/java/org/apache/hadoop/hbase/regionserver/wal/ReadWriteKeyValuesWithCodecIT.java
+++ /dev/null
@@ -1,184 +0,0 @@
-/*
- * Licensed to the Apache Software Foundation (ASF) under one
- * or more contributor license agreements.  See the NOTICE file
- * distributed with this work for additional information
- * regarding copyright ownership.  The ASF licenses this file
- * to you under the Apache License, Version 2.0 (the
- * "License"); you may not use this file except in compliance
- * with the License.  You may obtain a copy of the License at
- *
- * http://www.apache.org/licenses/LICENSE-2.0
- *
- * Unless required by applicable law or agreed to in writing, software
- * distributed under the License is distributed on an "AS IS" BASIS,
- * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
- * See the License for the specific language governing permissions and
- * limitations under the License.
- */
-
-package org.apache.hadoop.hbase.regionserver.wal;
-
-import static org.junit.Assert.assertEquals;
-
-import java.io.IOException;
-import java.util.ArrayList;
-import java.util.List;
-
-import org.apache.hadoop.conf.Configuration;
-import org.apache.hadoop.fs.FSDataInputStream;
-import org.apache.hadoop.fs.FSDataOutputStream;
-import org.apache.hadoop.fs.FileSystem;
-import org.apache.hadoop.fs.Path;
-import org.apache.hadoop.hbase.Cell;
-import org.apache.hadoop.hbase.HBaseTestingUtility;
-import org.apache.hadoop.hbase.KeyValue;
-import org.apache.hadoop.hbase.KeyValueUtil;
-import org.apache.hadoop.hbase.client.Delete;
-import org.apache.hadoop.hbase.client.Mutation;
-import org.apache.hadoop.hbase.client.Put;
-import org.apache.hadoop.hbase.codec.Codec;
-import org.apache.hadoop.hbase.io.util.LRUDictionary;
-import org.apache.hadoop.hbase.util.Bytes;
-import org.apache.phoenix.hbase.index.IndexTestingUtils;
-import org.apache.phoenix.hbase.index.wal.IndexedKeyValue;
-import org.junit.BeforeClass;
-import org.junit.Test;
-
-/**
- * Simple test to read/write simple files via our custom {@link WALCellCodec} 
to ensure properly
- * encoding/decoding without going through a cluster.
- */
-public class ReadWriteKeyValuesWithCodecIT {
-
-  private static final HBaseTestingUtility UTIL = new HBaseTestingUtility();
-  private static final byte[] ROW = Bytes.toBytes("row");
-  private static final byte[] FAMILY = Bytes.toBytes("family");
-
-  @BeforeClass
-  public static void setupCodec() {
-Configuration conf = UTIL.getConfiguration();
-IndexTestingUtils.setupConfig(conf);
-conf.set(WALCellCodec.WAL_CELL_CODEC_CLASS_KEY, 
IndexedWALEditCodec.class.getName());
-  }
-
-  @Test
-  public void testWithoutCompression() throws Exception {
-// get the FS ready to read/write the edits
-Path testDir = 
UTIL.getDataTestDir("TestReadWriteCustomEdits_withoutCompression");
-Path testFile = new Path(testDir, "testfile");
-FileSystem fs = UTIL.getTestFileSystem();
-
-List edits = getEdits();
-writeReadAndVerify(null, fs, edits, testFile);
-  }
-
-  @Test
-  public void testWithCompression() throws Exception {
-// get the FS ready to read/write the edit
-Path testDir = 
UTIL.getDataTestDir("TestReadWriteCustomEdits_withCompression");
-Path testFile 

[4/4] phoenix git commit: PHOENIX-4191 Categorize uncategorized integration tests

2017-09-12 Thread elserj
PHOENIX-4191 Categorize uncategorized integration tests

Uncategorized tests results in Maven not running them.


Project: http://git-wip-us.apache.org/repos/asf/phoenix/repo
Commit: http://git-wip-us.apache.org/repos/asf/phoenix/commit/7f38f7e7
Tree: http://git-wip-us.apache.org/repos/asf/phoenix/tree/7f38f7e7
Diff: http://git-wip-us.apache.org/repos/asf/phoenix/diff/7f38f7e7

Branch: refs/heads/4.x-HBase-0.98
Commit: 7f38f7e70ad420a253f0ce2709ebebccb4572062
Parents: ae60776
Author: Josh Elser 
Authored: Tue Sep 12 18:26:36 2017 -0400
Committer: Josh Elser 
Committed: Tue Sep 12 19:00:00 2017 -0400

--
 .../wal/ReadWriteKeyValuesWithCodecIT.java  | 184 --
 .../phoenix/end2end/IndexScrutinyToolIT.java|   2 +
 .../end2end/SystemTablePermissionsIT.java   |   2 +
 .../wal/ReadWriteKeyValuesWithCodecTest.java| 186 +++
 .../phoenix/hive/BaseHivePhoenixStoreIT.java|   3 +
 .../apache/phoenix/hive/HivePhoenixStoreIT.java |   3 +
 6 files changed, 196 insertions(+), 184 deletions(-)
--


http://git-wip-us.apache.org/repos/asf/phoenix/blob/7f38f7e7/phoenix-core/src/it/java/org/apache/hadoop/hbase/regionserver/wal/ReadWriteKeyValuesWithCodecIT.java
--
diff --git 
a/phoenix-core/src/it/java/org/apache/hadoop/hbase/regionserver/wal/ReadWriteKeyValuesWithCodecIT.java
 
b/phoenix-core/src/it/java/org/apache/hadoop/hbase/regionserver/wal/ReadWriteKeyValuesWithCodecIT.java
deleted file mode 100644
index 39eb871..000
--- 
a/phoenix-core/src/it/java/org/apache/hadoop/hbase/regionserver/wal/ReadWriteKeyValuesWithCodecIT.java
+++ /dev/null
@@ -1,184 +0,0 @@
-/*
- * Licensed to the Apache Software Foundation (ASF) under one
- * or more contributor license agreements.  See the NOTICE file
- * distributed with this work for additional information
- * regarding copyright ownership.  The ASF licenses this file
- * to you under the Apache License, Version 2.0 (the
- * "License"); you may not use this file except in compliance
- * with the License.  You may obtain a copy of the License at
- *
- * http://www.apache.org/licenses/LICENSE-2.0
- *
- * Unless required by applicable law or agreed to in writing, software
- * distributed under the License is distributed on an "AS IS" BASIS,
- * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
- * See the License for the specific language governing permissions and
- * limitations under the License.
- */
-
-package org.apache.hadoop.hbase.regionserver.wal;
-
-import static org.junit.Assert.assertEquals;
-
-import java.io.IOException;
-import java.util.ArrayList;
-import java.util.List;
-
-import org.apache.hadoop.conf.Configuration;
-import org.apache.hadoop.fs.FSDataInputStream;
-import org.apache.hadoop.fs.FSDataOutputStream;
-import org.apache.hadoop.fs.FileSystem;
-import org.apache.hadoop.fs.Path;
-import org.apache.hadoop.hbase.Cell;
-import org.apache.hadoop.hbase.HBaseTestingUtility;
-import org.apache.hadoop.hbase.KeyValue;
-import org.apache.hadoop.hbase.KeyValueUtil;
-import org.apache.hadoop.hbase.client.Delete;
-import org.apache.hadoop.hbase.client.Mutation;
-import org.apache.hadoop.hbase.client.Put;
-import org.apache.hadoop.hbase.codec.Codec;
-import org.apache.hadoop.hbase.io.util.LRUDictionary;
-import org.apache.hadoop.hbase.util.Bytes;
-import org.apache.phoenix.hbase.index.IndexTestingUtils;
-import org.apache.phoenix.hbase.index.wal.IndexedKeyValue;
-import org.junit.BeforeClass;
-import org.junit.Test;
-
-/**
- * Simple test to read/write simple files via our custom {@link WALCellCodec} 
to ensure properly
- * encoding/decoding without going through a cluster.
- */
-public class ReadWriteKeyValuesWithCodecIT {
-
-  private static final HBaseTestingUtility UTIL = new HBaseTestingUtility();
-  private static final byte[] ROW = Bytes.toBytes("row");
-  private static final byte[] FAMILY = Bytes.toBytes("family");
-
-  @BeforeClass
-  public static void setupCodec() {
-Configuration conf = UTIL.getConfiguration();
-IndexTestingUtils.setupConfig(conf);
-conf.set(WALCellCodec.WAL_CELL_CODEC_CLASS_KEY, 
IndexedWALEditCodec.class.getName());
-  }
-
-  @Test
-  public void testWithoutCompression() throws Exception {
-// get the FS ready to read/write the edits
-Path testDir = 
UTIL.getDataTestDir("TestReadWriteCustomEdits_withoutCompression");
-Path testFile = new Path(testDir, "testfile");
-FileSystem fs = UTIL.getTestFileSystem();
-
-List edits = getEdits();
-writeReadAndVerify(null, fs, edits, testFile);
-  }
-
-  @Test
-  public void testWithCompression() throws Exception {
-// get the FS ready to read/write the edit
-Path testDir = 
UTIL.getDataTestDir("TestReadWriteCustomEdits_withCompression");
-Path 

[1/4] phoenix git commit: PHOENIX-4191 Categorize uncategorized integration tests

2017-09-12 Thread elserj
Repository: phoenix
Updated Branches:
  refs/heads/4.x-HBase-0.98 ae6077609 -> 7f38f7e70
  refs/heads/4.x-HBase-1.1 acb9e13af -> ca8bd4a90
  refs/heads/4.x-HBase-1.2 8f2b2be26 -> b1751c4de
  refs/heads/master b53de2041 -> d9ac3f109


PHOENIX-4191 Categorize uncategorized integration tests

Uncategorized tests results in Maven not running them.


Project: http://git-wip-us.apache.org/repos/asf/phoenix/repo
Commit: http://git-wip-us.apache.org/repos/asf/phoenix/commit/d9ac3f10
Tree: http://git-wip-us.apache.org/repos/asf/phoenix/tree/d9ac3f10
Diff: http://git-wip-us.apache.org/repos/asf/phoenix/diff/d9ac3f10

Branch: refs/heads/master
Commit: d9ac3f109b86d70a3ec8f5e82175f03343be10d0
Parents: b53de20
Author: Josh Elser 
Authored: Tue Sep 12 18:26:36 2017 -0400
Committer: Josh Elser 
Committed: Tue Sep 12 18:34:32 2017 -0400

--
 .../wal/ReadWriteKeyValuesWithCodecIT.java  | 184 --
 .../phoenix/end2end/IndexScrutinyToolIT.java|   2 +
 .../end2end/SystemTablePermissionsIT.java   |   2 +
 .../wal/ReadWriteKeyValuesWithCodecTest.java| 186 +++
 .../phoenix/hive/BaseHivePhoenixStoreIT.java|   3 +
 .../apache/phoenix/hive/HivePhoenixStoreIT.java |   3 +
 6 files changed, 196 insertions(+), 184 deletions(-)
--


http://git-wip-us.apache.org/repos/asf/phoenix/blob/d9ac3f10/phoenix-core/src/it/java/org/apache/hadoop/hbase/regionserver/wal/ReadWriteKeyValuesWithCodecIT.java
--
diff --git 
a/phoenix-core/src/it/java/org/apache/hadoop/hbase/regionserver/wal/ReadWriteKeyValuesWithCodecIT.java
 
b/phoenix-core/src/it/java/org/apache/hadoop/hbase/regionserver/wal/ReadWriteKeyValuesWithCodecIT.java
deleted file mode 100644
index 39eb871..000
--- 
a/phoenix-core/src/it/java/org/apache/hadoop/hbase/regionserver/wal/ReadWriteKeyValuesWithCodecIT.java
+++ /dev/null
@@ -1,184 +0,0 @@
-/*
- * Licensed to the Apache Software Foundation (ASF) under one
- * or more contributor license agreements.  See the NOTICE file
- * distributed with this work for additional information
- * regarding copyright ownership.  The ASF licenses this file
- * to you under the Apache License, Version 2.0 (the
- * "License"); you may not use this file except in compliance
- * with the License.  You may obtain a copy of the License at
- *
- * http://www.apache.org/licenses/LICENSE-2.0
- *
- * Unless required by applicable law or agreed to in writing, software
- * distributed under the License is distributed on an "AS IS" BASIS,
- * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
- * See the License for the specific language governing permissions and
- * limitations under the License.
- */
-
-package org.apache.hadoop.hbase.regionserver.wal;
-
-import static org.junit.Assert.assertEquals;
-
-import java.io.IOException;
-import java.util.ArrayList;
-import java.util.List;
-
-import org.apache.hadoop.conf.Configuration;
-import org.apache.hadoop.fs.FSDataInputStream;
-import org.apache.hadoop.fs.FSDataOutputStream;
-import org.apache.hadoop.fs.FileSystem;
-import org.apache.hadoop.fs.Path;
-import org.apache.hadoop.hbase.Cell;
-import org.apache.hadoop.hbase.HBaseTestingUtility;
-import org.apache.hadoop.hbase.KeyValue;
-import org.apache.hadoop.hbase.KeyValueUtil;
-import org.apache.hadoop.hbase.client.Delete;
-import org.apache.hadoop.hbase.client.Mutation;
-import org.apache.hadoop.hbase.client.Put;
-import org.apache.hadoop.hbase.codec.Codec;
-import org.apache.hadoop.hbase.io.util.LRUDictionary;
-import org.apache.hadoop.hbase.util.Bytes;
-import org.apache.phoenix.hbase.index.IndexTestingUtils;
-import org.apache.phoenix.hbase.index.wal.IndexedKeyValue;
-import org.junit.BeforeClass;
-import org.junit.Test;
-
-/**
- * Simple test to read/write simple files via our custom {@link WALCellCodec} 
to ensure properly
- * encoding/decoding without going through a cluster.
- */
-public class ReadWriteKeyValuesWithCodecIT {
-
-  private static final HBaseTestingUtility UTIL = new HBaseTestingUtility();
-  private static final byte[] ROW = Bytes.toBytes("row");
-  private static final byte[] FAMILY = Bytes.toBytes("family");
-
-  @BeforeClass
-  public static void setupCodec() {
-Configuration conf = UTIL.getConfiguration();
-IndexTestingUtils.setupConfig(conf);
-conf.set(WALCellCodec.WAL_CELL_CODEC_CLASS_KEY, 
IndexedWALEditCodec.class.getName());
-  }
-
-  @Test
-  public void testWithoutCompression() throws Exception {
-// get the FS ready to read/write the edits
-Path testDir = 
UTIL.getDataTestDir("TestReadWriteCustomEdits_withoutCompression");
-Path testFile = new Path(testDir, "testfile");
-FileSystem fs = UTIL.getTestFileSystem();
-
-List edits = getEdits();
-writeReadAndVerify(null, fs, edits, 

phoenix git commit: PHOENIX-4201 Addendum to clear cache in deleteMetadata

2017-09-12 Thread samarth
Repository: phoenix
Updated Branches:
  refs/heads/4.x-HBase-1.2 9554611b3 -> 8f2b2be26


PHOENIX-4201 Addendum to clear cache in deleteMetadata


Project: http://git-wip-us.apache.org/repos/asf/phoenix/repo
Commit: http://git-wip-us.apache.org/repos/asf/phoenix/commit/8f2b2be2
Tree: http://git-wip-us.apache.org/repos/asf/phoenix/tree/8f2b2be2
Diff: http://git-wip-us.apache.org/repos/asf/phoenix/diff/8f2b2be2

Branch: refs/heads/4.x-HBase-1.2
Commit: 8f2b2be26c0a369a4ef1a87f3b3a92b8848867fa
Parents: 9554611
Author: Samarth Jain 
Authored: Tue Sep 12 15:01:25 2017 -0700
Committer: Samarth Jain 
Committed: Tue Sep 12 15:01:25 2017 -0700

--
 .../it/java/org/apache/phoenix/end2end/QueryDatabaseMetaDataIT.java | 1 +
 1 file changed, 1 insertion(+)
--


http://git-wip-us.apache.org/repos/asf/phoenix/blob/8f2b2be2/phoenix-core/src/it/java/org/apache/phoenix/end2end/QueryDatabaseMetaDataIT.java
--
diff --git 
a/phoenix-core/src/it/java/org/apache/phoenix/end2end/QueryDatabaseMetaDataIT.java
 
b/phoenix-core/src/it/java/org/apache/phoenix/end2end/QueryDatabaseMetaDataIT.java
index 0561843..f809e2c 100644
--- 
a/phoenix-core/src/it/java/org/apache/phoenix/end2end/QueryDatabaseMetaDataIT.java
+++ 
b/phoenix-core/src/it/java/org/apache/phoenix/end2end/QueryDatabaseMetaDataIT.java
@@ -101,6 +101,7 @@ public class QueryDatabaseMetaDataIT extends 
ParallelStatsDisabledIT {
 delete = "DELETE FROM \"SYSTEM\".\"SEQUENCE\"";
 conn.createStatement().executeUpdate(delete);
 conn.commit();
+
conn.unwrap(PhoenixConnection.class).getQueryServices().clearCache();
 }
 }
 



phoenix git commit: PHOENIX-4201 Addendum to clear cache in deleteMetadata

2017-09-12 Thread samarth
Repository: phoenix
Updated Branches:
  refs/heads/4.x-HBase-1.1 5315998f1 -> acb9e13af


PHOENIX-4201 Addendum to clear cache in deleteMetadata


Project: http://git-wip-us.apache.org/repos/asf/phoenix/repo
Commit: http://git-wip-us.apache.org/repos/asf/phoenix/commit/acb9e13a
Tree: http://git-wip-us.apache.org/repos/asf/phoenix/tree/acb9e13a
Diff: http://git-wip-us.apache.org/repos/asf/phoenix/diff/acb9e13a

Branch: refs/heads/4.x-HBase-1.1
Commit: acb9e13af4cd6f5bcfa9aa88916a4a518b794c69
Parents: 5315998
Author: Samarth Jain 
Authored: Tue Sep 12 15:01:01 2017 -0700
Committer: Samarth Jain 
Committed: Tue Sep 12 15:01:01 2017 -0700

--
 .../it/java/org/apache/phoenix/end2end/QueryDatabaseMetaDataIT.java | 1 +
 1 file changed, 1 insertion(+)
--


http://git-wip-us.apache.org/repos/asf/phoenix/blob/acb9e13a/phoenix-core/src/it/java/org/apache/phoenix/end2end/QueryDatabaseMetaDataIT.java
--
diff --git 
a/phoenix-core/src/it/java/org/apache/phoenix/end2end/QueryDatabaseMetaDataIT.java
 
b/phoenix-core/src/it/java/org/apache/phoenix/end2end/QueryDatabaseMetaDataIT.java
index 0561843..f809e2c 100644
--- 
a/phoenix-core/src/it/java/org/apache/phoenix/end2end/QueryDatabaseMetaDataIT.java
+++ 
b/phoenix-core/src/it/java/org/apache/phoenix/end2end/QueryDatabaseMetaDataIT.java
@@ -101,6 +101,7 @@ public class QueryDatabaseMetaDataIT extends 
ParallelStatsDisabledIT {
 delete = "DELETE FROM \"SYSTEM\".\"SEQUENCE\"";
 conn.createStatement().executeUpdate(delete);
 conn.commit();
+
conn.unwrap(PhoenixConnection.class).getQueryServices().clearCache();
 }
 }
 



phoenix git commit: PHOENIX-4201 Addendum to clear cache in deleteMetadata

2017-09-12 Thread samarth
Repository: phoenix
Updated Branches:
  refs/heads/master 790e8d4d2 -> b53de2041


PHOENIX-4201 Addendum to clear cache in deleteMetadata


Project: http://git-wip-us.apache.org/repos/asf/phoenix/repo
Commit: http://git-wip-us.apache.org/repos/asf/phoenix/commit/b53de204
Tree: http://git-wip-us.apache.org/repos/asf/phoenix/tree/b53de204
Diff: http://git-wip-us.apache.org/repos/asf/phoenix/diff/b53de204

Branch: refs/heads/master
Commit: b53de204104bbd0262ed8fd7e0341f147698f1f1
Parents: 790e8d4
Author: Samarth Jain 
Authored: Tue Sep 12 15:00:45 2017 -0700
Committer: Samarth Jain 
Committed: Tue Sep 12 15:00:45 2017 -0700

--
 .../it/java/org/apache/phoenix/end2end/QueryDatabaseMetaDataIT.java | 1 +
 1 file changed, 1 insertion(+)
--


http://git-wip-us.apache.org/repos/asf/phoenix/blob/b53de204/phoenix-core/src/it/java/org/apache/phoenix/end2end/QueryDatabaseMetaDataIT.java
--
diff --git 
a/phoenix-core/src/it/java/org/apache/phoenix/end2end/QueryDatabaseMetaDataIT.java
 
b/phoenix-core/src/it/java/org/apache/phoenix/end2end/QueryDatabaseMetaDataIT.java
index 0561843..f809e2c 100644
--- 
a/phoenix-core/src/it/java/org/apache/phoenix/end2end/QueryDatabaseMetaDataIT.java
+++ 
b/phoenix-core/src/it/java/org/apache/phoenix/end2end/QueryDatabaseMetaDataIT.java
@@ -101,6 +101,7 @@ public class QueryDatabaseMetaDataIT extends 
ParallelStatsDisabledIT {
 delete = "DELETE FROM \"SYSTEM\".\"SEQUENCE\"";
 conn.createStatement().executeUpdate(delete);
 conn.commit();
+
conn.unwrap(PhoenixConnection.class).getQueryServices().clearCache();
 }
 }
 



phoenix git commit: PHOENIX-4201 Addendum to clear cache in deleteMetadata

2017-09-12 Thread samarth
Repository: phoenix
Updated Branches:
  refs/heads/4.x-HBase-0.98 519701974 -> ae6077609


PHOENIX-4201 Addendum to clear cache in deleteMetadata


Project: http://git-wip-us.apache.org/repos/asf/phoenix/repo
Commit: http://git-wip-us.apache.org/repos/asf/phoenix/commit/ae607760
Tree: http://git-wip-us.apache.org/repos/asf/phoenix/tree/ae607760
Diff: http://git-wip-us.apache.org/repos/asf/phoenix/diff/ae607760

Branch: refs/heads/4.x-HBase-0.98
Commit: ae607760996e69a77f794bb049883750e7e912cd
Parents: 5197019
Author: Samarth Jain 
Authored: Tue Sep 12 15:00:14 2017 -0700
Committer: Samarth Jain 
Committed: Tue Sep 12 15:00:14 2017 -0700

--
 .../it/java/org/apache/phoenix/end2end/QueryDatabaseMetaDataIT.java | 1 +
 1 file changed, 1 insertion(+)
--


http://git-wip-us.apache.org/repos/asf/phoenix/blob/ae607760/phoenix-core/src/it/java/org/apache/phoenix/end2end/QueryDatabaseMetaDataIT.java
--
diff --git 
a/phoenix-core/src/it/java/org/apache/phoenix/end2end/QueryDatabaseMetaDataIT.java
 
b/phoenix-core/src/it/java/org/apache/phoenix/end2end/QueryDatabaseMetaDataIT.java
index 0561843..f809e2c 100644
--- 
a/phoenix-core/src/it/java/org/apache/phoenix/end2end/QueryDatabaseMetaDataIT.java
+++ 
b/phoenix-core/src/it/java/org/apache/phoenix/end2end/QueryDatabaseMetaDataIT.java
@@ -101,6 +101,7 @@ public class QueryDatabaseMetaDataIT extends 
ParallelStatsDisabledIT {
 delete = "DELETE FROM \"SYSTEM\".\"SEQUENCE\"";
 conn.createStatement().executeUpdate(delete);
 conn.commit();
+
conn.unwrap(PhoenixConnection.class).getQueryServices().clearCache();
 }
 }
 



Jenkins build is back to normal : Phoenix | Master #1790

2017-09-12 Thread Apache Jenkins Server
See 




Apache-Phoenix | 4.x-HBase-1.1 | Build Successful

2017-09-12 Thread Apache Jenkins Server
4.x-HBase-1.1 branch build status Successful

Source repository https://git-wip-us.apache.org/repos/asf?p=phoenix.git;a=shortlog;h=refs/heads/4.x-HBase-1.1

Compiled Artifacts https://builds.apache.org/job/Phoenix-4.x-HBase-1.1/lastSuccessfulBuild/artifact/

Test Report https://builds.apache.org/job/Phoenix-4.x-HBase-1.1/lastCompletedBuild/testReport/

Changes
[elserj] PHOENIX-4188 Disable inline-DTDs in Pherf XML records



Build times for last couple of runsLatest build time is the right most | Legend blue: normal, red: test failure, gray: timeout


[1/2] phoenix git commit: PHOENIX-4201 Remove usage of SCN from QueryDatabaseMetadataIT

2017-09-12 Thread samarth
Repository: phoenix
Updated Branches:
  refs/heads/4.x-HBase-0.98 4004ed17f -> 519701974


http://git-wip-us.apache.org/repos/asf/phoenix/blob/51970197/phoenix-core/src/it/java/org/apache/phoenix/end2end/QueryDatabaseMetaDataIT.java
--
diff --git 
a/phoenix-core/src/it/java/org/apache/phoenix/end2end/QueryDatabaseMetaDataIT.java
 
b/phoenix-core/src/it/java/org/apache/phoenix/end2end/QueryDatabaseMetaDataIT.java
index 12c0bd3..0561843 100644
--- 
a/phoenix-core/src/it/java/org/apache/phoenix/end2end/QueryDatabaseMetaDataIT.java
+++ 
b/phoenix-core/src/it/java/org/apache/phoenix/end2end/QueryDatabaseMetaDataIT.java
@@ -22,14 +22,7 @@ import static 
org.apache.phoenix.jdbc.PhoenixDatabaseMetaData.SYSTEM_CATALOG_TAB
 import static 
org.apache.phoenix.jdbc.PhoenixDatabaseMetaData.SYSTEM_FUNCTION_TABLE;
 import static org.apache.phoenix.jdbc.PhoenixDatabaseMetaData.TYPE_SEQUENCE;
 import static org.apache.phoenix.util.TestUtil.ATABLE_NAME;
-import static org.apache.phoenix.util.TestUtil.ATABLE_SCHEMA_NAME;
-import static org.apache.phoenix.util.TestUtil.BTABLE_NAME;
 import static org.apache.phoenix.util.TestUtil.CUSTOM_ENTITY_DATA_FULL_NAME;
-import static org.apache.phoenix.util.TestUtil.CUSTOM_ENTITY_DATA_NAME;
-import static org.apache.phoenix.util.TestUtil.CUSTOM_ENTITY_DATA_SCHEMA_NAME;
-import static org.apache.phoenix.util.TestUtil.GROUPBYTEST_NAME;
-import static org.apache.phoenix.util.TestUtil.MDTEST_NAME;
-import static org.apache.phoenix.util.TestUtil.MDTEST_SCHEMA_NAME;
 import static org.apache.phoenix.util.TestUtil.PTSDB_NAME;
 import static org.apache.phoenix.util.TestUtil.STABLE_NAME;
 import static org.apache.phoenix.util.TestUtil.TABLE_WITH_SALTING;
@@ -57,17 +50,13 @@ import org.apache.hadoop.hbase.HTableDescriptor;
 import org.apache.hadoop.hbase.client.HBaseAdmin;
 import org.apache.hadoop.hbase.client.HTableInterface;
 import org.apache.hadoop.hbase.client.Put;
-import org.apache.hadoop.hbase.client.Scan;
-import org.apache.hadoop.hbase.filter.FirstKeyOnlyFilter;
 import org.apache.hadoop.hbase.io.encoding.DataBlockEncoding;
 import org.apache.hadoop.hbase.util.Bytes;
 import org.apache.phoenix.coprocessor.GroupedAggregateRegionObserver;
 import org.apache.phoenix.coprocessor.ServerCachingEndpointImpl;
 import org.apache.phoenix.coprocessor.UngroupedAggregateRegionObserver;
-import org.apache.phoenix.exception.SQLExceptionCode;
 import org.apache.phoenix.jdbc.PhoenixConnection;
 import org.apache.phoenix.jdbc.PhoenixDatabaseMetaData;
-import org.apache.phoenix.query.QueryServices;
 import org.apache.phoenix.schema.ColumnNotFoundException;
 import org.apache.phoenix.schema.PTable.ViewType;
 import org.apache.phoenix.schema.PTableType;
@@ -77,265 +66,289 @@ import org.apache.phoenix.schema.types.PChar;
 import org.apache.phoenix.schema.types.PDecimal;
 import org.apache.phoenix.schema.types.PInteger;
 import org.apache.phoenix.schema.types.PLong;
-import org.apache.phoenix.util.PhoenixRuntime;
 import org.apache.phoenix.util.PropertiesUtil;
 import org.apache.phoenix.util.SchemaUtil;
 import org.apache.phoenix.util.StringUtil;
-import org.apache.phoenix.util.TestUtil;
+import org.junit.Before;
 import org.junit.Test;
 
 
-public class QueryDatabaseMetaDataIT extends BaseClientManagedTimeIT {
-   
-private static void createMDTestTable(Connection conn, String tableName, 
String extraProps) throws SQLException {
-String ddl = "create table if not exists " + tableName +
-"   (id char(1) primary key,\n" +
-"a.col1 integer,\n" +
-"b.col2 bigint,\n" +
-"b.col3 decimal,\n" +
-"b.col4 decimal(5),\n" +
-"b.col5 decimal(6,3))\n" +
-"a." + HConstants.VERSIONS + "=" + 1 + "," + "a." + 
HColumnDescriptor.DATA_BLOCK_ENCODING + "='" + DataBlockEncoding.NONE +  "'";
+public class QueryDatabaseMetaDataIT extends ParallelStatsDisabledIT {
+
+private static void createMDTestTable(Connection conn, String tableName, 
String extraProps)
+throws SQLException {
+String ddl =
+"create table if not exists " + tableName + "   (id char(1) 
primary key,\n"
++ "a.col1 integer,\n" + "b.col2 bigint,\n" + " 
   b.col3 decimal,\n"
++ "b.col4 decimal(5),\n" + "b.col5 
decimal(6,3))\n" + "a."
++ HConstants.VERSIONS + "=" + 1 + "," + "a."
++ HColumnDescriptor.DATA_BLOCK_ENCODING + "='" + 
DataBlockEncoding.NONE
++ "'";
 if (extraProps != null && extraProps.length() > 0) {
 ddl += "," + extraProps;
 }
 conn.createStatement().execute(ddl);
 }
 
+@Before
+// We need to clean up phoenix metadata to ensure tests don't step on each 
other
+public void deleteMetadata() throws Exception {

[2/2] phoenix git commit: PHOENIX-4201 Remove usage of SCN from QueryDatabaseMetadataIT

2017-09-12 Thread samarth
PHOENIX-4201 Remove usage of SCN from QueryDatabaseMetadataIT


Project: http://git-wip-us.apache.org/repos/asf/phoenix/repo
Commit: http://git-wip-us.apache.org/repos/asf/phoenix/commit/5315998f
Tree: http://git-wip-us.apache.org/repos/asf/phoenix/tree/5315998f
Diff: http://git-wip-us.apache.org/repos/asf/phoenix/diff/5315998f

Branch: refs/heads/4.x-HBase-1.1
Commit: 5315998f1413bb5ff5ba3d0c99fda0f3a0f410b8
Parents: 050a353
Author: Samarth Jain 
Authored: Tue Sep 12 13:44:45 2017 -0700
Committer: Samarth Jain 
Committed: Tue Sep 12 13:44:45 2017 -0700

--
 .../end2end/QueryDatabaseMetaDataIT.java| 1715 --
 1 file changed, 751 insertions(+), 964 deletions(-)
--




[1/2] phoenix git commit: PHOENIX-4201 Remove usage of SCN from QueryDatabaseMetadataIT

2017-09-12 Thread samarth
Repository: phoenix
Updated Branches:
  refs/heads/4.x-HBase-1.1 050a35382 -> 5315998f1


http://git-wip-us.apache.org/repos/asf/phoenix/blob/5315998f/phoenix-core/src/it/java/org/apache/phoenix/end2end/QueryDatabaseMetaDataIT.java
--
diff --git 
a/phoenix-core/src/it/java/org/apache/phoenix/end2end/QueryDatabaseMetaDataIT.java
 
b/phoenix-core/src/it/java/org/apache/phoenix/end2end/QueryDatabaseMetaDataIT.java
index 12c0bd3..0561843 100644
--- 
a/phoenix-core/src/it/java/org/apache/phoenix/end2end/QueryDatabaseMetaDataIT.java
+++ 
b/phoenix-core/src/it/java/org/apache/phoenix/end2end/QueryDatabaseMetaDataIT.java
@@ -22,14 +22,7 @@ import static 
org.apache.phoenix.jdbc.PhoenixDatabaseMetaData.SYSTEM_CATALOG_TAB
 import static 
org.apache.phoenix.jdbc.PhoenixDatabaseMetaData.SYSTEM_FUNCTION_TABLE;
 import static org.apache.phoenix.jdbc.PhoenixDatabaseMetaData.TYPE_SEQUENCE;
 import static org.apache.phoenix.util.TestUtil.ATABLE_NAME;
-import static org.apache.phoenix.util.TestUtil.ATABLE_SCHEMA_NAME;
-import static org.apache.phoenix.util.TestUtil.BTABLE_NAME;
 import static org.apache.phoenix.util.TestUtil.CUSTOM_ENTITY_DATA_FULL_NAME;
-import static org.apache.phoenix.util.TestUtil.CUSTOM_ENTITY_DATA_NAME;
-import static org.apache.phoenix.util.TestUtil.CUSTOM_ENTITY_DATA_SCHEMA_NAME;
-import static org.apache.phoenix.util.TestUtil.GROUPBYTEST_NAME;
-import static org.apache.phoenix.util.TestUtil.MDTEST_NAME;
-import static org.apache.phoenix.util.TestUtil.MDTEST_SCHEMA_NAME;
 import static org.apache.phoenix.util.TestUtil.PTSDB_NAME;
 import static org.apache.phoenix.util.TestUtil.STABLE_NAME;
 import static org.apache.phoenix.util.TestUtil.TABLE_WITH_SALTING;
@@ -57,17 +50,13 @@ import org.apache.hadoop.hbase.HTableDescriptor;
 import org.apache.hadoop.hbase.client.HBaseAdmin;
 import org.apache.hadoop.hbase.client.HTableInterface;
 import org.apache.hadoop.hbase.client.Put;
-import org.apache.hadoop.hbase.client.Scan;
-import org.apache.hadoop.hbase.filter.FirstKeyOnlyFilter;
 import org.apache.hadoop.hbase.io.encoding.DataBlockEncoding;
 import org.apache.hadoop.hbase.util.Bytes;
 import org.apache.phoenix.coprocessor.GroupedAggregateRegionObserver;
 import org.apache.phoenix.coprocessor.ServerCachingEndpointImpl;
 import org.apache.phoenix.coprocessor.UngroupedAggregateRegionObserver;
-import org.apache.phoenix.exception.SQLExceptionCode;
 import org.apache.phoenix.jdbc.PhoenixConnection;
 import org.apache.phoenix.jdbc.PhoenixDatabaseMetaData;
-import org.apache.phoenix.query.QueryServices;
 import org.apache.phoenix.schema.ColumnNotFoundException;
 import org.apache.phoenix.schema.PTable.ViewType;
 import org.apache.phoenix.schema.PTableType;
@@ -77,265 +66,289 @@ import org.apache.phoenix.schema.types.PChar;
 import org.apache.phoenix.schema.types.PDecimal;
 import org.apache.phoenix.schema.types.PInteger;
 import org.apache.phoenix.schema.types.PLong;
-import org.apache.phoenix.util.PhoenixRuntime;
 import org.apache.phoenix.util.PropertiesUtil;
 import org.apache.phoenix.util.SchemaUtil;
 import org.apache.phoenix.util.StringUtil;
-import org.apache.phoenix.util.TestUtil;
+import org.junit.Before;
 import org.junit.Test;
 
 
-public class QueryDatabaseMetaDataIT extends BaseClientManagedTimeIT {
-   
-private static void createMDTestTable(Connection conn, String tableName, 
String extraProps) throws SQLException {
-String ddl = "create table if not exists " + tableName +
-"   (id char(1) primary key,\n" +
-"a.col1 integer,\n" +
-"b.col2 bigint,\n" +
-"b.col3 decimal,\n" +
-"b.col4 decimal(5),\n" +
-"b.col5 decimal(6,3))\n" +
-"a." + HConstants.VERSIONS + "=" + 1 + "," + "a." + 
HColumnDescriptor.DATA_BLOCK_ENCODING + "='" + DataBlockEncoding.NONE +  "'";
+public class QueryDatabaseMetaDataIT extends ParallelStatsDisabledIT {
+
+private static void createMDTestTable(Connection conn, String tableName, 
String extraProps)
+throws SQLException {
+String ddl =
+"create table if not exists " + tableName + "   (id char(1) 
primary key,\n"
++ "a.col1 integer,\n" + "b.col2 bigint,\n" + " 
   b.col3 decimal,\n"
++ "b.col4 decimal(5),\n" + "b.col5 
decimal(6,3))\n" + "a."
++ HConstants.VERSIONS + "=" + 1 + "," + "a."
++ HColumnDescriptor.DATA_BLOCK_ENCODING + "='" + 
DataBlockEncoding.NONE
++ "'";
 if (extraProps != null && extraProps.length() > 0) {
 ddl += "," + extraProps;
 }
 conn.createStatement().execute(ddl);
 }
 
+@Before
+// We need to clean up phoenix metadata to ensure tests don't step on each 
other
+public void deleteMetadata() throws Exception {
+ 

[2/2] phoenix git commit: PHOENIX-4201 Remove usage of SCN from QueryDatabaseMetadataIT

2017-09-12 Thread samarth
PHOENIX-4201 Remove usage of SCN from QueryDatabaseMetadataIT


Project: http://git-wip-us.apache.org/repos/asf/phoenix/repo
Commit: http://git-wip-us.apache.org/repos/asf/phoenix/commit/9554611b
Tree: http://git-wip-us.apache.org/repos/asf/phoenix/tree/9554611b
Diff: http://git-wip-us.apache.org/repos/asf/phoenix/diff/9554611b

Branch: refs/heads/4.x-HBase-1.2
Commit: 9554611b3691aea517c00f0db303202d198546db
Parents: f8100ca
Author: Samarth Jain 
Authored: Tue Sep 12 13:44:23 2017 -0700
Committer: Samarth Jain 
Committed: Tue Sep 12 13:44:23 2017 -0700

--
 .../end2end/QueryDatabaseMetaDataIT.java| 1715 --
 1 file changed, 751 insertions(+), 964 deletions(-)
--




[2/2] phoenix git commit: PHOENIX-4201 Remove usage of SCN from QueryDatabaseMetadataIT

2017-09-12 Thread samarth
PHOENIX-4201 Remove usage of SCN from QueryDatabaseMetadataIT


Project: http://git-wip-us.apache.org/repos/asf/phoenix/repo
Commit: http://git-wip-us.apache.org/repos/asf/phoenix/commit/790e8d4d
Tree: http://git-wip-us.apache.org/repos/asf/phoenix/tree/790e8d4d
Diff: http://git-wip-us.apache.org/repos/asf/phoenix/diff/790e8d4d

Branch: refs/heads/master
Commit: 790e8d4d2b84a0e234609b2d8420e3ff398f0dee
Parents: 4ee3505
Author: Samarth Jain 
Authored: Tue Sep 12 13:43:48 2017 -0700
Committer: Samarth Jain 
Committed: Tue Sep 12 13:43:48 2017 -0700

--
 .../end2end/QueryDatabaseMetaDataIT.java| 1715 --
 1 file changed, 751 insertions(+), 964 deletions(-)
--




[1/2] phoenix git commit: PHOENIX-4201 Remove usage of SCN from QueryDatabaseMetadataIT

2017-09-12 Thread samarth
Repository: phoenix
Updated Branches:
  refs/heads/master 4ee35057c -> 790e8d4d2


http://git-wip-us.apache.org/repos/asf/phoenix/blob/790e8d4d/phoenix-core/src/it/java/org/apache/phoenix/end2end/QueryDatabaseMetaDataIT.java
--
diff --git 
a/phoenix-core/src/it/java/org/apache/phoenix/end2end/QueryDatabaseMetaDataIT.java
 
b/phoenix-core/src/it/java/org/apache/phoenix/end2end/QueryDatabaseMetaDataIT.java
index 12c0bd3..0561843 100644
--- 
a/phoenix-core/src/it/java/org/apache/phoenix/end2end/QueryDatabaseMetaDataIT.java
+++ 
b/phoenix-core/src/it/java/org/apache/phoenix/end2end/QueryDatabaseMetaDataIT.java
@@ -22,14 +22,7 @@ import static 
org.apache.phoenix.jdbc.PhoenixDatabaseMetaData.SYSTEM_CATALOG_TAB
 import static 
org.apache.phoenix.jdbc.PhoenixDatabaseMetaData.SYSTEM_FUNCTION_TABLE;
 import static org.apache.phoenix.jdbc.PhoenixDatabaseMetaData.TYPE_SEQUENCE;
 import static org.apache.phoenix.util.TestUtil.ATABLE_NAME;
-import static org.apache.phoenix.util.TestUtil.ATABLE_SCHEMA_NAME;
-import static org.apache.phoenix.util.TestUtil.BTABLE_NAME;
 import static org.apache.phoenix.util.TestUtil.CUSTOM_ENTITY_DATA_FULL_NAME;
-import static org.apache.phoenix.util.TestUtil.CUSTOM_ENTITY_DATA_NAME;
-import static org.apache.phoenix.util.TestUtil.CUSTOM_ENTITY_DATA_SCHEMA_NAME;
-import static org.apache.phoenix.util.TestUtil.GROUPBYTEST_NAME;
-import static org.apache.phoenix.util.TestUtil.MDTEST_NAME;
-import static org.apache.phoenix.util.TestUtil.MDTEST_SCHEMA_NAME;
 import static org.apache.phoenix.util.TestUtil.PTSDB_NAME;
 import static org.apache.phoenix.util.TestUtil.STABLE_NAME;
 import static org.apache.phoenix.util.TestUtil.TABLE_WITH_SALTING;
@@ -57,17 +50,13 @@ import org.apache.hadoop.hbase.HTableDescriptor;
 import org.apache.hadoop.hbase.client.HBaseAdmin;
 import org.apache.hadoop.hbase.client.HTableInterface;
 import org.apache.hadoop.hbase.client.Put;
-import org.apache.hadoop.hbase.client.Scan;
-import org.apache.hadoop.hbase.filter.FirstKeyOnlyFilter;
 import org.apache.hadoop.hbase.io.encoding.DataBlockEncoding;
 import org.apache.hadoop.hbase.util.Bytes;
 import org.apache.phoenix.coprocessor.GroupedAggregateRegionObserver;
 import org.apache.phoenix.coprocessor.ServerCachingEndpointImpl;
 import org.apache.phoenix.coprocessor.UngroupedAggregateRegionObserver;
-import org.apache.phoenix.exception.SQLExceptionCode;
 import org.apache.phoenix.jdbc.PhoenixConnection;
 import org.apache.phoenix.jdbc.PhoenixDatabaseMetaData;
-import org.apache.phoenix.query.QueryServices;
 import org.apache.phoenix.schema.ColumnNotFoundException;
 import org.apache.phoenix.schema.PTable.ViewType;
 import org.apache.phoenix.schema.PTableType;
@@ -77,265 +66,289 @@ import org.apache.phoenix.schema.types.PChar;
 import org.apache.phoenix.schema.types.PDecimal;
 import org.apache.phoenix.schema.types.PInteger;
 import org.apache.phoenix.schema.types.PLong;
-import org.apache.phoenix.util.PhoenixRuntime;
 import org.apache.phoenix.util.PropertiesUtil;
 import org.apache.phoenix.util.SchemaUtil;
 import org.apache.phoenix.util.StringUtil;
-import org.apache.phoenix.util.TestUtil;
+import org.junit.Before;
 import org.junit.Test;
 
 
-public class QueryDatabaseMetaDataIT extends BaseClientManagedTimeIT {
-   
-private static void createMDTestTable(Connection conn, String tableName, 
String extraProps) throws SQLException {
-String ddl = "create table if not exists " + tableName +
-"   (id char(1) primary key,\n" +
-"a.col1 integer,\n" +
-"b.col2 bigint,\n" +
-"b.col3 decimal,\n" +
-"b.col4 decimal(5),\n" +
-"b.col5 decimal(6,3))\n" +
-"a." + HConstants.VERSIONS + "=" + 1 + "," + "a." + 
HColumnDescriptor.DATA_BLOCK_ENCODING + "='" + DataBlockEncoding.NONE +  "'";
+public class QueryDatabaseMetaDataIT extends ParallelStatsDisabledIT {
+
+private static void createMDTestTable(Connection conn, String tableName, 
String extraProps)
+throws SQLException {
+String ddl =
+"create table if not exists " + tableName + "   (id char(1) 
primary key,\n"
++ "a.col1 integer,\n" + "b.col2 bigint,\n" + " 
   b.col3 decimal,\n"
++ "b.col4 decimal(5),\n" + "b.col5 
decimal(6,3))\n" + "a."
++ HConstants.VERSIONS + "=" + 1 + "," + "a."
++ HColumnDescriptor.DATA_BLOCK_ENCODING + "='" + 
DataBlockEncoding.NONE
++ "'";
 if (extraProps != null && extraProps.length() > 0) {
 ddl += "," + extraProps;
 }
 conn.createStatement().execute(ddl);
 }
 
+@Before
+// We need to clean up phoenix metadata to ensure tests don't step on each 
other
+public void deleteMetadata() throws Exception {
+

[2/8] phoenix git commit: PHOENIX-4188 Disable inline-DTDs in Pherf XML records

2017-09-12 Thread elserj
PHOENIX-4188 Disable inline-DTDs in Pherf XML records


Project: http://git-wip-us.apache.org/repos/asf/phoenix/repo
Commit: http://git-wip-us.apache.org/repos/asf/phoenix/commit/4ee35057
Tree: http://git-wip-us.apache.org/repos/asf/phoenix/tree/4ee35057
Diff: http://git-wip-us.apache.org/repos/asf/phoenix/diff/4ee35057

Branch: refs/heads/master
Commit: 4ee35057c6a63c347f959361338b517d4f5b38c4
Parents: 1d4025a
Author: Josh Elser 
Authored: Fri Sep 8 22:50:25 2017 -0400
Committer: Josh Elser 
Committed: Tue Sep 12 13:28:33 2017 -0400

--
 .../config/scenario/user_defined_scenario.xml   |   4 +-
 phoenix-pherf/pom.xml   |   4 +
 .../pherf/configuration/XMLConfigParser.java|  15 +-
 .../pherf/result/impl/XMLResultHandler.java |  17 +-
 .../phoenix/pherf/ConfigurationParserTest.java  |   5 +-
 .../phoenix/pherf/XMLConfigParserTest.java  |  53 ++
 .../pherf/result/impl/XMLResultHandlerTest.java |  53 ++
 .../resources/malicious_results_with_dtd.xml| 676 +++
 .../scenario/malicious_scenario_with_dtd.xml|  48 ++
 9 files changed, 863 insertions(+), 12 deletions(-)
--


http://git-wip-us.apache.org/repos/asf/phoenix/blob/4ee35057/phoenix-pherf/config/scenario/user_defined_scenario.xml
--
diff --git a/phoenix-pherf/config/scenario/user_defined_scenario.xml 
b/phoenix-pherf/config/scenario/user_defined_scenario.xml
index e54d76a..6435e29 100644
--- a/phoenix-pherf/config/scenario/user_defined_scenario.xml
+++ b/phoenix-pherf/config/scenario/user_defined_scenario.xml
@@ -82,7 +82,7 @@
 2019-09-15 11:00:00.000
 
 
-2019-09-19 00:01:00
+2019-09-19 00:01:00.000
 
 
 2019-09-22 00:01:00.000
@@ -131,4 +131,4 @@
 
 
 
-
\ No newline at end of file
+

http://git-wip-us.apache.org/repos/asf/phoenix/blob/4ee35057/phoenix-pherf/pom.xml
--
diff --git a/phoenix-pherf/pom.xml b/phoenix-pherf/pom.xml
index 029ec6c..f65f026 100644
--- a/phoenix-pherf/pom.xml
+++ b/phoenix-pherf/pom.xml
@@ -219,6 +219,10 @@

com.googlecode.java-diff-utils:diffutils

org.apache.commons:commons-lang3

org.apache.commons:commons-math3
+   
commons-cli:commons-cli
+   
joda-time:joda-time
+   
org.apache.commons:commons-csv
+   
commons-lang:commons-lang




http://git-wip-us.apache.org/repos/asf/phoenix/blob/4ee35057/phoenix-pherf/src/main/java/org/apache/phoenix/pherf/configuration/XMLConfigParser.java
--
diff --git 
a/phoenix-pherf/src/main/java/org/apache/phoenix/pherf/configuration/XMLConfigParser.java
 
b/phoenix-pherf/src/main/java/org/apache/phoenix/pherf/configuration/XMLConfigParser.java
index 93dc94c..f3ec12f 100644
--- 
a/phoenix-pherf/src/main/java/org/apache/phoenix/pherf/configuration/XMLConfigParser.java
+++ 
b/phoenix-pherf/src/main/java/org/apache/phoenix/pherf/configuration/XMLConfigParser.java
@@ -29,6 +29,10 @@ import javax.xml.bind.JAXBContext;
 import javax.xml.bind.JAXBException;
 import javax.xml.bind.Marshaller;
 import javax.xml.bind.Unmarshaller;
+import javax.xml.stream.XMLInputFactory;
+import javax.xml.stream.XMLStreamException;
+import javax.xml.stream.XMLStreamReader;
+import javax.xml.transform.stream.StreamSource;
 
 import org.apache.phoenix.pherf.PherfConstants;
 import org.apache.phoenix.pherf.exception.FileLoaderException;
@@ -108,16 +112,19 @@ public class XMLConfigParser {
  * @param file Name of File
  * @return {@link org.apache.phoenix.pherf.configuration.DataModel} 
Returns DataModel from
  * XML configuration
- * @throws JAXBException
  */
 // TODO Remove static calls
-public static DataModel readDataModel(Path file) throws JAXBException {
+public static DataModel readDataModel(Path file) throws JAXBException, 
XMLStreamException {
+XMLInputFactory xif = XMLInputFactory.newFactory();
+xif.setProperty(XMLInputFactory.IS_SUPPORTING_EXTERNAL_ENTITIES, 
false);
+

[5/8] phoenix git commit: PHOENIX-4188 Disable inline-DTDs in Pherf XML records

2017-09-12 Thread elserj
http://git-wip-us.apache.org/repos/asf/phoenix/blob/050a3538/phoenix-pherf/src/test/resources/malicious_results_with_dtd.xml
--
diff --git a/phoenix-pherf/src/test/resources/malicious_results_with_dtd.xml 
b/phoenix-pherf/src/test/resources/malicious_results_with_dtd.xml
new file mode 100644
index 000..ab24c41
--- /dev/null
+++ b/phoenix-pherf/src/test/resources/malicious_results_with_dtd.xml
@@ -0,0 +1,676 @@
+
+
+
+]
+>
+
+
+
+
+RANDOM
+10
+-2147483648
+-2147483648
+
+-2147483648
+-2147483648
+VARCHAR
+false
+true
+
+
+
+
+phoenix.schema.isNamespaceMappingEnabled
+false
+
+
+phoenix.queryserver.serialization
+PROTOBUF
+
+
+phoenix.queryserver.http.port
+8765
+
+
+
+-9223372036854775808
+-9223372036854775808
+-2147483648
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+

[4/8] phoenix git commit: PHOENIX-4188 Disable inline-DTDs in Pherf XML records

2017-09-12 Thread elserj
PHOENIX-4188 Disable inline-DTDs in Pherf XML records


Project: http://git-wip-us.apache.org/repos/asf/phoenix/repo
Commit: http://git-wip-us.apache.org/repos/asf/phoenix/commit/f8100caa
Tree: http://git-wip-us.apache.org/repos/asf/phoenix/tree/f8100caa
Diff: http://git-wip-us.apache.org/repos/asf/phoenix/diff/f8100caa

Branch: refs/heads/4.x-HBase-1.2
Commit: f8100caa6b8c747a535fa2b7e138456dc17bc18c
Parents: 149eddc
Author: Josh Elser 
Authored: Fri Sep 8 22:50:25 2017 -0400
Committer: Josh Elser 
Committed: Tue Sep 12 13:40:06 2017 -0400

--
 .../config/scenario/user_defined_scenario.xml   |   4 +-
 phoenix-pherf/pom.xml   |   4 +
 .../pherf/configuration/XMLConfigParser.java|  15 +-
 .../pherf/result/impl/XMLResultHandler.java |  17 +-
 .../phoenix/pherf/ConfigurationParserTest.java  |   5 +-
 .../phoenix/pherf/XMLConfigParserTest.java  |  53 ++
 .../pherf/result/impl/XMLResultHandlerTest.java |  53 ++
 .../resources/malicious_results_with_dtd.xml| 676 +++
 .../scenario/malicious_scenario_with_dtd.xml|  48 ++
 9 files changed, 863 insertions(+), 12 deletions(-)
--


http://git-wip-us.apache.org/repos/asf/phoenix/blob/f8100caa/phoenix-pherf/config/scenario/user_defined_scenario.xml
--
diff --git a/phoenix-pherf/config/scenario/user_defined_scenario.xml 
b/phoenix-pherf/config/scenario/user_defined_scenario.xml
index e54d76a..6435e29 100644
--- a/phoenix-pherf/config/scenario/user_defined_scenario.xml
+++ b/phoenix-pherf/config/scenario/user_defined_scenario.xml
@@ -82,7 +82,7 @@
 2019-09-15 11:00:00.000
 
 
-2019-09-19 00:01:00
+2019-09-19 00:01:00.000
 
 
 2019-09-22 00:01:00.000
@@ -131,4 +131,4 @@
 
 
 
-
\ No newline at end of file
+

http://git-wip-us.apache.org/repos/asf/phoenix/blob/f8100caa/phoenix-pherf/pom.xml
--
diff --git a/phoenix-pherf/pom.xml b/phoenix-pherf/pom.xml
index 3128785..9fc3541 100644
--- a/phoenix-pherf/pom.xml
+++ b/phoenix-pherf/pom.xml
@@ -219,6 +219,10 @@

com.googlecode.java-diff-utils:diffutils

org.apache.commons:commons-lang3

org.apache.commons:commons-math3
+   
commons-cli:commons-cli
+   
joda-time:joda-time
+   
org.apache.commons:commons-csv
+   
commons-lang:commons-lang




http://git-wip-us.apache.org/repos/asf/phoenix/blob/f8100caa/phoenix-pherf/src/main/java/org/apache/phoenix/pherf/configuration/XMLConfigParser.java
--
diff --git 
a/phoenix-pherf/src/main/java/org/apache/phoenix/pherf/configuration/XMLConfigParser.java
 
b/phoenix-pherf/src/main/java/org/apache/phoenix/pherf/configuration/XMLConfigParser.java
index 93dc94c..f3ec12f 100644
--- 
a/phoenix-pherf/src/main/java/org/apache/phoenix/pherf/configuration/XMLConfigParser.java
+++ 
b/phoenix-pherf/src/main/java/org/apache/phoenix/pherf/configuration/XMLConfigParser.java
@@ -29,6 +29,10 @@ import javax.xml.bind.JAXBContext;
 import javax.xml.bind.JAXBException;
 import javax.xml.bind.Marshaller;
 import javax.xml.bind.Unmarshaller;
+import javax.xml.stream.XMLInputFactory;
+import javax.xml.stream.XMLStreamException;
+import javax.xml.stream.XMLStreamReader;
+import javax.xml.transform.stream.StreamSource;
 
 import org.apache.phoenix.pherf.PherfConstants;
 import org.apache.phoenix.pherf.exception.FileLoaderException;
@@ -108,16 +112,19 @@ public class XMLConfigParser {
  * @param file Name of File
  * @return {@link org.apache.phoenix.pherf.configuration.DataModel} 
Returns DataModel from
  * XML configuration
- * @throws JAXBException
  */
 // TODO Remove static calls
-public static DataModel readDataModel(Path file) throws JAXBException {
+public static DataModel readDataModel(Path file) throws JAXBException, 
XMLStreamException {
+XMLInputFactory xif = XMLInputFactory.newFactory();
+xif.setProperty(XMLInputFactory.IS_SUPPORTING_EXTERNAL_ENTITIES, 
false);
+

[6/8] phoenix git commit: PHOENIX-4188 Disable inline-DTDs in Pherf XML records

2017-09-12 Thread elserj
PHOENIX-4188 Disable inline-DTDs in Pherf XML records


Project: http://git-wip-us.apache.org/repos/asf/phoenix/repo
Commit: http://git-wip-us.apache.org/repos/asf/phoenix/commit/050a3538
Tree: http://git-wip-us.apache.org/repos/asf/phoenix/tree/050a3538
Diff: http://git-wip-us.apache.org/repos/asf/phoenix/diff/050a3538

Branch: refs/heads/4.x-HBase-1.1
Commit: 050a35382fe6921e3163d4158d3306ec8964e49d
Parents: ef6e536
Author: Josh Elser 
Authored: Fri Sep 8 22:50:25 2017 -0400
Committer: Josh Elser 
Committed: Tue Sep 12 13:45:33 2017 -0400

--
 .../config/scenario/user_defined_scenario.xml   |   4 +-
 phoenix-pherf/pom.xml   |   4 +
 .../pherf/configuration/XMLConfigParser.java|  15 +-
 .../pherf/result/impl/XMLResultHandler.java |  17 +-
 .../phoenix/pherf/ConfigurationParserTest.java  |   5 +-
 .../phoenix/pherf/XMLConfigParserTest.java  |  53 ++
 .../pherf/result/impl/XMLResultHandlerTest.java |  53 ++
 .../resources/malicious_results_with_dtd.xml| 676 +++
 .../scenario/malicious_scenario_with_dtd.xml|  48 ++
 9 files changed, 863 insertions(+), 12 deletions(-)
--


http://git-wip-us.apache.org/repos/asf/phoenix/blob/050a3538/phoenix-pherf/config/scenario/user_defined_scenario.xml
--
diff --git a/phoenix-pherf/config/scenario/user_defined_scenario.xml 
b/phoenix-pherf/config/scenario/user_defined_scenario.xml
index e54d76a..6435e29 100644
--- a/phoenix-pherf/config/scenario/user_defined_scenario.xml
+++ b/phoenix-pherf/config/scenario/user_defined_scenario.xml
@@ -82,7 +82,7 @@
 2019-09-15 11:00:00.000
 
 
-2019-09-19 00:01:00
+2019-09-19 00:01:00.000
 
 
 2019-09-22 00:01:00.000
@@ -131,4 +131,4 @@
 
 
 
-
\ No newline at end of file
+

http://git-wip-us.apache.org/repos/asf/phoenix/blob/050a3538/phoenix-pherf/pom.xml
--
diff --git a/phoenix-pherf/pom.xml b/phoenix-pherf/pom.xml
index 1c38768..4d0ed15 100644
--- a/phoenix-pherf/pom.xml
+++ b/phoenix-pherf/pom.xml
@@ -219,6 +219,10 @@

com.googlecode.java-diff-utils:diffutils

org.apache.commons:commons-lang3

org.apache.commons:commons-math3
+   
commons-cli:commons-cli
+   
joda-time:joda-time
+   
org.apache.commons:commons-csv
+   
commons-lang:commons-lang




http://git-wip-us.apache.org/repos/asf/phoenix/blob/050a3538/phoenix-pherf/src/main/java/org/apache/phoenix/pherf/configuration/XMLConfigParser.java
--
diff --git 
a/phoenix-pherf/src/main/java/org/apache/phoenix/pherf/configuration/XMLConfigParser.java
 
b/phoenix-pherf/src/main/java/org/apache/phoenix/pherf/configuration/XMLConfigParser.java
index 93dc94c..f3ec12f 100644
--- 
a/phoenix-pherf/src/main/java/org/apache/phoenix/pherf/configuration/XMLConfigParser.java
+++ 
b/phoenix-pherf/src/main/java/org/apache/phoenix/pherf/configuration/XMLConfigParser.java
@@ -29,6 +29,10 @@ import javax.xml.bind.JAXBContext;
 import javax.xml.bind.JAXBException;
 import javax.xml.bind.Marshaller;
 import javax.xml.bind.Unmarshaller;
+import javax.xml.stream.XMLInputFactory;
+import javax.xml.stream.XMLStreamException;
+import javax.xml.stream.XMLStreamReader;
+import javax.xml.transform.stream.StreamSource;
 
 import org.apache.phoenix.pherf.PherfConstants;
 import org.apache.phoenix.pherf.exception.FileLoaderException;
@@ -108,16 +112,19 @@ public class XMLConfigParser {
  * @param file Name of File
  * @return {@link org.apache.phoenix.pherf.configuration.DataModel} 
Returns DataModel from
  * XML configuration
- * @throws JAXBException
  */
 // TODO Remove static calls
-public static DataModel readDataModel(Path file) throws JAXBException {
+public static DataModel readDataModel(Path file) throws JAXBException, 
XMLStreamException {
+XMLInputFactory xif = XMLInputFactory.newFactory();
+xif.setProperty(XMLInputFactory.IS_SUPPORTING_EXTERNAL_ENTITIES, 
false);
+

[8/8] phoenix git commit: PHOENIX-4188 Disable inline-DTDs in Pherf XML records

2017-09-12 Thread elserj
PHOENIX-4188 Disable inline-DTDs in Pherf XML records


Project: http://git-wip-us.apache.org/repos/asf/phoenix/repo
Commit: http://git-wip-us.apache.org/repos/asf/phoenix/commit/4004ed17
Tree: http://git-wip-us.apache.org/repos/asf/phoenix/tree/4004ed17
Diff: http://git-wip-us.apache.org/repos/asf/phoenix/diff/4004ed17

Branch: refs/heads/4.x-HBase-0.98
Commit: 4004ed17fd67c93e7d67f8c365740e01632597be
Parents: 3aec050
Author: Josh Elser 
Authored: Fri Sep 8 22:50:25 2017 -0400
Committer: Josh Elser 
Committed: Tue Sep 12 13:50:53 2017 -0400

--
 .../config/scenario/user_defined_scenario.xml   |   4 +-
 phoenix-pherf/pom.xml   |   4 +
 .../pherf/configuration/XMLConfigParser.java|  15 +-
 .../pherf/result/impl/XMLResultHandler.java |  17 +-
 .../phoenix/pherf/ConfigurationParserTest.java  |   5 +-
 .../phoenix/pherf/XMLConfigParserTest.java  |  53 ++
 .../pherf/result/impl/XMLResultHandlerTest.java |  53 ++
 .../resources/malicious_results_with_dtd.xml| 676 +++
 .../scenario/malicious_scenario_with_dtd.xml|  48 ++
 9 files changed, 863 insertions(+), 12 deletions(-)
--


http://git-wip-us.apache.org/repos/asf/phoenix/blob/4004ed17/phoenix-pherf/config/scenario/user_defined_scenario.xml
--
diff --git a/phoenix-pherf/config/scenario/user_defined_scenario.xml 
b/phoenix-pherf/config/scenario/user_defined_scenario.xml
index e54d76a..6435e29 100644
--- a/phoenix-pherf/config/scenario/user_defined_scenario.xml
+++ b/phoenix-pherf/config/scenario/user_defined_scenario.xml
@@ -82,7 +82,7 @@
 2019-09-15 11:00:00.000
 
 
-2019-09-19 00:01:00
+2019-09-19 00:01:00.000
 
 
 2019-09-22 00:01:00.000
@@ -131,4 +131,4 @@
 
 
 
-
\ No newline at end of file
+

http://git-wip-us.apache.org/repos/asf/phoenix/blob/4004ed17/phoenix-pherf/pom.xml
--
diff --git a/phoenix-pherf/pom.xml b/phoenix-pherf/pom.xml
index 2b4641d..cd31fbd 100644
--- a/phoenix-pherf/pom.xml
+++ b/phoenix-pherf/pom.xml
@@ -219,6 +219,10 @@

com.googlecode.java-diff-utils:diffutils

org.apache.commons:commons-lang3

org.apache.commons:commons-math3
+   
commons-cli:commons-cli
+   
joda-time:joda-time
+   
org.apache.commons:commons-csv
+   
commons-lang:commons-lang




http://git-wip-us.apache.org/repos/asf/phoenix/blob/4004ed17/phoenix-pherf/src/main/java/org/apache/phoenix/pherf/configuration/XMLConfigParser.java
--
diff --git 
a/phoenix-pherf/src/main/java/org/apache/phoenix/pherf/configuration/XMLConfigParser.java
 
b/phoenix-pherf/src/main/java/org/apache/phoenix/pherf/configuration/XMLConfigParser.java
index 93dc94c..f3ec12f 100644
--- 
a/phoenix-pherf/src/main/java/org/apache/phoenix/pherf/configuration/XMLConfigParser.java
+++ 
b/phoenix-pherf/src/main/java/org/apache/phoenix/pherf/configuration/XMLConfigParser.java
@@ -29,6 +29,10 @@ import javax.xml.bind.JAXBContext;
 import javax.xml.bind.JAXBException;
 import javax.xml.bind.Marshaller;
 import javax.xml.bind.Unmarshaller;
+import javax.xml.stream.XMLInputFactory;
+import javax.xml.stream.XMLStreamException;
+import javax.xml.stream.XMLStreamReader;
+import javax.xml.transform.stream.StreamSource;
 
 import org.apache.phoenix.pherf.PherfConstants;
 import org.apache.phoenix.pherf.exception.FileLoaderException;
@@ -108,16 +112,19 @@ public class XMLConfigParser {
  * @param file Name of File
  * @return {@link org.apache.phoenix.pherf.configuration.DataModel} 
Returns DataModel from
  * XML configuration
- * @throws JAXBException
  */
 // TODO Remove static calls
-public static DataModel readDataModel(Path file) throws JAXBException {
+public static DataModel readDataModel(Path file) throws JAXBException, 
XMLStreamException {
+XMLInputFactory xif = XMLInputFactory.newFactory();
+xif.setProperty(XMLInputFactory.IS_SUPPORTING_EXTERNAL_ENTITIES, 
false);
+

[7/8] phoenix git commit: PHOENIX-4188 Disable inline-DTDs in Pherf XML records

2017-09-12 Thread elserj
http://git-wip-us.apache.org/repos/asf/phoenix/blob/4004ed17/phoenix-pherf/src/test/resources/malicious_results_with_dtd.xml
--
diff --git a/phoenix-pherf/src/test/resources/malicious_results_with_dtd.xml 
b/phoenix-pherf/src/test/resources/malicious_results_with_dtd.xml
new file mode 100644
index 000..ab24c41
--- /dev/null
+++ b/phoenix-pherf/src/test/resources/malicious_results_with_dtd.xml
@@ -0,0 +1,676 @@
+
+
+
+]
+>
+
+
+
+
+RANDOM
+10
+-2147483648
+-2147483648
+
+-2147483648
+-2147483648
+VARCHAR
+false
+true
+
+
+
+
+phoenix.schema.isNamespaceMappingEnabled
+false
+
+
+phoenix.queryserver.serialization
+PROTOBUF
+
+
+phoenix.queryserver.http.port
+8765
+
+
+
+-9223372036854775808
+-9223372036854775808
+-2147483648
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+

[3/8] phoenix git commit: PHOENIX-4188 Disable inline-DTDs in Pherf XML records

2017-09-12 Thread elserj
http://git-wip-us.apache.org/repos/asf/phoenix/blob/f8100caa/phoenix-pherf/src/test/resources/malicious_results_with_dtd.xml
--
diff --git a/phoenix-pherf/src/test/resources/malicious_results_with_dtd.xml 
b/phoenix-pherf/src/test/resources/malicious_results_with_dtd.xml
new file mode 100644
index 000..ab24c41
--- /dev/null
+++ b/phoenix-pherf/src/test/resources/malicious_results_with_dtd.xml
@@ -0,0 +1,676 @@
+
+
+
+]
+>
+
+
+
+
+RANDOM
+10
+-2147483648
+-2147483648
+
+-2147483648
+-2147483648
+VARCHAR
+false
+true
+
+
+
+
+phoenix.schema.isNamespaceMappingEnabled
+false
+
+
+phoenix.queryserver.serialization
+PROTOBUF
+
+
+phoenix.queryserver.http.port
+8765
+
+
+
+-9223372036854775808
+-9223372036854775808
+-2147483648
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+

Build failed in Jenkins: Phoenix Compile Compatibility with HBase #404

2017-09-12 Thread Apache Jenkins Server
See 


--
Started by timer
[EnvInject] - Loading node environment variables.
Building remotely on qnode3 (ubuntu) in workspace 

[Phoenix_Compile_Compat_wHBase] $ /bin/bash /tmp/jenkins6029817483214699247.sh
core file size  (blocks, -c) 0
data seg size   (kbytes, -d) unlimited
scheduling priority (-e) 0
file size   (blocks, -f) unlimited
pending signals (-i) 128341
max locked memory   (kbytes, -l) 64
max memory size (kbytes, -m) unlimited
open files  (-n) 6
pipe size(512 bytes, -p) 8
POSIX message queues (bytes, -q) 819200
real-time priority  (-r) 0
stack size  (kbytes, -s) 8192
cpu time   (seconds, -t) unlimited
max user processes  (-u) 10240
virtual memory  (kbytes, -v) unlimited
file locks  (-x) unlimited
core id : 0
core id : 1
core id : 2
core id : 3
core id : 4
core id : 5
core id : 6
core id : 7
physical id : 0
MemTotal:   32865152 kB
MemFree: 4473784 kB
Filesystem  Size  Used Avail Use% Mounted on
none 16G 0   16G   0% /dev
tmpfs   3.2G  342M  2.9G  11% /run
/dev/nbd046G   43G  1.4G  97% /
tmpfs16G 0   16G   0% /dev/shm
tmpfs   5.0M 0  5.0M   0% /run/lock
tmpfs16G 0   16G   0% /sys/fs/cgroup
/dev/sda1   235G  147G   77G  66% /home
tmpfs   3.2G 0  3.2G   0% /run/user/9997
apache-maven-2.2.1
apache-maven-3.0.4
apache-maven-3.0.5
apache-maven-3.2.1
apache-maven-3.2.5
apache-maven-3.3.3
apache-maven-3.3.9
apache-maven-3.5.0
latest
latest2
latest3


===
Verifying compile level compatibility with HBase 0.98 with Phoenix 
4.x-HBase-0.98
===

Cloning into 'hbase'...
Switched to a new branch '0.98'
Branch 0.98 set up to track remote branch 0.98 from origin.

main:
 [exec] 
~/jenkins-slave/workspace/Phoenix_Compile_Compat_wHBase/hbase/hbase-common 
~/jenkins-slave/workspace/Phoenix_Compile_Compat_wHBase/hbase/hbase-common
 [exec] 
~/jenkins-slave/workspace/Phoenix_Compile_Compat_wHBase/hbase/hbase-common

main:
[mkdir] Created dir: 

 [exec] tar: hadoop-snappy-nativelibs.tar: Cannot open: No such file or 
directory
 [exec] tar: Error is not recoverable: exiting now
 [exec] Result: 2

main:
[mkdir] Created dir: 

 [copy] Copying 20 files to 

[mkdir] Created dir: 

[mkdir] Created dir: 


main:
[mkdir] Created dir: 

 [copy] Copying 17 files to 

[mkdir] Created dir: 


main:
[mkdir] Created dir: 

 [copy] Copying 1 file to 

[mkdir] Created dir: 


HBase pom.xml:

Got HBase version as 0.98.25-SNAPSHOT
Cloning into 'phoenix'...
Switched to a new branch '4.x-HBase-0.98'
Branch 4.x-HBase-0.98 set up to track remote branch 4.x-HBase-0.98 from origin.
ANTLR Parser Generator  Version 3.5.2
Output file 

 does not exist: must build 

PhoenixSQL.g


===
Verifying compile level compatibility with HBase 

Build failed in Jenkins: Phoenix | Master #1789

2017-09-12 Thread Apache Jenkins Server
See 

--
[...truncated 97.07 KB...]
[INFO] Running org.apache.phoenix.end2end.index.DropMetadataIT
[INFO] Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 9.889 s 
- in org.apache.phoenix.end2end.index.DropMetadataIT
[INFO] Running org.apache.phoenix.end2end.index.IndexIT
[INFO] Running org.apache.phoenix.end2end.index.GlobalIndexOptimizationIT
[INFO] Tests run: 24, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 106.696 
s - in org.apache.phoenix.end2end.UpsertSelectIT
[INFO] Running org.apache.phoenix.end2end.index.IndexMetadataIT
[INFO] Tests run: 6, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 37.487 s 
- in org.apache.phoenix.end2end.index.GlobalIndexOptimizationIT
[INFO] Running org.apache.phoenix.end2end.index.MutableIndexIT
[INFO] Tests run: 10, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 39.077 
s - in org.apache.phoenix.end2end.index.IndexMetadataIT
[INFO] Running org.apache.phoenix.end2end.index.MutableIndexSplitForwardScanIT
[INFO] Tests run: 12, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 227.589 
s - in org.apache.phoenix.end2end.UpgradeIT
[INFO] Tests run: 112, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 
707.275 s - in org.apache.phoenix.end2end.ScanQueryIT
[INFO] Running org.apache.phoenix.end2end.index.MutableIndexSplitReverseScanIT
[INFO] Running org.apache.phoenix.end2end.index.SaltedIndexIT
[INFO] Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 14.416 s 
- in org.apache.phoenix.end2end.index.SaltedIndexIT
[INFO] Running org.apache.phoenix.end2end.index.ViewIndexIT
[INFO] Tests run: 8, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 48.849 s 
- in org.apache.phoenix.end2end.index.ViewIndexIT
[INFO] Running org.apache.phoenix.end2end.index.txn.MutableRollbackIT
[INFO] Tests run: 16, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 315.096 
s - in org.apache.phoenix.end2end.index.DropColumnIT
[INFO] Running org.apache.phoenix.end2end.index.txn.RollbackIT
[INFO] Tests run: 6, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 61.901 s 
- in org.apache.phoenix.end2end.index.txn.MutableRollbackIT
[INFO] Running org.apache.phoenix.end2end.salted.SaltedTableUpsertSelectIT
[INFO] Tests run: 6, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 28.629 s 
- in org.apache.phoenix.end2end.salted.SaltedTableUpsertSelectIT
[INFO] Running org.apache.phoenix.end2end.salted.SaltedTableVarLengthRowKeyIT
[INFO] Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 2.31 s - 
in org.apache.phoenix.end2end.salted.SaltedTableVarLengthRowKeyIT
[INFO] Running org.apache.phoenix.iterate.PhoenixQueryTimeoutIT
[INFO] Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 4.229 s 
- in org.apache.phoenix.iterate.PhoenixQueryTimeoutIT
[INFO] Running org.apache.phoenix.iterate.RoundRobinResultIteratorIT
[INFO] Tests run: 8, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 51.328 s 
- in org.apache.phoenix.end2end.index.txn.RollbackIT
[INFO] Running org.apache.phoenix.replication.SystemCatalogWALEntryFilterIT
[INFO] Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.094 s 
- in org.apache.phoenix.replication.SystemCatalogWALEntryFilterIT
[INFO] Running org.apache.phoenix.rpc.UpdateCacheIT
[INFO] Tests run: 7, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 26.78 s 
- in org.apache.phoenix.rpc.UpdateCacheIT
[INFO] Running org.apache.phoenix.trace.PhoenixTableMetricsWriterIT
[INFO] Tests run: 9, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 47.6 s - 
in org.apache.phoenix.iterate.RoundRobinResultIteratorIT
[INFO] Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 11.407 s 
- in org.apache.phoenix.trace.PhoenixTableMetricsWriterIT
[INFO] Running org.apache.phoenix.tx.FlappingTransactionIT
[INFO] Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 8.989 s 
- in org.apache.phoenix.tx.FlappingTransactionIT
[INFO] Running org.apache.phoenix.tx.ParameterizedTransactionIT
[INFO] Running org.apache.phoenix.trace.PhoenixTracingEndToEndIT
[INFO] Tests run: 67, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 423.645 
s - in org.apache.phoenix.end2end.index.IndexExpressionIT
[INFO] Running org.apache.phoenix.tx.TransactionIT
[INFO] Tests run: 102, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 985.94 
s - in org.apache.phoenix.end2end.SortMergeJoinIT
[INFO] Tests run: 7, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 46.638 s 
- in org.apache.phoenix.tx.TransactionIT
[INFO] Running org.apache.phoenix.util.IndexScrutinyIT
[INFO] Tests run: 8, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 84.97 s 
- in org.apache.phoenix.trace.PhoenixTracingEndToEndIT
[INFO] Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 9.168 s 
- in org.apache.phoenix.util.IndexScrutinyIT
[INFO] Tests run: 64, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 446.749 
s - in