Build failed in Jenkins: Phoenix-4.x-HBase-1.3 #175

2018-07-30 Thread Apache Jenkins Server
See 


Changes:

[ankitsinghal59] PHOENIX-4825 Replace usage of HBase Base64 implementation with

--
[...truncated 149.90 KB...]
[INFO] Tests run: 12, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 963.668 
s - in org.apache.phoenix.end2end.ChangePermissionsIT
[INFO] Running 
org.apache.phoenix.end2end.IndexToolForPartialBuildWithNamespaceEnabledIT
[INFO] Running org.apache.phoenix.end2end.IndexToolIT
[INFO] Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 26.771 s 
- in org.apache.phoenix.end2end.IndexToolForPartialBuildWithNamespaceEnabledIT
[INFO] Running org.apache.phoenix.end2end.LocalIndexSplitMergeIT
[INFO] Running org.apache.phoenix.end2end.MigrateSystemTablesToSystemNamespaceIT
[INFO] Tests run: 33, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 470.55 
s - in org.apache.phoenix.end2end.IndexScrutinyToolIT
[INFO] Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 390.748 
s - in org.apache.phoenix.end2end.MigrateSystemTablesToSystemNamespaceIT
[INFO] Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 437.245 
s - in org.apache.phoenix.end2end.LocalIndexSplitMergeIT
[INFO] Running 
org.apache.phoenix.end2end.NonColumnEncodedImmutableNonTxStatsCollectorIT
[INFO] Running 
org.apache.phoenix.end2end.NonColumnEncodedImmutableTxStatsCollectorIT
[INFO] Running org.apache.phoenix.end2end.PartialResultServerConfigurationIT
[INFO] Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 53.932 s 
- in org.apache.phoenix.end2end.PartialResultServerConfigurationIT
[INFO] Running org.apache.phoenix.end2end.PhoenixDriverIT
[WARNING] Tests run: 26, Failures: 0, Errors: 0, Skipped: 4, Time elapsed: 
174.776 s - in 
org.apache.phoenix.end2end.NonColumnEncodedImmutableNonTxStatsCollectorIT
[WARNING] Tests run: 26, Failures: 0, Errors: 0, Skipped: 4, Time elapsed: 
174.236 s - in 
org.apache.phoenix.end2end.NonColumnEncodedImmutableTxStatsCollectorIT
[INFO] Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 89.279 s 
- in org.apache.phoenix.end2end.PhoenixDriverIT
[INFO] Running org.apache.phoenix.end2end.QueryTimeoutIT
[INFO] Running org.apache.phoenix.end2end.QueryWithLimitIT
[INFO] Running org.apache.phoenix.end2end.QueryLoggerIT
[INFO] Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 10.507 s 
- in org.apache.phoenix.end2end.QueryTimeoutIT
[INFO] Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 7.605 s 
- in org.apache.phoenix.end2end.QueryWithLimitIT
[INFO] Tests run: 7, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 63.782 s 
- in org.apache.phoenix.end2end.QueryLoggerIT
[INFO] Running org.apache.phoenix.end2end.RegexBulkLoadToolIT
[INFO] Running org.apache.phoenix.end2end.RebuildIndexConnectionPropsIT
[INFO] Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.429 s 
- in org.apache.phoenix.end2end.RebuildIndexConnectionPropsIT
[INFO] Running org.apache.phoenix.end2end.RenewLeaseIT
[INFO] Running org.apache.phoenix.end2end.SequencePointInTimeIT
[INFO] Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.134 s 
- in org.apache.phoenix.end2end.SequencePointInTimeIT
[INFO] Tests run: 10, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 96.144 
s - in org.apache.phoenix.end2end.RegexBulkLoadToolIT
[INFO] Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 28.614 s 
- in org.apache.phoenix.end2end.RenewLeaseIT
[INFO] Running org.apache.phoenix.end2end.StatsEnabledSplitSystemCatalogIT
[INFO] Running org.apache.phoenix.end2end.SpillableGroupByIT
[INFO] Running 
org.apache.phoenix.end2end.SysTableNamespaceMappedStatsCollectorIT
[INFO] Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 15.976 s 
- in org.apache.phoenix.end2end.SpillableGroupByIT
[INFO] Running org.apache.phoenix.end2end.SystemCatalogCreationOnConnectionIT
[ERROR] Tests run: 26, Failures: 0, Errors: 18, Skipped: 4, Time elapsed: 
71.037 s <<< FAILURE! - in 
org.apache.phoenix.end2end.SysTableNamespaceMappedStatsCollectorIT
[ERROR] testRowCountAndByteCounts[mutable = true, transactional = true, 
isUserTableNamespaceMapped = false, columnEncoded = 
false](org.apache.phoenix.end2end.SysTableNamespaceMappedStatsCollectorIT)  
Time elapsed: 2.181 s  <<< ERROR!
java.lang.RuntimeException: org.apache.thrift.TException: Unable to discover 
transaction service.
Caused by: org.apache.thrift.TException: Unable to discover transaction service.

[ERROR] testSomeUpdateEmptyStats[mutable = true, transactional = true, 
isUserTableNamespaceMapped = false, columnEncoded = 
false](org.apache.phoenix.end2end.SysTableNamespaceMappedStatsCollectorIT)  
Time elapsed: 2.006 s  <<< ERROR!
java.lang.RuntimeException: org.apache.thrift.TException: Unable to discover 
transaction service.
Caused by: org.apache.thrift.TException: Unable to discover transaction service.

[ERROR] 

Build failed in Jenkins: Phoenix-4.x-HBase-1.2 #413

2018-07-30 Thread Apache Jenkins Server
See 

--
[...truncated 5.50 KB...]
[WARNING] 
[WARNING] Some problems were encountered while building the effective model for 
org.apache.phoenix:phoenix-queryserver-client:jar:4.14.0-HBase-1.2
[WARNING] Reporting configuration should be done in  section, not in 
maven-site-plugin  as reportPlugins parameter.
[WARNING] 
[WARNING] Some problems were encountered while building the effective model for 
org.apache.phoenix:phoenix-queryserver:jar:4.14.0-HBase-1.2
[WARNING] Reporting configuration should be done in  section, not in 
maven-site-plugin  as reportPlugins parameter.
[WARNING] 
[WARNING] Some problems were encountered while building the effective model for 
org.apache.phoenix:phoenix-pherf:jar:4.14.0-HBase-1.2
[WARNING] Reporting configuration should be done in  section, not in 
maven-site-plugin  as reportPlugins parameter.
[WARNING] 
[WARNING] Some problems were encountered while building the effective model for 
org.apache.phoenix:phoenix-spark:jar:4.14.0-HBase-1.2
[WARNING] Reporting configuration should be done in  section, not in 
maven-site-plugin  as reportPlugins parameter.
[WARNING] 
[WARNING] Some problems were encountered while building the effective model for 
org.apache.phoenix:phoenix-hive:jar:4.14.0-HBase-1.2
[WARNING] Reporting configuration should be done in  section, not in 
maven-site-plugin  as reportPlugins parameter.
[WARNING] 
[WARNING] Some problems were encountered while building the effective model for 
org.apache.phoenix:phoenix-client:jar:4.14.0-HBase-1.2
[WARNING] Reporting configuration should be done in  section, not in 
maven-site-plugin  as reportPlugins parameter. @ 
org.apache.phoenix:phoenix-client:[unknown-version], 

 line 52, column 24
[WARNING] 
[WARNING] Some problems were encountered while building the effective model for 
org.apache.phoenix:phoenix-server:jar:4.14.0-HBase-1.2
[WARNING] Reporting configuration should be done in  section, not in 
maven-site-plugin  as reportPlugins parameter. @ 
org.apache.phoenix:phoenix-server:[unknown-version], 

 line 50, column 24
[WARNING] 
[WARNING] Some problems were encountered while building the effective model for 
org.apache.phoenix:phoenix-assembly:pom:4.14.0-HBase-1.2
[WARNING] Reporting configuration should be done in  section, not in 
maven-site-plugin  as reportPlugins parameter.
[WARNING] 
[WARNING] Some problems were encountered while building the effective model for 
org.apache.phoenix:phoenix-tracing-webapp:jar:4.14.0-HBase-1.2
[WARNING] Reporting configuration should be done in  section, not in 
maven-site-plugin  as reportPlugins parameter.
[WARNING] 
[WARNING] Some problems were encountered while building the effective model for 
org.apache.phoenix:phoenix-load-balancer:jar:4.14.0-HBase-1.2
[WARNING] Reporting configuration should be done in  section, not in 
maven-site-plugin  as reportPlugins parameter.
[WARNING] 
[WARNING] Some problems were encountered while building the effective model for 
org.apache.phoenix:phoenix:pom:4.14.0-HBase-1.2
[WARNING] Reporting configuration should be done in  section, not in 
maven-site-plugin  as reportPlugins parameter. @ line 467, 
column 24
[WARNING] 
[WARNING] It is highly recommended to fix these problems because they threaten 
the stability of your build.
[WARNING] 
[WARNING] For this reason, future Maven versions might no longer support 
building such malformed projects.
[WARNING] 
[INFO] 
[INFO] Reactor Build Order:
[INFO] 
[INFO] Apache Phoenix
[INFO] Phoenix Core
[INFO] Phoenix - Flume
[INFO] Phoenix - Kafka
[INFO] Phoenix - Pig
[INFO] Phoenix Query Server Client
[INFO] Phoenix Query Server
[INFO] Phoenix - Pherf
[INFO] Phoenix - Spark
[INFO] Phoenix - Hive
[INFO] Phoenix Client
[INFO] Phoenix Server
[INFO] Phoenix Assembly
[INFO] Phoenix - Tracing Web Application
[INFO] Phoenix Load Balancer
[INFO] 
[INFO] 
[INFO] Building Apache Phoenix 4.14.0-HBase-1.2
[INFO] 
[INFO] 
[INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ phoenix ---
[INFO] Deleting 
[INFO] 
[INFO] --- maven-checkstyle-plugin:2.13:check (validate) @ phoenix ---
[INFO] 
[INFO] --- maven-remote-resources-plugin:1.5:process (default) @ phoenix ---
[INFO] 
[INFO] --- maven-source-plugin:2.2.1:jar-no-fork (attach-sources) @ phoenix ---
[INFO] 
[INFO] --- maven-jar-plugin:2.4:test-jar (default) @ phoenix ---
[WARNING] JAR will be empty - no content was marked for inclusion!
[INFO] Building jar: 

Build failed in Jenkins: Phoenix-4.x-HBase-1.2 #412

2018-07-30 Thread Apache Jenkins Server
See 


Changes:

[ankitsinghal59] PHOENIX-4825 Replace usage of HBase Base64 implementation with

--
[...truncated 7.18 KB...]
[WARNING] Some problems were encountered while building the effective model for 
org.apache.phoenix:phoenix-queryserver:jar:4.14.0-HBase-1.2
[WARNING] Reporting configuration should be done in  section, not in 
maven-site-plugin  as reportPlugins parameter.
[WARNING] 
[WARNING] Some problems were encountered while building the effective model for 
org.apache.phoenix:phoenix-pherf:jar:4.14.0-HBase-1.2
[WARNING] Reporting configuration should be done in  section, not in 
maven-site-plugin  as reportPlugins parameter.
[WARNING] 
[WARNING] Some problems were encountered while building the effective model for 
org.apache.phoenix:phoenix-spark:jar:4.14.0-HBase-1.2
[WARNING] Reporting configuration should be done in  section, not in 
maven-site-plugin  as reportPlugins parameter.
[WARNING] 
[WARNING] Some problems were encountered while building the effective model for 
org.apache.phoenix:phoenix-hive:jar:4.14.0-HBase-1.2
[WARNING] Reporting configuration should be done in  section, not in 
maven-site-plugin  as reportPlugins parameter.
[WARNING] 
[WARNING] Some problems were encountered while building the effective model for 
org.apache.phoenix:phoenix-client:jar:4.14.0-HBase-1.2
[WARNING] Reporting configuration should be done in  section, not in 
maven-site-plugin  as reportPlugins parameter. @ 
org.apache.phoenix:phoenix-client:[unknown-version], 

 line 52, column 24
[WARNING] 
[WARNING] Some problems were encountered while building the effective model for 
org.apache.phoenix:phoenix-server:jar:4.14.0-HBase-1.2
[WARNING] Reporting configuration should be done in  section, not in 
maven-site-plugin  as reportPlugins parameter. @ 
org.apache.phoenix:phoenix-server:[unknown-version], 

 line 50, column 24
[WARNING] 
[WARNING] Some problems were encountered while building the effective model for 
org.apache.phoenix:phoenix-assembly:pom:4.14.0-HBase-1.2
[WARNING] Reporting configuration should be done in  section, not in 
maven-site-plugin  as reportPlugins parameter.
[WARNING] 
[WARNING] Some problems were encountered while building the effective model for 
org.apache.phoenix:phoenix-tracing-webapp:jar:4.14.0-HBase-1.2
[WARNING] Reporting configuration should be done in  section, not in 
maven-site-plugin  as reportPlugins parameter.
[WARNING] 
[WARNING] Some problems were encountered while building the effective model for 
org.apache.phoenix:phoenix-load-balancer:jar:4.14.0-HBase-1.2
[WARNING] Reporting configuration should be done in  section, not in 
maven-site-plugin  as reportPlugins parameter.
[WARNING] 
[WARNING] Some problems were encountered while building the effective model for 
org.apache.phoenix:phoenix:pom:4.14.0-HBase-1.2
[WARNING] Reporting configuration should be done in  section, not in 
maven-site-plugin  as reportPlugins parameter. @ line 467, 
column 24
[WARNING] 
[WARNING] It is highly recommended to fix these problems because they threaten 
the stability of your build.
[WARNING] 
[WARNING] For this reason, future Maven versions might no longer support 
building such malformed projects.
[WARNING] 
[INFO] 
[INFO] Reactor Build Order:
[INFO] 
[INFO] Apache Phoenix
[INFO] Phoenix Core
[INFO] Phoenix - Flume
[INFO] Phoenix - Kafka
[INFO] Phoenix - Pig
[INFO] Phoenix Query Server Client
[INFO] Phoenix Query Server
[INFO] Phoenix - Pherf
[INFO] Phoenix - Spark
[INFO] Phoenix - Hive
[INFO] Phoenix Client
[INFO] Phoenix Server
[INFO] Phoenix Assembly
[INFO] Phoenix - Tracing Web Application
[INFO] Phoenix Load Balancer
[INFO] 
[INFO] 
[INFO] Building Apache Phoenix 4.14.0-HBase-1.2
[INFO] 
[INFO] 
[INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ phoenix ---
[INFO] 
[INFO] --- maven-checkstyle-plugin:2.13:check (validate) @ phoenix ---
[INFO] 
[INFO] --- maven-remote-resources-plugin:1.5:process (default) @ phoenix ---
[INFO] 
[INFO] --- maven-source-plugin:2.2.1:jar-no-fork (attach-sources) @ phoenix ---
[INFO] 
[INFO] --- maven-jar-plugin:2.4:test-jar (default) @ phoenix ---
[WARNING] JAR will be empty - no content was marked for inclusion!
[INFO] Building jar: 

[INFO] 
[INFO] --- maven-site-plugin:3.2:attach-descriptor (attach-descriptor) @ 
phoenix ---
[INFO] 
[INFO] --- maven-install-plugin:2.5.2:install (default-install) @ phoenix ---
[INFO] Installing 

phoenix git commit: PHOENIX-4825 Replace usage of HBase Base64 implementation with java.util.Base64

2018-07-30 Thread ankit
Repository: phoenix
Updated Branches:
  refs/heads/4.x-HBase-1.2 85b479b0e -> 8e2e99d96


PHOENIX-4825 Replace usage of HBase Base64 implementation with java.util.Base64


Project: http://git-wip-us.apache.org/repos/asf/phoenix/repo
Commit: http://git-wip-us.apache.org/repos/asf/phoenix/commit/8e2e99d9
Tree: http://git-wip-us.apache.org/repos/asf/phoenix/tree/8e2e99d9
Diff: http://git-wip-us.apache.org/repos/asf/phoenix/diff/8e2e99d9

Branch: refs/heads/4.x-HBase-1.2
Commit: 8e2e99d965e532a007e021a6dd67b26fb2097d17
Parents: 85b479b0e
Author: Ankit Singhal 
Authored: Mon Jul 30 13:57:18 2018 -0700
Committer: Ankit Singhal 
Committed: Mon Jul 30 13:57:18 2018 -0700

--
 .../org/apache/phoenix/end2end/QueryMoreIT.java |  7 +++--
 .../phoenix/mapreduce/CsvBulkImportUtil.java|  8 --
 .../util/PhoenixConfigurationUtil.java  |  7 +++--
 .../apache/phoenix/schema/types/PVarbinary.java |  4 +--
 .../phoenix/util/csv/CsvUpsertExecutor.java |  4 +--
 .../phoenix/util/json/JsonUpsertExecutor.java   |  4 +--
 .../util/AbstractUpsertExecutorTest.java| 12 
 .../util/TenantIdByteConversionTest.java| 30 
 8 files changed, 50 insertions(+), 26 deletions(-)
--


http://git-wip-us.apache.org/repos/asf/phoenix/blob/8e2e99d9/phoenix-core/src/it/java/org/apache/phoenix/end2end/QueryMoreIT.java
--
diff --git 
a/phoenix-core/src/it/java/org/apache/phoenix/end2end/QueryMoreIT.java 
b/phoenix-core/src/it/java/org/apache/phoenix/end2end/QueryMoreIT.java
index 04272fa..528fe7f 100644
--- a/phoenix-core/src/it/java/org/apache/phoenix/end2end/QueryMoreIT.java
+++ b/phoenix-core/src/it/java/org/apache/phoenix/end2end/QueryMoreIT.java
@@ -31,12 +31,13 @@ import java.sql.ResultSet;
 import java.sql.SQLException;
 import java.sql.Statement;
 import java.util.ArrayList;
+import java.util.Base64;
 import java.util.HashMap;
 import java.util.List;
 import java.util.Map;
 import java.util.Properties;
 
-import org.apache.hadoop.hbase.util.Base64;
+import org.apache.hadoop.hbase.util.Bytes;
 import org.apache.hadoop.hbase.util.Pair;
 import org.apache.phoenix.jdbc.PhoenixConnection;
 import org.apache.phoenix.query.QueryServices;
@@ -278,7 +279,7 @@ public class QueryMoreIT extends ParallelStatsDisabledIT {
 values[i] = rs.getObject(i + 1);
 }
 conn = getTenantSpecificConnection(tenantId);
-
pkIds.add(Base64.encodeBytes(PhoenixRuntime.encodeColumnValues(conn, 
tableOrViewName.toUpperCase(), values, columns)));
+
pkIds.add(Bytes.toString(Base64.getEncoder().encode(PhoenixRuntime.encodeColumnValues(conn,
 tableOrViewName.toUpperCase(), values, columns;
 }
 return pkIds.toArray(new String[pkIds.size()]);
 }
@@ -296,7 +297,7 @@ public class QueryMoreIT extends ParallelStatsDisabledIT {
 PreparedStatement stmt = conn.prepareStatement(query);
 int bindCounter = 1;
 for (int i = 0; i < cursorIds.length; i++) {
-Object[] pkParts = PhoenixRuntime.decodeColumnValues(conn, 
tableName.toUpperCase(), Base64.decode(cursorIds[i]), columns);
+Object[] pkParts = PhoenixRuntime.decodeColumnValues(conn, 
tableName.toUpperCase(), Base64.getDecoder().decode(cursorIds[i]), columns);
 for (int j = 0; j < pkParts.length; j++) {
 stmt.setObject(bindCounter++, pkParts[j]);
 }

http://git-wip-us.apache.org/repos/asf/phoenix/blob/8e2e99d9/phoenix-core/src/main/java/org/apache/phoenix/mapreduce/CsvBulkImportUtil.java
--
diff --git 
a/phoenix-core/src/main/java/org/apache/phoenix/mapreduce/CsvBulkImportUtil.java
 
b/phoenix-core/src/main/java/org/apache/phoenix/mapreduce/CsvBulkImportUtil.java
index ff9ff72..bf5a538 100644
--- 
a/phoenix-core/src/main/java/org/apache/phoenix/mapreduce/CsvBulkImportUtil.java
+++ 
b/phoenix-core/src/main/java/org/apache/phoenix/mapreduce/CsvBulkImportUtil.java
@@ -17,9 +17,11 @@
  */
 package org.apache.phoenix.mapreduce;
 
+import java.util.Base64;
+
 import org.apache.hadoop.conf.Configuration;
 import org.apache.hadoop.fs.Path;
-import org.apache.hadoop.hbase.util.Base64;
+import org.apache.hadoop.hbase.util.Bytes;
 import org.apache.phoenix.mapreduce.util.PhoenixConfigurationUtil;
 import org.apache.phoenix.query.QueryConstants;
 import org.apache.phoenix.query.QueryServices;
@@ -68,7 +70,7 @@ public class CsvBulkImportUtil {
 
 @VisibleForTesting
 static void setChar(Configuration conf, String confKey, char charValue) {
-conf.set(confKey, 
Base64.encodeBytes(Character.toString(charValue).getBytes()));
+conf.set(confKey, 
Bytes.toString(Base64.getEncoder().encode(Character.toString(charValue).getBytes(;
 

phoenix git commit: PHOENIX-4825 Replace usage of HBase Base64 implementation with java.util.Base64

2018-07-30 Thread ankit
Repository: phoenix
Updated Branches:
  refs/heads/4.x-HBase-1.4 3eecbe985 -> 22934e5af


PHOENIX-4825 Replace usage of HBase Base64 implementation with java.util.Base64


Project: http://git-wip-us.apache.org/repos/asf/phoenix/repo
Commit: http://git-wip-us.apache.org/repos/asf/phoenix/commit/22934e5a
Tree: http://git-wip-us.apache.org/repos/asf/phoenix/tree/22934e5a
Diff: http://git-wip-us.apache.org/repos/asf/phoenix/diff/22934e5a

Branch: refs/heads/4.x-HBase-1.4
Commit: 22934e5af7af79580bf54feeb7667eccafaafc71
Parents: 3eecbe9
Author: Ankit Singhal 
Authored: Mon Jul 30 13:57:40 2018 -0700
Committer: Ankit Singhal 
Committed: Mon Jul 30 13:57:40 2018 -0700

--
 .../org/apache/phoenix/end2end/QueryMoreIT.java |  7 +++--
 .../phoenix/mapreduce/CsvBulkImportUtil.java|  8 --
 .../util/PhoenixConfigurationUtil.java  |  7 +++--
 .../apache/phoenix/schema/types/PVarbinary.java |  4 +--
 .../phoenix/util/csv/CsvUpsertExecutor.java |  4 +--
 .../phoenix/util/json/JsonUpsertExecutor.java   |  4 +--
 .../util/AbstractUpsertExecutorTest.java| 12 
 .../util/TenantIdByteConversionTest.java| 30 
 8 files changed, 50 insertions(+), 26 deletions(-)
--


http://git-wip-us.apache.org/repos/asf/phoenix/blob/22934e5a/phoenix-core/src/it/java/org/apache/phoenix/end2end/QueryMoreIT.java
--
diff --git 
a/phoenix-core/src/it/java/org/apache/phoenix/end2end/QueryMoreIT.java 
b/phoenix-core/src/it/java/org/apache/phoenix/end2end/QueryMoreIT.java
index 04272fa..528fe7f 100644
--- a/phoenix-core/src/it/java/org/apache/phoenix/end2end/QueryMoreIT.java
+++ b/phoenix-core/src/it/java/org/apache/phoenix/end2end/QueryMoreIT.java
@@ -31,12 +31,13 @@ import java.sql.ResultSet;
 import java.sql.SQLException;
 import java.sql.Statement;
 import java.util.ArrayList;
+import java.util.Base64;
 import java.util.HashMap;
 import java.util.List;
 import java.util.Map;
 import java.util.Properties;
 
-import org.apache.hadoop.hbase.util.Base64;
+import org.apache.hadoop.hbase.util.Bytes;
 import org.apache.hadoop.hbase.util.Pair;
 import org.apache.phoenix.jdbc.PhoenixConnection;
 import org.apache.phoenix.query.QueryServices;
@@ -278,7 +279,7 @@ public class QueryMoreIT extends ParallelStatsDisabledIT {
 values[i] = rs.getObject(i + 1);
 }
 conn = getTenantSpecificConnection(tenantId);
-
pkIds.add(Base64.encodeBytes(PhoenixRuntime.encodeColumnValues(conn, 
tableOrViewName.toUpperCase(), values, columns)));
+
pkIds.add(Bytes.toString(Base64.getEncoder().encode(PhoenixRuntime.encodeColumnValues(conn,
 tableOrViewName.toUpperCase(), values, columns;
 }
 return pkIds.toArray(new String[pkIds.size()]);
 }
@@ -296,7 +297,7 @@ public class QueryMoreIT extends ParallelStatsDisabledIT {
 PreparedStatement stmt = conn.prepareStatement(query);
 int bindCounter = 1;
 for (int i = 0; i < cursorIds.length; i++) {
-Object[] pkParts = PhoenixRuntime.decodeColumnValues(conn, 
tableName.toUpperCase(), Base64.decode(cursorIds[i]), columns);
+Object[] pkParts = PhoenixRuntime.decodeColumnValues(conn, 
tableName.toUpperCase(), Base64.getDecoder().decode(cursorIds[i]), columns);
 for (int j = 0; j < pkParts.length; j++) {
 stmt.setObject(bindCounter++, pkParts[j]);
 }

http://git-wip-us.apache.org/repos/asf/phoenix/blob/22934e5a/phoenix-core/src/main/java/org/apache/phoenix/mapreduce/CsvBulkImportUtil.java
--
diff --git 
a/phoenix-core/src/main/java/org/apache/phoenix/mapreduce/CsvBulkImportUtil.java
 
b/phoenix-core/src/main/java/org/apache/phoenix/mapreduce/CsvBulkImportUtil.java
index ff9ff72..bf5a538 100644
--- 
a/phoenix-core/src/main/java/org/apache/phoenix/mapreduce/CsvBulkImportUtil.java
+++ 
b/phoenix-core/src/main/java/org/apache/phoenix/mapreduce/CsvBulkImportUtil.java
@@ -17,9 +17,11 @@
  */
 package org.apache.phoenix.mapreduce;
 
+import java.util.Base64;
+
 import org.apache.hadoop.conf.Configuration;
 import org.apache.hadoop.fs.Path;
-import org.apache.hadoop.hbase.util.Base64;
+import org.apache.hadoop.hbase.util.Bytes;
 import org.apache.phoenix.mapreduce.util.PhoenixConfigurationUtil;
 import org.apache.phoenix.query.QueryConstants;
 import org.apache.phoenix.query.QueryServices;
@@ -68,7 +70,7 @@ public class CsvBulkImportUtil {
 
 @VisibleForTesting
 static void setChar(Configuration conf, String confKey, char charValue) {
-conf.set(confKey, 
Base64.encodeBytes(Character.toString(charValue).getBytes()));
+conf.set(confKey, 
Bytes.toString(Base64.getEncoder().encode(Character.toString(charValue).getBytes(;
 }
 

phoenix git commit: PHOENIX-4825 Replace usage of HBase Base64 implementation with java.util.Base64

2018-07-30 Thread ankit
Repository: phoenix
Updated Branches:
  refs/heads/4.x-HBase-1.3 bf6db8f4d -> 6f5926b6b


PHOENIX-4825 Replace usage of HBase Base64 implementation with java.util.Base64


Project: http://git-wip-us.apache.org/repos/asf/phoenix/repo
Commit: http://git-wip-us.apache.org/repos/asf/phoenix/commit/6f5926b6
Tree: http://git-wip-us.apache.org/repos/asf/phoenix/tree/6f5926b6
Diff: http://git-wip-us.apache.org/repos/asf/phoenix/diff/6f5926b6

Branch: refs/heads/4.x-HBase-1.3
Commit: 6f5926b6b1f3d89b7283a5d030d6f46533dc0d39
Parents: bf6db8f
Author: Ankit Singhal 
Authored: Mon Jul 30 13:56:47 2018 -0700
Committer: Ankit Singhal 
Committed: Mon Jul 30 13:56:47 2018 -0700

--
 .../org/apache/phoenix/end2end/QueryMoreIT.java |  7 +++--
 .../phoenix/mapreduce/CsvBulkImportUtil.java|  8 --
 .../util/PhoenixConfigurationUtil.java  |  7 +++--
 .../apache/phoenix/schema/types/PVarbinary.java |  4 +--
 .../phoenix/util/csv/CsvUpsertExecutor.java |  4 +--
 .../phoenix/util/json/JsonUpsertExecutor.java   |  4 +--
 .../util/AbstractUpsertExecutorTest.java| 12 
 .../util/TenantIdByteConversionTest.java| 30 
 8 files changed, 50 insertions(+), 26 deletions(-)
--


http://git-wip-us.apache.org/repos/asf/phoenix/blob/6f5926b6/phoenix-core/src/it/java/org/apache/phoenix/end2end/QueryMoreIT.java
--
diff --git 
a/phoenix-core/src/it/java/org/apache/phoenix/end2end/QueryMoreIT.java 
b/phoenix-core/src/it/java/org/apache/phoenix/end2end/QueryMoreIT.java
index 04272fa..528fe7f 100644
--- a/phoenix-core/src/it/java/org/apache/phoenix/end2end/QueryMoreIT.java
+++ b/phoenix-core/src/it/java/org/apache/phoenix/end2end/QueryMoreIT.java
@@ -31,12 +31,13 @@ import java.sql.ResultSet;
 import java.sql.SQLException;
 import java.sql.Statement;
 import java.util.ArrayList;
+import java.util.Base64;
 import java.util.HashMap;
 import java.util.List;
 import java.util.Map;
 import java.util.Properties;
 
-import org.apache.hadoop.hbase.util.Base64;
+import org.apache.hadoop.hbase.util.Bytes;
 import org.apache.hadoop.hbase.util.Pair;
 import org.apache.phoenix.jdbc.PhoenixConnection;
 import org.apache.phoenix.query.QueryServices;
@@ -278,7 +279,7 @@ public class QueryMoreIT extends ParallelStatsDisabledIT {
 values[i] = rs.getObject(i + 1);
 }
 conn = getTenantSpecificConnection(tenantId);
-
pkIds.add(Base64.encodeBytes(PhoenixRuntime.encodeColumnValues(conn, 
tableOrViewName.toUpperCase(), values, columns)));
+
pkIds.add(Bytes.toString(Base64.getEncoder().encode(PhoenixRuntime.encodeColumnValues(conn,
 tableOrViewName.toUpperCase(), values, columns;
 }
 return pkIds.toArray(new String[pkIds.size()]);
 }
@@ -296,7 +297,7 @@ public class QueryMoreIT extends ParallelStatsDisabledIT {
 PreparedStatement stmt = conn.prepareStatement(query);
 int bindCounter = 1;
 for (int i = 0; i < cursorIds.length; i++) {
-Object[] pkParts = PhoenixRuntime.decodeColumnValues(conn, 
tableName.toUpperCase(), Base64.decode(cursorIds[i]), columns);
+Object[] pkParts = PhoenixRuntime.decodeColumnValues(conn, 
tableName.toUpperCase(), Base64.getDecoder().decode(cursorIds[i]), columns);
 for (int j = 0; j < pkParts.length; j++) {
 stmt.setObject(bindCounter++, pkParts[j]);
 }

http://git-wip-us.apache.org/repos/asf/phoenix/blob/6f5926b6/phoenix-core/src/main/java/org/apache/phoenix/mapreduce/CsvBulkImportUtil.java
--
diff --git 
a/phoenix-core/src/main/java/org/apache/phoenix/mapreduce/CsvBulkImportUtil.java
 
b/phoenix-core/src/main/java/org/apache/phoenix/mapreduce/CsvBulkImportUtil.java
index ff9ff72..bf5a538 100644
--- 
a/phoenix-core/src/main/java/org/apache/phoenix/mapreduce/CsvBulkImportUtil.java
+++ 
b/phoenix-core/src/main/java/org/apache/phoenix/mapreduce/CsvBulkImportUtil.java
@@ -17,9 +17,11 @@
  */
 package org.apache.phoenix.mapreduce;
 
+import java.util.Base64;
+
 import org.apache.hadoop.conf.Configuration;
 import org.apache.hadoop.fs.Path;
-import org.apache.hadoop.hbase.util.Base64;
+import org.apache.hadoop.hbase.util.Bytes;
 import org.apache.phoenix.mapreduce.util.PhoenixConfigurationUtil;
 import org.apache.phoenix.query.QueryConstants;
 import org.apache.phoenix.query.QueryServices;
@@ -68,7 +70,7 @@ public class CsvBulkImportUtil {
 
 @VisibleForTesting
 static void setChar(Configuration conf, String confKey, char charValue) {
-conf.set(confKey, 
Base64.encodeBytes(Character.toString(charValue).getBytes()));
+conf.set(confKey, 
Bytes.toString(Base64.getEncoder().encode(Character.toString(charValue).getBytes(;
 }
 

Build failed in Jenkins: Phoenix | Master #2073

2018-07-30 Thread Apache Jenkins Server
See 


Changes:

[ankitsinghal59] PHOENIX-4825 Replace usage of HBase Base64 implementation with

--
[...truncated 63.01 KB...]
[WARNING] Tests run: 7, Failures: 0, Errors: 0, Skipped: 1, Time elapsed: 0.924 
s - in org.apache.phoenix.jdbc.PhoenixDriverTest
[INFO] Tests run: 23, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.191 s 
- in org.apache.phoenix.expression.RoundFloorCeilExpressionsTest
[INFO] Running org.apache.phoenix.cache.JodaTimezoneCacheTest
[INFO] Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.912 s 
- in org.apache.phoenix.jdbc.PhoenixResultSetMetadataTest
[INFO] Running org.apache.phoenix.cache.TenantCacheTest
[INFO] Running org.apache.phoenix.filter.DistinctPrefixFilterTest
[INFO] Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.012 s 
- in org.apache.phoenix.cache.TenantCacheTest
[INFO] Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.046 s 
- in org.apache.phoenix.cache.JodaTimezoneCacheTest
[INFO] Running org.apache.phoenix.filter.SkipScanFilterTest
[INFO] Running org.apache.phoenix.filter.SkipScanFilterIntersectTest
[INFO] Tests run: 8, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.04 s - 
in org.apache.phoenix.filter.DistinctPrefixFilterTest
[INFO] Running org.apache.phoenix.filter.SkipScanBigFilterTest
[INFO] Tests run: 6, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.214 s 
- in org.apache.phoenix.index.automated.MRJobSubmitterTest
[INFO] Tests run: 19, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.04 s 
- in org.apache.phoenix.filter.SkipScanFilterIntersectTest
[INFO] Running org.apache.phoenix.util.csv.CsvUpsertExecutorTest
[INFO] Running org.apache.phoenix.util.csv.StringToArrayConverterTest
[INFO] Tests run: 20, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.104 s 
- in org.apache.phoenix.filter.SkipScanFilterTest
[INFO] Running org.apache.phoenix.util.PhoenixEncodeDecodeTest
[INFO] Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.362 s 
- in org.apache.phoenix.util.csv.StringToArrayConverterTest
[INFO] Running org.apache.phoenix.util.MetaDataUtilTest
[INFO] Tests run: 5, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.035 s 
- in org.apache.phoenix.util.MetaDataUtilTest
[INFO] Running org.apache.phoenix.util.EquiDepthStreamHistogramTest
[INFO] Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 3.03 s - 
in org.apache.phoenix.query.KeyRangeMoreTest
[INFO] Running org.apache.phoenix.util.LogUtilTest
[INFO] Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.412 s 
- in org.apache.phoenix.util.PhoenixEncodeDecodeTest
[INFO] Running org.apache.phoenix.util.TenantIdByteConversionTest
[INFO] Tests run: 18, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.03 s 
- in org.apache.phoenix.util.TenantIdByteConversionTest
[INFO] Running org.apache.phoenix.util.ByteUtilTest
[INFO] Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.034 s 
- in org.apache.phoenix.util.ByteUtilTest
[INFO] Running org.apache.phoenix.util.ScanUtilTest
[INFO] Tests run: 8, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.606 s 
- in org.apache.phoenix.util.csv.CsvUpsertExecutorTest
[INFO] Running org.apache.phoenix.util.SequenceUtilTest
[INFO] Tests run: 20, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.072 s 
- in org.apache.phoenix.util.ScanUtilTest
[INFO] Running org.apache.phoenix.util.PhoenixContextExecutorTest
[INFO] Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.001 s 
- in org.apache.phoenix.util.PhoenixContextExecutorTest
[INFO] Running org.apache.phoenix.util.QueryUtilTest
[INFO] Tests run: 11, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.009 s 
- in org.apache.phoenix.util.QueryUtilTest
[INFO] Running org.apache.phoenix.util.json.JsonUpsertExecutorTest
[INFO] Tests run: 18, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.023 s 
- in org.apache.phoenix.util.SequenceUtilTest
[INFO] Running org.apache.phoenix.util.PropertiesUtilTest
[INFO] Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.043 s 
- in org.apache.phoenix.util.PropertiesUtilTest
[INFO] Running org.apache.phoenix.util.QualifierEncodingSchemeTest
[INFO] Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.718 s 
- in org.apache.phoenix.filter.SkipScanBigFilterTest
[INFO] Running org.apache.phoenix.util.LikeExpressionTest
[INFO] Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.004 s 
- in org.apache.phoenix.util.LikeExpressionTest
[INFO] Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.013 s 
- in org.apache.phoenix.util.QualifierEncodingSchemeTest
[INFO] Running org.apache.phoenix.util.PrefixByteEncoderDecoderTest
[INFO] Running org.apache.phoenix.util.IndexUtilTest
[INFO] Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.005 s 
- in 

phoenix git commit: PHOENIX-4826 Changes to support HBase 2.0.1

2018-07-30 Thread ankit
Repository: phoenix
Updated Branches:
  refs/heads/master e26e0f29b -> a4f93eb45


PHOENIX-4826 Changes to support HBase 2.0.1


Project: http://git-wip-us.apache.org/repos/asf/phoenix/repo
Commit: http://git-wip-us.apache.org/repos/asf/phoenix/commit/a4f93eb4
Tree: http://git-wip-us.apache.org/repos/asf/phoenix/tree/a4f93eb4
Diff: http://git-wip-us.apache.org/repos/asf/phoenix/diff/a4f93eb4

Branch: refs/heads/master
Commit: a4f93eb458c516206cc3ed25978fb025d752a2a7
Parents: e26e0f2
Author: Ankit Singhal 
Authored: Mon Jul 30 13:52:21 2018 -0700
Committer: Ankit Singhal 
Committed: Mon Jul 30 13:52:21 2018 -0700

--
 .../index/covered/data/DelegateComparator.java  | 83 
 .../hbase/index/covered/data/IndexMemStore.java |  6 +-
 .../index/covered/data/TestIndexMemStore.java   |  6 +-
 pom.xml |  2 +-
 4 files changed, 90 insertions(+), 7 deletions(-)
--


http://git-wip-us.apache.org/repos/asf/phoenix/blob/a4f93eb4/phoenix-core/src/main/java/org/apache/phoenix/hbase/index/covered/data/DelegateComparator.java
--
diff --git 
a/phoenix-core/src/main/java/org/apache/phoenix/hbase/index/covered/data/DelegateComparator.java
 
b/phoenix-core/src/main/java/org/apache/phoenix/hbase/index/covered/data/DelegateComparator.java
new file mode 100644
index 000..478d98b
--- /dev/null
+++ 
b/phoenix-core/src/main/java/org/apache/phoenix/hbase/index/covered/data/DelegateComparator.java
@@ -0,0 +1,83 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.phoenix.hbase.index.covered.data;
+
+import java.util.Comparator;
+
+import org.apache.hadoop.hbase.Cell;
+import org.apache.hadoop.hbase.CellComparator;
+
+public class DelegateComparator implements CellComparator {
+
+private CellComparator delegate;
+
+public DelegateComparator(CellComparator delegate) {
+this.delegate=delegate;
+}
+
+@Override
+public int compare(Cell leftCell, Cell rightCell) {
+return delegate.compare(leftCell, rightCell);
+}
+
+@Override
+public int compareRows(Cell leftCell, Cell rightCell) {
+return delegate.compareRows(leftCell, rightCell);
+}
+
+@Override
+public int compareRows(Cell cell, byte[] bytes, int offset, int length) {
+return delegate.compareRows(cell, bytes, offset, length);
+}
+
+@Override
+public int compareWithoutRow(Cell leftCell, Cell rightCell) {
+return delegate.compareWithoutRow(leftCell, rightCell);
+}
+
+@Override
+public int compareFamilies(Cell leftCell, Cell rightCell) {
+return delegate.compareFamilies(leftCell, rightCell);
+}
+
+@Override
+public int compareQualifiers(Cell leftCell, Cell rightCell) {
+return delegate.compareQualifiers(leftCell, rightCell);
+}
+
+@Override
+public int compareTimestamps(Cell leftCell, Cell rightCell) {
+return delegate.compareTimestamps(leftCell, rightCell);
+}
+
+@Override
+public int compareTimestamps(long leftCellts, long rightCellts) {
+return delegate.compareTimestamps(leftCellts, rightCellts);
+}
+
+@Override
+public int compare(Cell leftCell, Cell rightCell, boolean 
ignoreSequenceid) {
+return delegate.compare(leftCell, rightCell, ignoreSequenceid);
+}
+
+@Override
+public Comparator getSimpleComparator() {
+return delegate.getSimpleComparator();
+}
+
+}

http://git-wip-us.apache.org/repos/asf/phoenix/blob/a4f93eb4/phoenix-core/src/main/java/org/apache/phoenix/hbase/index/covered/data/IndexMemStore.java
--
diff --git 
a/phoenix-core/src/main/java/org/apache/phoenix/hbase/index/covered/data/IndexMemStore.java
 
b/phoenix-core/src/main/java/org/apache/phoenix/hbase/index/covered/data/IndexMemStore.java
index 8247496..301d825 100644
--- 
a/phoenix-core/src/main/java/org/apache/phoenix/hbase/index/covered/data/IndexMemStore.java
+++ 

phoenix git commit: PHOENIX-4825 Replace usage of HBase Base64 implementation with java.util.Base64

2018-07-30 Thread ankit
Repository: phoenix
Updated Branches:
  refs/heads/master e65917eb2 -> e26e0f29b


PHOENIX-4825 Replace usage of HBase Base64 implementation with java.util.Base64


Project: http://git-wip-us.apache.org/repos/asf/phoenix/repo
Commit: http://git-wip-us.apache.org/repos/asf/phoenix/commit/e26e0f29
Tree: http://git-wip-us.apache.org/repos/asf/phoenix/tree/e26e0f29
Diff: http://git-wip-us.apache.org/repos/asf/phoenix/diff/e26e0f29

Branch: refs/heads/master
Commit: e26e0f29b91dceaf3ca0a9fb76944803e707fbbc
Parents: e65917e
Author: Ankit Singhal 
Authored: Mon Jul 30 13:51:43 2018 -0700
Committer: Ankit Singhal 
Committed: Mon Jul 30 13:51:43 2018 -0700

--
 .../org/apache/phoenix/end2end/QueryMoreIT.java |  7 +++--
 .../phoenix/mapreduce/CsvBulkImportUtil.java|  8 --
 .../util/PhoenixConfigurationUtil.java  |  7 +++--
 .../apache/phoenix/schema/types/PVarbinary.java |  4 +--
 .../phoenix/util/csv/CsvUpsertExecutor.java |  4 +--
 .../phoenix/util/json/JsonUpsertExecutor.java   |  4 +--
 .../util/AbstractUpsertExecutorTest.java| 12 
 .../util/TenantIdByteConversionTest.java| 30 
 8 files changed, 50 insertions(+), 26 deletions(-)
--


http://git-wip-us.apache.org/repos/asf/phoenix/blob/e26e0f29/phoenix-core/src/it/java/org/apache/phoenix/end2end/QueryMoreIT.java
--
diff --git 
a/phoenix-core/src/it/java/org/apache/phoenix/end2end/QueryMoreIT.java 
b/phoenix-core/src/it/java/org/apache/phoenix/end2end/QueryMoreIT.java
index 04272fa..528fe7f 100644
--- a/phoenix-core/src/it/java/org/apache/phoenix/end2end/QueryMoreIT.java
+++ b/phoenix-core/src/it/java/org/apache/phoenix/end2end/QueryMoreIT.java
@@ -31,12 +31,13 @@ import java.sql.ResultSet;
 import java.sql.SQLException;
 import java.sql.Statement;
 import java.util.ArrayList;
+import java.util.Base64;
 import java.util.HashMap;
 import java.util.List;
 import java.util.Map;
 import java.util.Properties;
 
-import org.apache.hadoop.hbase.util.Base64;
+import org.apache.hadoop.hbase.util.Bytes;
 import org.apache.hadoop.hbase.util.Pair;
 import org.apache.phoenix.jdbc.PhoenixConnection;
 import org.apache.phoenix.query.QueryServices;
@@ -278,7 +279,7 @@ public class QueryMoreIT extends ParallelStatsDisabledIT {
 values[i] = rs.getObject(i + 1);
 }
 conn = getTenantSpecificConnection(tenantId);
-
pkIds.add(Base64.encodeBytes(PhoenixRuntime.encodeColumnValues(conn, 
tableOrViewName.toUpperCase(), values, columns)));
+
pkIds.add(Bytes.toString(Base64.getEncoder().encode(PhoenixRuntime.encodeColumnValues(conn,
 tableOrViewName.toUpperCase(), values, columns;
 }
 return pkIds.toArray(new String[pkIds.size()]);
 }
@@ -296,7 +297,7 @@ public class QueryMoreIT extends ParallelStatsDisabledIT {
 PreparedStatement stmt = conn.prepareStatement(query);
 int bindCounter = 1;
 for (int i = 0; i < cursorIds.length; i++) {
-Object[] pkParts = PhoenixRuntime.decodeColumnValues(conn, 
tableName.toUpperCase(), Base64.decode(cursorIds[i]), columns);
+Object[] pkParts = PhoenixRuntime.decodeColumnValues(conn, 
tableName.toUpperCase(), Base64.getDecoder().decode(cursorIds[i]), columns);
 for (int j = 0; j < pkParts.length; j++) {
 stmt.setObject(bindCounter++, pkParts[j]);
 }

http://git-wip-us.apache.org/repos/asf/phoenix/blob/e26e0f29/phoenix-core/src/main/java/org/apache/phoenix/mapreduce/CsvBulkImportUtil.java
--
diff --git 
a/phoenix-core/src/main/java/org/apache/phoenix/mapreduce/CsvBulkImportUtil.java
 
b/phoenix-core/src/main/java/org/apache/phoenix/mapreduce/CsvBulkImportUtil.java
index ff9ff72..bf5a538 100644
--- 
a/phoenix-core/src/main/java/org/apache/phoenix/mapreduce/CsvBulkImportUtil.java
+++ 
b/phoenix-core/src/main/java/org/apache/phoenix/mapreduce/CsvBulkImportUtil.java
@@ -17,9 +17,11 @@
  */
 package org.apache.phoenix.mapreduce;
 
+import java.util.Base64;
+
 import org.apache.hadoop.conf.Configuration;
 import org.apache.hadoop.fs.Path;
-import org.apache.hadoop.hbase.util.Base64;
+import org.apache.hadoop.hbase.util.Bytes;
 import org.apache.phoenix.mapreduce.util.PhoenixConfigurationUtil;
 import org.apache.phoenix.query.QueryConstants;
 import org.apache.phoenix.query.QueryServices;
@@ -68,7 +70,7 @@ public class CsvBulkImportUtil {
 
 @VisibleForTesting
 static void setChar(Configuration conf, String confKey, char charValue) {
-conf.set(confKey, 
Base64.encodeBytes(Character.toString(charValue).getBytes()));
+conf.set(confKey, 
Bytes.toString(Base64.getEncoder().encode(Character.toString(charValue).getBytes(;
 }
 
 

[phoenix] Git Push Summary

2018-07-30 Thread elserj
Repository: phoenix
Updated Branches:
  refs/heads/5.x-HBase-2.0 [deleted] 14a6f54fc


Build failed in Jenkins: Phoenix Compile Compatibility with HBase #712

2018-07-30 Thread Apache Jenkins Server
See 


--
Started by timer
[EnvInject] - Loading node environment variables.
Building remotely on H25 (ubuntu xenial) in workspace 

[Phoenix_Compile_Compat_wHBase] $ /bin/bash /tmp/jenkins114142822196625056.sh
core file size  (blocks, -c) 0
data seg size   (kbytes, -d) unlimited
scheduling priority (-e) 0
file size   (blocks, -f) unlimited
pending signals (-i) 386413
max locked memory   (kbytes, -l) 64
max memory size (kbytes, -m) unlimited
open files  (-n) 6
pipe size(512 bytes, -p) 8
POSIX message queues (bytes, -q) 819200
real-time priority  (-r) 0
stack size  (kbytes, -s) 8192
cpu time   (seconds, -t) unlimited
max user processes  (-u) 10240
virtual memory  (kbytes, -v) unlimited
file locks  (-x) unlimited
core id : 0
core id : 1
core id : 2
core id : 3
core id : 4
core id : 5
physical id : 0
physical id : 1
MemTotal:   98957736 kB
MemFree:47771112 kB
Filesystem  Size  Used Avail Use% Mounted on
udev 48G 0   48G   0% /dev
tmpfs   9.5G   90M  9.4G   1% /run
/dev/sda1   364G  294G   52G  85% /
tmpfs48G 0   48G   0% /dev/shm
tmpfs   5.0M 0  5.0M   0% /run/lock
tmpfs48G 0   48G   0% /sys/fs/cgroup
tmpfs   9.5G 0  9.5G   0% /run/user/910
apache-maven-2.2.1
apache-maven-3.0.4
apache-maven-3.0.5
apache-maven-3.2.1
apache-maven-3.2.5
apache-maven-3.3.3
apache-maven-3.3.9
apache-maven-3.5.0
apache-maven-3.5.2
apache-maven-3.5.4
latest
latest2
latest3


===
Verifying compile level compatibility with HBase 0.98 with Phoenix 
4.x-HBase-0.98
===

Cloning into 'hbase'...
Switched to a new branch '0.98'
Branch 0.98 set up to track remote branch 0.98 from origin.
[ERROR] [ERROR] Some problems were encountered while processing the POMs:
[ERROR] Unresolveable build extension: Plugin 
org.apache.felix:maven-bundle-plugin:2.5.3 or one of its dependencies could not 
be resolved: The following artifacts could not be resolved: 
org.apache.maven:maven-core:jar:2.0.7, 
org.apache.maven:maven-settings:jar:2.0.7, 
org.apache.maven:maven-plugin-parameter-documenter:jar:2.0.7, 
org.apache.maven:maven-profile:jar:2.0.7, 
org.apache.maven:maven-model:jar:2.0.7, 
org.apache.maven:maven-artifact:jar:2.0.7, 
org.apache.maven:maven-repository-metadata:jar:2.0.7, 
org.apache.maven:maven-error-diagnostics:jar:2.0.7, 
org.apache.maven:maven-project:jar:2.0.7, 
org.apache.maven:maven-plugin-registry:jar:2.0.7, 
org.apache.maven:maven-plugin-api:jar:2.0.7, 
org.apache.maven:maven-plugin-descriptor:jar:2.0.7, 
org.apache.maven:maven-artifact-manager:jar:2.0.7, 
org.apache.maven:maven-monitor:jar:2.0.7: Could not transfer artifact 
org.apache.maven:maven-core:jar:2.0.7 from/to central 
(https://repo.maven.apache.org/maven2): Received fatal alert: protocol_version 
@ 
 @ 
[ERROR] The build could not read 1 project -> [Help 1]
[ERROR]   
[ERROR]   The project org.apache.hbase:hbase:0.98.25-SNAPSHOT 
(
 has 1 error
[ERROR] Unresolveable build extension: Plugin 
org.apache.felix:maven-bundle-plugin:2.5.3 or one of its dependencies could not 
be resolved: The following artifacts could not be resolved: 
org.apache.maven:maven-core:jar:2.0.7, 
org.apache.maven:maven-settings:jar:2.0.7, 
org.apache.maven:maven-plugin-parameter-documenter:jar:2.0.7, 
org.apache.maven:maven-profile:jar:2.0.7, 
org.apache.maven:maven-model:jar:2.0.7, 
org.apache.maven:maven-artifact:jar:2.0.7, 
org.apache.maven:maven-repository-metadata:jar:2.0.7, 
org.apache.maven:maven-error-diagnostics:jar:2.0.7, 
org.apache.maven:maven-project:jar:2.0.7, 
org.apache.maven:maven-plugin-registry:jar:2.0.7, 
org.apache.maven:maven-plugin-api:jar:2.0.7, 
org.apache.maven:maven-plugin-descriptor:jar:2.0.7, 
org.apache.maven:maven-artifact-manager:jar:2.0.7, 
org.apache.maven:maven-monitor:jar:2.0.7: Could not transfer artifact 
org.apache.maven:maven-core:jar:2.0.7 from/to central 
(https://repo.maven.apache.org/maven2): Received fatal alert: protocol_version 
-> [Help 2]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e 
switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please 
read the following articles:
[ERROR] [Help 1]