Build failed in Jenkins: Phoenix | Master #1951

2018-03-02 Thread Apache Jenkins Server
See 


Changes:

[tdsilva] PHOENIX-4633 Handle the creation of SYSTEM.CATALOG correctly for users

--
[...truncated 109.88 KB...]
[INFO] Running org.apache.phoenix.end2end.join.SortMergeJoinNoIndexIT
[INFO] Tests run: 40, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 498.804 
s - in org.apache.phoenix.end2end.index.LocalMutableTxIndexIT
[INFO] Running org.apache.phoenix.end2end.join.SubqueryIT
[INFO] Tests run: 33, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 367.446 
s - in org.apache.phoenix.end2end.join.HashJoinNoIndexIT
[INFO] Tests run: 72, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 837.393 
s - in org.apache.phoenix.end2end.index.MutableIndexIT
[INFO] Running org.apache.phoenix.end2end.join.SubqueryUsingSortMergeJoinIT
[INFO] Tests run: 34, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 375.566 
s - in org.apache.phoenix.end2end.join.SortMergeJoinNoIndexIT
[INFO] Running org.apache.phoenix.end2end.salted.SaltedTableIT
[INFO] Tests run: 33, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 625.784 
s - in org.apache.phoenix.end2end.join.HashJoinGlobalIndexIT
[INFO] Running org.apache.phoenix.end2end.salted.SaltedTableUpsertSelectIT
[INFO] Tests run: 8, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 30.835 s 
- in org.apache.phoenix.end2end.salted.SaltedTableIT
[INFO] Running org.apache.phoenix.iterate.PhoenixQueryTimeoutIT
[INFO] Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 9.737 s 
- in org.apache.phoenix.iterate.PhoenixQueryTimeoutIT
[INFO] Running org.apache.phoenix.iterate.RoundRobinResultIteratorIT
[INFO] Tests run: 24, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 405.533 
s - in org.apache.phoenix.end2end.join.SubqueryIT
[INFO] Running org.apache.phoenix.end2end.salted.SaltedTableVarLengthRowKeyIT
[INFO] Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 4.431 s 
- in org.apache.phoenix.end2end.salted.SaltedTableVarLengthRowKeyIT
[INFO] Running org.apache.phoenix.rpc.UpdateCacheIT
[INFO] Tests run: 6, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 55.232 s 
- in org.apache.phoenix.end2end.salted.SaltedTableUpsertSelectIT
[INFO] Running org.apache.phoenix.trace.PhoenixTableMetricsWriterIT
[INFO] Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 14.452 s 
- in org.apache.phoenix.trace.PhoenixTableMetricsWriterIT
[INFO] Running org.apache.phoenix.trace.PhoenixTracingEndToEndIT
[INFO] Tests run: 7, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 43.506 s 
- in org.apache.phoenix.rpc.UpdateCacheIT
[INFO] Running org.apache.phoenix.tx.FlappingTransactionIT
[INFO] Running org.apache.phoenix.replication.SystemCatalogWALEntryFilterIT
[INFO] Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.485 s 
- in org.apache.phoenix.replication.SystemCatalogWALEntryFilterIT
[INFO] Running org.apache.phoenix.tx.ParameterizedTransactionIT
[INFO] Tests run: 9, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 80.292 s 
- in org.apache.phoenix.iterate.RoundRobinResultIteratorIT
[INFO] Running org.apache.phoenix.tx.TransactionIT
[INFO] Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 17.761 s 
- in org.apache.phoenix.tx.FlappingTransactionIT
[INFO] Running org.apache.phoenix.tx.TxCheckpointIT
[INFO] Tests run: 8, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 104.324 
s - in org.apache.phoenix.trace.PhoenixTracingEndToEndIT
[INFO] Tests run: 34, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 651.055 
s - in org.apache.phoenix.end2end.join.SortMergeJoinGlobalIndexIT
[INFO] Tests run: 8, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 101.105 
s - in org.apache.phoenix.tx.TransactionIT
[INFO] Tests run: 34, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 838.427 
s - in org.apache.phoenix.end2end.join.HashJoinLocalIndexIT
[INFO] Running org.apache.phoenix.util.IndexScrutinyIT
[INFO] Tests run: 15, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 293.535 
s - in org.apache.phoenix.end2end.join.SubqueryUsingSortMergeJoinIT
[INFO] Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 28.979 s 
- in org.apache.phoenix.util.IndexScrutinyIT
[INFO] Tests run: 34, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 838.099 
s - in org.apache.phoenix.end2end.join.SortMergeJoinLocalIndexIT
[WARNING] Tests run: 52, Failures: 0, Errors: 0, Skipped: 4, Time elapsed: 
378.158 s - in org.apache.phoenix.tx.ParameterizedTransactionIT
[INFO] Tests run: 40, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 452.793 
s - in org.apache.phoenix.tx.TxCheckpointIT
[INFO] 
[INFO] Results:
[INFO] 
[ERROR] Failures: 
[ERROR]   ConcurrentMutationsIT.testConcurrentDeletesAndUpsertValues:214 
Expected to find PK in data table: (0,0)
[ERROR]   DefaultColumnValueIT.testDefaultIndexed:978
[ERROR]   RowValueConstructorIT.testRVCLastPkIsTable1stPkIndex:1584
[ERROR]   

Apache Phoenix - Timeout crawler - Build https://builds.apache.org/job/Phoenix-4.x-HBase-0.98/1826/

2018-03-02 Thread Apache Jenkins Server
[...truncated 50 lines...]

Build failed in Jenkins: Phoenix | 4.x-HBase-0.98 #1826

2018-03-02 Thread Apache Jenkins Server
See 


Changes:

[tdsilva] PHOENIX-4633 Handle the creation of SYSTEM.CATALOG correctly for users

--
[...truncated 110.62 KB...]
org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: Failed 1 
action: org.apache.hadoop.hbase.DoNotRetryIOException: Failed 1 action: 
RegionOpeningException: 1 time, servers with issues: 
asf931.gq1.ygridcore.net,34354,1520038603157, 
at 
org.apache.phoenix.util.ServerUtil.createIOException(ServerUtil.java:80)
at 
org.apache.phoenix.util.ServerUtil.throwIOException(ServerUtil.java:62)
at 
org.apache.phoenix.index.PhoenixTransactionalIndexer.postBatchMutateIndispensably(PhoenixTransactionalIndexer.java:240)
at 
org.apache.hadoop.hbase.regionserver.RegionCoprocessorHost$37.call(RegionCoprocessorHost.java:1040)
at 
org.apache.hadoop.hbase.regionserver.RegionCoprocessorHost$RegionOperation.call(RegionCoprocessorHost.java:1656)
at 
org.apache.hadoop.hbase.regionserver.RegionCoprocessorHost.execOperation(RegionCoprocessorHost.java:1733)
at 
org.apache.hadoop.hbase.regionserver.RegionCoprocessorHost.execOperation(RegionCoprocessorHost.java:1688)
at 
org.apache.hadoop.hbase.regionserver.RegionCoprocessorHost.postBatchMutateIndispensably(RegionCoprocessorHost.java:1036)
at 
org.apache.hadoop.hbase.regionserver.HRegion.doMiniBatchMutation(HRegion.java:2767)
at 
org.apache.hadoop.hbase.regionserver.HRegion.batchMutate(HRegion.java:2359)
at 
org.apache.hadoop.hbase.regionserver.HRegion.batchMutate(HRegion.java:2314)
at 
org.apache.hadoop.hbase.regionserver.HRegion.batchMutate(HRegion.java:2318)
at 
org.apache.hadoop.hbase.regionserver.HRegionServer.doBatchOp(HRegionServer.java:4678)
at 
org.apache.hadoop.hbase.regionserver.HRegionServer.doNonAtomicRegionMutation(HRegionServer.java:3835)
at 
org.apache.hadoop.hbase.regionserver.HRegionServer.multi(HRegionServer.java:3680)
at 
org.apache.hadoop.hbase.protobuf.generated.ClientProtos$ClientService$2.callBlockingMethod(ClientProtos.java:32500)
at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2195)
at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:104)
at 
org.apache.hadoop.hbase.ipc.RpcExecutor.consumerLoop(RpcExecutor.java:133)
at org.apache.hadoop.hbase.ipc.RpcExecutor$1.run(RpcExecutor.java:108)
at java.lang.Thread.run(Thread.java:745)
Caused by: org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: 
Failed 1 action: RegionOpeningException: 1 time, servers with issues: 
asf931.gq1.ygridcore.net,34354,1520038603157, 
at 
org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.makeException(AsyncProcess.java:211)
at 
org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.access$500(AsyncProcess.java:195)
at 
org.apache.hadoop.hbase.client.AsyncProcess.getErrors(AsyncProcess.java:1082)
at 
org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation.processBatchCallback(HConnectionManager.java:2479)
at org.apache.hadoop.hbase.client.HTable.batchCallback(HTable.java:898)
at org.apache.hadoop.hbase.client.HTable.batchCallback(HTable.java:913)
at org.apache.hadoop.hbase.client.HTable.batch(HTable.java:888)
at 
org.apache.phoenix.hbase.index.write.ParallelWriterIndexCommitter$1.call(ParallelWriterIndexCommitter.java:170)
at 
org.apache.phoenix.hbase.index.write.ParallelWriterIndexCommitter$1.call(ParallelWriterIndexCommitter.java:133)
at java.util.concurrent.FutureTask.run(FutureTask.java:262)
at 
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
at 
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
... 1 more
: 1 time, servers with issues: asf931.gq1.ygridcore.net,34354,1520038603157, 
at 
org.apache.phoenix.tx.ParameterizedTransactionIT.testNonTxToTxTable(ParameterizedTransactionIT.java:288)
Caused by: org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: 
Failed 1 action: org.apache.hadoop.hbase.DoNotRetryIOException: Failed 1 
action: RegionOpeningException: 1 time, servers with issues: 
asf931.gq1.ygridcore.net,34354,1520038603157, 
at 
org.apache.phoenix.util.ServerUtil.createIOException(ServerUtil.java:80)
at 
org.apache.phoenix.util.ServerUtil.throwIOException(ServerUtil.java:62)
at 
org.apache.phoenix.index.PhoenixTransactionalIndexer.postBatchMutateIndispensably(PhoenixTransactionalIndexer.java:240)
at 
org.apache.hadoop.hbase.regionserver.RegionCoprocessorHost$37.call(RegionCoprocessorHost.java:1040)
at 
org.apache.hadoop.hbase.regionserver.RegionCoprocessorHost$RegionOperation.call(RegionCoprocessorHost.java:1656)
at 

Apache-Phoenix | 4.x-HBase-1.2 | Build Successful

2018-03-02 Thread Apache Jenkins Server
4.x-HBase-1.2 branch build status Successful

Source repository https://git-wip-us.apache.org/repos/asf?p=phoenix.git;a=shortlog;h=refs/heads/4.x-HBase-1.2

Compiled Artifacts https://builds.apache.org/job/Phoenix-4.x-HBase-1.2/lastSuccessfulBuild/artifact/

Test Report https://builds.apache.org/job/Phoenix-4.x-HBase-1.2/lastCompletedBuild/testReport/

Changes
[gjacoby] PHOENIX-4607 - Allow PhoenixInputFormat to use tenant-specific



Build times for last couple of runsLatest build time is the right most | Legend blue: normal, red: test failure, gray: timeout


Build failed in Jenkins: Phoenix-4.x-HBase-1.3 #51

2018-03-02 Thread Apache Jenkins Server
See 


Changes:

[gjacoby] PHOENIX-4607 - Allow PhoenixInputFormat to use tenant-specific

--
[...truncated 104.17 KB...]
[INFO] Running org.apache.phoenix.end2end.index.SaltedIndexIT
[INFO] Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 27.279 s 
- in org.apache.phoenix.end2end.index.SaltedIndexIT
[INFO] Running org.apache.phoenix.end2end.index.ViewIndexIT
[INFO] Tests run: 18, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 247.783 
s - in org.apache.phoenix.end2end.index.IndexWithTableSchemaChangeIT
[INFO] Running org.apache.phoenix.end2end.index.txn.MutableRollbackIT
[WARNING] Tests run: 12, Failures: 0, Errors: 0, Skipped: 2, Time elapsed: 
109.262 s - in org.apache.phoenix.end2end.index.ViewIndexIT
[INFO] Tests run: 37, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 374.639 
s - in org.apache.phoenix.end2end.index.IndexUsageIT
[INFO] Running org.apache.phoenix.end2end.index.txn.RollbackIT
[INFO] Running org.apache.phoenix.end2end.join.HashJoinCacheIT
[INFO] Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 13.439 s 
- in org.apache.phoenix.end2end.join.HashJoinCacheIT
[INFO] Running org.apache.phoenix.end2end.join.HashJoinGlobalIndexIT
[INFO] Tests run: 6, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 112.246 
s - in org.apache.phoenix.end2end.index.txn.MutableRollbackIT
[INFO] Running org.apache.phoenix.end2end.join.HashJoinLocalIndexIT
[INFO] Tests run: 8, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 85.1 s - 
in org.apache.phoenix.end2end.index.txn.RollbackIT
[INFO] Running org.apache.phoenix.end2end.join.HashJoinMoreIT
[INFO] Tests run: 40, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 466.642 
s - in org.apache.phoenix.end2end.index.LocalImmutableNonTxIndexIT
[INFO] Tests run: 40, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 487.716 
s - in org.apache.phoenix.end2end.index.LocalImmutableTxIndexIT
[INFO] Running org.apache.phoenix.end2end.join.HashJoinNoIndexIT
[INFO] Tests run: 8, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 101.688 
s - in org.apache.phoenix.end2end.join.HashJoinMoreIT
[INFO] Running org.apache.phoenix.end2end.join.SortMergeJoinLocalIndexIT
[INFO] Tests run: 40, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 467.905 
s - in org.apache.phoenix.end2end.index.LocalMutableNonTxIndexIT
[INFO] Running org.apache.phoenix.end2end.join.SortMergeJoinGlobalIndexIT
[INFO] Tests run: 40, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 491.627 
s - in org.apache.phoenix.end2end.index.LocalMutableTxIndexIT
[INFO] Running org.apache.phoenix.end2end.join.SortMergeJoinNoIndexIT
[INFO] Running org.apache.phoenix.end2end.join.SubqueryIT
[INFO] Tests run: 33, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 364.694 
s - in org.apache.phoenix.end2end.join.HashJoinNoIndexIT
[INFO] Running org.apache.phoenix.end2end.join.SubqueryUsingSortMergeJoinIT
[INFO] Tests run: 34, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 378.083 
s - in org.apache.phoenix.end2end.join.SortMergeJoinNoIndexIT
[INFO] Tests run: 72, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 838.488 
s - in org.apache.phoenix.end2end.index.MutableIndexIT
[INFO] Tests run: 33, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 624.266 
s - in org.apache.phoenix.end2end.join.HashJoinGlobalIndexIT
[INFO] Running org.apache.phoenix.end2end.salted.SaltedTableIT
[INFO] Tests run: 24, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 400.8 s 
- in org.apache.phoenix.end2end.join.SubqueryIT
[INFO] Running org.apache.phoenix.end2end.salted.SaltedTableUpsertSelectIT
[INFO] Running org.apache.phoenix.end2end.salted.SaltedTableVarLengthRowKeyIT
[INFO] Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 4.386 s 
- in org.apache.phoenix.end2end.salted.SaltedTableVarLengthRowKeyIT
[INFO] Running org.apache.phoenix.iterate.RoundRobinResultIteratorIT
[INFO] Tests run: 8, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 26.81 s 
- in org.apache.phoenix.end2end.salted.SaltedTableIT
[INFO] Running org.apache.phoenix.replication.SystemCatalogWALEntryFilterIT
[INFO] Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.508 s 
- in org.apache.phoenix.replication.SystemCatalogWALEntryFilterIT
[INFO] Running org.apache.phoenix.rpc.UpdateCacheIT
[INFO] Running org.apache.phoenix.iterate.PhoenixQueryTimeoutIT
[INFO] Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 9.229 s 
- in org.apache.phoenix.iterate.PhoenixQueryTimeoutIT
[INFO] Running org.apache.phoenix.trace.PhoenixTableMetricsWriterIT
[INFO] Tests run: 6, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 51.915 s 
- in org.apache.phoenix.end2end.salted.SaltedTableUpsertSelectIT
[INFO] Running org.apache.phoenix.trace.PhoenixTracingEndToEndIT
[INFO] Tests run: 7, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 43.162 s 
- in org.apache.phoenix.rpc.UpdateCacheIT
[INFO] 

phoenix git commit: PHOENIX-4633 Handle the creation of SYSTEM.CATALOG correctly for users that don't have CREATE access on SYSTEM.CATALOG

2018-03-02 Thread tdsilva
Repository: phoenix
Updated Branches:
  refs/heads/master a35591c87 -> 2b7861536


PHOENIX-4633 Handle the creation of SYSTEM.CATALOG correctly for users that 
don't have CREATE access on SYSTEM.CATALOG


Project: http://git-wip-us.apache.org/repos/asf/phoenix/repo
Commit: http://git-wip-us.apache.org/repos/asf/phoenix/commit/2b786153
Tree: http://git-wip-us.apache.org/repos/asf/phoenix/tree/2b786153
Diff: http://git-wip-us.apache.org/repos/asf/phoenix/diff/2b786153

Branch: refs/heads/master
Commit: 2b7861536909a51167bcd5006eb70f65f03d9536
Parents: a35591c
Author: Thomas D'Silva 
Authored: Wed Feb 28 14:32:27 2018 -0800
Committer: Thomas D'Silva 
Committed: Fri Mar 2 15:52:07 2018 -0800

--
 .../phoenix/query/ConnectionQueryServicesImpl.java  | 16 +++-
 1 file changed, 15 insertions(+), 1 deletion(-)
--


http://git-wip-us.apache.org/repos/asf/phoenix/blob/2b786153/phoenix-core/src/main/java/org/apache/phoenix/query/ConnectionQueryServicesImpl.java
--
diff --git 
a/phoenix-core/src/main/java/org/apache/phoenix/query/ConnectionQueryServicesImpl.java
 
b/phoenix-core/src/main/java/org/apache/phoenix/query/ConnectionQueryServicesImpl.java
index 6926c4e..6bff885 100644
--- 
a/phoenix-core/src/main/java/org/apache/phoenix/query/ConnectionQueryServicesImpl.java
+++ 
b/phoenix-core/src/main/java/org/apache/phoenix/query/ConnectionQueryServicesImpl.java
@@ -129,6 +129,7 @@ import org.apache.hadoop.hbase.util.Bytes;
 import org.apache.hadoop.hbase.util.Pair;
 import org.apache.hadoop.hbase.util.VersionInfo;
 import org.apache.hadoop.hbase.zookeeper.ZKConfig;
+import org.apache.hadoop.ipc.RemoteException;
 import org.apache.phoenix.compile.MutationPlan;
 import org.apache.phoenix.coprocessor.GroupedAggregateRegionObserver;
 import org.apache.phoenix.coprocessor.MetaDataEndpointImpl;
@@ -2449,7 +2450,20 @@ public class ConnectionQueryServicesImpl extends 
DelegateQueryServices implement
 setUpgradeRequired();
 }
 } catch (PhoenixIOException e) {
-if 
(!Iterables.isEmpty(Iterables.filter(Throwables.getCausalChain(e), 
AccessDeniedException.class))) {
+boolean foundAccessDeniedException = false;
+// when running spark/map reduce jobs the 
ADE might be wrapped
+// in a RemoteException
+for (Throwable t : 
Throwables.getCausalChain(e)) {
+if (t instanceof AccessDeniedException
+|| (t instanceof 
RemoteException
+&& ((RemoteException) 
t).getClassName()
+
.equals(AccessDeniedException.class
+
.getName( {
+foundAccessDeniedException = true;
+break;
+}
+}
+if (foundAccessDeniedException) {
 // Pass
 logger.warn("Could not check for 
Phoenix SYSTEM tables, assuming they exist and are properly configured");
 
checkClientServerCompatibility(SchemaUtil.getPhysicalName(SYSTEM_CATALOG_NAME_BYTES,
 getProps()).getName());



phoenix git commit: PHOENIX-4633 Handle the creation of SYSTEM.CATALOG correctly for users that don't have CREATE access on SYSTEM.CATALOG

2018-03-02 Thread tdsilva
Repository: phoenix
Updated Branches:
  refs/heads/5.x-HBase-2.0 4f7949135 -> 81784909f


PHOENIX-4633 Handle the creation of SYSTEM.CATALOG correctly for users that 
don't have CREATE access on SYSTEM.CATALOG


Project: http://git-wip-us.apache.org/repos/asf/phoenix/repo
Commit: http://git-wip-us.apache.org/repos/asf/phoenix/commit/81784909
Tree: http://git-wip-us.apache.org/repos/asf/phoenix/tree/81784909
Diff: http://git-wip-us.apache.org/repos/asf/phoenix/diff/81784909

Branch: refs/heads/5.x-HBase-2.0
Commit: 81784909f76335ae95e5e5a417c2722d283020de
Parents: 4f79491
Author: Thomas D'Silva 
Authored: Wed Feb 28 14:32:27 2018 -0800
Committer: Thomas D'Silva 
Committed: Fri Mar 2 15:51:49 2018 -0800

--
 .../phoenix/query/ConnectionQueryServicesImpl.java  | 16 +++-
 1 file changed, 15 insertions(+), 1 deletion(-)
--


http://git-wip-us.apache.org/repos/asf/phoenix/blob/81784909/phoenix-core/src/main/java/org/apache/phoenix/query/ConnectionQueryServicesImpl.java
--
diff --git 
a/phoenix-core/src/main/java/org/apache/phoenix/query/ConnectionQueryServicesImpl.java
 
b/phoenix-core/src/main/java/org/apache/phoenix/query/ConnectionQueryServicesImpl.java
index 07bef1b..44e7f33 100644
--- 
a/phoenix-core/src/main/java/org/apache/phoenix/query/ConnectionQueryServicesImpl.java
+++ 
b/phoenix-core/src/main/java/org/apache/phoenix/query/ConnectionQueryServicesImpl.java
@@ -135,6 +135,7 @@ import org.apache.hadoop.hbase.util.Bytes;
 import org.apache.hadoop.hbase.util.Pair;
 import org.apache.hadoop.hbase.util.VersionInfo;
 import org.apache.hadoop.hbase.zookeeper.ZKConfig;
+import org.apache.hadoop.ipc.RemoteException;
 import org.apache.phoenix.compile.MutationPlan;
 import org.apache.phoenix.coprocessor.GroupedAggregateRegionObserver;
 import org.apache.phoenix.coprocessor.MetaDataEndpointImpl;
@@ -2485,7 +2486,20 @@ public class ConnectionQueryServicesImpl extends 
DelegateQueryServices implement
 setUpgradeRequired();
 }
 } catch (PhoenixIOException e) {
-if 
(!Iterables.isEmpty(Iterables.filter(Throwables.getCausalChain(e), 
AccessDeniedException.class))) {
+boolean foundAccessDeniedException = false;
+// when running spark/map reduce jobs the 
ADE might be wrapped
+// in a RemoteException
+for (Throwable t : 
Throwables.getCausalChain(e)) {
+if (t instanceof AccessDeniedException
+|| (t instanceof 
RemoteException
+&& ((RemoteException) 
t).getClassName()
+
.equals(AccessDeniedException.class
+
.getName( {
+foundAccessDeniedException = true;
+break;
+}
+}
+if (foundAccessDeniedException) {
 // Pass
 logger.warn("Could not check for 
Phoenix SYSTEM tables, assuming they exist and are properly configured");
 
checkClientServerCompatibility(SchemaUtil.getPhysicalName(SYSTEM_CATALOG_NAME_BYTES,
 getProps()).getName());



phoenix git commit: PHOENIX-4633 Handle the creation of SYSTEM.CATALOG correctly for users that don't have CREATE access on SYSTEM.CATALOG

2018-03-02 Thread tdsilva
Repository: phoenix
Updated Branches:
  refs/heads/4.x-cdh5.11.2 1da8e8608 -> ca5c9d03c


PHOENIX-4633 Handle the creation of SYSTEM.CATALOG correctly for users that 
don't have CREATE access on SYSTEM.CATALOG


Project: http://git-wip-us.apache.org/repos/asf/phoenix/repo
Commit: http://git-wip-us.apache.org/repos/asf/phoenix/commit/ca5c9d03
Tree: http://git-wip-us.apache.org/repos/asf/phoenix/tree/ca5c9d03
Diff: http://git-wip-us.apache.org/repos/asf/phoenix/diff/ca5c9d03

Branch: refs/heads/4.x-cdh5.11.2
Commit: ca5c9d03ceb619f6b8fcb3f443907a0ccf0b9cd0
Parents: 1da8e86
Author: Thomas D'Silva 
Authored: Wed Feb 28 14:32:27 2018 -0800
Committer: Thomas D'Silva 
Committed: Fri Mar 2 15:51:33 2018 -0800

--
 .../phoenix/query/ConnectionQueryServicesImpl.java  | 16 +++-
 1 file changed, 15 insertions(+), 1 deletion(-)
--


http://git-wip-us.apache.org/repos/asf/phoenix/blob/ca5c9d03/phoenix-core/src/main/java/org/apache/phoenix/query/ConnectionQueryServicesImpl.java
--
diff --git 
a/phoenix-core/src/main/java/org/apache/phoenix/query/ConnectionQueryServicesImpl.java
 
b/phoenix-core/src/main/java/org/apache/phoenix/query/ConnectionQueryServicesImpl.java
index 6926c4e..6bff885 100644
--- 
a/phoenix-core/src/main/java/org/apache/phoenix/query/ConnectionQueryServicesImpl.java
+++ 
b/phoenix-core/src/main/java/org/apache/phoenix/query/ConnectionQueryServicesImpl.java
@@ -129,6 +129,7 @@ import org.apache.hadoop.hbase.util.Bytes;
 import org.apache.hadoop.hbase.util.Pair;
 import org.apache.hadoop.hbase.util.VersionInfo;
 import org.apache.hadoop.hbase.zookeeper.ZKConfig;
+import org.apache.hadoop.ipc.RemoteException;
 import org.apache.phoenix.compile.MutationPlan;
 import org.apache.phoenix.coprocessor.GroupedAggregateRegionObserver;
 import org.apache.phoenix.coprocessor.MetaDataEndpointImpl;
@@ -2449,7 +2450,20 @@ public class ConnectionQueryServicesImpl extends 
DelegateQueryServices implement
 setUpgradeRequired();
 }
 } catch (PhoenixIOException e) {
-if 
(!Iterables.isEmpty(Iterables.filter(Throwables.getCausalChain(e), 
AccessDeniedException.class))) {
+boolean foundAccessDeniedException = false;
+// when running spark/map reduce jobs the 
ADE might be wrapped
+// in a RemoteException
+for (Throwable t : 
Throwables.getCausalChain(e)) {
+if (t instanceof AccessDeniedException
+|| (t instanceof 
RemoteException
+&& ((RemoteException) 
t).getClassName()
+
.equals(AccessDeniedException.class
+
.getName( {
+foundAccessDeniedException = true;
+break;
+}
+}
+if (foundAccessDeniedException) {
 // Pass
 logger.warn("Could not check for 
Phoenix SYSTEM tables, assuming they exist and are properly configured");
 
checkClientServerCompatibility(SchemaUtil.getPhysicalName(SYSTEM_CATALOG_NAME_BYTES,
 getProps()).getName());



phoenix git commit: PHOENIX-4633 Handle the creation of SYSTEM.CATALOG correctly for users that don't have CREATE access on SYSTEM.CATALOG

2018-03-02 Thread tdsilva
Repository: phoenix
Updated Branches:
  refs/heads/4.x-HBase-1.3 bd34ae79f -> 26f06354a


PHOENIX-4633 Handle the creation of SYSTEM.CATALOG correctly for users that 
don't have CREATE access on SYSTEM.CATALOG


Project: http://git-wip-us.apache.org/repos/asf/phoenix/repo
Commit: http://git-wip-us.apache.org/repos/asf/phoenix/commit/26f06354
Tree: http://git-wip-us.apache.org/repos/asf/phoenix/tree/26f06354
Diff: http://git-wip-us.apache.org/repos/asf/phoenix/diff/26f06354

Branch: refs/heads/4.x-HBase-1.3
Commit: 26f06354a9a95b6a54bcc2975dc4914f44c79ce8
Parents: bd34ae7
Author: Thomas D'Silva 
Authored: Wed Feb 28 14:32:27 2018 -0800
Committer: Thomas D'Silva 
Committed: Fri Mar 2 15:51:10 2018 -0800

--
 .../phoenix/query/ConnectionQueryServicesImpl.java  | 16 +++-
 1 file changed, 15 insertions(+), 1 deletion(-)
--


http://git-wip-us.apache.org/repos/asf/phoenix/blob/26f06354/phoenix-core/src/main/java/org/apache/phoenix/query/ConnectionQueryServicesImpl.java
--
diff --git 
a/phoenix-core/src/main/java/org/apache/phoenix/query/ConnectionQueryServicesImpl.java
 
b/phoenix-core/src/main/java/org/apache/phoenix/query/ConnectionQueryServicesImpl.java
index 6926c4e..6bff885 100644
--- 
a/phoenix-core/src/main/java/org/apache/phoenix/query/ConnectionQueryServicesImpl.java
+++ 
b/phoenix-core/src/main/java/org/apache/phoenix/query/ConnectionQueryServicesImpl.java
@@ -129,6 +129,7 @@ import org.apache.hadoop.hbase.util.Bytes;
 import org.apache.hadoop.hbase.util.Pair;
 import org.apache.hadoop.hbase.util.VersionInfo;
 import org.apache.hadoop.hbase.zookeeper.ZKConfig;
+import org.apache.hadoop.ipc.RemoteException;
 import org.apache.phoenix.compile.MutationPlan;
 import org.apache.phoenix.coprocessor.GroupedAggregateRegionObserver;
 import org.apache.phoenix.coprocessor.MetaDataEndpointImpl;
@@ -2449,7 +2450,20 @@ public class ConnectionQueryServicesImpl extends 
DelegateQueryServices implement
 setUpgradeRequired();
 }
 } catch (PhoenixIOException e) {
-if 
(!Iterables.isEmpty(Iterables.filter(Throwables.getCausalChain(e), 
AccessDeniedException.class))) {
+boolean foundAccessDeniedException = false;
+// when running spark/map reduce jobs the 
ADE might be wrapped
+// in a RemoteException
+for (Throwable t : 
Throwables.getCausalChain(e)) {
+if (t instanceof AccessDeniedException
+|| (t instanceof 
RemoteException
+&& ((RemoteException) 
t).getClassName()
+
.equals(AccessDeniedException.class
+
.getName( {
+foundAccessDeniedException = true;
+break;
+}
+}
+if (foundAccessDeniedException) {
 // Pass
 logger.warn("Could not check for 
Phoenix SYSTEM tables, assuming they exist and are properly configured");
 
checkClientServerCompatibility(SchemaUtil.getPhysicalName(SYSTEM_CATALOG_NAME_BYTES,
 getProps()).getName());



phoenix git commit: PHOENIX-4633 Handle the creation of SYSTEM.CATALOG correctly for users that don't have CREATE access on SYSTEM.CATALOG

2018-03-02 Thread tdsilva
Repository: phoenix
Updated Branches:
  refs/heads/4.x-HBase-1.2 8e0c3d1c0 -> 8f19583c5


PHOENIX-4633 Handle the creation of SYSTEM.CATALOG correctly for users that 
don't have CREATE access on SYSTEM.CATALOG


Project: http://git-wip-us.apache.org/repos/asf/phoenix/repo
Commit: http://git-wip-us.apache.org/repos/asf/phoenix/commit/8f19583c
Tree: http://git-wip-us.apache.org/repos/asf/phoenix/tree/8f19583c
Diff: http://git-wip-us.apache.org/repos/asf/phoenix/diff/8f19583c

Branch: refs/heads/4.x-HBase-1.2
Commit: 8f19583c5a06fe468797ba49e763a9a5c3c6278b
Parents: 8e0c3d1
Author: Thomas D'Silva 
Authored: Wed Feb 28 14:32:27 2018 -0800
Committer: Thomas D'Silva 
Committed: Fri Mar 2 15:49:12 2018 -0800

--
 .../phoenix/query/ConnectionQueryServicesImpl.java  | 16 +++-
 1 file changed, 15 insertions(+), 1 deletion(-)
--


http://git-wip-us.apache.org/repos/asf/phoenix/blob/8f19583c/phoenix-core/src/main/java/org/apache/phoenix/query/ConnectionQueryServicesImpl.java
--
diff --git 
a/phoenix-core/src/main/java/org/apache/phoenix/query/ConnectionQueryServicesImpl.java
 
b/phoenix-core/src/main/java/org/apache/phoenix/query/ConnectionQueryServicesImpl.java
index 6926c4e..6bff885 100644
--- 
a/phoenix-core/src/main/java/org/apache/phoenix/query/ConnectionQueryServicesImpl.java
+++ 
b/phoenix-core/src/main/java/org/apache/phoenix/query/ConnectionQueryServicesImpl.java
@@ -129,6 +129,7 @@ import org.apache.hadoop.hbase.util.Bytes;
 import org.apache.hadoop.hbase.util.Pair;
 import org.apache.hadoop.hbase.util.VersionInfo;
 import org.apache.hadoop.hbase.zookeeper.ZKConfig;
+import org.apache.hadoop.ipc.RemoteException;
 import org.apache.phoenix.compile.MutationPlan;
 import org.apache.phoenix.coprocessor.GroupedAggregateRegionObserver;
 import org.apache.phoenix.coprocessor.MetaDataEndpointImpl;
@@ -2449,7 +2450,20 @@ public class ConnectionQueryServicesImpl extends 
DelegateQueryServices implement
 setUpgradeRequired();
 }
 } catch (PhoenixIOException e) {
-if 
(!Iterables.isEmpty(Iterables.filter(Throwables.getCausalChain(e), 
AccessDeniedException.class))) {
+boolean foundAccessDeniedException = false;
+// when running spark/map reduce jobs the 
ADE might be wrapped
+// in a RemoteException
+for (Throwable t : 
Throwables.getCausalChain(e)) {
+if (t instanceof AccessDeniedException
+|| (t instanceof 
RemoteException
+&& ((RemoteException) 
t).getClassName()
+
.equals(AccessDeniedException.class
+
.getName( {
+foundAccessDeniedException = true;
+break;
+}
+}
+if (foundAccessDeniedException) {
 // Pass
 logger.warn("Could not check for 
Phoenix SYSTEM tables, assuming they exist and are properly configured");
 
checkClientServerCompatibility(SchemaUtil.getPhysicalName(SYSTEM_CATALOG_NAME_BYTES,
 getProps()).getName());



phoenix git commit: PHOENIX-4633 Handle the creation of SYSTEM.CATALOG correctly for users that don't have CREATE access on SYSTEM.CATALOG

2018-03-02 Thread tdsilva
Repository: phoenix
Updated Branches:
  refs/heads/4.x-HBase-1.1 93405cb52 -> c1fcc92ce


PHOENIX-4633 Handle the creation of SYSTEM.CATALOG correctly for users that 
don't have CREATE access on SYSTEM.CATALOG


Project: http://git-wip-us.apache.org/repos/asf/phoenix/repo
Commit: http://git-wip-us.apache.org/repos/asf/phoenix/commit/c1fcc92c
Tree: http://git-wip-us.apache.org/repos/asf/phoenix/tree/c1fcc92c
Diff: http://git-wip-us.apache.org/repos/asf/phoenix/diff/c1fcc92c

Branch: refs/heads/4.x-HBase-1.1
Commit: c1fcc92cea9ee4136cc8a510d9d3a7aba506d228
Parents: 93405cb
Author: Thomas D'Silva 
Authored: Wed Feb 28 14:32:27 2018 -0800
Committer: Thomas D'Silva 
Committed: Fri Mar 2 15:48:59 2018 -0800

--
 .../phoenix/query/ConnectionQueryServicesImpl.java  | 16 +++-
 1 file changed, 15 insertions(+), 1 deletion(-)
--


http://git-wip-us.apache.org/repos/asf/phoenix/blob/c1fcc92c/phoenix-core/src/main/java/org/apache/phoenix/query/ConnectionQueryServicesImpl.java
--
diff --git 
a/phoenix-core/src/main/java/org/apache/phoenix/query/ConnectionQueryServicesImpl.java
 
b/phoenix-core/src/main/java/org/apache/phoenix/query/ConnectionQueryServicesImpl.java
index 517a275..ccc6f99 100644
--- 
a/phoenix-core/src/main/java/org/apache/phoenix/query/ConnectionQueryServicesImpl.java
+++ 
b/phoenix-core/src/main/java/org/apache/phoenix/query/ConnectionQueryServicesImpl.java
@@ -128,6 +128,7 @@ import org.apache.hadoop.hbase.util.Bytes;
 import org.apache.hadoop.hbase.util.Pair;
 import org.apache.hadoop.hbase.util.VersionInfo;
 import org.apache.hadoop.hbase.zookeeper.ZKConfig;
+import org.apache.hadoop.ipc.RemoteException;
 import org.apache.phoenix.compile.MutationPlan;
 import org.apache.phoenix.coprocessor.GroupedAggregateRegionObserver;
 import org.apache.phoenix.coprocessor.MetaDataEndpointImpl;
@@ -2445,7 +2446,20 @@ public class ConnectionQueryServicesImpl extends 
DelegateQueryServices implement
 setUpgradeRequired();
 }
 } catch (PhoenixIOException e) {
-if 
(!Iterables.isEmpty(Iterables.filter(Throwables.getCausalChain(e), 
AccessDeniedException.class))) {
+boolean foundAccessDeniedException = false;
+// when running spark/map reduce jobs the 
ADE might be wrapped
+// in a RemoteException
+for (Throwable t : 
Throwables.getCausalChain(e)) {
+if (t instanceof AccessDeniedException
+|| (t instanceof 
RemoteException
+&& ((RemoteException) 
t).getClassName()
+
.equals(AccessDeniedException.class
+
.getName( {
+foundAccessDeniedException = true;
+break;
+}
+}
+if (foundAccessDeniedException) {
 // Pass
 logger.warn("Could not check for 
Phoenix SYSTEM tables, assuming they exist and are properly configured");
 
checkClientServerCompatibility(SchemaUtil.getPhysicalName(SYSTEM_CATALOG_NAME_BYTES,
 getProps()).getName());



phoenix git commit: PHOENIX-4633 Handle the creation of SYSTEM.CATALOG correctly for users that don't have CREATE access on SYSTEM.CATALOG

2018-03-02 Thread tdsilva
Repository: phoenix
Updated Branches:
  refs/heads/4.x-HBase-0.98 62a611690 -> 0f7879b17


PHOENIX-4633 Handle the creation of SYSTEM.CATALOG correctly for users that 
don't have CREATE access on SYSTEM.CATALOG


Project: http://git-wip-us.apache.org/repos/asf/phoenix/repo
Commit: http://git-wip-us.apache.org/repos/asf/phoenix/commit/0f7879b1
Tree: http://git-wip-us.apache.org/repos/asf/phoenix/tree/0f7879b1
Diff: http://git-wip-us.apache.org/repos/asf/phoenix/diff/0f7879b1

Branch: refs/heads/4.x-HBase-0.98
Commit: 0f7879b17f25c110c145ce449da918a80308b18e
Parents: 62a6116
Author: Thomas D'Silva 
Authored: Wed Feb 28 14:32:27 2018 -0800
Committer: Thomas D'Silva 
Committed: Fri Mar 2 15:48:23 2018 -0800

--
 .../phoenix/query/ConnectionQueryServicesImpl.java  | 16 +++-
 1 file changed, 15 insertions(+), 1 deletion(-)
--


http://git-wip-us.apache.org/repos/asf/phoenix/blob/0f7879b1/phoenix-core/src/main/java/org/apache/phoenix/query/ConnectionQueryServicesImpl.java
--
diff --git 
a/phoenix-core/src/main/java/org/apache/phoenix/query/ConnectionQueryServicesImpl.java
 
b/phoenix-core/src/main/java/org/apache/phoenix/query/ConnectionQueryServicesImpl.java
index 01c862e..624544f 100644
--- 
a/phoenix-core/src/main/java/org/apache/phoenix/query/ConnectionQueryServicesImpl.java
+++ 
b/phoenix-core/src/main/java/org/apache/phoenix/query/ConnectionQueryServicesImpl.java
@@ -129,6 +129,7 @@ import org.apache.hadoop.hbase.util.Bytes;
 import org.apache.hadoop.hbase.util.Pair;
 import org.apache.hadoop.hbase.util.VersionInfo;
 import org.apache.hadoop.hbase.zookeeper.ZKConfig;
+import org.apache.hadoop.ipc.RemoteException;
 import org.apache.phoenix.compile.MutationPlan;
 import org.apache.phoenix.coprocessor.GroupedAggregateRegionObserver;
 import org.apache.phoenix.coprocessor.MetaDataEndpointImpl;
@@ -2451,7 +2452,20 @@ public class ConnectionQueryServicesImpl extends 
DelegateQueryServices implement
 setUpgradeRequired();
 }
 } catch (PhoenixIOException e) {
-if 
(!Iterables.isEmpty(Iterables.filter(Throwables.getCausalChain(e), 
AccessDeniedException.class))) {
+boolean foundAccessDeniedException = false;
+// when running spark/map reduce jobs the 
ADE might be wrapped
+// in a RemoteException
+for (Throwable t : 
Throwables.getCausalChain(e)) {
+if (t instanceof AccessDeniedException
+|| (t instanceof 
RemoteException
+&& ((RemoteException) 
t).getClassName()
+
.equals(AccessDeniedException.class
+
.getName( {
+foundAccessDeniedException = true;
+break;
+}
+}
+if (foundAccessDeniedException) {
 // Pass
 logger.warn("Could not check for 
Phoenix SYSTEM tables, assuming they exist and are properly configured");
 
checkClientServerCompatibility(SchemaUtil.getPhysicalName(SYSTEM_CATALOG_NAME_BYTES,
 getProps()).getName());



Build failed in Jenkins: Phoenix | Master #1950

2018-03-02 Thread Apache Jenkins Server
See 


Changes:

[gjacoby] PHOENIX-4607 - Allow PhoenixInputFormat to use tenant-specific

--
[...truncated 110.10 KB...]
[INFO] Running org.apache.phoenix.end2end.join.SortMergeJoinNoIndexIT
[INFO] Tests run: 40, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 485.594 
s - in org.apache.phoenix.end2end.index.LocalMutableTxIndexIT
[INFO] Running org.apache.phoenix.end2end.join.SubqueryIT
[INFO] Tests run: 33, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 357.36 
s - in org.apache.phoenix.end2end.join.HashJoinNoIndexIT
[INFO] Tests run: 72, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 795.648 
s - in org.apache.phoenix.end2end.index.MutableIndexIT
[INFO] Running org.apache.phoenix.end2end.join.SubqueryUsingSortMergeJoinIT
[INFO] Tests run: 34, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 365.313 
s - in org.apache.phoenix.end2end.join.SortMergeJoinNoIndexIT
[INFO] Running org.apache.phoenix.end2end.salted.SaltedTableIT
[INFO] Tests run: 33, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 588.653 
s - in org.apache.phoenix.end2end.join.HashJoinGlobalIndexIT
[INFO] Running org.apache.phoenix.end2end.salted.SaltedTableUpsertSelectIT
[INFO] Tests run: 8, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 26.391 s 
- in org.apache.phoenix.end2end.salted.SaltedTableIT
[INFO] Running org.apache.phoenix.iterate.PhoenixQueryTimeoutIT
[INFO] Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 9.202 s 
- in org.apache.phoenix.iterate.PhoenixQueryTimeoutIT
[INFO] Running org.apache.phoenix.iterate.RoundRobinResultIteratorIT
[INFO] Running org.apache.phoenix.end2end.salted.SaltedTableVarLengthRowKeyIT
[INFO] Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 4.32 s - 
in org.apache.phoenix.end2end.salted.SaltedTableVarLengthRowKeyIT
[INFO] Running org.apache.phoenix.replication.SystemCatalogWALEntryFilterIT
[INFO] Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.2 s - 
in org.apache.phoenix.replication.SystemCatalogWALEntryFilterIT
[INFO] Running org.apache.phoenix.rpc.UpdateCacheIT
[INFO] Tests run: 6, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 50.262 s 
- in org.apache.phoenix.end2end.salted.SaltedTableUpsertSelectIT
[INFO] Running org.apache.phoenix.trace.PhoenixTableMetricsWriterIT
[INFO] Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 14.395 s 
- in org.apache.phoenix.trace.PhoenixTableMetricsWriterIT
[INFO] Running org.apache.phoenix.trace.PhoenixTracingEndToEndIT
[INFO] Tests run: 24, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 395.651 
s - in org.apache.phoenix.end2end.join.SubqueryIT
[INFO] Tests run: 7, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 42.293 s 
- in org.apache.phoenix.rpc.UpdateCacheIT
[INFO] Running org.apache.phoenix.tx.ParameterizedTransactionIT
[INFO] Tests run: 9, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 73.453 s 
- in org.apache.phoenix.iterate.RoundRobinResultIteratorIT
[INFO] Running org.apache.phoenix.tx.TransactionIT
[INFO] Running org.apache.phoenix.tx.FlappingTransactionIT
[INFO] Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 19.68 s 
- in org.apache.phoenix.tx.FlappingTransactionIT
[INFO] Running org.apache.phoenix.tx.TxCheckpointIT
[INFO] Tests run: 8, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 102.279 
s - in org.apache.phoenix.trace.PhoenixTracingEndToEndIT
[INFO] Tests run: 34, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 618.019 
s - in org.apache.phoenix.end2end.join.SortMergeJoinGlobalIndexIT
[INFO] Tests run: 8, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 90.339 s 
- in org.apache.phoenix.tx.TransactionIT
[INFO] Tests run: 34, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 791.612 
s - in org.apache.phoenix.end2end.join.HashJoinLocalIndexIT
[INFO] Running org.apache.phoenix.util.IndexScrutinyIT
[INFO] Tests run: 15, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 269.186 
s - in org.apache.phoenix.end2end.join.SubqueryUsingSortMergeJoinIT
[INFO] Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 24.373 s 
- in org.apache.phoenix.util.IndexScrutinyIT
[INFO] Tests run: 34, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 778.973 
s - in org.apache.phoenix.end2end.join.SortMergeJoinLocalIndexIT
[WARNING] Tests run: 52, Failures: 0, Errors: 0, Skipped: 4, Time elapsed: 
319.755 s - in org.apache.phoenix.tx.ParameterizedTransactionIT
[INFO] Tests run: 40, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 424.355 
s - in org.apache.phoenix.tx.TxCheckpointIT
[INFO] 
[INFO] Results:
[INFO] 
[ERROR] Failures: 
[ERROR]   ConcurrentMutationsIT.testConcurrentDeletesAndUpsertValues:214 
Expected to find PK in data table: (0,0)
[ERROR]   DefaultColumnValueIT.testDefaultIndexed:978
[ERROR]   RowValueConstructorIT.testRVCLastPkIsTable1stPkIndex:1584
[ERROR]   

Apache Phoenix - Timeout crawler - Build https://builds.apache.org/job/Phoenix-master/1950/

2018-03-02 Thread Apache Jenkins Server
[...truncated 50 lines...]

Build failed in Jenkins: Phoenix | 4.x-HBase-0.98 #1825

2018-03-02 Thread Apache Jenkins Server
See 


Changes:

[gjacoby] PHOENIX-4607 - Allow PhoenixInputFormat to use tenant-specific

--
[...truncated 104.69 KB...]
[INFO] Tests run: 8, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 42.501 s 
- in org.apache.phoenix.end2end.index.txn.RollbackIT
[INFO] Running org.apache.phoenix.end2end.join.HashJoinCacheIT
[INFO] Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 8.665 s 
- in org.apache.phoenix.end2end.join.HashJoinCacheIT
[INFO] Running org.apache.phoenix.end2end.join.HashJoinGlobalIndexIT
[WARNING] Tests run: 12, Failures: 0, Errors: 0, Skipped: 2, Time elapsed: 
50.726 s - in org.apache.phoenix.end2end.index.ViewIndexIT
[INFO] Tests run: 6, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 56.181 s 
- in org.apache.phoenix.end2end.index.txn.MutableRollbackIT
[INFO] Running org.apache.phoenix.end2end.join.HashJoinMoreIT
[INFO] Running org.apache.phoenix.end2end.join.HashJoinLocalIndexIT
[INFO] Tests run: 40, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 316.167 
s - in org.apache.phoenix.end2end.index.LocalMutableTxIndexIT
[INFO] Tests run: 8, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 41.802 s 
- in org.apache.phoenix.end2end.join.HashJoinMoreIT
[INFO] Running org.apache.phoenix.end2end.join.SortMergeJoinGlobalIndexIT
[INFO] Running org.apache.phoenix.end2end.join.HashJoinNoIndexIT
[INFO] Tests run: 33, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 146.989 
s - in org.apache.phoenix.end2end.join.HashJoinNoIndexIT
[INFO] Tests run: 33, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 242.277 
s - in org.apache.phoenix.end2end.join.HashJoinGlobalIndexIT
[INFO] Running org.apache.phoenix.end2end.join.SortMergeJoinLocalIndexIT
[INFO] Running org.apache.phoenix.end2end.join.SortMergeJoinNoIndexIT
[INFO] Tests run: 34, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 255.155 
s - in org.apache.phoenix.end2end.join.SortMergeJoinGlobalIndexIT
[INFO] Running org.apache.phoenix.end2end.join.SubqueryIT
[INFO] Tests run: 72, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 561.02 
s - in org.apache.phoenix.end2end.index.MutableIndexIT
[INFO] Running org.apache.phoenix.end2end.join.SubqueryUsingSortMergeJoinIT
[INFO] Tests run: 34, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 148.964 
s - in org.apache.phoenix.end2end.join.SortMergeJoinNoIndexIT
[INFO] Running org.apache.phoenix.end2end.salted.SaltedTableIT
[INFO] Tests run: 8, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 14.344 s 
- in org.apache.phoenix.end2end.salted.SaltedTableIT
[INFO] Running org.apache.phoenix.end2end.salted.SaltedTableUpsertSelectIT
[INFO] Tests run: 6, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 28.19 s 
- in org.apache.phoenix.end2end.salted.SaltedTableUpsertSelectIT
[INFO] Running org.apache.phoenix.end2end.salted.SaltedTableVarLengthRowKeyIT
[INFO] Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 2.526 s 
- in org.apache.phoenix.end2end.salted.SaltedTableVarLengthRowKeyIT
[INFO] Running org.apache.phoenix.iterate.PhoenixQueryTimeoutIT
[INFO] Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 3.446 s 
- in org.apache.phoenix.iterate.PhoenixQueryTimeoutIT
[INFO] Running org.apache.phoenix.iterate.RoundRobinResultIteratorIT
[INFO] Tests run: 15, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 132.216 
s - in org.apache.phoenix.end2end.join.SubqueryUsingSortMergeJoinIT
[INFO] Tests run: 24, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 177.502 
s - in org.apache.phoenix.end2end.join.SubqueryIT
[INFO] Tests run: 34, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 495.295 
s - in org.apache.phoenix.end2end.join.HashJoinLocalIndexIT
[INFO] Running org.apache.phoenix.rpc.UpdateCacheIT
[INFO] Tests run: 9, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 59.203 s 
- in org.apache.phoenix.iterate.RoundRobinResultIteratorIT
[INFO] Running org.apache.phoenix.trace.PhoenixTableMetricsWriterIT
[INFO] Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 3.163 s 
- in org.apache.phoenix.trace.PhoenixTableMetricsWriterIT
[INFO] Running org.apache.phoenix.tx.FlappingTransactionIT
[INFO] Tests run: 7, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 23.278 s 
- in org.apache.phoenix.rpc.UpdateCacheIT
[INFO] Running org.apache.phoenix.tx.ParameterizedTransactionIT
[INFO] Running org.apache.phoenix.trace.PhoenixTraceReaderIT
[INFO] Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 3.154 s 
- in org.apache.phoenix.trace.PhoenixTraceReaderIT
[INFO] Running org.apache.phoenix.tx.TransactionIT
[INFO] Running org.apache.phoenix.trace.PhoenixTracingEndToEndIT
[INFO] Tests run: 8, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 39.741 s 
- in org.apache.phoenix.tx.TransactionIT
[INFO] Running org.apache.phoenix.tx.TxCheckpointIT
[INFO] Tests run: 6, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 82.112 s 
- in 

Apache Phoenix - Timeout crawler - Build https://builds.apache.org/job/Phoenix-4.x-HBase-0.98/1825/

2018-03-02 Thread Apache Jenkins Server
[...truncated 50 lines...]

Jenkins build is back to normal : Phoenix-4.x-HBase-1.3 #50

2018-03-02 Thread Apache Jenkins Server
See 




Apache-Phoenix | 4.x-HBase-1.2 | Build Successful

2018-03-02 Thread Apache Jenkins Server
4.x-HBase-1.2 branch build status Successful

Source repository https://git-wip-us.apache.org/repos/asf?p=phoenix.git;a=shortlog;h=refs/heads/4.x-HBase-1.2

Compiled Artifacts https://builds.apache.org/job/Phoenix-4.x-HBase-1.2/lastSuccessfulBuild/artifact/

Test Report https://builds.apache.org/job/Phoenix-4.x-HBase-1.2/lastCompletedBuild/testReport/

Changes
[gjacoby] PHOENIX-4635 HBase Connection leak in



Build times for last couple of runsLatest build time is the right most | Legend blue: normal, red: test failure, gray: timeout


phoenix git commit: PHOENIX-4607 - Allow PhoenixInputFormat to use tenant-specific connections

2018-03-02 Thread gjacoby
Repository: phoenix
Updated Branches:
  refs/heads/5.x-HBase-2.0 d5b3faf2e -> 4f7949135


PHOENIX-4607 - Allow PhoenixInputFormat to use tenant-specific connections


Project: http://git-wip-us.apache.org/repos/asf/phoenix/repo
Commit: http://git-wip-us.apache.org/repos/asf/phoenix/commit/4f794913
Tree: http://git-wip-us.apache.org/repos/asf/phoenix/tree/4f794913
Diff: http://git-wip-us.apache.org/repos/asf/phoenix/diff/4f794913

Branch: refs/heads/5.x-HBase-2.0
Commit: 4f7949135bf406a7bf903dc96201a03750bbd771
Parents: d5b3faf
Author: Geoffrey 
Authored: Tue Feb 20 14:28:40 2018 -0800
Committer: Geoffrey Jacoby 
Committed: Fri Mar 2 13:04:56 2018 -0800

--
 .../org/apache/phoenix/end2end/MapReduceIT.java | 69 +++-
 .../phoenix/mapreduce/PhoenixInputFormat.java   | 41 +++-
 .../util/PhoenixConfigurationUtil.java  | 15 -
 .../mapreduce/util/PhoenixMapReduceUtil.java|  5 +-
 4 files changed, 92 insertions(+), 38 deletions(-)
--


http://git-wip-us.apache.org/repos/asf/phoenix/blob/4f794913/phoenix-core/src/it/java/org/apache/phoenix/end2end/MapReduceIT.java
--
diff --git 
a/phoenix-core/src/it/java/org/apache/phoenix/end2end/MapReduceIT.java 
b/phoenix-core/src/it/java/org/apache/phoenix/end2end/MapReduceIT.java
index 68d9c9c..fb24bb2 100644
--- a/phoenix-core/src/it/java/org/apache/phoenix/end2end/MapReduceIT.java
+++ b/phoenix-core/src/it/java/org/apache/phoenix/end2end/MapReduceIT.java
@@ -30,11 +30,13 @@ import 
org.apache.phoenix.mapreduce.util.PhoenixConfigurationUtil;
 import org.apache.phoenix.mapreduce.util.PhoenixMapReduceUtil;
 import org.apache.phoenix.schema.types.PDouble;
 import org.apache.phoenix.schema.types.PhoenixArray;
+import org.apache.phoenix.util.PhoenixRuntime;
 import org.junit.Before;
 import org.junit.Test;
 
 import java.io.IOException;
 import java.sql.*;
+import java.util.Properties;
 
 import static org.junit.Assert.*;
 
@@ -50,12 +52,19 @@ public class MapReduceIT extends ParallelStatsDisabledIT {
 " STOCK_NAME VARCHAR NOT NULL , RECORDING_YEAR  INTEGER NOT  NULL, 
 RECORDINGS_QUARTER " +
 " DOUBLE array[] CONSTRAINT pk PRIMARY KEY ( STOCK_NAME, 
RECORDING_YEAR ))";
 
+private static final String CREATE_STOCK_VIEW = "CREATE VIEW IF NOT EXISTS 
%s (v1 VARCHAR) AS "
++ " SELECT * FROM %s WHERE RECORDING_YEAR = 2008";
+
 private static final String MAX_RECORDING = "MAX_RECORDING";
 private  String CREATE_STOCK_STATS_TABLE =
 "CREATE TABLE IF NOT EXISTS %s(STOCK_NAME VARCHAR NOT NULL , "
 + " MAX_RECORDING DOUBLE CONSTRAINT pk PRIMARY KEY 
(STOCK_NAME ))";
+
+
 private String UPSERT = "UPSERT into %s values (?, ?, ?)";
 
+private String TENANT_ID = "1234567890";
+
 @Before
 public void setupTables() throws Exception {
 
@@ -63,22 +72,28 @@ public class MapReduceIT extends ParallelStatsDisabledIT {
 
 @Test
 public void testNoConditionsOnSelect() throws Exception {
-Connection conn = DriverManager.getConnection(getUrl());
-String stockTableName = generateUniqueName();
-String stockStatsTableName = generateUniqueName();
-conn.createStatement().execute(String.format(CREATE_STOCK_TABLE, 
stockTableName));
-conn.createStatement().execute(String.format(CREATE_STOCK_STATS_TABLE, 
stockStatsTableName));
-conn.commit();
-final Configuration conf = getUtility().getConfiguration();
-Job job = Job.getInstance(conf);
-PhoenixMapReduceUtil.setInput(job, StockWritable.class, 
stockTableName, null,
-STOCK_NAME, RECORDING_YEAR, "0." + RECORDINGS_QUARTER);
-testJob(job, stockTableName, stockStatsTableName, 91.04);
+try (Connection conn = DriverManager.getConnection(getUrl())) {
+createAndTestJob(conn, null, 91.04, null);
+}
 }
 
 @Test
 public void testConditionsOnSelect() throws Exception {
-Connection conn = DriverManager.getConnection(getUrl());
+try (Connection conn = DriverManager.getConnection(getUrl())) {
+createAndTestJob(conn, RECORDING_YEAR + "  < 2009", 81.04, null);
+}
+}
+
+@Test
+public void testWithTenantId() throws Exception {
+try (Connection conn = DriverManager.getConnection(getUrl())){
+//tenant view will perform the same filter as the select 
conditions do in testConditionsOnSelect
+createAndTestJob(conn, null, 81.04, TENANT_ID);
+}
+
+}
+
+private void createAndTestJob(Connection conn, String s, double v, String 
tenantId) throws SQLException, IOException, InterruptedException, 
ClassNotFoundException {
 String stockTableName = generateUniqueName();
 

phoenix git commit: PHOENIX-4607 - Allow PhoenixInputFormat to use tenant-specific connections

2018-03-02 Thread gjacoby
Repository: phoenix
Updated Branches:
  refs/heads/master 02816fe95 -> a35591c87


PHOENIX-4607 - Allow PhoenixInputFormat to use tenant-specific connections


Project: http://git-wip-us.apache.org/repos/asf/phoenix/repo
Commit: http://git-wip-us.apache.org/repos/asf/phoenix/commit/a35591c8
Tree: http://git-wip-us.apache.org/repos/asf/phoenix/tree/a35591c8
Diff: http://git-wip-us.apache.org/repos/asf/phoenix/diff/a35591c8

Branch: refs/heads/master
Commit: a35591c876ddb338cdb8269e67aed92ed6e5d485
Parents: 02816fe
Author: Geoffrey 
Authored: Tue Feb 20 14:28:40 2018 -0800
Committer: Geoffrey Jacoby 
Committed: Fri Mar 2 13:00:13 2018 -0800

--
 .../org/apache/phoenix/end2end/MapReduceIT.java | 69 +++-
 .../phoenix/mapreduce/PhoenixInputFormat.java   | 41 +++-
 .../util/PhoenixConfigurationUtil.java  | 15 -
 .../mapreduce/util/PhoenixMapReduceUtil.java|  5 +-
 4 files changed, 92 insertions(+), 38 deletions(-)
--


http://git-wip-us.apache.org/repos/asf/phoenix/blob/a35591c8/phoenix-core/src/it/java/org/apache/phoenix/end2end/MapReduceIT.java
--
diff --git 
a/phoenix-core/src/it/java/org/apache/phoenix/end2end/MapReduceIT.java 
b/phoenix-core/src/it/java/org/apache/phoenix/end2end/MapReduceIT.java
index 68d9c9c..fb24bb2 100644
--- a/phoenix-core/src/it/java/org/apache/phoenix/end2end/MapReduceIT.java
+++ b/phoenix-core/src/it/java/org/apache/phoenix/end2end/MapReduceIT.java
@@ -30,11 +30,13 @@ import 
org.apache.phoenix.mapreduce.util.PhoenixConfigurationUtil;
 import org.apache.phoenix.mapreduce.util.PhoenixMapReduceUtil;
 import org.apache.phoenix.schema.types.PDouble;
 import org.apache.phoenix.schema.types.PhoenixArray;
+import org.apache.phoenix.util.PhoenixRuntime;
 import org.junit.Before;
 import org.junit.Test;
 
 import java.io.IOException;
 import java.sql.*;
+import java.util.Properties;
 
 import static org.junit.Assert.*;
 
@@ -50,12 +52,19 @@ public class MapReduceIT extends ParallelStatsDisabledIT {
 " STOCK_NAME VARCHAR NOT NULL , RECORDING_YEAR  INTEGER NOT  NULL, 
 RECORDINGS_QUARTER " +
 " DOUBLE array[] CONSTRAINT pk PRIMARY KEY ( STOCK_NAME, 
RECORDING_YEAR ))";
 
+private static final String CREATE_STOCK_VIEW = "CREATE VIEW IF NOT EXISTS 
%s (v1 VARCHAR) AS "
++ " SELECT * FROM %s WHERE RECORDING_YEAR = 2008";
+
 private static final String MAX_RECORDING = "MAX_RECORDING";
 private  String CREATE_STOCK_STATS_TABLE =
 "CREATE TABLE IF NOT EXISTS %s(STOCK_NAME VARCHAR NOT NULL , "
 + " MAX_RECORDING DOUBLE CONSTRAINT pk PRIMARY KEY 
(STOCK_NAME ))";
+
+
 private String UPSERT = "UPSERT into %s values (?, ?, ?)";
 
+private String TENANT_ID = "1234567890";
+
 @Before
 public void setupTables() throws Exception {
 
@@ -63,22 +72,28 @@ public class MapReduceIT extends ParallelStatsDisabledIT {
 
 @Test
 public void testNoConditionsOnSelect() throws Exception {
-Connection conn = DriverManager.getConnection(getUrl());
-String stockTableName = generateUniqueName();
-String stockStatsTableName = generateUniqueName();
-conn.createStatement().execute(String.format(CREATE_STOCK_TABLE, 
stockTableName));
-conn.createStatement().execute(String.format(CREATE_STOCK_STATS_TABLE, 
stockStatsTableName));
-conn.commit();
-final Configuration conf = getUtility().getConfiguration();
-Job job = Job.getInstance(conf);
-PhoenixMapReduceUtil.setInput(job, StockWritable.class, 
stockTableName, null,
-STOCK_NAME, RECORDING_YEAR, "0." + RECORDINGS_QUARTER);
-testJob(job, stockTableName, stockStatsTableName, 91.04);
+try (Connection conn = DriverManager.getConnection(getUrl())) {
+createAndTestJob(conn, null, 91.04, null);
+}
 }
 
 @Test
 public void testConditionsOnSelect() throws Exception {
-Connection conn = DriverManager.getConnection(getUrl());
+try (Connection conn = DriverManager.getConnection(getUrl())) {
+createAndTestJob(conn, RECORDING_YEAR + "  < 2009", 81.04, null);
+}
+}
+
+@Test
+public void testWithTenantId() throws Exception {
+try (Connection conn = DriverManager.getConnection(getUrl())){
+//tenant view will perform the same filter as the select 
conditions do in testConditionsOnSelect
+createAndTestJob(conn, null, 81.04, TENANT_ID);
+}
+
+}
+
+private void createAndTestJob(Connection conn, String s, double v, String 
tenantId) throws SQLException, IOException, InterruptedException, 
ClassNotFoundException {
 String stockTableName = generateUniqueName();
 String 

phoenix git commit: PHOENIX-4607 - Allow PhoenixInputFormat to use tenant-specific connections

2018-03-02 Thread gjacoby
Repository: phoenix
Updated Branches:
  refs/heads/4.x-HBase-1.2 fb8997963 -> 8e0c3d1c0


PHOENIX-4607 - Allow PhoenixInputFormat to use tenant-specific connections


Project: http://git-wip-us.apache.org/repos/asf/phoenix/repo
Commit: http://git-wip-us.apache.org/repos/asf/phoenix/commit/8e0c3d1c
Tree: http://git-wip-us.apache.org/repos/asf/phoenix/tree/8e0c3d1c
Diff: http://git-wip-us.apache.org/repos/asf/phoenix/diff/8e0c3d1c

Branch: refs/heads/4.x-HBase-1.2
Commit: 8e0c3d1c025dca08cabf53038a0d7f302f0679e9
Parents: fb89979
Author: Geoffrey 
Authored: Tue Feb 20 14:28:40 2018 -0800
Committer: Geoffrey Jacoby 
Committed: Fri Mar 2 12:59:49 2018 -0800

--
 .../org/apache/phoenix/end2end/MapReduceIT.java | 69 +++-
 .../phoenix/mapreduce/PhoenixInputFormat.java   | 41 +++-
 .../util/PhoenixConfigurationUtil.java  | 15 -
 .../mapreduce/util/PhoenixMapReduceUtil.java|  5 +-
 4 files changed, 92 insertions(+), 38 deletions(-)
--


http://git-wip-us.apache.org/repos/asf/phoenix/blob/8e0c3d1c/phoenix-core/src/it/java/org/apache/phoenix/end2end/MapReduceIT.java
--
diff --git 
a/phoenix-core/src/it/java/org/apache/phoenix/end2end/MapReduceIT.java 
b/phoenix-core/src/it/java/org/apache/phoenix/end2end/MapReduceIT.java
index 68d9c9c..fb24bb2 100644
--- a/phoenix-core/src/it/java/org/apache/phoenix/end2end/MapReduceIT.java
+++ b/phoenix-core/src/it/java/org/apache/phoenix/end2end/MapReduceIT.java
@@ -30,11 +30,13 @@ import 
org.apache.phoenix.mapreduce.util.PhoenixConfigurationUtil;
 import org.apache.phoenix.mapreduce.util.PhoenixMapReduceUtil;
 import org.apache.phoenix.schema.types.PDouble;
 import org.apache.phoenix.schema.types.PhoenixArray;
+import org.apache.phoenix.util.PhoenixRuntime;
 import org.junit.Before;
 import org.junit.Test;
 
 import java.io.IOException;
 import java.sql.*;
+import java.util.Properties;
 
 import static org.junit.Assert.*;
 
@@ -50,12 +52,19 @@ public class MapReduceIT extends ParallelStatsDisabledIT {
 " STOCK_NAME VARCHAR NOT NULL , RECORDING_YEAR  INTEGER NOT  NULL, 
 RECORDINGS_QUARTER " +
 " DOUBLE array[] CONSTRAINT pk PRIMARY KEY ( STOCK_NAME, 
RECORDING_YEAR ))";
 
+private static final String CREATE_STOCK_VIEW = "CREATE VIEW IF NOT EXISTS 
%s (v1 VARCHAR) AS "
++ " SELECT * FROM %s WHERE RECORDING_YEAR = 2008";
+
 private static final String MAX_RECORDING = "MAX_RECORDING";
 private  String CREATE_STOCK_STATS_TABLE =
 "CREATE TABLE IF NOT EXISTS %s(STOCK_NAME VARCHAR NOT NULL , "
 + " MAX_RECORDING DOUBLE CONSTRAINT pk PRIMARY KEY 
(STOCK_NAME ))";
+
+
 private String UPSERT = "UPSERT into %s values (?, ?, ?)";
 
+private String TENANT_ID = "1234567890";
+
 @Before
 public void setupTables() throws Exception {
 
@@ -63,22 +72,28 @@ public class MapReduceIT extends ParallelStatsDisabledIT {
 
 @Test
 public void testNoConditionsOnSelect() throws Exception {
-Connection conn = DriverManager.getConnection(getUrl());
-String stockTableName = generateUniqueName();
-String stockStatsTableName = generateUniqueName();
-conn.createStatement().execute(String.format(CREATE_STOCK_TABLE, 
stockTableName));
-conn.createStatement().execute(String.format(CREATE_STOCK_STATS_TABLE, 
stockStatsTableName));
-conn.commit();
-final Configuration conf = getUtility().getConfiguration();
-Job job = Job.getInstance(conf);
-PhoenixMapReduceUtil.setInput(job, StockWritable.class, 
stockTableName, null,
-STOCK_NAME, RECORDING_YEAR, "0." + RECORDINGS_QUARTER);
-testJob(job, stockTableName, stockStatsTableName, 91.04);
+try (Connection conn = DriverManager.getConnection(getUrl())) {
+createAndTestJob(conn, null, 91.04, null);
+}
 }
 
 @Test
 public void testConditionsOnSelect() throws Exception {
-Connection conn = DriverManager.getConnection(getUrl());
+try (Connection conn = DriverManager.getConnection(getUrl())) {
+createAndTestJob(conn, RECORDING_YEAR + "  < 2009", 81.04, null);
+}
+}
+
+@Test
+public void testWithTenantId() throws Exception {
+try (Connection conn = DriverManager.getConnection(getUrl())){
+//tenant view will perform the same filter as the select 
conditions do in testConditionsOnSelect
+createAndTestJob(conn, null, 81.04, TENANT_ID);
+}
+
+}
+
+private void createAndTestJob(Connection conn, String s, double v, String 
tenantId) throws SQLException, IOException, InterruptedException, 
ClassNotFoundException {
 String stockTableName = generateUniqueName();
 

phoenix git commit: PHOENIX-4635 HBase Connection leak in org.apache.phoenix.hive.mapreduce.PhoenixInputFormat

2018-03-02 Thread gjacoby
Repository: phoenix
Updated Branches:
  refs/heads/4.x-HBase-1.3 7b5bd5da9 -> 814afcb93


PHOENIX-4635 HBase Connection leak in 
org.apache.phoenix.hive.mapreduce.PhoenixInputFormat

Signed-off-by: Geoffrey Jacoby 


Project: http://git-wip-us.apache.org/repos/asf/phoenix/repo
Commit: http://git-wip-us.apache.org/repos/asf/phoenix/commit/814afcb9
Tree: http://git-wip-us.apache.org/repos/asf/phoenix/tree/814afcb9
Diff: http://git-wip-us.apache.org/repos/asf/phoenix/diff/814afcb9

Branch: refs/heads/4.x-HBase-1.3
Commit: 814afcb93fce729fcf49f050cf15cd56897eec8a
Parents: 7b5bd5d
Author: Yechao Chen 
Authored: Fri Mar 2 09:53:04 2018 +0800
Committer: Geoffrey Jacoby 
Committed: Fri Mar 2 11:00:50 2018 -0800

--
 .../org/apache/phoenix/hive/mapreduce/PhoenixInputFormat.java | 3 ++-
 1 file changed, 2 insertions(+), 1 deletion(-)
--


http://git-wip-us.apache.org/repos/asf/phoenix/blob/814afcb9/phoenix-hive/src/main/java/org/apache/phoenix/hive/mapreduce/PhoenixInputFormat.java
--
diff --git 
a/phoenix-hive/src/main/java/org/apache/phoenix/hive/mapreduce/PhoenixInputFormat.java
 
b/phoenix-hive/src/main/java/org/apache/phoenix/hive/mapreduce/PhoenixInputFormat.java
index f0a5dd6..b550e32 100644
--- 
a/phoenix-hive/src/main/java/org/apache/phoenix/hive/mapreduce/PhoenixInputFormat.java
+++ 
b/phoenix-hive/src/main/java/org/apache/phoenix/hive/mapreduce/PhoenixInputFormat.java
@@ -150,7 +150,7 @@ public class PhoenixInputFormat 
implements InputFormat 
implements InputFormat

phoenix git commit: PHOENIX-4635 HBase Connection leak in org.apache.phoenix.hive.mapreduce.PhoenixInputFormat

2018-03-02 Thread gjacoby
Repository: phoenix
Updated Branches:
  refs/heads/5.x-HBase-2.0 ba1fd85dc -> d5b3faf2e


PHOENIX-4635 HBase Connection leak in 
org.apache.phoenix.hive.mapreduce.PhoenixInputFormat

Signed-off-by: Geoffrey Jacoby 


Project: http://git-wip-us.apache.org/repos/asf/phoenix/repo
Commit: http://git-wip-us.apache.org/repos/asf/phoenix/commit/d5b3faf2
Tree: http://git-wip-us.apache.org/repos/asf/phoenix/tree/d5b3faf2
Diff: http://git-wip-us.apache.org/repos/asf/phoenix/diff/d5b3faf2

Branch: refs/heads/5.x-HBase-2.0
Commit: d5b3faf2ee03fd2964a60031bd965686242fe4fb
Parents: ba1fd85
Author: Yechao Chen 
Authored: Fri Mar 2 10:26:36 2018 +0800
Committer: Geoffrey Jacoby 
Committed: Fri Mar 2 10:50:11 2018 -0800

--
 .../org/apache/phoenix/hive/mapreduce/PhoenixInputFormat.java | 3 ++-
 1 file changed, 2 insertions(+), 1 deletion(-)
--


http://git-wip-us.apache.org/repos/asf/phoenix/blob/d5b3faf2/phoenix-hive/src/main/java/org/apache/phoenix/hive/mapreduce/PhoenixInputFormat.java
--
diff --git 
a/phoenix-hive/src/main/java/org/apache/phoenix/hive/mapreduce/PhoenixInputFormat.java
 
b/phoenix-hive/src/main/java/org/apache/phoenix/hive/mapreduce/PhoenixInputFormat.java
index ff15972..b4f96ee 100644
--- 
a/phoenix-hive/src/main/java/org/apache/phoenix/hive/mapreduce/PhoenixInputFormat.java
+++ 
b/phoenix-hive/src/main/java/org/apache/phoenix/hive/mapreduce/PhoenixInputFormat.java
@@ -151,7 +151,7 @@ public class PhoenixInputFormat 
implements InputFormat 
implements InputFormat

phoenix git commit: PHOENIX-4635 HBase Connection leak in org.apache.phoenix.hive.mapreduce.PhoenixInputFormat

2018-03-02 Thread gjacoby
Repository: phoenix
Updated Branches:
  refs/heads/4.x-HBase-1.2 bb0813199 -> fb8997963


PHOENIX-4635 HBase Connection leak in 
org.apache.phoenix.hive.mapreduce.PhoenixInputFormat

Signed-off-by: Geoffrey Jacoby 


Project: http://git-wip-us.apache.org/repos/asf/phoenix/repo
Commit: http://git-wip-us.apache.org/repos/asf/phoenix/commit/fb899796
Tree: http://git-wip-us.apache.org/repos/asf/phoenix/tree/fb899796
Diff: http://git-wip-us.apache.org/repos/asf/phoenix/diff/fb899796

Branch: refs/heads/4.x-HBase-1.2
Commit: fb89979632d6aa1c8569ff67c00bcdd93ca7bc98
Parents: bb08131
Author: Yechao Chen 
Authored: Fri Mar 2 09:53:04 2018 +0800
Committer: Geoffrey Jacoby 
Committed: Fri Mar 2 10:48:53 2018 -0800

--
 .../org/apache/phoenix/hive/mapreduce/PhoenixInputFormat.java | 3 ++-
 1 file changed, 2 insertions(+), 1 deletion(-)
--


http://git-wip-us.apache.org/repos/asf/phoenix/blob/fb899796/phoenix-hive/src/main/java/org/apache/phoenix/hive/mapreduce/PhoenixInputFormat.java
--
diff --git 
a/phoenix-hive/src/main/java/org/apache/phoenix/hive/mapreduce/PhoenixInputFormat.java
 
b/phoenix-hive/src/main/java/org/apache/phoenix/hive/mapreduce/PhoenixInputFormat.java
index f0a5dd6..b550e32 100644
--- 
a/phoenix-hive/src/main/java/org/apache/phoenix/hive/mapreduce/PhoenixInputFormat.java
+++ 
b/phoenix-hive/src/main/java/org/apache/phoenix/hive/mapreduce/PhoenixInputFormat.java
@@ -150,7 +150,7 @@ public class PhoenixInputFormat 
implements InputFormat 
implements InputFormat

phoenix git commit: PHOENIX-4635 HBase Connection leak in org.apache.phoenix.hive.mapreduce.PhoenixInputFormat

2018-03-02 Thread gjacoby
Repository: phoenix
Updated Branches:
  refs/heads/4.x-HBase-0.98 1f484245a -> f9cc29989


PHOENIX-4635 HBase Connection leak in 
org.apache.phoenix.hive.mapreduce.PhoenixInputFormat

Signed-off-by: Geoffrey Jacoby 


Project: http://git-wip-us.apache.org/repos/asf/phoenix/repo
Commit: http://git-wip-us.apache.org/repos/asf/phoenix/commit/f9cc2998
Tree: http://git-wip-us.apache.org/repos/asf/phoenix/tree/f9cc2998
Diff: http://git-wip-us.apache.org/repos/asf/phoenix/diff/f9cc2998

Branch: refs/heads/4.x-HBase-0.98
Commit: f9cc29989b11c47f6b79b0c802d72456c924472e
Parents: 1f48424
Author: Yechao Chen 
Authored: Fri Mar 2 10:03:11 2018 +0800
Committer: Geoffrey Jacoby 
Committed: Fri Mar 2 10:44:04 2018 -0800

--
 .../org/apache/phoenix/hive/mapreduce/PhoenixInputFormat.java | 3 ++-
 1 file changed, 2 insertions(+), 1 deletion(-)
--


http://git-wip-us.apache.org/repos/asf/phoenix/blob/f9cc2998/phoenix-hive/src/main/java/org/apache/phoenix/hive/mapreduce/PhoenixInputFormat.java
--
diff --git 
a/phoenix-hive/src/main/java/org/apache/phoenix/hive/mapreduce/PhoenixInputFormat.java
 
b/phoenix-hive/src/main/java/org/apache/phoenix/hive/mapreduce/PhoenixInputFormat.java
index b70c00f..794fcc4 100644
--- 
a/phoenix-hive/src/main/java/org/apache/phoenix/hive/mapreduce/PhoenixInputFormat.java
+++ 
b/phoenix-hive/src/main/java/org/apache/phoenix/hive/mapreduce/PhoenixInputFormat.java
@@ -154,7 +154,7 @@ public class PhoenixInputFormat 
implements InputFormat scans : qplan.getScans()) {
@@ -205,6 +205,7 @@ public class PhoenixInputFormat 
implements InputFormat

phoenix git commit: PHOENIX-4635 HBase Connection leak in org.apache.phoenix.hive.mapreduce.PhoenixInputFormat

2018-03-02 Thread gjacoby
Repository: phoenix
Updated Branches:
  refs/heads/master 1a226ed3e -> 02816fe95


PHOENIX-4635 HBase Connection leak in 
org.apache.phoenix.hive.mapreduce.PhoenixInputFormat

Signed-off-by: Geoffrey 


Project: http://git-wip-us.apache.org/repos/asf/phoenix/repo
Commit: http://git-wip-us.apache.org/repos/asf/phoenix/commit/02816fe9
Tree: http://git-wip-us.apache.org/repos/asf/phoenix/tree/02816fe9
Diff: http://git-wip-us.apache.org/repos/asf/phoenix/diff/02816fe9

Branch: refs/heads/master
Commit: 02816fe95adff974e54fd7ace8851ba099ba0963
Parents: 1a226ed
Author: Yechao Chen 
Authored: Fri Mar 2 09:53:04 2018 +0800
Committer: Geoffrey 
Committed: Fri Mar 2 10:12:15 2018 -0800

--
 .../org/apache/phoenix/hive/mapreduce/PhoenixInputFormat.java | 3 ++-
 1 file changed, 2 insertions(+), 1 deletion(-)
--


http://git-wip-us.apache.org/repos/asf/phoenix/blob/02816fe9/phoenix-hive/src/main/java/org/apache/phoenix/hive/mapreduce/PhoenixInputFormat.java
--
diff --git 
a/phoenix-hive/src/main/java/org/apache/phoenix/hive/mapreduce/PhoenixInputFormat.java
 
b/phoenix-hive/src/main/java/org/apache/phoenix/hive/mapreduce/PhoenixInputFormat.java
index f0a5dd6..b550e32 100644
--- 
a/phoenix-hive/src/main/java/org/apache/phoenix/hive/mapreduce/PhoenixInputFormat.java
+++ 
b/phoenix-hive/src/main/java/org/apache/phoenix/hive/mapreduce/PhoenixInputFormat.java
@@ -150,7 +150,7 @@ public class PhoenixInputFormat 
implements InputFormat 
implements InputFormat

Build failed in Jenkins: Phoenix Compile Compatibility with HBase #564

2018-03-02 Thread Apache Jenkins Server
See 


--
[...truncated 39.70 KB...]
[ERROR] 
:[364,5]
 method does not override or implement a method from a supertype
[ERROR] 
:[370,5]
 method does not override or implement a method from a supertype
[ERROR] 
:[376,5]
 method does not override or implement a method from a supertype
[ERROR] 
:[382,5]
 method does not override or implement a method from a supertype
[ERROR] Failed to execute goal 
org.apache.maven.plugins:maven-compiler-plugin:3.0:compile (default-compile) on 
project phoenix-core: Compilation failure: Compilation failure: 
[ERROR] 
:[34,39]
 cannot find symbol
[ERROR]   symbol:   class MetricRegistry
[ERROR]   location: package org.apache.hadoop.hbase.metrics
[ERROR] 
:[144,16]
 cannot find symbol
[ERROR]   symbol:   class MetricRegistry
[ERROR]   location: class 
org.apache.phoenix.coprocessor.PhoenixMetaDataCoprocessorHost.PhoenixMetaDataControllerEnvironment
[ERROR] 
:[24,35]
 cannot find symbol
[ERROR]   symbol:   class DelegatingHBaseRpcController
[ERROR]   location: package org.apache.hadoop.hbase.ipc
[ERROR] 
:[25,35]
 cannot find symbol
[ERROR]   symbol:   class HBaseRpcController
[ERROR]   location: package org.apache.hadoop.hbase.ipc
[ERROR] 
:[37,37]
 cannot find symbol
[ERROR]   symbol: class DelegatingHBaseRpcController
[ERROR] 
:[56,38]
 cannot find symbol
[ERROR]   symbol:   class HBaseRpcController
[ERROR]   location: class 
org.apache.hadoop.hbase.ipc.controller.MetadataRpcController
[ERROR] 
:[26,35]
 cannot find symbol
[ERROR]   symbol:   class HBaseRpcController
[ERROR]   location: package org.apache.hadoop.hbase.ipc
[ERROR] 
:[40,12]
 cannot find symbol
[ERROR]   symbol:   class HBaseRpcController
[ERROR]   location: class 
org.apache.hadoop.hbase.ipc.controller.InterRegionServerMetadataRpcControllerFactory
[ERROR] 
:[46,12]
 cannot find symbol
[ERROR]   symbol:   class HBaseRpcController
[ERROR]   location: class 
org.apache.hadoop.hbase.ipc.controller.InterRegionServerMetadataRpcControllerFactory
[ERROR] 
:[52,12]
 cannot find symbol
[ERROR]   symbol:   class HBaseRpcController
[ERROR]   location: class 
org.apache.hadoop.hbase.ipc.controller.InterRegionServerMetadataRpcControllerFactory
[ERROR] 
:[57,46]
 cannot