Jenkins build is back to normal : Phoenix | 4.0 | Hadoop1 #183

2014-06-08 Thread Apache Jenkins Server
See 



git commit: Kick build

2014-06-08 Thread jamestaylor
Repository: phoenix
Updated Branches:
  refs/heads/4.0 9d5b0e6ab -> 84930e3e9


Kick build


Project: http://git-wip-us.apache.org/repos/asf/phoenix/repo
Commit: http://git-wip-us.apache.org/repos/asf/phoenix/commit/84930e3e
Tree: http://git-wip-us.apache.org/repos/asf/phoenix/tree/84930e3e
Diff: http://git-wip-us.apache.org/repos/asf/phoenix/diff/84930e3e

Branch: refs/heads/4.0
Commit: 84930e3e97e889c5a713aaecfc211b6348897ad5
Parents: 9d5b0e6
Author: James Taylor 
Authored: Sun Jun 8 23:26:10 2014 -0700
Committer: James Taylor 
Committed: Sun Jun 8 23:26:10 2014 -0700

--
 BUILDING | 2 +-
 1 file changed, 1 insertion(+), 1 deletion(-)
--


http://git-wip-us.apache.org/repos/asf/phoenix/blob/84930e3e/BUILDING
--
diff --git a/BUILDING b/BUILDING
index aac954c..74c6294 100644
--- a/BUILDING
+++ b/BUILDING
@@ -22,7 +22,7 @@
 # Building Apache Phoenix
 =
 
-Phoenix uses Maven (3.X) to build all its necessary resources. 
+Phoenix uses Maven (3.X) to build all its necessary resources.
 
 ## Building from source
 ===



Jenkins build is back to normal : Phoenix | 3.0 | Hadoop1 #117

2014-06-08 Thread Apache Jenkins Server
See 



Build failed in Jenkins: Phoenix | 4.0 | Hadoop1 #182

2014-06-08 Thread Apache Jenkins Server
See 

Changes:

[jtaylor] PHOENIX-1028 Prevent declaration of non PK columns as NOT NULL

--
[...truncated 347 lines...]
Tests run: 136, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 80.737 sec - 
in org.apache.phoenix.end2end.CaseStatementIT
Running org.apache.phoenix.end2end.KeyOnlyIT
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 3.296 sec - in 
org.apache.phoenix.end2end.KeyOnlyIT
Running org.apache.phoenix.end2end.DefaultParallelIteratorsRegionSplitterIT
Tests run: 8, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 6.1 sec - in 
org.apache.phoenix.end2end.SkipRangeParallelIteratorRegionSplitterIT
Running org.apache.phoenix.end2end.SequenceIT
Tests run: 152, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 82.308 sec - 
in org.apache.phoenix.end2end.GroupByIT
Running org.apache.phoenix.end2end.ToNumberFunctionIT
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 4.371 sec - in 
org.apache.phoenix.end2end.DefaultParallelIteratorsRegionSplitterIT
Running org.apache.phoenix.end2end.QueryDatabaseMetaDataIT
Tests run: 18, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 3.998 sec - in 
org.apache.phoenix.end2end.ToNumberFunctionIT
Running org.apache.phoenix.end2end.NotQueryIT
Tests run: 23, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 17.65 sec - in 
org.apache.phoenix.end2end.SequenceIT
Running org.apache.phoenix.end2end.VariableLengthPKIT
Tests run: 49, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 22.417 sec - 
in org.apache.phoenix.end2end.VariableLengthPKIT
Running org.apache.phoenix.end2end.salted.SaltedTableVarLengthRowKeyIT
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 4.234 sec - in 
org.apache.phoenix.end2end.salted.SaltedTableVarLengthRowKeyIT
Running org.apache.phoenix.end2end.salted.SaltedTableIT
Tests run: 15, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 44.877 sec - 
in org.apache.phoenix.end2end.QueryDatabaseMetaDataIT
Running org.apache.phoenix.end2end.RowValueConstructorIT
Tests run: 8, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 6.308 sec - in 
org.apache.phoenix.end2end.salted.SaltedTableIT
Running org.apache.phoenix.end2end.FunkyNamesIT
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 3.195 sec - in 
org.apache.phoenix.end2end.FunkyNamesIT
Running org.apache.phoenix.end2end.UpsertSelectIT
Tests run: 25, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 20.595 sec - 
in org.apache.phoenix.end2end.RowValueConstructorIT
Running org.apache.phoenix.end2end.CastAndCoerceIT
Tests run: 11, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 30.601 sec - 
in org.apache.phoenix.end2end.UpsertSelectIT
Running org.apache.phoenix.end2end.ReadIsolationLevelIT
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 3.186 sec - in 
org.apache.phoenix.end2end.ReadIsolationLevelIT
Running org.apache.phoenix.end2end.MultiCfQueryExecIT
Tests run: 144, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 82.803 sec - 
in org.apache.phoenix.end2end.NotQueryIT
Running org.apache.phoenix.end2end.NativeHBaseTypesIT
Tests run: 7, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 2.889 sec - in 
org.apache.phoenix.end2end.NativeHBaseTypesIT
Running org.apache.phoenix.end2end.ColumnProjectionOptimizationIT
Tests run: 7, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 14.11 sec - in 
org.apache.phoenix.end2end.MultiCfQueryExecIT
Running org.apache.phoenix.end2end.PercentileIT
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 9.846 sec - in 
org.apache.phoenix.end2end.ColumnProjectionOptimizationIT
Running org.apache.phoenix.end2end.CreateTableIT
Tests run: 17, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 11.004 sec - 
in org.apache.phoenix.end2end.PercentileIT
Running org.apache.phoenix.end2end.OrderByIT
Tests run: 3, Failures: 0, Errors: 0, Skipped: 1, Time elapsed: 3.655 sec - in 
org.apache.phoenix.end2end.OrderByIT
Running org.apache.phoenix.end2end.ProductMetricsIT
Tests run: 10, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 34.786 sec - 
in org.apache.phoenix.end2end.CreateTableIT
Running org.apache.phoenix.end2end.DistinctCountIT
Tests run: 61, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 29.038 sec - 
in org.apache.phoenix.end2end.ProductMetricsIT
Running org.apache.phoenix.end2end.SpooledOrderByIT
Tests run: 124, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 79.237 sec - 
in org.apache.phoenix.end2end.CastAndCoerceIT
Running org.apache.phoenix.end2end.InMemoryOrderByIT
Tests run: 9, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 7.654 sec - in 
org.apache.phoenix.end2end.DistinctCountIT
Running org.apache.phoenix.end2end.TruncateFunctionIT
Tests run: 3, Failures: 0, Errors: 0, Skipped: 1, Time elapsed: 3.75 sec - in 
org.apache.phoenix.end2end.SpooledOrderByIT
Running org.apache.phoenix.end2end.TenantSpecificTablesDMLIT
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time el

git commit: PHOENIX-1028 Prevent declaration of non PK columns as NOT NULL

2014-06-08 Thread jamestaylor
Repository: phoenix
Updated Branches:
  refs/heads/3.0 05f8bf056 -> e18893f9e


PHOENIX-1028 Prevent declaration of non PK columns as NOT NULL


Project: http://git-wip-us.apache.org/repos/asf/phoenix/repo
Commit: http://git-wip-us.apache.org/repos/asf/phoenix/commit/e18893f9
Tree: http://git-wip-us.apache.org/repos/asf/phoenix/tree/e18893f9
Diff: http://git-wip-us.apache.org/repos/asf/phoenix/diff/e18893f9

Branch: refs/heads/3.0
Commit: e18893f9ef3ff44b323e948ccada84a02994b2ff
Parents: 05f8bf0
Author: James Taylor 
Authored: Sun Jun 8 22:51:05 2014 -0700
Committer: James Taylor 
Committed: Sun Jun 8 22:52:24 2014 -0700

--
 .../it/java/org/apache/phoenix/end2end/ExecuteStatementsIT.java| 2 +-
 1 file changed, 1 insertion(+), 1 deletion(-)
--


http://git-wip-us.apache.org/repos/asf/phoenix/blob/e18893f9/phoenix-core/src/it/java/org/apache/phoenix/end2end/ExecuteStatementsIT.java
--
diff --git 
a/phoenix-core/src/it/java/org/apache/phoenix/end2end/ExecuteStatementsIT.java 
b/phoenix-core/src/it/java/org/apache/phoenix/end2end/ExecuteStatementsIT.java
index 0611c33..c2bf5c5 100644
--- 
a/phoenix-core/src/it/java/org/apache/phoenix/end2end/ExecuteStatementsIT.java
+++ 
b/phoenix-core/src/it/java/org/apache/phoenix/end2end/ExecuteStatementsIT.java
@@ -126,7 +126,7 @@ public class ExecuteStatementsIT extends 
BaseHBaseManagedTimeIT {
 String query = "create table " + tableName +
 "(a_id integer not null, \n" + 
 "a_string char(10) not null, \n" +
-"b_string char(8) not null \n" + 
+"b_string char(8)\n" + 
 "CONSTRAINT my_pk PRIMARY KEY (a_id, a_string))";
 
 



git commit: PHOENIX-1028 Prevent declaration of non PK columns as NOT NULL

2014-06-08 Thread jamestaylor
Repository: phoenix
Updated Branches:
  refs/heads/4.0 f3ed1a571 -> 9d5b0e6ab


PHOENIX-1028 Prevent declaration of non PK columns as NOT NULL


Project: http://git-wip-us.apache.org/repos/asf/phoenix/repo
Commit: http://git-wip-us.apache.org/repos/asf/phoenix/commit/9d5b0e6a
Tree: http://git-wip-us.apache.org/repos/asf/phoenix/tree/9d5b0e6a
Diff: http://git-wip-us.apache.org/repos/asf/phoenix/diff/9d5b0e6a

Branch: refs/heads/4.0
Commit: 9d5b0e6aba76be8e5a75695b553e449697f6d0f8
Parents: f3ed1a5
Author: James Taylor 
Authored: Sun Jun 8 22:51:05 2014 -0700
Committer: James Taylor 
Committed: Sun Jun 8 22:52:06 2014 -0700

--
 .../it/java/org/apache/phoenix/end2end/ExecuteStatementsIT.java| 2 +-
 1 file changed, 1 insertion(+), 1 deletion(-)
--


http://git-wip-us.apache.org/repos/asf/phoenix/blob/9d5b0e6a/phoenix-core/src/it/java/org/apache/phoenix/end2end/ExecuteStatementsIT.java
--
diff --git 
a/phoenix-core/src/it/java/org/apache/phoenix/end2end/ExecuteStatementsIT.java 
b/phoenix-core/src/it/java/org/apache/phoenix/end2end/ExecuteStatementsIT.java
index 0611c33..c2bf5c5 100644
--- 
a/phoenix-core/src/it/java/org/apache/phoenix/end2end/ExecuteStatementsIT.java
+++ 
b/phoenix-core/src/it/java/org/apache/phoenix/end2end/ExecuteStatementsIT.java
@@ -126,7 +126,7 @@ public class ExecuteStatementsIT extends 
BaseHBaseManagedTimeIT {
 String query = "create table " + tableName +
 "(a_id integer not null, \n" + 
 "a_string char(10) not null, \n" +
-"b_string char(8) not null \n" + 
+"b_string char(8)\n" + 
 "CONSTRAINT my_pk PRIMARY KEY (a_id, a_string))";
 
 



git commit: PHOENIX-1028 Prevent declaration of non PK columns as NOT NULL

2014-06-08 Thread jamestaylor
Repository: phoenix
Updated Branches:
  refs/heads/master 455c1fdb8 -> 6098d7bd0


PHOENIX-1028 Prevent declaration of non PK columns as NOT NULL


Project: http://git-wip-us.apache.org/repos/asf/phoenix/repo
Commit: http://git-wip-us.apache.org/repos/asf/phoenix/commit/6098d7bd
Tree: http://git-wip-us.apache.org/repos/asf/phoenix/tree/6098d7bd
Diff: http://git-wip-us.apache.org/repos/asf/phoenix/diff/6098d7bd

Branch: refs/heads/master
Commit: 6098d7bd07e1bfdf5f015b55d77e987c5a7cba04
Parents: 455c1fd
Author: James Taylor 
Authored: Sun Jun 8 22:51:05 2014 -0700
Committer: James Taylor 
Committed: Sun Jun 8 22:51:05 2014 -0700

--
 .../it/java/org/apache/phoenix/end2end/ExecuteStatementsIT.java| 2 +-
 1 file changed, 1 insertion(+), 1 deletion(-)
--


http://git-wip-us.apache.org/repos/asf/phoenix/blob/6098d7bd/phoenix-core/src/it/java/org/apache/phoenix/end2end/ExecuteStatementsIT.java
--
diff --git 
a/phoenix-core/src/it/java/org/apache/phoenix/end2end/ExecuteStatementsIT.java 
b/phoenix-core/src/it/java/org/apache/phoenix/end2end/ExecuteStatementsIT.java
index 0611c33..c2bf5c5 100644
--- 
a/phoenix-core/src/it/java/org/apache/phoenix/end2end/ExecuteStatementsIT.java
+++ 
b/phoenix-core/src/it/java/org/apache/phoenix/end2end/ExecuteStatementsIT.java
@@ -126,7 +126,7 @@ public class ExecuteStatementsIT extends 
BaseHBaseManagedTimeIT {
 String query = "create table " + tableName +
 "(a_id integer not null, \n" + 
 "a_string char(10) not null, \n" +
-"b_string char(8) not null \n" + 
+"b_string char(8)\n" + 
 "CONSTRAINT my_pk PRIMARY KEY (a_id, a_string))";
 
 



Build failed in Jenkins: Phoenix | 4.0 | Hadoop1 #181

2014-06-08 Thread Apache Jenkins Server
See 

Changes:

[jtaylor] PHOENIX-1028 Prevent declaration of non PK columns as NOT NULL

--
[...truncated 355 lines...]
Tests run: 9, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 8.875 sec - in 
org.apache.phoenix.end2end.DistinctCountIT
Running org.apache.phoenix.end2end.QueryDatabaseMetaDataIT
Tests run: 144, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 82.133 sec - 
in org.apache.phoenix.end2end.NotQueryIT
Running org.apache.phoenix.end2end.StatsManagerIT
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 18.102 sec - in 
org.apache.phoenix.end2end.StatsManagerIT
Running org.apache.phoenix.end2end.TenantSpecificTablesDMLIT
Tests run: 216, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 116.996 sec - 
in org.apache.phoenix.end2end.ClientTimeArithmeticQueryIT
Running org.apache.phoenix.end2end.KeyOnlyIT
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 9.043 sec - in 
org.apache.phoenix.end2end.KeyOnlyIT
Running org.apache.phoenix.end2end.CompareDecimalToLongIT
Tests run: 8, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 6.953 sec - in 
org.apache.phoenix.end2end.CompareDecimalToLongIT
Running org.apache.phoenix.end2end.CaseStatementIT
Tests run: 14, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 35.335 sec - 
in org.apache.phoenix.end2end.TenantSpecificTablesDMLIT
Running org.apache.phoenix.end2end.DynamicColumnIT
Tests run: 5, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 3.954 sec - in 
org.apache.phoenix.end2end.DynamicColumnIT
Running org.apache.phoenix.end2end.TruncateFunctionIT
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 5.769 sec - in 
org.apache.phoenix.end2end.TruncateFunctionIT
Running org.apache.phoenix.end2end.ArrayIT
Tests run: 15, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 72.842 sec - 
in org.apache.phoenix.end2end.QueryDatabaseMetaDataIT
Running org.apache.phoenix.end2end.GroupByIT
Tests run: 42, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 42.982 sec - 
in org.apache.phoenix.end2end.ArrayIT
Running org.apache.phoenix.end2end.NativeHBaseTypesIT
Tests run: 7, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 7.076 sec - in 
org.apache.phoenix.end2end.NativeHBaseTypesIT
Running org.apache.phoenix.end2end.salted.SaltedTableVarLengthRowKeyIT
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 10.305 sec - in 
org.apache.phoenix.end2end.salted.SaltedTableVarLengthRowKeyIT
Running org.apache.phoenix.end2end.salted.SaltedTableIT
Tests run: 8, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 12.242 sec - in 
org.apache.phoenix.end2end.salted.SaltedTableIT
Running org.apache.phoenix.end2end.CastAndCoerceIT
Tests run: 136, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 121.32 sec - 
in org.apache.phoenix.end2end.CaseStatementIT
Running org.apache.phoenix.end2end.ReadIsolationLevelIT
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 10.172 sec - in 
org.apache.phoenix.end2end.ReadIsolationLevelIT
Running org.apache.phoenix.end2end.QueryIT
Tests run: 152, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 134.216 sec - 
in org.apache.phoenix.end2end.GroupByIT
Running org.apache.phoenix.end2end.ProductMetricsIT
Tests run: 61, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 53.087 sec - 
in org.apache.phoenix.end2end.ProductMetricsIT
Running org.apache.phoenix.end2end.SkipRangeParallelIteratorRegionSplitterIT
Tests run: 124, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 108.726 sec - 
in org.apache.phoenix.end2end.CastAndCoerceIT
Running org.apache.phoenix.end2end.InMemoryOrderByIT
Tests run: 8, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 18.455 sec - in 
org.apache.phoenix.end2end.SkipRangeParallelIteratorRegionSplitterIT
Running org.apache.phoenix.end2end.UpsertValuesIT
Tests run: 3, Failures: 0, Errors: 0, Skipped: 1, Time elapsed: 18.282 sec - in 
org.apache.phoenix.end2end.InMemoryOrderByIT
Running org.apache.phoenix.end2end.CreateTableIT
Tests run: 100, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 103.437 sec - 
in org.apache.phoenix.end2end.QueryIT
Running org.apache.phoenix.end2end.ToNumberFunctionIT
Tests run: 18, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 11.28 sec - in 
org.apache.phoenix.end2end.ToNumberFunctionIT
Running org.apache.phoenix.end2end.ToCharFunctionIT
Tests run: 10, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 9.585 sec - in 
org.apache.phoenix.end2end.ToCharFunctionIT
Running org.apache.phoenix.end2end.DerivedTableIT
Tests run: 16, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 25.067 sec - 
in org.apache.phoenix.end2end.DerivedTableIT
Running org.apache.phoenix.end2end.DefaultParallelIteratorsRegionSplitterIT
Tests run: 12, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 57.399 sec - 
in org.apache.phoenix.end2end.UpsertValuesIT
Running org.apache.phoenix.end2end.SequenceIT
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time ela

Build failed in Jenkins: Phoenix | 3.0 | Hadoop1 #116

2014-06-08 Thread Apache Jenkins Server
See 

Changes:

[jtaylor] PHOENIX-1028 Prevent declaration of non PK columns as NOT NULL

--
[...truncated 357 lines...]
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 11.75 sec - in 
org.apache.phoenix.end2end.MD5FunctionIT
Running org.apache.phoenix.end2end.index.MutableIndexIT
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 28.224 sec - in 
org.apache.phoenix.end2end.TenantSpecificViewIndexIT
Running org.apache.phoenix.end2end.index.ImmutableIndexIT
Tests run: 7, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 32.845 sec - in 
org.apache.phoenix.end2end.index.ImmutableIndexIT
Running org.apache.phoenix.end2end.index.DropViewIT
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 8.48 sec - in 
org.apache.phoenix.end2end.index.DropViewIT
Running org.apache.phoenix.end2end.index.IndexMetadataIT
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 58.831 sec - in 
org.apache.phoenix.end2end.index.SaltedIndexIT
Running org.apache.phoenix.end2end.TimezoneOffsetFunctionIT
Tests run: 6, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 7.99 sec - in 
org.apache.phoenix.end2end.TimezoneOffsetFunctionIT
Running org.apache.phoenix.end2end.UpsertSelectAutoCommitIT
Tests run: 60, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 109.6 sec - in 
org.apache.phoenix.end2end.HashJoinIT
Running org.apache.phoenix.end2end.BinaryRowKeyIT
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 5.919 sec - in 
org.apache.phoenix.end2end.BinaryRowKeyIT
Running org.apache.phoenix.end2end.ReverseFunctionIT
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 24.984 sec - in 
org.apache.phoenix.end2end.UpsertSelectAutoCommitIT
Running org.apache.phoenix.end2end.salted.SaltedTableUpsertSelectIT
Tests run: 5, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 5.996 sec - in 
org.apache.phoenix.end2end.ReverseFunctionIT
Running org.apache.phoenix.end2end.SpillableGroupByIT
Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 44.45 sec - in 
org.apache.phoenix.end2end.index.IndexMetadataIT
Running org.apache.phoenix.end2end.SkipScanQueryIT
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 8.867 sec - in 
org.apache.phoenix.end2end.SpillableGroupByIT
Running org.apache.phoenix.end2end.DecodeFunctionIT
Tests run: 6, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 7.147 sec - in 
org.apache.phoenix.end2end.DecodeFunctionIT
Running org.apache.phoenix.end2end.TenantSpecificViewIndexSaltedIT
Tests run: 5, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 19.583 sec - in 
org.apache.phoenix.end2end.salted.SaltedTableUpsertSelectIT
Running org.apache.phoenix.end2end.AutoCommitIT
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 9.248 sec - in 
org.apache.phoenix.end2end.AutoCommitIT
Running org.apache.phoenix.end2end.SaltedViewIT
Tests run: 7, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 28.671 sec - in 
org.apache.phoenix.end2end.SkipScanQueryIT
Running org.apache.phoenix.end2end.ServerExceptionIT
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 17.429 sec - in 
org.apache.phoenix.end2end.TenantSpecificViewIndexSaltedIT
Running org.apache.phoenix.end2end.ViewIT
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 5.815 sec - in 
org.apache.phoenix.end2end.ServerExceptionIT
Running org.apache.phoenix.end2end.CSVCommonsLoaderIT
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 21.937 sec - in 
org.apache.phoenix.end2end.SaltedViewIT
Running org.apache.phoenix.end2end.QueryPlanIT
Tests run: 7, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 24.912 sec - in 
org.apache.phoenix.end2end.ViewIT
Running org.apache.phoenix.end2end.AlterTableIT
Tests run: 13, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 30.755 sec - 
in org.apache.phoenix.end2end.CSVCommonsLoaderIT
Running org.apache.phoenix.end2end.ArithmeticQueryIT
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 22.137 sec - in 
org.apache.phoenix.end2end.QueryPlanIT
Running org.apache.phoenix.end2end.UpsertBigValuesIT
Tests run: 8, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 24.719 sec - in 
org.apache.phoenix.end2end.ArithmeticQueryIT
Running org.apache.phoenix.end2end.QueryExecWithoutSCNIT
Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 28.824 sec - in 
org.apache.phoenix.end2end.UpsertBigValuesIT
Running org.apache.phoenix.end2end.DeleteIT
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 13.927 sec - in 
org.apache.phoenix.end2end.QueryExecWithoutSCNIT
Tests run: 14, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 206.669 sec - 
in org.apache.phoenix.end2end.index.MutableIndexIT
Tests run: 13, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 78.915 sec - 
in org.apache.phoenix.end2end.AlterTableIT
Tests run: 15, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 39.28 sec 

git commit: PHOENIX-1028 Prevent declaration of non PK columns as NOT NULL

2014-06-08 Thread jamestaylor
Repository: phoenix
Updated Branches:
  refs/heads/4.0 4cab22ce0 -> f3ed1a571


PHOENIX-1028 Prevent declaration of non PK columns as NOT NULL


Project: http://git-wip-us.apache.org/repos/asf/phoenix/repo
Commit: http://git-wip-us.apache.org/repos/asf/phoenix/commit/f3ed1a57
Tree: http://git-wip-us.apache.org/repos/asf/phoenix/tree/f3ed1a57
Diff: http://git-wip-us.apache.org/repos/asf/phoenix/diff/f3ed1a57

Branch: refs/heads/4.0
Commit: f3ed1a571482c5d11310ec92109e8e576fcb3e38
Parents: 4cab22c
Author: James Taylor 
Authored: Sun Jun 8 22:13:31 2014 -0700
Committer: James Taylor 
Committed: Sun Jun 8 22:14:15 2014 -0700

--
 .../java/org/apache/phoenix/end2end/ToCharFunctionIT.java | 10 +-
 .../org/apache/phoenix/end2end/ToNumberFunctionIT.java|  8 
 2 files changed, 9 insertions(+), 9 deletions(-)
--


http://git-wip-us.apache.org/repos/asf/phoenix/blob/f3ed1a57/phoenix-core/src/it/java/org/apache/phoenix/end2end/ToCharFunctionIT.java
--
diff --git 
a/phoenix-core/src/it/java/org/apache/phoenix/end2end/ToCharFunctionIT.java 
b/phoenix-core/src/it/java/org/apache/phoenix/end2end/ToCharFunctionIT.java
index 641ddc4..13d6bb7 100644
--- a/phoenix-core/src/it/java/org/apache/phoenix/end2end/ToCharFunctionIT.java
+++ b/phoenix-core/src/it/java/org/apache/phoenix/end2end/ToCharFunctionIT.java
@@ -67,11 +67,11 @@ public class ToCharFunctionIT extends 
BaseClientManagedTimeIT {
 
 public static final String TO_CHAR_TABLE_DDL = "create table " + 
TO_CHAR_TABLE_NAME +
 "(pk integer not null, \n" + 
-"col_date date not null, \n" +
-"col_time date not null, \n" +
-"col_timestamp timestamp not null, \n" +
-"col_integer integer not null, \n" + 
-"col_decimal decimal not null \n" + 
+"col_date date, \n" +
+"col_time date, \n" +
+"col_timestamp timestamp, \n" +
+"col_integer integer, \n" + 
+"col_decimal decimal\n" + 
 "CONSTRAINT my_pk PRIMARY KEY (pk))";
 
 @Before

http://git-wip-us.apache.org/repos/asf/phoenix/blob/f3ed1a57/phoenix-core/src/it/java/org/apache/phoenix/end2end/ToNumberFunctionIT.java
--
diff --git 
a/phoenix-core/src/it/java/org/apache/phoenix/end2end/ToNumberFunctionIT.java 
b/phoenix-core/src/it/java/org/apache/phoenix/end2end/ToNumberFunctionIT.java
index 0fbb23d..b087985 100644
--- 
a/phoenix-core/src/it/java/org/apache/phoenix/end2end/ToNumberFunctionIT.java
+++ 
b/phoenix-core/src/it/java/org/apache/phoenix/end2end/ToNumberFunctionIT.java
@@ -60,10 +60,10 @@ public class ToNumberFunctionIT extends 
BaseClientManagedTimeIT {
 public static final String TO_NUMBER_TABLE_DDL = "create table " + 
TO_NUMBER_TABLE_NAME +
 "(a_id integer not null, \n" + 
 "a_string char(4) not null, \n" +
-"b_string char(4) not null, \n" + 
-"a_date date not null, \n" + 
-"a_time date not null, \n" + 
-"a_timestamp timestamp not null \n" + 
+"b_string char(4), \n" + 
+"a_date date, \n" + 
+"a_time date, \n" + 
+"a_timestamp timestamp \n" + 
 "CONSTRAINT my_pk PRIMARY KEY (a_id, a_string))";
 
 private Date row1Date;



git commit: PHOENIX-1028 Prevent declaration of non PK columns as NOT NULL

2014-06-08 Thread jamestaylor
Repository: phoenix
Updated Branches:
  refs/heads/master a56f78ba6 -> 455c1fdb8


PHOENIX-1028 Prevent declaration of non PK columns as NOT NULL


Project: http://git-wip-us.apache.org/repos/asf/phoenix/repo
Commit: http://git-wip-us.apache.org/repos/asf/phoenix/commit/455c1fdb
Tree: http://git-wip-us.apache.org/repos/asf/phoenix/tree/455c1fdb
Diff: http://git-wip-us.apache.org/repos/asf/phoenix/diff/455c1fdb

Branch: refs/heads/master
Commit: 455c1fdb827cd8e6a8d4acd7e6e77ea800329190
Parents: a56f78b
Author: James Taylor 
Authored: Sun Jun 8 22:13:31 2014 -0700
Committer: James Taylor 
Committed: Sun Jun 8 22:14:44 2014 -0700

--
 .../java/org/apache/phoenix/end2end/ToCharFunctionIT.java | 10 +-
 .../org/apache/phoenix/end2end/ToNumberFunctionIT.java|  8 
 2 files changed, 9 insertions(+), 9 deletions(-)
--


http://git-wip-us.apache.org/repos/asf/phoenix/blob/455c1fdb/phoenix-core/src/it/java/org/apache/phoenix/end2end/ToCharFunctionIT.java
--
diff --git 
a/phoenix-core/src/it/java/org/apache/phoenix/end2end/ToCharFunctionIT.java 
b/phoenix-core/src/it/java/org/apache/phoenix/end2end/ToCharFunctionIT.java
index 641ddc4..13d6bb7 100644
--- a/phoenix-core/src/it/java/org/apache/phoenix/end2end/ToCharFunctionIT.java
+++ b/phoenix-core/src/it/java/org/apache/phoenix/end2end/ToCharFunctionIT.java
@@ -67,11 +67,11 @@ public class ToCharFunctionIT extends 
BaseClientManagedTimeIT {
 
 public static final String TO_CHAR_TABLE_DDL = "create table " + 
TO_CHAR_TABLE_NAME +
 "(pk integer not null, \n" + 
-"col_date date not null, \n" +
-"col_time date not null, \n" +
-"col_timestamp timestamp not null, \n" +
-"col_integer integer not null, \n" + 
-"col_decimal decimal not null \n" + 
+"col_date date, \n" +
+"col_time date, \n" +
+"col_timestamp timestamp, \n" +
+"col_integer integer, \n" + 
+"col_decimal decimal\n" + 
 "CONSTRAINT my_pk PRIMARY KEY (pk))";
 
 @Before

http://git-wip-us.apache.org/repos/asf/phoenix/blob/455c1fdb/phoenix-core/src/it/java/org/apache/phoenix/end2end/ToNumberFunctionIT.java
--
diff --git 
a/phoenix-core/src/it/java/org/apache/phoenix/end2end/ToNumberFunctionIT.java 
b/phoenix-core/src/it/java/org/apache/phoenix/end2end/ToNumberFunctionIT.java
index 0fbb23d..b087985 100644
--- 
a/phoenix-core/src/it/java/org/apache/phoenix/end2end/ToNumberFunctionIT.java
+++ 
b/phoenix-core/src/it/java/org/apache/phoenix/end2end/ToNumberFunctionIT.java
@@ -60,10 +60,10 @@ public class ToNumberFunctionIT extends 
BaseClientManagedTimeIT {
 public static final String TO_NUMBER_TABLE_DDL = "create table " + 
TO_NUMBER_TABLE_NAME +
 "(a_id integer not null, \n" + 
 "a_string char(4) not null, \n" +
-"b_string char(4) not null, \n" + 
-"a_date date not null, \n" + 
-"a_time date not null, \n" + 
-"a_timestamp timestamp not null \n" + 
+"b_string char(4), \n" + 
+"a_date date, \n" + 
+"a_time date, \n" + 
+"a_timestamp timestamp \n" + 
 "CONSTRAINT my_pk PRIMARY KEY (a_id, a_string))";
 
 private Date row1Date;



git commit: PHOENIX-1028 Prevent declaration of non PK columns as NOT NULL

2014-06-08 Thread jamestaylor
Repository: phoenix
Updated Branches:
  refs/heads/3.0 4281e7cea -> 05f8bf056


PHOENIX-1028 Prevent declaration of non PK columns as NOT NULL


Project: http://git-wip-us.apache.org/repos/asf/phoenix/repo
Commit: http://git-wip-us.apache.org/repos/asf/phoenix/commit/05f8bf05
Tree: http://git-wip-us.apache.org/repos/asf/phoenix/tree/05f8bf05
Diff: http://git-wip-us.apache.org/repos/asf/phoenix/diff/05f8bf05

Branch: refs/heads/3.0
Commit: 05f8bf056a8295d3634c5f8329208a7ee35e1a22
Parents: 4281e7c
Author: James Taylor 
Authored: Sun Jun 8 22:13:31 2014 -0700
Committer: James Taylor 
Committed: Sun Jun 8 22:13:31 2014 -0700

--
 .../java/org/apache/phoenix/end2end/ToCharFunctionIT.java | 10 +-
 .../org/apache/phoenix/end2end/ToNumberFunctionIT.java|  8 
 2 files changed, 9 insertions(+), 9 deletions(-)
--


http://git-wip-us.apache.org/repos/asf/phoenix/blob/05f8bf05/phoenix-core/src/it/java/org/apache/phoenix/end2end/ToCharFunctionIT.java
--
diff --git 
a/phoenix-core/src/it/java/org/apache/phoenix/end2end/ToCharFunctionIT.java 
b/phoenix-core/src/it/java/org/apache/phoenix/end2end/ToCharFunctionIT.java
index 641ddc4..13d6bb7 100644
--- a/phoenix-core/src/it/java/org/apache/phoenix/end2end/ToCharFunctionIT.java
+++ b/phoenix-core/src/it/java/org/apache/phoenix/end2end/ToCharFunctionIT.java
@@ -67,11 +67,11 @@ public class ToCharFunctionIT extends 
BaseClientManagedTimeIT {
 
 public static final String TO_CHAR_TABLE_DDL = "create table " + 
TO_CHAR_TABLE_NAME +
 "(pk integer not null, \n" + 
-"col_date date not null, \n" +
-"col_time date not null, \n" +
-"col_timestamp timestamp not null, \n" +
-"col_integer integer not null, \n" + 
-"col_decimal decimal not null \n" + 
+"col_date date, \n" +
+"col_time date, \n" +
+"col_timestamp timestamp, \n" +
+"col_integer integer, \n" + 
+"col_decimal decimal\n" + 
 "CONSTRAINT my_pk PRIMARY KEY (pk))";
 
 @Before

http://git-wip-us.apache.org/repos/asf/phoenix/blob/05f8bf05/phoenix-core/src/it/java/org/apache/phoenix/end2end/ToNumberFunctionIT.java
--
diff --git 
a/phoenix-core/src/it/java/org/apache/phoenix/end2end/ToNumberFunctionIT.java 
b/phoenix-core/src/it/java/org/apache/phoenix/end2end/ToNumberFunctionIT.java
index 0fbb23d..b087985 100644
--- 
a/phoenix-core/src/it/java/org/apache/phoenix/end2end/ToNumberFunctionIT.java
+++ 
b/phoenix-core/src/it/java/org/apache/phoenix/end2end/ToNumberFunctionIT.java
@@ -60,10 +60,10 @@ public class ToNumberFunctionIT extends 
BaseClientManagedTimeIT {
 public static final String TO_NUMBER_TABLE_DDL = "create table " + 
TO_NUMBER_TABLE_NAME +
 "(a_id integer not null, \n" + 
 "a_string char(4) not null, \n" +
-"b_string char(4) not null, \n" + 
-"a_date date not null, \n" + 
-"a_time date not null, \n" + 
-"a_timestamp timestamp not null \n" + 
+"b_string char(4), \n" + 
+"a_date date, \n" + 
+"a_time date, \n" + 
+"a_timestamp timestamp \n" + 
 "CONSTRAINT my_pk PRIMARY KEY (a_id, a_string))";
 
 private Date row1Date;



Build failed in Jenkins: Phoenix | 4.0 | Hadoop1 #180

2014-06-08 Thread Apache Jenkins Server
See 

Changes:

[jtaylor] PHOENIX-1028 Prevent declaration of non PK columns as NOT NULL (Ravi)

--
[...truncated 833 lines...]
Tests run: 100, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 77.638 sec - 
in org.apache.phoenix.end2end.QueryIT
Running org.apache.phoenix.end2end.SkipRangeParallelIteratorRegionSplitterIT
Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 7.134 sec - in 
org.apache.phoenix.end2end.TopNIT
Running org.apache.phoenix.end2end.UpsertValuesIT
Tests run: 8, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 10.042 sec - in 
org.apache.phoenix.end2end.SkipRangeParallelIteratorRegionSplitterIT
Running org.apache.phoenix.end2end.TenantSpecificTablesDMLIT
Tests run: 14, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 25.278 sec - 
in org.apache.phoenix.end2end.TenantSpecificTablesDMLIT
Tests run: 12, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 35.72 sec - in 
org.apache.phoenix.end2end.UpsertValuesIT
Running org.apache.phoenix.end2end.CustomEntityDataIT
Running org.apache.phoenix.end2end.DynamicUpsertIT
Tests run: 7, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 3.871 sec - in 
org.apache.phoenix.end2end.DynamicUpsertIT
Running org.apache.phoenix.end2end.ClientTimeArithmeticQueryIT
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 6.188 sec - in 
org.apache.phoenix.end2end.CustomEntityDataIT
Running org.apache.phoenix.end2end.MultiCfQueryExecIT
Tests run: 7, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 17.636 sec - in 
org.apache.phoenix.end2end.MultiCfQueryExecIT
Running org.apache.phoenix.end2end.InMemoryOrderByIT
Tests run: 3, Failures: 0, Errors: 0, Skipped: 1, Time elapsed: 5.852 sec - in 
org.apache.phoenix.end2end.InMemoryOrderByIT
Running org.apache.phoenix.end2end.ArrayIT
Tests run: 15, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 72.649 sec - 
in org.apache.phoenix.end2end.QueryDatabaseMetaDataIT
Running org.apache.phoenix.end2end.DerivedTableIT
Tests run: 16, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 17.714 sec - 
in org.apache.phoenix.end2end.DerivedTableIT
Running org.apache.phoenix.end2end.GroupByIT
Tests run: 42, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 28.856 sec - 
in org.apache.phoenix.end2end.ArrayIT
Running org.apache.phoenix.end2end.NotQueryIT
Tests run: 216, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 142.868 sec - 
in org.apache.phoenix.end2end.ClientTimeArithmeticQueryIT
Running org.apache.phoenix.end2end.KeyOnlyIT
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 4.032 sec - in 
org.apache.phoenix.end2end.KeyOnlyIT
Running org.apache.phoenix.end2end.DynamicColumnIT
Tests run: 152, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 104.581 sec - 
in org.apache.phoenix.end2end.GroupByIT
Running org.apache.phoenix.end2end.RowValueConstructorIT
Tests run: 5, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 6.667 sec - in 
org.apache.phoenix.end2end.DynamicColumnIT
Running org.apache.phoenix.end2end.SequenceIT
Tests run: 144, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 104.758 sec - 
in org.apache.phoenix.end2end.NotQueryIT
Running org.apache.phoenix.end2end.ColumnProjectionOptimizationIT
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 18.183 sec - in 
org.apache.phoenix.end2end.ColumnProjectionOptimizationIT
Tests run: 23, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 28.172 sec - 
in org.apache.phoenix.end2end.SequenceIT
Tests run: 25, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 36.592 sec - 
in org.apache.phoenix.end2end.RowValueConstructorIT

Results :

Tests in error: 
  
ToCharFunctionIT.initTable:83->BaseTest.createTestTable:610->BaseTest.createTestTable:636
 ? SQL
  
ToCharFunctionIT.initTable:83->BaseTest.createTestTable:610->BaseTest.createTestTable:636
 ? SQL
  
ToCharFunctionIT.initTable:83->BaseTest.createTestTable:610->BaseTest.createTestTable:636
 ? SQL
  
ToCharFunctionIT.initTable:83->BaseTest.createTestTable:610->BaseTest.createTestTable:636
 ? SQL
  
ToCharFunctionIT.initTable:83->BaseTest.createTestTable:610->BaseTest.createTestTable:636
 ? SQL
  
ToCharFunctionIT.initTable:83->BaseTest.createTestTable:610->BaseTest.createTestTable:636
 ? SQL
  
ToCharFunctionIT.initTable:83->BaseTest.createTestTable:610->BaseTest.createTestTable:636
 ? SQL
  
ToCharFunctionIT.initTable:83->BaseTest.createTestTable:610->BaseTest.createTestTable:636
 ? SQL
  
ToCharFunctionIT.initTable:83->BaseTest.createTestTable:610->BaseTest.createTestTable:636
 ? SQL
  
ToCharFunctionIT.initTable:83->BaseTest.createTestTable:610->BaseTest.createTestTable:636
 ? SQL
  
ToNumberFunctionIT.initTable:93->BaseTest.createTestTable:610->BaseTest.createTestTable:636
 ? SQL
  
ToNumberFunctionIT.initTable:93->BaseTest.createTestTable:610->BaseTest.createTestTable:636
 ? SQL
  
ToNumberFunctionIT.initTable:93->BaseTest.createTestTable:610->BaseTest.createTestTable:636
 

Build failed in Jenkins: Phoenix | 3.0 | Hadoop1 #115

2014-06-08 Thread Apache Jenkins Server
See 

Changes:

[jtaylor] PHOENIX-1028 Prevent declaration of non PK columns as NOT NULL (Ravi)

--
[...truncated 835 lines...]
at 
org.apache.phoenix.util.PhoenixContextExecutor.call(PhoenixContextExecutor.java:54)
at 
org.apache.phoenix.jdbc.PhoenixStatement.executeMutation(PhoenixStatement.java:236)
at 
org.apache.phoenix.jdbc.PhoenixStatement.execute(PhoenixStatement.java:952)
at org.apache.phoenix.query.BaseTest.createTestTable(BaseTest.java:624)
at org.apache.phoenix.query.BaseTest.createTestTable(BaseTest.java:598)
at 
org.apache.phoenix.end2end.ToCharFunctionIT.initTable(ToCharFunctionIT.java:83)

testDecimalProjection(org.apache.phoenix.end2end.ToCharFunctionIT)  Time 
elapsed: 0.063 sec  <<< ERROR!
java.sql.SQLException: ERROR 517 (42895): Invalid not null constraint on non 
primary key column columnName=COL_DATE
at 
org.apache.phoenix.exception.SQLExceptionCode$Factory$1.newException(SQLExceptionCode.java:310)
at 
org.apache.phoenix.exception.SQLExceptionInfo.buildException(SQLExceptionInfo.java:133)
at 
org.apache.phoenix.schema.MetaDataClient.createTableInternal(MetaDataClient.java:948)
at 
org.apache.phoenix.schema.MetaDataClient.createTable(MetaDataClient.java:422)
at 
org.apache.phoenix.compile.CreateTableCompiler$2.execute(CreateTableCompiler.java:183)
at 
org.apache.phoenix.jdbc.PhoenixStatement$2.call(PhoenixStatement.java:246)
at 
org.apache.phoenix.jdbc.PhoenixStatement$2.call(PhoenixStatement.java:237)
at 
org.apache.phoenix.util.PhoenixContextExecutor.call(PhoenixContextExecutor.java:54)
at 
org.apache.phoenix.jdbc.PhoenixStatement.executeMutation(PhoenixStatement.java:236)
at 
org.apache.phoenix.jdbc.PhoenixStatement.execute(PhoenixStatement.java:952)
at org.apache.phoenix.query.BaseTest.createTestTable(BaseTest.java:624)
at org.apache.phoenix.query.BaseTest.createTestTable(BaseTest.java:598)
at 
org.apache.phoenix.end2end.ToCharFunctionIT.initTable(ToCharFunctionIT.java:83)

testDecimalFilter(org.apache.phoenix.end2end.ToCharFunctionIT)  Time elapsed: 
0.069 sec  <<< ERROR!
java.sql.SQLException: ERROR 517 (42895): Invalid not null constraint on non 
primary key column columnName=COL_DATE
at 
org.apache.phoenix.exception.SQLExceptionCode$Factory$1.newException(SQLExceptionCode.java:310)
at 
org.apache.phoenix.exception.SQLExceptionInfo.buildException(SQLExceptionInfo.java:133)
at 
org.apache.phoenix.schema.MetaDataClient.createTableInternal(MetaDataClient.java:948)
at 
org.apache.phoenix.schema.MetaDataClient.createTable(MetaDataClient.java:422)
at 
org.apache.phoenix.compile.CreateTableCompiler$2.execute(CreateTableCompiler.java:183)
at 
org.apache.phoenix.jdbc.PhoenixStatement$2.call(PhoenixStatement.java:246)
at 
org.apache.phoenix.jdbc.PhoenixStatement$2.call(PhoenixStatement.java:237)
at 
org.apache.phoenix.util.PhoenixContextExecutor.call(PhoenixContextExecutor.java:54)
at 
org.apache.phoenix.jdbc.PhoenixStatement.executeMutation(PhoenixStatement.java:236)
at 
org.apache.phoenix.jdbc.PhoenixStatement.execute(PhoenixStatement.java:952)
at org.apache.phoenix.query.BaseTest.createTestTable(BaseTest.java:624)
at org.apache.phoenix.query.BaseTest.createTestTable(BaseTest.java:598)
at 
org.apache.phoenix.end2end.ToCharFunctionIT.initTable(ToCharFunctionIT.java:83)

testDateFilter(org.apache.phoenix.end2end.ToCharFunctionIT)  Time elapsed: 
0.095 sec  <<< ERROR!
java.sql.SQLException: ERROR 517 (42895): Invalid not null constraint on non 
primary key column columnName=COL_DATE
at 
org.apache.phoenix.exception.SQLExceptionCode$Factory$1.newException(SQLExceptionCode.java:310)
at 
org.apache.phoenix.exception.SQLExceptionInfo.buildException(SQLExceptionInfo.java:133)
at 
org.apache.phoenix.schema.MetaDataClient.createTableInternal(MetaDataClient.java:948)
at 
org.apache.phoenix.schema.MetaDataClient.createTable(MetaDataClient.java:422)
at 
org.apache.phoenix.compile.CreateTableCompiler$2.execute(CreateTableCompiler.java:183)
at 
org.apache.phoenix.jdbc.PhoenixStatement$2.call(PhoenixStatement.java:246)
at 
org.apache.phoenix.jdbc.PhoenixStatement$2.call(PhoenixStatement.java:237)
at 
org.apache.phoenix.util.PhoenixContextExecutor.call(PhoenixContextExecutor.java:54)
at 
org.apache.phoenix.jdbc.PhoenixStatement.executeMutation(PhoenixStatement.java:236)
at 
org.apache.phoenix.jdbc.PhoenixStatement.execute(PhoenixStatement.java:952)
at org.apache.phoenix.query.BaseTest.createTestTable(BaseTest.java:624)
at org.apache.phoenix.query.BaseTest.createTestTable(BaseTest.java:598)
at 
org.apache.phoenix.end2end.ToCharFunctionIT.initTable(T

git commit: PHOENIX-1028 Prevent declaration of non PK columns as NOT NULL (Ravi)

2014-06-08 Thread jamestaylor
Repository: phoenix
Updated Branches:
  refs/heads/3.0 41ed8388a -> 4281e7cea


PHOENIX-1028 Prevent declaration of non PK columns as NOT NULL (Ravi)


Project: http://git-wip-us.apache.org/repos/asf/phoenix/repo
Commit: http://git-wip-us.apache.org/repos/asf/phoenix/commit/4281e7ce
Tree: http://git-wip-us.apache.org/repos/asf/phoenix/tree/4281e7ce
Diff: http://git-wip-us.apache.org/repos/asf/phoenix/diff/4281e7ce

Branch: refs/heads/3.0
Commit: 4281e7cea3314632e68ed72e9605d9a6f73a1c65
Parents: 41ed838
Author: James Taylor 
Authored: Sun Jun 8 21:38:45 2014 -0700
Committer: James Taylor 
Committed: Sun Jun 8 21:38:45 2014 -0700

--
 .../apache/phoenix/end2end/CreateTableIT.java   | 31 
 .../phoenix/exception/SQLExceptionCode.java |  1 +
 .../apache/phoenix/query/QueryConstants.java|  8 ++---
 .../apache/phoenix/schema/MetaDataClient.java   |  8 +
 .../phoenix/compile/QueryCompilerTest.java  |  2 +-
 .../phoenix/index/IndexMaintainerTest.java  |  4 +--
 6 files changed, 47 insertions(+), 7 deletions(-)
--


http://git-wip-us.apache.org/repos/asf/phoenix/blob/4281e7ce/phoenix-core/src/it/java/org/apache/phoenix/end2end/CreateTableIT.java
--
diff --git 
a/phoenix-core/src/it/java/org/apache/phoenix/end2end/CreateTableIT.java 
b/phoenix-core/src/it/java/org/apache/phoenix/end2end/CreateTableIT.java
index 96b4a8e..e28273e 100644
--- a/phoenix-core/src/it/java/org/apache/phoenix/end2end/CreateTableIT.java
+++ b/phoenix-core/src/it/java/org/apache/phoenix/end2end/CreateTableIT.java
@@ -32,6 +32,7 @@ import java.util.Properties;
 import org.apache.hadoop.hbase.HColumnDescriptor;
 import org.apache.hadoop.hbase.client.HBaseAdmin;
 import org.apache.hadoop.hbase.util.Bytes;
+import org.apache.phoenix.exception.SQLExceptionCode;
 import org.apache.phoenix.jdbc.PhoenixStatement;
 import org.apache.phoenix.query.KeyRange;
 import org.apache.phoenix.schema.TableAlreadyExistsException;
@@ -318,5 +319,35 @@ public class CreateTableIT extends BaseClientManagedTimeIT 
{
assertEquals("a", columnFamilies[0].getNameAsString());
assertEquals(1, columnFamilies[0].getTimeToLive());
 }
+
+
+/**
+ * Test to ensure that NOT NULL constraint isn't added to a non primary 
key column.
+ * @throws Exception
+ */
+@Test
+public void testNotNullConstraintForNonPKColumn() throws Exception {
+
+String ddl = "CREATE TABLE IF NOT EXISTS EVENT.APEX_LIMIT ( " +
+" ORGANIZATION_ID CHAR(15) NOT NULL, " +
+" EVENT_TIME DATE NOT NULL, USER_ID CHAR(15) NOT NULL, " +
+" ENTRY_POINT_ID CHAR(15) NOT NULL, ENTRY_POINT_TYPE CHAR(2) 
NOT NULL , " +
+" APEX_LIMIT_ID CHAR(15) NOT NULL,  USERNAME CHAR(80),  " +
+" NAMESPACE_PREFIX VARCHAR, ENTRY_POINT_NAME VARCHAR  NOT NULL 
, " +
+" EXECUTION_UNIT_NO VARCHAR, LIMIT_TYPE VARCHAR, " +
+" LIMIT_VALUE DOUBLE  " +
+" CONSTRAINT PK PRIMARY KEY (" + 
+" ORGANIZATION_ID, EVENT_TIME,USER_ID,ENTRY_POINT_ID, 
ENTRY_POINT_TYPE, APEX_LIMIT_ID " +
+" ) ) VERSIONS=1";
+
+Properties props = new Properties();
+Connection conn = DriverManager.getConnection(getUrl(), props);
+try {
+conn.createStatement().execute(ddl);
+fail(" Non pk column ENTRY_POINT_NAME has a NOT NULL constraint");
+} catch( SQLException sqle) {
+
assertEquals(SQLExceptionCode.INVALID_NOT_NULL_CONSTRAINT.getErrorCode(),sqle.getErrorCode());
+}
+   }
 
 }

http://git-wip-us.apache.org/repos/asf/phoenix/blob/4281e7ce/phoenix-core/src/main/java/org/apache/phoenix/exception/SQLExceptionCode.java
--
diff --git 
a/phoenix-core/src/main/java/org/apache/phoenix/exception/SQLExceptionCode.java 
b/phoenix-core/src/main/java/org/apache/phoenix/exception/SQLExceptionCode.java
index d9e23f5..39b951d 100644
--- 
a/phoenix-core/src/main/java/org/apache/phoenix/exception/SQLExceptionCode.java
+++ 
b/phoenix-core/src/main/java/org/apache/phoenix/exception/SQLExceptionCode.java
@@ -138,6 +138,7 @@ public enum SQLExceptionCode {
 }),
 ORDER_BY_ARRAY_NOT_SUPPORTED(515, "42893", "ORDER BY of an array type is 
not allowed"),
 NON_EQUALITY_ARRAY_COMPARISON(516, "42894", "Array types may only be 
compared using = or !="),
+INVALID_NOT_NULL_CONSTRAINT(517, "42895", "Invalid not null constraint on 
non primary key column"),
 
 /** 
  * HBase and Phoenix specific implementation defined sub-classes.

http://git-wip-us.apache.org/repos/asf/phoenix/blob/4281e7ce/phoenix-core/src/main/java/org/apache/phoenix/query/QueryCons

git commit: PHOENIX-1028 Prevent declaration of non PK columns as NOT NULL (Ravi)

2014-06-08 Thread jamestaylor
Repository: phoenix
Updated Branches:
  refs/heads/master dbda9b702 -> a56f78ba6


PHOENIX-1028 Prevent declaration of non PK columns as NOT NULL (Ravi)


Project: http://git-wip-us.apache.org/repos/asf/phoenix/repo
Commit: http://git-wip-us.apache.org/repos/asf/phoenix/commit/a56f78ba
Tree: http://git-wip-us.apache.org/repos/asf/phoenix/tree/a56f78ba
Diff: http://git-wip-us.apache.org/repos/asf/phoenix/diff/a56f78ba

Branch: refs/heads/master
Commit: a56f78ba68a257b91a42ba0d9b24fcfe1d204c2b
Parents: dbda9b7
Author: James Taylor 
Authored: Sun Jun 8 21:37:35 2014 -0700
Committer: James Taylor 
Committed: Sun Jun 8 21:37:35 2014 -0700

--
 .../apache/phoenix/end2end/CreateTableIT.java   | 31 
 .../phoenix/exception/SQLExceptionCode.java |  1 +
 .../apache/phoenix/query/QueryConstants.java|  8 ++---
 .../apache/phoenix/schema/MetaDataClient.java   |  8 +
 .../phoenix/compile/QueryCompilerTest.java  |  2 +-
 .../phoenix/index/IndexMaintainerTest.java  |  4 +--
 6 files changed, 47 insertions(+), 7 deletions(-)
--


http://git-wip-us.apache.org/repos/asf/phoenix/blob/a56f78ba/phoenix-core/src/it/java/org/apache/phoenix/end2end/CreateTableIT.java
--
diff --git 
a/phoenix-core/src/it/java/org/apache/phoenix/end2end/CreateTableIT.java 
b/phoenix-core/src/it/java/org/apache/phoenix/end2end/CreateTableIT.java
index 96b4a8e..e28273e 100644
--- a/phoenix-core/src/it/java/org/apache/phoenix/end2end/CreateTableIT.java
+++ b/phoenix-core/src/it/java/org/apache/phoenix/end2end/CreateTableIT.java
@@ -32,6 +32,7 @@ import java.util.Properties;
 import org.apache.hadoop.hbase.HColumnDescriptor;
 import org.apache.hadoop.hbase.client.HBaseAdmin;
 import org.apache.hadoop.hbase.util.Bytes;
+import org.apache.phoenix.exception.SQLExceptionCode;
 import org.apache.phoenix.jdbc.PhoenixStatement;
 import org.apache.phoenix.query.KeyRange;
 import org.apache.phoenix.schema.TableAlreadyExistsException;
@@ -318,5 +319,35 @@ public class CreateTableIT extends BaseClientManagedTimeIT 
{
assertEquals("a", columnFamilies[0].getNameAsString());
assertEquals(1, columnFamilies[0].getTimeToLive());
 }
+
+
+/**
+ * Test to ensure that NOT NULL constraint isn't added to a non primary 
key column.
+ * @throws Exception
+ */
+@Test
+public void testNotNullConstraintForNonPKColumn() throws Exception {
+
+String ddl = "CREATE TABLE IF NOT EXISTS EVENT.APEX_LIMIT ( " +
+" ORGANIZATION_ID CHAR(15) NOT NULL, " +
+" EVENT_TIME DATE NOT NULL, USER_ID CHAR(15) NOT NULL, " +
+" ENTRY_POINT_ID CHAR(15) NOT NULL, ENTRY_POINT_TYPE CHAR(2) 
NOT NULL , " +
+" APEX_LIMIT_ID CHAR(15) NOT NULL,  USERNAME CHAR(80),  " +
+" NAMESPACE_PREFIX VARCHAR, ENTRY_POINT_NAME VARCHAR  NOT NULL 
, " +
+" EXECUTION_UNIT_NO VARCHAR, LIMIT_TYPE VARCHAR, " +
+" LIMIT_VALUE DOUBLE  " +
+" CONSTRAINT PK PRIMARY KEY (" + 
+" ORGANIZATION_ID, EVENT_TIME,USER_ID,ENTRY_POINT_ID, 
ENTRY_POINT_TYPE, APEX_LIMIT_ID " +
+" ) ) VERSIONS=1";
+
+Properties props = new Properties();
+Connection conn = DriverManager.getConnection(getUrl(), props);
+try {
+conn.createStatement().execute(ddl);
+fail(" Non pk column ENTRY_POINT_NAME has a NOT NULL constraint");
+} catch( SQLException sqle) {
+
assertEquals(SQLExceptionCode.INVALID_NOT_NULL_CONSTRAINT.getErrorCode(),sqle.getErrorCode());
+}
+   }
 
 }

http://git-wip-us.apache.org/repos/asf/phoenix/blob/a56f78ba/phoenix-core/src/main/java/org/apache/phoenix/exception/SQLExceptionCode.java
--
diff --git 
a/phoenix-core/src/main/java/org/apache/phoenix/exception/SQLExceptionCode.java 
b/phoenix-core/src/main/java/org/apache/phoenix/exception/SQLExceptionCode.java
index d9e23f5..39b951d 100644
--- 
a/phoenix-core/src/main/java/org/apache/phoenix/exception/SQLExceptionCode.java
+++ 
b/phoenix-core/src/main/java/org/apache/phoenix/exception/SQLExceptionCode.java
@@ -138,6 +138,7 @@ public enum SQLExceptionCode {
 }),
 ORDER_BY_ARRAY_NOT_SUPPORTED(515, "42893", "ORDER BY of an array type is 
not allowed"),
 NON_EQUALITY_ARRAY_COMPARISON(516, "42894", "Array types may only be 
compared using = or !="),
+INVALID_NOT_NULL_CONSTRAINT(517, "42895", "Invalid not null constraint on 
non primary key column"),
 
 /** 
  * HBase and Phoenix specific implementation defined sub-classes.

http://git-wip-us.apache.org/repos/asf/phoenix/blob/a56f78ba/phoenix-core/src/main/java/org/apache/phoenix/query/Que

git commit: PHOENIX-1028 Prevent declaration of non PK columns as NOT NULL (Ravi)

2014-06-08 Thread jamestaylor
Repository: phoenix
Updated Branches:
  refs/heads/4.0 6982fde78 -> 4cab22ce0


PHOENIX-1028 Prevent declaration of non PK columns as NOT NULL (Ravi)


Project: http://git-wip-us.apache.org/repos/asf/phoenix/repo
Commit: http://git-wip-us.apache.org/repos/asf/phoenix/commit/4cab22ce
Tree: http://git-wip-us.apache.org/repos/asf/phoenix/tree/4cab22ce
Diff: http://git-wip-us.apache.org/repos/asf/phoenix/diff/4cab22ce

Branch: refs/heads/4.0
Commit: 4cab22ce05e4c35e92e8ffd0f21874dd23171fae
Parents: 6982fde
Author: James Taylor 
Authored: Sun Jun 8 21:36:45 2014 -0700
Committer: James Taylor 
Committed: Sun Jun 8 21:36:45 2014 -0700

--
 .../apache/phoenix/end2end/CreateTableIT.java   | 31 
 .../phoenix/exception/SQLExceptionCode.java |  1 +
 .../apache/phoenix/query/QueryConstants.java|  8 ++---
 .../apache/phoenix/schema/MetaDataClient.java   |  8 +
 .../phoenix/compile/QueryCompilerTest.java  |  2 +-
 .../phoenix/index/IndexMaintainerTest.java  |  4 +--
 6 files changed, 47 insertions(+), 7 deletions(-)
--


http://git-wip-us.apache.org/repos/asf/phoenix/blob/4cab22ce/phoenix-core/src/it/java/org/apache/phoenix/end2end/CreateTableIT.java
--
diff --git 
a/phoenix-core/src/it/java/org/apache/phoenix/end2end/CreateTableIT.java 
b/phoenix-core/src/it/java/org/apache/phoenix/end2end/CreateTableIT.java
index 96b4a8e..e28273e 100644
--- a/phoenix-core/src/it/java/org/apache/phoenix/end2end/CreateTableIT.java
+++ b/phoenix-core/src/it/java/org/apache/phoenix/end2end/CreateTableIT.java
@@ -32,6 +32,7 @@ import java.util.Properties;
 import org.apache.hadoop.hbase.HColumnDescriptor;
 import org.apache.hadoop.hbase.client.HBaseAdmin;
 import org.apache.hadoop.hbase.util.Bytes;
+import org.apache.phoenix.exception.SQLExceptionCode;
 import org.apache.phoenix.jdbc.PhoenixStatement;
 import org.apache.phoenix.query.KeyRange;
 import org.apache.phoenix.schema.TableAlreadyExistsException;
@@ -318,5 +319,35 @@ public class CreateTableIT extends BaseClientManagedTimeIT 
{
assertEquals("a", columnFamilies[0].getNameAsString());
assertEquals(1, columnFamilies[0].getTimeToLive());
 }
+
+
+/**
+ * Test to ensure that NOT NULL constraint isn't added to a non primary 
key column.
+ * @throws Exception
+ */
+@Test
+public void testNotNullConstraintForNonPKColumn() throws Exception {
+
+String ddl = "CREATE TABLE IF NOT EXISTS EVENT.APEX_LIMIT ( " +
+" ORGANIZATION_ID CHAR(15) NOT NULL, " +
+" EVENT_TIME DATE NOT NULL, USER_ID CHAR(15) NOT NULL, " +
+" ENTRY_POINT_ID CHAR(15) NOT NULL, ENTRY_POINT_TYPE CHAR(2) 
NOT NULL , " +
+" APEX_LIMIT_ID CHAR(15) NOT NULL,  USERNAME CHAR(80),  " +
+" NAMESPACE_PREFIX VARCHAR, ENTRY_POINT_NAME VARCHAR  NOT NULL 
, " +
+" EXECUTION_UNIT_NO VARCHAR, LIMIT_TYPE VARCHAR, " +
+" LIMIT_VALUE DOUBLE  " +
+" CONSTRAINT PK PRIMARY KEY (" + 
+" ORGANIZATION_ID, EVENT_TIME,USER_ID,ENTRY_POINT_ID, 
ENTRY_POINT_TYPE, APEX_LIMIT_ID " +
+" ) ) VERSIONS=1";
+
+Properties props = new Properties();
+Connection conn = DriverManager.getConnection(getUrl(), props);
+try {
+conn.createStatement().execute(ddl);
+fail(" Non pk column ENTRY_POINT_NAME has a NOT NULL constraint");
+} catch( SQLException sqle) {
+
assertEquals(SQLExceptionCode.INVALID_NOT_NULL_CONSTRAINT.getErrorCode(),sqle.getErrorCode());
+}
+   }
 
 }

http://git-wip-us.apache.org/repos/asf/phoenix/blob/4cab22ce/phoenix-core/src/main/java/org/apache/phoenix/exception/SQLExceptionCode.java
--
diff --git 
a/phoenix-core/src/main/java/org/apache/phoenix/exception/SQLExceptionCode.java 
b/phoenix-core/src/main/java/org/apache/phoenix/exception/SQLExceptionCode.java
index d9e23f5..39b951d 100644
--- 
a/phoenix-core/src/main/java/org/apache/phoenix/exception/SQLExceptionCode.java
+++ 
b/phoenix-core/src/main/java/org/apache/phoenix/exception/SQLExceptionCode.java
@@ -138,6 +138,7 @@ public enum SQLExceptionCode {
 }),
 ORDER_BY_ARRAY_NOT_SUPPORTED(515, "42893", "ORDER BY of an array type is 
not allowed"),
 NON_EQUALITY_ARRAY_COMPARISON(516, "42894", "Array types may only be 
compared using = or !="),
+INVALID_NOT_NULL_CONSTRAINT(517, "42895", "Invalid not null constraint on 
non primary key column"),
 
 /** 
  * HBase and Phoenix specific implementation defined sub-classes.

http://git-wip-us.apache.org/repos/asf/phoenix/blob/4cab22ce/phoenix-core/src/main/java/org/apache/phoenix/query/QueryCons

Apache-Phoenix | Master | Hadoop1 | Build Successful

2014-06-08 Thread Apache Jenkins Server
Master branch build status Successful
Source repository https://git-wip-us.apache.org/repos/asf/incubator-phoenix.git

Last Successful Compiled Artifacts https://builds.apache.org/job/Phoenix-master-hadoop1/lastSuccessfulBuild/artifact/

Last Complete Test Report https://builds.apache.org/job/Phoenix-master-hadoop1/lastCompletedBuild/testReport/

Changes
[jtaylor] Update readme files and remove disclaimer



Apache-Phoenix | 3.0 | Hadoop1 | Build Successful

2014-06-08 Thread Apache Jenkins Server
3.0 branch build status Successful
Source repository https://git-wip-us.apache.org/repos/asf/phoenix.git

Last Successful Compiled Artifacts https://builds.apache.org/job/Phoenix-3.0-hadoop1/lastSuccessfulBuild/artifact/

Last Complete Test Report https://builds.apache.org/job/Phoenix-3.0-hadoop1/lastCompletedBuild/testReport/

Changes
[jtaylor] Update readme files and remove disclaimer



Jenkins build is back to normal : Phoenix | 4.0 | Hadoop1 #179

2014-06-08 Thread Apache Jenkins Server
See 



git commit: Update readme files and remove disclaimer

2014-06-08 Thread jamestaylor
Repository: phoenix
Updated Branches:
  refs/heads/master 17ab67ece -> dbda9b702


Update readme files and remove disclaimer

Conflicts:
README


Project: http://git-wip-us.apache.org/repos/asf/phoenix/repo
Commit: http://git-wip-us.apache.org/repos/asf/phoenix/commit/dbda9b70
Tree: http://git-wip-us.apache.org/repos/asf/phoenix/tree/dbda9b70
Diff: http://git-wip-us.apache.org/repos/asf/phoenix/diff/dbda9b70

Branch: refs/heads/master
Commit: dbda9b702faec970f698c055c4648142e3c8c94c
Parents: 17ab67e
Author: James Taylor 
Authored: Sun Jun 8 15:02:21 2014 -0700
Committer: James Taylor 
Committed: Sun Jun 8 15:05:39 2014 -0700

--
 DISCLAIMER | 8 
 README.md  | 4 ++--
 build.txt  | 2 +-
 3 files changed, 3 insertions(+), 11 deletions(-)
--


http://git-wip-us.apache.org/repos/asf/phoenix/blob/dbda9b70/DISCLAIMER
--
diff --git a/DISCLAIMER b/DISCLAIMER
deleted file mode 100644
index 07eef20..000
--- a/DISCLAIMER
+++ /dev/null
@@ -1,8 +0,0 @@
-Apache Phoenix is an effort undergoing incubation at The Apache
-Software Foundation (ASF), sponsored by the Apache Incubator PMC.
-Incubation is required of all newly accepted projects until a further
-review indicates that the infrastructure, communications, and decision
-making process have stabilized in a manner consistent with other
-successful ASF projects. While incubation status is not necessarily a
-reflection of the completeness or stability of the code, it does
-indicate that the project has yet to be fully endorsed by the ASF.

http://git-wip-us.apache.org/repos/asf/phoenix/blob/dbda9b70/README.md
--
diff --git a/README.md b/README.md
index a45912d..c731bd5 100644
--- a/README.md
+++ b/README.md
@@ -1,5 +1,5 @@
-![logo](http://phoenix.incubator.apache.org/images/logo.png)
+![logo](http://phoenix.apache.org/images/logo.png)
 
-[Apache Phoenix](http://phoenix.incubator.apache.org/) is a SQL skin 
over HBase delivered as a client-embedded JDBC driver targeting low latency 
queries over HBase data. Visit the Apache Phoenix Incubator website 
[here](http://phoenix.incubator.apache.org/).
+[Apache Phoenix](http://phoenix.apache.org/) is a SQL skin over HBase 
delivered as a client-embedded JDBC driver targeting low latency queries over 
HBase data. Visit the Apache Phoenix website 
[here](http://phoenix.apache.org/).
 
 Copyright ©2014 [Apache Software Foundation](http://www.apache.org/). All 
Rights Reserved. 

http://git-wip-us.apache.org/repos/asf/phoenix/blob/dbda9b70/build.txt
--
diff --git a/build.txt b/build.txt
index e60fe84..f69048e 100644
--- a/build.txt
+++ b/build.txt
@@ -62,5 +62,5 @@ Findbugs report is generated in /target/site
 
 ## Generate Apache Web Site
 ===
-checkout https://svn.apache.org/repos/asf/incubator/phoenix
+checkout https://svn.apache.org/repos/asf/phoenix
 $ build.sh



git commit: Update readme files and remove disclaimer

2014-06-08 Thread jamestaylor
Repository: phoenix
Updated Branches:
  refs/heads/3.0 9729c171c -> 41ed8388a


Update readme files and remove disclaimer


Project: http://git-wip-us.apache.org/repos/asf/phoenix/repo
Commit: http://git-wip-us.apache.org/repos/asf/phoenix/commit/41ed8388
Tree: http://git-wip-us.apache.org/repos/asf/phoenix/tree/41ed8388
Diff: http://git-wip-us.apache.org/repos/asf/phoenix/diff/41ed8388

Branch: refs/heads/3.0
Commit: 41ed8388ac89d05ab9571d14bdd36e9b3ee91185
Parents: 9729c17
Author: James Taylor 
Authored: Sun Jun 8 15:02:21 2014 -0700
Committer: James Taylor 
Committed: Sun Jun 8 15:03:10 2014 -0700

--
 BUILDING   |  2 +-
 DISCLAIMER |  8 
 README | 10 +-
 README.md  |  4 ++--
 4 files changed, 8 insertions(+), 16 deletions(-)
--


http://git-wip-us.apache.org/repos/asf/phoenix/blob/41ed8388/BUILDING
--
diff --git a/BUILDING b/BUILDING
index 645f805..aac954c 100644
--- a/BUILDING
+++ b/BUILDING
@@ -60,5 +60,5 @@ Findbugs report is generated in /target/site
 
 ## Generate Apache Web Site
 ===
-checkout https://svn.apache.org/repos/asf/incubator/phoenix
+checkout https://svn.apache.org/repos/asf/phoenix
 $ build.sh

http://git-wip-us.apache.org/repos/asf/phoenix/blob/41ed8388/DISCLAIMER
--
diff --git a/DISCLAIMER b/DISCLAIMER
deleted file mode 100644
index 07eef20..000
--- a/DISCLAIMER
+++ /dev/null
@@ -1,8 +0,0 @@
-Apache Phoenix is an effort undergoing incubation at The Apache
-Software Foundation (ASF), sponsored by the Apache Incubator PMC.
-Incubation is required of all newly accepted projects until a further
-review indicates that the infrastructure, communications, and decision
-making process have stabilized in a manner consistent with other
-successful ASF projects. While incubation status is not necessarily a
-reflection of the completeness or stability of the code, it does
-indicate that the project has yet to be fully endorsed by the ASF.

http://git-wip-us.apache.org/repos/asf/phoenix/blob/41ed8388/README
--
diff --git a/README b/README
index 7f1873e..f98519a 100644
--- a/README
+++ b/README
@@ -1,4 +1,4 @@
-Apache Phoenix [1] Incubator project is a SQL skin over HBase delivered as a 
client-embedded 
+Apache Phoenix [1] project is a SQL skin over HBase delivered as a 
client-embedded 
 JDBC driver targeting low latency queries over HBase data. Apache Phoenix 
takes your SQL query, 
 compiles it into a series of HBase scans, and orchestrates the running of 
those scans to produce 
 regular JDBC result sets.
@@ -13,11 +13,11 @@ Apache Phoenix is made available under the Apache License, 
version 2 [4]
 
 The Phoenix mailing lists and archives are listed here [5]
 
-1. http://phoenix.incubator.apache.org/
-2. http://phoenix.incubator.apache.org/source.html
-3. http://phoenix.incubator.apache.org/issues.html
+1. http://phoenix.apache.org/
+2. http://phoenix.apache.org/source.html
+3. http://phoenix.apache.org/issues.html
 4. http://www.apache.org/licenses/
-5. http://phoenix.incubator.apache.org/mailing_list.html
+5. http://phoenix.apache.org/mailing_list.html
 
 Upgrading from Phoenix 2.2.x to Apache Phoenix 3.0/4.0
 --

http://git-wip-us.apache.org/repos/asf/phoenix/blob/41ed8388/README.md
--
diff --git a/README.md b/README.md
index d08508d..41b562b 100644
--- a/README.md
+++ b/README.md
@@ -1,5 +1,5 @@
-![logo](http://phoenix.incubator.apache.org/images/logo.png)
+![logo](http://phoenix.apache.org/images/logo.png)
 
-[Apache Phoenix](http://phoenix.incubator.apache.org/) is a SQL skin 
over HBase delivered as a client-embedded JDBC driver targeting low latency 
queries over HBase data. Visit the Apache Phoenix Incubator website 
[here](http://phoenix.incubator.apache.org/).
+[Apache Phoenix](http://phoenix.apache.org/) is a SQL skin over HBase 
delivered as a client-embedded JDBC driver targeting low latency queries over 
HBase data. Visit the Apache Phoenix website 
[here](http://phoenix.apache.org/).
 
 Copyright ©2014 [Apache Software Foundation](http://www.apache.org/). All 
Rights Reserved.



git commit: Update readme files and remove disclaimer

2014-06-08 Thread jamestaylor
Repository: phoenix
Updated Branches:
  refs/heads/4.0 0dfa31c35 -> 6982fde78


Update readme files and remove disclaimer


Project: http://git-wip-us.apache.org/repos/asf/phoenix/repo
Commit: http://git-wip-us.apache.org/repos/asf/phoenix/commit/6982fde7
Tree: http://git-wip-us.apache.org/repos/asf/phoenix/tree/6982fde7
Diff: http://git-wip-us.apache.org/repos/asf/phoenix/diff/6982fde7

Branch: refs/heads/4.0
Commit: 6982fde78a4ed989f01fdd92a42f420ed8afa80d
Parents: 0dfa31c
Author: James Taylor 
Authored: Sun Jun 8 15:02:21 2014 -0700
Committer: James Taylor 
Committed: Sun Jun 8 15:02:21 2014 -0700

--
 BUILDING   |  2 +-
 DISCLAIMER |  8 
 README | 10 +-
 README.md  |  4 ++--
 4 files changed, 8 insertions(+), 16 deletions(-)
--


http://git-wip-us.apache.org/repos/asf/phoenix/blob/6982fde7/BUILDING
--
diff --git a/BUILDING b/BUILDING
index 645f805..aac954c 100644
--- a/BUILDING
+++ b/BUILDING
@@ -60,5 +60,5 @@ Findbugs report is generated in /target/site
 
 ## Generate Apache Web Site
 ===
-checkout https://svn.apache.org/repos/asf/incubator/phoenix
+checkout https://svn.apache.org/repos/asf/phoenix
 $ build.sh

http://git-wip-us.apache.org/repos/asf/phoenix/blob/6982fde7/DISCLAIMER
--
diff --git a/DISCLAIMER b/DISCLAIMER
deleted file mode 100644
index 07eef20..000
--- a/DISCLAIMER
+++ /dev/null
@@ -1,8 +0,0 @@
-Apache Phoenix is an effort undergoing incubation at The Apache
-Software Foundation (ASF), sponsored by the Apache Incubator PMC.
-Incubation is required of all newly accepted projects until a further
-review indicates that the infrastructure, communications, and decision
-making process have stabilized in a manner consistent with other
-successful ASF projects. While incubation status is not necessarily a
-reflection of the completeness or stability of the code, it does
-indicate that the project has yet to be fully endorsed by the ASF.

http://git-wip-us.apache.org/repos/asf/phoenix/blob/6982fde7/README
--
diff --git a/README b/README
index 903e407..1e8fc54 100644
--- a/README
+++ b/README
@@ -1,4 +1,4 @@
-Apache Phoenix [1] Incubator project is a SQL skin over HBase delivered as a 
client-embedded 
+Apache Phoenix [1] project is a SQL skin over HBase delivered as a 
client-embedded 
 JDBC driver targeting low latency queries over HBase data. Apache Phoenix 
takes your SQL query, 
 compiles it into a series of HBase scans, and orchestrates the running of 
those scans to produce 
 regular JDBC result sets.
@@ -13,11 +13,11 @@ Apache Phoenix is made available under the Apache License, 
version 2 [4]
 
 The Phoenix mailing lists and archives are listed here [5]
 
-1. http://phoenix.incubator.apache.org/
-2. http://phoenix.incubator.apache.org/source.html
-3. http://phoenix.incubator.apache.org/issues.html
+1. http://phoenix.apache.org/
+2. http://phoenix.apache.org/source.html
+3. http://phoenix.apache.org/issues.html
 4. http://www.apache.org/licenses/
-5. http://phoenix.incubator.apache.org/mailing_list.html
+5. http://phoenix.apache.org/mailing_list.html
 
 Upgrading from Phoenix 2.2.x to Apache Phoenix 3.0/4.0
 --

http://git-wip-us.apache.org/repos/asf/phoenix/blob/6982fde7/README.md
--
diff --git a/README.md b/README.md
index d08508d..41b562b 100644
--- a/README.md
+++ b/README.md
@@ -1,5 +1,5 @@
-![logo](http://phoenix.incubator.apache.org/images/logo.png)
+![logo](http://phoenix.apache.org/images/logo.png)
 
-[Apache Phoenix](http://phoenix.incubator.apache.org/) is a SQL skin 
over HBase delivered as a client-embedded JDBC driver targeting low latency 
queries over HBase data. Visit the Apache Phoenix Incubator website 
[here](http://phoenix.incubator.apache.org/).
+[Apache Phoenix](http://phoenix.apache.org/) is a SQL skin over HBase 
delivered as a client-embedded JDBC driver targeting low latency queries over 
HBase data. Visit the Apache Phoenix website 
[here](http://phoenix.apache.org/).
 
 Copyright ©2014 [Apache Software Foundation](http://www.apache.org/). All 
Rights Reserved.



Jenkins build is back to normal : Phoenix | 3.0 | Hadoop1 #113

2014-06-08 Thread Apache Jenkins Server
See 



Jenkins build is back to normal : Phoenix | Master | Hadoop1 #249

2014-06-08 Thread Apache Jenkins Server
See 



Build failed in Jenkins: Phoenix | 4.0 | Hadoop1 #178

2014-06-08 Thread Apache Jenkins Server
See 

Changes:

[jtaylor] PHOENIX-1034 Move validate/reserve of sequences into query compile

--
[...truncated 347 lines...]
Running org.apache.phoenix.end2end.SkipRangeParallelIteratorRegionSplitterIT
Running org.apache.phoenix.end2end.KeyOnlyIT
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 4.352 sec - in 
org.apache.phoenix.end2end.KeyOnlyIT
Running org.apache.phoenix.end2end.DefaultParallelIteratorsRegionSplitterIT
Tests run: 8, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 8.608 sec - in 
org.apache.phoenix.end2end.SkipRangeParallelIteratorRegionSplitterIT
Running org.apache.phoenix.end2end.SequenceIT
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 4.603 sec - in 
org.apache.phoenix.end2end.DefaultParallelIteratorsRegionSplitterIT
Running org.apache.phoenix.end2end.ToNumberFunctionIT
Tests run: 152, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 99.635 sec - 
in org.apache.phoenix.end2end.GroupByIT
Running org.apache.phoenix.end2end.QueryDatabaseMetaDataIT
Tests run: 18, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 6.223 sec - in 
org.apache.phoenix.end2end.ToNumberFunctionIT
Running org.apache.phoenix.end2end.NotQueryIT
Tests run: 23, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 19.882 sec - 
in org.apache.phoenix.end2end.SequenceIT
Running org.apache.phoenix.end2end.VariableLengthPKIT
Tests run: 49, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 27.542 sec - 
in org.apache.phoenix.end2end.VariableLengthPKIT
Running org.apache.phoenix.end2end.salted.SaltedTableVarLengthRowKeyIT
Tests run: 15, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 48.481 sec - 
in org.apache.phoenix.end2end.QueryDatabaseMetaDataIT
Running org.apache.phoenix.end2end.salted.SaltedTableIT
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 7.118 sec - in 
org.apache.phoenix.end2end.salted.SaltedTableVarLengthRowKeyIT
Running org.apache.phoenix.end2end.RowValueConstructorIT
Tests run: 8, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 6.799 sec - in 
org.apache.phoenix.end2end.salted.SaltedTableIT
Running org.apache.phoenix.end2end.FunkyNamesIT
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 3.976 sec - in 
org.apache.phoenix.end2end.FunkyNamesIT
Running org.apache.phoenix.end2end.UpsertSelectIT
Tests run: 25, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 25.503 sec - 
in org.apache.phoenix.end2end.RowValueConstructorIT
Running org.apache.phoenix.end2end.CastAndCoerceIT
Tests run: 11, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 36.866 sec - 
in org.apache.phoenix.end2end.UpsertSelectIT
Running org.apache.phoenix.end2end.ReadIsolationLevelIT
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 3.368 sec - in 
org.apache.phoenix.end2end.ReadIsolationLevelIT
Running org.apache.phoenix.end2end.MultiCfQueryExecIT
Tests run: 144, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 97.428 sec - 
in org.apache.phoenix.end2end.NotQueryIT
Running org.apache.phoenix.end2end.NativeHBaseTypesIT
Tests run: 7, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 4.6 sec - in 
org.apache.phoenix.end2end.NativeHBaseTypesIT
Running org.apache.phoenix.end2end.ColumnProjectionOptimizationIT
Tests run: 7, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 17.117 sec - in 
org.apache.phoenix.end2end.MultiCfQueryExecIT
Running org.apache.phoenix.end2end.PercentileIT
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 11.745 sec - in 
org.apache.phoenix.end2end.ColumnProjectionOptimizationIT
Running org.apache.phoenix.end2end.CreateTableIT
Tests run: 17, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 13.968 sec - 
in org.apache.phoenix.end2end.PercentileIT
Running org.apache.phoenix.end2end.OrderByIT
Tests run: 3, Failures: 0, Errors: 0, Skipped: 1, Time elapsed: 4.948 sec - in 
org.apache.phoenix.end2end.OrderByIT
Running org.apache.phoenix.end2end.ProductMetricsIT
Tests run: 9, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 38.987 sec - in 
org.apache.phoenix.end2end.CreateTableIT
Running org.apache.phoenix.end2end.DistinctCountIT
Tests run: 124, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 87.274 sec - 
in org.apache.phoenix.end2end.CastAndCoerceIT
Running org.apache.phoenix.end2end.SpooledOrderByIT
Tests run: 61, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 30.406 sec - 
in org.apache.phoenix.end2end.ProductMetricsIT
Running org.apache.phoenix.end2end.InMemoryOrderByIT
Tests run: 3, Failures: 0, Errors: 0, Skipped: 1, Time elapsed: 5.288 sec - in 
org.apache.phoenix.end2end.SpooledOrderByIT
Running org.apache.phoenix.end2end.TruncateFunctionIT
Tests run: 3, Failures: 0, Errors: 0, Skipped: 1, Time elapsed: 5.194 sec - in 
org.apache.phoenix.end2end.InMemoryOrderByIT
Running org.apache.phoenix.end2end.TenantSpecificTablesDMLIT
Tests run: 9, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 8.243 sec - in 
org.apache.phoeni

git commit: PHOENIX-1034 Move validate/reserve of sequences into query compile

2014-06-08 Thread jamestaylor
Repository: phoenix
Updated Branches:
  refs/heads/master ea9232a31 -> 17ab67ece


PHOENIX-1034 Move validate/reserve of sequences into query compile


Project: http://git-wip-us.apache.org/repos/asf/phoenix/repo
Commit: http://git-wip-us.apache.org/repos/asf/phoenix/commit/17ab67ec
Tree: http://git-wip-us.apache.org/repos/asf/phoenix/tree/17ab67ec
Diff: http://git-wip-us.apache.org/repos/asf/phoenix/diff/17ab67ec

Branch: refs/heads/master
Commit: 17ab67ecefd4467897519a5c0f67d09bbf023ff8
Parents: ea9232a
Author: James Taylor 
Authored: Sun Jun 8 14:02:35 2014 -0700
Committer: James Taylor 
Committed: Sun Jun 8 14:13:08 2014 -0700

--
 .../DefaultParallelIteratorsRegionSplitterIT.java |  4 +++-
 ...SkipRangeParallelIteratorRegionSplitterIT.java |  4 +++-
 .../phoenix/compile/CreateIndexCompiler.java  |  2 +-
 .../phoenix/compile/CreateTableCompiler.java  |  2 +-
 .../org/apache/phoenix/compile/JoinCompiler.java  |  2 +-
 .../apache/phoenix/compile/PostDDLCompiler.java   |  3 ++-
 .../org/apache/phoenix/compile/QueryCompiler.java | 14 --
 .../apache/phoenix/compile/StatementContext.java  | 18 --
 .../apache/phoenix/compile/UpsertCompiler.java|  2 +-
 .../apache/phoenix/optimize/QueryOptimizer.java   |  5 +++--
 .../iterate/AggregateResultScannerTest.java   |  4 +++-
 11 files changed, 38 insertions(+), 22 deletions(-)
--


http://git-wip-us.apache.org/repos/asf/phoenix/blob/17ab67ec/phoenix-core/src/it/java/org/apache/phoenix/end2end/DefaultParallelIteratorsRegionSplitterIT.java
--
diff --git 
a/phoenix-core/src/it/java/org/apache/phoenix/end2end/DefaultParallelIteratorsRegionSplitterIT.java
 
b/phoenix-core/src/it/java/org/apache/phoenix/end2end/DefaultParallelIteratorsRegionSplitterIT.java
index 152b955..3ebbc8b 100644
--- 
a/phoenix-core/src/it/java/org/apache/phoenix/end2end/DefaultParallelIteratorsRegionSplitterIT.java
+++ 
b/phoenix-core/src/it/java/org/apache/phoenix/end2end/DefaultParallelIteratorsRegionSplitterIT.java
@@ -32,6 +32,7 @@ import org.apache.hadoop.hbase.HConstants;
 import org.apache.hadoop.hbase.HRegionLocation;
 import org.apache.hadoop.hbase.client.Scan;
 import org.apache.hadoop.hbase.util.Bytes;
+import org.apache.phoenix.compile.SequenceManager;
 import org.apache.phoenix.compile.StatementContext;
 import org.apache.phoenix.iterate.DefaultParallelIteratorRegionSplitter;
 import org.apache.phoenix.jdbc.PhoenixConnection;
@@ -61,7 +62,8 @@ public class DefaultParallelIteratorsRegionSplitterIT extends 
BaseParallelIterat
 TableRef tableRef = getTableRef(conn, ts);
 PhoenixConnection pconn = conn.unwrap(PhoenixConnection.class);
 final List regions =  
pconn.getQueryServices().getAllTableRegions(tableRef.getTable().getPhysicalName().getBytes());
-StatementContext context = new StatementContext(new 
PhoenixStatement(pconn), null, scan);
+PhoenixStatement statement = new PhoenixStatement(pconn);
+StatementContext context = new StatementContext(statement, null, scan, 
new SequenceManager(statement));
 DefaultParallelIteratorRegionSplitter splitter = new 
DefaultParallelIteratorRegionSplitter(context, tableRef, 
HintNode.EMPTY_HINT_NODE) {
 @Override
 protected List getAllRegions() throws 
SQLException {

http://git-wip-us.apache.org/repos/asf/phoenix/blob/17ab67ec/phoenix-core/src/it/java/org/apache/phoenix/end2end/SkipRangeParallelIteratorRegionSplitterIT.java
--
diff --git 
a/phoenix-core/src/it/java/org/apache/phoenix/end2end/SkipRangeParallelIteratorRegionSplitterIT.java
 
b/phoenix-core/src/it/java/org/apache/phoenix/end2end/SkipRangeParallelIteratorRegionSplitterIT.java
index 20ce768..d4a40f0 100644
--- 
a/phoenix-core/src/it/java/org/apache/phoenix/end2end/SkipRangeParallelIteratorRegionSplitterIT.java
+++ 
b/phoenix-core/src/it/java/org/apache/phoenix/end2end/SkipRangeParallelIteratorRegionSplitterIT.java
@@ -37,6 +37,7 @@ import org.apache.hadoop.hbase.client.Scan;
 import org.apache.hadoop.hbase.util.Bytes;
 import org.apache.phoenix.compile.ColumnResolver;
 import org.apache.phoenix.compile.ScanRanges;
+import org.apache.phoenix.compile.SequenceManager;
 import org.apache.phoenix.compile.StatementContext;
 import org.apache.phoenix.filter.SkipScanFilter;
 import org.apache.phoenix.iterate.SkipRangeParallelIteratorRegionSplitter;
@@ -356,7 +357,8 @@ public class SkipRangeParallelIteratorRegionSplitterIT 
extends BaseClientManaged
 
 };
 PhoenixConnection connection = DriverManager.getConnection(getUrl(), 
TEST_PROPERTIES).unwrap(PhoenixConnection.class);
-StatementContext context = new StatementContext(new 
PhoenixStatement(connection), resolver, scan);

git commit: PHOENIX-1034 Move validate/reserve of sequences into query compile

2014-06-08 Thread jamestaylor
Repository: phoenix
Updated Branches:
  refs/heads/4.0 6e9d02c86 -> 0dfa31c35


PHOENIX-1034 Move validate/reserve of sequences into query compile


Project: http://git-wip-us.apache.org/repos/asf/phoenix/repo
Commit: http://git-wip-us.apache.org/repos/asf/phoenix/commit/0dfa31c3
Tree: http://git-wip-us.apache.org/repos/asf/phoenix/tree/0dfa31c3
Diff: http://git-wip-us.apache.org/repos/asf/phoenix/diff/0dfa31c3

Branch: refs/heads/4.0
Commit: 0dfa31c35f595d66405c7121c9ce865b7d889805
Parents: 6e9d02c
Author: James Taylor 
Authored: Sun Jun 8 14:02:35 2014 -0700
Committer: James Taylor 
Committed: Sun Jun 8 14:08:29 2014 -0700

--
 .../DefaultParallelIteratorsRegionSplitterIT.java |  4 +++-
 ...SkipRangeParallelIteratorRegionSplitterIT.java |  4 +++-
 .../phoenix/compile/CreateIndexCompiler.java  |  2 +-
 .../phoenix/compile/CreateTableCompiler.java  |  2 +-
 .../org/apache/phoenix/compile/JoinCompiler.java  |  2 +-
 .../apache/phoenix/compile/PostDDLCompiler.java   |  3 ++-
 .../org/apache/phoenix/compile/QueryCompiler.java | 14 --
 .../apache/phoenix/compile/StatementContext.java  | 18 --
 .../apache/phoenix/compile/UpsertCompiler.java|  2 +-
 .../apache/phoenix/optimize/QueryOptimizer.java   |  5 +++--
 .../iterate/AggregateResultScannerTest.java   |  4 +++-
 11 files changed, 38 insertions(+), 22 deletions(-)
--


http://git-wip-us.apache.org/repos/asf/phoenix/blob/0dfa31c3/phoenix-core/src/it/java/org/apache/phoenix/end2end/DefaultParallelIteratorsRegionSplitterIT.java
--
diff --git 
a/phoenix-core/src/it/java/org/apache/phoenix/end2end/DefaultParallelIteratorsRegionSplitterIT.java
 
b/phoenix-core/src/it/java/org/apache/phoenix/end2end/DefaultParallelIteratorsRegionSplitterIT.java
index 152b955..3ebbc8b 100644
--- 
a/phoenix-core/src/it/java/org/apache/phoenix/end2end/DefaultParallelIteratorsRegionSplitterIT.java
+++ 
b/phoenix-core/src/it/java/org/apache/phoenix/end2end/DefaultParallelIteratorsRegionSplitterIT.java
@@ -32,6 +32,7 @@ import org.apache.hadoop.hbase.HConstants;
 import org.apache.hadoop.hbase.HRegionLocation;
 import org.apache.hadoop.hbase.client.Scan;
 import org.apache.hadoop.hbase.util.Bytes;
+import org.apache.phoenix.compile.SequenceManager;
 import org.apache.phoenix.compile.StatementContext;
 import org.apache.phoenix.iterate.DefaultParallelIteratorRegionSplitter;
 import org.apache.phoenix.jdbc.PhoenixConnection;
@@ -61,7 +62,8 @@ public class DefaultParallelIteratorsRegionSplitterIT extends 
BaseParallelIterat
 TableRef tableRef = getTableRef(conn, ts);
 PhoenixConnection pconn = conn.unwrap(PhoenixConnection.class);
 final List regions =  
pconn.getQueryServices().getAllTableRegions(tableRef.getTable().getPhysicalName().getBytes());
-StatementContext context = new StatementContext(new 
PhoenixStatement(pconn), null, scan);
+PhoenixStatement statement = new PhoenixStatement(pconn);
+StatementContext context = new StatementContext(statement, null, scan, 
new SequenceManager(statement));
 DefaultParallelIteratorRegionSplitter splitter = new 
DefaultParallelIteratorRegionSplitter(context, tableRef, 
HintNode.EMPTY_HINT_NODE) {
 @Override
 protected List getAllRegions() throws 
SQLException {

http://git-wip-us.apache.org/repos/asf/phoenix/blob/0dfa31c3/phoenix-core/src/it/java/org/apache/phoenix/end2end/SkipRangeParallelIteratorRegionSplitterIT.java
--
diff --git 
a/phoenix-core/src/it/java/org/apache/phoenix/end2end/SkipRangeParallelIteratorRegionSplitterIT.java
 
b/phoenix-core/src/it/java/org/apache/phoenix/end2end/SkipRangeParallelIteratorRegionSplitterIT.java
index 20ce768..d4a40f0 100644
--- 
a/phoenix-core/src/it/java/org/apache/phoenix/end2end/SkipRangeParallelIteratorRegionSplitterIT.java
+++ 
b/phoenix-core/src/it/java/org/apache/phoenix/end2end/SkipRangeParallelIteratorRegionSplitterIT.java
@@ -37,6 +37,7 @@ import org.apache.hadoop.hbase.client.Scan;
 import org.apache.hadoop.hbase.util.Bytes;
 import org.apache.phoenix.compile.ColumnResolver;
 import org.apache.phoenix.compile.ScanRanges;
+import org.apache.phoenix.compile.SequenceManager;
 import org.apache.phoenix.compile.StatementContext;
 import org.apache.phoenix.filter.SkipScanFilter;
 import org.apache.phoenix.iterate.SkipRangeParallelIteratorRegionSplitter;
@@ -356,7 +357,8 @@ public class SkipRangeParallelIteratorRegionSplitterIT 
extends BaseClientManaged
 
 };
 PhoenixConnection connection = DriverManager.getConnection(getUrl(), 
TEST_PROPERTIES).unwrap(PhoenixConnection.class);
-StatementContext context = new StatementContext(new 
PhoenixStatement(connection), resolver, scan);
+

git commit: PHOENIX-1034 Move validate/reserve of sequences into query compile

2014-06-08 Thread jamestaylor
Repository: phoenix
Updated Branches:
  refs/heads/3.0 c3067a754 -> 9729c171c


PHOENIX-1034 Move validate/reserve of sequences into query compile


Project: http://git-wip-us.apache.org/repos/asf/phoenix/repo
Commit: http://git-wip-us.apache.org/repos/asf/phoenix/commit/9729c171
Tree: http://git-wip-us.apache.org/repos/asf/phoenix/tree/9729c171
Diff: http://git-wip-us.apache.org/repos/asf/phoenix/diff/9729c171

Branch: refs/heads/3.0
Commit: 9729c171cbee7a06b7aa5e6be6ca01c3fbfa0da8
Parents: c3067a7
Author: James Taylor 
Authored: Sun Jun 8 14:02:35 2014 -0700
Committer: James Taylor 
Committed: Sun Jun 8 14:02:35 2014 -0700

--
 .../DefaultParallelIteratorsRegionSplitterIT.java |  4 +++-
 ...SkipRangeParallelIteratorRegionSplitterIT.java |  4 +++-
 .../phoenix/compile/CreateIndexCompiler.java  |  2 +-
 .../phoenix/compile/CreateTableCompiler.java  |  2 +-
 .../org/apache/phoenix/compile/JoinCompiler.java  |  2 +-
 .../apache/phoenix/compile/PostDDLCompiler.java   |  3 ++-
 .../org/apache/phoenix/compile/QueryCompiler.java | 14 --
 .../apache/phoenix/compile/StatementContext.java  | 18 --
 .../apache/phoenix/compile/UpsertCompiler.java|  2 +-
 .../apache/phoenix/optimize/QueryOptimizer.java   |  5 +++--
 .../iterate/AggregateResultScannerTest.java   |  4 +++-
 11 files changed, 38 insertions(+), 22 deletions(-)
--


http://git-wip-us.apache.org/repos/asf/phoenix/blob/9729c171/phoenix-core/src/it/java/org/apache/phoenix/end2end/DefaultParallelIteratorsRegionSplitterIT.java
--
diff --git 
a/phoenix-core/src/it/java/org/apache/phoenix/end2end/DefaultParallelIteratorsRegionSplitterIT.java
 
b/phoenix-core/src/it/java/org/apache/phoenix/end2end/DefaultParallelIteratorsRegionSplitterIT.java
index 152b955..3ebbc8b 100644
--- 
a/phoenix-core/src/it/java/org/apache/phoenix/end2end/DefaultParallelIteratorsRegionSplitterIT.java
+++ 
b/phoenix-core/src/it/java/org/apache/phoenix/end2end/DefaultParallelIteratorsRegionSplitterIT.java
@@ -32,6 +32,7 @@ import org.apache.hadoop.hbase.HConstants;
 import org.apache.hadoop.hbase.HRegionLocation;
 import org.apache.hadoop.hbase.client.Scan;
 import org.apache.hadoop.hbase.util.Bytes;
+import org.apache.phoenix.compile.SequenceManager;
 import org.apache.phoenix.compile.StatementContext;
 import org.apache.phoenix.iterate.DefaultParallelIteratorRegionSplitter;
 import org.apache.phoenix.jdbc.PhoenixConnection;
@@ -61,7 +62,8 @@ public class DefaultParallelIteratorsRegionSplitterIT extends 
BaseParallelIterat
 TableRef tableRef = getTableRef(conn, ts);
 PhoenixConnection pconn = conn.unwrap(PhoenixConnection.class);
 final List regions =  
pconn.getQueryServices().getAllTableRegions(tableRef.getTable().getPhysicalName().getBytes());
-StatementContext context = new StatementContext(new 
PhoenixStatement(pconn), null, scan);
+PhoenixStatement statement = new PhoenixStatement(pconn);
+StatementContext context = new StatementContext(statement, null, scan, 
new SequenceManager(statement));
 DefaultParallelIteratorRegionSplitter splitter = new 
DefaultParallelIteratorRegionSplitter(context, tableRef, 
HintNode.EMPTY_HINT_NODE) {
 @Override
 protected List getAllRegions() throws 
SQLException {

http://git-wip-us.apache.org/repos/asf/phoenix/blob/9729c171/phoenix-core/src/it/java/org/apache/phoenix/end2end/SkipRangeParallelIteratorRegionSplitterIT.java
--
diff --git 
a/phoenix-core/src/it/java/org/apache/phoenix/end2end/SkipRangeParallelIteratorRegionSplitterIT.java
 
b/phoenix-core/src/it/java/org/apache/phoenix/end2end/SkipRangeParallelIteratorRegionSplitterIT.java
index 20ce768..d4a40f0 100644
--- 
a/phoenix-core/src/it/java/org/apache/phoenix/end2end/SkipRangeParallelIteratorRegionSplitterIT.java
+++ 
b/phoenix-core/src/it/java/org/apache/phoenix/end2end/SkipRangeParallelIteratorRegionSplitterIT.java
@@ -37,6 +37,7 @@ import org.apache.hadoop.hbase.client.Scan;
 import org.apache.hadoop.hbase.util.Bytes;
 import org.apache.phoenix.compile.ColumnResolver;
 import org.apache.phoenix.compile.ScanRanges;
+import org.apache.phoenix.compile.SequenceManager;
 import org.apache.phoenix.compile.StatementContext;
 import org.apache.phoenix.filter.SkipScanFilter;
 import org.apache.phoenix.iterate.SkipRangeParallelIteratorRegionSplitter;
@@ -356,7 +357,8 @@ public class SkipRangeParallelIteratorRegionSplitterIT 
extends BaseClientManaged
 
 };
 PhoenixConnection connection = DriverManager.getConnection(getUrl(), 
TEST_PROPERTIES).unwrap(PhoenixConnection.class);
-StatementContext context = new StatementContext(new 
PhoenixStatement(connection), resolver, scan);
+

Build failed in Jenkins: Phoenix | 4.0 | Hadoop1 #177

2014-06-08 Thread Apache Jenkins Server
See 

Changes:

[jtaylor] PHOENIX-1034 Move validate/reserve of sequences into query compile

[jtaylor] PHOENIX-1034 Move validate/reserve of sequences into query compile

--
[...truncated 1355 lines...]
at 
org.apache.maven.surefire.junitcore.JUnitCoreProvider.invoke(JUnitCoreProvider.java:134)
at 
org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:200)
at 
org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:153)
at 
org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:103)

testPointInTimeSequence[CREATE INDEX ATABLE_IDX ON aTable (a_integer) INCLUDE ( 
   A_STRING, B_STRING, A_DATE)](org.apache.phoenix.end2end.NotQueryIT)  
Time elapsed: 0.662 sec  <<< ERROR!
java.lang.NullPointerException: null
at 
org.apache.phoenix.compile.SequenceManager$SequenceTuple.(SequenceManager.java:92)
at 
org.apache.phoenix.compile.SequenceManager.newSequenceTuple(SequenceManager.java:80)
at 
org.apache.phoenix.iterate.SequenceResultIterator.next(SequenceResultIterator.java:47)
at 
org.apache.phoenix.jdbc.PhoenixResultSet.next(PhoenixResultSet.java:732)
at 
org.apache.phoenix.end2end.QueryIT.testPointInTimeSequence(QueryIT.java:433)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at 
org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:47)
at 
org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
at 
org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:44)
at 
org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
at 
org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
at 
org.junit.internal.runners.statements.RunAfters.evaluate(RunAfters.java:27)
at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:271)
at 
org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:70)
at 
org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:50)
at org.junit.runners.ParentRunner$3.run(ParentRunner.java:238)
at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:63)
at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:236)
at org.junit.runners.ParentRunner.access$000(ParentRunner.java:53)
at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:229)
at org.junit.runners.ParentRunner.run(ParentRunner.java:309)
at org.junit.runners.Suite.runChild(Suite.java:127)
at org.junit.runners.Suite.runChild(Suite.java:26)
at org.junit.runners.ParentRunner$3.run(ParentRunner.java:238)
at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:63)
at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:236)
at org.junit.runners.ParentRunner.access$000(ParentRunner.java:53)
at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:229)
at 
org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
at 
org.junit.internal.runners.statements.RunAfters.evaluate(RunAfters.java:27)
at org.junit.runners.ParentRunner.run(ParentRunner.java:309)
at org.junit.runners.Suite.runChild(Suite.java:127)
at org.junit.runners.Suite.runChild(Suite.java:26)
at org.junit.runners.ParentRunner$3.run(ParentRunner.java:238)
at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:63)
at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:236)
at org.junit.runners.ParentRunner.access$000(ParentRunner.java:53)
at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:229)
at org.junit.runners.ParentRunner.run(ParentRunner.java:309)
at org.junit.runner.JUnitCore.run(JUnitCore.java:160)
at org.junit.runner.JUnitCore.run(JUnitCore.java:138)
at 
org.apache.maven.surefire.junitcore.JUnitCoreWrapper.createRequestAndRun(JUnitCoreWrapper.java:113)
at 
org.apache.maven.surefire.junitcore.JUnitCoreWrapper.executeLazy(JUnitCoreWrapper.java:94)
at 
org.apache.maven.surefire.junitcore.JUnitCoreWrapper.execute(JUnitCoreWrapper.java:58)
at 
org.apache.maven.surefire.junitcore.JUnitCoreProvider.invoke(JUnitCoreProvider.java:134)
at 
org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:200)
at 
org.apache.maven.surefire.booter.ForkedBoo

Build failed in Jenkins: Phoenix | 3.0 | Hadoop1 #112

2014-06-08 Thread Apache Jenkins Server
See 

Changes:

[jtaylor] PHOENIX-1034 Move validate/reserve of sequences into query compile

--
[...truncated 1357 lines...]
at 
org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
at 
org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:44)
at 
org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
at 
org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
at 
org.junit.internal.runners.statements.RunAfters.evaluate(RunAfters.java:27)
at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:271)
at 
org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:70)
at 
org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:50)
at org.junit.runners.ParentRunner$3.run(ParentRunner.java:238)
at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:63)
at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:236)
at org.junit.runners.ParentRunner.access$000(ParentRunner.java:53)
at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:229)
at org.junit.runners.ParentRunner.run(ParentRunner.java:309)
at org.junit.runners.Suite.runChild(Suite.java:127)
at org.junit.runners.Suite.runChild(Suite.java:26)
at org.junit.runners.ParentRunner$3.run(ParentRunner.java:238)
at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:63)
at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:236)
at org.junit.runners.ParentRunner.access$000(ParentRunner.java:53)
at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:229)
at 
org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
at 
org.junit.internal.runners.statements.RunAfters.evaluate(RunAfters.java:27)
at org.junit.runners.ParentRunner.run(ParentRunner.java:309)
at org.junit.runners.Suite.runChild(Suite.java:127)
at org.junit.runners.Suite.runChild(Suite.java:26)
at org.junit.runners.ParentRunner$3.run(ParentRunner.java:238)
at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:63)
at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:236)
at org.junit.runners.ParentRunner.access$000(ParentRunner.java:53)
at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:229)
at org.junit.runners.ParentRunner.run(ParentRunner.java:309)
at org.junit.runner.JUnitCore.run(JUnitCore.java:160)
at org.junit.runner.JUnitCore.run(JUnitCore.java:138)
at 
org.apache.maven.surefire.junitcore.JUnitCoreWrapper.createRequestAndRun(JUnitCoreWrapper.java:113)
at 
org.apache.maven.surefire.junitcore.JUnitCoreWrapper.executeLazy(JUnitCoreWrapper.java:94)
at 
org.apache.maven.surefire.junitcore.JUnitCoreWrapper.execute(JUnitCoreWrapper.java:58)
at 
org.apache.maven.surefire.junitcore.JUnitCoreProvider.invoke(JUnitCoreProvider.java:134)
at 
org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:200)
at 
org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:153)
at 
org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:103)

testPointInTimeSequence[CREATE INDEX ATABLE_IDX ON aTable (a_integer) INCLUDE ( 
   A_STRING, B_STRING, 
A_DATE)](org.apache.phoenix.end2end.CastAndCoerceIT)  Time elapsed: 0.515 sec  
<<< ERROR!
java.lang.NullPointerException: null
at 
org.apache.phoenix.compile.SequenceManager$SequenceTuple.(SequenceManager.java:92)
at 
org.apache.phoenix.compile.SequenceManager.newSequenceTuple(SequenceManager.java:80)
at 
org.apache.phoenix.iterate.SequenceResultIterator.next(SequenceResultIterator.java:47)
at 
org.apache.phoenix.jdbc.PhoenixResultSet.next(PhoenixResultSet.java:732)
at 
org.apache.phoenix.end2end.QueryIT.testPointInTimeSequence(QueryIT.java:433)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at 
org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:47)
at 
org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
at 
org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:44)
at 
org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
at 
org.junit.internal.runners.statements.RunBe

Build failed in Jenkins: Phoenix | Master | Hadoop1 #248

2014-06-08 Thread Apache Jenkins Server
See 

Changes:

[jtaylor] PHOENIX-1034 Move validate/reserve of sequences into query compile

[jtaylor] PHOENIX-1034 Move validate/reserve of sequences into query compile

--
[...truncated 1366 lines...]
at 
org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:200)
at 
org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:153)
at 
org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:103)

testPointInTimeSequence[CREATE INDEX ATABLE_IDX ON aTable (a_integer) INCLUDE ( 
   A_STRING, B_STRING, A_DATE)](org.apache.phoenix.end2end.NotQueryIT)  
Time elapsed: 0.554 sec  <<< ERROR!
java.lang.NullPointerException: null
at 
org.apache.phoenix.compile.SequenceManager$SequenceTuple.(SequenceManager.java:92)
at 
org.apache.phoenix.compile.SequenceManager.newSequenceTuple(SequenceManager.java:80)
at 
org.apache.phoenix.iterate.SequenceResultIterator.next(SequenceResultIterator.java:47)
at 
org.apache.phoenix.jdbc.PhoenixResultSet.next(PhoenixResultSet.java:732)
at 
org.apache.phoenix.end2end.QueryIT.testPointInTimeSequence(QueryIT.java:433)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at 
org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:47)
at 
org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
at 
org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:44)
at 
org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
at 
org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
at 
org.junit.internal.runners.statements.RunAfters.evaluate(RunAfters.java:27)
at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:271)
at 
org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:70)
at 
org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:50)
at org.junit.runners.ParentRunner$3.run(ParentRunner.java:238)
at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:63)
at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:236)
at org.junit.runners.ParentRunner.access$000(ParentRunner.java:53)
at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:229)
at org.junit.runners.ParentRunner.run(ParentRunner.java:309)
at org.junit.runners.Suite.runChild(Suite.java:127)
at org.junit.runners.Suite.runChild(Suite.java:26)
at org.junit.runners.ParentRunner$3.run(ParentRunner.java:238)
at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:63)
at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:236)
at org.junit.runners.ParentRunner.access$000(ParentRunner.java:53)
at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:229)
at 
org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
at 
org.junit.internal.runners.statements.RunAfters.evaluate(RunAfters.java:27)
at org.junit.runners.ParentRunner.run(ParentRunner.java:309)
at org.junit.runners.Suite.runChild(Suite.java:127)
at org.junit.runners.Suite.runChild(Suite.java:26)
at org.junit.runners.ParentRunner$3.run(ParentRunner.java:238)
at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:63)
at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:236)
at org.junit.runners.ParentRunner.access$000(ParentRunner.java:53)
at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:229)
at org.junit.runners.ParentRunner.run(ParentRunner.java:309)
at org.junit.runner.JUnitCore.run(JUnitCore.java:160)
at org.junit.runner.JUnitCore.run(JUnitCore.java:138)
at 
org.apache.maven.surefire.junitcore.JUnitCoreWrapper.createRequestAndRun(JUnitCoreWrapper.java:113)
at 
org.apache.maven.surefire.junitcore.JUnitCoreWrapper.executeLazy(JUnitCoreWrapper.java:94)
at 
org.apache.maven.surefire.junitcore.JUnitCoreWrapper.execute(JUnitCoreWrapper.java:58)
at 
org.apache.maven.surefire.junitcore.JUnitCoreProvider.invoke(JUnitCoreProvider.java:134)
at 
org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:200)
at 
org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:153)
at 
org.apache.maven.surefire.booter.ForkedB

git commit: PHOENIX-1034 Move validate/reserve of sequences into query compile

2014-06-08 Thread jamestaylor
Repository: phoenix
Updated Branches:
  refs/heads/master 2ad434d6e -> ea9232a31


PHOENIX-1034 Move validate/reserve of sequences into query compile


Project: http://git-wip-us.apache.org/repos/asf/phoenix/repo
Commit: http://git-wip-us.apache.org/repos/asf/phoenix/commit/ea9232a3
Tree: http://git-wip-us.apache.org/repos/asf/phoenix/tree/ea9232a3
Diff: http://git-wip-us.apache.org/repos/asf/phoenix/diff/ea9232a3

Branch: refs/heads/master
Commit: ea9232a31e0bed23c6e746e91d3bb3250dc71ae9
Parents: 2ad434d
Author: James Taylor 
Authored: Sun Jun 8 13:01:55 2014 -0700
Committer: James Taylor 
Committed: Sun Jun 8 13:02:50 2014 -0700

--
 .../phoenix/coprocessor/SequenceRegionObserver.java |  2 +-
 .../apache/phoenix/jdbc/PhoenixPreparedStatement.java   |  4 ++--
 .../java/org/apache/phoenix/jdbc/PhoenixStatement.java  | 12 ++--
 .../phoenix/query/ConnectionQueryServicesImpl.java  |  2 +-
 .../main/java/org/apache/phoenix/schema/Sequence.java   |  4 ++--
 5 files changed, 12 insertions(+), 12 deletions(-)
--


http://git-wip-us.apache.org/repos/asf/phoenix/blob/ea9232a3/phoenix-core/src/main/java/org/apache/phoenix/coprocessor/SequenceRegionObserver.java
--
diff --git 
a/phoenix-core/src/main/java/org/apache/phoenix/coprocessor/SequenceRegionObserver.java
 
b/phoenix-core/src/main/java/org/apache/phoenix/coprocessor/SequenceRegionObserver.java
index a21a61a..97a9a47 100644
--- 
a/phoenix-core/src/main/java/org/apache/phoenix/coprocessor/SequenceRegionObserver.java
+++ 
b/phoenix-core/src/main/java/org/apache/phoenix/coprocessor/SequenceRegionObserver.java
@@ -129,7 +129,7 @@ public class SequenceRegionObserver extends 
BaseRegionObserver {
long value = 
PDataType.LONG.getCodec().decodeLong(cq.getValueArray(), cq.getValueOffset(), 
SortOrder.getDefault());
 get.addColumn(cf, CellUtil.cloneQualifier(cq));
-validateOnly &= 
(Sequence.ValueOp.VALIDATE_SEQUENCES.ordinal() == value);
+validateOnly &= 
(Sequence.ValueOp.VALIDATE_SEQUENCE.ordinal() == value);
 }
 }
 Result result = region.get(get);

http://git-wip-us.apache.org/repos/asf/phoenix/blob/ea9232a3/phoenix-core/src/main/java/org/apache/phoenix/jdbc/PhoenixPreparedStatement.java
--
diff --git 
a/phoenix-core/src/main/java/org/apache/phoenix/jdbc/PhoenixPreparedStatement.java
 
b/phoenix-core/src/main/java/org/apache/phoenix/jdbc/PhoenixPreparedStatement.java
index d75eb28..7eea568 100644
--- 
a/phoenix-core/src/main/java/org/apache/phoenix/jdbc/PhoenixPreparedStatement.java
+++ 
b/phoenix-core/src/main/java/org/apache/phoenix/jdbc/PhoenixPreparedStatement.java
@@ -189,7 +189,7 @@ public class PhoenixPreparedStatement extends 
PhoenixStatement implements Prepar
 }
 try {
 // Just compile top level query without optimizing to get 
ResultSetMetaData
-QueryPlan plan = statement.compilePlan(this, 
Sequence.ValueOp.VALIDATE_SEQUENCES);
+QueryPlan plan = statement.compilePlan(this, 
Sequence.ValueOp.VALIDATE_SEQUENCE);
 return new PhoenixResultSetMetaData(this.getConnection(), 
plan.getProjector());
 } finally {
 int lastSetBit = 0;
@@ -212,7 +212,7 @@ public class PhoenixPreparedStatement extends 
PhoenixStatement implements Prepar
 }
 }
 try {
-StatementPlan plan = statement.compilePlan(this, 
Sequence.ValueOp.VALIDATE_SEQUENCES);
+StatementPlan plan = statement.compilePlan(this, 
Sequence.ValueOp.VALIDATE_SEQUENCE);
 return plan.getParameterMetaData();
 } finally {
 int lastSetBit = 0;

http://git-wip-us.apache.org/repos/asf/phoenix/blob/ea9232a3/phoenix-core/src/main/java/org/apache/phoenix/jdbc/PhoenixStatement.java
--
diff --git 
a/phoenix-core/src/main/java/org/apache/phoenix/jdbc/PhoenixStatement.java 
b/phoenix-core/src/main/java/org/apache/phoenix/jdbc/PhoenixStatement.java
index 529a40a..d4c677b 100644
--- a/phoenix-core/src/main/java/org/apache/phoenix/jdbc/PhoenixStatement.java
+++ b/phoenix-core/src/main/java/org/apache/phoenix/jdbc/PhoenixStatement.java
@@ -195,7 +195,7 @@ public class PhoenixStatement implements Statement, 
SQLCloseable, org.apache.pho
 }
 
 protected QueryPlan optimizeQuery(CompilableStatement stmt) throws 
SQLException {
-QueryPlan plan = stmt.compilePlan(this, 
Sequence.ValueOp.RESERVE_SEQUENCES);
+QueryPlan plan = stmt.compilePlan(this, 
Sequence.ValueOp.RESERVE_SEQUENCE);
 return connection.ge

git commit: PHOENIX-1034 Move validate/reserve of sequences into query compile

2014-06-08 Thread jamestaylor
Repository: phoenix
Updated Branches:
  refs/heads/4.0 607a11b44 -> 6e9d02c86


PHOENIX-1034 Move validate/reserve of sequences into query compile


Project: http://git-wip-us.apache.org/repos/asf/phoenix/repo
Commit: http://git-wip-us.apache.org/repos/asf/phoenix/commit/6e9d02c8
Tree: http://git-wip-us.apache.org/repos/asf/phoenix/tree/6e9d02c8
Diff: http://git-wip-us.apache.org/repos/asf/phoenix/diff/6e9d02c8

Branch: refs/heads/4.0
Commit: 6e9d02c861d94c4e85474119ee89f6e4bb05fa5f
Parents: 607a11b
Author: James Taylor 
Authored: Sun Jun 8 13:01:55 2014 -0700
Committer: James Taylor 
Committed: Sun Jun 8 13:01:55 2014 -0700

--
 .../phoenix/coprocessor/SequenceRegionObserver.java |  2 +-
 .../apache/phoenix/jdbc/PhoenixPreparedStatement.java   |  4 ++--
 .../java/org/apache/phoenix/jdbc/PhoenixStatement.java  | 12 ++--
 .../phoenix/query/ConnectionQueryServicesImpl.java  |  2 +-
 .../main/java/org/apache/phoenix/schema/Sequence.java   |  4 ++--
 5 files changed, 12 insertions(+), 12 deletions(-)
--


http://git-wip-us.apache.org/repos/asf/phoenix/blob/6e9d02c8/phoenix-core/src/main/java/org/apache/phoenix/coprocessor/SequenceRegionObserver.java
--
diff --git 
a/phoenix-core/src/main/java/org/apache/phoenix/coprocessor/SequenceRegionObserver.java
 
b/phoenix-core/src/main/java/org/apache/phoenix/coprocessor/SequenceRegionObserver.java
index a21a61a..97a9a47 100644
--- 
a/phoenix-core/src/main/java/org/apache/phoenix/coprocessor/SequenceRegionObserver.java
+++ 
b/phoenix-core/src/main/java/org/apache/phoenix/coprocessor/SequenceRegionObserver.java
@@ -129,7 +129,7 @@ public class SequenceRegionObserver extends 
BaseRegionObserver {
long value = 
PDataType.LONG.getCodec().decodeLong(cq.getValueArray(), cq.getValueOffset(), 
SortOrder.getDefault());
 get.addColumn(cf, CellUtil.cloneQualifier(cq));
-validateOnly &= 
(Sequence.ValueOp.VALIDATE_SEQUENCES.ordinal() == value);
+validateOnly &= 
(Sequence.ValueOp.VALIDATE_SEQUENCE.ordinal() == value);
 }
 }
 Result result = region.get(get);

http://git-wip-us.apache.org/repos/asf/phoenix/blob/6e9d02c8/phoenix-core/src/main/java/org/apache/phoenix/jdbc/PhoenixPreparedStatement.java
--
diff --git 
a/phoenix-core/src/main/java/org/apache/phoenix/jdbc/PhoenixPreparedStatement.java
 
b/phoenix-core/src/main/java/org/apache/phoenix/jdbc/PhoenixPreparedStatement.java
index d75eb28..7eea568 100644
--- 
a/phoenix-core/src/main/java/org/apache/phoenix/jdbc/PhoenixPreparedStatement.java
+++ 
b/phoenix-core/src/main/java/org/apache/phoenix/jdbc/PhoenixPreparedStatement.java
@@ -189,7 +189,7 @@ public class PhoenixPreparedStatement extends 
PhoenixStatement implements Prepar
 }
 try {
 // Just compile top level query without optimizing to get 
ResultSetMetaData
-QueryPlan plan = statement.compilePlan(this, 
Sequence.ValueOp.VALIDATE_SEQUENCES);
+QueryPlan plan = statement.compilePlan(this, 
Sequence.ValueOp.VALIDATE_SEQUENCE);
 return new PhoenixResultSetMetaData(this.getConnection(), 
plan.getProjector());
 } finally {
 int lastSetBit = 0;
@@ -212,7 +212,7 @@ public class PhoenixPreparedStatement extends 
PhoenixStatement implements Prepar
 }
 }
 try {
-StatementPlan plan = statement.compilePlan(this, 
Sequence.ValueOp.VALIDATE_SEQUENCES);
+StatementPlan plan = statement.compilePlan(this, 
Sequence.ValueOp.VALIDATE_SEQUENCE);
 return plan.getParameterMetaData();
 } finally {
 int lastSetBit = 0;

http://git-wip-us.apache.org/repos/asf/phoenix/blob/6e9d02c8/phoenix-core/src/main/java/org/apache/phoenix/jdbc/PhoenixStatement.java
--
diff --git 
a/phoenix-core/src/main/java/org/apache/phoenix/jdbc/PhoenixStatement.java 
b/phoenix-core/src/main/java/org/apache/phoenix/jdbc/PhoenixStatement.java
index 529a40a..d4c677b 100644
--- a/phoenix-core/src/main/java/org/apache/phoenix/jdbc/PhoenixStatement.java
+++ b/phoenix-core/src/main/java/org/apache/phoenix/jdbc/PhoenixStatement.java
@@ -195,7 +195,7 @@ public class PhoenixStatement implements Statement, 
SQLCloseable, org.apache.pho
 }
 
 protected QueryPlan optimizeQuery(CompilableStatement stmt) throws 
SQLException {
-QueryPlan plan = stmt.compilePlan(this, 
Sequence.ValueOp.RESERVE_SEQUENCES);
+QueryPlan plan = stmt.compilePlan(this, 
Sequence.ValueOp.RESERVE_SEQUENCE);
 return connection.getQuery

git commit: PHOENIX-1034 Move validate/reserve of sequences into query compile

2014-06-08 Thread jamestaylor
Repository: phoenix
Updated Branches:
  refs/heads/3.0 bf395def6 -> c3067a754


PHOENIX-1034 Move validate/reserve of sequences into query compile


Project: http://git-wip-us.apache.org/repos/asf/phoenix/repo
Commit: http://git-wip-us.apache.org/repos/asf/phoenix/commit/c3067a75
Tree: http://git-wip-us.apache.org/repos/asf/phoenix/tree/c3067a75
Diff: http://git-wip-us.apache.org/repos/asf/phoenix/diff/c3067a75

Branch: refs/heads/3.0
Commit: c3067a754541e9d315960200f0136bf696fd2db5
Parents: bf395de
Author: James Taylor 
Authored: Sun Jun 8 12:47:53 2014 -0700
Committer: James Taylor 
Committed: Sun Jun 8 12:59:33 2014 -0700

--
 .../apache/phoenix/compile/SequenceManager.java |  2 +-
 .../coprocessor/SequenceRegionObserver.java | 13 ++--
 .../phoenix/jdbc/PhoenixPreparedStatement.java  |  5 +-
 .../apache/phoenix/jdbc/PhoenixStatement.java   | 62 +++-
 .../apache/phoenix/parse/BindableStatement.java |  2 -
 .../apache/phoenix/parse/DeleteStatement.java   |  6 --
 .../apache/phoenix/parse/ExplainStatement.java  |  6 --
 .../apache/phoenix/parse/MutableStatement.java  |  6 --
 .../apache/phoenix/parse/SelectStatement.java   |  6 --
 .../phoenix/query/ConnectionQueryServices.java  |  2 +-
 .../query/ConnectionQueryServicesImpl.java  |  6 +-
 .../query/ConnectionlessQueryServicesImpl.java  |  4 +-
 .../query/DelegateConnectionQueryServices.java  |  2 +-
 .../org/apache/phoenix/schema/Sequence.java | 15 ++---
 .../phoenix/pig/hadoop/PhoenixInputFormat.java  |  5 --
 15 files changed, 59 insertions(+), 83 deletions(-)
--


http://git-wip-us.apache.org/repos/asf/phoenix/blob/c3067a75/phoenix-core/src/main/java/org/apache/phoenix/compile/SequenceManager.java
--
diff --git 
a/phoenix-core/src/main/java/org/apache/phoenix/compile/SequenceManager.java 
b/phoenix-core/src/main/java/org/apache/phoenix/compile/SequenceManager.java
index a5f37f8..8e71c3b 100644
--- a/phoenix-core/src/main/java/org/apache/phoenix/compile/SequenceManager.java
+++ b/phoenix-core/src/main/java/org/apache/phoenix/compile/SequenceManager.java
@@ -138,7 +138,7 @@ public class SequenceManager {
 return expression;
 }
 
-public void validateSequences(Sequence.Action action) throws SQLException {
+public void validateSequences(Sequence.ValueOp action) throws SQLException 
{
 if (sequenceMap == null || sequenceMap.isEmpty()) {
 return;
 }

http://git-wip-us.apache.org/repos/asf/phoenix/blob/c3067a75/phoenix-core/src/main/java/org/apache/phoenix/coprocessor/SequenceRegionObserver.java
--
diff --git 
a/phoenix-core/src/main/java/org/apache/phoenix/coprocessor/SequenceRegionObserver.java
 
b/phoenix-core/src/main/java/org/apache/phoenix/coprocessor/SequenceRegionObserver.java
index 46834cf..875bb0c 100644
--- 
a/phoenix-core/src/main/java/org/apache/phoenix/coprocessor/SequenceRegionObserver.java
+++ 
b/phoenix-core/src/main/java/org/apache/phoenix/coprocessor/SequenceRegionObserver.java
@@ -67,7 +67,6 @@ import org.apache.phoenix.util.ServerUtil;
  * @since 3.0.0
  */
 public class SequenceRegionObserver extends BaseRegionObserver {
-public enum Op {CREATE_SEQUENCE, DROP_SEQUENCE, RETURN_SEQUENCE};
 public static final String OPERATION_ATTRIB = "SEQUENCE_OPERATION";
 public static final String MAX_TIMERANGE_ATTRIB = "MAX_TIMERANGE";
 public static final String CURRENT_VALUE_ATTRIB = "CURRENT_VALUE";
@@ -114,7 +113,7 @@ public class SequenceRegionObserver extends 
BaseRegionObserver {
 byte[] cf = entry.getKey();
 for (Map.Entry kvEntry : 
entry.getValue().entrySet()) {
 get.addColumn(cf, kvEntry.getKey());
-validateOnly &= (Sequence.Action.VALIDATE.ordinal() == 
kvEntry.getValue().intValue());
+validateOnly &= 
(Sequence.ValueOp.VALIDATE_SEQUENCE.ordinal() == kvEntry.getValue().intValue());
 }
 }
 Result result = region.get(get);
@@ -167,7 +166,7 @@ public class SequenceRegionObserver extends 
BaseRegionObserver {
 if (opBuf == null) {
 return null;
 }
-Op op = Op.values()[opBuf[0]];
+Sequence.MetaOp op = Sequence.MetaOp.values()[opBuf[0]];
 KeyValue keyValue = 
append.getFamilyMap().values().iterator().next().iterator().next();
 
 long clientTimestamp = HConstants.LATEST_TIMESTAMP;
@@ -175,7 +174,7 @@ public class SequenceRegionObserver extends 
BaseRegionObserver {
 long maxGetTimestamp = HConstants.LATEST_TIMESTAMP;
 boolean hadClientTimestamp;
 byte[] clientTimestampBuf = null;
-if (op == Op.RETURN_SEQUENCE) {
+if (op == Sequ

git commit: PHOENIX-1034 Move validate/reserve of sequences into query compile

2014-06-08 Thread jamestaylor
Repository: phoenix
Updated Branches:
  refs/heads/4.0 75db322e2 -> 607a11b44


PHOENIX-1034 Move validate/reserve of sequences into query compile


Project: http://git-wip-us.apache.org/repos/asf/phoenix/repo
Commit: http://git-wip-us.apache.org/repos/asf/phoenix/commit/607a11b4
Tree: http://git-wip-us.apache.org/repos/asf/phoenix/tree/607a11b4
Diff: http://git-wip-us.apache.org/repos/asf/phoenix/diff/607a11b4

Branch: refs/heads/4.0
Commit: 607a11b44ac769e3eff0b52c7895a93fd78bad88
Parents: 75db322
Author: James Taylor 
Authored: Sun Jun 8 12:47:53 2014 -0700
Committer: James Taylor 
Committed: Sun Jun 8 12:48:58 2014 -0700

--
 .../apache/phoenix/compile/SequenceManager.java |  2 +-
 .../coprocessor/SequenceRegionObserver.java | 13 ++--
 .../phoenix/jdbc/PhoenixPreparedStatement.java  |  5 +-
 .../apache/phoenix/jdbc/PhoenixStatement.java   | 62 +++-
 .../apache/phoenix/parse/BindableStatement.java |  2 -
 .../apache/phoenix/parse/DeleteStatement.java   |  6 --
 .../apache/phoenix/parse/ExplainStatement.java  |  6 --
 .../apache/phoenix/parse/MutableStatement.java  |  6 --
 .../apache/phoenix/parse/SelectStatement.java   |  6 --
 .../phoenix/query/ConnectionQueryServices.java  |  2 +-
 .../query/ConnectionQueryServicesImpl.java  |  6 +-
 .../query/ConnectionlessQueryServicesImpl.java  |  2 +-
 .../query/DelegateConnectionQueryServices.java  |  2 +-
 .../org/apache/phoenix/schema/Sequence.java | 15 ++---
 .../phoenix/pig/hadoop/PhoenixInputFormat.java  |  5 --
 15 files changed, 58 insertions(+), 82 deletions(-)
--


http://git-wip-us.apache.org/repos/asf/phoenix/blob/607a11b4/phoenix-core/src/main/java/org/apache/phoenix/compile/SequenceManager.java
--
diff --git 
a/phoenix-core/src/main/java/org/apache/phoenix/compile/SequenceManager.java 
b/phoenix-core/src/main/java/org/apache/phoenix/compile/SequenceManager.java
index a5f37f8..8e71c3b 100644
--- a/phoenix-core/src/main/java/org/apache/phoenix/compile/SequenceManager.java
+++ b/phoenix-core/src/main/java/org/apache/phoenix/compile/SequenceManager.java
@@ -138,7 +138,7 @@ public class SequenceManager {
 return expression;
 }
 
-public void validateSequences(Sequence.Action action) throws SQLException {
+public void validateSequences(Sequence.ValueOp action) throws SQLException 
{
 if (sequenceMap == null || sequenceMap.isEmpty()) {
 return;
 }

http://git-wip-us.apache.org/repos/asf/phoenix/blob/607a11b4/phoenix-core/src/main/java/org/apache/phoenix/coprocessor/SequenceRegionObserver.java
--
diff --git 
a/phoenix-core/src/main/java/org/apache/phoenix/coprocessor/SequenceRegionObserver.java
 
b/phoenix-core/src/main/java/org/apache/phoenix/coprocessor/SequenceRegionObserver.java
index 94008fc..a21a61a 100644
--- 
a/phoenix-core/src/main/java/org/apache/phoenix/coprocessor/SequenceRegionObserver.java
+++ 
b/phoenix-core/src/main/java/org/apache/phoenix/coprocessor/SequenceRegionObserver.java
@@ -70,7 +70,6 @@ import com.google.common.collect.Lists;
  * @since 3.0.0
  */
 public class SequenceRegionObserver extends BaseRegionObserver {
-public enum Op {CREATE_SEQUENCE, DROP_SEQUENCE, RETURN_SEQUENCE};
 public static final String OPERATION_ATTRIB = "SEQUENCE_OPERATION";
 public static final String MAX_TIMERANGE_ATTRIB = "MAX_TIMERANGE";
 public static final String CURRENT_VALUE_ATTRIB = "CURRENT_VALUE";
@@ -130,7 +129,7 @@ public class SequenceRegionObserver extends 
BaseRegionObserver {
long value = 
PDataType.LONG.getCodec().decodeLong(cq.getValueArray(), cq.getValueOffset(), 
SortOrder.getDefault());
 get.addColumn(cf, CellUtil.cloneQualifier(cq));
-validateOnly &= (Sequence.Action.VALIDATE.ordinal() == 
value);
+validateOnly &= 
(Sequence.ValueOp.VALIDATE_SEQUENCES.ordinal() == value);
 }
 }
 Result result = region.get(get);
@@ -187,7 +186,7 @@ public class SequenceRegionObserver extends 
BaseRegionObserver {
 if (opBuf == null) {
 return null;
 }
-Op op = Op.values()[opBuf[0]];
+Sequence.MetaOp op = Sequence.MetaOp.values()[opBuf[0]];
 Cell keyValue = 
append.getFamilyCellMap().values().iterator().next().iterator().next();
 
 long clientTimestamp = HConstants.LATEST_TIMESTAMP;
@@ -195,7 +194,7 @@ public class SequenceRegionObserver extends 
BaseRegionObserver {
 long maxGetTimestamp = HConstants.LATEST_TIMESTAMP;
 boolean hadClientTimestamp;
 byte[] clientTimestampBuf = null;
-if (op == Op.RETURN_SEQUENCE) {
+  

git commit: PHOENIX-1034 Move validate/reserve of sequences into query compile

2014-06-08 Thread jamestaylor
Repository: phoenix
Updated Branches:
  refs/heads/master 318921eae -> 2ad434d6e


PHOENIX-1034 Move validate/reserve of sequences into query compile


Project: http://git-wip-us.apache.org/repos/asf/phoenix/repo
Commit: http://git-wip-us.apache.org/repos/asf/phoenix/commit/2ad434d6
Tree: http://git-wip-us.apache.org/repos/asf/phoenix/tree/2ad434d6
Diff: http://git-wip-us.apache.org/repos/asf/phoenix/diff/2ad434d6

Branch: refs/heads/master
Commit: 2ad434d6eac8e32670014191256236b4d181d19b
Parents: 318921e
Author: James Taylor 
Authored: Sun Jun 8 12:47:53 2014 -0700
Committer: James Taylor 
Committed: Sun Jun 8 12:47:53 2014 -0700

--
 .../apache/phoenix/compile/SequenceManager.java |  2 +-
 .../coprocessor/SequenceRegionObserver.java | 13 ++--
 .../phoenix/jdbc/PhoenixPreparedStatement.java  |  5 +-
 .../apache/phoenix/jdbc/PhoenixStatement.java   | 62 +++-
 .../apache/phoenix/parse/BindableStatement.java |  2 -
 .../apache/phoenix/parse/DeleteStatement.java   |  6 --
 .../apache/phoenix/parse/ExplainStatement.java  |  6 --
 .../apache/phoenix/parse/MutableStatement.java  |  6 --
 .../apache/phoenix/parse/SelectStatement.java   |  6 --
 .../phoenix/query/ConnectionQueryServices.java  |  2 +-
 .../query/ConnectionQueryServicesImpl.java  |  6 +-
 .../query/ConnectionlessQueryServicesImpl.java  |  2 +-
 .../query/DelegateConnectionQueryServices.java  |  2 +-
 .../org/apache/phoenix/schema/Sequence.java | 15 ++---
 .../phoenix/pig/hadoop/PhoenixInputFormat.java  |  5 --
 15 files changed, 58 insertions(+), 82 deletions(-)
--


http://git-wip-us.apache.org/repos/asf/phoenix/blob/2ad434d6/phoenix-core/src/main/java/org/apache/phoenix/compile/SequenceManager.java
--
diff --git 
a/phoenix-core/src/main/java/org/apache/phoenix/compile/SequenceManager.java 
b/phoenix-core/src/main/java/org/apache/phoenix/compile/SequenceManager.java
index a5f37f8..8e71c3b 100644
--- a/phoenix-core/src/main/java/org/apache/phoenix/compile/SequenceManager.java
+++ b/phoenix-core/src/main/java/org/apache/phoenix/compile/SequenceManager.java
@@ -138,7 +138,7 @@ public class SequenceManager {
 return expression;
 }
 
-public void validateSequences(Sequence.Action action) throws SQLException {
+public void validateSequences(Sequence.ValueOp action) throws SQLException 
{
 if (sequenceMap == null || sequenceMap.isEmpty()) {
 return;
 }

http://git-wip-us.apache.org/repos/asf/phoenix/blob/2ad434d6/phoenix-core/src/main/java/org/apache/phoenix/coprocessor/SequenceRegionObserver.java
--
diff --git 
a/phoenix-core/src/main/java/org/apache/phoenix/coprocessor/SequenceRegionObserver.java
 
b/phoenix-core/src/main/java/org/apache/phoenix/coprocessor/SequenceRegionObserver.java
index 94008fc..a21a61a 100644
--- 
a/phoenix-core/src/main/java/org/apache/phoenix/coprocessor/SequenceRegionObserver.java
+++ 
b/phoenix-core/src/main/java/org/apache/phoenix/coprocessor/SequenceRegionObserver.java
@@ -70,7 +70,6 @@ import com.google.common.collect.Lists;
  * @since 3.0.0
  */
 public class SequenceRegionObserver extends BaseRegionObserver {
-public enum Op {CREATE_SEQUENCE, DROP_SEQUENCE, RETURN_SEQUENCE};
 public static final String OPERATION_ATTRIB = "SEQUENCE_OPERATION";
 public static final String MAX_TIMERANGE_ATTRIB = "MAX_TIMERANGE";
 public static final String CURRENT_VALUE_ATTRIB = "CURRENT_VALUE";
@@ -130,7 +129,7 @@ public class SequenceRegionObserver extends 
BaseRegionObserver {
long value = 
PDataType.LONG.getCodec().decodeLong(cq.getValueArray(), cq.getValueOffset(), 
SortOrder.getDefault());
 get.addColumn(cf, CellUtil.cloneQualifier(cq));
-validateOnly &= (Sequence.Action.VALIDATE.ordinal() == 
value);
+validateOnly &= 
(Sequence.ValueOp.VALIDATE_SEQUENCES.ordinal() == value);
 }
 }
 Result result = region.get(get);
@@ -187,7 +186,7 @@ public class SequenceRegionObserver extends 
BaseRegionObserver {
 if (opBuf == null) {
 return null;
 }
-Op op = Op.values()[opBuf[0]];
+Sequence.MetaOp op = Sequence.MetaOp.values()[opBuf[0]];
 Cell keyValue = 
append.getFamilyCellMap().values().iterator().next().iterator().next();
 
 long clientTimestamp = HConstants.LATEST_TIMESTAMP;
@@ -195,7 +194,7 @@ public class SequenceRegionObserver extends 
BaseRegionObserver {
 long maxGetTimestamp = HConstants.LATEST_TIMESTAMP;
 boolean hadClientTimestamp;
 byte[] clientTimestampBuf = null;
-if (op == Op.RETURN_SEQUENCE) {
+

Apache-Phoenix | Master | Hadoop1 | Build Successful

2014-06-08 Thread Apache Jenkins Server
Master branch build status Successful
Source repository https://git-wip-us.apache.org/repos/asf/incubator-phoenix.git

Last Successful Compiled Artifacts https://builds.apache.org/job/Phoenix-master-hadoop1/lastSuccessfulBuild/artifact/

Last Complete Test Report https://builds.apache.org/job/Phoenix-master-hadoop1/lastCompletedBuild/testReport/

Changes
[jtaylor] PHOENIX-1032 Don't compile alternate plans if query is a point lookup



Apache-Phoenix | 4.0 | Hadoop1 | Build Successful

2014-06-08 Thread Apache Jenkins Server
4.0 branch build status Successful

Source repository https://git-wip-us.apache.org/repos/asf/incubator-phoenix.git

Compiled Artifacts https://builds.apache.org/job/Phoenix-4.0-hadoop1/lastSuccessfulBuild/artifact/

Test Report https://builds.apache.org/job/Phoenix-4.0-hadoop1/lastCompletedBuild/testReport/

Changes
[jtaylor] PHOENIX-1032 Don't compile alternate plans if query is a point lookup



Apache-Phoenix | 3.0 | Hadoop1 | Build Successful

2014-06-08 Thread Apache Jenkins Server
3.0 branch build status Successful
Source repository https://git-wip-us.apache.org/repos/asf/phoenix.git

Last Successful Compiled Artifacts https://builds.apache.org/job/Phoenix-3.0-hadoop1/lastSuccessfulBuild/artifact/

Last Complete Test Report https://builds.apache.org/job/Phoenix-3.0-hadoop1/lastCompletedBuild/testReport/

Changes
[jtaylor] PHOENIX-1033 Update HBase version to 0.94.19 to make connection to secure cluster easier

[jtaylor] PHOENIX-1032 Don't compile alternate plans if query is a point lookup



git commit: PHOENIX-1032 Don't compile alternate plans if query is a point lookup

2014-06-08 Thread jamestaylor
Repository: phoenix
Updated Branches:
  refs/heads/master 156e6cc2d -> 318921eae


PHOENIX-1032 Don't compile alternate plans if query is a point lookup


Project: http://git-wip-us.apache.org/repos/asf/phoenix/repo
Commit: http://git-wip-us.apache.org/repos/asf/phoenix/commit/318921ea
Tree: http://git-wip-us.apache.org/repos/asf/phoenix/tree/318921ea
Diff: http://git-wip-us.apache.org/repos/asf/phoenix/diff/318921ea

Branch: refs/heads/master
Commit: 318921eaef93dd5315c144b0f9c22dcf4febabc2
Parents: 156e6cc
Author: James Taylor 
Authored: Sun Jun 8 11:40:15 2014 -0700
Committer: James Taylor 
Committed: Sun Jun 8 11:40:15 2014 -0700

--
 .../java/org/apache/phoenix/optimize/QueryOptimizer.java| 9 +
 1 file changed, 5 insertions(+), 4 deletions(-)
--


http://git-wip-us.apache.org/repos/asf/phoenix/blob/318921ea/phoenix-core/src/main/java/org/apache/phoenix/optimize/QueryOptimizer.java
--
diff --git 
a/phoenix-core/src/main/java/org/apache/phoenix/optimize/QueryOptimizer.java 
b/phoenix-core/src/main/java/org/apache/phoenix/optimize/QueryOptimizer.java
index 76276e4..53e6939 100644
--- a/phoenix-core/src/main/java/org/apache/phoenix/optimize/QueryOptimizer.java
+++ b/phoenix-core/src/main/java/org/apache/phoenix/optimize/QueryOptimizer.java
@@ -76,11 +76,12 @@ public class QueryOptimizer {
 }
 
 public QueryPlan optimize(QueryPlan dataPlan, PhoenixStatement statement, 
List targetColumns, ParallelIteratorFactory 
parallelIteratorFactory) throws SQLException {
-// Get the statement as it's been normalized now
-// TODO: the recompile for the index tables could skip the normalize 
step
 SelectStatement select = (SelectStatement)dataPlan.getStatement();
-// TODO: consider not even compiling index plans if we have a point 
lookup
-if (!useIndexes || select.isJoin() || 
dataPlan.getContext().getResolver().getTables().size() > 1) {
+// Exit early if we have a point lookup as we can't get better than 
that
+if (!useIndexes 
+|| select.isJoin() 
+|| dataPlan.getContext().getResolver().getTables().size() > 1
+|| dataPlan.getContext().getScanRanges().isPointLookup()) {
 return dataPlan;
 }
 PTable dataTable = dataPlan.getTableRef().getTable();



Apache-Phoenix | 4.0 | Hadoop1 | Build Successful

2014-06-08 Thread Apache Jenkins Server
4.0 branch build status Successful

Source repository https://git-wip-us.apache.org/repos/asf/incubator-phoenix.git

Compiled Artifacts https://builds.apache.org/job/Phoenix-4.0-hadoop1/lastSuccessfulBuild/artifact/

Test Report https://builds.apache.org/job/Phoenix-4.0-hadoop1/lastCompletedBuild/testReport/

Changes
[jtaylor] PHOENIX-19 Enhance JDBC connection of Phoenix to support connecting to a Secure HBase cluster (Anil Gupta)



Apache-Phoenix | Master | Hadoop1 | Build Successful

2014-06-08 Thread Apache Jenkins Server
Master branch build status Successful
Source repository https://git-wip-us.apache.org/repos/asf/incubator-phoenix.git

Last Successful Compiled Artifacts https://builds.apache.org/job/Phoenix-master-hadoop1/lastSuccessfulBuild/artifact/

Last Complete Test Report https://builds.apache.org/job/Phoenix-master-hadoop1/lastCompletedBuild/testReport/

Changes
[jtaylor] PHOENIX-19 Enhance JDBC connection of Phoenix to support connecting to a Secure HBase cluster (Anil Gupta)



Apache-Phoenix | 3.0 | Hadoop1 | Build Successful

2014-06-08 Thread Apache Jenkins Server
3.0 branch build status Successful
Source repository https://git-wip-us.apache.org/repos/asf/phoenix.git

Last Successful Compiled Artifacts https://builds.apache.org/job/Phoenix-3.0-hadoop1/lastSuccessfulBuild/artifact/

Last Complete Test Report https://builds.apache.org/job/Phoenix-3.0-hadoop1/lastCompletedBuild/testReport/

Changes
[jtaylor] PHOENIX-19 Enhance JDBC connection of Phoenix to support connecting to a Secure HBase cluster (Anil Gupta)



git commit: PHOENIX-1032 Don't compile alternate plans if query is a point lookup

2014-06-08 Thread jamestaylor
Repository: phoenix
Updated Branches:
  refs/heads/4.0 40bc58a5a -> 75db322e2


PHOENIX-1032 Don't compile alternate plans if query is a point lookup


Project: http://git-wip-us.apache.org/repos/asf/phoenix/repo
Commit: http://git-wip-us.apache.org/repos/asf/phoenix/commit/75db322e
Tree: http://git-wip-us.apache.org/repos/asf/phoenix/tree/75db322e
Diff: http://git-wip-us.apache.org/repos/asf/phoenix/diff/75db322e

Branch: refs/heads/4.0
Commit: 75db322e20b1a7928aa70e90b59d183cd2745c6a
Parents: 40bc58a
Author: James Taylor 
Authored: Sun Jun 8 11:36:12 2014 -0700
Committer: James Taylor 
Committed: Sun Jun 8 11:36:12 2014 -0700

--
 .../java/org/apache/phoenix/optimize/QueryOptimizer.java| 9 +
 1 file changed, 5 insertions(+), 4 deletions(-)
--


http://git-wip-us.apache.org/repos/asf/phoenix/blob/75db322e/phoenix-core/src/main/java/org/apache/phoenix/optimize/QueryOptimizer.java
--
diff --git 
a/phoenix-core/src/main/java/org/apache/phoenix/optimize/QueryOptimizer.java 
b/phoenix-core/src/main/java/org/apache/phoenix/optimize/QueryOptimizer.java
index 76276e4..53e6939 100644
--- a/phoenix-core/src/main/java/org/apache/phoenix/optimize/QueryOptimizer.java
+++ b/phoenix-core/src/main/java/org/apache/phoenix/optimize/QueryOptimizer.java
@@ -76,11 +76,12 @@ public class QueryOptimizer {
 }
 
 public QueryPlan optimize(QueryPlan dataPlan, PhoenixStatement statement, 
List targetColumns, ParallelIteratorFactory 
parallelIteratorFactory) throws SQLException {
-// Get the statement as it's been normalized now
-// TODO: the recompile for the index tables could skip the normalize 
step
 SelectStatement select = (SelectStatement)dataPlan.getStatement();
-// TODO: consider not even compiling index plans if we have a point 
lookup
-if (!useIndexes || select.isJoin() || 
dataPlan.getContext().getResolver().getTables().size() > 1) {
+// Exit early if we have a point lookup as we can't get better than 
that
+if (!useIndexes 
+|| select.isJoin() 
+|| dataPlan.getContext().getResolver().getTables().size() > 1
+|| dataPlan.getContext().getScanRanges().isPointLookup()) {
 return dataPlan;
 }
 PTable dataTable = dataPlan.getTableRef().getTable();



git commit: PHOENIX-1032 Don't compile alternate plans if query is a point lookup

2014-06-08 Thread jamestaylor
Repository: phoenix
Updated Branches:
  refs/heads/3.0 b2041c2af -> bf395def6


PHOENIX-1032 Don't compile alternate plans if query is a point lookup


Project: http://git-wip-us.apache.org/repos/asf/phoenix/repo
Commit: http://git-wip-us.apache.org/repos/asf/phoenix/commit/bf395def
Tree: http://git-wip-us.apache.org/repos/asf/phoenix/tree/bf395def
Diff: http://git-wip-us.apache.org/repos/asf/phoenix/diff/bf395def

Branch: refs/heads/3.0
Commit: bf395def638b12fb07c5843616d8d2d34428435b
Parents: b2041c2
Author: James Taylor 
Authored: Sun Jun 8 11:35:23 2014 -0700
Committer: James Taylor 
Committed: Sun Jun 8 11:35:23 2014 -0700

--
 .../java/org/apache/phoenix/optimize/QueryOptimizer.java| 9 +
 1 file changed, 5 insertions(+), 4 deletions(-)
--


http://git-wip-us.apache.org/repos/asf/phoenix/blob/bf395def/phoenix-core/src/main/java/org/apache/phoenix/optimize/QueryOptimizer.java
--
diff --git 
a/phoenix-core/src/main/java/org/apache/phoenix/optimize/QueryOptimizer.java 
b/phoenix-core/src/main/java/org/apache/phoenix/optimize/QueryOptimizer.java
index 83d1395..0a01152 100644
--- a/phoenix-core/src/main/java/org/apache/phoenix/optimize/QueryOptimizer.java
+++ b/phoenix-core/src/main/java/org/apache/phoenix/optimize/QueryOptimizer.java
@@ -76,11 +76,12 @@ public class QueryOptimizer {
 }
 
 public QueryPlan optimize(QueryPlan dataPlan, PhoenixStatement statement, 
List targetColumns, ParallelIteratorFactory 
parallelIteratorFactory) throws SQLException {
-// Get the statement as it's been normalized now
-// TODO: the recompile for the index tables could skip the normalize 
step
 SelectStatement select = (SelectStatement)dataPlan.getStatement();
-// TODO: consider not even compiling index plans if we have a point 
lookup
-if (!useIndexes || select.isJoin() || 
dataPlan.getContext().getResolver().getTables().size() > 1) {
+// Exit early if we have a point lookup as we can't get better than 
that
+if (!useIndexes 
+|| select.isJoin() 
+|| dataPlan.getContext().getResolver().getTables().size() > 1
+|| dataPlan.getContext().getScanRanges().isPointLookup()) {
 return dataPlan;
 }
 PTable dataTable = dataPlan.getTableRef().getTable();



git commit: PHOENIX-1033 Update HBase version to 0.94.19 to make connection to secure cluster easier

2014-06-08 Thread jamestaylor
Repository: phoenix
Updated Branches:
  refs/heads/3.0 e799a0f36 -> b2041c2af


PHOENIX-1033 Update HBase version to 0.94.19 to make connection to secure 
cluster easier


Project: http://git-wip-us.apache.org/repos/asf/phoenix/repo
Commit: http://git-wip-us.apache.org/repos/asf/phoenix/commit/b2041c2a
Tree: http://git-wip-us.apache.org/repos/asf/phoenix/tree/b2041c2a
Diff: http://git-wip-us.apache.org/repos/asf/phoenix/diff/b2041c2a

Branch: refs/heads/3.0
Commit: b2041c2afad89973ade53c7089491fe2693b0c54
Parents: e799a0f
Author: James Taylor 
Authored: Sun Jun 8 11:25:39 2014 -0700
Committer: James Taylor 
Committed: Sun Jun 8 11:25:39 2014 -0700

--
 pom.xml | 2 +-
 1 file changed, 1 insertion(+), 1 deletion(-)
--


http://git-wip-us.apache.org/repos/asf/phoenix/blob/b2041c2a/pom.xml
--
diff --git a/pom.xml b/pom.xml
index 0d37a55..2fe91f5 100644
--- a/pom.xml
+++ b/pom.xml
@@ -75,7 +75,7 @@
 2.0.4-alpha
 
 
-0.94.14
+0.94.19
 1.2
 1.0.4
 0.12.0



Apache-Phoenix | Master | Hadoop1 | Build Successful

2014-06-08 Thread Apache Jenkins Server
Master branch build status Successful
Source repository https://git-wip-us.apache.org/repos/asf/incubator-phoenix.git

Last Successful Compiled Artifacts https://builds.apache.org/job/Phoenix-master-hadoop1/lastSuccessfulBuild/artifact/

Last Complete Test Report https://builds.apache.org/job/Phoenix-master-hadoop1/lastCompletedBuild/testReport/

Changes
[jtaylor] PHOENIX-19 Enhance JDBC connection of Phoenix to support connecting to a Secure HBase cluster (Anil Gupta)



Apache-Phoenix | 4.0 | Hadoop1 | Build Successful

2014-06-08 Thread Apache Jenkins Server
4.0 branch build status Successful

Source repository https://git-wip-us.apache.org/repos/asf/incubator-phoenix.git

Compiled Artifacts https://builds.apache.org/job/Phoenix-4.0-hadoop1/lastSuccessfulBuild/artifact/

Test Report https://builds.apache.org/job/Phoenix-4.0-hadoop1/lastCompletedBuild/testReport/

Changes
[jtaylor] PHOENIX-19 Enhance JDBC connection of Phoenix to support connecting to a Secure HBase cluster (Anil Gupta)



git commit: PHOENIX-19 Enhance JDBC connection of Phoenix to support connecting to a Secure HBase cluster (Anil Gupta)

2014-06-08 Thread jamestaylor
Repository: phoenix
Updated Branches:
  refs/heads/master dddcffad2 -> 156e6cc2d


PHOENIX-19 Enhance JDBC connection of Phoenix to support connecting to a Secure 
HBase cluster (Anil Gupta)


Project: http://git-wip-us.apache.org/repos/asf/phoenix/repo
Commit: http://git-wip-us.apache.org/repos/asf/phoenix/commit/156e6cc2
Tree: http://git-wip-us.apache.org/repos/asf/phoenix/tree/156e6cc2
Diff: http://git-wip-us.apache.org/repos/asf/phoenix/diff/156e6cc2

Branch: refs/heads/master
Commit: 156e6cc2d9e0fd1882f806f50f50c2a0d8c7244a
Parents: dddcffa
Author: James Taylor 
Authored: Sun Jun 8 11:08:41 2014 -0700
Committer: James Taylor 
Committed: Sun Jun 8 11:09:39 2014 -0700

--
 .../org/apache/phoenix/query/ConnectionQueryServicesImpl.java| 4 ++--
 1 file changed, 2 insertions(+), 2 deletions(-)
--


http://git-wip-us.apache.org/repos/asf/phoenix/blob/156e6cc2/phoenix-core/src/main/java/org/apache/phoenix/query/ConnectionQueryServicesImpl.java
--
diff --git 
a/phoenix-core/src/main/java/org/apache/phoenix/query/ConnectionQueryServicesImpl.java
 
b/phoenix-core/src/main/java/org/apache/phoenix/query/ConnectionQueryServicesImpl.java
index acb2239..f16bdac 100644
--- 
a/phoenix-core/src/main/java/org/apache/phoenix/query/ConnectionQueryServicesImpl.java
+++ 
b/phoenix-core/src/main/java/org/apache/phoenix/query/ConnectionQueryServicesImpl.java
@@ -255,8 +255,8 @@ public class ConnectionQueryServicesImpl extends 
DelegateQueryServices implement
 private void openConnection() throws SQLException {
 try {
 // check if we need to authenticate with kerberos
-String clientKeytab = config.get(HBASE_CLIENT_KEYTAB);
-String clientPrincipal = config.get(HBASE_CLIENT_PRINCIPAL);
+String clientKeytab = this.getProps().get(HBASE_CLIENT_KEYTAB);
+String clientPrincipal = 
this.getProps().get(HBASE_CLIENT_PRINCIPAL);
 if (clientKeytab != null && clientPrincipal != null) {
 logger.info("Trying to connect to a secure cluster with 
keytab:" + clientKeytab);
 UserGroupInformation.setConfiguration(config);



git commit: PHOENIX-19 Enhance JDBC connection of Phoenix to support connecting to a Secure HBase cluster (Anil Gupta)

2014-06-08 Thread jamestaylor
Repository: phoenix
Updated Branches:
  refs/heads/4.0 3ef7df14c -> 40bc58a5a


PHOENIX-19 Enhance JDBC connection of Phoenix to support connecting to a Secure 
HBase cluster (Anil Gupta)


Project: http://git-wip-us.apache.org/repos/asf/phoenix/repo
Commit: http://git-wip-us.apache.org/repos/asf/phoenix/commit/40bc58a5
Tree: http://git-wip-us.apache.org/repos/asf/phoenix/tree/40bc58a5
Diff: http://git-wip-us.apache.org/repos/asf/phoenix/diff/40bc58a5

Branch: refs/heads/4.0
Commit: 40bc58a5a1aa5eeebb17e3d33c37b70997aaecb1
Parents: 3ef7df1
Author: James Taylor 
Authored: Sun Jun 8 11:08:41 2014 -0700
Committer: James Taylor 
Committed: Sun Jun 8 11:08:41 2014 -0700

--
 .../org/apache/phoenix/query/ConnectionQueryServicesImpl.java| 4 ++--
 1 file changed, 2 insertions(+), 2 deletions(-)
--


http://git-wip-us.apache.org/repos/asf/phoenix/blob/40bc58a5/phoenix-core/src/main/java/org/apache/phoenix/query/ConnectionQueryServicesImpl.java
--
diff --git 
a/phoenix-core/src/main/java/org/apache/phoenix/query/ConnectionQueryServicesImpl.java
 
b/phoenix-core/src/main/java/org/apache/phoenix/query/ConnectionQueryServicesImpl.java
index 716ba3d..be118c5 100644
--- 
a/phoenix-core/src/main/java/org/apache/phoenix/query/ConnectionQueryServicesImpl.java
+++ 
b/phoenix-core/src/main/java/org/apache/phoenix/query/ConnectionQueryServicesImpl.java
@@ -255,8 +255,8 @@ public class ConnectionQueryServicesImpl extends 
DelegateQueryServices implement
 private void openConnection() throws SQLException {
 try {
 // check if we need to authenticate with kerberos
-String clientKeytab = config.get(HBASE_CLIENT_KEYTAB);
-String clientPrincipal = config.get(HBASE_CLIENT_PRINCIPAL);
+String clientKeytab = this.getProps().get(HBASE_CLIENT_KEYTAB);
+String clientPrincipal = 
this.getProps().get(HBASE_CLIENT_PRINCIPAL);
 if (clientKeytab != null && clientPrincipal != null) {
 logger.info("Trying to connect to a secure cluster with 
keytab:" + clientKeytab);
 UserGroupInformation.setConfiguration(config);



git commit: PHOENIX-19 Enhance JDBC connection of Phoenix to support connecting to a Secure HBase cluster (Anil Gupta)

2014-06-08 Thread jamestaylor
Repository: phoenix
Updated Branches:
  refs/heads/3.0 1f895e204 -> e799a0f36


PHOENIX-19 Enhance JDBC connection of Phoenix to support connecting to a Secure 
HBase cluster (Anil Gupta)


Project: http://git-wip-us.apache.org/repos/asf/phoenix/repo
Commit: http://git-wip-us.apache.org/repos/asf/phoenix/commit/e799a0f3
Tree: http://git-wip-us.apache.org/repos/asf/phoenix/tree/e799a0f3
Diff: http://git-wip-us.apache.org/repos/asf/phoenix/diff/e799a0f3

Branch: refs/heads/3.0
Commit: e799a0f360025071b00345aafc043411c2934bdd
Parents: 1f895e2
Author: James Taylor 
Authored: Sun Jun 8 11:03:35 2014 -0700
Committer: James Taylor 
Committed: Sun Jun 8 11:03:35 2014 -0700

--
 .../apache/phoenix/compile/DeleteCompiler.java  |   1 -
 .../phoenix/jdbc/PhoenixEmbeddedDriver.java | 100 ---
 .../query/ConnectionQueryServicesImpl.java  |  31 --
 .../org/apache/phoenix/query/QueryServices.java |   3 +
 .../org/apache/phoenix/util/PhoenixRuntime.java |  34 ---
 .../phoenix/jdbc/PhoenixEmbeddedDriverTest.java |   9 ++
 .../apache/phoenix/util/PhoenixRuntimeTest.java |   4 +-
 7 files changed, 141 insertions(+), 41 deletions(-)
--


http://git-wip-us.apache.org/repos/asf/phoenix/blob/e799a0f3/phoenix-core/src/main/java/org/apache/phoenix/compile/DeleteCompiler.java
--
diff --git 
a/phoenix-core/src/main/java/org/apache/phoenix/compile/DeleteCompiler.java 
b/phoenix-core/src/main/java/org/apache/phoenix/compile/DeleteCompiler.java
index c4a574a..1f0eef3 100644
--- a/phoenix-core/src/main/java/org/apache/phoenix/compile/DeleteCompiler.java
+++ b/phoenix-core/src/main/java/org/apache/phoenix/compile/DeleteCompiler.java
@@ -28,7 +28,6 @@ import java.util.Map;
 
 import org.apache.hadoop.hbase.KeyValue;
 import org.apache.hadoop.hbase.client.Scan;
-import org.apache.hadoop.hbase.filter.FilterList;
 import org.apache.hadoop.hbase.io.ImmutableBytesWritable;
 import org.apache.phoenix.cache.ServerCacheClient.ServerCache;
 import org.apache.phoenix.compile.GroupByCompiler.GroupBy;

http://git-wip-us.apache.org/repos/asf/phoenix/blob/e799a0f3/phoenix-core/src/main/java/org/apache/phoenix/jdbc/PhoenixEmbeddedDriver.java
--
diff --git 
a/phoenix-core/src/main/java/org/apache/phoenix/jdbc/PhoenixEmbeddedDriver.java 
b/phoenix-core/src/main/java/org/apache/phoenix/jdbc/PhoenixEmbeddedDriver.java
index 8cfe3c2..10c24b8 100644
--- 
a/phoenix-core/src/main/java/org/apache/phoenix/jdbc/PhoenixEmbeddedDriver.java
+++ 
b/phoenix-core/src/main/java/org/apache/phoenix/jdbc/PhoenixEmbeddedDriver.java
@@ -169,7 +169,7 @@ public abstract class PhoenixEmbeddedDriver implements 
Driver, org.apache.phoeni
 StringTokenizer tokenizer = new StringTokenizer(url == null ? "" : 
url.substring(PhoenixRuntime.JDBC_PROTOCOL.length()),DELIMITERS, true);
 int i = 0;
 boolean isMalformedUrl = false;
-String[] tokens = new String[3];
+String[] tokens = new String[5];
 String token = null;
 while (tokenizer.hasMoreTokens() && 
!(token=tokenizer.nextToken()).equals(TERMINATOR) && tokenizer.hasMoreTokens() 
&& i < tokens.length) {
 token = tokenizer.nextToken();
@@ -188,14 +188,41 @@ public abstract class PhoenixEmbeddedDriver implements 
Driver, org.apache.phoeni
 try {
 port = Integer.parseInt(tokens[1]);
 isMalformedUrl = port < 0;
+if(i == 4){
+   if(!tokens[2].endsWith(".keytab")){
+   isMalformedUrl = true;
+   }
+   tokens[4] = tokens[3];
+   tokens[3] = tokens[2];
+   tokens[2] = null;
+}
 } catch (NumberFormatException e) {
 // If we have 3 tokens, then the second one must be a 
port.
 // If we only have 2 tokens, the second one might be 
the root node:
 // Assume that is the case if we get a 
NumberFormatException
-if (! (isMalformedUrl = i == 3) ) {
+if (!tokens[1].startsWith("/")) {
+isMalformedUrl = true;
+}
+if (i == 2) {
+tokens[4] = null;
+tokens[3] = null;
+tokens[2] = tokens[1];
+tokens[1] = null;
+} else if (i == 3) {
+tokens[4] = tokens[2];
+tokens[3] = tokens[1];

git commit: PHOENIX-19 Enhance JDBC connection of Phoenix to support connecting to a Secure HBase cluster (Anil Gupta)

2014-06-08 Thread jamestaylor
Repository: phoenix
Updated Branches:
  refs/heads/master 437bf6881 -> dddcffad2


PHOENIX-19 Enhance JDBC connection of Phoenix to support connecting to a Secure 
HBase cluster (Anil Gupta)


Project: http://git-wip-us.apache.org/repos/asf/phoenix/repo
Commit: http://git-wip-us.apache.org/repos/asf/phoenix/commit/dddcffad
Tree: http://git-wip-us.apache.org/repos/asf/phoenix/tree/dddcffad
Diff: http://git-wip-us.apache.org/repos/asf/phoenix/diff/dddcffad

Branch: refs/heads/master
Commit: dddcffad28c1a338e63dfa476619fec3f47db489
Parents: 437bf68
Author: James Taylor 
Authored: Sun Jun 8 10:45:38 2014 -0700
Committer: James Taylor 
Committed: Sun Jun 8 10:52:34 2014 -0700

--
 .../phoenix/jdbc/PhoenixEmbeddedDriver.java | 100 ---
 .../query/ConnectionQueryServicesImpl.java  |  19 +++-
 .../org/apache/phoenix/query/QueryServices.java |   3 +
 .../phoenix/jdbc/PhoenixEmbeddedDriverTest.java |   9 ++
 4 files changed, 117 insertions(+), 14 deletions(-)
--


http://git-wip-us.apache.org/repos/asf/phoenix/blob/dddcffad/phoenix-core/src/main/java/org/apache/phoenix/jdbc/PhoenixEmbeddedDriver.java
--
diff --git 
a/phoenix-core/src/main/java/org/apache/phoenix/jdbc/PhoenixEmbeddedDriver.java 
b/phoenix-core/src/main/java/org/apache/phoenix/jdbc/PhoenixEmbeddedDriver.java
index 8cfe3c2..10c24b8 100644
--- 
a/phoenix-core/src/main/java/org/apache/phoenix/jdbc/PhoenixEmbeddedDriver.java
+++ 
b/phoenix-core/src/main/java/org/apache/phoenix/jdbc/PhoenixEmbeddedDriver.java
@@ -169,7 +169,7 @@ public abstract class PhoenixEmbeddedDriver implements 
Driver, org.apache.phoeni
 StringTokenizer tokenizer = new StringTokenizer(url == null ? "" : 
url.substring(PhoenixRuntime.JDBC_PROTOCOL.length()),DELIMITERS, true);
 int i = 0;
 boolean isMalformedUrl = false;
-String[] tokens = new String[3];
+String[] tokens = new String[5];
 String token = null;
 while (tokenizer.hasMoreTokens() && 
!(token=tokenizer.nextToken()).equals(TERMINATOR) && tokenizer.hasMoreTokens() 
&& i < tokens.length) {
 token = tokenizer.nextToken();
@@ -188,14 +188,41 @@ public abstract class PhoenixEmbeddedDriver implements 
Driver, org.apache.phoeni
 try {
 port = Integer.parseInt(tokens[1]);
 isMalformedUrl = port < 0;
+if(i == 4){
+   if(!tokens[2].endsWith(".keytab")){
+   isMalformedUrl = true;
+   }
+   tokens[4] = tokens[3];
+   tokens[3] = tokens[2];
+   tokens[2] = null;
+}
 } catch (NumberFormatException e) {
 // If we have 3 tokens, then the second one must be a 
port.
 // If we only have 2 tokens, the second one might be 
the root node:
 // Assume that is the case if we get a 
NumberFormatException
-if (! (isMalformedUrl = i == 3) ) {
+if (!tokens[1].startsWith("/")) {
+isMalformedUrl = true;
+}
+if (i == 2) {
+tokens[4] = null;
+tokens[3] = null;
+tokens[2] = tokens[1];
+tokens[1] = null;
+} else if (i == 3) {
+tokens[4] = tokens[2];
+tokens[3] = tokens[1];
+tokens[2] = null;
+tokens[1] = null;
+} else if (i == 4) {
+tokens[4] = tokens[3];
+tokens[3] = tokens[2];
+tokens[2] = tokens[1];
+tokens[1] = null;
+} else if (i == 5) {
+tokens[4] = tokens[3];
+tokens[3] = tokens[2];
 tokens[2] = tokens[1];
 }
-
 }
 }
 }
@@ -203,13 +230,15 @@ public abstract class PhoenixEmbeddedDriver implements 
Driver, org.apache.phoeni
 throw new 
SQLExceptionInfo.Builder(SQLExceptionCode.MALFORMED_CONNECTION_URL)
 .setMessage(url).build().buildException();
 }
-return new ConnectionInfo(tokens[0],port,tokens[2]);
+return new ConnectionInfo(tokens[0],port,tokens[2], tokens[3], 
tokens[4]);
 }
  

git commit: PHOENIX-19 Enhance JDBC connection of Phoenix to support connecting to a Secure HBase cluster (Anil Gupta)

2014-06-08 Thread jamestaylor
Repository: phoenix
Updated Branches:
  refs/heads/4.0 2d9347e0b -> 3ef7df14c


PHOENIX-19 Enhance JDBC connection of Phoenix to support connecting to a Secure 
HBase cluster (Anil Gupta)


Project: http://git-wip-us.apache.org/repos/asf/phoenix/repo
Commit: http://git-wip-us.apache.org/repos/asf/phoenix/commit/3ef7df14
Tree: http://git-wip-us.apache.org/repos/asf/phoenix/tree/3ef7df14
Diff: http://git-wip-us.apache.org/repos/asf/phoenix/diff/3ef7df14

Branch: refs/heads/4.0
Commit: 3ef7df14ce125996ce77d1bf9e4e06abba93
Parents: 2d9347e
Author: James Taylor 
Authored: Sun Jun 8 10:45:38 2014 -0700
Committer: James Taylor 
Committed: Sun Jun 8 10:45:38 2014 -0700

--
 .../phoenix/jdbc/PhoenixEmbeddedDriver.java | 100 ---
 .../query/ConnectionQueryServicesImpl.java  |  16 ++-
 .../org/apache/phoenix/query/QueryServices.java |   3 +
 .../phoenix/jdbc/PhoenixEmbeddedDriverTest.java |   9 ++
 4 files changed, 112 insertions(+), 16 deletions(-)
--


http://git-wip-us.apache.org/repos/asf/phoenix/blob/3ef7df14/phoenix-core/src/main/java/org/apache/phoenix/jdbc/PhoenixEmbeddedDriver.java
--
diff --git 
a/phoenix-core/src/main/java/org/apache/phoenix/jdbc/PhoenixEmbeddedDriver.java 
b/phoenix-core/src/main/java/org/apache/phoenix/jdbc/PhoenixEmbeddedDriver.java
index 8cfe3c2..10c24b8 100644
--- 
a/phoenix-core/src/main/java/org/apache/phoenix/jdbc/PhoenixEmbeddedDriver.java
+++ 
b/phoenix-core/src/main/java/org/apache/phoenix/jdbc/PhoenixEmbeddedDriver.java
@@ -169,7 +169,7 @@ public abstract class PhoenixEmbeddedDriver implements 
Driver, org.apache.phoeni
 StringTokenizer tokenizer = new StringTokenizer(url == null ? "" : 
url.substring(PhoenixRuntime.JDBC_PROTOCOL.length()),DELIMITERS, true);
 int i = 0;
 boolean isMalformedUrl = false;
-String[] tokens = new String[3];
+String[] tokens = new String[5];
 String token = null;
 while (tokenizer.hasMoreTokens() && 
!(token=tokenizer.nextToken()).equals(TERMINATOR) && tokenizer.hasMoreTokens() 
&& i < tokens.length) {
 token = tokenizer.nextToken();
@@ -188,14 +188,41 @@ public abstract class PhoenixEmbeddedDriver implements 
Driver, org.apache.phoeni
 try {
 port = Integer.parseInt(tokens[1]);
 isMalformedUrl = port < 0;
+if(i == 4){
+   if(!tokens[2].endsWith(".keytab")){
+   isMalformedUrl = true;
+   }
+   tokens[4] = tokens[3];
+   tokens[3] = tokens[2];
+   tokens[2] = null;
+}
 } catch (NumberFormatException e) {
 // If we have 3 tokens, then the second one must be a 
port.
 // If we only have 2 tokens, the second one might be 
the root node:
 // Assume that is the case if we get a 
NumberFormatException
-if (! (isMalformedUrl = i == 3) ) {
+if (!tokens[1].startsWith("/")) {
+isMalformedUrl = true;
+}
+if (i == 2) {
+tokens[4] = null;
+tokens[3] = null;
+tokens[2] = tokens[1];
+tokens[1] = null;
+} else if (i == 3) {
+tokens[4] = tokens[2];
+tokens[3] = tokens[1];
+tokens[2] = null;
+tokens[1] = null;
+} else if (i == 4) {
+tokens[4] = tokens[3];
+tokens[3] = tokens[2];
+tokens[2] = tokens[1];
+tokens[1] = null;
+} else if (i == 5) {
+tokens[4] = tokens[3];
+tokens[3] = tokens[2];
 tokens[2] = tokens[1];
 }
-
 }
 }
 }
@@ -203,13 +230,15 @@ public abstract class PhoenixEmbeddedDriver implements 
Driver, org.apache.phoeni
 throw new 
SQLExceptionInfo.Builder(SQLExceptionCode.MALFORMED_CONNECTION_URL)
 .setMessage(url).build().buildException();
 }
-return new ConnectionInfo(tokens[0],port,tokens[2]);
+return new ConnectionInfo(tokens[0],port,tokens[2], tokens[3], 
tokens[4]);
 }
 

Apache-Phoenix | Master | Hadoop1 | Build Successful

2014-06-08 Thread Apache Jenkins Server
Master branch build status Successful
Source repository https://git-wip-us.apache.org/repos/asf/incubator-phoenix.git

Last Successful Compiled Artifacts https://builds.apache.org/job/Phoenix-master-hadoop1/lastSuccessfulBuild/artifact/

Last Complete Test Report https://builds.apache.org/job/Phoenix-master-hadoop1/lastCompletedBuild/testReport/

Changes
[jtaylor] PHOENIX-939 Generalize SELECT expressions for Pig Loader (Ravi)



Apache-Phoenix | 3.0 | Hadoop1 | Build Successful

2014-06-08 Thread Apache Jenkins Server
3.0 branch build status Successful
Source repository https://git-wip-us.apache.org/repos/asf/phoenix.git

Last Successful Compiled Artifacts https://builds.apache.org/job/Phoenix-3.0-hadoop1/lastSuccessfulBuild/artifact/

Last Complete Test Report https://builds.apache.org/job/Phoenix-3.0-hadoop1/lastCompletedBuild/testReport/

Changes
[jtaylor] PHOENIX-939 Generalize SELECT expressions for Pig Loader (Ravi)



Jenkins build is back to normal : Phoenix | 4.0 | Hadoop1 #173

2014-06-08 Thread Apache Jenkins Server
See 



git commit: PHOENIX-939 Generalize SELECT expressions for Pig Loader (Ravi)

2014-06-08 Thread jamestaylor
Repository: phoenix
Updated Branches:
  refs/heads/master 3fd64258a -> 437bf6881


PHOENIX-939 Generalize SELECT expressions for Pig Loader (Ravi)


Project: http://git-wip-us.apache.org/repos/asf/phoenix/repo
Commit: http://git-wip-us.apache.org/repos/asf/phoenix/commit/437bf688
Tree: http://git-wip-us.apache.org/repos/asf/phoenix/tree/437bf688
Diff: http://git-wip-us.apache.org/repos/asf/phoenix/diff/437bf688

Branch: refs/heads/master
Commit: 437bf6881eea425c28f67f148f5be6fc3093fddc
Parents: 3fd6425
Author: James Taylor 
Authored: Sun Jun 8 00:33:20 2014 -0700
Committer: James Taylor 
Committed: Sun Jun 8 00:41:27 2014 -0700

--
 .../phoenix/pig/PhoenixHBaseLoaderIT.java   | 84 +++
 .../apache/phoenix/pig/PhoenixHBaseLoader.java  |  3 +
 .../phoenix/pig/PhoenixPigConfiguration.java| 61 +++---
 .../phoenix/pig/hadoop/PhoenixInputFormat.java  |  5 ++
 .../phoenix/pig/hadoop/PhoenixRecordReader.java | 17 ++--
 .../phoenix/pig/util/PhoenixPigSchemaUtil.java  | 13 ++-
 .../pig/util/SqlQueryToColumnInfoFunction.java  | 85 
 .../util/SqlQueryToColumnInfoFunctionTest.java  | 75 +
 8 files changed, 327 insertions(+), 16 deletions(-)
--


http://git-wip-us.apache.org/repos/asf/phoenix/blob/437bf688/phoenix-pig/src/it/java/org/apache/phoenix/pig/PhoenixHBaseLoaderIT.java
--
diff --git 
a/phoenix-pig/src/it/java/org/apache/phoenix/pig/PhoenixHBaseLoaderIT.java 
b/phoenix-pig/src/it/java/org/apache/phoenix/pig/PhoenixHBaseLoaderIT.java
index 28afb9a..9f118a6 100644
--- a/phoenix-pig/src/it/java/org/apache/phoenix/pig/PhoenixHBaseLoaderIT.java
+++ b/phoenix-pig/src/it/java/org/apache/phoenix/pig/PhoenixHBaseLoaderIT.java
@@ -422,6 +422,90 @@ public class PhoenixHBaseLoaderIT {
 assertEquals(0, rs.getInt("MIN_SAL"));
 assertEquals(270, rs.getInt("MAX_SAL"));
 }
+   
+   /**
+ * Test for Sequence
+ * @throws Exception
+ */
+@Test
+public void testDataForSQLQueryWithSequences() throws Exception {
+
+ //create the table
+ String ddl = "CREATE TABLE " + TABLE_FULL_NAME
++ " (ID INTEGER NOT NULL PRIMARY KEY, NAME VARCHAR, AGE 
INTEGER) ";
+
+conn.createStatement().execute(ddl);
+
+String sequenceDdl = "CREATE SEQUENCE my_sequence";
+
+conn.createStatement().execute(sequenceDdl);
+   
+//prepare data with 10 rows having age 25 and the other 30.
+final String dml = "UPSERT INTO " + TABLE_FULL_NAME + " VALUES(?,?,?)";
+PreparedStatement stmt = conn.prepareStatement(dml);
+int rows = 20;
+for(int i = 0 ; i < rows; i++) {
+stmt.setInt(1, i);
+stmt.setString(2, "a"+i);
+stmt.setInt(3, (i % 2 == 0) ? 25 : 30);
+stmt.execute();
+}
+conn.commit();
+
+//sql query load data and filter rows whose age is > 25
+final String sqlQuery = " SELECT NEXT VALUE FOR my_sequence AS 
my_seq,ID,NAME,AGE FROM " + TABLE_FULL_NAME + " WHERE AGE > 25";
+pigServer.registerQuery(String.format(
+"A = load 'hbase://query/%s' using 
org.apache.phoenix.pig.PhoenixHBaseLoader('%s');", sqlQuery,
+zkQuorum));
+
+
+Iterator iterator = pigServer.openIterator("A");
+int recordsRead = 0;
+while (iterator.hasNext()) {
+Tuple tuple = iterator.next();
+System.out.println(" the field value is "+tuple.get(1));
+recordsRead++;
+}
+assertEquals(rows/2, recordsRead);
+}
+   
+@Test
+public void testDataForSQLQueryWithFunctions() throws Exception {
+
+ //create the table
+ String ddl = "CREATE TABLE " + TABLE_FULL_NAME
++ " (ID INTEGER NOT NULL PRIMARY KEY, NAME VARCHAR) ";
+
+conn.createStatement().execute(ddl);
+
+final String dml = "UPSERT INTO " + TABLE_FULL_NAME + " VALUES(?,?)";
+PreparedStatement stmt = conn.prepareStatement(dml);
+int rows = 20;
+for(int i = 0 ; i < rows; i++) {
+stmt.setInt(1, i);
+stmt.setString(2, "a"+i);
+stmt.execute();
+}
+conn.commit();
+
+//sql query
+final String sqlQuery = " SELECT UPPER(NAME) AS n FROM " + 
TABLE_FULL_NAME + " ORDER BY ID" ;
+
+pigServer.registerQuery(String.format(
+"A = load 'hbase://query/%s' using 
org.apache.phoenix.pig.PhoenixHBaseLoader('%s');", sqlQuery,
+zkQuorum));
+
+
+Iterator iterator = pigServer.openIterator("A");
+int i = 0;
+while (iterator.hasNext()) 

git commit: PHOENIX-939 Generalize SELECT expressions for Pig Loader (Ravi)

2014-06-08 Thread jamestaylor
Repository: phoenix
Updated Branches:
  refs/heads/4.0 c75eae1a9 -> 2d9347e0b


PHOENIX-939 Generalize SELECT expressions for Pig Loader (Ravi)


Project: http://git-wip-us.apache.org/repos/asf/phoenix/repo
Commit: http://git-wip-us.apache.org/repos/asf/phoenix/commit/2d9347e0
Tree: http://git-wip-us.apache.org/repos/asf/phoenix/tree/2d9347e0
Diff: http://git-wip-us.apache.org/repos/asf/phoenix/diff/2d9347e0

Branch: refs/heads/4.0
Commit: 2d9347e0ba022ebc27143348de590baf3cc1b234
Parents: c75eae1
Author: James Taylor 
Authored: Sun Jun 8 00:33:20 2014 -0700
Committer: James Taylor 
Committed: Sun Jun 8 00:34:39 2014 -0700

--
 .../phoenix/pig/PhoenixHBaseLoaderIT.java   | 84 +++
 .../apache/phoenix/pig/PhoenixHBaseLoader.java  |  3 +
 .../phoenix/pig/PhoenixPigConfiguration.java| 61 +++---
 .../phoenix/pig/hadoop/PhoenixInputFormat.java  |  5 ++
 .../phoenix/pig/hadoop/PhoenixRecordReader.java | 17 ++--
 .../phoenix/pig/util/PhoenixPigSchemaUtil.java  | 13 ++-
 .../pig/util/SqlQueryToColumnInfoFunction.java  | 85 
 .../util/SqlQueryToColumnInfoFunctionTest.java  | 75 +
 8 files changed, 327 insertions(+), 16 deletions(-)
--


http://git-wip-us.apache.org/repos/asf/phoenix/blob/2d9347e0/phoenix-pig/src/it/java/org/apache/phoenix/pig/PhoenixHBaseLoaderIT.java
--
diff --git 
a/phoenix-pig/src/it/java/org/apache/phoenix/pig/PhoenixHBaseLoaderIT.java 
b/phoenix-pig/src/it/java/org/apache/phoenix/pig/PhoenixHBaseLoaderIT.java
index 28afb9a..9f118a6 100644
--- a/phoenix-pig/src/it/java/org/apache/phoenix/pig/PhoenixHBaseLoaderIT.java
+++ b/phoenix-pig/src/it/java/org/apache/phoenix/pig/PhoenixHBaseLoaderIT.java
@@ -422,6 +422,90 @@ public class PhoenixHBaseLoaderIT {
 assertEquals(0, rs.getInt("MIN_SAL"));
 assertEquals(270, rs.getInt("MAX_SAL"));
 }
+   
+   /**
+ * Test for Sequence
+ * @throws Exception
+ */
+@Test
+public void testDataForSQLQueryWithSequences() throws Exception {
+
+ //create the table
+ String ddl = "CREATE TABLE " + TABLE_FULL_NAME
++ " (ID INTEGER NOT NULL PRIMARY KEY, NAME VARCHAR, AGE 
INTEGER) ";
+
+conn.createStatement().execute(ddl);
+
+String sequenceDdl = "CREATE SEQUENCE my_sequence";
+
+conn.createStatement().execute(sequenceDdl);
+   
+//prepare data with 10 rows having age 25 and the other 30.
+final String dml = "UPSERT INTO " + TABLE_FULL_NAME + " VALUES(?,?,?)";
+PreparedStatement stmt = conn.prepareStatement(dml);
+int rows = 20;
+for(int i = 0 ; i < rows; i++) {
+stmt.setInt(1, i);
+stmt.setString(2, "a"+i);
+stmt.setInt(3, (i % 2 == 0) ? 25 : 30);
+stmt.execute();
+}
+conn.commit();
+
+//sql query load data and filter rows whose age is > 25
+final String sqlQuery = " SELECT NEXT VALUE FOR my_sequence AS 
my_seq,ID,NAME,AGE FROM " + TABLE_FULL_NAME + " WHERE AGE > 25";
+pigServer.registerQuery(String.format(
+"A = load 'hbase://query/%s' using 
org.apache.phoenix.pig.PhoenixHBaseLoader('%s');", sqlQuery,
+zkQuorum));
+
+
+Iterator iterator = pigServer.openIterator("A");
+int recordsRead = 0;
+while (iterator.hasNext()) {
+Tuple tuple = iterator.next();
+System.out.println(" the field value is "+tuple.get(1));
+recordsRead++;
+}
+assertEquals(rows/2, recordsRead);
+}
+   
+@Test
+public void testDataForSQLQueryWithFunctions() throws Exception {
+
+ //create the table
+ String ddl = "CREATE TABLE " + TABLE_FULL_NAME
++ " (ID INTEGER NOT NULL PRIMARY KEY, NAME VARCHAR) ";
+
+conn.createStatement().execute(ddl);
+
+final String dml = "UPSERT INTO " + TABLE_FULL_NAME + " VALUES(?,?)";
+PreparedStatement stmt = conn.prepareStatement(dml);
+int rows = 20;
+for(int i = 0 ; i < rows; i++) {
+stmt.setInt(1, i);
+stmt.setString(2, "a"+i);
+stmt.execute();
+}
+conn.commit();
+
+//sql query
+final String sqlQuery = " SELECT UPPER(NAME) AS n FROM " + 
TABLE_FULL_NAME + " ORDER BY ID" ;
+
+pigServer.registerQuery(String.format(
+"A = load 'hbase://query/%s' using 
org.apache.phoenix.pig.PhoenixHBaseLoader('%s');", sqlQuery,
+zkQuorum));
+
+
+Iterator iterator = pigServer.openIterator("A");
+int i = 0;
+while (iterator.hasNext()) {
+   

git commit: PHOENIX-939 Generalize SELECT expressions for Pig Loader (Ravi)

2014-06-08 Thread jamestaylor
Repository: phoenix
Updated Branches:
  refs/heads/3.0 90a518e50 -> 1f895e204


PHOENIX-939 Generalize SELECT expressions for Pig Loader (Ravi)


Project: http://git-wip-us.apache.org/repos/asf/phoenix/repo
Commit: http://git-wip-us.apache.org/repos/asf/phoenix/commit/1f895e20
Tree: http://git-wip-us.apache.org/repos/asf/phoenix/tree/1f895e20
Diff: http://git-wip-us.apache.org/repos/asf/phoenix/diff/1f895e20

Branch: refs/heads/3.0
Commit: 1f895e204da77df6d927738790116fd09eaf6b88
Parents: 90a518e
Author: James Taylor 
Authored: Sun Jun 8 00:33:20 2014 -0700
Committer: James Taylor 
Committed: Sun Jun 8 00:33:20 2014 -0700

--
 .../phoenix/pig/PhoenixHBaseLoaderIT.java   | 84 +++
 .../apache/phoenix/pig/PhoenixHBaseLoader.java  |  3 +
 .../phoenix/pig/PhoenixPigConfiguration.java| 61 +++---
 .../phoenix/pig/hadoop/PhoenixInputFormat.java  |  5 ++
 .../phoenix/pig/hadoop/PhoenixRecordReader.java | 17 ++--
 .../phoenix/pig/util/PhoenixPigSchemaUtil.java  | 13 ++-
 .../pig/util/SqlQueryToColumnInfoFunction.java  | 85 
 .../util/SqlQueryToColumnInfoFunctionTest.java  | 75 +
 8 files changed, 327 insertions(+), 16 deletions(-)
--


http://git-wip-us.apache.org/repos/asf/phoenix/blob/1f895e20/phoenix-pig/src/it/java/org/apache/phoenix/pig/PhoenixHBaseLoaderIT.java
--
diff --git 
a/phoenix-pig/src/it/java/org/apache/phoenix/pig/PhoenixHBaseLoaderIT.java 
b/phoenix-pig/src/it/java/org/apache/phoenix/pig/PhoenixHBaseLoaderIT.java
index 28afb9a..9f118a6 100644
--- a/phoenix-pig/src/it/java/org/apache/phoenix/pig/PhoenixHBaseLoaderIT.java
+++ b/phoenix-pig/src/it/java/org/apache/phoenix/pig/PhoenixHBaseLoaderIT.java
@@ -422,6 +422,90 @@ public class PhoenixHBaseLoaderIT {
 assertEquals(0, rs.getInt("MIN_SAL"));
 assertEquals(270, rs.getInt("MAX_SAL"));
 }
+   
+   /**
+ * Test for Sequence
+ * @throws Exception
+ */
+@Test
+public void testDataForSQLQueryWithSequences() throws Exception {
+
+ //create the table
+ String ddl = "CREATE TABLE " + TABLE_FULL_NAME
++ " (ID INTEGER NOT NULL PRIMARY KEY, NAME VARCHAR, AGE 
INTEGER) ";
+
+conn.createStatement().execute(ddl);
+
+String sequenceDdl = "CREATE SEQUENCE my_sequence";
+
+conn.createStatement().execute(sequenceDdl);
+   
+//prepare data with 10 rows having age 25 and the other 30.
+final String dml = "UPSERT INTO " + TABLE_FULL_NAME + " VALUES(?,?,?)";
+PreparedStatement stmt = conn.prepareStatement(dml);
+int rows = 20;
+for(int i = 0 ; i < rows; i++) {
+stmt.setInt(1, i);
+stmt.setString(2, "a"+i);
+stmt.setInt(3, (i % 2 == 0) ? 25 : 30);
+stmt.execute();
+}
+conn.commit();
+
+//sql query load data and filter rows whose age is > 25
+final String sqlQuery = " SELECT NEXT VALUE FOR my_sequence AS 
my_seq,ID,NAME,AGE FROM " + TABLE_FULL_NAME + " WHERE AGE > 25";
+pigServer.registerQuery(String.format(
+"A = load 'hbase://query/%s' using 
org.apache.phoenix.pig.PhoenixHBaseLoader('%s');", sqlQuery,
+zkQuorum));
+
+
+Iterator iterator = pigServer.openIterator("A");
+int recordsRead = 0;
+while (iterator.hasNext()) {
+Tuple tuple = iterator.next();
+System.out.println(" the field value is "+tuple.get(1));
+recordsRead++;
+}
+assertEquals(rows/2, recordsRead);
+}
+   
+@Test
+public void testDataForSQLQueryWithFunctions() throws Exception {
+
+ //create the table
+ String ddl = "CREATE TABLE " + TABLE_FULL_NAME
++ " (ID INTEGER NOT NULL PRIMARY KEY, NAME VARCHAR) ";
+
+conn.createStatement().execute(ddl);
+
+final String dml = "UPSERT INTO " + TABLE_FULL_NAME + " VALUES(?,?)";
+PreparedStatement stmt = conn.prepareStatement(dml);
+int rows = 20;
+for(int i = 0 ; i < rows; i++) {
+stmt.setInt(1, i);
+stmt.setString(2, "a"+i);
+stmt.execute();
+}
+conn.commit();
+
+//sql query
+final String sqlQuery = " SELECT UPPER(NAME) AS n FROM " + 
TABLE_FULL_NAME + " ORDER BY ID" ;
+
+pigServer.registerQuery(String.format(
+"A = load 'hbase://query/%s' using 
org.apache.phoenix.pig.PhoenixHBaseLoader('%s');", sqlQuery,
+zkQuorum));
+
+
+Iterator iterator = pigServer.openIterator("A");
+int i = 0;
+while (iterator.hasNext()) {
+