Apache-Phoenix | 4.x-HBase-1.1 | Build Successful
4.x-HBase-1.1 branch build status Successful Source repository https://git-wip-us.apache.org/repos/asf?p=phoenix.git;a=shortlog;h=refs/heads/4.x-HBase-1.1 Compiled Artifacts https://builds.apache.org/job/Phoenix-4.x-HBase-1.1/lastSuccessfulBuild/artifact/ Test Report https://builds.apache.org/job/Phoenix-4.x-HBase-1.1/lastCompletedBuild/testReport/ Changes Build times for last couple of runsLatest build time is the right most | Legend blue: normal, red: test failure, gray: timeout
Apache-Phoenix | 4.x-HBase-1.1 | Build Successful
4.x-HBase-1.1 branch build status Successful Source repository https://git-wip-us.apache.org/repos/asf?p=phoenix.git;a=shortlog;h=refs/heads/4.x-HBase-1.1 Compiled Artifacts https://builds.apache.org/job/Phoenix-4.x-HBase-1.1/lastSuccessfulBuild/artifact/ Test Report https://builds.apache.org/job/Phoenix-4.x-HBase-1.1/lastCompletedBuild/testReport/ Changes Build times for last couple of runsLatest build time is the right most | Legend blue: normal, red: test failure, gray: timeout
Apache-Phoenix | 4.x-HBase-1.1 | Build Successful
4.x-HBase-1.1 branch build status Successful Source repository https://git-wip-us.apache.org/repos/asf?p=phoenix.git;a=shortlog;h=refs/heads/4.x-HBase-1.1 Compiled Artifacts https://builds.apache.org/job/Phoenix-4.x-HBase-1.1/lastSuccessfulBuild/artifact/ Test Report https://builds.apache.org/job/Phoenix-4.x-HBase-1.1/lastCompletedBuild/testReport/ Changes Build times for last couple of runsLatest build time is the right most | Legend blue: normal, red: test failure, gray: timeout
Apache-Phoenix | 4.x-HBase-1.1 | Build Successful
4.x-HBase-1.1 branch build status Successful Source repository https://git-wip-us.apache.org/repos/asf?p=phoenix.git;a=shortlog;h=refs/heads/4.x-HBase-1.1 Compiled Artifacts https://builds.apache.org/job/Phoenix-4.x-HBase-1.1/lastSuccessfulBuild/artifact/ Test Report https://builds.apache.org/job/Phoenix-4.x-HBase-1.1/lastCompletedBuild/testReport/ Changes Build times for last couple of runsLatest build time is the right most | Legend blue: normal, red: test failure, gray: timeout
Apache-Phoenix | 4.x-HBase-1.1 | Build Successful
4.x-HBase-1.1 branch build status Successful Source repository https://git-wip-us.apache.org/repos/asf?p=phoenix.git;a=shortlog;h=refs/heads/4.x-HBase-1.1 Compiled Artifacts https://builds.apache.org/job/Phoenix-4.x-HBase-1.1/lastSuccessfulBuild/artifact/ Test Report https://builds.apache.org/job/Phoenix-4.x-HBase-1.1/lastCompletedBuild/testReport/ Changes Build times for last couple of runsLatest build time is the right most | Legend blue: normal, red: test failure, gray: timeout
Apache-Phoenix | 4.x-HBase-1.1 | Build Successful
4.x-HBase-1.1 branch build status Successful Source repository https://git-wip-us.apache.org/repos/asf?p=phoenix.git;a=shortlog;h=refs/heads/4.x-HBase-1.1 Compiled Artifacts https://builds.apache.org/job/Phoenix-4.x-HBase-1.1/lastSuccessfulBuild/artifact/ Test Report https://builds.apache.org/job/Phoenix-4.x-HBase-1.1/lastCompletedBuild/testReport/ Changes Build times for last couple of runsLatest build time is the right most | Legend blue: normal, red: test failure, gray: timeout
Apache-Phoenix | 4.x-HBase-1.1 | Build Successful
4.x-HBase-1.1 branch build status Successful Source repository https://git-wip-us.apache.org/repos/asf?p=phoenix.git;a=shortlog;h=refs/heads/4.x-HBase-1.1 Compiled Artifacts https://builds.apache.org/job/Phoenix-4.x-HBase-1.1/lastSuccessfulBuild/artifact/ Test Report https://builds.apache.org/job/Phoenix-4.x-HBase-1.1/lastCompletedBuild/testReport/ Changes Build times for last couple of runsLatest build time is the right most | Legend blue: normal, red: test failure, gray: timeout
Apache-Phoenix | 4.x-HBase-1.1 | Build Successful
4.x-HBase-1.1 branch build status Successful Source repository https://git-wip-us.apache.org/repos/asf?p=phoenix.git;a=shortlog;h=refs/heads/4.x-HBase-1.1 Compiled Artifacts https://builds.apache.org/job/Phoenix-4.x-HBase-1.1/lastSuccessfulBuild/artifact/ Test Report https://builds.apache.org/job/Phoenix-4.x-HBase-1.1/lastCompletedBuild/testReport/ Changes Build times for last couple of runsLatest build time is the right most | Legend blue: normal, red: test failure, gray: timeout
Apache-Phoenix | 4.x-HBase-1.1 | Build Successful
4.x-HBase-1.1 branch build status Successful Source repository https://git-wip-us.apache.org/repos/asf?p=phoenix.git;a=shortlog;h=refs/heads/4.x-HBase-1.1 Compiled Artifacts https://builds.apache.org/job/Phoenix-4.x-HBase-1.1/lastSuccessfulBuild/artifact/ Test Report https://builds.apache.org/job/Phoenix-4.x-HBase-1.1/lastCompletedBuild/testReport/ Changes Build times for last couple of runsLatest build time is the right most | Legend blue: normal, red: test failure, gray: timeout
Apache-Phoenix | 4.x-HBase-1.1 | Build Successful
4.x-HBase-1.1 branch build status Successful Source repository https://git-wip-us.apache.org/repos/asf?p=phoenix.git;a=shortlog;h=refs/heads/4.x-HBase-1.1 Compiled Artifacts https://builds.apache.org/job/Phoenix-4.x-HBase-1.1/lastSuccessfulBuild/artifact/ Test Report https://builds.apache.org/job/Phoenix-4.x-HBase-1.1/lastCompletedBuild/testReport/ Changes Build times for last couple of runsLatest build time is the right most | Legend blue: normal, red: test failure, gray: timeout
Apache-Phoenix | 4.x-HBase-1.1 | Build Successful
4.x-HBase-1.1 branch build status Successful Source repository https://git-wip-us.apache.org/repos/asf?p=phoenix.git;a=shortlog;h=refs/heads/4.x-HBase-1.1 Compiled Artifacts https://builds.apache.org/job/Phoenix-4.x-HBase-1.1/lastSuccessfulBuild/artifact/ Test Report https://builds.apache.org/job/Phoenix-4.x-HBase-1.1/lastCompletedBuild/testReport/ Changes Build times for last couple of runsLatest build time is the right most | Legend blue: normal, red: test failure, gray: timeout
phoenix git commit: PHOENIX-2036 PhoenixConfigurationUtil should provide a pre-normalize table name to PhoenixRuntime
Repository: phoenix Updated Branches: refs/heads/master 1e606d579 - 39c982f92 PHOENIX-2036 PhoenixConfigurationUtil should provide a pre-normalize table name to PhoenixRuntime Update phoenix-spark to follow the same normalization requirement. Project: http://git-wip-us.apache.org/repos/asf/phoenix/repo Commit: http://git-wip-us.apache.org/repos/asf/phoenix/commit/39c982f9 Tree: http://git-wip-us.apache.org/repos/asf/phoenix/tree/39c982f9 Diff: http://git-wip-us.apache.org/repos/asf/phoenix/diff/39c982f9 Branch: refs/heads/master Commit: 39c982f923033b97c477464d0c4e27221421774d Parents: 1e606d5 Author: Josh Mahonin jmaho...@gmail.com Authored: Mon Jul 6 19:39:31 2015 -0400 Committer: Josh Mahonin jmaho...@apache.org Committed: Mon Jul 6 19:41:38 2015 -0400 -- phoenix-spark/src/it/resources/setup.sql| 4 +- .../apache/phoenix/spark/PhoenixSparkIT.scala | 58 .../org/apache/phoenix/spark/PhoenixRDD.scala | 24 +++- 3 files changed, 46 insertions(+), 40 deletions(-) -- http://git-wip-us.apache.org/repos/asf/phoenix/blob/39c982f9/phoenix-spark/src/it/resources/setup.sql -- diff --git a/phoenix-spark/src/it/resources/setup.sql b/phoenix-spark/src/it/resources/setup.sql index 40157a2..154a996 100644 --- a/phoenix-spark/src/it/resources/setup.sql +++ b/phoenix-spark/src/it/resources/setup.sql @@ -32,4 +32,6 @@ CREATE TABLE ARRAY_TEST_TABLE (ID BIGINT NOT NULL PRIMARY KEY, VCARRAY VARCHAR[] UPSERT INTO ARRAY_TEST_TABLE (ID, VCARRAY) VALUES (1, ARRAY['String1', 'String2', 'String3']) CREATE TABLE DATE_PREDICATE_TEST_TABLE (ID BIGINT NOT NULL, TIMESERIES_KEY TIMESTAMP NOT NULL CONSTRAINT pk PRIMARY KEY (ID, TIMESERIES_KEY)) UPSERT INTO DATE_PREDICATE_TEST_TABLE (ID, TIMESERIES_KEY) VALUES (1, CAST(CURRENT_TIME() AS TIMESTAMP)) -CREATE TABLE OUTPUT_TEST_TABLE (id BIGINT NOT NULL PRIMARY KEY, col1 VARCHAR, col2 INTEGER, col3 DATE) \ No newline at end of file +CREATE TABLE OUTPUT_TEST_TABLE (id BIGINT NOT NULL PRIMARY KEY, col1 VARCHAR, col2 INTEGER, col3 DATE) +CREATE TABLE CUSTOM_ENTITY.z02(id BIGINT NOT NULL PRIMARY KEY) +UPSERT INTO CUSTOM_ENTITY.z02 (id) VALUES(1) \ No newline at end of file http://git-wip-us.apache.org/repos/asf/phoenix/blob/39c982f9/phoenix-spark/src/it/scala/org/apache/phoenix/spark/PhoenixSparkIT.scala -- diff --git a/phoenix-spark/src/it/scala/org/apache/phoenix/spark/PhoenixSparkIT.scala b/phoenix-spark/src/it/scala/org/apache/phoenix/spark/PhoenixSparkIT.scala index 5f256e6..e1c9df4 100644 --- a/phoenix-spark/src/it/scala/org/apache/phoenix/spark/PhoenixSparkIT.scala +++ b/phoenix-spark/src/it/scala/org/apache/phoenix/spark/PhoenixSparkIT.scala @@ -20,9 +20,9 @@ import org.apache.hadoop.conf.Configuration import org.apache.hadoop.hbase.{HBaseConfiguration, HConstants, HBaseTestingUtility} import org.apache.phoenix.end2end.BaseHBaseManagedTimeIT import org.apache.phoenix.query.BaseTest -import org.apache.phoenix.schema.ColumnNotFoundException +import org.apache.phoenix.schema.{TableNotFoundException, ColumnNotFoundException} import org.apache.phoenix.schema.types.PVarchar -import org.apache.phoenix.util.ColumnInfo +import org.apache.phoenix.util.{SchemaUtil, ColumnInfo} import org.apache.spark.sql.{SaveMode, execution, SQLContext} import org.apache.spark.sql.types.{LongType, DataType, StringType, StructField} import org.apache.spark.{SparkConf, SparkContext} @@ -96,23 +96,6 @@ class PhoenixSparkIT extends FunSuite with Matchers with BeforeAndAfterAll { PhoenixSparkITHelper.doTeardown } - def buildSql(table: String, columns: Seq[String], predicate: Option[String]): String = { -val query = SELECT %s FROM \%s\ format(columns.map(f = \ + f + \).mkString(, ), table) - -query + (predicate match { - case Some(p: String) = WHERE + p - case _ = -}) - } - - test(Can create valid SQL) { -val rdd = new PhoenixRDD(sc, MyTable, Array(Foo, Bar), - conf = hbaseConfiguration) - -rdd.buildSql(MyTable, Array(Foo, Bar), None) should - equal(SELECT \Foo\, \Bar\ FROM \MyTable\) - } - test(Can convert Phoenix schema) { val phoenixSchema = List( new ColumnInfo(varcharColumn, PVarchar.INSTANCE.getSqlType) @@ -154,7 +137,9 @@ class PhoenixSparkIT extends FunSuite with Matchers with BeforeAndAfterAll { val sqlContext = new SQLContext(sc) -val df1 = sqlContext.phoenixTableAsDataFrame(table3, Array(id, col1), +val df1 = sqlContext.phoenixTableAsDataFrame( + SchemaUtil.getEscapedArgument(table3), + Array(id, col1), zkUrl = Some(quorumAddress)) df1.registerTempTable(table3) @@ -191,10 +176,12 @@ class PhoenixSparkIT extends FunSuite with Matchers with
phoenix git commit: PHOENIX-2036 PhoenixConfigurationUtil should provide a pre-normalize table name to PhoenixRuntime
Repository: phoenix Updated Branches: refs/heads/4.4-HBase-1.0 b584ae982 - a7333c8a6 PHOENIX-2036 PhoenixConfigurationUtil should provide a pre-normalize table name to PhoenixRuntime Update phoenix-spark to follow the same normalization requirement. Project: http://git-wip-us.apache.org/repos/asf/phoenix/repo Commit: http://git-wip-us.apache.org/repos/asf/phoenix/commit/a7333c8a Tree: http://git-wip-us.apache.org/repos/asf/phoenix/tree/a7333c8a Diff: http://git-wip-us.apache.org/repos/asf/phoenix/diff/a7333c8a Branch: refs/heads/4.4-HBase-1.0 Commit: a7333c8a68e7d59afd94f7b0298cf4dbb45ce13a Parents: b584ae9 Author: Josh Mahonin jmaho...@gmail.com Authored: Mon Jul 6 19:39:31 2015 -0400 Committer: Josh Mahonin jmaho...@apache.org Committed: Mon Jul 6 19:54:44 2015 -0400 -- phoenix-spark/src/it/resources/setup.sql| 4 +- .../apache/phoenix/spark/PhoenixSparkIT.scala | 58 .../org/apache/phoenix/spark/PhoenixRDD.scala | 24 +++- 3 files changed, 46 insertions(+), 40 deletions(-) -- http://git-wip-us.apache.org/repos/asf/phoenix/blob/a7333c8a/phoenix-spark/src/it/resources/setup.sql -- diff --git a/phoenix-spark/src/it/resources/setup.sql b/phoenix-spark/src/it/resources/setup.sql index 40157a2..154a996 100644 --- a/phoenix-spark/src/it/resources/setup.sql +++ b/phoenix-spark/src/it/resources/setup.sql @@ -32,4 +32,6 @@ CREATE TABLE ARRAY_TEST_TABLE (ID BIGINT NOT NULL PRIMARY KEY, VCARRAY VARCHAR[] UPSERT INTO ARRAY_TEST_TABLE (ID, VCARRAY) VALUES (1, ARRAY['String1', 'String2', 'String3']) CREATE TABLE DATE_PREDICATE_TEST_TABLE (ID BIGINT NOT NULL, TIMESERIES_KEY TIMESTAMP NOT NULL CONSTRAINT pk PRIMARY KEY (ID, TIMESERIES_KEY)) UPSERT INTO DATE_PREDICATE_TEST_TABLE (ID, TIMESERIES_KEY) VALUES (1, CAST(CURRENT_TIME() AS TIMESTAMP)) -CREATE TABLE OUTPUT_TEST_TABLE (id BIGINT NOT NULL PRIMARY KEY, col1 VARCHAR, col2 INTEGER, col3 DATE) \ No newline at end of file +CREATE TABLE OUTPUT_TEST_TABLE (id BIGINT NOT NULL PRIMARY KEY, col1 VARCHAR, col2 INTEGER, col3 DATE) +CREATE TABLE CUSTOM_ENTITY.z02(id BIGINT NOT NULL PRIMARY KEY) +UPSERT INTO CUSTOM_ENTITY.z02 (id) VALUES(1) \ No newline at end of file http://git-wip-us.apache.org/repos/asf/phoenix/blob/a7333c8a/phoenix-spark/src/it/scala/org/apache/phoenix/spark/PhoenixSparkIT.scala -- diff --git a/phoenix-spark/src/it/scala/org/apache/phoenix/spark/PhoenixSparkIT.scala b/phoenix-spark/src/it/scala/org/apache/phoenix/spark/PhoenixSparkIT.scala index 5f256e6..e1c9df4 100644 --- a/phoenix-spark/src/it/scala/org/apache/phoenix/spark/PhoenixSparkIT.scala +++ b/phoenix-spark/src/it/scala/org/apache/phoenix/spark/PhoenixSparkIT.scala @@ -20,9 +20,9 @@ import org.apache.hadoop.conf.Configuration import org.apache.hadoop.hbase.{HBaseConfiguration, HConstants, HBaseTestingUtility} import org.apache.phoenix.end2end.BaseHBaseManagedTimeIT import org.apache.phoenix.query.BaseTest -import org.apache.phoenix.schema.ColumnNotFoundException +import org.apache.phoenix.schema.{TableNotFoundException, ColumnNotFoundException} import org.apache.phoenix.schema.types.PVarchar -import org.apache.phoenix.util.ColumnInfo +import org.apache.phoenix.util.{SchemaUtil, ColumnInfo} import org.apache.spark.sql.{SaveMode, execution, SQLContext} import org.apache.spark.sql.types.{LongType, DataType, StringType, StructField} import org.apache.spark.{SparkConf, SparkContext} @@ -96,23 +96,6 @@ class PhoenixSparkIT extends FunSuite with Matchers with BeforeAndAfterAll { PhoenixSparkITHelper.doTeardown } - def buildSql(table: String, columns: Seq[String], predicate: Option[String]): String = { -val query = SELECT %s FROM \%s\ format(columns.map(f = \ + f + \).mkString(, ), table) - -query + (predicate match { - case Some(p: String) = WHERE + p - case _ = -}) - } - - test(Can create valid SQL) { -val rdd = new PhoenixRDD(sc, MyTable, Array(Foo, Bar), - conf = hbaseConfiguration) - -rdd.buildSql(MyTable, Array(Foo, Bar), None) should - equal(SELECT \Foo\, \Bar\ FROM \MyTable\) - } - test(Can convert Phoenix schema) { val phoenixSchema = List( new ColumnInfo(varcharColumn, PVarchar.INSTANCE.getSqlType) @@ -154,7 +137,9 @@ class PhoenixSparkIT extends FunSuite with Matchers with BeforeAndAfterAll { val sqlContext = new SQLContext(sc) -val df1 = sqlContext.phoenixTableAsDataFrame(table3, Array(id, col1), +val df1 = sqlContext.phoenixTableAsDataFrame( + SchemaUtil.getEscapedArgument(table3), + Array(id, col1), zkUrl = Some(quorumAddress)) df1.registerTempTable(table3) @@ -191,10 +176,12 @@ class PhoenixSparkIT extends FunSuite with Matchers with
[1/2] phoenix git commit: PHOENIX-2036 PhoenixConfigurationUtil should provide a pre-normalize table name to PhoenixRuntime
Repository: phoenix Updated Branches: refs/heads/4.4-HBase-1.1 35b8cc505 - 27838c48b PHOENIX-2036 PhoenixConfigurationUtil should provide a pre-normalize table name to PhoenixRuntime Update phoenix-spark to follow the same normalization requirement. Conflicts: phoenix-spark/src/it/scala/org/apache/phoenix/spark/PhoenixSparkIT.scala Project: http://git-wip-us.apache.org/repos/asf/phoenix/repo Commit: http://git-wip-us.apache.org/repos/asf/phoenix/commit/06c49c7c Tree: http://git-wip-us.apache.org/repos/asf/phoenix/tree/06c49c7c Diff: http://git-wip-us.apache.org/repos/asf/phoenix/diff/06c49c7c Branch: refs/heads/4.4-HBase-1.1 Commit: 06c49c7c4027eb9affa4ae2e4f8418711ee23113 Parents: 35b8cc5 Author: Josh Mahonin jmaho...@gmail.com Authored: Mon Jul 6 19:39:31 2015 -0400 Committer: Josh Mahonin jmaho...@apache.org Committed: Mon Jul 6 20:01:43 2015 -0400 -- phoenix-spark/src/it/resources/setup.sql| 4 +- .../apache/phoenix/spark/PhoenixSparkIT.scala | 79 ++-- .../org/apache/phoenix/spark/PhoenixRDD.scala | 24 ++ 3 files changed, 67 insertions(+), 40 deletions(-) -- http://git-wip-us.apache.org/repos/asf/phoenix/blob/06c49c7c/phoenix-spark/src/it/resources/setup.sql -- diff --git a/phoenix-spark/src/it/resources/setup.sql b/phoenix-spark/src/it/resources/setup.sql index 40157a2..154a996 100644 --- a/phoenix-spark/src/it/resources/setup.sql +++ b/phoenix-spark/src/it/resources/setup.sql @@ -32,4 +32,6 @@ CREATE TABLE ARRAY_TEST_TABLE (ID BIGINT NOT NULL PRIMARY KEY, VCARRAY VARCHAR[] UPSERT INTO ARRAY_TEST_TABLE (ID, VCARRAY) VALUES (1, ARRAY['String1', 'String2', 'String3']) CREATE TABLE DATE_PREDICATE_TEST_TABLE (ID BIGINT NOT NULL, TIMESERIES_KEY TIMESTAMP NOT NULL CONSTRAINT pk PRIMARY KEY (ID, TIMESERIES_KEY)) UPSERT INTO DATE_PREDICATE_TEST_TABLE (ID, TIMESERIES_KEY) VALUES (1, CAST(CURRENT_TIME() AS TIMESTAMP)) -CREATE TABLE OUTPUT_TEST_TABLE (id BIGINT NOT NULL PRIMARY KEY, col1 VARCHAR, col2 INTEGER, col3 DATE) \ No newline at end of file +CREATE TABLE OUTPUT_TEST_TABLE (id BIGINT NOT NULL PRIMARY KEY, col1 VARCHAR, col2 INTEGER, col3 DATE) +CREATE TABLE CUSTOM_ENTITY.z02(id BIGINT NOT NULL PRIMARY KEY) +UPSERT INTO CUSTOM_ENTITY.z02 (id) VALUES(1) \ No newline at end of file http://git-wip-us.apache.org/repos/asf/phoenix/blob/06c49c7c/phoenix-spark/src/it/scala/org/apache/phoenix/spark/PhoenixSparkIT.scala -- diff --git a/phoenix-spark/src/it/scala/org/apache/phoenix/spark/PhoenixSparkIT.scala b/phoenix-spark/src/it/scala/org/apache/phoenix/spark/PhoenixSparkIT.scala index 42e8676..e1c9df4 100644 --- a/phoenix-spark/src/it/scala/org/apache/phoenix/spark/PhoenixSparkIT.scala +++ b/phoenix-spark/src/it/scala/org/apache/phoenix/spark/PhoenixSparkIT.scala @@ -20,9 +20,9 @@ import org.apache.hadoop.conf.Configuration import org.apache.hadoop.hbase.{HBaseConfiguration, HConstants, HBaseTestingUtility} import org.apache.phoenix.end2end.BaseHBaseManagedTimeIT import org.apache.phoenix.query.BaseTest -import org.apache.phoenix.schema.ColumnNotFoundException +import org.apache.phoenix.schema.{TableNotFoundException, ColumnNotFoundException} import org.apache.phoenix.schema.types.PVarchar -import org.apache.phoenix.util.ColumnInfo +import org.apache.phoenix.util.{SchemaUtil, ColumnInfo} import org.apache.spark.sql.{SaveMode, execution, SQLContext} import org.apache.spark.sql.types.{LongType, DataType, StringType, StructField} import org.apache.spark.{SparkConf, SparkContext} @@ -96,23 +96,6 @@ class PhoenixSparkIT extends FunSuite with Matchers with BeforeAndAfterAll { PhoenixSparkITHelper.doTeardown } - def buildSql(table: String, columns: Seq[String], predicate: Option[String]): String = { -val query = SELECT %s FROM \%s\ format(columns.map(f = \ + f + \).mkString(, ), table) - -query + (predicate match { - case Some(p: String) = WHERE + p - case _ = -}) - } - - test(Can create valid SQL) { -val rdd = new PhoenixRDD(sc, MyTable, Array(Foo, Bar), - conf = hbaseConfiguration) - -rdd.buildSql(MyTable, Array(Foo, Bar), None) should - equal(SELECT \Foo\, \Bar\ FROM \MyTable\) - } - test(Can convert Phoenix schema) { val phoenixSchema = List( new ColumnInfo(varcharColumn, PVarchar.INSTANCE.getSqlType) @@ -154,7 +137,9 @@ class PhoenixSparkIT extends FunSuite with Matchers with BeforeAndAfterAll { val sqlContext = new SQLContext(sc) -val df1 = sqlContext.phoenixTableAsDataFrame(table3, Array(id, col1), +val df1 = sqlContext.phoenixTableAsDataFrame( + SchemaUtil.getEscapedArgument(table3), + Array(id, col1), zkUrl = Some(quorumAddress))
phoenix git commit: PHOENIX-2036 PhoenixConfigurationUtil should provide a pre-normalize table name to PhoenixRuntime
Repository: phoenix Updated Branches: refs/heads/4.x-HBase-0.98 f3056ba67 - 517cbb78f PHOENIX-2036 PhoenixConfigurationUtil should provide a pre-normalize table name to PhoenixRuntime Update phoenix-spark to follow the same normalization requirement. Conflicts: phoenix-spark/src/it/scala/org/apache/phoenix/spark/PhoenixSparkIT.scala Project: http://git-wip-us.apache.org/repos/asf/phoenix/repo Commit: http://git-wip-us.apache.org/repos/asf/phoenix/commit/517cbb78 Tree: http://git-wip-us.apache.org/repos/asf/phoenix/tree/517cbb78 Diff: http://git-wip-us.apache.org/repos/asf/phoenix/diff/517cbb78 Branch: refs/heads/4.x-HBase-0.98 Commit: 517cbb78fb142b95657856dd07b763acea19ecfb Parents: f3056ba Author: Josh Mahonin jmaho...@gmail.com Authored: Mon Jul 6 19:39:31 2015 -0400 Committer: Josh Mahonin jmaho...@apache.org Committed: Mon Jul 6 20:08:29 2015 -0400 -- phoenix-spark/src/it/resources/setup.sql| 4 +- .../apache/phoenix/spark/PhoenixSparkIT.scala | 58 .../org/apache/phoenix/spark/PhoenixRDD.scala | 24 +++- 3 files changed, 46 insertions(+), 40 deletions(-) -- http://git-wip-us.apache.org/repos/asf/phoenix/blob/517cbb78/phoenix-spark/src/it/resources/setup.sql -- diff --git a/phoenix-spark/src/it/resources/setup.sql b/phoenix-spark/src/it/resources/setup.sql index 40157a2..154a996 100644 --- a/phoenix-spark/src/it/resources/setup.sql +++ b/phoenix-spark/src/it/resources/setup.sql @@ -32,4 +32,6 @@ CREATE TABLE ARRAY_TEST_TABLE (ID BIGINT NOT NULL PRIMARY KEY, VCARRAY VARCHAR[] UPSERT INTO ARRAY_TEST_TABLE (ID, VCARRAY) VALUES (1, ARRAY['String1', 'String2', 'String3']) CREATE TABLE DATE_PREDICATE_TEST_TABLE (ID BIGINT NOT NULL, TIMESERIES_KEY TIMESTAMP NOT NULL CONSTRAINT pk PRIMARY KEY (ID, TIMESERIES_KEY)) UPSERT INTO DATE_PREDICATE_TEST_TABLE (ID, TIMESERIES_KEY) VALUES (1, CAST(CURRENT_TIME() AS TIMESTAMP)) -CREATE TABLE OUTPUT_TEST_TABLE (id BIGINT NOT NULL PRIMARY KEY, col1 VARCHAR, col2 INTEGER, col3 DATE) \ No newline at end of file +CREATE TABLE OUTPUT_TEST_TABLE (id BIGINT NOT NULL PRIMARY KEY, col1 VARCHAR, col2 INTEGER, col3 DATE) +CREATE TABLE CUSTOM_ENTITY.z02(id BIGINT NOT NULL PRIMARY KEY) +UPSERT INTO CUSTOM_ENTITY.z02 (id) VALUES(1) \ No newline at end of file http://git-wip-us.apache.org/repos/asf/phoenix/blob/517cbb78/phoenix-spark/src/it/scala/org/apache/phoenix/spark/PhoenixSparkIT.scala -- diff --git a/phoenix-spark/src/it/scala/org/apache/phoenix/spark/PhoenixSparkIT.scala b/phoenix-spark/src/it/scala/org/apache/phoenix/spark/PhoenixSparkIT.scala index fb7d869..2889464 100644 --- a/phoenix-spark/src/it/scala/org/apache/phoenix/spark/PhoenixSparkIT.scala +++ b/phoenix-spark/src/it/scala/org/apache/phoenix/spark/PhoenixSparkIT.scala @@ -20,9 +20,9 @@ import org.apache.hadoop.conf.Configuration import org.apache.hadoop.hbase.{HBaseConfiguration, HConstants, HBaseTestingUtility} import org.apache.phoenix.end2end.BaseHBaseManagedTimeIT import org.apache.phoenix.query.BaseTest -import org.apache.phoenix.schema.ColumnNotFoundException +import org.apache.phoenix.schema.{TableNotFoundException, ColumnNotFoundException} import org.apache.phoenix.schema.types.PVarchar -import org.apache.phoenix.util.ColumnInfo +import org.apache.phoenix.util.{SchemaUtil, ColumnInfo} import org.apache.spark.sql.{SaveMode, execution, SQLContext} import org.apache.spark.sql.types.{LongType, DataType, StringType, StructField} import org.apache.spark.{SparkConf, SparkContext} @@ -96,23 +96,6 @@ class PhoenixSparkIT extends FunSuite with Matchers with BeforeAndAfterAll { PhoenixSparkITHelper.doTeardown } - def buildSql(table: String, columns: Seq[String], predicate: Option[String]): String = { -val query = SELECT %s FROM \%s\ format(columns.map(f = \ + f + \).mkString(, ), table) - -query + (predicate match { - case Some(p: String) = WHERE + p - case _ = -}) - } - - test(Can create valid SQL) { -val rdd = new PhoenixRDD(sc, MyTable, Array(Foo, Bar), - conf = hbaseConfiguration) - -rdd.buildSql(MyTable, Array(Foo, Bar), None) should - equal(SELECT \Foo\, \Bar\ FROM \MyTable\) - } - test(Can convert Phoenix schema) { val phoenixSchema = List( new ColumnInfo(varcharColumn, PVarchar.INSTANCE.getSqlType) @@ -153,7 +136,9 @@ class PhoenixSparkIT extends FunSuite with Matchers with BeforeAndAfterAll { test(Can create schema RDD and execute query on case sensitive table (no config)) { val sqlContext = new SQLContext(sc) -val df1 = sqlContext.phoenixTableAsDataFrame(table3, Array(id, col1), +val df1 = sqlContext.phoenixTableAsDataFrame( +
[2/2] phoenix git commit: PHOENIX 1968: Should support saving arrays
PHOENIX 1968: Should support saving arrays Conflicts: phoenix-spark/src/it/scala/org/apache/phoenix/spark/PhoenixSparkIT.scala Project: http://git-wip-us.apache.org/repos/asf/phoenix/repo Commit: http://git-wip-us.apache.org/repos/asf/phoenix/commit/27838c48 Tree: http://git-wip-us.apache.org/repos/asf/phoenix/tree/27838c48 Diff: http://git-wip-us.apache.org/repos/asf/phoenix/diff/27838c48 Branch: refs/heads/4.4-HBase-1.1 Commit: 27838c48be2c5b98e5806f8f808f61a76d684277 Parents: 06c49c7 Author: ravimagham ravimag...@apache.org Authored: Thu Jun 11 11:50:21 2015 -0700 Committer: Josh Mahonin jmaho...@apache.org Committed: Mon Jul 6 20:02:32 2015 -0400 -- .../phoenix/spark/PhoenixRecordWritable.scala | 25 1 file changed, 20 insertions(+), 5 deletions(-) -- http://git-wip-us.apache.org/repos/asf/phoenix/blob/27838c48/phoenix-spark/src/main/scala/org/apache/phoenix/spark/PhoenixRecordWritable.scala -- diff --git a/phoenix-spark/src/main/scala/org/apache/phoenix/spark/PhoenixRecordWritable.scala b/phoenix-spark/src/main/scala/org/apache/phoenix/spark/PhoenixRecordWritable.scala index 67e0bd2..3977657 100644 --- a/phoenix-spark/src/main/scala/org/apache/phoenix/spark/PhoenixRecordWritable.scala +++ b/phoenix-spark/src/main/scala/org/apache/phoenix/spark/PhoenixRecordWritable.scala @@ -16,11 +16,12 @@ package org.apache.phoenix.spark import java.sql.{PreparedStatement, ResultSet} import org.apache.hadoop.mapreduce.lib.db.DBWritable import org.apache.phoenix.mapreduce.util.ColumnInfoToStringEncoderDecoder -import org.apache.phoenix.schema.types.{PDate, PhoenixArray} +import org.apache.phoenix.schema.types.{PDataType, PDate, PhoenixArray} import org.joda.time.DateTime import scala.collection.{immutable, mutable} import scala.collection.JavaConversions._ + class PhoenixRecordWritable(var encodedColumns: String) extends DBWritable { val upsertValues = mutable.ArrayBuffer[Any]() val resultMap = mutable.Map[String, AnyRef]() @@ -44,13 +45,27 @@ class PhoenixRecordWritable(var encodedColumns: String) extends DBWritable { upsertValues.zip(columns).zipWithIndex.foreach { case ((v, c), i) = { if (v != null) { + // Both Java and Joda dates used to work in 4.2.3, but now they must be java.sql.Date + // Can override any other types here as needed val (finalObj, finalType) = v match { -case dt: DateTime = (new java.sql.Date(dt.getMillis), PDate.INSTANCE.getSqlType) -case d: java.util.Date = (new java.sql.Date(d.getTime), PDate.INSTANCE.getSqlType) -case _ = (v, c.getSqlType) +case dt: DateTime = (new java.sql.Date(dt.getMillis), PDate.INSTANCE) +case d: java.util.Date = (new java.sql.Date(d.getTime), PDate.INSTANCE) +case _ = (v, c.getPDataType) + } + + // Save as array or object + finalObj match { +case obj: Array[AnyRef] = { + // Create a java.sql.Array, need to lookup the base sql type name + val sqlArray = statement.getConnection.createArrayOf( +PDataType.arrayBaseType(finalType).getSqlTypeName, +obj + ) + statement.setArray(i + 1, sqlArray) +} +case _ = statement.setObject(i + 1, finalObj) } - statement.setObject(i + 1, finalObj, finalType) } else { statement.setNull(i + 1, c.getSqlType) }
phoenix git commit: PHOENIX-2036 PhoenixConfigurationUtil should provide a pre-normalize table name to PhoenixRuntime
Repository: phoenix Updated Branches: refs/heads/4.x-HBase-1.0 fe982b807 - b5cb4141c PHOENIX-2036 PhoenixConfigurationUtil should provide a pre-normalize table name to PhoenixRuntime Update phoenix-spark to follow the same normalization requirement. Project: http://git-wip-us.apache.org/repos/asf/phoenix/repo Commit: http://git-wip-us.apache.org/repos/asf/phoenix/commit/b5cb4141 Tree: http://git-wip-us.apache.org/repos/asf/phoenix/tree/b5cb4141 Diff: http://git-wip-us.apache.org/repos/asf/phoenix/diff/b5cb4141 Branch: refs/heads/4.x-HBase-1.0 Commit: b5cb4141c80db384f48f01f8850c5ea97d0d8d7a Parents: fe982b8 Author: Josh Mahonin jmaho...@gmail.com Authored: Mon Jul 6 19:39:31 2015 -0400 Committer: Josh Mahonin jmaho...@apache.org Committed: Mon Jul 6 20:09:35 2015 -0400 -- phoenix-spark/src/it/resources/setup.sql| 4 +- .../apache/phoenix/spark/PhoenixSparkIT.scala | 58 .../org/apache/phoenix/spark/PhoenixRDD.scala | 24 +++- 3 files changed, 46 insertions(+), 40 deletions(-) -- http://git-wip-us.apache.org/repos/asf/phoenix/blob/b5cb4141/phoenix-spark/src/it/resources/setup.sql -- diff --git a/phoenix-spark/src/it/resources/setup.sql b/phoenix-spark/src/it/resources/setup.sql index 40157a2..154a996 100644 --- a/phoenix-spark/src/it/resources/setup.sql +++ b/phoenix-spark/src/it/resources/setup.sql @@ -32,4 +32,6 @@ CREATE TABLE ARRAY_TEST_TABLE (ID BIGINT NOT NULL PRIMARY KEY, VCARRAY VARCHAR[] UPSERT INTO ARRAY_TEST_TABLE (ID, VCARRAY) VALUES (1, ARRAY['String1', 'String2', 'String3']) CREATE TABLE DATE_PREDICATE_TEST_TABLE (ID BIGINT NOT NULL, TIMESERIES_KEY TIMESTAMP NOT NULL CONSTRAINT pk PRIMARY KEY (ID, TIMESERIES_KEY)) UPSERT INTO DATE_PREDICATE_TEST_TABLE (ID, TIMESERIES_KEY) VALUES (1, CAST(CURRENT_TIME() AS TIMESTAMP)) -CREATE TABLE OUTPUT_TEST_TABLE (id BIGINT NOT NULL PRIMARY KEY, col1 VARCHAR, col2 INTEGER, col3 DATE) \ No newline at end of file +CREATE TABLE OUTPUT_TEST_TABLE (id BIGINT NOT NULL PRIMARY KEY, col1 VARCHAR, col2 INTEGER, col3 DATE) +CREATE TABLE CUSTOM_ENTITY.z02(id BIGINT NOT NULL PRIMARY KEY) +UPSERT INTO CUSTOM_ENTITY.z02 (id) VALUES(1) \ No newline at end of file http://git-wip-us.apache.org/repos/asf/phoenix/blob/b5cb4141/phoenix-spark/src/it/scala/org/apache/phoenix/spark/PhoenixSparkIT.scala -- diff --git a/phoenix-spark/src/it/scala/org/apache/phoenix/spark/PhoenixSparkIT.scala b/phoenix-spark/src/it/scala/org/apache/phoenix/spark/PhoenixSparkIT.scala index 5f256e6..e1c9df4 100644 --- a/phoenix-spark/src/it/scala/org/apache/phoenix/spark/PhoenixSparkIT.scala +++ b/phoenix-spark/src/it/scala/org/apache/phoenix/spark/PhoenixSparkIT.scala @@ -20,9 +20,9 @@ import org.apache.hadoop.conf.Configuration import org.apache.hadoop.hbase.{HBaseConfiguration, HConstants, HBaseTestingUtility} import org.apache.phoenix.end2end.BaseHBaseManagedTimeIT import org.apache.phoenix.query.BaseTest -import org.apache.phoenix.schema.ColumnNotFoundException +import org.apache.phoenix.schema.{TableNotFoundException, ColumnNotFoundException} import org.apache.phoenix.schema.types.PVarchar -import org.apache.phoenix.util.ColumnInfo +import org.apache.phoenix.util.{SchemaUtil, ColumnInfo} import org.apache.spark.sql.{SaveMode, execution, SQLContext} import org.apache.spark.sql.types.{LongType, DataType, StringType, StructField} import org.apache.spark.{SparkConf, SparkContext} @@ -96,23 +96,6 @@ class PhoenixSparkIT extends FunSuite with Matchers with BeforeAndAfterAll { PhoenixSparkITHelper.doTeardown } - def buildSql(table: String, columns: Seq[String], predicate: Option[String]): String = { -val query = SELECT %s FROM \%s\ format(columns.map(f = \ + f + \).mkString(, ), table) - -query + (predicate match { - case Some(p: String) = WHERE + p - case _ = -}) - } - - test(Can create valid SQL) { -val rdd = new PhoenixRDD(sc, MyTable, Array(Foo, Bar), - conf = hbaseConfiguration) - -rdd.buildSql(MyTable, Array(Foo, Bar), None) should - equal(SELECT \Foo\, \Bar\ FROM \MyTable\) - } - test(Can convert Phoenix schema) { val phoenixSchema = List( new ColumnInfo(varcharColumn, PVarchar.INSTANCE.getSqlType) @@ -154,7 +137,9 @@ class PhoenixSparkIT extends FunSuite with Matchers with BeforeAndAfterAll { val sqlContext = new SQLContext(sc) -val df1 = sqlContext.phoenixTableAsDataFrame(table3, Array(id, col1), +val df1 = sqlContext.phoenixTableAsDataFrame( + SchemaUtil.getEscapedArgument(table3), + Array(id, col1), zkUrl = Some(quorumAddress)) df1.registerTempTable(table3) @@ -191,10 +176,12 @@ class PhoenixSparkIT extends FunSuite with Matchers with
Apache-Phoenix | 4.x-HBase-1.1 | Build Successful
4.x-HBase-1.1 branch build status Successful Source repository https://git-wip-us.apache.org/repos/asf?p=phoenix.git;a=shortlog;h=refs/heads/4.x-HBase-1.1 Compiled Artifacts https://builds.apache.org/job/Phoenix-4.x-HBase-1.1/lastSuccessfulBuild/artifact/ Test Report https://builds.apache.org/job/Phoenix-4.x-HBase-1.1/lastCompletedBuild/testReport/ Changes Build times for last couple of runsLatest build time is the right most | Legend blue: normal, red: test failure, gray: timeout
Build failed in Jenkins: Phoenix | Master #814
See https://builds.apache.org/job/Phoenix-master/814/changes Changes: [jmahonin] PHOENIX-2036 PhoenixConfigurationUtil should provide a pre-normalize table name to PhoenixRuntime -- [...truncated 119849 lines...] Running org.apache.phoenix.end2end.TimezoneOffsetFunctionIT Running org.apache.phoenix.end2end.index.GlobalMutableIndexIT Tests run: 7, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 19.2 sec - in org.apache.phoenix.end2end.TimezoneOffsetFunctionIT Running org.apache.phoenix.end2end.index.GlobalIndexOptimizationIT Tests run: 20, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 93.29 sec - in org.apache.phoenix.end2end.DeleteIT Running org.apache.phoenix.end2end.index.ViewIndexIT Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 31.059 sec - in org.apache.phoenix.end2end.TenantSpecificViewIndexSaltedIT Running org.apache.phoenix.end2end.index.DropViewIT Tests run: 6, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 33.397 sec - in org.apache.phoenix.end2end.EncodeFunctionIT Running org.apache.phoenix.end2end.index.IndexMetadataIT Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 13.876 sec - in org.apache.phoenix.end2end.index.ViewIndexIT Running org.apache.phoenix.end2end.index.SaltedIndexIT Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 16.831 sec - in org.apache.phoenix.end2end.index.DropViewIT Running org.apache.phoenix.end2end.index.LocalMutableIndexIT Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 49.792 sec - in org.apache.phoenix.end2end.index.SaltedIndexIT Running org.apache.phoenix.end2end.index.IndexExpressionIT Tests run: 6, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 114.75 sec - in org.apache.phoenix.end2end.index.IndexMetadataIT Running org.apache.phoenix.end2end.index.LocalIndexIT Tests run: 6, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 135.577 sec - in org.apache.phoenix.end2end.index.GlobalIndexOptimizationIT Running org.apache.phoenix.end2end.index.ImmutableIndexIT Tests run: 13, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 74.015 sec - in org.apache.phoenix.end2end.index.ImmutableIndexIT Running org.apache.phoenix.end2end.PowerFunctionEnd2EndIT Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 17.803 sec - in org.apache.phoenix.end2end.PowerFunctionEnd2EndIT Running org.apache.phoenix.end2end.ServerExceptionIT Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 8.333 sec - in org.apache.phoenix.end2end.ServerExceptionIT Running org.apache.phoenix.end2end.LastValueFunctionIT Tests run: 15, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 263.121 sec - in org.apache.phoenix.end2end.index.GlobalMutableIndexIT Running org.apache.phoenix.end2end.DisableLocalIndexIT Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 19.158 sec - in org.apache.phoenix.end2end.DisableLocalIndexIT Running org.apache.phoenix.end2end.ExpFunctionEnd2EndIT Tests run: 11, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 27.253 sec - in org.apache.phoenix.end2end.LastValueFunctionIT Running org.apache.phoenix.end2end.HashJoinIT Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 20.164 sec - in org.apache.phoenix.end2end.ExpFunctionEnd2EndIT Running org.apache.phoenix.end2end.SortOrderFIT Tests run: 15, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 266.027 sec - in org.apache.phoenix.end2end.index.LocalMutableIndexIT Running org.apache.phoenix.end2end.CbrtFunctionEnd2EndIT Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 21.563 sec - in org.apache.phoenix.end2end.CbrtFunctionEnd2EndIT Running org.apache.phoenix.end2end.LikeExpressionIT Tests run: 30, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 32.008 sec - in org.apache.phoenix.end2end.SortOrderFIT Running org.apache.phoenix.end2end.FirstValueFunctionIT Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 9.943 sec - in org.apache.phoenix.end2end.LikeExpressionIT Running org.apache.phoenix.end2end.RoundFloorCeilFunctionsEnd2EndIT Tests run: 7, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 14.726 sec - in org.apache.phoenix.end2end.FirstValueFunctionIT Running org.apache.phoenix.end2end.CSVCommonsLoaderIT Tests run: 58, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 294.843 sec - in org.apache.phoenix.end2end.index.IndexExpressionIT Running org.apache.phoenix.end2end.DynamicFamilyIT Tests run: 31, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 41.79 sec - in org.apache.phoenix.end2end.RoundFloorCeilFunctionsEnd2EndIT Running org.apache.phoenix.end2end.AlterSessionIT Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 17.399 sec - in org.apache.phoenix.end2end.DynamicFamilyIT Running org.apache.phoenix.end2end.EvaluationOfORIT Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 13.02 sec - in org.apache.phoenix.end2end.AlterSessionIT Running org.apache.phoenix.end2end.ArrayConcatFunctionIT
Apache-Phoenix | 4.x-HBase-1.1 | Build Successful
4.x-HBase-1.1 branch build status Successful Source repository https://git-wip-us.apache.org/repos/asf?p=phoenix.git;a=shortlog;h=refs/heads/4.x-HBase-1.1 Compiled Artifacts https://builds.apache.org/job/Phoenix-4.x-HBase-1.1/lastSuccessfulBuild/artifact/ Test Report https://builds.apache.org/job/Phoenix-4.x-HBase-1.1/lastCompletedBuild/testReport/ Changes Build times for last couple of runsLatest build time is the right most | Legend blue: normal, red: test failure, gray: timeout
Apache-Phoenix | 4.x-HBase-1.1 | Build Successful
4.x-HBase-1.1 branch build status Successful Source repository https://git-wip-us.apache.org/repos/asf?p=phoenix.git;a=shortlog;h=refs/heads/4.x-HBase-1.1 Compiled Artifacts https://builds.apache.org/job/Phoenix-4.x-HBase-1.1/lastSuccessfulBuild/artifact/ Test Report https://builds.apache.org/job/Phoenix-4.x-HBase-1.1/lastCompletedBuild/testReport/ Changes Build times for last couple of runsLatest build time is the right most | Legend blue: normal, red: test failure, gray: timeout
Apache-Phoenix | 4.x-HBase-1.1 | Build Successful
4.x-HBase-1.1 branch build status Successful Source repository https://git-wip-us.apache.org/repos/asf?p=phoenix.git;a=shortlog;h=refs/heads/4.x-HBase-1.1 Compiled Artifacts https://builds.apache.org/job/Phoenix-4.x-HBase-1.1/lastSuccessfulBuild/artifact/ Test Report https://builds.apache.org/job/Phoenix-4.x-HBase-1.1/lastCompletedBuild/testReport/ Changes Build times for last couple of runsLatest build time is the right most | Legend blue: normal, red: test failure, gray: timeout
Apache-Phoenix | 4.x-HBase-1.1 | Build Successful
4.x-HBase-1.1 branch build status Successful Source repository https://git-wip-us.apache.org/repos/asf?p=phoenix.git;a=shortlog;h=refs/heads/4.x-HBase-1.1 Compiled Artifacts https://builds.apache.org/job/Phoenix-4.x-HBase-1.1/lastSuccessfulBuild/artifact/ Test Report https://builds.apache.org/job/Phoenix-4.x-HBase-1.1/lastCompletedBuild/testReport/ Changes Build times for last couple of runsLatest build time is the right most | Legend blue: normal, red: test failure, gray: timeout
Apache-Phoenix | 4.x-HBase-1.1 | Build Successful
4.x-HBase-1.1 branch build status Successful Source repository https://git-wip-us.apache.org/repos/asf?p=phoenix.git;a=shortlog;h=refs/heads/4.x-HBase-1.1 Compiled Artifacts https://builds.apache.org/job/Phoenix-4.x-HBase-1.1/lastSuccessfulBuild/artifact/ Test Report https://builds.apache.org/job/Phoenix-4.x-HBase-1.1/lastCompletedBuild/testReport/ Changes Build times for last couple of runsLatest build time is the right most | Legend blue: normal, red: test failure, gray: timeout