Apache-Phoenix | 4.x-HBase-1.3 | Build Successful

2020-02-11 Thread Apache Jenkins Server
4.x-HBase-1.3 branch build status Successful

Source repository https://git-wip-us.apache.org/repos/asf?p=phoenix.git;a=shortlog;h=refs/heads/4.x-HBase-1.3

Compiled Artifacts https://builds.apache.org/job/Phoenix-4.x-HBase-1.3/lastSuccessfulBuild/artifact/

Test Report https://builds.apache.org/job/Phoenix-4.x-HBase-1.3/lastCompletedBuild/testReport/

Changes
[Rajeshbabu Chintaguntla] PHOENIX-5691 create index is failing when phoenix acls enabled and



Build times for last couple of runsLatest build time is the right most | Legend blue: normal, red: test failure, gray: timeout


Apache-Phoenix | 4.x-HBase-1.3 | Build Successful

2020-02-11 Thread Apache Jenkins Server
4.x-HBase-1.3 branch build status Successful

Source repository https://git-wip-us.apache.org/repos/asf?p=phoenix.git;a=shortlog;h=refs/heads/4.x-HBase-1.3

Compiled Artifacts https://builds.apache.org/job/Phoenix-4.x-HBase-1.3/lastSuccessfulBuild/artifact/

Test Report https://builds.apache.org/job/Phoenix-4.x-HBase-1.3/lastCompletedBuild/testReport/

Changes
[Rajeshbabu Chintaguntla] PHOENIX-5691 create index is failing when phoenix acls enabled and



Build times for last couple of runsLatest build time is the right most | Legend blue: normal, red: test failure, gray: timeout


Jenkins build is back to normal : Phoenix-4.x-HBase-1.4 #393

2020-02-11 Thread Apache Jenkins Server
See 




Build failed in Jenkins: Phoenix Compile Compatibility with HBase #1267

2020-02-11 Thread Apache Jenkins Server
See 


Changes:


--
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on H26 (ubuntu) in workspace 

[Phoenix_Compile_Compat_wHBase] $ /bin/bash /tmp/jenkins6799642269219136729.sh
core file size  (blocks, -c) 0
data seg size   (kbytes, -d) unlimited
scheduling priority (-e) 0
file size   (blocks, -f) unlimited
pending signals (-i) 386346
max locked memory   (kbytes, -l) 16384
max memory size (kbytes, -m) unlimited
open files  (-n) 6
pipe size(512 bytes, -p) 8
POSIX message queues (bytes, -q) 819200
real-time priority  (-r) 0
stack size  (kbytes, -s) 8192
cpu time   (seconds, -t) unlimited
max user processes  (-u) 10240
virtual memory  (kbytes, -v) unlimited
file locks  (-x) unlimited
core id : 0
core id : 1
core id : 2
core id : 3
core id : 4
core id : 5
physical id : 0
physical id : 1
MemTotal:   98949656 kB
MemFree:16734460 kB
Filesystem  Size  Used Avail Use% Mounted on
udev 48G 0   48G   0% /dev
tmpfs   9.5G  1.6M  9.5G   1% /run
/dev/sda3   3.6T  406G  3.1T  12% /
tmpfs48G  4.0K   48G   1% /dev/shm
tmpfs   5.0M 0  5.0M   0% /run/lock
tmpfs48G 0   48G   0% /sys/fs/cgroup
/dev/loop0   60M   60M 0 100% /snap/snapcraft/3970
/dev/loop1   67M   67M 0 100% /snap/lxd/13253
/dev/loop2   55M   55M 0 100% /snap/core18/1668
/dev/loop3   90M   90M 0 100% /snap/core/8268
/dev/loop4   60M   60M 0 100% /snap/snapcraft/3943
/dev/loop5   55M   55M 0 100% /snap/core18/1650
/dev/loop7   67M   67M 0 100% /snap/lxd/13300
/dev/sda2   473M  109M  340M  25% /boot
tmpfs   9.5G 0  9.5G   0% /run/user/910
/dev/loop8   92M   92M 0 100% /snap/core/8592
apache-maven-2.2.1
apache-maven-3.0.5
apache-maven-3.1.1
apache-maven-3.2.5
apache-maven-3.3.9
apache-maven-3.5.2
apache-maven-3.5.4
apache-maven-3.6.0
apache-maven-3.6.2
apache-maven-3.6.3
latest
latest2
latest3


===
Verifying compile level compatibility with HBase 0.98 with Phoenix 
4.x-HBase-0.98
===

Cloning into 'hbase'...
Switched to a new branch '0.98'
Branch '0.98' set up to track remote branch '0.98' from 'origin'.
[ERROR] Plugin org.codehaus.mojo:findbugs-maven-plugin:2.5.2 or one of its 
dependencies could not be resolved: Failed to read artifact descriptor for 
org.codehaus.mojo:findbugs-maven-plugin:jar:2.5.2: Could not transfer artifact 
org.codehaus.mojo:findbugs-maven-plugin:pom:2.5.2 from/to central 
(https://repo.maven.apache.org/maven2): Transfer failed for 
https://repo.maven.apache.org/maven2/org/codehaus/mojo/findbugs-maven-plugin/2.5.2/findbugs-maven-plugin-2.5.2.pom:
 Received fatal alert: protocol_version -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e 
switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please 
read the following articles:
[ERROR] [Help 1] 
http://cwiki.apache.org/confluence/display/MAVEN/PluginResolutionException
Build step 'Execute shell' marked build as failure


[phoenix] branch master updated (9fb24b3 -> 14eacf4)

2020-02-11 Thread skadam
This is an automated email from the ASF dual-hosted git repository.

skadam pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/phoenix.git.


from 9fb24b3  PHOENIX-5691 create index is failing when phoenix acls 
enabled and ranger is enabled(Rajeshbabu)
 add 14eacf4  PHOENIX-5629 Phoenix Function to return HBase row timestamp

No new revisions were added by this update.

Summary of changes:
 .../end2end/RowTimestampStringFunctionIT.java  | 194 +
 .../apache/phoenix/expression/ExpressionType.java  |   3 +-
 ...nction.java => RowTimestampStringFunction.java} |  43 ++---
 3 files changed, 215 insertions(+), 25 deletions(-)
 create mode 100644 
phoenix-core/src/it/java/org/apache/phoenix/end2end/RowTimestampStringFunctionIT.java
 copy 
phoenix-core/src/main/java/org/apache/phoenix/expression/function/{MathPIFunction.java
 => RowTimestampStringFunction.java} (59%)



[phoenix] branch 4.x-HBase-1.3 updated: PHOENIX-5629 Phoenix Function to return HBase row timestamp

2020-02-11 Thread skadam
This is an automated email from the ASF dual-hosted git repository.

skadam pushed a commit to branch 4.x-HBase-1.3
in repository https://gitbox.apache.org/repos/asf/phoenix.git


The following commit(s) were added to refs/heads/4.x-HBase-1.3 by this push:
 new 959a77f  PHOENIX-5629 Phoenix Function to return HBase row timestamp
959a77f is described below

commit 959a77fb1071a9001fd99412b643aa5a500795e6
Author: Tanuj Khurana 
AuthorDate: Wed Jan 22 16:09:37 2020 -0800

PHOENIX-5629 Phoenix Function to return HBase row timestamp

Signed-off-by: s.kadam 
---
 .../end2end/RowTimestampStringFunctionIT.java  | 194 +
 .../apache/phoenix/expression/ExpressionType.java  |   3 +-
 .../function/RowTimestampStringFunction.java   |  60 +++
 3 files changed, 256 insertions(+), 1 deletion(-)

diff --git 
a/phoenix-core/src/it/java/org/apache/phoenix/end2end/RowTimestampStringFunctionIT.java
 
b/phoenix-core/src/it/java/org/apache/phoenix/end2end/RowTimestampStringFunctionIT.java
new file mode 100644
index 000..d725989
--- /dev/null
+++ 
b/phoenix-core/src/it/java/org/apache/phoenix/end2end/RowTimestampStringFunctionIT.java
@@ -0,0 +1,194 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.phoenix.end2end;
+
+import com.google.common.collect.Lists;
+import org.apache.hadoop.hbase.TableName;
+import org.apache.hadoop.hbase.client.ConnectionFactory;
+import org.apache.hadoop.hbase.client.Get;
+import org.apache.hadoop.hbase.client.Result;
+import org.apache.hadoop.hbase.client.ResultScanner;
+import org.apache.hadoop.hbase.client.Scan;
+import org.apache.hadoop.hbase.client.Table;
+import org.apache.hadoop.hbase.util.Bytes;
+import org.apache.phoenix.query.QueryConstants;
+import org.apache.phoenix.schema.PTable.ImmutableStorageScheme;
+import org.apache.phoenix.schema.PTable.QualifierEncodingScheme;
+import org.apache.phoenix.util.EncodedColumnsUtil;
+import org.apache.phoenix.util.EnvironmentEdgeManager;
+import org.junit.Test;
+import org.junit.runner.RunWith;
+import org.junit.runners.Parameterized;
+
+import java.sql.Connection;
+import java.sql.Date;
+import java.sql.DriverManager;
+import java.sql.PreparedStatement;
+import java.sql.ResultSet;
+import java.util.Collection;
+import java.util.List;
+
+import static org.junit.Assert.assertEquals;
+import static org.junit.Assert.assertFalse;
+
+@RunWith(Parameterized.class)
+public class RowTimestampStringFunctionIT extends ParallelStatsDisabledIT {
+
+private final boolean encoded;
+private final String tableDDLOptions;
+
+public RowTimestampStringFunctionIT(QualifierEncodingScheme encoding,
+ImmutableStorageScheme storage) {
+StringBuilder optionBuilder = new StringBuilder();
+optionBuilder.append(" COLUMN_ENCODED_BYTES = " + encoding.ordinal());
+optionBuilder.append(",IMMUTABLE_STORAGE_SCHEME = "+ 
storage.toString());
+this.tableDDLOptions = optionBuilder.toString();
+this.encoded = (encoding != 
QualifierEncodingScheme.NON_ENCODED_QUALIFIERS) ? true : false;
+}
+
+@Parameterized.Parameters(name = "encoding={0},storage={1}")
+public static synchronized Collection data() {
+List list = Lists.newArrayList();
+for (QualifierEncodingScheme encoding : 
QualifierEncodingScheme.values()) {
+for (ImmutableStorageScheme storage : 
ImmutableStorageScheme.values()) {
+list.add(new Object[]{encoding, storage});
+}
+}
+return list;
+}
+
+private void verifyHbaseAllRowsTimestamp(String tableName, ResultSet rs, 
int expectedRowCount)
+throws Exception {
+
+Scan scan = new Scan();
+byte[] emptyKVQualifier = 
EncodedColumnsUtil.getEmptyKeyValueInfo(this.encoded).getFirst();
+try (org.apache.hadoop.hbase.client.Connection hconn =
+ConnectionFactory.createConnection(config)) {
+Table table = hconn.getTable(TableName.valueOf(tableName));
+ResultScanner resultScanner = table.getScanner(scan);
+int rowCount = 0;
+while (rs.next()) {
+Result result = resultScanner.next();
+long t

[phoenix] branch 4.x-HBase-1.4 updated: PHOENIX-5629 Phoenix Function to return HBase row timestamp

2020-02-11 Thread skadam
This is an automated email from the ASF dual-hosted git repository.

skadam pushed a commit to branch 4.x-HBase-1.4
in repository https://gitbox.apache.org/repos/asf/phoenix.git


The following commit(s) were added to refs/heads/4.x-HBase-1.4 by this push:
 new 72b839e  PHOENIX-5629 Phoenix Function to return HBase row timestamp
72b839e is described below

commit 72b839ebb5d3b16a08aa5d923a52485a3d6945fe
Author: Tanuj Khurana 
AuthorDate: Wed Jan 22 16:09:37 2020 -0800

PHOENIX-5629 Phoenix Function to return HBase row timestamp

Signed-off-by: s.kadam 
---
 .../end2end/RowTimestampStringFunctionIT.java  | 194 +
 .../apache/phoenix/expression/ExpressionType.java  |   3 +-
 .../function/RowTimestampStringFunction.java   |  60 +++
 3 files changed, 256 insertions(+), 1 deletion(-)

diff --git 
a/phoenix-core/src/it/java/org/apache/phoenix/end2end/RowTimestampStringFunctionIT.java
 
b/phoenix-core/src/it/java/org/apache/phoenix/end2end/RowTimestampStringFunctionIT.java
new file mode 100644
index 000..d725989
--- /dev/null
+++ 
b/phoenix-core/src/it/java/org/apache/phoenix/end2end/RowTimestampStringFunctionIT.java
@@ -0,0 +1,194 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.phoenix.end2end;
+
+import com.google.common.collect.Lists;
+import org.apache.hadoop.hbase.TableName;
+import org.apache.hadoop.hbase.client.ConnectionFactory;
+import org.apache.hadoop.hbase.client.Get;
+import org.apache.hadoop.hbase.client.Result;
+import org.apache.hadoop.hbase.client.ResultScanner;
+import org.apache.hadoop.hbase.client.Scan;
+import org.apache.hadoop.hbase.client.Table;
+import org.apache.hadoop.hbase.util.Bytes;
+import org.apache.phoenix.query.QueryConstants;
+import org.apache.phoenix.schema.PTable.ImmutableStorageScheme;
+import org.apache.phoenix.schema.PTable.QualifierEncodingScheme;
+import org.apache.phoenix.util.EncodedColumnsUtil;
+import org.apache.phoenix.util.EnvironmentEdgeManager;
+import org.junit.Test;
+import org.junit.runner.RunWith;
+import org.junit.runners.Parameterized;
+
+import java.sql.Connection;
+import java.sql.Date;
+import java.sql.DriverManager;
+import java.sql.PreparedStatement;
+import java.sql.ResultSet;
+import java.util.Collection;
+import java.util.List;
+
+import static org.junit.Assert.assertEquals;
+import static org.junit.Assert.assertFalse;
+
+@RunWith(Parameterized.class)
+public class RowTimestampStringFunctionIT extends ParallelStatsDisabledIT {
+
+private final boolean encoded;
+private final String tableDDLOptions;
+
+public RowTimestampStringFunctionIT(QualifierEncodingScheme encoding,
+ImmutableStorageScheme storage) {
+StringBuilder optionBuilder = new StringBuilder();
+optionBuilder.append(" COLUMN_ENCODED_BYTES = " + encoding.ordinal());
+optionBuilder.append(",IMMUTABLE_STORAGE_SCHEME = "+ 
storage.toString());
+this.tableDDLOptions = optionBuilder.toString();
+this.encoded = (encoding != 
QualifierEncodingScheme.NON_ENCODED_QUALIFIERS) ? true : false;
+}
+
+@Parameterized.Parameters(name = "encoding={0},storage={1}")
+public static synchronized Collection data() {
+List list = Lists.newArrayList();
+for (QualifierEncodingScheme encoding : 
QualifierEncodingScheme.values()) {
+for (ImmutableStorageScheme storage : 
ImmutableStorageScheme.values()) {
+list.add(new Object[]{encoding, storage});
+}
+}
+return list;
+}
+
+private void verifyHbaseAllRowsTimestamp(String tableName, ResultSet rs, 
int expectedRowCount)
+throws Exception {
+
+Scan scan = new Scan();
+byte[] emptyKVQualifier = 
EncodedColumnsUtil.getEmptyKeyValueInfo(this.encoded).getFirst();
+try (org.apache.hadoop.hbase.client.Connection hconn =
+ConnectionFactory.createConnection(config)) {
+Table table = hconn.getTable(TableName.valueOf(tableName));
+ResultScanner resultScanner = table.getScanner(scan);
+int rowCount = 0;
+while (rs.next()) {
+Result result = resultScanner.next();
+long t

[phoenix] branch 4.x-HBase-1.5 updated: PHOENIX-5629 Phoenix Function to return HBase row timestamp

2020-02-11 Thread skadam
This is an automated email from the ASF dual-hosted git repository.

skadam pushed a commit to branch 4.x-HBase-1.5
in repository https://gitbox.apache.org/repos/asf/phoenix.git


The following commit(s) were added to refs/heads/4.x-HBase-1.5 by this push:
 new d11e851  PHOENIX-5629 Phoenix Function to return HBase row timestamp
d11e851 is described below

commit d11e8511809c941bdcf28889167f01198190ddfd
Author: Tanuj Khurana 
AuthorDate: Wed Jan 22 16:09:37 2020 -0800

PHOENIX-5629 Phoenix Function to return HBase row timestamp

Signed-off-by: s.kadam 
---
 .../end2end/RowTimestampStringFunctionIT.java  | 194 +
 .../apache/phoenix/expression/ExpressionType.java  |   3 +-
 .../function/RowTimestampStringFunction.java   |  60 +++
 3 files changed, 256 insertions(+), 1 deletion(-)

diff --git 
a/phoenix-core/src/it/java/org/apache/phoenix/end2end/RowTimestampStringFunctionIT.java
 
b/phoenix-core/src/it/java/org/apache/phoenix/end2end/RowTimestampStringFunctionIT.java
new file mode 100644
index 000..d725989
--- /dev/null
+++ 
b/phoenix-core/src/it/java/org/apache/phoenix/end2end/RowTimestampStringFunctionIT.java
@@ -0,0 +1,194 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.phoenix.end2end;
+
+import com.google.common.collect.Lists;
+import org.apache.hadoop.hbase.TableName;
+import org.apache.hadoop.hbase.client.ConnectionFactory;
+import org.apache.hadoop.hbase.client.Get;
+import org.apache.hadoop.hbase.client.Result;
+import org.apache.hadoop.hbase.client.ResultScanner;
+import org.apache.hadoop.hbase.client.Scan;
+import org.apache.hadoop.hbase.client.Table;
+import org.apache.hadoop.hbase.util.Bytes;
+import org.apache.phoenix.query.QueryConstants;
+import org.apache.phoenix.schema.PTable.ImmutableStorageScheme;
+import org.apache.phoenix.schema.PTable.QualifierEncodingScheme;
+import org.apache.phoenix.util.EncodedColumnsUtil;
+import org.apache.phoenix.util.EnvironmentEdgeManager;
+import org.junit.Test;
+import org.junit.runner.RunWith;
+import org.junit.runners.Parameterized;
+
+import java.sql.Connection;
+import java.sql.Date;
+import java.sql.DriverManager;
+import java.sql.PreparedStatement;
+import java.sql.ResultSet;
+import java.util.Collection;
+import java.util.List;
+
+import static org.junit.Assert.assertEquals;
+import static org.junit.Assert.assertFalse;
+
+@RunWith(Parameterized.class)
+public class RowTimestampStringFunctionIT extends ParallelStatsDisabledIT {
+
+private final boolean encoded;
+private final String tableDDLOptions;
+
+public RowTimestampStringFunctionIT(QualifierEncodingScheme encoding,
+ImmutableStorageScheme storage) {
+StringBuilder optionBuilder = new StringBuilder();
+optionBuilder.append(" COLUMN_ENCODED_BYTES = " + encoding.ordinal());
+optionBuilder.append(",IMMUTABLE_STORAGE_SCHEME = "+ 
storage.toString());
+this.tableDDLOptions = optionBuilder.toString();
+this.encoded = (encoding != 
QualifierEncodingScheme.NON_ENCODED_QUALIFIERS) ? true : false;
+}
+
+@Parameterized.Parameters(name = "encoding={0},storage={1}")
+public static synchronized Collection data() {
+List list = Lists.newArrayList();
+for (QualifierEncodingScheme encoding : 
QualifierEncodingScheme.values()) {
+for (ImmutableStorageScheme storage : 
ImmutableStorageScheme.values()) {
+list.add(new Object[]{encoding, storage});
+}
+}
+return list;
+}
+
+private void verifyHbaseAllRowsTimestamp(String tableName, ResultSet rs, 
int expectedRowCount)
+throws Exception {
+
+Scan scan = new Scan();
+byte[] emptyKVQualifier = 
EncodedColumnsUtil.getEmptyKeyValueInfo(this.encoded).getFirst();
+try (org.apache.hadoop.hbase.client.Connection hconn =
+ConnectionFactory.createConnection(config)) {
+Table table = hconn.getTable(TableName.valueOf(tableName));
+ResultScanner resultScanner = table.getScanner(scan);
+int rowCount = 0;
+while (rs.next()) {
+Result result = resultScanner.next();
+long t

Apache Phoenix - Timeout crawler - Build https://builds.apache.org/job/Phoenix-master-matrix/1/

2020-02-11 Thread Apache Jenkins Server
[...truncated 21 lines...]
Looking at the log, list of test(s) that timed-out:

Build:
https://builds.apache.org/job/Phoenix-master-matrix/1/


Affected test class(es):
Set(['as SYSTEM'])


Build step 'Execute shell' marked build as failure
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any

Apache Phoenix - Timeout crawler - Build https://builds.apache.org/job/Phoenix-master-matrix/2/

2020-02-11 Thread Apache Jenkins Server
[...truncated 21 lines...]
Looking at the log, list of test(s) that timed-out:

Build:
https://builds.apache.org/job/Phoenix-master-matrix/2/


Affected test class(es):
Set(['as SYSTEM'])


Build step 'Execute shell' marked build as failure
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any

Build failed in Jenkins: Phoenix-4.x-HBase-1.5 #271

2020-02-11 Thread Apache Jenkins Server
See 


Changes:

[s.kadam] PHOENIX-5629 Phoenix Function to return HBase row timestamp


--
[...truncated 104.30 KB...]
[INFO] 
[INFO] ---
[INFO]  T E S T S
[INFO] ---
[INFO] 
[INFO] Results:
[INFO] 
[INFO] Tests run: 0, Failures: 0, Errors: 0, Skipped: 0
[INFO] 
[INFO] 
[INFO] --- maven-failsafe-plugin:2.22.0:integration-test 
(NeedTheirOwnClusterTests) @ phoenix-core ---
[INFO] 
[INFO] ---
[INFO]  T E S T S
[INFO] ---
[INFO] Running 
org.apache.hadoop.hbase.regionserver.wal.WALReplayWithIndexWritesAndCompressedWALIT
[WARNING] Tests run: 1, Failures: 0, Errors: 0, Skipped: 1, Time elapsed: 0.003 
s - in 
org.apache.hadoop.hbase.regionserver.wal.WALReplayWithIndexWritesAndCompressedWALIT
[INFO] Running org.apache.phoenix.end2end.ConnectionUtilIT
[INFO] Running 
org.apache.hadoop.hbase.regionserver.wal.WALRecoveryRegionPostOpenIT
[INFO] Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 6.134 s 
- in org.apache.hadoop.hbase.regionserver.wal.WALRecoveryRegionPostOpenIT
[INFO] Running org.apache.phoenix.end2end.CsvBulkLoadToolIT
[INFO] Running org.apache.phoenix.end2end.ConcurrentMutationsExtendedIT
[INFO] Running org.apache.phoenix.end2end.CountDistinctCompressionIT
[INFO] Running org.apache.phoenix.end2end.ContextClassloaderIT
[INFO] Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 2.52 s - 
in org.apache.phoenix.end2end.CountDistinctCompressionIT
[INFO] Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 2.498 s 
- in org.apache.phoenix.end2end.ContextClassloaderIT
[INFO] Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 28.614 s 
- in org.apache.phoenix.end2end.ConnectionUtilIT
[INFO] Running org.apache.phoenix.end2end.CostBasedDecisionIT
[INFO] Running org.apache.phoenix.end2end.DropSchemaIT
[INFO] Running org.apache.phoenix.end2end.FlappingLocalIndexIT
[INFO] Running org.apache.phoenix.end2end.IndexBuildTimestampIT
[INFO] Running org.apache.phoenix.end2end.IndexExtendedIT
[INFO] Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 18.821 s 
- in org.apache.phoenix.end2end.DropSchemaIT
[INFO] Tests run: 16, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 122.791 
s - in org.apache.phoenix.end2end.CsvBulkLoadToolIT
[INFO] Running org.apache.phoenix.end2end.IndexRebuildTaskIT
[INFO] Running org.apache.phoenix.end2end.IndexScrutinyToolForTenantIT
[INFO] Tests run: 16, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 143.648 
s - in org.apache.phoenix.end2end.IndexBuildTimestampIT
[INFO] Tests run: 12, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 149.243 
s - in org.apache.phoenix.end2end.FlappingLocalIndexIT
[INFO] Tests run: 10, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 234.159 
s - in org.apache.phoenix.end2end.ConcurrentMutationsExtendedIT
[INFO] Tests run: 6, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 65.306 s 
- in org.apache.phoenix.end2end.IndexScrutinyToolForTenantIT
[INFO] Tests run: 48, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 188.577 
s - in org.apache.phoenix.end2end.IndexExtendedIT
[INFO] Running org.apache.phoenix.end2end.IndexScrutinyToolIT
[INFO] Running org.apache.phoenix.end2end.IndexToolForPartialBuildIT
[INFO] Running 
org.apache.phoenix.end2end.IndexToolForPartialBuildWithNamespaceEnabledIT
[INFO] Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 20.37 s 
- in org.apache.phoenix.end2end.IndexToolForPartialBuildIT
[INFO] Running org.apache.phoenix.end2end.IndexToolIT
[INFO] Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 36.735 s 
- in org.apache.phoenix.end2end.IndexToolForPartialBuildWithNamespaceEnabledIT
[INFO] Running org.apache.phoenix.end2end.LocalIndexSplitMergeIT
[INFO] Running org.apache.phoenix.end2end.MigrateSystemTablesToSystemNamespaceIT
[INFO] Running org.apache.phoenix.end2end.MaxLookbackIT
[ERROR] Tests run: 1, Failures: 1, Errors: 0, Skipped: 0, Time elapsed: 222.709 
s <<< FAILURE! - in org.apache.phoenix.end2end.IndexRebuildTaskIT
[ERROR] testIndexRebuildTask(org.apache.phoenix.end2end.IndexRebuildTaskIT)  
Time elapsed: 222.708 s  <<< FAILURE!
java.lang.AssertionError: Ran out of time waiting for task state to become 
COMPLETED
at 
org.apache.phoenix.end2end.IndexRebuildTaskIT.waitForTaskState(IndexRebuildTaskIT.java:196)
at 
org.apache.phoenix.end2end.IndexRebuildTaskIT.testIndexRebuildTask(IndexRebuildTaskIT.java:156)

[INFO] Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 55.515 s 
- in org.apache.phoenix.end2end.MaxLookbackIT
[INFO] Running 
org.apache.phoenix.end2end.OrderByWithServerClientSpoolingDisabledIT
[INFO] Running org.apache.phoenix.end2end.OrderByWithServerMemoryLimit