[
https://issues.apache.org/jira/browse/PHOENIX-6010?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17193433#comment-17193433
]
Hadoop QA commented on PHOENIX-6010:
------------------------------------
{color:red}-1 overall{color}. Here are the results of testing the latest
attachment
http://issues.apache.org/jira/secure/attachment/13011308/PHOENIX-6010.master.v8.patch
against master branch at commit .
ATTACHMENT ID: 13011308
{color:green}+1 @author{color}. The patch does not contain any @author
tags.
{color:green}+1 tests included{color}. The patch appears to include 0 new
or modified tests.
{color:red}-1 javac{color}. The applied patch generated 335 javac compiler
warnings (more than the master's current 303 warnings).
{color:green}+1 release audit{color}. The applied patch does not increase
the total number of release audit warnings.
{color:red}-1 lineLengths{color}. The patch introduces the following lines
longer than 100:
+ return MoreObjects.firstNonNull(super.getMessage(), "") + " " +
FAILED_MSG + "\n\t table: " + this.table + "\n\t edits: " + mutationsMsg
+
idxKey2.addAll(org.apache.phoenix.thirdparty.com.google.common.primitives.Bytes.asList(Bytes.toBytes(ROW_KEY)));
+ byte[] idxKeyBytes2 =
org.apache.phoenix.thirdparty.com.google.common.primitives.Bytes.toArray(idxKey2);
+
idxKey.addAll(org.apache.phoenix.thirdparty.com.google.common.primitives.Bytes.asList(Bytes.toBytes(indexVal)));
+
idxKey.addAll(org.apache.phoenix.thirdparty.com.google.common.primitives.Bytes.asList(Bytes.toBytes(ROW_KEY)));
{color:green}+1 core tests{color}. The patch passed unit tests in .
{color:red}-1 core zombie tests{color}. There are 7 zombie test(s):
at org.apache.phoenix.end2end.AlterTableIT.testAddVarCols(AlterTableIT.java:417)
at
org.apache.phoenix.end2end.DateTimeIT.testUnsignedTimeDateWithLiteral(DateTimeIT.java:667)
at
org.apache.phoenix.end2end.DefaultColumnValueIT.testDefaultUpsertSelectPrimaryKey(DefaultColumnValueIT.java:644)
at
org.apache.phoenix.end2end.DeleteIT.testPointDeleteWithMultipleImmutableIndexes(DeleteIT.java:781)
at
org.apache.phoenix.end2end.DeleteIT.testPointDeleteWithMultipleImmutableIndexesAfterAlter(DeleteIT.java:764)
Test results:
https://ci-hadoop.apache.org/job/PreCommit-PHOENIX-Build/80//testReport/
Code Coverage results:
https://ci-hadoop.apache.org/job/PreCommit-PHOENIX-Build/80//artifact/phoenix-core/target/site/jacoco/index.html
Console output:
https://ci-hadoop.apache.org/job/PreCommit-PHOENIX-Build/80//console
This message is automatically generated.
> Create phoenix-thirdparty, and consume guava through it
> -------------------------------------------------------
>
> Key: PHOENIX-6010
> URL: https://issues.apache.org/jira/browse/PHOENIX-6010
> Project: Phoenix
> Issue Type: Improvement
> Components: core, omid, tephra
> Affects Versions: 5.1.0, 4.16.0
> Reporter: Istvan Toth
> Assignee: Istvan Toth
> Priority: Major
> Attachments: PHOENIX-6010.master.v1.patch,
> PHOENIX-6010.master.v2.patch, PHOENIX-6010.master.v3.patch,
> PHOENIX-6010.master.v4.patch, PHOENIX-6010.master.v5.patch,
> PHOENIX-6010.master.v6.patch, PHOENIX-6010.master.v7.patch,
> PHOENIX-6010.master.v8.patch
>
> Time Spent: 40m
> Remaining Estimate: 0h
>
> We have long-standing and well-documented problems with Guava, just like the
> rest of the Hadoop components.
> Adopt the solution used by HBase:
> * create phoenix-thirdparty repo
> * create a pre-shaded phoenix-shaded-guava artifact in it
> * Use the pre-shaded Guava in every phoenix component
> The advantages are well-known, but to name a few:
> * Phoenix will work with Hadoop 3.1.3+
> * One less CVE in our direct dependencies
> * No more conflict with our consumer's Guava versions
--
This message was sent by Atlassian Jira
(v8.3.4#803005)