[
https://issues.apache.org/jira/browse/PHOENIX-6010?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17199256#comment-17199256
]
Hadoop QA commented on PHOENIX-6010:
------------------------------------
{color:red}-1 overall{color}. Here are the results of testing the latest
attachment
http://issues.apache.org/jira/secure/attachment/13011824/PHOENIX-6010.master.10.patch
against master branch at commit .
ATTACHMENT ID: 13011824
{color:green}+1 @author{color}. The patch does not contain any @author
tags.
{color:green}+1 tests included{color}. The patch appears to include 0 new
or modified tests.
{color:red}-1 javac{color}. The applied patch generated 335 javac compiler
warnings (more than the master's current 303 warnings).
{color:green}+1 release audit{color}. The applied patch does not increase
the total number of release audit warnings.
{color:red}-1 lineLengths{color}. The patch introduces the following lines
longer than 100:
+ return MoreObjects.firstNonNull(super.getMessage(), "") + " " +
FAILED_MSG + "\n\t table: " + this.table + "\n\t edits: " + mutationsMsg
+
idxKey2.addAll(org.apache.phoenix.thirdparty.com.google.common.primitives.Bytes.asList(Bytes.toBytes(ROW_KEY)));
+ byte[] idxKeyBytes2 =
org.apache.phoenix.thirdparty.com.google.common.primitives.Bytes.toArray(idxKey2);
+
idxKey.addAll(org.apache.phoenix.thirdparty.com.google.common.primitives.Bytes.asList(Bytes.toBytes(indexVal)));
+
idxKey.addAll(org.apache.phoenix.thirdparty.com.google.common.primitives.Bytes.asList(Bytes.toBytes(ROW_KEY)));
+
idxKey.addAll(org.apache.phoenix.thirdparty.com.google.common.primitives.Bytes.asList(Bytes.toBytes(dataVal)));
+
idxKey.addAll(org.apache.phoenix.thirdparty.com.google.common.primitives.Bytes.asList(Bytes.toBytes(dataRowKey)));
{color:green}+1 core tests{color}. The patch passed unit tests in .
Test results:
https://ci-hadoop.apache.org/job/PreCommit-PHOENIX-Build/111//testReport/
Code Coverage results:
https://ci-hadoop.apache.org/job/PreCommit-PHOENIX-Build/111//artifact/phoenix-core/target/site/jacoco/index.html
Console output:
https://ci-hadoop.apache.org/job/PreCommit-PHOENIX-Build/111//console
This message is automatically generated.
> Create phoenix-thirdparty, and consume guava through it
> -------------------------------------------------------
>
> Key: PHOENIX-6010
> URL: https://issues.apache.org/jira/browse/PHOENIX-6010
> Project: Phoenix
> Issue Type: Improvement
> Components: core, omid, tephra
> Affects Versions: 5.1.0, 4.16.0
> Reporter: Istvan Toth
> Assignee: Istvan Toth
> Priority: Major
> Attachments: PHOENIX-6010.master.10.patch,
> PHOENIX-6010.master.v1.patch, PHOENIX-6010.master.v2.patch,
> PHOENIX-6010.master.v3.patch, PHOENIX-6010.master.v4.patch,
> PHOENIX-6010.master.v5.patch, PHOENIX-6010.master.v6.patch,
> PHOENIX-6010.master.v7.patch, PHOENIX-6010.master.v8.patch,
> PHOENIX-6010.master.v9.patch
>
> Time Spent: 2h
> Remaining Estimate: 0h
>
> We have long-standing and well-documented problems with Guava, just like the
> rest of the Hadoop components.
> Adopt the solution used by HBase:
> * create phoenix-thirdparty repo
> * create a pre-shaded phoenix-shaded-guava artifact in it
> * Use the pre-shaded Guava in every phoenix component
> The advantages are well-known, but to name a few:
> * Phoenix will work with Hadoop 3.1.3+
> * One less CVE in our direct dependencies
> * No more conflict with our consumer's Guava versions
--
This message was sent by Atlassian Jira
(v8.3.4#803005)