[jira] [Commented] (PHOENIX-6444) Extend Cell Tags to Delete object for Indexer coproc

2021-05-27 Thread Viraj Jasani (Jira)


[ 
https://issues.apache.org/jira/browse/PHOENIX-6444?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17352949#comment-17352949
 ] 

Viraj Jasani commented on PHOENIX-6444:
---

Backported to 5.1, will keep an eye on Phoenix multibranch build (post commit 
build) for 5.1.

> Extend Cell Tags to Delete object for Indexer coproc
> 
>
> Key: PHOENIX-6444
> URL: https://issues.apache.org/jira/browse/PHOENIX-6444
> Project: Phoenix
>  Issue Type: Improvement
>  Components: core
>Reporter: Rushabh Shah
>Assignee: Rushabh Shah
>Priority: Major
> Fix For: 4.17.0, 5.2.0, 5.1.2
>
>
> In PHOENIX-6213 we added support for adding source of operation cell tag to 
> Delete Markers. But we added the logic to create TagRewriteCell and add it to 
> DeleteMarker only in IndexRegionObserver coproc. I missed adding the same 
> logic to Indexer coproc. Thank you [~tkhurana] for finding this bug.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[jira] [Commented] (PHOENIX-6444) Extend Cell Tags to Delete object for Indexer coproc

2021-05-27 Thread Lars Hofhansl (Jira)


[ 
https://issues.apache.org/jira/browse/PHOENIX-6444?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17352930#comment-17352930
 ] 

Lars Hofhansl commented on PHOENIX-6444:


If the master change applies (mostly) cleanly, we can just cherry-pick the 
change into 5.1 without a separate PR.

> Extend Cell Tags to Delete object for Indexer coproc
> 
>
> Key: PHOENIX-6444
> URL: https://issues.apache.org/jira/browse/PHOENIX-6444
> Project: Phoenix
>  Issue Type: Improvement
>  Components: core
>Reporter: Rushabh Shah
>Assignee: Rushabh Shah
>Priority: Major
> Fix For: 4.17.0, 5.2.0
>
>
> In PHOENIX-6213 we added support for adding source of operation cell tag to 
> Delete Markers. But we added the logic to create TagRewriteCell and add it to 
> DeleteMarker only in IndexRegionObserver coproc. I missed adding the same 
> logic to Indexer coproc. Thank you [~tkhurana] for finding this bug.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[jira] [Commented] (PHOENIX-6476) Index tool when verifying from index to data doesn't correctly split page into tasks

2021-05-27 Thread ASF GitHub Bot (Jira)


[ 
https://issues.apache.org/jira/browse/PHOENIX-6476?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17352675#comment-17352675
 ] 

ASF GitHub Bot commented on PHOENIX-6476:
-

tkhurana commented on a change in pull request #1240:
URL: https://github.com/apache/phoenix/pull/1240#discussion_r640858365



##
File path: 
phoenix-core/src/main/java/org/apache/phoenix/coprocessor/IndexRepairRegionScanner.java
##
@@ -303,20 +306,47 @@ public Boolean call() throws Exception {
 return dataRowKeys;
 }
 
+/**
+ * @param indexMutationMap actual index mutations for a page
+ * @param dataRowKeysSetList List of per-task data row keys
+ * @return For each set of data row keys, split the acutal index mutation 
map into
+ * a per-task index mutation map and return the list of all index mutation 
maps.
+ */
+private List>> getPerTaskIndexMutationMap(
+Map> indexMutationMap, List> 
dataRowKeysSetList) {
+List>> mapList = 
Lists.newArrayListWithExpectedSize(dataRowKeysSetList.size());
+for (int i = 0; i < dataRowKeysSetList.size(); ++i) {
+Map> perTaskIndexMutationMap = new 
TreeMap<>(Bytes.BYTES_COMPARATOR);
+mapList.add(perTaskIndexMutationMap);
+}
+for (Map.Entry> entry : 
indexMutationMap.entrySet()) {
+byte[] indexRowKey = entry.getKey();
+List actualMutationList = entry.getValue();
+byte[] dataRowKey = indexMaintainer.buildDataRowKey(new 
ImmutableBytesWritable(indexRowKey), viewConstants);
+for (int i = 0; i < dataRowKeysSetList.size(); ++i) {
+if (dataRowKeysSetList.get(i).contains(dataRowKey)) {
+mapList.get(i).put(indexRowKey, actualMutationList);

Review comment:
   Yes, that would avoid unnecessary lookups.




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


> Index tool when verifying from index to data doesn't correctly split page 
> into tasks
> 
>
> Key: PHOENIX-6476
> URL: https://issues.apache.org/jira/browse/PHOENIX-6476
> Project: Phoenix
>  Issue Type: Bug
>Affects Versions: 4.14.3, 4.16.0, 4.16.1
>Reporter: Tanuj Khurana
>Assignee: Tanuj Khurana
>Priority: Major
>
> When running index tool with index table as source, it splits a page into 
> tasks when the page size is greater than the configured task size (default 
> 2048) and runs each task in parallel. Each task is assigned a set of data row 
> keys but the index mutation map is not split according to the data row keys 
> assigned to a particular task. As a result, the tool reports wrong results 
> because the index mutation map is per page but the set of data row keys is 
> per task.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[GitHub] [phoenix] tkhurana commented on a change in pull request #1240: PHOENIX-6476 Index tool when verifying from index to data doesn't correctly split page into tasks

2021-05-27 Thread GitBox


tkhurana commented on a change in pull request #1240:
URL: https://github.com/apache/phoenix/pull/1240#discussion_r640858365



##
File path: 
phoenix-core/src/main/java/org/apache/phoenix/coprocessor/IndexRepairRegionScanner.java
##
@@ -303,20 +306,47 @@ public Boolean call() throws Exception {
 return dataRowKeys;
 }
 
+/**
+ * @param indexMutationMap actual index mutations for a page
+ * @param dataRowKeysSetList List of per-task data row keys
+ * @return For each set of data row keys, split the acutal index mutation 
map into
+ * a per-task index mutation map and return the list of all index mutation 
maps.
+ */
+private List>> getPerTaskIndexMutationMap(
+Map> indexMutationMap, List> 
dataRowKeysSetList) {
+List>> mapList = 
Lists.newArrayListWithExpectedSize(dataRowKeysSetList.size());
+for (int i = 0; i < dataRowKeysSetList.size(); ++i) {
+Map> perTaskIndexMutationMap = new 
TreeMap<>(Bytes.BYTES_COMPARATOR);
+mapList.add(perTaskIndexMutationMap);
+}
+for (Map.Entry> entry : 
indexMutationMap.entrySet()) {
+byte[] indexRowKey = entry.getKey();
+List actualMutationList = entry.getValue();
+byte[] dataRowKey = indexMaintainer.buildDataRowKey(new 
ImmutableBytesWritable(indexRowKey), viewConstants);
+for (int i = 0; i < dataRowKeysSetList.size(); ++i) {
+if (dataRowKeysSetList.get(i).contains(dataRowKey)) {
+mapList.get(i).put(indexRowKey, actualMutationList);

Review comment:
   Yes, that would avoid unnecessary lookups.




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[jira] [Commented] (PHOENIX-6476) Index tool when verifying from index to data doesn't correctly split page into tasks

2021-05-27 Thread ASF GitHub Bot (Jira)


[ 
https://issues.apache.org/jira/browse/PHOENIX-6476?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17352669#comment-17352669
 ] 

ASF GitHub Bot commented on PHOENIX-6476:
-

gokceni commented on a change in pull request #1240:
URL: https://github.com/apache/phoenix/pull/1240#discussion_r640841573



##
File path: 
phoenix-core/src/main/java/org/apache/phoenix/coprocessor/IndexRepairRegionScanner.java
##
@@ -303,20 +306,47 @@ public Boolean call() throws Exception {
 return dataRowKeys;
 }
 
+/**
+ * @param indexMutationMap actual index mutations for a page
+ * @param dataRowKeysSetList List of per-task data row keys
+ * @return For each set of data row keys, split the acutal index mutation 
map into
+ * a per-task index mutation map and return the list of all index mutation 
maps.
+ */
+private List>> getPerTaskIndexMutationMap(
+Map> indexMutationMap, List> 
dataRowKeysSetList) {
+List>> mapList = 
Lists.newArrayListWithExpectedSize(dataRowKeysSetList.size());
+for (int i = 0; i < dataRowKeysSetList.size(); ++i) {
+Map> perTaskIndexMutationMap = new 
TreeMap<>(Bytes.BYTES_COMPARATOR);
+mapList.add(perTaskIndexMutationMap);
+}
+for (Map.Entry> entry : 
indexMutationMap.entrySet()) {
+byte[] indexRowKey = entry.getKey();
+List actualMutationList = entry.getValue();
+byte[] dataRowKey = indexMaintainer.buildDataRowKey(new 
ImmutableBytesWritable(indexRowKey), viewConstants);
+for (int i = 0; i < dataRowKeysSetList.size(); ++i) {
+if (dataRowKeysSetList.get(i).contains(dataRowKey)) {
+mapList.get(i).put(indexRowKey, actualMutationList);

Review comment:
   don't you need to break?




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


> Index tool when verifying from index to data doesn't correctly split page 
> into tasks
> 
>
> Key: PHOENIX-6476
> URL: https://issues.apache.org/jira/browse/PHOENIX-6476
> Project: Phoenix
>  Issue Type: Bug
>Affects Versions: 4.14.3, 4.16.0, 4.16.1
>Reporter: Tanuj Khurana
>Assignee: Tanuj Khurana
>Priority: Major
>
> When running index tool with index table as source, it splits a page into 
> tasks when the page size is greater than the configured task size (default 
> 2048) and runs each task in parallel. Each task is assigned a set of data row 
> keys but the index mutation map is not split according to the data row keys 
> assigned to a particular task. As a result, the tool reports wrong results 
> because the index mutation map is per page but the set of data row keys is 
> per task.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[GitHub] [phoenix] gokceni commented on a change in pull request #1240: PHOENIX-6476 Index tool when verifying from index to data doesn't correctly split page into tasks

2021-05-27 Thread GitBox


gokceni commented on a change in pull request #1240:
URL: https://github.com/apache/phoenix/pull/1240#discussion_r640841573



##
File path: 
phoenix-core/src/main/java/org/apache/phoenix/coprocessor/IndexRepairRegionScanner.java
##
@@ -303,20 +306,47 @@ public Boolean call() throws Exception {
 return dataRowKeys;
 }
 
+/**
+ * @param indexMutationMap actual index mutations for a page
+ * @param dataRowKeysSetList List of per-task data row keys
+ * @return For each set of data row keys, split the acutal index mutation 
map into
+ * a per-task index mutation map and return the list of all index mutation 
maps.
+ */
+private List>> getPerTaskIndexMutationMap(
+Map> indexMutationMap, List> 
dataRowKeysSetList) {
+List>> mapList = 
Lists.newArrayListWithExpectedSize(dataRowKeysSetList.size());
+for (int i = 0; i < dataRowKeysSetList.size(); ++i) {
+Map> perTaskIndexMutationMap = new 
TreeMap<>(Bytes.BYTES_COMPARATOR);
+mapList.add(perTaskIndexMutationMap);
+}
+for (Map.Entry> entry : 
indexMutationMap.entrySet()) {
+byte[] indexRowKey = entry.getKey();
+List actualMutationList = entry.getValue();
+byte[] dataRowKey = indexMaintainer.buildDataRowKey(new 
ImmutableBytesWritable(indexRowKey), viewConstants);
+for (int i = 0; i < dataRowKeysSetList.size(); ++i) {
+if (dataRowKeysSetList.get(i).contains(dataRowKey)) {
+mapList.get(i).put(indexRowKey, actualMutationList);

Review comment:
   don't you need to break?




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [phoenix] stoty commented on pull request #1241: Remove duplicate entry of commons-io dependency in phoenix-pherf

2021-05-27 Thread GitBox


stoty commented on pull request #1241:
URL: https://github.com/apache/phoenix/pull/1241#issuecomment-849815891


   Committed to master and 5.1


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [phoenix] stoty closed pull request #1241: Remove duplicate entry of commons-io dependency in phoenix-pherf

2021-05-27 Thread GitBox


stoty closed pull request #1241:
URL: https://github.com/apache/phoenix/pull/1241


   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[jira] [Commented] (PHOENIX-6479) Duplicate commons-io dependency in phoenix-pherf

2021-05-27 Thread Istvan Toth (Jira)


[ 
https://issues.apache.org/jira/browse/PHOENIX-6479?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17352657#comment-17352657
 ] 

Istvan Toth commented on PHOENIX-6479:
--

The commits are 

https://github.com/apache/phoenix/commit/852adb132895ae887defe5832ef8e12705e1aee5
 

and 

https://github.com/apache/phoenix/commit/421ecb209909d8d31a893205f710edc0bb531c33

> Duplicate commons-io dependency in phoenix-pherf 
> -
>
> Key: PHOENIX-6479
> URL: https://issues.apache.org/jira/browse/PHOENIX-6479
> Project: Phoenix
>  Issue Type: Bug
>Affects Versions: 5.1.1
>Reporter: Istvan Toth
>Assignee: Martin Tzvetanov Grigorov
>Priority: Trivial
> Fix For: 5.2.0, 5.1.2
>
>
> The commons-io dependency is duplicated in the phoenix-pherf pom in master.
> This was reported and fixe by [~mgrigorov], and I commited the fix without a 
> JIRA reference by mistake.
> The commit message does not have a JIRA reference.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[jira] [Commented] (PHOENIX-6378) Unbundle sqlline from phoenix-client

2021-05-27 Thread ASF GitHub Bot (Jira)


[ 
https://issues.apache.org/jira/browse/PHOENIX-6378?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17352638#comment-17352638
 ] 

ASF GitHub Bot commented on PHOENIX-6378:
-

stoty commented on pull request #1239:
URL: https://github.com/apache/phoenix/pull/1239#issuecomment-849804462


   Could you update the ticket title and commit message ?
   Something like 
   Unbunldle sqqline from phoenix-client-embedded, and use it in sqlline.py


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


> Unbundle sqlline from phoenix-client
> 
>
> Key: PHOENIX-6378
> URL: https://issues.apache.org/jira/browse/PHOENIX-6378
> Project: Phoenix
>  Issue Type: Improvement
>  Components: core
>Affects Versions: 5.1.0, 4.16.0
>Reporter: Istvan Toth
>Assignee: Richárd Antal
>Priority: Major
>
> Ship sqlline separately, and adjust sqlline.py to load it.
> FYI [~dbwong]



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[GitHub] [phoenix] stoty commented on pull request #1239: PHOENIX-6378 Unbundle sqlline from phoenix-client

2021-05-27 Thread GitBox


stoty commented on pull request #1239:
URL: https://github.com/apache/phoenix/pull/1239#issuecomment-849804462


   Could you update the ticket title and commit message ?
   Something like 
   Unbunldle sqqline from phoenix-client-embedded, and use it in sqlline.py


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [phoenix-queryserver] stoty commented on pull request #65: PHOENIX-6473 Add Hadoop JMXServlet as /jmx endpoint

2021-05-27 Thread GitBox


stoty commented on pull request #65:
URL: 
https://github.com/apache/phoenix-queryserver/pull/65#issuecomment-849680888


   I tried to figure out if the Jmx servlet is authenticatd in Hadoop, but I 
got lost in the code.
   Did you check if you can access Jmx without authentication in Hadoop ?
   If yes, then it's fine, if not then, then we should also require 
authentication here.
   
   Also, I think that a property to optionally disable the JMX server would be 
useful.
   
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[jira] [Commented] (PHOENIX-6378) Unbundle sqlline from phoenix-client

2021-05-27 Thread ASF GitHub Bot (Jira)


[ 
https://issues.apache.org/jira/browse/PHOENIX-6378?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17352387#comment-17352387
 ] 

ASF GitHub Bot commented on PHOENIX-6378:
-

stoty commented on pull request #1239:
URL: https://github.com/apache/phoenix/pull/1239#issuecomment-849525495


   :broken_heart: **-1 overall**
   
   
   
   
   
   
   | Vote | Subsystem | Runtime | Comment |
   |::|--:|:|:|
   | +0 :ok: |  reexec  |   0m 55s |  Docker mode activated.  |
   ||| _ Prechecks _ |
   | +1 :green_heart: |  dupname  |   0m  0s |  No case conflicting files 
found.  |
   | +1 :green_heart: |  @author  |   0m  0s |  The patch does not contain any 
@author tags.  |
   | -1 :x: |  test4tests  |   0m  0s |  The patch doesn't appear to include 
any new or modified tests. Please justify why no new tests are needed for this 
patch. Also please list what manual steps were performed to verify this patch.  
|
   ||| _ master Compile Tests _ |
   | +0 :ok: |  mvndep  |   5m 14s |  Maven dependency ordering for branch  |
   | +1 :green_heart: |  mvninstall  |  11m  7s |  master passed  |
   | +0 |  hbaserecompile  |  23m  6s |  HBase recompiled.  |
   | +1 :green_heart: |  compile  |   1m 32s |  master passed  |
   | +1 :green_heart: |  javadoc  |   3m  3s |  master passed  |
   ||| _ Patch Compile Tests _ |
   | +0 :ok: |  mvndep  |   0m 20s |  Maven dependency ordering for patch  |
   | -1 :x: |  mvninstall  |   7m 54s |  root in the patch failed.  |
   | +0 |  hbaserecompile  |  14m 35s |  HBase recompiled.  |
   | -1 :x: |  compile  |   1m 29s |  root in the patch failed.  |
   | -1 :x: |  javac  |   1m 29s |  root in the patch failed.  |
   | -1 :x: |  pylint  |   0m  8s |  The patch generated 22 new + 115 unchanged 
- 10 fixed = 137 total (was 125)  |
   | +1 :green_heart: |  whitespace  |   0m  0s |  The patch has no whitespace 
issues.  |
   | +1 :green_heart: |  xml  |   0m 11s |  The patch has no ill-formed XML 
file.  |
   | -1 :x: |  javadoc  |   1m 21s |  root in the patch failed.  |
   | -1 :x: |  javadoc  |   0m 14s |  phoenix-assembly in the patch failed.  |
   ||| _ Other Tests _ |
   | -1 :x: |  unit  | 116m 13s |  root in the patch failed.  |
   | -1 :x: |  asflicense  |   3m  2s |  The patch generated 614 ASF License 
warnings.  |
   |  |   | 170m 48s |   |
   
   
   | Subsystem | Report/Notes |
   |--:|:-|
   | Docker | ClientAPI=1.41 ServerAPI=1.41 base: 
https://ci-hadoop.apache.org/job/Phoenix/job/Phoenix-PreCommit-GitHub-PR/job/PR-1239/3/artifact/yetus-general-check/output/Dockerfile
 |
   | GITHUB PR | https://github.com/apache/phoenix/pull/1239 |
   | Optional Tests | dupname asflicense pylint javac javadoc unit xml compile |
   | uname | Linux e2cfb2bde495 4.15.0-58-generic #64-Ubuntu SMP Tue Aug 6 
11:12:41 UTC 2019 x86_64 x86_64 x86_64 GNU/Linux |
   | Build tool | maven |
   | Personality | dev/phoenix-personality.sh |
   | git revision | master / 63cbb11 |
   | Default Java | Private Build-1.8.0_242-8u242-b08-0ubuntu3~16.04-b08 |
   | mvninstall | 
https://ci-hadoop.apache.org/job/Phoenix/job/Phoenix-PreCommit-GitHub-PR/job/PR-1239/3/artifact/yetus-general-check/output/patch-mvninstall-root.txt
 |
   | compile | 
https://ci-hadoop.apache.org/job/Phoenix/job/Phoenix-PreCommit-GitHub-PR/job/PR-1239/3/artifact/yetus-general-check/output/patch-compile-root.txt
 |
   | javac | 
https://ci-hadoop.apache.org/job/Phoenix/job/Phoenix-PreCommit-GitHub-PR/job/PR-1239/3/artifact/yetus-general-check/output/patch-compile-root.txt
 |
   | pylint | 
https://ci-hadoop.apache.org/job/Phoenix/job/Phoenix-PreCommit-GitHub-PR/job/PR-1239/3/artifact/yetus-general-check/output/diff-patch-pylint.txt
 |
   | javadoc | 
https://ci-hadoop.apache.org/job/Phoenix/job/Phoenix-PreCommit-GitHub-PR/job/PR-1239/3/artifact/yetus-general-check/output/patch-javadoc-root.txt
 |
   | javadoc | 
https://ci-hadoop.apache.org/job/Phoenix/job/Phoenix-PreCommit-GitHub-PR/job/PR-1239/3/artifact/yetus-general-check/output/patch-javadoc-phoenix-assembly.txt
 |
   | unit | 
https://ci-hadoop.apache.org/job/Phoenix/job/Phoenix-PreCommit-GitHub-PR/job/PR-1239/3/artifact/yetus-general-check/output/patch-unit-root.txt
 |
   |  Test Results | 
https://ci-hadoop.apache.org/job/Phoenix/job/Phoenix-PreCommit-GitHub-PR/job/PR-1239/3/testReport/
 |
   | asflicense | 
https://ci-hadoop.apache.org/job/Phoenix/job/Phoenix-PreCommit-GitHub-PR/job/PR-1239/3/artifact/yetus-general-check/output/patch-asflicense-problems.txt
 |
   | Max. process+thread count | 15400 (vs. ulimit of 3) |
   | modules | C: phoenix-core phoenix-client-parent/phoenix-client 
phoenix-client-parent/phoenix-client-embedded . phoenix-assembly U: . |
   | Console output | 
https://ci-hadoop.apache.org/job/Phoenix/job/Phoenix-PreCommit-GitHub-PR/job/PR-1239/3/console
 |
   | versions | git=2.7.4 maven=3.3.9 pylint=2.4.4 |
   | Powered by | Apache Yetus 

[GitHub] [phoenix] stoty commented on pull request #1239: PHOENIX-6378 Unbundle sqlline from phoenix-client

2021-05-27 Thread GitBox


stoty commented on pull request #1239:
URL: https://github.com/apache/phoenix/pull/1239#issuecomment-849525495


   :broken_heart: **-1 overall**
   
   
   
   
   
   
   | Vote | Subsystem | Runtime | Comment |
   |::|--:|:|:|
   | +0 :ok: |  reexec  |   0m 55s |  Docker mode activated.  |
   ||| _ Prechecks _ |
   | +1 :green_heart: |  dupname  |   0m  0s |  No case conflicting files 
found.  |
   | +1 :green_heart: |  @author  |   0m  0s |  The patch does not contain any 
@author tags.  |
   | -1 :x: |  test4tests  |   0m  0s |  The patch doesn't appear to include 
any new or modified tests. Please justify why no new tests are needed for this 
patch. Also please list what manual steps were performed to verify this patch.  
|
   ||| _ master Compile Tests _ |
   | +0 :ok: |  mvndep  |   5m 14s |  Maven dependency ordering for branch  |
   | +1 :green_heart: |  mvninstall  |  11m  7s |  master passed  |
   | +0 |  hbaserecompile  |  23m  6s |  HBase recompiled.  |
   | +1 :green_heart: |  compile  |   1m 32s |  master passed  |
   | +1 :green_heart: |  javadoc  |   3m  3s |  master passed  |
   ||| _ Patch Compile Tests _ |
   | +0 :ok: |  mvndep  |   0m 20s |  Maven dependency ordering for patch  |
   | -1 :x: |  mvninstall  |   7m 54s |  root in the patch failed.  |
   | +0 |  hbaserecompile  |  14m 35s |  HBase recompiled.  |
   | -1 :x: |  compile  |   1m 29s |  root in the patch failed.  |
   | -1 :x: |  javac  |   1m 29s |  root in the patch failed.  |
   | -1 :x: |  pylint  |   0m  8s |  The patch generated 22 new + 115 unchanged 
- 10 fixed = 137 total (was 125)  |
   | +1 :green_heart: |  whitespace  |   0m  0s |  The patch has no whitespace 
issues.  |
   | +1 :green_heart: |  xml  |   0m 11s |  The patch has no ill-formed XML 
file.  |
   | -1 :x: |  javadoc  |   1m 21s |  root in the patch failed.  |
   | -1 :x: |  javadoc  |   0m 14s |  phoenix-assembly in the patch failed.  |
   ||| _ Other Tests _ |
   | -1 :x: |  unit  | 116m 13s |  root in the patch failed.  |
   | -1 :x: |  asflicense  |   3m  2s |  The patch generated 614 ASF License 
warnings.  |
   |  |   | 170m 48s |   |
   
   
   | Subsystem | Report/Notes |
   |--:|:-|
   | Docker | ClientAPI=1.41 ServerAPI=1.41 base: 
https://ci-hadoop.apache.org/job/Phoenix/job/Phoenix-PreCommit-GitHub-PR/job/PR-1239/3/artifact/yetus-general-check/output/Dockerfile
 |
   | GITHUB PR | https://github.com/apache/phoenix/pull/1239 |
   | Optional Tests | dupname asflicense pylint javac javadoc unit xml compile |
   | uname | Linux e2cfb2bde495 4.15.0-58-generic #64-Ubuntu SMP Tue Aug 6 
11:12:41 UTC 2019 x86_64 x86_64 x86_64 GNU/Linux |
   | Build tool | maven |
   | Personality | dev/phoenix-personality.sh |
   | git revision | master / 63cbb11 |
   | Default Java | Private Build-1.8.0_242-8u242-b08-0ubuntu3~16.04-b08 |
   | mvninstall | 
https://ci-hadoop.apache.org/job/Phoenix/job/Phoenix-PreCommit-GitHub-PR/job/PR-1239/3/artifact/yetus-general-check/output/patch-mvninstall-root.txt
 |
   | compile | 
https://ci-hadoop.apache.org/job/Phoenix/job/Phoenix-PreCommit-GitHub-PR/job/PR-1239/3/artifact/yetus-general-check/output/patch-compile-root.txt
 |
   | javac | 
https://ci-hadoop.apache.org/job/Phoenix/job/Phoenix-PreCommit-GitHub-PR/job/PR-1239/3/artifact/yetus-general-check/output/patch-compile-root.txt
 |
   | pylint | 
https://ci-hadoop.apache.org/job/Phoenix/job/Phoenix-PreCommit-GitHub-PR/job/PR-1239/3/artifact/yetus-general-check/output/diff-patch-pylint.txt
 |
   | javadoc | 
https://ci-hadoop.apache.org/job/Phoenix/job/Phoenix-PreCommit-GitHub-PR/job/PR-1239/3/artifact/yetus-general-check/output/patch-javadoc-root.txt
 |
   | javadoc | 
https://ci-hadoop.apache.org/job/Phoenix/job/Phoenix-PreCommit-GitHub-PR/job/PR-1239/3/artifact/yetus-general-check/output/patch-javadoc-phoenix-assembly.txt
 |
   | unit | 
https://ci-hadoop.apache.org/job/Phoenix/job/Phoenix-PreCommit-GitHub-PR/job/PR-1239/3/artifact/yetus-general-check/output/patch-unit-root.txt
 |
   |  Test Results | 
https://ci-hadoop.apache.org/job/Phoenix/job/Phoenix-PreCommit-GitHub-PR/job/PR-1239/3/testReport/
 |
   | asflicense | 
https://ci-hadoop.apache.org/job/Phoenix/job/Phoenix-PreCommit-GitHub-PR/job/PR-1239/3/artifact/yetus-general-check/output/patch-asflicense-problems.txt
 |
   | Max. process+thread count | 15400 (vs. ulimit of 3) |
   | modules | C: phoenix-core phoenix-client-parent/phoenix-client 
phoenix-client-parent/phoenix-client-embedded . phoenix-assembly U: . |
   | Console output | 
https://ci-hadoop.apache.org/job/Phoenix/job/Phoenix-PreCommit-GitHub-PR/job/PR-1239/3/console
 |
   | versions | git=2.7.4 maven=3.3.9 pylint=2.4.4 |
   | Powered by | Apache Yetus 0.12.0 https://yetus.apache.org |
   
   
   This message was automatically generated.
   
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the 

[jira] [Commented] (PHOENIX-6429) Add support for global connections and sequential data generators

2021-05-27 Thread Viraj Jasani (Jira)


[ 
https://issues.apache.org/jira/browse/PHOENIX-6429?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17352376#comment-17352376
 ] 

Viraj Jasani commented on PHOENIX-6429:
---

PHERF changes can be backported to 5.1 and 4.16 right?

> Add support for global connections and sequential data generators
> -
>
> Key: PHOENIX-6429
> URL: https://issues.apache.org/jira/browse/PHOENIX-6429
> Project: Phoenix
>  Issue Type: Sub-task
>Reporter: Jacob Isaac
>Assignee: Jacob Isaac
>Priority: Major
> Fix For: 4.17.0, 5.2.0
>
>
> We may at times want to upsert or query using global connections. 
> Also add additional sequential data generators in addition to INTEGER and 
> VARCHAR data types.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[jira] [Commented] (PHOENIX-6444) Extend Cell Tags to Delete object for Indexer coproc

2021-05-27 Thread Viraj Jasani (Jira)


[ 
https://issues.apache.org/jira/browse/PHOENIX-6444?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17352375#comment-17352375
 ] 

Viraj Jasani commented on PHOENIX-6444:
---

[~shahrs87] you would like to create PR for 5.1 to get QA?

> Extend Cell Tags to Delete object for Indexer coproc
> 
>
> Key: PHOENIX-6444
> URL: https://issues.apache.org/jira/browse/PHOENIX-6444
> Project: Phoenix
>  Issue Type: Improvement
>  Components: core
>Reporter: Rushabh Shah
>Assignee: Rushabh Shah
>Priority: Major
> Fix For: 4.17.0, 5.2.0
>
>
> In PHOENIX-6213 we added support for adding source of operation cell tag to 
> Delete Markers. But we added the logic to create TagRewriteCell and add it to 
> DeleteMarker only in IndexRegionObserver coproc. I missed adding the same 
> logic to Indexer coproc. Thank you [~tkhurana] for finding this bug.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[jira] [Commented] (PHOENIX-6444) Extend Cell Tags to Delete object for Indexer coproc

2021-05-27 Thread Viraj Jasani (Jira)


[ 
https://issues.apache.org/jira/browse/PHOENIX-6444?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17352373#comment-17352373
 ] 

Viraj Jasani commented on PHOENIX-6444:
---

Given that PHOENIX-6213 is already present in 5.1.0 and 4.16.0, I think it's 
good to include this addition to branch: 5.1 and 4.16

> Extend Cell Tags to Delete object for Indexer coproc
> 
>
> Key: PHOENIX-6444
> URL: https://issues.apache.org/jira/browse/PHOENIX-6444
> Project: Phoenix
>  Issue Type: Improvement
>  Components: core
>Reporter: Rushabh Shah
>Assignee: Rushabh Shah
>Priority: Major
> Fix For: 4.17.0, 5.2.0
>
>
> In PHOENIX-6213 we added support for adding source of operation cell tag to 
> Delete Markers. But we added the logic to create TagRewriteCell and add it to 
> DeleteMarker only in IndexRegionObserver coproc. I missed adding the same 
> logic to Indexer coproc. Thank you [~tkhurana] for finding this bug.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[jira] [Commented] (PHOENIX-6271) Effective DDL generated by SchemaExtractionTool should maintain the order of PK and other columns

2021-05-27 Thread Viraj Jasani (Jira)


[ 
https://issues.apache.org/jira/browse/PHOENIX-6271?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17352369#comment-17352369
 ] 

Viraj Jasani commented on PHOENIX-6271:
---

+1 for 5.1 backport (4.16 too if required).

> Effective DDL generated by SchemaExtractionTool should maintain the order of 
> PK and other columns
> -
>
> Key: PHOENIX-6271
> URL: https://issues.apache.org/jira/browse/PHOENIX-6271
> Project: Phoenix
>  Issue Type: Improvement
>Reporter: Swaroopa Kadam
>Assignee: Swaroopa Kadam
>Priority: Minor
> Fix For: 4.17.0, 5.2.0
>
>
> SchemaExtractionTool is used to generate effective DDL which can be then 
> compared with the DDL on the cluster to perform schema monitoring. 
> This won't affect the monitoring part but would be good to have the PR order 
> in place so that effective DDL can be used for creating the entity for the 
> first time in a new environment.
>  
>  
>  



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[jira] [Commented] (PHOENIX-6478) The build of phoenix-connectors fails due to missing dependency of pentaho-aggdesigner-algorithm 5.1.5-jhyde

2021-05-27 Thread ASF GitHub Bot (Jira)


[ 
https://issues.apache.org/jira/browse/PHOENIX-6478?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17352355#comment-17352355
 ] 

ASF GitHub Bot commented on PHOENIX-6478:
-

stoty commented on pull request #53:
URL: https://github.com/apache/phoenix-connectors/pull/53#issuecomment-849493434


   :broken_heart: **-1 overall**
   
   
   
   
   
   
   | Vote | Subsystem | Runtime | Comment |
   |::|--:|:|:|
   | +0 :ok: |  reexec  |   0m 33s |  Docker mode activated.  |
   ||| _ Prechecks _ |
   | +1 :green_heart: |  dupname  |   0m  0s |  No case conflicting files 
found.  |
   | +1 :green_heart: |  @author  |   0m  0s |  The patch does not contain any 
@author tags.  |
   | -1 :x: |  test4tests  |   0m  0s |  The patch doesn't appear to include 
any new or modified tests. Please justify why no new tests are needed for this 
patch. Also please list what manual steps were performed to verify this patch.  
|
   ||| _ master Compile Tests _ |
   | -1 :x: |  mvninstall  |  31m 47s |  root in master failed.  |
   | -1 :x: |  compile  |   0m 12s |  phoenix4-hive in master failed.  |
   | -1 :x: |  javadoc  |   0m 12s |  phoenix4-hive in master failed.  |
   ||| _ Patch Compile Tests _ |
   | +1 :green_heart: |  mvninstall  |  24m  6s |  the patch passed  |
   | +1 :green_heart: |  compile  |   0m 39s |  the patch passed  |
   | +1 :green_heart: |  javac  |   0m 39s |  the patch passed  |
   | +1 :green_heart: |  whitespace  |   0m  0s |  The patch has no whitespace 
issues.  |
   | +1 :green_heart: |  xml  |   0m  2s |  The patch has no ill-formed XML 
file.  |
   | -1 :x: |  javadoc  |   0m 19s |  phoenix-hive-base_phoenix4-hive generated 
3 new + 0 unchanged - 0 fixed = 3 total (was 0)  |
   ||| _ Other Tests _ |
   | +1 :green_heart: |  unit  |  36m 21s |  phoenix4-hive in the patch passed. 
 |
   | +1 :green_heart: |  asflicense  |   0m  9s |  The patch does not generate 
ASF License warnings.  |
   |  |   |  94m 44s |   |
   
   
   | Subsystem | Report/Notes |
   |--:|:-|
   | Docker | ClientAPI=1.41 ServerAPI=1.41 base: 
https://ci-hadoop.apache.org/job/Phoenix/job/Phoenix-Connectors-PreCommit-GitHub-PR/job/PR-53/1/artifact/yetus-general-check/output/Dockerfile
 |
   | GITHUB PR | https://github.com/apache/phoenix-connectors/pull/53 |
   | Optional Tests | dupname asflicense javac javadoc unit xml compile |
   | uname | Linux 95c1bdde3f62 4.15.0-112-generic #113-Ubuntu SMP Thu Jul 9 
23:41:39 UTC 2020 x86_64 x86_64 x86_64 GNU/Linux |
   | Build tool | maven |
   | Personality | dev/phoenix-connectors-personality.sh |
   | git revision | master / eaf5adf |
   | Default Java | Private Build-1.8.0_242-8u242-b08-0ubuntu3~16.04-b08 |
   | mvninstall | 
https://ci-hadoop.apache.org/job/Phoenix/job/Phoenix-Connectors-PreCommit-GitHub-PR/job/PR-53/1/artifact/yetus-general-check/output/branch-mvninstall-root.txt
 |
   | compile | 
https://ci-hadoop.apache.org/job/Phoenix/job/Phoenix-Connectors-PreCommit-GitHub-PR/job/PR-53/1/artifact/yetus-general-check/output/branch-compile-phoenix-hive-base_phoenix4-hive.txt
 |
   | javadoc | 
https://ci-hadoop.apache.org/job/Phoenix/job/Phoenix-Connectors-PreCommit-GitHub-PR/job/PR-53/1/artifact/yetus-general-check/output/branch-javadoc-phoenix-hive-base_phoenix4-hive.txt
 |
   | javadoc | 
https://ci-hadoop.apache.org/job/Phoenix/job/Phoenix-Connectors-PreCommit-GitHub-PR/job/PR-53/1/artifact/yetus-general-check/output/diff-javadoc-javadoc-phoenix-hive-base_phoenix4-hive.txt
 |
   |  Test Results | 
https://ci-hadoop.apache.org/job/Phoenix/job/Phoenix-Connectors-PreCommit-GitHub-PR/job/PR-53/1/testReport/
 |
   | Max. process+thread count | 1282 (vs. ulimit of 3) |
   | modules | C: phoenix-hive-base/phoenix4-hive U: 
phoenix-hive-base/phoenix4-hive |
   | Console output | 
https://ci-hadoop.apache.org/job/Phoenix/job/Phoenix-Connectors-PreCommit-GitHub-PR/job/PR-53/1/console
 |
   | versions | git=2.7.4 maven=3.3.9 |
   | Powered by | Apache Yetus 0.12.0 https://yetus.apache.org |
   
   
   This message was automatically generated.
   
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


> The build of phoenix-connectors fails due to missing dependency of  
> pentaho-aggdesigner-algorithm 5.1.5-jhyde
> -
>
> Key: PHOENIX-6478
> URL: https://issues.apache.org/jira/browse/PHOENIX-6478
> Project: Phoenix
>  Issue Type: Bug
>  Components: hive-connector
>Reporter: Martin Tzvetanov Grigorov
>Priority: Major
>
> The build of phoenix4-hive fails because 

[GitHub] [phoenix-connectors] stoty commented on pull request #53: PHOENIX-6478 Add Pentaho Nexus repo for pentaho-aggdesigner-algorithm

2021-05-27 Thread GitBox


stoty commented on pull request #53:
URL: https://github.com/apache/phoenix-connectors/pull/53#issuecomment-849493434


   :broken_heart: **-1 overall**
   
   
   
   
   
   
   | Vote | Subsystem | Runtime | Comment |
   |::|--:|:|:|
   | +0 :ok: |  reexec  |   0m 33s |  Docker mode activated.  |
   ||| _ Prechecks _ |
   | +1 :green_heart: |  dupname  |   0m  0s |  No case conflicting files 
found.  |
   | +1 :green_heart: |  @author  |   0m  0s |  The patch does not contain any 
@author tags.  |
   | -1 :x: |  test4tests  |   0m  0s |  The patch doesn't appear to include 
any new or modified tests. Please justify why no new tests are needed for this 
patch. Also please list what manual steps were performed to verify this patch.  
|
   ||| _ master Compile Tests _ |
   | -1 :x: |  mvninstall  |  31m 47s |  root in master failed.  |
   | -1 :x: |  compile  |   0m 12s |  phoenix4-hive in master failed.  |
   | -1 :x: |  javadoc  |   0m 12s |  phoenix4-hive in master failed.  |
   ||| _ Patch Compile Tests _ |
   | +1 :green_heart: |  mvninstall  |  24m  6s |  the patch passed  |
   | +1 :green_heart: |  compile  |   0m 39s |  the patch passed  |
   | +1 :green_heart: |  javac  |   0m 39s |  the patch passed  |
   | +1 :green_heart: |  whitespace  |   0m  0s |  The patch has no whitespace 
issues.  |
   | +1 :green_heart: |  xml  |   0m  2s |  The patch has no ill-formed XML 
file.  |
   | -1 :x: |  javadoc  |   0m 19s |  phoenix-hive-base_phoenix4-hive generated 
3 new + 0 unchanged - 0 fixed = 3 total (was 0)  |
   ||| _ Other Tests _ |
   | +1 :green_heart: |  unit  |  36m 21s |  phoenix4-hive in the patch passed. 
 |
   | +1 :green_heart: |  asflicense  |   0m  9s |  The patch does not generate 
ASF License warnings.  |
   |  |   |  94m 44s |   |
   
   
   | Subsystem | Report/Notes |
   |--:|:-|
   | Docker | ClientAPI=1.41 ServerAPI=1.41 base: 
https://ci-hadoop.apache.org/job/Phoenix/job/Phoenix-Connectors-PreCommit-GitHub-PR/job/PR-53/1/artifact/yetus-general-check/output/Dockerfile
 |
   | GITHUB PR | https://github.com/apache/phoenix-connectors/pull/53 |
   | Optional Tests | dupname asflicense javac javadoc unit xml compile |
   | uname | Linux 95c1bdde3f62 4.15.0-112-generic #113-Ubuntu SMP Thu Jul 9 
23:41:39 UTC 2020 x86_64 x86_64 x86_64 GNU/Linux |
   | Build tool | maven |
   | Personality | dev/phoenix-connectors-personality.sh |
   | git revision | master / eaf5adf |
   | Default Java | Private Build-1.8.0_242-8u242-b08-0ubuntu3~16.04-b08 |
   | mvninstall | 
https://ci-hadoop.apache.org/job/Phoenix/job/Phoenix-Connectors-PreCommit-GitHub-PR/job/PR-53/1/artifact/yetus-general-check/output/branch-mvninstall-root.txt
 |
   | compile | 
https://ci-hadoop.apache.org/job/Phoenix/job/Phoenix-Connectors-PreCommit-GitHub-PR/job/PR-53/1/artifact/yetus-general-check/output/branch-compile-phoenix-hive-base_phoenix4-hive.txt
 |
   | javadoc | 
https://ci-hadoop.apache.org/job/Phoenix/job/Phoenix-Connectors-PreCommit-GitHub-PR/job/PR-53/1/artifact/yetus-general-check/output/branch-javadoc-phoenix-hive-base_phoenix4-hive.txt
 |
   | javadoc | 
https://ci-hadoop.apache.org/job/Phoenix/job/Phoenix-Connectors-PreCommit-GitHub-PR/job/PR-53/1/artifact/yetus-general-check/output/diff-javadoc-javadoc-phoenix-hive-base_phoenix4-hive.txt
 |
   |  Test Results | 
https://ci-hadoop.apache.org/job/Phoenix/job/Phoenix-Connectors-PreCommit-GitHub-PR/job/PR-53/1/testReport/
 |
   | Max. process+thread count | 1282 (vs. ulimit of 3) |
   | modules | C: phoenix-hive-base/phoenix4-hive U: 
phoenix-hive-base/phoenix4-hive |
   | Console output | 
https://ci-hadoop.apache.org/job/Phoenix/job/Phoenix-Connectors-PreCommit-GitHub-PR/job/PR-53/1/console
 |
   | versions | git=2.7.4 maven=3.3.9 |
   | Powered by | Apache Yetus 0.12.0 https://yetus.apache.org |
   
   
   This message was automatically generated.
   
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[jira] [Commented] (PHOENIX-6478) The build of phoenix-connectors fails due to missing dependency of pentaho-aggdesigner-algorithm 5.1.5-jhyde

2021-05-27 Thread ASF GitHub Bot (Jira)


[ 
https://issues.apache.org/jira/browse/PHOENIX-6478?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17352325#comment-17352325
 ] 

ASF GitHub Bot commented on PHOENIX-6478:
-

martin-g opened a new pull request #53:
URL: https://github.com/apache/phoenix-connectors/pull/53


   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


> The build of phoenix-connectors fails due to missing dependency of  
> pentaho-aggdesigner-algorithm 5.1.5-jhyde
> -
>
> Key: PHOENIX-6478
> URL: https://issues.apache.org/jira/browse/PHOENIX-6478
> Project: Phoenix
>  Issue Type: Bug
>  Components: hive-connector
>Reporter: Martin Tzvetanov Grigorov
>Priority: Major
>
> The build of phoenix4-hive fails because clojars.org is no more active Maven 
> repository and org.pentaho:pentaho-aggdesigner-algorithm:jar:5.1.5-jhyde 
> could not be found in Maven Central
>  
> {code:java}
> INFO] 
> [INFO] BUILD FAILURE
> [INFO] 
> 
> [INFO] Total time:  6.181 s
> [INFO] Finished at: 2021-05-27T08:54:09+01:00
> [INFO] 
> 
> [ERROR] Failed to execute goal on project phoenix4-hive: Could not resolve 
> dependencies for project org.apache.phoenix:phoenix4-hive:jar:6.0.0-SNAPSHOT: 
> Failure to find org.pentaho:pentaho-aggdesigner-algorithm:jar:5.1.5-jhyde in 
> https://repository.apache.org/content/repositories/releases/ was cached in 
> the local repository, resolution will not be reattempted until the update 
> interval of apache release has elapsed or updates are forced -> [Help 1]
>  {code}



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[GitHub] [phoenix-connectors] martin-g opened a new pull request #53: PHOENIX-6478 Add Pentaho Nexus repo for pentaho-aggdesigner-algorithm

2021-05-27 Thread GitBox


martin-g opened a new pull request #53:
URL: https://github.com/apache/phoenix-connectors/pull/53


   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[jira] [Commented] (PHOENIX-6477) Build failure on Linux ARM64

2021-05-27 Thread Martin Tzvetanov Grigorov (Jira)


[ 
https://issues.apache.org/jira/browse/PHOENIX-6477?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17352295#comment-17352295
 ] 

Martin Tzvetanov Grigorov commented on PHOENIX-6477:


https://issues.apache.org/jira/browse/OMID-210

> Build failure on Linux ARM64
> 
>
> Key: PHOENIX-6477
> URL: https://issues.apache.org/jira/browse/PHOENIX-6477
> Project: Phoenix
>  Issue Type: Bug
>  Components: core
>Affects Versions: 5.1.1
>Reporter: Martin Tzvetanov Grigorov
>Assignee: Martin Tzvetanov Grigorov
>Priority: Major
>
> The build fails on Linux ARM64 architecture because Protobuf-Java 2.5.0 does 
> not provide protoc binary for aarch64:
>  
> {code:java}
> [ERROR] Failed to execute goal 
> org.xolstice.maven.plugins:protobuf-maven-plugin:0.6.1:compile 
> (compile-protoc) on project phoenix-core: Unable to resolve artifact: Missing:
> [ERROR] --
> [ERROR] 1) com.google.protobuf:protoc:exe:linux-aarch_64:2.5.0
> [ERROR] 
> [ERROR]   Try downloading the file manually from the project website.
> [ERROR] 
> [ERROR]   Then, install it using the command: 
> [ERROR]   mvn install:install-file -DgroupId=com.google.protobuf 
> -DartifactId=protoc -Dversion=2.5.0 -Dclassifier=linux-aarch_64 
> -Dpackaging=exe -Dfile=/path/to/file
> [ERROR] 
> [ERROR]   Alternatively, if you host your own repository you can deploy the 
> file there: 
> [ERROR]   mvn deploy:deploy-file -DgroupId=com.google.protobuf 
> -DartifactId=protoc -Dversion=2.5.0 -Dclassifier=linux-aarch_64 
> -Dpackaging=exe -Dfile=/path/to/file -Durl=[url] -DrepositoryId=[id]
> [ERROR] 
> [ERROR]   Path to dependency: 
> [ERROR]   1) org.apache.phoenix:phoenix-core:jar:5.2.0-SNAPSHOT
> [ERROR]   2) com.google.protobuf:protoc:exe:linux-aarch_64:2.5.0
> [ERROR] 
> [ERROR] --
> [ERROR] 1 required artifact is missing.
> [ERROR] 
> [ERROR] for artifact: 
> [ERROR]   org.apache.phoenix:phoenix-core:jar:5.2.0-SNAPSHOT
> [ERROR] 
> [ERROR] from the specified remote repositories:
> [ERROR]   apache release 
> (https://repository.apache.org/content/repositories/releases/, releases=true, 
> snapshots=true),
> [ERROR]   apache.snapshots (https://repository.apache.org/snapshots, 
> releases=false, snapshots=true),
> [ERROR]   central (https://repo.maven.apache.org/maven2, releases=true, 
> snapshots=false)
> {code}
>  
> As discussed at 
> [https://lists.apache.org/thread.html/ra5405789376bdb9e16ffa014f1d0a098af34d4946e41ca09efc95a84%40%3Cdev.phoenix.apache.org%3E]
>  updating to Protobuf 3.5+ is not an option at the moment.
> To preserve backward compatibility Phoenix should continue using Protobuf 2.x.
> The only working solution I was able to find is to use 
> [https://github.com/os72/protoc-jar:2.6.1-build3] on Linux ARM64
>  
>  



--
This message was sent by Atlassian Jira
(v8.3.4#803005)