[
https://issues.apache.org/jira/browse/HDDS-2071?focusedWorklogId=322704&page=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-322704
]
ASF GitHub Bot logged work on HDDS-2071:
----------------------------------------
Author: ASF GitHub Bot
Created on: 03/Oct/19 15:59
Start Date: 03/Oct/19 15:59
Worklog Time Spent: 10m
Work Description: hadoop-yetus commented on issue #1583: HDDS-2071.
Support filters in ozone insight point
URL: https://github.com/apache/hadoop/pull/1583#issuecomment-538009113
:broken_heart: **-1 overall**
| Vote | Subsystem | Runtime | Comment |
|:----:|----------:|--------:|:--------|
| 0 | reexec | 464 | Docker mode activated. |
||| _ Prechecks _ |
| +1 | dupname | 0 | No case conflicting files found. |
| +1 | @author | 0 | The patch does not contain any @author tags. |
| +1 | test4tests | 0 | The patch appears to include 1 new or modified test
files. |
||| _ trunk Compile Tests _ |
| 0 | mvndep | 70 | Maven dependency ordering for branch |
| -1 | mvninstall | 44 | hadoop-hdds in trunk failed. |
| -1 | mvninstall | 43 | hadoop-ozone in trunk failed. |
| -1 | compile | 18 | hadoop-hdds in trunk failed. |
| -1 | compile | 14 | hadoop-ozone in trunk failed. |
| +1 | checkstyle | 60 | trunk passed |
| +1 | mvnsite | 0 | trunk passed |
| +1 | shadedclient | 938 | branch has no errors when building and testing
our client artifacts. |
| -1 | javadoc | 20 | hadoop-hdds in trunk failed. |
| -1 | javadoc | 16 | hadoop-ozone in trunk failed. |
| 0 | spotbugs | 1029 | Used deprecated FindBugs config; considering
switching to SpotBugs. |
| -1 | findbugs | 33 | hadoop-hdds in trunk failed. |
| -1 | findbugs | 17 | hadoop-ozone in trunk failed. |
||| _ Patch Compile Tests _ |
| 0 | mvndep | 25 | Maven dependency ordering for patch |
| -1 | mvninstall | 31 | hadoop-hdds in the patch failed. |
| -1 | mvninstall | 35 | hadoop-ozone in the patch failed. |
| -1 | compile | 21 | hadoop-hdds in the patch failed. |
| -1 | compile | 16 | hadoop-ozone in the patch failed. |
| -1 | javac | 21 | hadoop-hdds in the patch failed. |
| -1 | javac | 16 | hadoop-ozone in the patch failed. |
| +1 | checkstyle | 52 | the patch passed |
| +1 | mvnsite | 0 | the patch passed |
| +1 | whitespace | 0 | The patch has no whitespace issues. |
| +1 | shadedclient | 797 | patch has no errors when building and testing
our client artifacts. |
| -1 | javadoc | 20 | hadoop-hdds in the patch failed. |
| -1 | javadoc | 16 | hadoop-ozone in the patch failed. |
| -1 | findbugs | 28 | hadoop-hdds in the patch failed. |
| -1 | findbugs | 17 | hadoop-ozone in the patch failed. |
||| _ Other Tests _ |
| -1 | unit | 25 | hadoop-hdds in the patch failed. |
| -1 | unit | 22 | hadoop-ozone in the patch failed. |
| +1 | asflicense | 29 | The patch does not generate ASF License warnings. |
| | | 2991 | |
| Subsystem | Report/Notes |
|----------:|:-------------|
| Docker | Client=18.09.7 Server=18.09.7 base:
https://builds.apache.org/job/hadoop-multibranch/job/PR-1583/1/artifact/out/Dockerfile
|
| GITHUB PR | https://github.com/apache/hadoop/pull/1583 |
| Optional Tests | dupname asflicense compile javac javadoc mvninstall
mvnsite unit shadedclient findbugs checkstyle |
| uname | Linux df91587a47f7 4.15.0-58-generic #64-Ubuntu SMP Tue Aug 6
11:12:41 UTC 2019 x86_64 x86_64 x86_64 GNU/Linux |
| Build tool | maven |
| Personality | personality/hadoop.sh |
| git revision | trunk / 5a7483c |
| Default Java | 1.8.0_222 |
| mvninstall |
https://builds.apache.org/job/hadoop-multibranch/job/PR-1583/1/artifact/out/branch-mvninstall-hadoop-hdds.txt
|
| mvninstall |
https://builds.apache.org/job/hadoop-multibranch/job/PR-1583/1/artifact/out/branch-mvninstall-hadoop-ozone.txt
|
| compile |
https://builds.apache.org/job/hadoop-multibranch/job/PR-1583/1/artifact/out/branch-compile-hadoop-hdds.txt
|
| compile |
https://builds.apache.org/job/hadoop-multibranch/job/PR-1583/1/artifact/out/branch-compile-hadoop-ozone.txt
|
| javadoc |
https://builds.apache.org/job/hadoop-multibranch/job/PR-1583/1/artifact/out/branch-javadoc-hadoop-hdds.txt
|
| javadoc |
https://builds.apache.org/job/hadoop-multibranch/job/PR-1583/1/artifact/out/branch-javadoc-hadoop-ozone.txt
|
| findbugs |
https://builds.apache.org/job/hadoop-multibranch/job/PR-1583/1/artifact/out/branch-findbugs-hadoop-hdds.txt
|
| findbugs |
https://builds.apache.org/job/hadoop-multibranch/job/PR-1583/1/artifact/out/branch-findbugs-hadoop-ozone.txt
|
| mvninstall |
https://builds.apache.org/job/hadoop-multibranch/job/PR-1583/1/artifact/out/patch-mvninstall-hadoop-hdds.txt
|
| mvninstall |
https://builds.apache.org/job/hadoop-multibranch/job/PR-1583/1/artifact/out/patch-mvninstall-hadoop-ozone.txt
|
| compile |
https://builds.apache.org/job/hadoop-multibranch/job/PR-1583/1/artifact/out/patch-compile-hadoop-hdds.txt
|
| compile |
https://builds.apache.org/job/hadoop-multibranch/job/PR-1583/1/artifact/out/patch-compile-hadoop-ozone.txt
|
| javac |
https://builds.apache.org/job/hadoop-multibranch/job/PR-1583/1/artifact/out/patch-compile-hadoop-hdds.txt
|
| javac |
https://builds.apache.org/job/hadoop-multibranch/job/PR-1583/1/artifact/out/patch-compile-hadoop-ozone.txt
|
| javadoc |
https://builds.apache.org/job/hadoop-multibranch/job/PR-1583/1/artifact/out/patch-javadoc-hadoop-hdds.txt
|
| javadoc |
https://builds.apache.org/job/hadoop-multibranch/job/PR-1583/1/artifact/out/patch-javadoc-hadoop-ozone.txt
|
| findbugs |
https://builds.apache.org/job/hadoop-multibranch/job/PR-1583/1/artifact/out/patch-findbugs-hadoop-hdds.txt
|
| findbugs |
https://builds.apache.org/job/hadoop-multibranch/job/PR-1583/1/artifact/out/patch-findbugs-hadoop-ozone.txt
|
| unit |
https://builds.apache.org/job/hadoop-multibranch/job/PR-1583/1/artifact/out/patch-unit-hadoop-hdds.txt
|
| unit |
https://builds.apache.org/job/hadoop-multibranch/job/PR-1583/1/artifact/out/patch-unit-hadoop-ozone.txt
|
| Test Results |
https://builds.apache.org/job/hadoop-multibranch/job/PR-1583/1/testReport/ |
| Max. process+thread count | 307 (vs. ulimit of 5500) |
| modules | C: hadoop-hdds/framework hadoop-hdds/server-scm
hadoop-ozone/insight U: . |
| Console output |
https://builds.apache.org/job/hadoop-multibranch/job/PR-1583/1/console |
| versions | git=2.7.4 maven=3.3.9 |
| Powered by | Apache Yetus 0.10.0 http://yetus.apache.org |
This message was automatically generated.
----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
For queries about this service, please contact Infrastructure at:
[email protected]
Issue Time Tracking
-------------------
Worklog Id: (was: 322704)
Time Spent: 0.5h (was: 20m)
> Support filters in ozone insight point
> --------------------------------------
>
> Key: HDDS-2071
> URL: https://issues.apache.org/jira/browse/HDDS-2071
> Project: Hadoop Distributed Data Store
> Issue Type: Sub-task
> Reporter: Marton Elek
> Assignee: Marton Elek
> Priority: Major
> Labels: pull-request-available
> Time Spent: 0.5h
> Remaining Estimate: 0h
>
> With Ozone insight we can print out all the logs / metrics of one specific
> component s (eg. scm.node-manager or scm.node-manager).
> It would be great to support additional filtering capabilities where the
> output is filtered based on specific keys.
> For example to print out all of the logs related to one datanode or related
> to one type of RPC request.
> Filter should be a key value map (eg. --filter
> datanode=sjdhfhf,rpc=createChunk) which can be defined in the ozone insight
> CLI.
> As we have no option to add additional tags to the logs (it may be supported
> by log4j2 but not with slf4k), the first implementation can be implemented by
> pattern matching.
> For example in SCMNodeManager.processNodeReport contains trace/debug logs
> which includes the " [datanode={}]" part. This formatting convention can be
> used to print out the only the related information.
--
This message was sent by Atlassian Jira
(v8.3.4#803005)
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]