[ 
https://issues.apache.org/jira/browse/HDDS-2166?focusedWorklogId=318192&page=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-318192
 ]

ASF GitHub Bot logged work on HDDS-2166:
----------------------------------------

                Author: ASF GitHub Bot
            Created on: 25/Sep/19 09:54
            Start Date: 25/Sep/19 09:54
    Worklog Time Spent: 10m 
      Work Description: hadoop-yetus commented on issue #1505: HDDS-2166. Some 
RPC metrics are missing from SCM prometheus endpoint
URL: https://github.com/apache/hadoop/pull/1505#issuecomment-534946357
 
 
   :broken_heart: **-1 overall**
   
   
   
   
   
   
   | Vote | Subsystem | Runtime | Comment |
   |:----:|----------:|--------:|:--------|
   | 0 | reexec | 39 | Docker mode activated. |
   ||| _ Prechecks _ |
   | +1 | dupname | 0 | No case conflicting files found. |
   | +1 | @author | 0 | The patch does not contain any @author tags. |
   | +1 | test4tests | 0 | The patch appears to include 1 new or modified test 
files. |
   ||| _ trunk Compile Tests _ |
   | -1 | mvninstall | 38 | hadoop-hdds in trunk failed. |
   | -1 | mvninstall | 27 | hadoop-ozone in trunk failed. |
   | -1 | compile | 21 | hadoop-hdds in trunk failed. |
   | -1 | compile | 12 | hadoop-ozone in trunk failed. |
   | +1 | checkstyle | 62 | trunk passed |
   | +1 | mvnsite | 0 | trunk passed |
   | +1 | shadedclient | 916 | branch has no errors when building and testing 
our client artifacts. |
   | -1 | javadoc | 17 | hadoop-hdds in trunk failed. |
   | -1 | javadoc | 15 | hadoop-ozone in trunk failed. |
   | 0 | spotbugs | 997 | Used deprecated FindBugs config; considering 
switching to SpotBugs. |
   | -1 | findbugs | 28 | hadoop-hdds in trunk failed. |
   | -1 | findbugs | 16 | hadoop-ozone in trunk failed. |
   ||| _ Patch Compile Tests _ |
   | -1 | mvninstall | 32 | hadoop-hdds in the patch failed. |
   | -1 | mvninstall | 27 | hadoop-ozone in the patch failed. |
   | -1 | compile | 22 | hadoop-hdds in the patch failed. |
   | -1 | compile | 15 | hadoop-ozone in the patch failed. |
   | -1 | javac | 22 | hadoop-hdds in the patch failed. |
   | -1 | javac | 15 | hadoop-ozone in the patch failed. |
   | +1 | checkstyle | 54 | the patch passed |
   | +1 | mvnsite | 0 | the patch passed |
   | +1 | whitespace | 0 | The patch has no whitespace issues. |
   | +1 | shadedclient | 772 | patch has no errors when building and testing 
our client artifacts. |
   | -1 | javadoc | 17 | hadoop-hdds in the patch failed. |
   | -1 | javadoc | 16 | hadoop-ozone in the patch failed. |
   | -1 | findbugs | 28 | hadoop-hdds in the patch failed. |
   | -1 | findbugs | 15 | hadoop-ozone in the patch failed. |
   ||| _ Other Tests _ |
   | -1 | unit | 26 | hadoop-hdds in the patch failed. |
   | -1 | unit | 20 | hadoop-ozone in the patch failed. |
   | +1 | asflicense | 30 | The patch does not generate ASF License warnings. |
   | | | 2376 | |
   
   
   | Subsystem | Report/Notes |
   |----------:|:-------------|
   | Docker | Client=19.03.1 Server=19.03.1 base: 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1505/3/artifact/out/Dockerfile
 |
   | GITHUB PR | https://github.com/apache/hadoop/pull/1505 |
   | Optional Tests | dupname asflicense compile javac javadoc mvninstall 
mvnsite unit shadedclient findbugs checkstyle |
   | uname | Linux 71aff0ab79f9 4.15.0-60-generic #67-Ubuntu SMP Thu Aug 22 
16:55:30 UTC 2019 x86_64 x86_64 x86_64 GNU/Linux |
   | Build tool | maven |
   | Personality | personality/hadoop.sh |
   | git revision | trunk / c724577 |
   | Default Java | 1.8.0_222 |
   | mvninstall | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1505/3/artifact/out/branch-mvninstall-hadoop-hdds.txt
 |
   | mvninstall | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1505/3/artifact/out/branch-mvninstall-hadoop-ozone.txt
 |
   | compile | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1505/3/artifact/out/branch-compile-hadoop-hdds.txt
 |
   | compile | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1505/3/artifact/out/branch-compile-hadoop-ozone.txt
 |
   | javadoc | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1505/3/artifact/out/branch-javadoc-hadoop-hdds.txt
 |
   | javadoc | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1505/3/artifact/out/branch-javadoc-hadoop-ozone.txt
 |
   | findbugs | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1505/3/artifact/out/branch-findbugs-hadoop-hdds.txt
 |
   | findbugs | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1505/3/artifact/out/branch-findbugs-hadoop-ozone.txt
 |
   | mvninstall | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1505/3/artifact/out/patch-mvninstall-hadoop-hdds.txt
 |
   | mvninstall | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1505/3/artifact/out/patch-mvninstall-hadoop-ozone.txt
 |
   | compile | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1505/3/artifact/out/patch-compile-hadoop-hdds.txt
 |
   | compile | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1505/3/artifact/out/patch-compile-hadoop-ozone.txt
 |
   | javac | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1505/3/artifact/out/patch-compile-hadoop-hdds.txt
 |
   | javac | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1505/3/artifact/out/patch-compile-hadoop-ozone.txt
 |
   | javadoc | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1505/3/artifact/out/patch-javadoc-hadoop-hdds.txt
 |
   | javadoc | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1505/3/artifact/out/patch-javadoc-hadoop-ozone.txt
 |
   | findbugs | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1505/3/artifact/out/patch-findbugs-hadoop-hdds.txt
 |
   | findbugs | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1505/3/artifact/out/patch-findbugs-hadoop-ozone.txt
 |
   | unit | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1505/3/artifact/out/patch-unit-hadoop-hdds.txt
 |
   | unit | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1505/3/artifact/out/patch-unit-hadoop-ozone.txt
 |
   |  Test Results | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1505/3/testReport/ |
   | Max. process+thread count | 306 (vs. ulimit of 5500) |
   | modules | C: hadoop-hdds/framework U: hadoop-hdds/framework |
   | Console output | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-1505/3/console |
   | versions | git=2.7.4 maven=3.3.9 |
   | Powered by | Apache Yetus 0.10.0 http://yetus.apache.org |
   
   
   This message was automatically generated.
   
   
 
----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


Issue Time Tracking
-------------------

    Worklog Id:     (was: 318192)
    Time Spent: 50m  (was: 40m)

> Some RPC metrics are missing from SCM prometheus endpoint
> ---------------------------------------------------------
>
>                 Key: HDDS-2166
>                 URL: https://issues.apache.org/jira/browse/HDDS-2166
>             Project: Hadoop Distributed Data Store
>          Issue Type: Bug
>            Reporter: Elek, Marton
>            Assignee: Elek, Marton
>            Priority: Major
>              Labels: pull-request-available
>          Time Spent: 50m
>  Remaining Estimate: 0h
>
> In Hadoop metrics it's possible to register multiple metrics with the same 
> name but with different tags. For example each RpcServere has an own metrics 
> instance in SCM.
> {code}
>     "name" : 
> "Hadoop:service=StorageContainerManager,name=RpcActivityForPort9860",
>     "name" : 
> "Hadoop:service=StorageContainerManager,name=RpcActivityForPort9863",
> {code}
> They are converted by PrometheusSink to a prometheus metric line with proper 
> name and tags. For example:
> {code}
> rpc_rpc_queue_time60s_num_ops{port="9860",servername="StorageContainerLocationProtocolService",context="rpc",hostname="72736061cbc5"}
>  0
> {code}
> The PrometheusSink uses a Map to cache all the recent values but 
> unfortunately the key contains only the name (rpc_rpc_queue_time60s_num_ops 
> in our example) but not the tags (port=...)
> For this reason if there are multiple metrics with the same name, only the 
> first one will be displayed.
> As a result in SCM only the metrics of the first RPC server can be exported 
> to the prometheus endpoint. 



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

---------------------------------------------------------------------
To unsubscribe, e-mail: hdfs-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: hdfs-issues-h...@hadoop.apache.org

Reply via email to