[
https://issues.apache.org/jira/browse/FLINK-7935?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16310189#comment-16310189
]
Elias Levy commented on FLINK-7935:
-----------------------------------
It appears the Metric documentation has not been updated in the 1.5 snapshot,
but if I understand the changes it means that for instance, if I have a job
that processes a bounded but not predefined set of message types, and publish a
metric per message type that counts the number of process messages per type, I
could do:
{code}
getRuntimeContext()
.getMetricGroup()
.addGroup("messages")
.addGroup("type", messageType)
.counter("count")
{code}
And the DataDog reporter would report a metric named {{messages.counts}} with
tags of {{type:messageType}}. If that is correct, then it may be sufficient.
Does FLINK-7692 work as I described?
> Metrics with user supplied scope variables
> ------------------------------------------
>
> Key: FLINK-7935
> URL: https://issues.apache.org/jira/browse/FLINK-7935
> Project: Flink
> Issue Type: Improvement
> Components: Metrics
> Affects Versions: 1.3.2
> Reporter: Elias Levy
>
> We use DataDog for metrics. DD and Flink differ somewhat in how they track
> metrics.
> Flink names and scopes metrics together, at least by default. E.g. by default
> the System scope for operator metrics is
> {{<host>.taskmanager.<tm_id>.<job_name>.<operator_name>.<subtask_index>}}.
> The scope variables become part of the metric's full name.
> In DD the metric would be named something generic, e.g.
> {{taskmanager.job.operator}}, and they would be distinguished by their tag
> values, e.g. {{tm_id=foo}}, {{job_name=var}}, {{operator_name=baz}}.
> Flink allows you to configure the format string for system scopes, so it is
> possible to set the operator scope format to {{taskmanager.job.operator}}.
> We do this for all scopes:
> {code}
> metrics.scope.jm: jobmanager
> metrics.scope.jm.job: jobmanager.job
> metrics.scope.tm: taskmanager
> metrics.scope.tm.job: taskmanager.job
> metrics.scope.task: taskmanager.job.task
> metrics.scope.operator: taskmanager.job.operator
> {code}
> This seems to work. The DataDog Flink metric's plugin submits all scope
> variables as tags, even if they are not used within the scope format. And it
> appears internally this does not lead to metrics conflicting with each other.
> We would like to extend this to user defined metrics, but you can define
> variables/scopes when adding a metric group or metric with the user API, so
> that in DD we have a single metric with a tag with many different values,
> rather than hundreds of metrics to just the one value we want to measure
> across different event types.
--
This message was sent by Atlassian JIRA
(v6.4.14#64029)