Use conditions within the template. See the Go documentation for the basic
language:

https://golang.org/pkg/text/template/

/MR

On Fri, May 14, 2021, 10:36 Sebastian Glock <[email protected]> wrote:

> Ok I see problem, but how can i separate email subjects from alert and
> alert-resolved? Can I somehow insert 2 templates of email or 2 templates of
> subject into 1 alert? How alertmanager will know when to use 1st template
> and second?
>
> pt., 14 maj 2021 o 08:45 Sebastian Glock <[email protected]>
> napisał(a):
>
>> Ok so this one should look like this:
>>
>>   - alert: CPU load is more than 80%!
>>     expr: 100 - (1 -avg(irate(windows_cpu_time_total{mode="user"}[10m]))
>> by (instance)) * 100 >= 40
>>     for: 10s
>>     labels:
>>       severity: "[High]"
>>     annotations:
>>       summary: "CPU load is more than 80%!"
>>       description: "{{ humanize $value }}%"
>>
>>
>> and alertmanager:
>>
>>   - receiver: alert-emailer-cpu-1h-resolved
>>     repeat_interval: 1m
>>     match:
>>       severity: "[High-CPU]"
>>
>>
>> - name: alert-emailer-cpu-1h-resolved
>>   email_configs:
>>   - to: '[email protected]'
>>     send_resolved: true
>>     from: ****
>>     smarthost: ****
>>     auth_username: '****
>>     auth_password: ****
>>     auth_secret: ****
>>     auth_identity: ***
>>     html: '{{ template "email" .}}'
>>     headers:
>>       subject: "[CUK][RESOLVED]{{ .CommonLabels.severity }} {{
>> .CommonLabels.instance }} {{ .CommonLabels.alertname }} | {{
>> .CommonAnnotations.description }}"
>>
>> I'm right or something wrong? What if I want to configure send resolved
>> for all alerts? I have many recivers based on severity in recivers and i
>> want only to see mail with "[RESOLVED]" in subject.
>>
>> I think there's something wrong because it's not working :(
>>
>> śr., 31 mar 2021 o 22:31 Matthias Rampke <[email protected]>
>> napisał(a):
>>
>>> To Prometheus, "resolved" is not a separate first class event.
>>>
>>> As long as an alert is "firing" in Prometheus, it sends a constant
>>> stream of "still firing" messages to Alertmanager. If there are multiple
>>> alerts firing, or multiple label combinations for one alert, there's more
>>> notifications. AM takes these streams of "still bad", deduplicates, and
>>> groups them.
>>>
>>> This leads to an alert group with 1 or more alerts in it. In
>>> configurable intervals, Alertmanager will send a notification to the
>>> configured receiver(s). This is when the templates are evaluated. While a
>>> group exists, individual alerts can be added and removed, this often
>>> triggers additional notifications. Finally, when the last alert in a group
>>> is resolved, another notification is fired.
>>>
>>> It is important here that this notification uses the same receiver, and
>>> the same templates, as all the previous ones! The choice of Slack color is
>>> only done based on the state of each notification, you can see how it's
>>> done in the default value of the "color" setting:
>>> https://prometheus.io/docs/alerting/latest/configuration/#slack_config
>>>
>>> You need only one receiver and set send_resolved: true on it. Then you
>>> can use templating to make these emails have different subjects depending
>>> on the contents of each notification.
>>>
>>> /MR
>>>
>>> On Wed, Mar 31, 2021, 09:19 Sebastian Glock <[email protected]>
>>> wrote:
>>>
>>>> I have something like this:
>>>>
>>>> global:
>>>> route:
>>>>   receiver: alert-emailer-cpu-30m
>>>>
>>>>   group_by: ['alertname', 'priority', 'instance']
>>>>   group_wait: 10s
>>>>   group_interval: 10s
>>>>   repeat_interval: 30m
>>>> #  resolve_timeout: 10s
>>>>   routes:
>>>>
>>>>
>>>> #rebooted
>>>>
>>>>
>>>>   - receiver: alert-emailer-reboot-30m
>>>>     repeat_interval: 30m
>>>>     match:
>>>>       severity: "[Rebooted]"
>>>>
>>>>
>>>> #CPU
>>>>
>>>>   - receiver: alert-emailer-cpu-30m
>>>>     repeat_interval: 1m
>>>>     match:
>>>>       severity: "[Disaster-CPU]"
>>>>
>>>>
>>>>   - receiver: alert-emailer-cpu-1h
>>>>     repeat_interval: 1m
>>>>     match:
>>>>       severity: "[High-CPU]"
>>>>
>>>>
>>>>   - receiver: alert-emailer-cpu-3h
>>>>     repeat_interval: 1m
>>>>     match:
>>>>       severity: "[Average-CPU]"
>>>>
>>>>
>>>> #resolved
>>>>
>>>>   - receiver: alert-emailer-cpu-30m-resolved
>>>>     repeat_interval: 1m
>>>>     match:
>>>>       severity: "[Disaster-CPU]"
>>>>
>>>>
>>>>   - receiver: alert-emailer-cpu-1h-resolved
>>>>     repeat_interval: 1m
>>>>     match:
>>>>       severity: "[High-CPU]"
>>>>
>>>>
>>>>   - receiver: alert-emailer-cpu-3h-resolved
>>>>     repeat_interval: 1m
>>>>     match:
>>>>       severity: "[Average-CPU]"
>>>>
>>>>
>>>> #cpu
>>>>
>>>> - name: alert-emailer-cpu-1h
>>>>   email_configs:
>>>>   - to: '[email protected]'
>>>>     send_resolved: false
>>>>     from: '******'
>>>>     smarthost: '******'
>>>>     auth_username: '******'
>>>>     auth_password: '******'
>>>>     auth_secret: '******'
>>>>     auth_identity:'******'
>>>>     html: '{{ template "email" .}}'
>>>>     headers:
>>>>       subject: "[CUK]{{ .CommonLabels.severity }} {{
>>>> .CommonLabels.instance }} {{ .CommonLabels.alertname }} | {{
>>>> .CommonAnnotations.description }}"
>>>>
>>>> - name: alert-emailer-cpu-1h-resolved
>>>>   email_configs:
>>>>   - to: '[email protected]'
>>>>     send_resolved: true
>>>>     from: '******'
>>>>     smarthost: '******'
>>>>     auth_username: '******'
>>>>     auth_password: '******'
>>>>     auth_secret: '******'
>>>>     auth_identity:'******'
>>>>     html: '{{ template "email" .}}'
>>>>     headers:
>>>>       subject: "[CUK][RESOLVED]{{ .CommonLabels.severity }} {{
>>>> .CommonLabels.instance }} {{ .CommonLabels.alertname }} | {{
>>>> .CommonAnnotations.description }}"
>>>>
>>>> - name: alert-emailer-cpu-30m
>>>>   email_configs:
>>>>   - to: '[email protected]'
>>>>     send_resolved: false
>>>>     from: '******'
>>>>     smarthost: '******'
>>>>     auth_username: '******'
>>>>     auth_password: '******'
>>>>     auth_secret: '******'
>>>>     auth_identity:'******'
>>>>     html: '{{ template "email" .}}'
>>>>     headers:
>>>>       subject: "[CUK]{{ .CommonLabels.severity }} {{
>>>> .CommonLabels.instance }} {{ .CommonLabels.alertname }} | {{
>>>> .CommonAnnotations.description }}"
>>>>
>>>> - name: alert-emailer-cpu-30m-resolved
>>>>   email_configs:
>>>>   - to: '[email protected]'
>>>>     send_resolved: true
>>>>     from: '******'
>>>>     smarthost: '******'
>>>>     auth_username: '******'
>>>>     auth_password: '******'
>>>>     auth_secret: '******'
>>>>     auth_identity:'******'
>>>>     html: '{{ template "email" .}}'
>>>>     headers:
>>>>       subject: "[CUK][RESOLVED]{{ .CommonLabels.severity }} {{
>>>> .CommonLabels.instance }} {{ .CommonLabels.alertname }} | {{
>>>> .CommonAnnotations.description }}"
>>>>
>>>>
>>>> @Matthias Rampke <[email protected]> so U think i should add
>>>> .Alerts.Resolved: {{ if gt (len .Alerts.Firing) 0 }} to alert? Where
>>>> prometheus hold's that alert is resolved? I thought in slack it will show
>>>> that alert has been resolved, but in prometheus that alert is just green
>>>> and show nothing.
>>>>
>>>> wt., 30 mar 2021 o 12:24 Matthias Rampke <[email protected]>
>>>> napisał(a):
>>>>
>>>>> I *think* you can have one email config, and toggle whether to show
>>>>> [RESOLVED] as part of the template. Looking at the default templates
>>>>> <https://github.com/prometheus/alertmanager/blob/master/template/default.tmpl>,
>>>>> it seems like the way to do that is to check for the length of
>>>>> .Alerts.Firing and .Alerts.Resolved: {{ if gt (len .Alerts.Firing) 0 }} 
>>>>> etc.
>>>>>
>>>>> /MR
>>>>>
>>>>> On Tue, Mar 30, 2021 at 9:02 AM Stuart Clark <[email protected]>
>>>>> wrote:
>>>>>
>>>>>> On 29/03/2021 14:20, Sebastian Glock wrote:
>>>>>> > I tried to do something like this:
>>>>>> >
>>>>>> > ```
>>>>>> > - name: alert-emailer-cpu-1h
>>>>>> >   email_configs:
>>>>>> >   - to: '[email protected]'
>>>>>> >     send_resolved: false
>>>>>> >     from: '*****'
>>>>>> >     smarthost:'*****'
>>>>>> >     auth_username: '*****'
>>>>>> >     auth_password: '*****'
>>>>>> >     auth_secret: '*****'
>>>>>> >     auth_identity:'*****'
>>>>>> >     html: '{{ template "email" .}}'
>>>>>> >     headers:
>>>>>> >       subject: "[CUK]{{ .CommonLabels.severity }} {{
>>>>>> > .CommonLabels.instance }} {{ .CommonLabels.alertname }} | {{
>>>>>> > .CommonAnnotations.description }}"
>>>>>> >
>>>>>> > - name: alert-emailer-cpu-1h-resolved
>>>>>> >   email_configs:
>>>>>> >   - to: '[email protected]'
>>>>>> >     send_resolved: true
>>>>>> >     from: '*****'
>>>>>> >     smarthost: '*****'
>>>>>> >     auth_username: '*****'
>>>>>> >     auth_password: '*****'
>>>>>> >     auth_secret: '*****'
>>>>>> >     auth_identity: '*****'
>>>>>> >     html: '{{ template "email" .}}'
>>>>>> >     headers:
>>>>>> >       subject: "[CUK][RESOLVED]{{ .CommonLabels.severity }} {{
>>>>>> > .CommonLabels.instance }} {{ .CommonLabels.alertname }} | {{
>>>>>> > .CommonAnnotations.description }}"
>>>>>> > ```
>>>>>> >
>>>>>> > 1 alert without send resolved, second with send resolved: true. I
>>>>>> > thought i'll get an email "alert-emailer-cpu-1h-resolved" after 5
>>>>>> > minutes when 1st alert "alert-emailer-cpu-1h" goes out. Is there
>>>>>> any
>>>>>> > other way to solve that?
>>>>>> >
>>>>>> > Thanks for your help!
>>>>>>
>>>>>> Could you post the whole alertmanager config?
>>>>>>
>>>>>> --
>>>>>> Stuart Clark
>>>>>>
>>>>>> --
>>>>>> You received this message because you are subscribed to the Google
>>>>>> Groups "Prometheus Users" group.
>>>>>> To unsubscribe from this group and stop receiving emails from it,
>>>>>> send an email to [email protected].
>>>>>> To view this discussion on the web visit
>>>>>> https://groups.google.com/d/msgid/prometheus-users/c6ee84ab-4b5b-9dde-db46-ae45f0dc8944%40Jahingo.com
>>>>>> .
>>>>>>
>>>>>

-- 
You received this message because you are subscribed to the Google Groups 
"Prometheus Users" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to [email protected].
To view this discussion on the web visit 
https://groups.google.com/d/msgid/prometheus-users/CAMV%3D_gbQUMVHS30X4Ad3UZqXwP4ujT%3D5TjeGE-bBHSUsdG%3DKYA%40mail.gmail.com.

Reply via email to