On Fri, May 6, 2022 at 2:54 PM Sebastian Osztovits <
[email protected]> wrote:

> So i did a bit of research and created a template:
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
> *{{ define "email.html" }}    {{ range .Alerts }}<pre>
> ========start==========    Alarm procedure: Prometheus_alert         Alarm
> level: {{ .Labels.severity }}         Alarm type: {{ .Labels.alertname }}
>        Fault host: {{ .Labels.instance }}         Alarm theme: {{
> .Annotations.summary }}         Alarm Detail: {{ .Annotations.description
> }}         Trigger time: {{ .StartsAt }}    ========end==========</pre>
> {{ end }}*
> *{{ end }}*
>
> And referenced it in alertmanager.yml:
>
> *global:*
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
> *templates:  - '/etc/prometheus/alertmanager/template/*.tmpl'route:
> receiver: smtp-local  repeat_interval: 24hreceivers:  - name: 'smtp-local'
>   email_configs:    - to: 'warnmessage@tld'      from:
> 'prometheusserver@tld'      html: '{{Template "email.to.html".}} '# Specify
> the use template, if not specified, or load the default template
> require_tls: false      smarthost: server:25      send_resolved: true*
>
> So any idea why the email i receive only contains the subject and
> otherwise is just empty?
> The log just says:
> May 06 12:36:25 echelon alertmanager[528319]: ts=2022-05-06T12:36:25.823Z
> caller=dispatch.go:352 level=error component=dispatcher msg="Notify for
> alerts failed" num_alerts=1 err="smtp-local/email[0]: notify retry canceled
> due to unrecoverable error after 1 attempts: execute html template:
> template: :1: function \"Td" num_alerts=1 err="smtp-local/email[0]: notify
> retry canceled due to unrecoverable error after 1 attempts: execute html
> template: template: :1: function \"Template\" not defined">
>

It's saying 'function \"Template\" not defined"', as the function name is
incorrectly upper-cased in your config example. Try "template" instead of
"Template"? The "template" function is defined on the Go level:
https://pkg.go.dev/text/template#hdr-Actions


> What is function "Td" ? Why does it fail? How can i fix it ?
>

I think that first log line is just cut off for some reason, maybe when
copying it from a terminal or something?


> Regards,
> Sebastian
>
>
>
>
> On Wednesday, May 4, 2022 at 12:48:52 PM UTC+2 [email protected]
> wrote:
>
>> Hi Sebastian,
>>
>> Ah yes, the issue is that different data is available on the Prometheus
>> side in alerting rules, than in notification templates on the Alertmanager
>> side.
>>
>> * For alerting rules, see:
>> https://prometheus.io/docs/prometheus/latest/configuration/alerting_rules/#templating.
>> So you only get: $labels, $externalLabels, and $value as variables.
>>
>> * For Alertmanager notification templates, you get a whole list of alerts
>> with some more fields:
>> https://prometheus.io/docs/alerting/latest/notifications/#data
>>
>> So if you want to access the ".StartsAt" field of an alert, you would
>> have to access that from an Alertmanager notification template.
>>
>> As a very simplistic example, you could set the "html" field of an
>> "email_config" in alertmanager.yml to something like this to range over all
>> alerts in the notification and output the ".StartsAt" field for each of
>> them:
>>
>>   html: "{{ range .Alerts }}{{ .StartsAt }}<br/>{{ end }}"
>>
>> Of course in practice you probably also want to put in other alert info
>> (labels, annotations, etc.) in that loop. See the default for the "html"
>> field in
>> https://github.com/prometheus/alertmanager/blob/main/template/email.tmpl
>> as inspiration.
>>
>> Cheers,
>> Julius
>>
>>
>>
>> On Wed, May 4, 2022 at 9:42 AM Sebastian Osztovits <
>> [email protected]> wrote:
>>
>>> Hi,
>>> As you have guessed I want to know the time when the alert first started
>>> firing.
>>> I simply tried adding the alerting rules as in example :
>>>
>>>
>>>   - alert: InstanceDown
>>>     expr: up == 0
>>>     for: 5m
>>>     labels:
>>>       severity: critical
>>>     annotations:
>>>       timestamp: >
>>>         time:  {{ .StartsAt }}
>>>
>>> Which leaves me with the error:
>>> = <error expanding template: error executing template
>>> __alert_InstanceDown: template: __alert_InstanceDown:1:90: executing
>>> "__alert_InstanceDown" at <.StartsAt>: can't evaluate field StartsAt in
>>> type struct { Labels map[string]string; ExternalLabels map[string]string;
>>> Value float64 }>
>>>
>>> Regards,
>>> Sebastian
>>>
>>>
>>>
>>> On Tuesday, May 3, 2022 at 9:47:25 PM UTC+2 [email protected] wrote:
>>>
>>>> Hi Sebastian,
>>>>
>>>> What error are you getting and what's your config? The first answer
>>>> puts the current time of the evaluation of the alerting rule into the
>>>> "time" annotation (which wouldn't necessarily be the time the alert started
>>>> firing, but the last time it was still evaluated as firing). The second
>>>> Stack Overflow answer is maybe more what you want, accessing the
>>>> ".StartsAt" field of an alert directly from an Alertmanager notification
>>>> template, and thus telling you when the alert first started firing. If you
>>>> share what you already have (and which kind of timestamp you expect), maybe
>>>> we can help you make it work.
>>>>
>>>> Regards,
>>>> Julius
>>>>
>>>> On Tue, May 3, 2022 at 11:49 AM Sebastian Osztovits <
>>>> [email protected]> wrote:
>>>>
>>>>>
>>>>> Hello,
>>>>> I'm currently using Alert manager 0.22.2 and Prometheus 2.18.1.
>>>>> I have tried to alter my alert as shown in stack overflow  post:
>>>>>
>>>>>
>>>>> https://stackoverflow.com/questions/41069912/populating-a-date-time-in-prometheus-alert-manager-e-mail
>>>>> yet i am met with an error message when doing so.
>>>>> Does anyone have and email Template where they successfully added a
>>>>> Timestamp?
>>>>> Thanks in advance.
>>>>> Best Regards,
>>>>> Sebastian
>>>>>
>>>>> --
>>>>> You received this message because you are subscribed to the Google
>>>>> Groups "Prometheus Users" group.
>>>>> To unsubscribe from this group and stop receiving emails from it, send
>>>>> an email to [email protected].
>>>>> To view this discussion on the web visit
>>>>> https://groups.google.com/d/msgid/prometheus-users/e03a03be-4d88-4ded-b969-f36a06c469f2n%40googlegroups.com
>>>>> <https://groups.google.com/d/msgid/prometheus-users/e03a03be-4d88-4ded-b969-f36a06c469f2n%40googlegroups.com?utm_medium=email&utm_source=footer>
>>>>> .
>>>>>
>>>>
>>>>
>>>> --
>>>> Julius Volz
>>>> PromLabs - promlabs.com
>>>>
>>> --
>>> You received this message because you are subscribed to the Google
>>> Groups "Prometheus Users" group.
>>> To unsubscribe from this group and stop receiving emails from it, send
>>> an email to [email protected].
>>>
>> To view this discussion on the web visit
>>> https://groups.google.com/d/msgid/prometheus-users/a2631779-dd7c-40fe-897e-7de27d8c7c0dn%40googlegroups.com
>>> <https://groups.google.com/d/msgid/prometheus-users/a2631779-dd7c-40fe-897e-7de27d8c7c0dn%40googlegroups.com?utm_medium=email&utm_source=footer>
>>> .
>>>
>>
>>
>> --
>> Julius Volz
>> PromLabs - promlabs.com
>>
> --
> You received this message because you are subscribed to the Google Groups
> "Prometheus Users" group.
> To unsubscribe from this group and stop receiving emails from it, send an
> email to [email protected].
> To view this discussion on the web visit
> https://groups.google.com/d/msgid/prometheus-users/862d8891-e298-44d8-81c2-53208f6e2a2an%40googlegroups.com
> <https://groups.google.com/d/msgid/prometheus-users/862d8891-e298-44d8-81c2-53208f6e2a2an%40googlegroups.com?utm_medium=email&utm_source=footer>
> .
>


-- 
Julius Volz
PromLabs - promlabs.com

-- 
You received this message because you are subscribed to the Google Groups 
"Prometheus Users" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to [email protected].
To view this discussion on the web visit 
https://groups.google.com/d/msgid/prometheus-users/CAObpH5xmCt0_%3DLFZxM%2BO4-MSWKK2MBOn_tZey8XM1-AwEZbvyA%40mail.gmail.com.

Reply via email to