Sorry for the delay. Updated my config based on your suggestion, and indeed
I think that fixed it so I am very grateful, thank you very much!

          slack_api_url: "${SECRET_ALERT_MANAGER_DISCORD_WEBHOOK}/slack" ##
? Discord
          # slack_api_url: "${ SECRET_ALERT_MANAGER_SLACK_WEBHOOK}" ## ?
Slack

Was just waiting for Prom to fire an alert to Discord after the changes,
which just happened.

Thanks again for your help!


*Best regards,*

*Fabrice Semti*

📧 <[email protected]> [email protected]   📧
 [email protected] <[email protected]>
🌐 fabricesemti.com <https://fabricesemti.com/>


Brian Candler <[email protected]> ezt írta (időpont: 2022. aug. 30., K,
15:51):

> And the URL you've configured, it definitely looks like
>     /webhooks/{webhook.id}/{webhook.token}/slack
> ?
>
> On Tuesday, 30 August 2022 at 15:22:03 UTC+1 Fabrice Semti wrote:
>
>> Thanks for your reply.
>>
>> I mean, if I replace the SECRET_ALERT_MANAGER_DISCORD_WEBHOOK with
>> SECRET_ALERT_MANAGER_SLACK_WEBHOOK(so using instead of a Discord webhook an
>> actual Slack one - to a freshly created slack channel of mine)  the alerts
>> are indeed coming in..
>>
>>
>> [image: image.png]
>>
>> [image: image.png]
>> So indeed something must be amiss between alert manager and/or
>> Discord...will do some reading / reaching out to Discord support, I thought
>> the Discord webhook by default interchangeable with the Slack-one, at least
>> the helmreleases I have seen seemed to indicate this.
>> Will circle back here if that avenue is exhausted...
>>
>>
>> *Best regards,*
>>
>> *Fabrice Semti*
>>
>> 📧 [email protected]   📧 [email protected]
>> 🌐 fabricesemti.com <https://fabricesemti.com/>
>> Brian Candler <[email protected]> ezt írta (időpont: 2022. aug. 30., K,
>> 14:15):
>>
>>> I think the part you've missed is that an *alertmanager* webhook message
>>> is completely different to a *discord* webhook message.  You need an
>>> application in between to translate - that is, it receives an alertmanager
>>> webhook message, and send it on to discord with a different payload.
>>>
>>> If you google "alertmanager discord" then the first two hits are
>>> projects which do exactly this:
>>>
>>> https://github.com/benjojo/alertmanager-discord
>>> https://github.com/rogerrum/alertmanager-discord
>>>
>>> Now, you linked to this bit of helm chart:
>>>
>>> https://github.com/cbirkenbeul/k3s-gitops/blob/main/cluster/apps/monitoring/kube-prometheus-stack/helmrelease.yaml
>>>
>>> That doesn't use alertmanager webhooks at all.  It uses the *slack*
>>> sending capability of alertmanager.
>>>
>>> Does Discord accept Slack-formatted API messages?  If so, then it might
>>> work.  This suggests that it could:
>>>
>>> https://stackoverflow.com/questions/66076875/discord-slack-compatible-webhook
>>> But there's a separate API endpoint for accepting Slack-format requests:
>>>
>>> https://discord.com/developers/docs/resources/webhook#execute-slackcompatible-webhook
>>>
>>> So either:
>>> 1. you're not POSTing to the correct endpoint (the Slack-compatible one)
>>> 2. something about alertmanager's request makes Discord unhappy (i.e.
>>> it's not as Slack-compatible as it claims to be)
>>>
>>> If you think it's case (2) then you can take it up with Discord support
>>> - although they probably will want to see the body of the HTTP request, and
>>> the body of the 400 response that comes back.
>>>
>>> On Tuesday, 30 August 2022 at 14:00:04 UTC+1 Fabrice Semti wrote:
>>>
>>>>
>>>>
>>>> *Best regards,*
>>>>
>>>> *Fabrice Semti*
>>>>
>>>> 📧 [email protected]   📧 [email protected]
>>>> 🌐 fabricesemti.com <https://fabricesemti.com/>
>>>>
>>>>
>>>> ---------- Forwarded message ---------
>>>> Feladó: Fabrice Semti <[email protected]>
>>>> Date: 2022. aug. 30., K, 13:42
>>>> Subject: Re: [prometheus-users] Re: discord - "unrecoverable error"
>>>> To: Brian Candler <[email protected]>
>>>>
>>>>
>>>> Hi Brian,
>>>>
>>>> Thanks for the reply.
>>>>
>>>> You haven't said what you've configured as the receiver (target) of the
>>>>> webhook.
>>>>
>>>>
>>>> So I have created a webhook on my discord channel. (The same webhook
>>>> was tested on the online tester)
>>>>
>>>> [image: image.png]
>>>>
>>>> Therefore, if you're pointing the Prometheus alertmanager webhook
>>>>> receiver *directly* at a Discord API, it definitely won't work. You will
>>>>> need some middleware to translate the Alertmanager webhook payload into
>>>>> something the target API understands.  Google "alertmanager discord" for
>>>>> some options.
>>>>>
>>>>
>>>> I have based my config on this one
>>>> <https://github.com/cbirkenbeul/k3s-gitops/blob/main/cluster/apps/monitoring/kube-prometheus-stack/helmrelease.yaml>,
>>>> which seems to be in line with what I found for "alertmanager discord"...
>>>>
>>>> My config is:
>>>>
>>>>   alertmanager:
>>>>       config:
>>>>         global:
>>>>           slack_api_url: "${SECRET_ALERT_MANAGER_DISCORD_WEBHOOK}"
>>>>           resolve_timeout: 5m
>>>>         receivers:
>>>>           - name: "null"
>>>>           - name: "discord"
>>>>             slack_configs:
>>>>               - channel: "#prometheus"
>>>>                 icon_url:
>>>> https://avatars3.githubusercontent.com/u/3380462
>>>>                 username: "Prometheus"
>>>>                 send_resolved: true
>>>>                 title: |-
>>>>                   [{{ .Status | toUpper }}{{ if eq .Status "firing"
>>>> }}:{{ .Alerts.Firing | len }}{{ end }}] {{ if ne .CommonAnnotations.summary
>>>> ""}}{{ .CommonAnnotations.summary }}{{ else }}{{ .CommonLabels.alertname
>>>> }}{{ end }}
>>>>                 text: >-
>>>>                   {{ range .Alerts -}}
>>>>                     **Alert:** {{ .Annotations.title }}{{ if
>>>> .Labels.severity }} - `{{ .Labels.severity }}`{{ end }}
>>>>                   **Description:** {{ if ne .Annotations.description
>>>> ""}}{{ .Annotations.description }}{{else}}N/A{{ end }}
>>>>                   **Details:**
>>>>                     {{ range .Labels.SortedPairs }} • *{{ .Name }}:*
>>>> `{{ .Value }}`
>>>>                     {{ end }}
>>>>                   {{ end }}
>>>>         route:
>>>>           group_by: ["alertname", "job"]
>>>>           group_wait: 30s
>>>>           group_interval: 5m
>>>>           repeat_interval: 6h
>>>>           receiver: "discord"
>>>>           routes:
>>>>             - receiver: "null"
>>>>               match:
>>>>                 alertname: Watchdog
>>>>             - receiver: "discord"
>>>>               match_re:
>>>>                 severity: critical|warning|error
>>>>               continue: true
>>>>         inhibit_rules:
>>>>           - source_match:
>>>>               severity: "critical"
>>>>             target_match:
>>>>               severity: "warning"
>>>>             equal: ["alertname", "namespace"]
>>>>
>>>>       ingress:
>>>>         enabled: true
>>>>         pathType: Prefix
>>>>         ingressClassName: "nginx"
>>>>         annotations:
>>>>           cert-manager.io/cluster-issuer: "letsencrypt-production"
>>>>           external-dns.alpha.kubernetes.io/target: "
>>>> ipv4.${SECRET_DOMAIN}"
>>>>           external-dns/is-public: "true"
>>>>           hajimari.io/enable: "true"
>>>>           hajimari.io/icon: "fire-alert"
>>>>           # nginx.ingress.kubernetes.io/auth-url: "https://oauth.
>>>> ${SECRET_DOMAIN}/oauth2/auth"
>>>>           # nginx.ingress.kubernetes.io/auth-signin: "https://oauth.
>>>> ${SECRET_DOMAIN}/oauth2/start?rd=$scheme://$best_http_host$request_uri"
>>>>           # nginx.ingress.kubernetes.io/auth-response-headers:
>>>> "x-auth-request-user, x-auth-request-email, x-auth-request-access-token"
>>>>         hosts:
>>>>           - &host "alert-manager.${SECRET_DOMAIN}"
>>>>         tls:
>>>>           - hosts:
>>>>               - *host
>>>>             secretName: "alert-manager-tls"
>>>>       alertmanagerSpec:
>>>>         storage:
>>>>           volumeClaimTemplate:
>>>>             spec:
>>>>               storageClassName: "hdd-nfs-storage"
>>>>               resources:
>>>>                 requests:
>>>>                   storage: 1Gi
>>>>
>>>>       resources:
>>>>         requests:
>>>>           cpu: 11m
>>>>           memory: 37M
>>>>
>>>>
>>>>
>>>> You can of course make one alertmanager receiver which does both, which
>>>>> is perhaps what you're already doing:
>>>>>
>>>>> receivers:
>>>>> - name: discord/slack
>>>>>   slack_configs:
>>>>>     - ...
>>>>>   webhook_configs:
>>>>>     - ...
>>>>>
>>>>>
>>>>> Yes, pretty much that is what I am aiming for, and I only lightly
>>>> modified what I have seen in other repos (such as above or this one
>>>> <https://github.com/North14/home-cluster/blob/125cc4938947e9ce07df6ba4bc7e3d6170590d41/cluster/apps/monitoring/kube-prometheus-stack/helm-release.yaml>
>>>> ), so I am not sure what is not right...
>>>> Any further clues would be much appreciated...
>>>>
>>>>
>>>>
>>>> *Best regards,*
>>>>
>>>> *Fabrice Semti*
>>>>
>>>> 📧 [email protected]   📧 [email protected]
>>>> 🌐 fabricesemti.com <https://fabricesemti.com/>
>>>>
>>>>
>>>> Brian Candler <[email protected]> ezt írta (időpont: 2022. aug. 30.,
>>>> K, 11:11):
>>>>
>>>>> You haven't said what you've configured as the receiver (target) of
>>>>> the webhook.
>>>>>
>>>>> Whatever it is, it is responding with a 400 error, which means that
>>>>> the request being sent is invalid.
>>>>>
>>>>> The format of the webhook payload sent by Alertmanager is *fixed*.
>>>>> For an example see:
>>>>>
>>>>> https://prometheus.io/docs/alerting/latest/configuration/#webhook_config
>>>>>
>>>>> This will be different to what a "Discord webhook tester" sends.
>>>>>
>>>>> Therefore, if you're pointing the Prometheus alertmanager webhook
>>>>> receiver *directly* at a Discord API, it definitely won't work. You will
>>>>> need some middleware to translate the Alertmanager webhook payload into
>>>>> something the target API understands.  Google "alertmanager discord" for
>>>>> some options.
>>>>>
>>>>> As for sending to slack, you should use the Alertmanager built-in
>>>>> slack integration, not webhook:
>>>>> https://prometheus.io/docs/alerting/latest/configuration/#slack_config
>>>>>
>>>>> You can of course make one alertmanager receiver which does both,
>>>>> which is perhaps what you're already doing:
>>>>>
>>>>> receivers:
>>>>> - name: discord/slack
>>>>>   slack_configs:
>>>>>     - ...
>>>>>   webhook_configs:
>>>>>     - ...
>>>>>
>>>>> On Tuesday, 30 August 2022 at 08:03:42 UTC+1 [email protected] wrote:
>>>>>
>>>>>> Hi Everyone,
>>>>>>
>>>>>> Seeking some help here - very new to Prometheus/Alert Manager, must
>>>>>> add.
>>>>>>
>>>>>> - I set up the kube-prometheus-stack as per:
>>>>>> https://github.com/fabricesemti80/home-stack/blob/42f23c6e62184b037908e84d52776a717889ff20/cluster/apps/monitoring/kube-prometheus-stack/helmrelease.yaml#L81
>>>>>>
>>>>>> - log errors are similar to this:
>>>>>>
>>>>>> ts=2022-08-30T06:50:59.456Z caller=dispatch.go:354 level=error
>>>>>> component=dispatcher msg="Notify for alerts failed" num_alerts=3
>>>>>> err="discord/slack[0]: notify retry canceled due to unrecoverable error
>>>>>> after 1 attempts: channel \"#prometheus\": unexpected status code 400:
>>>>>> {\"attachments\": [\"0\"]}"
>>>>>>
>>>>>> - I used this site <https://disforge.com/tool/webhook-tester> to
>>>>>> test the webhook
>>>>>>
>>>>>> - the test - with the same parameters used in the manifest - was
>>>>>> successful (see attached screenshot), therefore I think my settings are
>>>>>> correct.
>>>>>>
>>>>>> Any suggestions what am I missing please?
>>>>>>
>>>>>> Thanks,
>>>>>> F.
>>>>>>
>>>>> --
>>>>> You received this message because you are subscribed to a topic in the
>>>>> Google Groups "Prometheus Users" group.
>>>>> To unsubscribe from this topic, visit
>>>>> https://groups.google.com/d/topic/prometheus-users/uqNf-alj0i0/unsubscribe
>>>>> .
>>>>> To unsubscribe from this group and all its topics, send an email to
>>>>> [email protected].
>>>>> To view this discussion on the web visit
>>>>> https://groups.google.com/d/msgid/prometheus-users/e7f7f14f-514f-474c-8052-5bae2c0a6a64n%40googlegroups.com
>>>>> <https://groups.google.com/d/msgid/prometheus-users/e7f7f14f-514f-474c-8052-5bae2c0a6a64n%40googlegroups.com?utm_medium=email&utm_source=footer>
>>>>> .
>>>>>
>>>> --
>>> You received this message because you are subscribed to a topic in the
>>> Google Groups "Prometheus Users" group.
>>> To unsubscribe from this topic, visit
>>> https://groups.google.com/d/topic/prometheus-users/uqNf-alj0i0/unsubscribe
>>> .
>>> To unsubscribe from this group and all its topics, send an email to
>>> [email protected].
>>>
>> To view this discussion on the web visit
>>> https://groups.google.com/d/msgid/prometheus-users/33a1a588-f170-4b26-9285-fdc7db3fd509n%40googlegroups.com
>>> <https://groups.google.com/d/msgid/prometheus-users/33a1a588-f170-4b26-9285-fdc7db3fd509n%40googlegroups.com?utm_medium=email&utm_source=footer>
>>> .
>>>
>> --
> You received this message because you are subscribed to a topic in the
> Google Groups "Prometheus Users" group.
> To unsubscribe from this topic, visit
> https://groups.google.com/d/topic/prometheus-users/uqNf-alj0i0/unsubscribe
> .
> To unsubscribe from this group and all its topics, send an email to
> [email protected].
> To view this discussion on the web visit
> https://groups.google.com/d/msgid/prometheus-users/e4768237-b7b0-47d6-96c2-a0bd4e454ed1n%40googlegroups.com
> <https://groups.google.com/d/msgid/prometheus-users/e4768237-b7b0-47d6-96c2-a0bd4e454ed1n%40googlegroups.com?utm_medium=email&utm_source=footer>
> .
>

-- 
You received this message because you are subscribed to the Google Groups 
"Prometheus Users" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to [email protected].
To view this discussion on the web visit 
https://groups.google.com/d/msgid/prometheus-users/CA%2BeraO5-SXEEVTbNP5Ja%3DefMBw5mJMctES_7uxb4zspZv6e90A%40mail.gmail.com.

Reply via email to