ok Stuart Clark, thanks for your help.

success, expected result.
the documentation does not make it clear which files to add.

I added the labels in each file:
site01.yml
- targets:
   site01.localhost
   labels:
     group: slack01

site02.yml
- targets:
   site02.localhost
   labels:
     group: slack02

in the alermanager file I added:

alermanager.yml

  routes:
   - receiver: slack01
     group_by: ['slack01']
    
   - receiver: slack02
     group_by: ['slack02']

Em sexta-feira, 20 de novembro de 2020 às 15:46:09 UTC-3, Stuart Clark 
escreveu:

> On 20/11/2020 18:16, juan chaves wrote:
>
>
> I read the documentation, but I have no idea how to assemble this part.
> would you have an example?
>
>
> Assuming YAML your file is currently something like
>
> targets:
>   - target1
>   - target2
>
> You'd just add a labels section as well
>
> labels:
>   label1: value
>   label2: value
>
> So, in total
>
> targets:
>   - target1
>   - target2
>
> labels:
>   label1: value
>   label2: value
>
>
> Em sexta-feira, 20 de novembro de 2020 às 14:02:12 UTC-3, Stuart Clark 
> escreveu:
>
>> On 20/11/2020 16:47, juan chaves wrote:
>>
>> what did you say i understand.
>>
>> however there is the blackbox_exporter and the files site01.yml and 
>> site02.yml in file_sd_configs, I need each one to direct to an alert rule. 
>> then it is easy to target the alert rules matching the alertmanager routes.
>>
>> You can set labels within the YAML or JSON files used by file_sd. See 
>> https://prometheus.io/docs/prometheus/latest/configuration/configuration/#file_sd_config
>>
>> You could add a label in each file with two different values (so value 1 
>> in file 1 and value 2 in file 2). You can then use that label in your 
>> alerting rules in a way that then allows you to direct the alerts as 
>> desired in the Alertmanager routing as mentioned.
>>
>>
>> Em sexta-feira, 20 de novembro de 2020 às 13:15:51 UTC-3, 
>> [email protected] escreveu:
>>
>>> By adding labels to some alerts (which you can do in your alerting 
>>> rules), and by matching on those labels in the "routes 
>>> <https://prometheus.io/docs/alerting/latest/configuration/#route>" 
>>> section of your alertmanager config. 
>>>
>>> There are examples here 
>>> <https://prometheus.io/docs/alerting/latest/configuration/#example> and 
>>> in the archives of this group.  Also, you may find this interactive routing 
>>> tree editor/tester helpful:
>>> https://prometheus.io/webtools/alerting/routing-tree-editor/
>>>
>>> On Friday, 20 November 2020 at 16:05:51 UTC [email protected] wrote:
>>>
>>>> Hello people.
>>>> this is my blackbox configuration it has two files (lists of sites). 
>>>> below my configuration of alerts using slack.
>>>>
>>>>
>>>> my question is, how can i use different receiver for each list in the 
>>>> blackbox. ex: sites01.yml use slack
>>>> sites02.yml use email.
>>>>
>>>>
>>>> ####config blackbox  
>>>> - job_name: 'blackbox'
>>>>     scrape_interval: 30s
>>>>     scrape_timeout: 20s
>>>>     metrics_path: /probe
>>>>     params:
>>>>       module: [http_2xx]  # Look for a HTTP 200 response.
>>>>     file_sd_configs:
>>>>       - files:
>>>>         - /etc/prometheus/sites01.yml
>>>>         - /etc/prometheus/sites02.yml
>>>>     relabel_configs:
>>>>       - source_labels: [__address__]
>>>>         target_label: __param_target
>>>>       - source_labels: [__param_target]
>>>>         target_label: instance
>>>>       - target_label: __address__
>>>>         replacement: "blackbox_exporter:9115"  # Blackbox exporter.
>>>>
>>>> ###config alertmanager
>>>> receivers:
>>>>   - name: 'slack'
>>>>     slack_configs:
>>>>     - api_url: 'https://hooks.slack.com/services/...'
>>>>       send_resolved: true
>>>>       username: 'jcm'
>>>>       channel: '#alert'
>>>>       title: '[{{ .Status | toUpper }}{{ if eq .Status "firing" }}:{{ 
>>>> .Alerts.Firing | len }}{{ end }}] Notificações de Status - {{if eq .Status 
>>>> "firing"}}Site(s) OffLine!{{else}}Sites Sucessfull!{{end}} {{if eq .Status 
>>>> "firing"}}:fire:{{else}}:raised_hands:{{end}}'
>>>>       text: "{{ range .Alerts }}*Description:* 
>>>> {{.Annotations.description}}\n{{end}}
>>>>
>>> -- 
>> You received this message because you are subscribed to the Google Groups 
>> "Prometheus Users" group.
>> To unsubscribe from this group and stop receiving emails from it, send an 
>> email to [email protected].
>> To view this discussion on the web visit 
>> https://groups.google.com/d/msgid/prometheus-users/469913fc-8ab0-4a8a-b2fb-dfb3ea03dad2n%40googlegroups.com
>>  
>> <https://groups.google.com/d/msgid/prometheus-users/469913fc-8ab0-4a8a-b2fb-dfb3ea03dad2n%40googlegroups.com?utm_medium=email&utm_source=footer>
>> .
>>
>>
>> -- 
> You received this message because you are subscribed to the Google Groups 
> "Prometheus Users" group.
> To unsubscribe from this group and stop receiving emails from it, send an 
> email to [email protected].
>
> To view this discussion on the web visit 
> https://groups.google.com/d/msgid/prometheus-users/ed14bca4-c803-42f2-b9aa-c786c33f554an%40googlegroups.com
>  
> <https://groups.google.com/d/msgid/prometheus-users/ed14bca4-c803-42f2-b9aa-c786c33f554an%40googlegroups.com?utm_medium=email&utm_source=footer>
> .
>
>
>

-- 
You received this message because you are subscribed to the Google Groups 
"Prometheus Users" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to [email protected].
To view this discussion on the web visit 
https://groups.google.com/d/msgid/prometheus-users/c823c712-809a-44b6-aa3e-113b16fe5631n%40googlegroups.com.

Reply via email to