Hi all, I'm currently trying to work on a custom service discovery integration for an in-house database that contains lots of info about potential scrape targets. So far, I've been trying to use the file-based service discovery which I'm constructing using a Python script.
The main issue I'm having is that lots of these scrape targets require basic authentication using unique credentials, or differ on http/https, and some of them require specific relabel rules to achieve the desired outcome. Most of this information is set at the 'scrape_config' level, but service discovery works at the 'static_configs' level, allowing only targets and labels to be discovered. Does anyone have a pattern for dynamically providing Prometheus with things like auth credentials, relabel rules etc alongside targets? My only thought so far is to just manage the entire Prometheus config file inside this script, and periodically update it with new information from the database. I'd like to avoid that though because this file is already managed by an existing configuration management pipeline. Cheers, Ben -- You received this message because you are subscribed to the Google Groups "Prometheus Users" group. To unsubscribe from this group and stop receiving emails from it, send an email to [email protected]. To view this discussion on the web visit https://groups.google.com/d/msgid/prometheus-users/e96834ae-508d-4819-b7b3-2eae54df0fean%40googlegroups.com.

