"If you build a powerful prometheus server then a* total of 2 million 
timeseries* is doable; beyond that you ought to look at sharding across 
multiple servers."

I think you have forgot a zero

[email protected] schrieb am Dienstag, 15. September 2020 um 09:46:49 UTC+2:

> On Tuesday, 15 September 2020 06:11:48 UTC+1, Nick wrote:
>>
>> Keeping cardinality explosion in mind, what's a decent maximum number of 
>> exported metrics that can be considered performant for scraping and 
>> time-series processing?
>
>
> It depends on how much resource you're prepared to throw at it.  If you 
> build a powerful prometheus server then a total of 2 million timeseries is 
> doable; beyond that you ought to look at sharding across multiple servers.
>
> As I mainly need the counter total, I can split the web analytics to 
>> reduce the number of possible label combinations, for example:
>>
>> { domain, page }
>> { domain, browser }
>>
>
> Yes that's fine, but you still want to limit the number of values for each 
> label.  As Stuart said: in the case of browser you don't want the raw 
> User-Agent header, but pick between a small pre-determined set of values 
> like "Firefox", "Chrome", "IE", "Other".  In the case of "page" strip out 
> any query string, and ideally also limit to known pages or "Other".
>
> If you also want to record the raw User-Agent values for every request 
> then do that in a separate logging system (e.g. loki, elasticsearch, a SQL 
> database, or even just plain text files)
>

-- 
You received this message because you are subscribed to the Google Groups 
"Prometheus Users" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to [email protected].
To view this discussion on the web visit 
https://groups.google.com/d/msgid/prometheus-users/ad68efaa-63c9-46f6-8a8a-7852ad1373d0n%40googlegroups.com.

Reply via email to