Re: Best way to emit custom metrics to Prometheus in spark structured streaming

2020-11-04 Thread meetwes
So I tried it again in standalone mode (spark-shell) and the df.observe() functionality works. I tried sum, count, conditional aggregations using 'when', etc and all of this works in spark-shell. But, with spark-on-k8s, cluster mode, only using lit() as the aggregation column works. No other

Re: Best way to emit custom metrics to Prometheus in spark structured streaming

2020-11-04 Thread meetwes
Hi, Thanks for the reply. I tried it out today but I am unable to get it to work in cluster mode. The aggregation result is always 0. It works fine in standalone however with spark shell but with spark on Kubernetes in cluster mode, it doesn't. -- Sent from:

Best way to emit custom metrics to Prometheus in spark structured streaming

2020-11-02 Thread meetwes
Hi I am looking for the right approach to emit custom metrics for spark structured streaming job.*Actual Scenario:* I have an aggregated dataframe let's say with (id, key, value) columns. One of the kpis could be 'droppedRecords' and the corresponding value column has the number of dropped