Hi,
I manged to accomplish this by using side outputs [1] and stream union [2].
We used those in my past projects where we were using exclusively Java
Streaming API  to create alerting stream.

Every operator was able to produce a side output stream in case of an
exception/alert. Later we union all side output stream in to one stream and
sink it to Kafka or any other Sink.

If you would like to use HTTP as an async sink you can look at
HTTP-CONNECTOR [3] that has Sink support so you can Sink into HTTP REST
service.

In my project we had a POJO that was representing an alert event. We were
wrapping every operator in try/catch block where In catch block we were
creating new Alert event instance. The event was send on the side output
from this operator. We did it for every operator, later we union all side
output streams and sink them to kafka topic.

[1]
https://nightlies.apache.org/flink/flink-docs-master/docs/dev/datastream/side_output/
[2]
https://nightlies.apache.org/flink/flink-docs-master/docs/dev/datastream/operators/overview/#union
[3] https://github.com/getindata/flink-http-connector

Regards,
Krzysztof Chmielewski


śr., 28 gru 2022 o 18:43 Pratham Kumar P via user <user@flink.apache.org>
napisał(a):

> Hello,
>
>
>
> We have a query regarding our use case.
>
>
>
> *Use Case:*
>
> We need to send descriptive alerts from Flink operators triggered based on
> certain criteria.
>
>
>
> *Queries:*
>
>    1. Is there any mechanism provided by Flink, which can be used to send
>    these alerts?
>    2. We considered using the Async operators, and send the alerts using
>    the async REST HTTP call. If the answer to query 1 is No, then is this the
>    recommended way, considering there is possibility of back-pressure in case
>    of high load?
>
>
>
> Thanks,
>
> Pratham
>

Reply via email to