Flink listener for Task failures and restarts

2023-11-28 Thread Vignesh Ramesh
Hi Team, When a flink job fails we use an implementation of joblistener interface to get an handle of onJobSubmitted(job submitted) and onJobExecuted(failure,success). But if my flink application has restart strategy enabled flink automatically restarts specific task(which fail). How do we get an

Re: Evenly distribute task slots across task-manager

2021-03-24 Thread Vignesh Ramesh
he parallelism set to 16? Or do you observe the >> described behavior also on a job level? >> I'm adding Chesnay to the thread as he might have more insights on this >> topic. >> >> Best, >> Matthias >> >> On Mon, Mar 22, 2021 at 6:31 PM Vignesh Ra

Evenly distribute task slots across task-manager

2021-03-22 Thread Vignesh Ramesh
Hello Everyone, Can someone help me with a solution? I have a flink job(2 task-managers) with a job parallelism of 64 and task slot of 64. I have a parallelism set for one of the operators as 16. This operator(16 parallelism) slots are not getting evenly distributed across two task managers. It

Flink Elasticseach success handler

2021-02-11 Thread Vignesh Ramesh
I use Flink Elasticsearch sink to bulk insert the records to ES. I want to do an operation after the record is successfully synced to Elasticsearch. There is a failureHandler by which we can retry failures. Is there a successHandler in flink elasticsearch sink? *Note*: I couldn't do the

Flink kafka - Message Prioritization

2020-10-29 Thread Vignesh Ramesh
Hi, I have a flink pipeline which reads from a kafka topic does a map operation(builds an ElasticSearch model) and sinks it to Elasticsearch *Pipeline-1:* Flink-Kafka-Connector-Consumer(topic1) (parallelism 8) -> Map (parallelism 8) -> Flink-Es-connector-Sink(es1) (parallelism 8) Now i want

[QUERY] Multiple elastic search sinks for a single flink pipeline

2020-10-14 Thread Vignesh Ramesh
My requirement is to send the data to a different ES sink (based on the data). Ex: If the data contains a particular info send it to sink1 else send it to sink2 etc(basically send it dynamically to any one sink based on the data). I also want to set parallelism separately for ES sink1, ES sink2,