Hi Teena,

what happens if you replace the second sink with a non-ElasticSearchSink? Is there the same result? Is the data read from the KafkaTopic2?

We should determine which system is the bottleneck.

Regards,
Timo


Am 1/18/18 um 9:53 AM schrieb Teena Kappen // BPRISE:

Hi,

I am running flink 1.4 in single node. My job has two Kafka consumers reading from separate topics. After fetching the data, the job writes it to two separate Elasticsearch sinks. So the process is like this

KafkaTopic1 -> Kafkaconsumer1 -> create output record -> Elasticsearchsink1

KafkaTopic2 -> Kafkaconsumer2 -> create output record -> Elasticsearchsink2

Both the streams and their processing are completely unrelated. The first sink works as expected and it writes the output for all input records. The second sink writes to Elasticsearch only once and after that it stops writing to Elasticsearch even if there is more data that gets fed into Kafka. Sometimes, it does not even write once. We tested this in two other jobs and the same issue is there in all of them.

I have attached a sample code I had created to illustrate the issue. We are using Elasticsearch version 5.6.4 and hence the dependency used is ‘flink-connector-elasticsearch5_2.11’.

Regards,

Teena


Reply via email to