Hello there,

Was wondering if anybody has a suggestion how to filter messages from a 
committable kafka source while not loosing the committable offsets that 
needs to be committed after it was sent to a sink.

Currently I'm reading from Kafka, filter and write the result to another 
kafka topic using batches.

A simplified version of my code:

val done =
  Consumer.committableSource(consumerSettings, 
Subscriptions.topics("topic1"))
    .filter(_.record.value == "something")
    .via(Producer.flow(producerSettings))
    .batch(max = 20, first => CommittableOffsetBatch.empty.updated(first)) 
{ (batch, elem) =>
      batch.updated(elem)
    }
    .mapAsync(3)(_.commitScaladsl())
    .runWith(Sink.ignore)

When I filter elements from the stream, I'm loosing the committable 
information.

Anybody has a suggestion how to avoid that and still commit the offsets of 
the filtered in the batches after it was written  ?

Kind Regards,
Nicolae N.

-- 
>>>>>>>>>>      Read the docs: http://akka.io/docs/
>>>>>>>>>>      Check the FAQ: 
>>>>>>>>>> http://doc.akka.io/docs/akka/current/additional/faq.html
>>>>>>>>>>      Search the archives: https://groups.google.com/group/akka-user
--- 
You received this message because you are subscribed to the Google Groups "Akka 
User List" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to akka-user+unsubscr...@googlegroups.com.
To post to this group, send email to akka-user@googlegroups.com.
Visit this group at https://groups.google.com/group/akka-user.
For more options, visit https://groups.google.com/d/optout.

Reply via email to