I need to write a simple application that reads rows from the table of the 
postgres, indexes them in elasticsearch and if successful records in 
postgres that data has been indexed. Rows have sizes of about 1 Kbyte to 
100MB. I'm trying to solve this with the akka-stream:

  FlowFrom(messagePublisher)
    .mapFuture(elasticService.indexMessage)
    .withSink(messagesSink)
    .run



Question: What is the best way to do a flow restriction for rows to 
retrieve from the database so that the summary size of the in-process rows 
does not exceed a certain value?

-- 
>>>>>>>>>>      Read the docs: http://akka.io/docs/
>>>>>>>>>>      Check the FAQ: 
>>>>>>>>>> http://doc.akka.io/docs/akka/current/additional/faq.html
>>>>>>>>>>      Search the archives: https://groups.google.com/group/akka-user
--- 
You received this message because you are subscribed to the Google Groups "Akka 
User List" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to [email protected].
To post to this group, send email to [email protected].
Visit this group at http://groups.google.com/group/akka-user.
For more options, visit https://groups.google.com/d/optout.

Reply via email to