Hi,

I'm currently investigating whether it's possible in Spark Streaming to send
back ack's to RabbitMQ after a message has gone through the processing
pipeline. The problem is that the Receiver is the one who has the RabbitMQ
channel open for receiving messages, but due to reliability concerns we
don't want to ack messages right away when they're received, we want to
defer that to the time they have been completely processed and persisted.

Now the problem is: how can the Receiver tell that a message has made it
through the pipeline and is safe to ack?



--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/Ack-RabbitMQ-messages-after-processing-through-Spark-Streaming-tp15348.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to