I am currently working on an architecture for a big data streaming and batch 
processing platform. I am planning on using Apache Kafka for a distributed 
messaging system to handle data from streaming data sources and then pass on to 
Apache Flink for stream processing. I would also like to use Flink's batch 
processing capabilities to process batch data.

Does it make sense to pass the batched data through Kafka on a periodic basis 
as a source for Flink batch processing (is this even possible?) or should I 
just write the batch data to a data store and then process by reading into 
Flink?

________________________________

| All rights in this email and any attached documents or files are expressly 
reserved. This e-mail, and any files transmitted with it, contains confidential 
information which may be subject to legal privilege. If you are not the 
intended recipient, please delete it and notify Palamir Pty Ltd by e-mail. 
Palamir Pty Ltd does not warrant this transmission or attachments are free from 
viruses or similar malicious code and does not accept liability for any 
consequences to the recipient caused by opening or using this e-mail. For the 
legal protection of our business, any email sent or received by us may be 
monitored or intercepted. | Please consider the environment before printing 
this email. |

Reply via email to