No, we try not to fork :-) But it would be nice as you say. It can be done with 
a small intermediary app that just reads from a Kafka topic and send events to 
a localhost EventServer, which would allow events to be custom extracted from 
say log files (typical contents of Kafka). We’ve done this in non-PIO projects.

The intermediary app should use Spark streaming. I may have a snippet of code 
around if you need it but it just saves to micro-batch files. You’d have to use 
the PIO Java-SDK to send them to the EventServer. A relatively simple thing.

Donald, what did you have in mind for deeper integration? I guess we could cut 
out the intermediate app and integrate into a new Kafka aware EventServer 
endpoint where the raw topic input is stored in the EventStore. This would 
force any log filtering onto the Kafka source.


On Jul 5, 2017, at 10:20 AM, Donald Szeto <[email protected]> wrote:

Hi Thomas,

Supporting Kafka is definitely interesting and desirable. Are you looking to 
sinking your Kafka messages to event store for batch processing, or stream 
processing directly from Kafka? The latter would require more work because 
Apache PIO does not yet support streaming properly.

Folks from ActionML might have a flavor of PIO that works with Kafka.

Regards,
Donald

On Tue, Jul 4, 2017 at 8:34 AM, Thomas POCREAU <[email protected] 
<mailto:[email protected]>> wrote:
Hi,

Thanks a lot for this awesome project.

I have a question regarding Kafka and it's possible integration as an Event 
Store.
Do you have any plan on this matter ? 
Are you aware of someone working on a similar sujet ?

Regards,
Thomas.



Reply via email to