Hello Thomas, Kafka is not storage, it is fundamentally streaming logs with a limited retention window.
To work with PIO today, a Kafka consumer would need to be implemented to take messages and transform/insert them into persistent PIO event storage, like Postgres or HBase. As Donald asked, are you trying to build an engine with Spark Streaming? I'm curious about how to do this too: both the PIO `predict` runtime for Spark Streaming and the [Kafka] events stream consumer to push events into PIO. *Mars ( <> .. <> ) > On Jul 5, 2017, at 10:20, Donald Szeto <[email protected]> wrote: > > Hi Thomas, > > Supporting Kafka is definitely interesting and desirable. Are you looking to > sinking your Kafka messages to event store for batch processing, or stream > processing directly from Kafka? The latter would require more work because > Apache PIO does not yet support streaming properly. > > Folks from ActionML might have a flavor of PIO that works with Kafka. > > Regards, > Donald > > On Tue, Jul 4, 2017 at 8:34 AM, Thomas POCREAU <[email protected]> > wrote: > Hi, > > Thanks a lot for this awesome project. > > I have a question regarding Kafka and it's possible integration as an Event > Store. > Do you have any plan on this matter ? > Are you aware of someone working on a similar sujet ? > > Regards, > Thomas. > >
