Hello Esa,
all the steps that you described can be performed with Spark. I don't know
about CEP, but Spark Streaming should be enough.

Best,

Matteo

On 18 May 2018 at 09:20, Esa Heikkinen <esa.heikki...@student.tut.fi> wrote:

> Hi
>
>
>
> I have attached fictive example (pdf-file) about processing of event
> traces from data streams (or batch data). I hope the picture of the
> attachment is clear and understandable.
>
>
>
> I would be very interested in how best to solve it with Spark. Or it is
> possible or not ? If it is possible, can it be solved for example by CEP ?
>
>
>
> Little explanations.. Data processing reads three different and parallel
> streams (or batch data): A, B and C. Each of them have events which have
> different “keys with value” (like K1-K4) or record.
>
>
>
> I would want to find all event traces, which have certain dependences or
> patterns between streams (or batches). To find pattern there are three
> steps:
>
> 1)      Searches an event that have value “X” in K1 in stream A and if it
> is found, stores it to global data for later use and continues next step
>
> 2)      Searches an event that have value A(K1) in K2 in stream B and if
> it is found, stores it to global data for later use and continues next step
>
> 3)      Searches an event that have value A(K1) in K1 and value B(K3) in
> K2 in stream C and if it is found, continues next step (back to step 1)
>
>
>
> If that is not possible by Spark, do you have any idea of tools, which can
> solve this ?
>
>
>
> Best, Esa
>
>
>
>
> ---------------------------------------------------------------------
> To unsubscribe e-mail: user-unsubscr...@spark.apache.org
>

Reply via email to