ASF GitHub Bot commented on FLINK-4520:
Github user haoch commented on the issue:
@StephanEwen thanks for the comments, I think it's both ok to keep this in
the core or as an separated project, but the concern is it maybe better for
community development to centralize qualified libraries togather. As an
alternative solution for too test stability and dead code, may it possible to
create another code repository say "flink-library"?
**BTW: here are the answers to your questions one by one:**
> How complete is the implementation?
Siddhi is a rich-featured CEP and has its own community, and maybe almost
the only open source CEP solutions compatible with Apache License. And this
library `flink-siddhi` is mainly focused on bring siddhi's capability to flink
users seamlessly by:
- Integrate Siddhi CEP runtime with flink lifecycle
- Schema and DataStream source mapping
- State management and fault-tolerant.
So I think it would be extremely light-weight but useful, and the current
implementation should be almost completed.
> Would you be up for maintaining this code?
Sure, first of all, personally I am very willing to keep continuously
contributing to Flink project in any way.
And also we used siddhi with distributed streaming system a lot in
production, and currently considering to support flink as well under
consideration of better state management and window supporting. So I would
continuously maintain the code if merged, it not, I would maintain at
https://github.com/haoch/flink-siddhi as well to make sure it's workable.
> Are you building this as an experiment, or building a production use case
based on Siddhi on Flink?
We use siddhi with streaming environment in production a lot, currently
supports storm and spark streaming, and also consider extending to Flink.
> Integrate Siddhi as a lightweight CEP Library
> Key: FLINK-4520
> URL: https://issues.apache.org/jira/browse/FLINK-4520
> Project: Flink
> Issue Type: New Feature
> Components: CEP
> Affects Versions: 1.2.0
> Reporter: Hao Chen
> Assignee: Hao Chen
> Labels: cep, library, patch-available
> Fix For: 1.2.0
> h1. flink-siddhi proposal
> h2. Abstraction
> Siddhi CEP is a lightweight and easy-to-use Open Source Complex Event
> Processing Engine (CEP) released as a Java Library under `Apache Software
> License v2.0`. Siddhi CEP processes events which are generated by various
> event sources, analyses them and notifies appropriate complex events
> according to the user specified queries.
> It would be very helpful for flink users (especially streaming application
> developer) to provide a library to run Siddhi CEP query directly in Flink
> streaming application.
> * http://wso2.com/products/complex-event-processor/
> * https://github.com/wso2/siddhi
> h2. Features
> * Integrate Siddhi CEP as an stream operator (i.e.
> `TupleStreamSiddhiOperator`), supporting rich CEP features like
> * Filter
> * Join
> * Aggregation
> * Group by
> * Having
> * Window
> * Conditions and Expressions
> * Pattern processing
> * Sequence processing
> * Event Tables
> * Provide easy-to-use Siddhi CEP API to integrate Flink DataStream API (See
> `SiddhiCEP` and `SiddhiStream`)
> * Register Flink DataStream associating native type information with
> Siddhi Stream Schema, supporting POJO,Tuple, Primitive Type, etc.
> * Connect with single or multiple Flink DataStreams with Siddhi CEP
> Execution Plan
> * Return output stream as DataStream with type intelligently inferred
> from Siddhi Stream Schema
> * Integrate siddhi runtime state management with Flink state (See
> * Support siddhi plugin management to extend CEP functions. (See
> h2. Test Cases
> * org.apache.flink.contrib.siddhi.SiddhiCEPITCase:
> h2. Example
> StreamExecutionEnvironment env =
> SiddhiCEP cep = SiddhiCEP.getSiddhiEnvironment(env);
> cep.registerStream("inputStream1", input1, "id", "name",
> cep.registerStream("inputStream2", input2, "id", "name",
> DataStream<Tuple5<Integer,String,Integer,String,Double>> output = cep
> "from every s1 = inputStream1[id == 2] "
> + " -> s2 = inputStream2[id == 3] "
> + "select s1.id as id_1, s1.name as name_1, s2.id as id_2, s2.name as
> name_2 , custom:plus(s1.price,s2.price) as price"
> + "insert into outputStream"
This message was sent by Atlassian JIRA