Thank you for the answer, I know that solution, but I don't want to stream
the rules all time.
In my case I have the rules in Redis and at startup of flink they are
loaded.

I want to broadcast changes just when it occurs.

Thanks.

Le 9 nov. 2017 7:51 AM, "Tony Wei" <tony19920...@gmail.com> a écrit :

> Hi Sadok,
>
> Since you want to broadcast Rule Stream to all subtasks, it seems that it
> is not necessary to use KeyedStream.
> How about use broadcast partitioner, connect two streams to attach the
> rule on each record or imply rule on them directly, and do the key operator
> after that?
> If you need to do key operator and apply the rules, it should work by
> changing the order.
>
> The code might be something like this, and you can change the rules' state
> in the CoFlatMapFunction.
>
> DataStream<Rule> rules = ...;
> DataStream<Record> records = ...;
> DataStream<Tuple2<Rule, Record>> recordWithRule =
> rules.broadcast().connect(records).flatMap(...);
> dataWithRule.keyBy(...).process(...);
>
> Hope this will make sense to you.
>
> Best Regards,
> Tony Wei
>
> 2017-11-09 6:25 GMT+08:00 Ladhari Sadok <laadhari.sa...@gmail.com>:
>
>> Hello,
>>
>> I'm working on Rules Engine project with Flink 1.3, in this project I
>> want to update some keyed operator state when external event occurred.
>>
>> I have a Datastream of updates (from kafka) I want to broadcast the data
>> contained in this stream to all keyed operator so I can change the state in
>> all operators.
>>
>> It is like this use case :
>> Image : https://data-artisans.com/wp-content/uploads/2017/10/streami
>> ng-in-definitions.png
>> All article : https://data-artisans.com/blog/real-time-fraud-detection-
>> ing-bank-apache-flink
>>
>> I founded it in the DataSet API but not in the DataStream API !
>>
>> https://ci.apache.org/projects/flink/flink-docs-release-1.3/
>> dev/batch/index.html#broadcast-variables
>>
>> Can some one explain to me who to solve this problem ?
>>
>> Thanks a lot.
>>
>> Flinkly regards,
>> Sadok
>>
>
>

Reply via email to