Let me understand your case better here. You have a stream of model and
stream of data. To process the data, you will need a way to access your
model from the subsequent stream operations (map, filter, flatmap, ..).
I'm not sure in which case Operator State is a good choice, but I think you
can also live without.

val modelStream = .... // get the model stream
val dataStream   =

modelStream.broadcast.connect(dataStream). coFlatMap(  ) Then you can keep
the latest model in a CoFlatMapRichFunction, not necessarily as Operator
State, although maybe OperatorState is a good choice too.

Does it make sense to you ?

Anwar

On Fri, Nov 6, 2015 at 10:21 AM, Welly Tambunan <if05...@gmail.com> wrote:

> Hi All,
>
> We have a high density data that required a downsample. However this
> downsample model is very flexible based on the client device and user
> interaction. So it will be wasteful to precompute and store to db.
>
> So we want to use Apache Flink to do downsampling and cache the result for
> subsequent query.
>
> We are considering using Flink Operator state for that one.
>
> Is that the right approach to use that for memory cache ? Or if that
> preferable using memory cache like redis etc.
>
> Any comments will be appreciated.
>
>
> Cheers
> --
> Welly Tambunan
> Triplelands
>
> http://weltam.wordpress.com
> http://www.triplelands.com <http://www.triplelands.com/blog/>
>

Reply via email to