Hi,

I have set up a remote stateful function in python which I’ve deployed
on an AWS EC2 box. I am interacting with this from a separate statefun
docker container running 2 flink-statefun images with roles master and
worker (on a separate EC2 instance). The ingress and egress points for
this function are Kafka.

I then have a separate Java application using Flink, deployed on a
Ververica cluster. From this application I am communicating with the
statefun function by adding a sink/source pointing at the
ingress/egress above.

I have a couple of questions on this setup.

I am unsure if there is a better way to communicate with the function
from the Flink application
I am wondering if there is anyway that I can use the existing deployed
application to maintain the state of my remote function, meaning that
I can discard the statefun master/worker elements?
Failing that, do I just need to create a new Flink application,
translate the equivalent of the module.yml that is passed to the
existing master/worker to Java, add the dependencies and deploy that
jar?

I hope that makes sense?
Kindest Regards,

Barry

Reply via email to