Hi,

we are currently looking at replacing our sinks and sources with the respective 
counterparts using the 'new' data source/sink API (mainly Kafka). What holds us 
back is that we are not sure how to test the pipeline with mocked 
sources/sinks. Up till now, we somewhat followed the 'Testing docs'[0] and 
created a simple SinkFunction, as well as a ParallelSourceFunction, where we 
could get data in and out at our leisure. They could be easily plugged into the 
pipeline for the tests. But with the new API, it seems way too cumbersome to go 
such an approach, as there is a lot of overhead in creating a sink or source on 
your own (now).

I would love to know, what the intended or recommended way is here. I know, 
that I can still use the old API, but that a) feels wrong, and b) requires us 
to expose the DataStream, which is not necessary in the current setup.

I appreciate any ideas or even examples on this.

Thank you in advance.

Best
  David


[0]: 
https://nightlies.apache.org/flink/flink-docs-release-1.15/docs/dev/datastream/testing/#testing-flink-jobs

Attachment: smime.p7s
Description: S/MIME cryptographic signature

Reply via email to