Re: Pyflink data stream API to Table API conversion with multiple sinks.

2021-10-08 Thread Dian Fu
Hi Kamil, I guess `statement_set.execute()` should be enough. You could also check whether the job graph is expected via one of the following ways: - Call `print(statement_set.explain())` - Check the Flink web ui to see the job graph of the running job For your problems, could you double check wh

Pyflink data stream API to Table API conversion with multiple sinks.

2021-10-08 Thread Kamil ty
Hello, In my pyflink job I have such flow: 1. Use table API to get messages from Kafka 2. Convert the table to a data stream 3. Convert the data stream back to the table API 4. Use a statement set to write the data to two filesystem sinks (avro and parquet) I'm able to run the job and everything