Sure. 

query String = “
Select table1.*, table2.*
From table1 inner join table2 on table1.Id = table2.refId”;

TableEnv.createTemporaryView(“myView”, tableEnv.sqlQuery(query));

newQuery1 String = “select * from myView where myField1>10”

newQuery2 String = “select * from myView where myField2 >1999”

Then take the two queries and transform them
Into data streams and from there output to sinks

Med venlig hilsen / Best regards
Lasse Nedergaard


Den 2. mar. 2023 kl. 12.32 skrev Shammon FY <zjur...@gmail.com>:


Hi

Can you provide more information about the job such as sql? It may help to find the answer

Best,
Shammon


On Thu, Mar 2, 2023 at 6:02 PM Lasse Nedergaard <lassenedergaardfl...@gmail.com> wrote:
Hi

I’m working with a simple pipeline that reads from two Kafka topics one standard and one compacted in sql. Then create a temporary view doing a join of the two tables also in sql On top of the view I create two queries and transform the result to data streams and forward to a sink for each.

Please look at the DAG


It reads from Kafka twice and doing the join twice instead of one time I don’t understand why. My best guess is because of the transformation to data stream.
I testing on Flink 1.16.1 on a mini cluster.

Any suggestions / explanation are much appreciated.

Med venlig hilsen / Best regards
Lasse Nedergaard

Reply via email to