Hi Vino & Hequn,
        I am now using the table/sql API, if I import the mysql table as a 
stream then convert it into a table, it seems that it can also be a workaround 
for batch/streaming joining. May I ask what is the difference between the UDTF 
method? Does this implementation has some defects?
        
Best
Henry

> 在 2018年9月22日,上午10:28,Hequn Cheng <chenghe...@gmail.com> 写道:
> 
> Hi
> 
> +1 for vino's answer. 
> Also, this kind of join will be supported in FLINK-9712 
> <https://issues.apache.org/jira/browse/FLINK-9712>. You can check more 
> details in the jira.
> 
> Best, Hequn
> 
> On Fri, Sep 21, 2018 at 4:51 PM vino yang <yanghua1...@gmail.com 
> <mailto:yanghua1...@gmail.com>> wrote:
> Hi Henry,
> 
> There are three ways I can think of:
> 
> 1) use DataStream API, implement a flatmap UDF to access dimension table;
> 2) use table/sql API, implement a UDTF to access dimension table;
> 3) customize the table/sql join API/statement's implementation (and change 
> the physical plan)
> 
> Thanks, vino.
> 
> 徐涛 <happydexu...@gmail.com <mailto:happydexu...@gmail.com>> 于2018年9月21日周五 
> 下午4:43写道:
> Hi All,
>         Sometimes some “dimension table” need to be joined from the "fact 
> table", if data are not joined before sent to Kafka.
>         So if the data are joined in Flink, does the “dimension table” have 
> to be import as a stream, or there are some other ways can achieve it?
>         Thanks a lot!
> 
> Best
> Henry

Reply via email to