+1,  Thanks for the proposal.

I guess this is a long-awaited change. This can vastly increase the
functionalities of the SQL Client as it will be possible to use complex
extensions like for example those provided by Apache Bahir[1].

Best Regards,
Dom.

[1]
https://github.com/apache/bahir-flink

sob., 3 lis 2018 o 17:17 Rong Rong <walter...@gmail.com> napisał(a):

> +1. Thanks for putting the proposal together Shuyi.
>
> DDL has been brought up in a couple of times previously [1,2]. Utilizing
> DDL will definitely be a great extension to the current Flink SQL to
> systematically support some of the previously brought up features such as
> [3]. And it will also be beneficial to see the document closely aligned
> with the previous discussion for unified SQL connector API [4].
>
> I also left a few comments on the doc. Looking forward to the alignment
> with the other couple of efforts and contributing to them!
>
> Best,
> Rong
>
> [1]
>
> http://mail-archives.apache.org/mod_mbox/flink-dev/201805.mbox/%3CCAMZk55ZTJA7MkCK1Qu4gLPu1P9neqCfHZtTcgLfrFjfO4Xv5YQ%40mail.gmail.com%3E
> [2]
>
> http://mail-archives.apache.org/mod_mbox/flink-dev/201810.mbox/%3CDC070534-0782-4AFD-8A85-8A82B384B8F7%40gmail.com%3E
>
> [3] https://issues.apache.org/jira/browse/FLINK-8003
> [4]
>
> http://mail-archives.apache.org/mod_mbox/flink-dev/201810.mbox/%3c6676cb66-6f31-23e1-eff5-2e9c19f88...@apache.org%3E
>
>
> On Fri, Nov 2, 2018 at 10:22 AM Bowen Li <bowenl...@gmail.com> wrote:
>
> > Thanks Shuyi!
> >
> > I left some comments there. I think the design of SQL DDL and Flink-Hive
> > integration/External catalog enhancements will work closely with each
> > other. Hope we are well aligned on the directions of the two designs,
> and I
> > look forward to working with you guys on both!
> >
> > Bowen
> >
> >
> > On Thu, Nov 1, 2018 at 10:57 PM Shuyi Chen <suez1...@gmail.com> wrote:
> >
> > > Hi everyone,
> > >
> > > SQL DDL support has been a long-time ask from the community. Current
> > Flink
> > > SQL support only DML (e.g. SELECT and INSERT statements). In its
> current
> > > form, Flink SQL users still need to define/create table sources and
> sinks
> > > programmatically in Java/Scala. Also, in SQL Client, without DDL
> support,
> > > the current implementation does not allow dynamical creation of table,
> > type
> > > or functions with SQL, this adds friction for its adoption.
> > >
> > > I drafted a design doc [1] with a few other community members that
> > proposes
> > > the design and implementation for adding DDL support in Flink. The
> > initial
> > > design considers DDL for table, view, type, library and function. It
> will
> > > be great to get feedback on the design from the community, and align
> with
> > > latest effort in unified SQL connector API  [2] and Flink Hive
> > integration
> > > [3].
> > >
> > > Any feedback is highly appreciated.
> > >
> > > Thanks
> > > Shuyi Chen
> > >
> > > [1]
> > >
> > >
> >
> https://docs.google.com/document/d/1TTP-GCC8wSsibJaSUyFZ_5NBAHYEB1FVmPpP7RgDGBA/edit?usp=sharing
> > > [2]
> > >
> > >
> >
> https://docs.google.com/document/d/1Yaxp1UJUFW-peGLt8EIidwKIZEWrrA-pznWLuvaH39Y/edit?usp=sharing
> > > [3]
> > >
> > >
> >
> https://docs.google.com/document/d/1SkppRD_rE3uOKSN-LuZCqn4f7dz0zW5aa6T_hBZq5_o/edit?usp=sharing
> > > --
> > > "So you have to trust that the dots will somehow connect in your
> future."
> > >
> >
>

Reply via email to