Hi,
I think if the Table API/SQL API evolves enough it should be able to supply
a Flink program as just an SQL query with source/sink definitions.
Hopefully, in the future. :-)

Cheers,
Aljoscha

On Fri, 22 Apr 2016 at 23:10 Fabian Hueske <fhue...@gmail.com> wrote:

> Hi Alex,
>
> welcome to the Flink community!
> Right now, there is no way to specify a Flink program without writing code
> (Java, Scala, Python(beta)).
>
> In principle it is possible to put such functionality on top of the
> DataStream or DataSet APIs.
> This has been done before for other programming APIs (Flink's own
> libraries Table API, Gelly, FlinkML, and externals Apache Beam / Google
> DataFlow, Mahout, Cascading, ...). However, all of these are again
> programming APIs, some specialized for certain use-cases.
>
> Specifying Flink programs by config files (or graphically) would require a
> data model, a DataStream/DataSet program generator and probably a code
> generation component.
>
> Best, Fabian
>
> 2016-04-22 18:41 GMT+02:00 Alexander Smirnov <alexander.smirn...@gmail.com
> >:
>
>> Hi guys!
>>
>> I’m new to Flink, and actually to this mailing list as well :) this is my
>> first message.
>> I’m still reading the documentation and I would say Flink is an amazing
>> system!! Thanks everybody who participated in the development!
>>
>> The information I didn’t find in the documentation - if it is possible to
>> describe data(stream) transformation without any code (Java/Scala).
>> I mean if it is possible to describe datasource functions, all of the
>> operators, connections between them, and sinks in a plain text
>> configuration file and then feed it to Flink.
>> In this case it would be possible to change data flow without
>> recompilation/redeployment.
>>
>> Is there a similar functionality in Flink? May be some third party plugin?
>>
>> Thank you,
>> Alex
>
>
>

Reply via email to