What Michael said. 

My intention when I created the server module is that people should use it as a 
starting point for their ddl extensions. Copy-paste what you need. 

See http://calcite.apache.org/docs/adapter.html#server

Julian

> On Jan 6, 2018, at 12:16, Michael Mior <[email protected]> wrote:
> 
> The core of Calcite doesn't contain any DDL but the recently-added server
> module has preliminary support for CREATE TABLE. See the commit which added
> this here (
> https://github.com/apache/calcite/commit/238b3225a2309a1a72bd1383b57982feaa2068e8).
> This likely won't do everything you need, but should be a good starting
> point.
> 
> --
> Michael Mior
> [email protected]
> 
> 2018-01-05 21:33 GMT-05:00 DONG, Weike <[email protected]>:
> 
>> Hello everyone,
>> 
>> I am currently working on a project to extend the functionalities of SQL to
>> support more stream-computing features on Apache Flink which uses Calcite
>> to cope with SQL processing.
>> 
>> Currently the features provided by Calcite are not enough for my project
>> and I would like to know if there is a way to add custom grammar like
>> 
>> CREATE TABLE my_table (
>>    id   bigint,
>>    user varchar(20)
>> ) *PARAMS *(
>>    connector 'kafka',
>>    topic     'my_topic'
>> )
>> 
>> which uses something like *PARAMS *to define how to receive data from a
>> Kafka connector and treat it like a dynamic table as a data source to
>> Flink.
>> 
>> Also, I would like to add features like "CREATE STREAM" statement in Amazon
>> Kinesis
>> <https://docs.aws.amazon.com/kinesisanalytics/latest/
>> sqlref/sql-reference-create-stream.html>,
>> even though I know that this might be a tough task.
>> 
>> Since there is so little information about this on the Internet, I would
>> greatly appreciate it if someone of you could provide some hints or
>> anything useful.
>> 
>> Thank you : )
>> 
>> 
>> Sincerely,
>> Weike
>> 

Reply via email to