Hi, Shengnan YU ~
You can reference the test cases in FlinkDDLDataTypeTest[1] for a quick
reference of what a DDL column type looks like.
[1]
https://github.com/apache/flink/blob/a194b37d9b99a47174de9108a937f821816d61f5/flink-table/flink-sql-parser/src/test/java/org/apache/flink/sql/parser/Flin
Hi Shengnan,
Yes. Flink 1.9 supports nested json derived. You should declare the ROW
type with nested schema explicitly. I tested a similar DDL against 1.9.0
RC2 and worked well.
CREATE TABLE kafka_json_source (
rowtime VARCHAR,
user_name VARCHAR,
event ROW
) WITH (
'connector.typ
Hi guys
I am trying the DDL feature in branch 1.9-releasae. I am stucked in creating a
table from kafka with nested json format. Is it possibe to specify a "Row" type
of columns to derive the nested json schema?
String sql = "create table kafka_stream(\n" +
" a varchar, \n" +