你的意思是想先把json里面的array展开成多行,然后watermark基于这个展开后的数据来生成是么?
claylin <[email protected]> 于2020年7月17日周五 下午8:37写道: > 那我这种内嵌式的数据结构是不能在sql里面解析了,数组每行转成表中的一列,还有watermark,只能在外部处理成单条记录然后用flink处理了吗 > > > > > ------------------ 原始邮件 ------------------ > 发件人: > "user-zh" > < > [email protected]>; > 发送时间: 2020年7月17日(星期五) 晚上8:33 > 收件人: "user-zh"<[email protected]>; > > 主题: Re: sql 内嵌josn数组解析报 类型转换报错 > > > > 计算列只能写在最外层,不能在嵌套类型里面有计算列。 > > claylin <[email protected]> 于2020年7月17日周五 下午8:28写道: > > > hi all我这边有个嵌套的json数组,报类型转换错误(ts AS CAST(FROM_UNIXTIME(hiido_time) AS > > TIMESTAMP(3)),这里报错),是不是不能这么写 > > create table hiido_push_sdk_mq ( > > datas&nbsp; &nbsp;ARRAY<ROW<`from` string,hdid string,event > > string,hiido_time bigint,ts AS CAST(FROM_UNIXTIME(hiido_time) AS > > TIMESTAMP(3)),WATERMARK FOR ts AS ts - INTERVAL '5' > MINUTE&gt;&gt; > > ) with ( > > 'connector' = 'kafka', > > 'topic' = 'hiido_pushsdk_event', > > 'properties.bootstrap.servers' = 'kafkafs002-core001.yy.com:8103, > > kafkafs002-core002.yy.com:8103,kafkafs002-core003.yy.com:8103', > > 'properties.group.id' = 'push_click_sql_version_consumer', > > 'scan.startup.mode' = 'latest-offset', > > 'format.type' = 'json'); > > > > > > > > > > 错误如下: > > [ERROR] 2020-07-17 20:17:50,640(562284338) --&gt; > [http-nio-8080-exec-10] > > > com.yy.push.flink.sql.gateway.sql.parse.SqlCommandParser.parseBySqlParser(SqlCommandParser.java:77): > > parseBySqlParser, parse: > > com.yy.push.flink.sql.gateway.context.JobContext$1@5d5f32d1, stmt: > create > > table hiido_push_sdk_mq (&nbsp; &nbsp; datas&nbsp; > &nbsp;ARRAY<ROW<`from` > > string,hdid string,event string,hiido_time bigint,ts AS > > CAST(FROM_UNIXTIME(hiido_time) AS TIMESTAMP(3)),WATERMARK FOR ts AS > ts - > > INTERVAL '5' MINUTE&gt;&gt;) with ('connector' = > 'kafka','topic' = > > 'hiido_pushsdk_event','properties.bootstrap.servers' = ' > > kafkafs002-core001.yy.com:8103,kafkafs002-core002.yy.com:8103, > > kafkafs002-core003.yy.com:8103','properties.group.id' = > > 'push_click_sql_version_consumer','scan.startup.mode' = > > 'latest-offset','format.type' = 'json'), error info: SQL parse failed. > > Encountered "AS" at line 1, column 115. > > Was expecting one of: > > &nbsp; &nbsp; "ROW" ... > > &nbsp; &nbsp; <BRACKET_QUOTED_IDENTIFIER&gt; ... > > &nbsp; &nbsp; <QUOTED_IDENTIFIER&gt; ... > > &nbsp; &nbsp; <BACK_QUOTED_IDENTIFIER&gt; ... > > &nbsp; &nbsp; <IDENTIFIER&gt; ... > > &nbsp; &nbsp; <UNICODE_QUOTED_IDENTIFIER&gt; ... > > &nbsp; &nbsp; "STRING" ... > > &nbsp; &nbsp; "BYTES" ... > > &nbsp; &nbsp; "ARRAY" ... > > &nbsp; &nbsp; "MULTISET" ... > > &nbsp; &nbsp; "RAW" ... > > &nbsp; &nbsp; "BOOLEAN" ... > > &nbsp; &nbsp; "INTEGER" ... > > &nbsp; &nbsp; "INT" ... > > &nbsp; &nbsp; "TINYINT" ... > > &nbsp; &nbsp; "SMALLINT" ... > > &nbsp; &nbsp; "BIGINT" ... > > &nbsp; &nbsp; "REAL" ... > > &nbsp; &nbsp; "DOUBLE" ... > > &nbsp; &nbsp; "FLOAT" ... > > &nbsp; &nbsp; "BINARY" ... > > &nbsp; &nbsp; "VARBINARY" ... > > &nbsp; &nbsp; "DECIMAL" ... > > &nbsp; &nbsp; "DEC" ... > > &nbsp; &nbsp; "NUMERIC" ... > > &nbsp; &nbsp; "ANY" ... > > &nbsp; &nbsp; "CHARACTER" ... > > &nbsp; &nbsp; "CHAR" ... > > &nbsp; &nbsp; "VARCHAR" ... > > &nbsp; &nbsp; "DATE" ... > > &nbsp; &nbsp; "TIME" ... > > &nbsp; &nbsp; "TIMESTAMP" ... > > > > -- > > Best, > Benchao Li -- Best, Benchao Li
