我本地试了一下没有复现你的问题,你的hive版本是什么呢?
On Fri, Jul 30, 2021 at 3:00 PM Asahi Lee <[email protected]> wrote: > CREATE TABLE `cosldatacenter.ods_emp_maindata_iadc_paramvalue`( > `paramvalue_id` string COMMENT '', > `platform_id` string COMMENT '', > `equipment_id` string COMMENT '', > `param_id` string COMMENT '', > `param_value` string COMMENT '', > `remark` string COMMENT '', > `create_time` string COMMENT '', > `creator` string COMMENT '', > `update_time` string COMMENT '', > `update_person` string COMMENT '', > `record_flag` double COMMENT '', > `subject_id` string COMMENT '', > `output_unit` string COMMENT '', > `show_seq` double COMMENT '') > COMMENT '' > ROW FORMAT SERDE > 'org.apache.hadoop.hive.ql.io.orc.OrcSerde' > WITH SERDEPROPERTIES ( > 'field.delim'=',', > 'serialization.format'=',') > STORED AS INPUTFORMAT > 'org.apache.hadoop.hive.ql.io.orc.OrcInputFormat' > OUTPUTFORMAT > 'org.apache.hadoop.hive.ql.io.orc.OrcOutputFormat' > LOCATION > > 'hdfs://hadoop02:8020/user/hive/warehouse/cosldatacenter.db/ods_emp_maindata_iadc_paramvalue' > TBLPROPERTIES ( > 'COLUMN_STATS_ACCURATE'='false', > 'last_modified_by'='root', > 'last_modified_time'='1621834335', > 'numFiles'='0', > 'numRows'='-1', > 'rawDataSize'='-1', > 'totalSize'='0', > 'transient_lastDdlTime'='1621834335') > > > > CREATE TABLE `cosldatacenter.ods_emp_md_large_equip`( > `large_equip_id` string COMMENT '', > `equip_name` string COMMENT '', > `equip_type` string COMMENT '', > `equip_function` string COMMENT '', > `equip_board` string COMMENT '', > `ship_yard` string COMMENT '', > `manufacturer_date` string COMMENT '', > `enqueue_date` string COMMENT '', > `dockrepair_date` string COMMENT '', > `scrap_date` string COMMENT '', > `enqueue_mode` string COMMENT '', > `work_for_org` string COMMENT '', > `work_in_org` string COMMENT '', > `old_age` string COMMENT '', > `create_time` date COMMENT '', > `creator` string COMMENT '', > `update_time` date COMMENT '', > `update_person` string COMMENT '', > `record_flag` double COMMENT '', > `data_timestamp` string COMMENT '', > `work_unit_id` string COMMENT '', > `work_status` string COMMENT '', > `work_location` string COMMENT '', > `work_area` string COMMENT '', > `equip_code` string COMMENT '', > `shi_main_power` double COMMENT '', > `shi_total_len` double COMMENT '', > `shi_type_width` double COMMENT '', > `shi_type_depth` double COMMENT '', > `shi_design_draft` double COMMENT '', > `shi_total_tonnage` double COMMENT '', > `shi_load_tonnage` double COMMENT '', > `remark` string COMMENT '', > `unit_classification1` string COMMENT '', > `unit_classification2` string COMMENT '') > COMMENT '' > ROW FORMAT SERDE > 'org.apache.hadoop.hive.ql.io.orc.OrcSerde' > WITH SERDEPROPERTIES ( > 'field.delim'=',', > 'serialization.format'=',') > STORED AS INPUTFORMAT > 'org.apache.hadoop.hive.ql.io.orc.OrcInputFormat' > OUTPUTFORMAT > 'org.apache.hadoop.hive.ql.io.orc.OrcOutputFormat' > LOCATION > > 'hdfs://hadoop02:8020/user/hive/warehouse/cosldatacenter.db/ods_emp_md_large_equip' > TBLPROPERTIES ( > 'COLUMN_STATS_ACCURATE'='false', > 'last_modified_by'='root', > 'last_modified_time'='1621834338', > 'numFiles'='0', > 'numRows'='-1', > 'rawDataSize'='-1', > 'totalSize'='0', > 'transient_lastDdlTime'='1621834338') > > > > CREATE TABLE `cosldatacenter.ods_emp_maindata_iadc_paramdef`( > `param_id` string COMMENT '', > `iadc_id` string COMMENT '', > `param_code` string COMMENT '', > `param_en` string COMMENT '', > `param_cn` string COMMENT '', > `output_standard` string COMMENT '', > `output_unit` string COMMENT '', > `param_type` string COMMENT '', > `param_value` string COMMENT '', > `remark` string COMMENT '', > `create_time` string COMMENT '', > `creator` string COMMENT '', > `update_time` string COMMENT '', > `update_person` string COMMENT '', > `record_flag` double COMMENT '') > COMMENT '' > ROW FORMAT SERDE > 'org.apache.hadoop.hive.ql.io.orc.OrcSerde' > WITH SERDEPROPERTIES ( > 'field.delim'=',', > 'serialization.format'=',') > STORED AS INPUTFORMAT > 'org.apache.hadoop.hive.ql.io.orc.OrcInputFormat' > OUTPUTFORMAT > 'org.apache.hadoop.hive.ql.io.orc.OrcOutputFormat' > LOCATION > > 'hdfs://hadoop02:8020/user/hive/warehouse/cosldatacenter.db/ods_emp_maindata_iadc_paramdef' > TBLPROPERTIES ( > 'COLUMN_STATS_ACCURATE'='false', > 'last_modified_by'='root', > 'last_modified_time'='1621834335', > 'numFiles'='0', > 'numRows'='-1', > 'rawDataSize'='-1', > 'totalSize'='0', > 'transient_lastDdlTime'='1621834335') > > > > CREATE TABLE `cosldatacenter.dw_riginfoparam`( > `large_equip_id` string, > `equip_code` string, > `equip_name` string, > `enqueue_date` string, > `shi_total_len` double, > `shi_type_width` double, > `shi_type_depth` double, > `moonpool` string, > `maxwindvelocity` string, > `maxwaveheight` string, > `airgap` string, > `maxopewaterdepth` string, > `drilldepthcap` string, > `drillvl` string, > `drillwater` string, > `potablewater` string) > ROW FORMAT SERDE > 'org.apache.hadoop.hive.ql.io.orc.OrcSerde' > WITH SERDEPROPERTIES ( > 'field.delim'=',', > 'serialization.format'=',') > STORED AS INPUTFORMAT > 'org.apache.hadoop.hive.ql.io.orc.OrcInputFormat' > OUTPUTFORMAT > 'org.apache.hadoop.hive.ql.io.orc.OrcOutputFormat' > LOCATION > > 'hdfs://hadoop02:8020/user/hive/warehouse/cosldatacenter.db/dw_riginfoparam' > TBLPROPERTIES ( > 'COLUMN_STATS_ACCURATE'='false', > 'numFiles'='1', > 'numRows'='-1', > 'rawDataSize'='-1', > 'totalSize'='1564', > 'transient_lastDdlTime'='1627353556') > > > > > > > > ------------------ 原始邮件 ------------------ > 发件人: > "user-zh" > < > [email protected]>; > 发送时间: 2021年7月30日(星期五) 中午11:18 > 收件人: "user-zh"<[email protected]>; > > 主题: Re: flink 1.13.1 使用hive方言,执行hive sql解析报错 > > > > 你好, > > 能不能把你insert语句中使用到的表的DDL发一下?贴一下show create table的结果就可以了。 > > On Thu, Jul 29, 2021 at 9:03 PM Asahi Lee <[email protected]> > wrote: > > > hi! > > 我验证了,不是else的问题,下面的sql也报同样的问题?Invalid table alias or column reference > 'u' > > ,我的sql里面没有'u'的名称! > > CREATE CATALOG `tempo_df_hive_default_catalog` WITH( > > &nbsp; &nbsp; 'type' = 'hive', > > &nbsp; &nbsp; 'default-database' = 'default' > > ); > > USE CATALOG tempo_df_hive_default_catalog; > > CREATE TABLE IF NOT EXISTS `default`.`tempo_blackhole_table` ( > > &nbsp; &nbsp;f0 INT > > ); > > insert into cosldatacenter.dw_riginfoparam > > select&nbsp; > > c.LARGE_EQUIP_ID, > > c.EQUIP_CODE, > > c.EQUIP_NAME, > > c.ENQUEUE_DATE, > > c.SHI_TOTAL_LEN, > > c.SHI_TYPE_WIDTH, > > c.SHI_TYPE_DEPTH, > > case when b.param_cn = '月池尺寸' then a.param_value else null end as > Moonpool, > > case when b.param_cn = '最大风速' then a.param_value else null end as > > MaxWindvelocity, > > case when b.param_cn = '最大波浪高度' then a.param_value else null end as > > MaxWaveheight, > > case when b.param_cn = '气隙' then a.param_value else null end as > Airgap, > > case when b.param_cn = '设计最大作业水深' then a.param_value else null end as > > MaxOpeWaterdepth, > > case when b.param_cn = '额定钻井深度' then a.param_value else null end as > > DrilldepthCap, > > case when b.param_cn = '钻井可变载荷' then a.param_value else null end as > > DrillVL, > > case when b.param_cn = '钻井水' then a.param_value else null end as > > DrillWater, > > case when b.param_cn = '生活水' then a.param_value else null end as > > PotableWater > > from cosldatacenter.ods_emp_maindata_iadc_paramvalue a&nbsp; > > inner join cosldatacenter.ods_emp_maindata_iadc_paramdef b on > a.param_id = > > b.param_id > > inner join cosldatacenter.ods_emp_md_large_equip c on > > a.SUBJECT_ID=c.LARGE_EQUIP_ID; > > INSERT INTO `default`.`tempo_blackhole_table` SELECT 1 ; > > > > > > > > > > > > org.apache.hadoop.hive.ql.parse.SemanticException: Line 2:178 Invalid > > table alias or column reference 'u': (possible column names are: > > a.paramvalue_id, a.platform_id, a.equipment_id, a.param_id, > a.param_value, > > a.remark, a.create_time, a.creator, a.update_time, a.update_person, > > a.record_flag, a.subject_id, a.output_unit, a.show_seq, b.param_id, > > b.iadc_id, b.param_code, b.param_en, b.param_cn, b.output_standard, > > b.output_unit, b.param_type, b.param_value, b.remark, b.create_time, > > b.creator, b.update_time, b.update_person, b.record_flag, > c.large_equip_id, > > c.equip_name, c.equip_type, c.equip_function, c.equip_board, > c.ship_yard, > > c.manufacturer_date, c.enqueue_date, c.dockrepair_date, c.scrap_date, > > c.enqueue_mode, c.work_for_org, c.work_in_org, c.old_age, > c.create_time, > > c.creator, c.update_time, c.update_person, c.record_flag, > c.data_timestamp, > > c.work_unit_id, c.work_status, c.work_location, c.work_area, > c.equip_code, > > c.shi_main_power, c.shi_total_len, c.shi_type_width, c.shi_type_depth, > > c.shi_design_draft, c.shi_total_tonnage, c.shi_load_tonnage, c.remark, > > c.unit_classification1, c.unit_classification2) > > > > > > > > > > ------------------&nbsp;原始邮件&nbsp;------------------ > > 发件人: > > > "user-zh" > > > < > > [email protected]&gt;; > > 发送时间:&nbsp;2021年7月29日(星期四) 下午3:32 > > 收件人:&nbsp;"user-zh"<[email protected]&gt;; > > > > 主题:&nbsp;Re: flink 1.13.1 使用hive方言,执行hive sql解析报错 > > > > > > > > 看起来是sql语法报错,这里面的ELSE呢? > > > > 祝好, > > Leonard > > > > > > &gt; 在 2021年7月27日,20:04,Asahi Lee <[email protected]&gt; > 写道: > > &gt; > > &gt; CASE > > > &gt; > WHEN mipd.`param_cn` = '月池尺寸' THEN > > > &gt; > mipv.`param_value`&amp;nbsp; > > &gt; END AS `Moonpool` > > > > -- > Best regards! > Rui Li -- Best regards! Rui Li
