[ 
https://issues.apache.org/jira/browse/FLINK-23567?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17391346#comment-17391346
 ] 

wuyang edited comment on FLINK-23567 at 8/2/21, 6:00 AM:
---------------------------------------------------------

[~lirui]  Like the screenshot, but I removed this filter and repacked it and it 
succeeded.

CREATE CATALOG hive_catalog WITH (
 'type' = 'hive',
 'default-database' = 'tmp',
 'hive-conf-dir' = '/etc/hive/conf'
 );
 – set the HiveCatalog as the current catalog of the session
 USE CATALOG hive_catalog;

– 要指定 hive 方言,不然 hive 表创建不成功
 SET table.sql-dialect=hive; 
 drop table if exists tmp.flink_sql_sink_hive_hi;
 CREATE TABLE `tmp.flink_sql_sink_hive_hi`(
 `log_timestamp` BIGINT COMMENT '事件时间',
 `name` STRING COMMENT '姓名',
 `age` STRING COMMENT '年龄',
 `sex` STRING COMMENT '性别',
 `hometown` STRING COMMENT '家乡',
 `work` STRING COMMENT '工作'
 )PARTITIONED BY (`dt` STRING COMMENT '天',`hh` STRING COMMENT '小时') STORED AS 
PARQUET LOCATION 
'hdfs://ddNS/user/hive/warehouse/tmp.db/flink_sql_sink_hive_hi' 
 TBLPROPERTIES (
 --'partition.time-extractor.timestamp-pattern'='$dt $hh', – hive 分区提取器
 'sink.partition-commit.trigger'='partition-time', – 分区触发提交 
 'sink.partition-commit.delay'='1 h', – 提交延迟
 'sink.partition-commit.policy.kind'='metastore,success-file' – 提交类型
 );

!image-2021-08-02-14-00-26-096.png!


was (Author: wuyang09):
[~lirui]  Like the screenshot, but I removed this filter and repacked it and it 
succeeded.

CREATE CATALOG hive_catalog WITH (
 'type' = 'hive',
 'default-database' = 'tmp',
 'hive-conf-dir' = '/etc/hive/conf'
);
-- set the HiveCatalog as the current catalog of the session
USE CATALOG hive_catalog;

-- 要指定 hive 方言,不然 hive 表创建不成功
SET table.sql-dialect=hive; 
drop table if exists tmp.flink_sql_sink_hive_hi;
CREATE TABLE `tmp.flink_sql_sink_hive_hi`(
`log_timestamp` BIGINT COMMENT '事件时间',
`name` STRING COMMENT '姓名',
`age` STRING COMMENT '年龄',
`sex` STRING COMMENT '性别',
`hometown` STRING COMMENT '家乡',
`work` STRING COMMENT '工作'
)PARTITIONED BY (`dt` STRING COMMENT '天',`hh` STRING COMMENT '小时') STORED AS 
PARQUET LOCATION 
'hdfs://ddNS/user/hive/warehouse/tmp.db/flink_sql_sink_hive_hi' 
TBLPROPERTIES (
 --'partition.time-extractor.timestamp-pattern'='$dt $hh', -- hive 分区提取器
 'sink.partition-commit.trigger'='partition-time', -- 分区触发提交 
 'sink.partition-commit.delay'='1 h', -- 提交延迟
 'sink.partition-commit.policy.kind'='metastore,success-file' -- 提交类型
);

!image-2021-08-02-13-55-26-467.png!

> Hive 1.1.0 failed to write using flink sql 1.13.1 because the JSON class was 
> not found
> --------------------------------------------------------------------------------------
>
>                 Key: FLINK-23567
>                 URL: https://issues.apache.org/jira/browse/FLINK-23567
>             Project: Flink
>          Issue Type: Bug
>          Components: Connectors / Hive
>    Affects Versions: 1.13.1
>            Reporter: wuyang
>            Priority: Blocker
>             Fix For: 1.13.1
>
>         Attachments: image-2021-07-31-10-39-52-126.png, 
> image-2021-07-31-10-40-07-070.png, image-2021-08-02-13-55-26-467.png, 
> image-2021-08-02-14-00-26-096.png
>
>
> *First:I added the flink-sql-connector-hive-3.1.2_2.11-1.13.1.jar under the 
> Lib directory, the following error is prompted when publishing the task of 
> Flink SQL:*
> java.lang.NoClassDefFoundError: org/json/JSONException
>  at 
> org.apache.flink.table.planner.delegation.hive.parse.HiveParserDDLSemanticAnalyzer.analyzeCreateTable(HiveParserDDLSemanticAnalyzer.java:646)
>  at 
> org.apache.flink.table.planner.delegation.hive.parse.HiveParserDDLSemanticAnalyzer.analyzeInternal(HiveParserDDLSemanticAnalyzer.java:373)
>  at 
> org.apache.flink.table.planner.delegation.hive.HiveParser.processCmd(HiveParser.java:235)
>  at 
> org.apache.flink.table.planner.delegation.hive.HiveParser.parse(HiveParser.java:217)
>  at 
> org.apache.flink.table.api.internal.TableEnvironmentImpl.executeSql(TableEnvironmentImpl.java:724)
>  at 
> me.ddmc.bigdata.sqlsubmit.helper.SqlSubmitHelper.callSql(SqlSubmitHelper.java:201)
>  at 
> me.ddmc.bigdata.sqlsubmit.helper.SqlSubmitHelper.callCommand(SqlSubmitHelper.java:182)
>  at 
> me.ddmc.bigdata.sqlsubmit.helper.SqlSubmitHelper.run(SqlSubmitHelper.java:124)
>  at me.ddmc.bigdata.sqlsubmit.SqlSubmit.main(SqlSubmit.java:34)
>  at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>  at 
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
>  at 
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>  at java.lang.reflect.Method.invoke(Method.java:498)
>  at 
> org.apache.flink.client.program.PackagedProgram.callMainMethod(PackagedProgram.java:355)
> at 
> org.apache.flink.client.program.PackagedProgram.invokeInteractiveModeForExecution(PackagedProgram.java:222)
>  at org.apache.flink.client.ClientUtils.executeProgram(ClientUtils.java:114)
>  at 
> org.apache.flink.client.cli.CliFrontend.executeProgram(CliFrontend.java:812)
>  at org.apache.flink.client.cli.CliFrontend.run(CliFrontend.java:246)
>  at org.apache.flink.client.cli.CliFrontend.parseAndRun(CliFrontend.java:1054)
>  at 
> org.apache.flink.client.cli.CliFrontend.lambda$main$10(CliFrontend.java:1132)
>  at java.security.AccessController.doPrivileged(Native Method)
>  at javax.security.auth.Subject.doAs(Subject.java:422)
>  at 
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1754)
>  at 
> org.apache.flink.runtime.security.contexts.HadoopSecurityContext.runSecured(HadoopSecurityContext.java:41)
>  at org.apache.flink.client.cli.CliFrontend.main(CliFrontend.java:1132)
>  Caused by: java.lang.ClassNotFoundException: org.json.JSONException
>  at java.net.URLClassLoader.findClass(URLClassLoader.java:382)
>  at java.lang.ClassLoader.loadClass(ClassLoader.java:418)
>  at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:355)
>  at java.lang.ClassLoader.loadClass(ClassLoader.java:351)
>  ... 25 
>  
> *Second:  After investigation, it is found that the exclude is added to the 
> POM in the flink-sql-connector-hive-1.2.2 module, but other hive connectors 
> are not.*
> !image-2021-07-31-10-40-07-070.png!
> *But I didn't understand this remark. Is this a problem?*
>  
>  
>  



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

Reply via email to