我增加调试日志后,发现执行DDL语句创建hive表时,设置了dialect 为hive,现在报错根据堆栈信息是在执行DML语句insert 
into时创建Hive表时提示没有连接器的配置
Table options are: 'is_generic'='false' 
'partition.time-extractor.timestamp-pattern'='$dt $hr' 
'sink.partition-commit.delay'='0S' 
'sink.partition-commit.policy.kind'='metastore,success-file' 
'sink.partition-commit.trigger'='partition-time' at 
org.apache.flink.table.factories.FactoryUtil.createTableSink(FactoryUtil.java:164)
 at 
org.apache.flink.table.planner.delegation.PlannerBase.getTableSink(PlannerBase.scala:344)
 at 
org.apache.flink.table.planner.delegation.PlannerBase.translateToRel(PlannerBase.scala:204)
 at 
org.apache.flink.table.planner.delegation.PlannerBase$$anonfun$1.apply(PlannerBase.scala:163)
 at 
org.apache.flink.table.planner.delegation.PlannerBase$$anonfun$1.apply(PlannerBase.scala:163)
 at 
scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:234)
 at 
scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:234)
 at scala.collection.Iterator$class.foreach(Iterator.scala:891) at 
scala.collection.AbstractIterator.foreach(Iterator.scala:1334) at 
scala.collection.IterableLike$class.foreach(IterableLike.scala:72) at 
scala.collection.AbstractIterable.foreach(Iterable.scala:54) at 
scala.collection.TraversableLike$class.map(TraversableLike.scala:234) at 
scala.collection.AbstractTraversable.map(Traversable.scala:104) at 
org.apache.flink.table.planner.delegation.PlannerBase.translate(PlannerBase.scala:163)
 at 
org.apache.flink.table.api.internal.TableEnvironmentImpl.translate(TableEnvironmentImpl.java:1270)
 at 
org.apache.flink.table.api.internal.TableEnvironmentImpl.executeInternal(TableEnvironmentImpl.java:701)
 at 
org.apache.flink.table.api.internal.TableEnvironmentImpl.executeOperation(TableEnvironmentImpl.java:789)
 at 
org.apache.flink.table.api.internal.TableEnvironmentImpl.executeSql(TableEnvironmentImpl.java:691)
 at com.cgws.ccp.flink.sql.submit.SqlSubmit.callInsertInto(SqlSubmit.java:242) 
at com.cgws.ccp.flink.sql.submit.SqlSubmit.callCommand(SqlSubmit.java:201) at 
com.cgws.ccp.flink.sql.submit.SqlSubmit.run(SqlSubmit.java:126) at 
com.cgws.ccp.flink.sql.submit.SqlSubmit.main(SqlSubmit.java:84) at 
sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) 
at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
 at java.lang.reflect.Method.invoke(Method.java:498) at 
org.apache.flink.client.program.PackagedProgram.callMainMethod(PackagedProgram.java:288)
 ... 11 more Caused by: org.apache.flink.table.api.ValidationException: Table 
options do not contain an option key 'connector' for discovering a connector. 
at 
org.apache.flink.table.factories.FactoryUtil.getDynamicTableFactory(FactoryUtil.java:321)
 at 
org.apache.flink.table.factories.FactoryUtil.createTableSink(FactoryUtil.java:157)
 ... 37 more


假如在执行DML语句时设置Hive方言,那么Kafka的表不是Hive语法,这个该怎么处理?

















在 2021-02-22 17:12:55,"eriendeng" <eriend...@tencent.com> 写道:
>你这没有把dialect set成hive吧,走到了else分支。default
>dialect是需要指定connector的,参考文档的kafka到hive代码
>https://ci.apache.org/projects/flink/flink-docs-release-1.12/dev/table/connectors/hive/hive_read_write.html#writing
>
>
>
>--
>Sent from: http://apache-flink.147419.n8.nabble.com/

回复