Hi, 
我试了下,flink-connector-kafka-3.2.0-1.19.jar需要替换成flink-sql-connector-kafka-3.2.0-1.19.jar
 ,

下载地址在文档[1]里的sql client那一列下面,这个包里面是有OffsetResetStrategy的。




你能用这个包再试一下吗?




[1] 
https://nightlies.apache.org/flink/flink-docs-release-1.19/docs/connectors/table/kafka/#dependencies




--

    Best!
    Xuyang





在 2024-07-17 14:22:30,"冯奇" <ha.fen...@aisino.com> 写道:
>flink1.19,hive3.1.2
>使用新参数创建表
>CREATE TABLE mykafka (name String, age Int) WITH (
> 'connector' = 'kafka',
> 'topic' = 'test',
> 'properties.bootstrap.servers' = 'localhost:9092',
> 'properties.group.id' = 'testGroup',
> 'scan.startup.mode' = 'earliest-offset',
> 'format' = 'csv'
>);
>kafka包放了flink-connector-kafka-3.2.0-1.19.jar和flink-connector-base-1.19.0.jar
>Flink SQL> select * from mykafka; [ERROR] Could not execute SQL statement. 
>Reason: java.lang.ClassNotFoundException: 
>org.apache.kafka.clients.consumer.OffsetResetStrategy
>------------------------------------------------------------------
>发件人:Feng Jin <jinfeng1...@gmail.com>
>发送时间:2024年7月16日(星期二) 19:30
>收件人:"user-zh"<user-zh@flink.apache.org>
>主 题:Re: 回复:使用hive的catalog问题
>上面的示例好像使用的旧版本的 kafka connector 参数。
>参考文档使用新版本的参数:
>https://nightlies.apache.org/flink/flink-docs-master/docs/connectors/table/hive/hive_catalog/#step-4-create-a-kafka-table-with-flink-sql-ddl
> 
><https://nightlies.apache.org/flink/flink-docs-master/docs/connectors/table/hive/hive_catalog/#step-4-create-a-kafka-table-with-flink-sql-ddl
> >
>需要把 kafka 的 connector [1] 也放入到 lib 目录下。
>[1]
>https://nightlies.apache.org/flink/flink-docs-release-1.19/docs/connectors/table/kafka/
> 
><https://nightlies.apache.org/flink/flink-docs-release-1.19/docs/connectors/table/kafka/
> >
>Best,
>Feng
>On Tue, Jul 16, 2024 at 2:11 PM Xuyang <xyzhong...@163.com> wrote:
>> lib目录下,需要放置一下flink-sql-connector-hive-3.1.3,这个包是给sql作业用的
>>
>>
>>
>>
>> --
>>
>> Best!
>> Xuyang
>>
>>
>>
>>
>>
>> 在 2024-07-16 13:40:23,"冯奇" <ha.fen...@aisino.com> 写道:
>> >我看了下文档,几个包都在,还有一个单独下载依赖的包flink-sql-connector-hive-3.1.3,不知道是使用这个还是下面的?
>> >// Flink's Hive connector flink-connector-hive_2.12-1.19.1.jar // Hive
>> dependencies hive-exec-3.1.0.jar libfb303-0.9.3.jar // libfb303 is not
>> packed into hive-exec in some versions, need to add it separately // add
>> antlr-runtime if you need to use hive dialect antlr-runtime-3.5.2.jar
>> >lib下面的包
>> >antlr-runtime-3.5.2.jar flink-table-api-java-1.19.0.jar
>> flink-cdc-dist-3.0.0.jar flink-table-api-java-uber-1.19.0.jar
>> flink-cdc-pipeline-connector-doris-3.1.0.jar flink-table-common-1.19.0.jar
>> flink-cdc-pipeline-connector-mysql-3.1.0.jar
>> flink-table-planner-loader-1.19.0.jar flink-cep-1.19.0.jar
>> flink-table-runtime-1.19.0.jar flink-connector-files-1.19.0.jar
>> hive-exec-3.1.2.jar flink-connector-hive_2.12-1.19.0.jar libfb303-0.9.3.jar
>> flink-connector-jdbc-3.1.2-1.18.jar log4j-1.2-api-2.17.1.jar
>> flink-connector-kafka-3.1.0-1.18.jar log4j-api-2.17.1.jar
>> flink-csv-1.19.0.jar log4j-core-2.17.1.jar flink-dist-1.19.0.jar
>> log4j-slf4j-impl-2.17.1.jar flink-json-1.19.0.jar
>> mysql-connector-java-8.0.28.jar flink-scala_2.12-1.19.0.jar
>> paimon-flink-1.19-0.9-20240628.002224-23.jar
>> >------------------------------------------------------------------
>> >发件人:Xuyang <xyzhong...@163.com>
>> >发送时间:2024年7月16日(星期二) 11:43
>> >收件人:"user-zh"<user-zh@flink.apache.org>
>> >主 题:Re:使用hive的catalog问题
>> >Hi, 可以check一下是否将hive sql connector的依赖[1]放入lib目录下或者add jar了吗?
>> >[1]
>> https://nightlies.apache.org/flink/flink-docs-master/docs/connectors/table/hive/overview/
>>  
>> <https://nightlies.apache.org/flink/flink-docs-master/docs/connectors/table/hive/overview/
>>  >
>> <
>> https://nightlies.apache.org/flink/flink-docs-master/docs/connectors/table/hive/overview/
>>  
>> <https://nightlies.apache.org/flink/flink-docs-master/docs/connectors/table/hive/overview/
>>  >
>> >
>> >--
>> > Best!
>> > Xuyang
>> >At 2024-07-15 17:09:45, "冯奇" <ha.fen...@aisino.com> wrote:
>> >>Flink SQL> USE CATALOG myhive;
>> >>Flink SQL> CREATE TABLE mykafka (name String, age Int) WITH (
>> >> 'connector.type' = 'kafka',
>> >> 'connector.version' = 'universal',
>> >> 'connector.topic' = 'hive_sink',
>> >> 'connector.properties.bootstrap.servers' = '10.0.15.242:9092',
>> >> 'format.type' = 'csv',
>> >> 'update-mode' = 'append'
>> >>);
>> >>提示下面错误:
>> >>[ERROR] Could not execute SQL statement. Reason:
>> >>org.apache.flink.table.factories.NoMatchingTableFactoryException: Could
>> not find a suitable table factory for
>> 'org.apache.flink.table.factories.TableSourceFactory' in
>> >>the classpath.
>> >>Reason: Required context properties mismatch.
>> >>The matching candidates:
>> >>org.apache.flink.table.sources.CsvAppendTableSourceFactory
>> >>Mismatched properties:
>> >>'connector.type' expects 'filesystem', but is 'kafka'
>> >>The following properties are requested:
>> >>connector.properties.bootstrap.servers=10.0.15.242:9092
>> >>connector.topic=hive_sink
>> >>connector.type=kafka
>> >>connector.version=universal
>> >>format.type=csv
>> >>schema.0.data-type=VARCHAR(2147483647)
>> >>schema.0.name=name
>> >>schema.1.data-type=INT
>> >>schema.1.name=age
>> >>update-mode=append
>> >>The following factories have been considered:
>> >>org.apache.flink.table.sources.CsvBatchTableSourceFactory
>> >>org.apache.flink.table.sources.CsvAppendTableSourceFactory
>>

回复