804e opened a new issue, #90:
URL: https://github.com/apache/doris-kafka-connector/issues/90
doris-kafka-connector 25.0.0
doris 3.0.3
kafka connect 2.8.0
connect-standalone.properties配置
```
# These are defaults. This file just demonstrates how to override some
settings.
bootstrap.servers=10.0.0.116:9092,10.0.0.117:9092,10.0.0.31:9092
# The converters specify the format of data in Kafka and how to translate it
into Connect data. Every Connect user will
# need to configure these based on the format they want their data in when
loaded from or stored into Kafka
key.converter=org.apache.kafka.connect.json.JsonConverter
value.converter=org.apache.kafka.connect.json.JsonConverter
# Converter-specific settings can be passed in by prefixing the Converter's
setting with the converter we want to apply
# it to
key.converter.schemas.enable=true
value.converter.schemas.enable=true
offset.storage.file.filename=/tmp/connect.offsets
# Flush much faster than normal, which is useful for testing/debugging
offset.flush.interval.ms=10000
# Set to a list of filesystem paths separated by commas (,) to enable class
loading isolation for plugins
# (connectors, converters, transformations). The list should consist of top
level directories that include
# any combination of:
# a) directories immediately containing jars with plugins and their
dependencies
# b) uber-jars with plugins and their dependencies
# c) directories immediately containing the package directory structure of
classes of plugins and their dependencies
# Note: symlinks will be followed to discover dependencies or plugins.
# Examples:
#
plugin.path=/usr/local/share/java,/usr/local/share/kafka/plugins,/opt/connectors,
plugin.path=/opt/kafka_connect_2.13-2.8.0/plugins
# 建议将 Kafka 的 max.poll.interval.ms 时间调大到 30 分钟以上,默认 5 分钟
# 避免 Stream Load 导入数据消费超时,消费者被踢出消费群组
max.poll.interval.ms=1800000
consumer.max.poll.interval.ms=1800000
```
doris-connector-sink.properties配置
```
name=doris3-sink
connector.class=org.apache.doris.kafka.connector.DorisSinkConnector
topics=topic_test
doris.topic2table.map=topic_test:test_kafka_tbl
doris.urls=10.0.0.141
doris.http.port=8030
doris.query.port=9030
doris.user=root
doris.password=password
doris.database=jfdb_log
buffer.count.records=10000
buffer.flush.time=120
buffer.size.bytes=5000000
enable.combine.flush=true
key.converter=org.apache.kafka.connect.storage.StringConverter
value.converter=org.apache.kafka.connect.json.JsonConverter
value.converter.schemas.enable=false
sink.properties.Expect=100-continue
```
connect启动的时候报错
```
[2025-12-09 10:03:25,828] INFO kafka doris sink task closed with
[topic_test-1, topic_test-0, topic_test-2]
(org.apache.doris.kafka.connector.DorisSinkTask:91)
[2025-12-09 10:03:25,828] ERROR WorkerSinkTask{id=doris3-sink-0} Task threw
an uncaught and unrecoverable exception. Task is being killed and will not
recover until manually restarted
(org.apache.kafka.connect.runtime.WorkerTask:184)
org.apache.kafka.connect.errors.ConnectException: Exiting WorkerSinkTask due
to unrecoverable exception.
at
org.apache.kafka.connect.runtime.WorkerSinkTask.deliverMessages(WorkerSinkTask.java:609)
at
org.apache.kafka.connect.runtime.WorkerSinkTask.poll(WorkerSinkTask.java:329)
at
org.apache.kafka.connect.runtime.WorkerSinkTask.iteration(WorkerSinkTask.java:232)
at
org.apache.kafka.connect.runtime.WorkerSinkTask.execute(WorkerSinkTask.java:201)
at
org.apache.kafka.connect.runtime.WorkerTask.doRun(WorkerTask.java:182)
at org.apache.kafka.connect.runtime.WorkerTask.run(WorkerTask.java:231)
at
java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
Caused by: org.apache.doris.kafka.connector.exception.DorisException:
org.apache.doris.kafka.connector.exception.StreamLoadException: failed to
stream load data with label:
topic_test__KC_-1__KC_jfdb_log_test_kafka_tbl__KC_0__KC_940
at
org.apache.doris.kafka.connector.writer.load.AsyncDorisStreamLoad.checkFlushException(AsyncDorisStreamLoad.java:140)
at
org.apache.doris.kafka.connector.writer.load.AsyncDorisStreamLoad.checkException(AsyncDorisStreamLoad.java:156)
at
org.apache.doris.kafka.connector.writer.AsyncStreamLoadWriter.checkFlushException(AsyncStreamLoadWriter.java:121)
at
org.apache.doris.kafka.connector.writer.AsyncStreamLoadWriter.insert(AsyncStreamLoadWriter.java:71)
at
org.apache.doris.kafka.connector.service.DorisCombinedSinkService.insert(DorisCombinedSinkService.java:110)
at
org.apache.doris.kafka.connector.service.DorisCombinedSinkService.insert(DorisCombinedSinkService.java:98)
at
org.apache.doris.kafka.connector.DorisSinkTask.put(DorisSinkTask.java:103)
at
org.apache.kafka.connect.runtime.WorkerSinkTask.deliverMessages(WorkerSinkTask.java:581)
... 10 more
Caused by: org.apache.doris.kafka.connector.exception.StreamLoadException:
failed to stream load data with label:
topic_test__KC_-1__KC_jfdb_log_test_kafka_tbl__KC_0__KC_940
at
org.apache.doris.kafka.connector.writer.load.AsyncDorisStreamLoad$LoadAsyncExecutor.load(AsyncDorisStreamLoad.java:247)
at
org.apache.doris.kafka.connector.writer.load.AsyncDorisStreamLoad$LoadAsyncExecutor.run(AsyncDorisStreamLoad.java:175)
... 3 more
Caused by: org.apache.doris.kafka.connector.exception.StreamLoadException:
response error : {"status":"FAILED","msg":"There is no 100-continue header"}
at
org.apache.doris.kafka.connector.writer.load.AsyncDorisStreamLoad$LoadAsyncExecutor.load(AsyncDorisStreamLoad.java:223)
... 4 more
```
我看代码是有相关的头设置的,而且自己在connector配置上添加`sink.properties.Expect=100-continue`后也仍然报这个错,
辛苦帮忙看看咋回事
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]