medivh511 commented on issue #2701:
URL: https://github.com/apache/paimon/issues/2701#issuecomment-2133798454
> @zhangjun0x01 I remember this feature is already supported. Is there any
problem now?
在你的commit描述里:
When Debezium's data is written into Kafka, the primary key will be
automatically stored in the key. When Paimon parses Kafka messages, the data in
the key will be attached to the ’pkNames‘ field in the value . There are some
demos in unit testing
如果是以Debezium的key为主键(value假定是你的debezium-data-1.txt的模式),那key的样式是什么?
我的oracle cdc 到 kafka的 connect是这样的
{
"name": "test03",
"config": {
"connector.class": "io.debezium.connector.oracle.OracleConnector",
"key.converter": "org.apache.kafka.connect.storage.StringConverter",
"value.converter": "org.apache.kafka.connect.json.JsonConverter",
"topic.prefix": "test",
"database.hostname": "...",
"database.port": "1521",
"database.user": "...",
"database.password": "...",
"database.dbname": "L2DB",
"table.include.list": "FLINKUSER.ACT_DL",
"schema.include.list": "FLINKUSER",
"schema.history.internal.kafka.topic": "schema-changes.l2db",
"snapshot.mode": "initial",
"log.mining.strategy": "online_catalog",
"database.history.store.only.captured.tables.ddl": "true",
"database.tablename.case.insensitive": "false",
"log.mining.continuous.mine": "true",
"decimal.handling.mode": "string",
"schema.history.internal.kafka.bootstrap.servers":
"172.15.89.142:9092,172.15.89.181:9092,172.15.89.182:9092",
"value.converter.schemas.enable": "true"
}
}
其中 "key.converter": "org.apache.kafka.connect.storage.StringConverter"
请问这种格式是否能被解析?还是说有标准的格式?
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]