This is an automated email from the ASF dual-hosted git repository. xiangfu pushed a commit to branch kafka2.0_doc in repository https://gitbox.apache.org/repos/asf/incubator-pinot.git
commit 24bed398b104492ebe58c402195a4fc067ca995e Author: Xiang Fu <fx19880...@gmail.com> AuthorDate: Sat Aug 3 16:22:18 2019 +0800 Adding kafka 2.0 doc for using simple consumer --- docs/pluggable_streams.rst | 42 +++++++++++++++++++++++++++++++++++++++++- 1 file changed, 41 insertions(+), 1 deletion(-) diff --git a/docs/pluggable_streams.rst b/docs/pluggable_streams.rst index ff27cde..8a90f59 100644 --- a/docs/pluggable_streams.rst +++ b/docs/pluggable_streams.rst @@ -177,10 +177,50 @@ Below is a sample `streamConfigs` used to create a realtime table with Kafka Str "stream.kafka.hlc.bootstrap.server": "localhost:19092" } +Below is a sample table config used to create a realtime table with Kafka Partition(Low) level consumer: + +.. code-block:: none + + { + "tableName": "meetupRsvp", + "tableType": "REALTIME", + "segmentsConfig": { + "timeColumnName": "mtime", + "timeType": "MILLISECONDS", + "segmentPushType": "APPEND", + "segmentAssignmentStrategy": "BalanceNumSegmentAssignmentStrategy", + "schemaName": "meetupRsvp", + "replication": "1", + "replicasPerPartition": "1" + }, + "tenants": {}, + "tableIndexConfig": { + "loadMode": "MMAP", + "streamConfigs": { + "streamType": "kafka", + "stream.kafka.consumer.type": "simple", + "stream.kafka.topic.name": "meetupRSVPEvents", + "stream.kafka.decoder.class.name": "org.apache.pinot.core.realtime.impl.kafka.KafkaJSONMessageDecoder", + "stream.kafka.consumer.factory.class.name": "org.apache.pinot.core.realtime.impl.kafka2.KafkaConsumerFactory", + "stream.kafka.zk.broker.url": "localhost:2191/kafka", + "stream.kafka.broker.list": "localhost:19092" + } + }, + "metadata": { + "customConfigs": {} + } + } + +Please note that, + +1. Config ``replicasPerPartition`` under ``segmentsConfig`` is required to specify table replication. +2. Config ``stream.kafka.consumer.type`` should be specified as ``simple`` to use partition level consumer. +3. Configs ``stream.kafka.zk.broker.url`` and ``stream.kafka.broker.list`` are required under ``tableIndexConfig.streamConfigs`` to provide kafka related information. + Upgrade from Kafka 0.9 connector to Kafka 2.x connector ------------------------------------------------------- -* Update table config: +* Update table config for both high level and low level consumer: Update config: ``stream.kafka.consumer.factory.class.name`` from ``org.apache.pinot.core.realtime.impl.kafka.KafkaConsumerFactory`` to ``org.apache.pinot.core.realtime.impl.kafka2.KafkaConsumerFactory``. * If using Stream(High) level consumer: --------------------------------------------------------------------- To unsubscribe, e-mail: commits-unsubscr...@pinot.apache.org For additional commands, e-mail: commits-h...@pinot.apache.org