[
https://issues.apache.org/jira/browse/FLINK-10599?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16663686#comment-16663686
]
ASF GitHub Bot commented on FLINK-10599:
----------------------------------------
pnowojski commented on a change in pull request #6889:
[FLINK-10599][Documentation] Provide documentation for the modern kafka
connector
URL: https://github.com/apache/flink/pull/6889#discussion_r228153790
##########
File path: docs/dev/connectors/kafka.md
##########
@@ -100,6 +108,30 @@ Note that the streaming connectors are currently not part
of the binary distribu
* Follow the instructions from [Kafka's
quickstart](https://kafka.apache.org/documentation.html#quickstart) to download
the code and launch a server (launching a Zookeeper and a Kafka server is
required every time before starting the application).
* If the Kafka and Zookeeper servers are running on a remote machine, then the
`advertised.host.name` setting in the `config/server.properties` file must be
set to the machine's IP address.
+## Modern Kafka Connector
+
+Starting with Flink 1.7, there is a new Kafka connector that does not track a
specific Kafka major version. Rather, it tracks the latest version of Kafka at
the time of the Flink release.
+
+If your Kafka broker version is 1.0.0 or newer, you should use this Kafka
connector. If you use an older version of Kafka (0.11, 0.10, 0.9, or 0.8), you
should use the connector corresponding to the broker version.
+
+### Compatibility
+
+The modern Kafka connector is compatible with older and newer Kafka brokers
through the compatibility guarantees of the Kafka client API and broker. The
modern Kafka client is compatible with broker versions 0.10.0 or later,
depending on the features used. For details on Kafka compatibility, please
refer to the [Kafka
documentation](https://kafka.apache.org/protocol.html#protocol_compatibility).
+
+### Usage
+
+The use of the modern Kafka connector add a dependency to it:
+
+{% highlight xml %}
+<dependency>
+ <groupId>org.apache.flink</groupId>
+ <artifactId>flink-connector-kafka{{ site.scala_version_suffix }}</artifactId>
+ <version>{{ site.version }}</version>
+</dependency>
+{% endhighlight %}
+
+Then instantiate the new source (`FlinkKafkaConsumer`) and sink
(`FlinkKafkaProducer`). The API is the backwards compatible with the older
Kafka connectors.
+
## Kafka Consumer
Flink's Kafka consumer is called `FlinkKafkaConsumer08` (or `09` for Kafka
0.9.0.x versions, etc.). It provides access to one or more Kafka topics.
Review comment:
Should this line also be adjusted? I think we should also mention here that
"modern" connector brakes the naming convention.
Same applies to similar sentence later for the producer:
```
Flinkās Kafka Producer is called FlinkKafkaProducer011 (or 010 for Kafka
0.10.0.x versions, etc.). It allows writing a stream of records to one or more
Kafka topics.
```
Also in `Kafka Producers and Fault Tolerance` section, rename the title
`Kafka 0.11` to `Kafka 0.11 and newer`, and also adopt the content that section
(the name `FlinkKafkaProducer011 ` is used there frequently and with modern
connector it should also be adjusted).
----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
For queries about this service, please contact Infrastructure at:
[email protected]
> Provide documentation for the modern kafka connector
> ----------------------------------------------------
>
> Key: FLINK-10599
> URL: https://issues.apache.org/jira/browse/FLINK-10599
> Project: Flink
> Issue Type: Sub-task
> Components: Documentation, Kafka Connector
> Reporter: vinoyang
> Assignee: vinoyang
> Priority: Major
> Labels: pull-request-available
> Fix For: 1.7.0
>
>
--
This message was sent by Atlassian JIRA
(v7.6.3#76005)