Github user koeninger commented on a diff in the pull request:
https://github.com/apache/spark/pull/22703#discussion_r224899199
--- Diff: docs/streaming-kafka-0-10-integration.md ---
@@ -3,7 +3,11 @@ layout: global
title: Spark Streaming + Kafka Integration Guide (Kafka broker version
0.10.0 or higher)
---
-The Spark Streaming integration for Kafka 0.10 is similar in design to the
0.8 [Direct Stream
approach](streaming-kafka-0-8-integration.html#approach-2-direct-approach-no-receivers).
It provides simple parallelism, 1:1 correspondence between Kafka partitions
and Spark partitions, and access to offsets and metadata. However, because the
newer integration uses the [new Kafka consumer
API](http://kafka.apache.org/documentation.html#newconsumerapi) instead of the
simple API, there are notable differences in usage. This version of the
integration is marked as experimental, so the API is potentially subject to
change.
+The Spark Streaming integration for Kafka 0.10 provides simple
parallelism, 1:1 correspondence between Kafka
+partitions and Spark partitions, and access to offsets and metadata.
However, because the newer integration uses
+the [new Kafka consumer
API](https://kafka.apache.org/documentation.html#newconsumerapi) instead of the
simple API,
+there are notable differences in usage. This version of the integration is
marked as experimental, so the API is
--- End diff --
Do we want to leave the new integration marked as experimental if it is now
the only available one?
---
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]