[
https://issues.apache.org/jira/browse/SPARK-25075?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17350073#comment-17350073
]
Dongjoon Hyun commented on SPARK-25075:
---------------------------------------
It sounds like two issues.
- For the deprecation policy of Apache Kafka 3.0, it sounds reasonable to me
(as a user).
- For the relation with Apache Spark 3.2 and Apache Kafka 3.0, let's see after
Apache Kafka 3.0 is released.
You know, I'm a big fan of Kafka and I usually catch up the new release of
Apache Kafka to bring the latest client-side bug fixes to Apache Spark.
However, the followings are the current AS-IS status which Apache Spark
community are trying to stabilize.
- The latest Apache Spark release (3.1.1) was shipped Apache Kafka 2.6.0.
Apache Spark 3.1.2 RC1 vote starts in a few days.
- Apache Spark master branch has been using Apache Kafka 2.8.
> Build and test Spark against Scala 2.13
> ---------------------------------------
>
> Key: SPARK-25075
> URL: https://issues.apache.org/jira/browse/SPARK-25075
> Project: Spark
> Issue Type: Umbrella
> Components: Build, MLlib, Project Infra, Spark Core, SQL
> Affects Versions: 3.0.0
> Reporter: Guillaume Massé
> Priority: Major
>
> This umbrella JIRA tracks the requirements for building and testing Spark
> against the current Scala 2.13 milestone.
--
This message was sent by Atlassian Jira
(v8.3.4#803005)
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]