[
https://issues.apache.org/jira/browse/SPARK-25075?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17350084#comment-17350084
]
Ismael Juma commented on SPARK-25075:
-------------------------------------
Yes, makes sense. When I mentioned Kafka 3.0 and Spark 3.2, I meant from the
point of view of users that want to use both together. I understand that Spark
itself will go with Kafka 2.8 for Spark 3.2 and evaluate the upgrade to Kafka
3.0 once it's available (expected sometime in July/August).
> Build and test Spark against Scala 2.13
> ---------------------------------------
>
> Key: SPARK-25075
> URL: https://issues.apache.org/jira/browse/SPARK-25075
> Project: Spark
> Issue Type: Umbrella
> Components: Build, MLlib, Project Infra, Spark Core, SQL
> Affects Versions: 3.0.0
> Reporter: Guillaume Massé
> Priority: Major
>
> This umbrella JIRA tracks the requirements for building and testing Spark
> against the current Scala 2.13 milestone.
--
This message was sent by Atlassian Jira
(v8.3.4#803005)
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]