Thank you all for your replies.

1. Thank you, Jia, for those JIRAs.

2. Sounds great for "Scala 2.13 for Spark 4.0". I'll initiate a new thread
for that.
  - "I wonder if it’s safer to do it in Spark 4 (which I believe will be
discussed soon)."
  - "I would make it the default at 4.0, myself."
  - "Shall we initiate a new discussion thread for Scala 2.13 by default?"

3. Thanks. Did you try the pre-built one, Mich?
  - "I spent a day compiling  Spark 3.4.0 code against Scala 2.13.8 with
maven"

  -
https://downloads.apache.org/spark/spark-3.2.4/spark-3.2.4-bin-hadoop3.2-scala2.13.tgz
  -
https://downloads.apache.org/spark/spark-3.3.2/spark-3.3.2-bin-hadoop3-scala2.13.tgz
  -
https://downloads.apache.org/spark/spark-3.4.0/spark-3.4.0-bin-hadoop3-scala2.13.tgz

4. Good suggestion, Bjorn. Instead, we had better add daily jobs like Java
17 because Apache Spark 3.4 added Python 3.11 support via SPARK-41454
already.
- "First, we are currently conducting tests with Python versions 3.8 and
3.9."
- "Should we consider replacing 3.9 with 3.11?"

5. For Guava, I'm also tracking the on-going discussion.

Thanks,
Dongjoon.

Reply via email to