On Tue, Aug 27, 2019 at 12:01 PM DB Tsai <d_t...@apple.com.invalid> wrote:
> Hello everyone, > > Thank you all for working on supporting JDK11 in Apache Spark 3.0 as a > community. > > Java 8 is already end of life for commercial users, and many companies are > moving to Java 11. > The release date for Apache Spark 3.0 is still not there yet, and there > are many API > incompatibility issues when upgrading from Spark 2.x. As a result, asking > users to move to > Spark 3.0 to use JDK 11 is not realistic. > > Should we backport PRs for JDK11 and cut a release in 2.x to support JDK11? > > Should we cut a new Apache Spark 2.5 since the patches involve some of the > dependencies changes > which is not desired in minor release? I think a LTS 2.5 branch with JDK 11 would make sense given the state of Java 8. I think backporting to 2.4 would be not ideal. > > > Thanks. > > DB Tsai | Siri Open Source Technologies [not a contribution] | > Apple, Inc > > > --------------------------------------------------------------------- > To unsubscribe e-mail: dev-unsubscr...@spark.apache.org > > -- Twitter: https://twitter.com/holdenkarau Books (Learning Spark, High Performance Spark, etc.): https://amzn.to/2MaRAG9 <https://amzn.to/2MaRAG9> YouTube Live Streams: https://www.youtube.com/user/holdenkarau