Thanks Ayush for driving this! Good to know that Hive is getting ready for newer JDK. From my opinon, if we have more community energy to put into it, we can support both JDK-11 and JDK-17 like Spark[1]. If we have to make a choice between a JDK-11 and JDK-17, i would like to choose the relatively new version JDK-17, meanwhile, we should maintain compatibility with jdk8, as JDK-8 is still widely used in most big data platforms.
Thanks, Butao Zhang [1]https://issues.apache.org/jira/browse/SPARK-33772 ---- Replied Message ---- | From | Ayush Saxena<ayush...@gmail.com> | | Date | 5/31/2023 18:39 | | To | dev<dev@hive.apache.org> | | Subject | Move to JDK-11 | Hi Everyone, Want to pull in the attention of folks towards moving to JDK-11 compile time support in Hive. There was a ticket in the past [1] which talks about it and If I could decode it right, it was blocked because the Hadoop version used by Hive didn't had JDK-11 runtime support, But with [2] in we have upgraded the Hadoop version, so that problem is sorted out. I couldn't even see any unresolved tickets in the blocked state either. I quickly tried* a mvn clean install -DskipTests -Piceberg -Pitests -Dmaven.javadoc.skip=true And no surprises it failed with some weird exceptions towards the end. But I think that should be solvable. So, Questions? - What do folks think about this? Should we put in some effort towards JDK-11 - Should we support both JDK-11 & JDK-8? - Ditch JDK-11 and directly shoot for JDK-17? Let me know your thoughts, In case anyone has some experience in this area and have tried something in the context, feel free to share or may be if someone has any potential action plan or so -Ayush [1] https://issues.apache.org/jira/browse/HIVE-22415 [2] https://issues.apache.org/jira/browse/HIVE-24484 * changed the maven.compiler.source & maven.compiler.target to 11