I'm encountering several issues with JDK 11, which prompted me to remove it in the PR mentioned by Kevin.
1. Stuck with ORC-1.9.x which had CVE[1] and low release cadence 2. Upcoming Spark 4.1 can no longer target JDK11[2] 3. Upgrade to datafusion-comet 0.11.0 failed[3], although it has set JDK11 as target. Hence, I also support dropping Java 11, and we don't need workarounds here and there. We will still have 3 LTS releases (17, 21, 25) after dropping Java 11. I don't think we can have JDK25 till Spark, Flink and other dependencies support it what does that make the minimum supported spark version That will be Spark 3.4 or Spark 3.5 if we drop 3.4 in 1.11 as well. [1] https://github.com/apache/iceberg/issues/14391 <https://github.com/apache/iceberg/issues/14391> [2] https://github.com/apache/iceberg/pull/14155/commits/53bc376e5bf71a8f802c28186de943aff01d27bc#diff-5392a130b5f4f17e365379befee19dd4105817da777df9b8699b5e5704ce4d68R54 [3] https://github.com/apache/iceberg/pull/14591 Regards, Manu On Fri, Nov 21, 2025 at 5:00 AM Kevin Liu <[email protected]> wrote: > Thanks for starting the convo, JB. > > I'm in favor of dropping Java 11 support. > I see Manu has started a draft PR to remove java 11 [1]. This gives a good > overview of the current places where java 11 is used. > > Depending on the scope of the work, I think we can also target the next > Iceberg release (1.11). > > Best, > Kevin Liu > > > [1] https://github.com/apache/iceberg/pull/14400/files > > On Thu, Nov 20, 2025 at 12:28 PM Steve Loughran <[email protected]> > wrote: > >> JDK25 is fairly traumatic security-API wise; not of direct relevance to >> iceberg AFAIK. >> >> With a minimum of java17, what does that make the minimum supported spark >> version (i.e what version of spark supports java17?) >> >> On Thu, 20 Nov 2025 at 06:51, Eduard Tudenhöfner < >> [email protected]> wrote: >> >>> I would also be in favor of moving to JDK 17 but we need to check what >>> the implications are. >>> >>> On Thu, Nov 20, 2025 at 5:36 AM Steven Wu <[email protected]> wrote: >>> >>>> Yeah, the Flink benchmark shouldn't be a blocker, as the 1.20 module >>>> itself can be built and run with Java 17. >>>> >>>> I am in favor of dropping Java 11 support. We probably can also add >>>> Java 25 to the CI build after dropping Java 11, as JDK 25 (LTS) was >>>> released on Sep 25. We will still have 3 LTS releases (17, 21, 25) after >>>> dropping Java 11. >>>> >>>> I tend to be a bit more aggressive in dropping old versions. Let's see >>>> what others think. >>>> >>>> On Wed, Nov 19, 2025 at 10:52 AM Jean-Baptiste Onofré <[email protected]> >>>> wrote: >>>> >>>>> Hi everyone, >>>>> >>>>> I worked on the Gradle 9.x upgrade for Iceberg. Gradle 9.2.x requires >>>>> JDK17 minimum. >>>>> >>>>> I did a quick pass on Iceberg modules, I see all modules support JDK17. >>>>> >>>>> There is a known issue with JDK 17 in the Flink 1.20 module for a >>>>> specific benchmark. The comment in >>>>> >>>>> flink/v1.20/flink/src/jmh/java/org/apache/iceberg/flink/sink/shuffle/StatisticsRecordSerializerBenchmark.java. >>>>> This benchmark in 1.20 only works with Java 11 probably due to usage >>>>> of ArraysAsListSerializer in FlinkChillPackageRegistrar. Flink 2.0 and >>>>> above switched to DefaultSerializers#ArraysAsListSerializer in Kryo >>>>> 5.6. >>>>> Using Java 17 would result in the following error..."This affects only >>>>> that JMH benchmark, not the entire Flink 1.20 module. The module can >>>>> still be built and run with JDK 17; the benchmark has a runtime issue >>>>> due to Java module access restrictions. >>>>> I think we can live with that, waiting to remove Flink 1.20 in the >>>>> future. >>>>> >>>>> Regarding this, I would like to start a discussion to define JDK17 min >>>>> in Iceberg. >>>>> >>>>> Thoughts ? >>>>> >>>>> NB: if we have a consensus, I would be happy to start an >>>>> update/cleanup PR and prepare the next "major" release with JDK17 min. >>>>> >>>>> Regards >>>>> JB >>>>> >>>>
