andygrove commented on PR #2514:
URL:
https://github.com/apache/datafusion-comet/pull/2514#issuecomment-3361591859
> The jars for Spark 3.x will work with JDK 11.
>
> The pom file already specifies
>
> ```
> <java.version>11</java.version>
> <maven.compiler.source>${java.version}</maven.compiler.source>
> <maven.compiler.target>${java.version}</maven.compiler.target>
> ```
The `java.version` gets overridden by the JDK profiles though:
```xml
<profile>
<id>jdk17</id>
<activation>
<jdk>17</jdk>
</activation>
<properties>
<java.version>17</java.version>
<maven.compiler.source>${java.version}</maven.compiler.source>
<maven.compiler.target>${java.version}</maven.compiler.target>
</properties>
</profile>
```
So if I build with JDK 17, then I get JDK 17 classes. If I try and run with
JDK 11, I get:
```
java.lang.UnsupportedClassVersionError:
org/apache/spark/sql/comet/execution/shuffle/CometBypassMergeSortShuffleWriter
has been compiled by a more recent version of the Java Runtime (class file
version 61.0), this version of the Java Runtime only recognizes class file
versions up to 55.0
```
I think we need to override the `maven.compiler.target` for each Spark
version to set it to the minimum supported JDK version for that Spark version.
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]