Github user stanzhai commented on the issue:
https://github.com/apache/spark/pull/21663
@jerryshao My Spark Application is built on top of JDK10, but the
standalone cluster manager is running with JDK8 which does not support JDK10.
Java 7 support has been removed since Spark 2.2.
I've tried that JDK10 serialized message from executors which can be read
by JDK8 worker.
Aside from that, I think we should let the spark.executorEnv.JAVA_HOME
configuration work, and as for effectiveness, we should give it to the user.
---
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]