Re: Spark on Yarn with Java 17

2023-12-10 Thread Jason Xu
m/LucaCanali/Miscellaneous/blob/master/Spark_Notes/Spark_Set_Java_Home_Howto.md > > > > Best, > > Luca > > > > *From:* Dongjoon Hyun > *Sent:* Saturday, December 9, 2023 09:39 > *To:* Jason Xu > *Cc:* dev@spark.apache.org > *Subject:* Re: Spark on Yarn with Jav

RE: Spark on Yarn with Java 17

2023-12-09 Thread Luca Canali
/Spark_Set_Java_Home_Howto.md Best, Luca From: Dongjoon Hyun Sent: Saturday, December 9, 2023 09:39 To: Jason Xu Cc: dev@spark.apache.org Subject: Re: Spark on Yarn with Java 17 Please try Apache Spark 3.3+ (SPARK-33772) with Java 17 on your cluster simply, Jason. I believe you can set up for your Spark

Re: Spark on Yarn with Java 17

2023-12-09 Thread Dongjoon Hyun
Please try Apache Spark 3.3+ (SPARK-33772) with Java 17 on your cluster simply, Jason. I believe you can set up for your Spark 3.3+ jobs to run with Java 17 while your cluster(DataNode/NameNode/ResourceManager/NodeManager) is still sitting on Java 8. Dongjoon. On Fri, Dec 8, 2023 at 11:12 PM

Re: Spark on Yarn with Java 17

2023-12-08 Thread Jason Xu
Dongjoon, thank you for the fast response! Apache Spark 4.0.0 depends on only Apache Hadoop client library. To better understand your answer, does that mean a Spark application built with Java 17 can successfully run on a Hadoop cluster on version 3.3 and Java 8 runtime? On Fri, Dec 8, 2023 at

Re: Spark on Yarn with Java 17

2023-12-08 Thread Dongjoon Hyun
Hi, Jason. Apache Spark 4.0.0 depends on only Apache Hadoop client library. You can track all `Apache Spark 4` activities including Hadoop dependency here. https://issues.apache.org/jira/browse/SPARK-44111 (Prepare Apache Spark 4.0.0) According to the release history, the original suggested

Spark on Yarn with Java 17

2023-12-08 Thread Jason Xu
Hi Spark devs, According to the Spark 3.5 release notes, Spark 4 will no longer support Java 8 and 11 (link ). My company is using Spark on Yarn with Java 8 now. When considering a future upgrade to Spark 4, one issue