Generally speaking, to avoid the potential issues the versions that are used in 
compile time and in runtime should be the same (most important is Scala 
versions) but, due to Spark backward compatibility, the minor versions can be 
different.  

> On 5 Jan 2022, at 07:50, Zheng Ni <nizheng1...@gmail.com> wrote:
> 
> Hi Beam Community,
>  
> Greetings.
>  
> I am interested in submitting a spark job through portable runner. I have a 
> question about the compatibility between spark_job_server and spark cluster.
>  
> Let’s say I am going to use beam_spark_job_server of version 2.35.0. How 
> could I know which spark cluster version is compatible with it? Or could it 
> work with any versions of the spark cluster?
>  
>   <>
> Regards,
> Zheng 
> 

Reply via email to