Dear Community , 

 

>From what I understand , Spark uses a variation of Semantic Versioning[1] ,
but this information is not enough for me to clarify if it is compatible or
not within versions. 

 

For example , if my cluster is running Spark 2.3.1 , can I develop using API
additions in Spark 2.4? (higher order functions to give an  example). What
about the other way around? 

 

Typically , I assume that a job created in Spark 1.x will fail in Spark 2.x
, but that's also something I would like to get a confirmation. 

 

Thank you for your help!

 

[1] https://spark.apache.org/versioning-policy.html 

Reply via email to