yes, backwards compatibility is important to keep in mind. In the past, we mentioned the *minimum* Spark/Hadoop versions SystemML/SystemDS required, while still being able to run with more recent versions.

So let's separate two different aspects here: (1) the minimum version compatible with our source code, and (2) the dependencies we reference for tests and build. Although Spark and Hadoop are generally very good in terms of backwards compatibility inside major versions, it would be good to keep (1) and (2) in sync to avoid hidden incompatibilities.

Are there specific reasons for this update - API changes, security warnings, etc?

Regards,
Matthias

On 12/7/2022 1:06 PM, arnab phani wrote:
Can these upgrades harm the backward compatibility of the next SystemDS
release?
If so, then we either need to make a major release or delay the upgrades
till the next major release.

Regards,
Arnab..

On Wed, Dec 7, 2022 at 1:01 PM <[email protected]> wrote:

Hi all,

Just to let all know,

I am considering updating the Spark, and Hadoop version of our system,
would this interfere with any ongoing work currently?

Spark 3.2.0 -> 3.3.1
Hadoop 3.3.3 -> 3.3.4

best regards
Sebastian


Reply via email to