Hi all,

while working on the dependency cleanup, I noticed that we were stuck with some 
pretty old spark versions in the spark_2.11 module.
The reason is that the scala 2.11 versions have been end of life for quite some 
time. Spark is now generally working with scala 2.12 and since version 3.2 also 
2.13.

We have one reported CVE that we actually can’t get rid of in the scala_2.11 
module, as there will be no release with a fix.

I therefore propose we drop the 2.11 scala spark plugin and possibly add a 2.13 
version instead.

What do you folks think?

Chris

Reply via email to