I agree with all of Marcelo's points. The last time we discussed this was
when Spark 2.2 was new and it was decided that it was probably too soon,
but that was awhile ago now. I've been in support of deprecating and
removing support for older versions of Java/Scala/Spark for a while and I
believe it will allow us to clean up and unify large portions of our code.

                                                                                
   
 Alex Bozarth                                                                   
   
 Software Engineer                                                              
   
 Center for Open-Source Data & AI Technologies                                  
   
                                                                                
   
                                                                                
     
                                                                                
     
                                                                                
     
 E-mail: ajboz...@us.ibm.com                                                    
     
 GitHub: github.com/ajbozarth                                                   
     
                                                                   505 Howard 
Street 
                                                             San Francisco, CA 
94105 
                                                                       United 
States 
                                                                                
     








From:   Marcelo Vanzin <van...@cloudera.com.INVALID>
To:     dev@livy.incubator.apache.org
Date:   09/13/2018 03:10 PM
Subject:        [DISCUSS] Getting rid of old stuff



Hey all,

I'd like to gauge people's reaction to some proposals regarding what
is supported in Livy.

#1: Java versions

I propose dropping support for Java 7. Even J8 is already EOL,
although it's pretty obvious nobody is getting rid of it anytime soon.
But I don't see a good reason to support J7. Even testing it is a
nuisance, since most people only have jdk8 around...

#2: Spark versions

I think we should drop 1.6. At least. This would clean up some code
that currently uses reflection, and fix some parts of the API (like
the JobContext method to retrieve a "SparkSession" instance).

Changing the API is well, not backwards compatible, but I think it's
better to do that sort of cleanup earlier than later when the project
is more mature.

Separately, we could consider dropping 2.0 and 2.1 also. There was
talk in the Spark list about making 2.1 EOL - I don't remember a final
verdict, but I don't imagine there will be a lot of new deployments of
2.1 going forward, since the only reason to use it is if you're stuck
with J7.

#3: Scala versions

If we decide to only support Spark 2.2+, then the decision is easy.
But if we drop 1.6, we should consider dropping Scala 2.10 support.
Spark does not ship official builds with 2.10 support in the 2.x line,
and it was dropped altogether in 2.2.

We shouldn't remove support for multiple versions of Scala, though,
since 2.12 will be beta in Spark 2.4, and there will be a 2.13 at some
point.

Thoughts?


--
Marcelo



Reply via email to