Older versions of Spark are supported on J8, so we could still support them.

But it doesn't seem like there's a desire to support older Spark
anyway, so great! I'll file a few bugs tomorrow and clean that up.
That'll make my patch for LIVY-503 simpler, too.
On Thu, Sep 13, 2018 at 6:44 PM Saisai Shao <[email protected]> wrote:
>
> +1,
>
> To support J8, Spark 2.2+ might be the only option. I'm not sure if those
> vendors will continue to support Spark 2.2-, but IMHO for the community
> release, I think moving forward would be a better choice.
>
> Thanks
> Saisai
>
>
> Jeff Zhang <[email protected]> 于2018年9月14日周五 上午9:39写道:
>
> > +1
> >
> >
> > Alex Bozarth <[email protected]>于2018年9月14日周五 上午6:26写道:
> >
> > > I agree with all of Marcelo's points. The last time we discussed this was
> > > when Spark 2.2 was new and it was decided that it was probably too soon,
> > > but that was awhile ago now. I've been in support of deprecating and
> > > removing support for older versions of Java/Scala/Spark for a while and I
> > > believe it will allow us to clean up and unify large portions of our
> > code.
> > >
> > >
> > > *Alex Bozarth*
> > > Software Engineer
> > > Center for Open-Source Data & AI Technologies
> > > ------------------------------
> > > *E-mail:* *[email protected]* <[email protected]>
> > > *GitHub: **github.com/ajbozarth* <https://github.com/ajbozarth>
> > >
> > >
> > > 505 Howard Street
> > > <
> > https://maps.google.com/?q=505+Howard+Street+San+Francisco,+CA+94105+United+States&entry=gmail&source=g
> > >
> > > San Francisco, CA 94105
> > > <
> > https://maps.google.com/?q=505+Howard+Street+San+Francisco,+CA+94105+United+States&entry=gmail&source=g
> > >
> > > United States
> > > <
> > https://maps.google.com/?q=505+Howard+Street+San+Francisco,+CA+94105+United+States&entry=gmail&source=g
> > >
> > >
> > >
> > >
> > > [image: Inactive hide details for Marcelo Vanzin ---09/13/2018 03:10:28
> > > PM---Hey all, I'd like to gauge people's reaction to some propo]Marcelo
> > > Vanzin ---09/13/2018 03:10:28 PM---Hey all, I'd like to gauge people's
> > > reaction to some proposals regarding what
> > >
> > > From: Marcelo Vanzin <[email protected]>
> > > To: [email protected]
> > > Date: 09/13/2018 03:10 PM
> > > Subject: [DISCUSS] Getting rid of old stuff
> > > ------------------------------
> > >
> > >
> > >
> > >
> > > Hey all,
> > >
> > > I'd like to gauge people's reaction to some proposals regarding what
> > > is supported in Livy.
> > >
> > > #1: Java versions
> > >
> > > I propose dropping support for Java 7. Even J8 is already EOL,
> > > although it's pretty obvious nobody is getting rid of it anytime soon.
> > > But I don't see a good reason to support J7. Even testing it is a
> > > nuisance, since most people only have jdk8 around...
> > >
> > > #2: Spark versions
> > >
> > > I think we should drop 1.6. At least. This would clean up some code
> > > that currently uses reflection, and fix some parts of the API (like
> > > the JobContext method to retrieve a "SparkSession" instance).
> > >
> > > Changing the API is well, not backwards compatible, but I think it's
> > > better to do that sort of cleanup earlier than later when the project
> > > is more mature.
> > >
> > > Separately, we could consider dropping 2.0 and 2.1 also. There was
> > > talk in the Spark list about making 2.1 EOL - I don't remember a final
> > > verdict, but I don't imagine there will be a lot of new deployments of
> > > 2.1 going forward, since the only reason to use it is if you're stuck
> > > with J7.
> > >
> > > #3: Scala versions
> > >
> > > If we decide to only support Spark 2.2+, then the decision is easy.
> > > But if we drop 1.6, we should consider dropping Scala 2.10 support.
> > > Spark does not ship official builds with 2.10 support in the 2.x line,
> > > and it was dropped altogether in 2.2.
> > >
> > > We shouldn't remove support for multiple versions of Scala, though,
> > > since 2.12 will be beta in Spark 2.4, and there will be a 2.13 at some
> > > point.
> > >
> > > Thoughts?
> > >
> > >
> > > --
> > > Marcelo
> > >
> > >
> > >
> > >
> > >
> >



-- 
Marcelo

Reply via email to