I don't think so. Spark brings the Scala dependency, and I don't think
the installed scalac matters in this respect.
Darcy there was an open question about whether this enables you to
back out the workaround you created for 2.12.6. I tried removing it
and it failed again, so left it in as still nee
SGTM then. Is there anything we need to do to pick up the 2.12.7 upgrade?
like updating Jenkins config?
On Tue, Oct 2, 2018 at 10:53 AM Sean Owen wrote:
> I tested both ways, and it actually works fine. It calls into question
> whether there's really a fix we need with 2.12.7, but, I hear two
>
I tested both ways, and it actually works fine. It calls into question
whether there's really a fix we need with 2.12.7, but, I hear two
informed opinions (Darcy and the scala release notes) that it was
relevant. As we have no prior 2.12 support, I guess my feeling was
indeed to get this update in
:43 PM
To: Sean Owen
Cc: sad...@zoho.com; Spark dev list
Subject: Re: On Scala 2.12.7
My major concern is how it will affect end-users if Spark 2.4 is built with
Scala versions prior to 2.12.7. Generally I'm hesitating to upgrade Scala
version when we are very close to a release, and Scala
My major concern is how it will affect end-users if Spark 2.4 is built with
Scala versions prior to 2.12.7. Generally I'm hesitating to upgrade Scala
version when we are very close to a release, and Scala 2.12 build of Spark
2.4 is beta anyway.
On Sat, Sep 29, 2018 at 6:46 AM Sean Owen wrote:
>