We certainly can't be the only project downstream of Spark that includes Scala 
versioned artifacts in our release.  Our python library on PyPI depends on 
pyspark, our Bioconda recipe depends on the pyspark Conda recipe, and our 
Homebrew formula depends on the apache-spark Homebrew formula.

Using Scala 2.12 in the binary distribution for Spark 2.4.2 was unintentional 
and never voted on.  There was a successful vote to default to Scala 2.12 in 
Spark version 3.0.

   michael


> On Apr 26, 2019, at 9:52 AM, Sean Owen <[email protected]> wrote:
> 
> To be clear, what's the nature of the problem there... just Pyspark apps that 
> are using a Scala-based library? Trying to make sure we understand what is 
> and isn't a problem here.
> 
> On Fri, Apr 26, 2019 at 9:44 AM Michael Heuer <[email protected] 
> <mailto:[email protected]>> wrote:
> This will also cause problems in Conda builds that depend on pyspark
> 
> https://anaconda.org/conda-forge/pyspark 
> <https://anaconda.org/conda-forge/pyspark>
> 
> and Homebrew builds that depend on apache-spark, as that also uses the binary 
> distribution.
> 
> https://formulae.brew.sh/formula/apache-spark#default 
> <https://formulae.brew.sh/formula/apache-spark#default>
> 
> +1 (non-binding) to cutting a 2.4.3 release immediately.
> 
>    michael
> 
> 
>> On Apr 26, 2019, at 2:05 AM, Reynold Xin <[email protected] 
>> <mailto:[email protected]>> wrote:
>> 
>> I do feel it'd be better to not switch default Scala versions in a minor 
>> release. I don't know how much downstream this impacts. Dotnet is a good 
>> data point. Anybody else hit this issue?
>> 
>> 
>> 
>> 
>> On Thu, Apr 25, 2019 at 11:36 PM, Terry Kim <[email protected] 
>> <mailto:[email protected]>> wrote:
>> Very much interested in hearing what you folks decide. We currently have a 
>> couple asking us questions at https://github.com/dotnet/spark/issues 
>> <https://github.com/dotnet/spark/issues>.
>> 
>> Thanks, 
>> Terry
>> 
>> -- 
>> Sent from: http://apache-spark-developers-list.1001551.n3.nabble.com/ 
>> <http://apache-spark-developers-list.1001551.n3.nabble.com/>
>> --------------------------------------------------------------------- To 
>> unsubscribe e-mail: [email protected] 
>> <mailto:[email protected]>
> 

Reply via email to