Hello, Pavel, my two cents: looks like information on https://spark.apache.org/docs/2.4.5/: is not fully correct.
On download page there is Download Apache Spark⢠1. Choose a Spark release: 3.0.0-preview2 (Dec 23 2019)2.4.5 (Feb 05 2020) 2. Choose a package type: Pre-built for Apache Hadoop 2.7 Pre-built for Apache Hadoop 3.2 and later Pre-built with user-provided Apache Hadoop Source Code 3. Download Spark: spark-3.0.0-preview2-bin-hadoop2.7.tgz <https://www.apache.org/dyn/closer.lua/spark/spark-3.0.0-preview2/spark-3.0.0-preview2-bin-hadoop2.7.tgz> 4. Verify this release using the 3.0.0-preview2 signatures <https://downloads.apache.org/spark/spark-3.0.0-preview2/spark-3.0.0-preview2-bin-hadoop2.7.tgz.asc> , checksums <https://downloads.apache.org/spark/spark-3.0.0-preview2/spark-3.0.0-preview2-bin-hadoop2.7.tgz.sha512> and project release KEYS <https://www.apache.org/dist/spark/KEYS>. Note that, Spark is pre-built with Scala 2.11 except version 2.4.2, which is pre-built with Scala 2.12. So, available binaries are based on scala 2.11 (not 2.12, 2.12 is used by default in Spark 2.4.2, so probably documentation was updated while releasing spark 2.4.2 but not roll backed. But the way I agree, it would be nice to have support for scala 2.12 if there is no any blockers. Kind regards, Denis On Thu, 21 May 2020 at 12:35, Pavel Martynov <[email protected]> wrote: > Hi, folks! > > Looks like the last Spark release 2.4.5 completely dropped Scala 2.11 > support. See https://spark.apache.org/docs/2.4.5/: "For the Scala API, > Spark 2.4.5 uses Scala 2.12. You will need to use a compatible Scala > version (2.12.x).". > > Could you, please, build and publish to Maven repo kudu-spark2 Scala 2.12 > version of lib? > > Thanks! > > -- > with best regards, Pavel Martynov > -- //with Best Regards --Denis Bolshakov e-mail: [email protected]
