Hey all,

Working on releasing 0.13.1 with multiple spark/scala combos.

Afaik, there is no 'standard' for multiple spark versions (but I may be
wrong, I don't claim expertise here).

One approach is simply only release binaries for:
Spark-1.6 + Scala 2.10
Spark-2.1 + Scala 2.11

OR

We could do like dl4j

org.apache.mahout:mahout-spark_2.10:0.13.1_spark_1
org.apache.mahout:mahout-spark_2.11:0.13.1_spark_1

org.apache.mahout:mahout-spark_2.10:0.13.1_spark_2
org.apache.mahout:mahout-spark_2.11:0.13.1_spark_2

OR

some other option I don't know of.

Reply via email to