[ https://issues.apache.org/jira/browse/PIO-30?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15930383#comment-15930383 ]
ASF GitHub Bot commented on PIO-30: ----------------------------------- Github user shimamoto commented on the issue: https://github.com/apache/incubator-predictionio/pull/364 @chanlee514 I looked over the branch :) I would like to confirm couple of things. - Which versions are you going to make prebuilt packages? - Do you mean that predictionio also supports versions that require users to build Spark(ex. Scala 2.10 & Spark 2.10)? This is just my personal opinion but, .....my impression is that 2.10 is intended for Spark 1 users and 2.11 is intended for Spark 2 users. - ES5 has `elasticsearch-spark-13` dependency. We need to use `elasticsearch-spark-20` for Spark 2. > Cross build for different versions of scala and spark > ----------------------------------------------------- > > Key: PIO-30 > URL: https://issues.apache.org/jira/browse/PIO-30 > Project: PredictionIO > Issue Type: Improvement > Reporter: Marcin ZiemiĆski > Assignee: Chan > Fix For: 0.11.0 > > > The present version of Scala is 2.10 and Spark is 1.4, which is quite old. > With Spark 2.0.0 come many performance improvements and features, that people > will definitely like to add to their templates. I am also aware that past > cannot be ignored and simply dumping 1.x might not be an option for other > users. > I propose setting up a crossbuild in sbt to build with scala 2.10 and Spark > 1.6 and a separate one for Scala 2.11 and Spark 2.0. Most of the files will > be consistent between versions including API. The problematic ones will be > divided between additional source directories: src/main/scala-2.10/ and > src/main/scala-2.11/. The dockerized tests should also take the two versions > into consideration -- This message was sent by Atlassian JIRA (v6.3.15#6346)