[ 
https://issues.apache.org/jira/browse/PIO-30?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15898324#comment-15898324
 ] 

ASF GitHub Bot commented on PIO-30:
-----------------------------------

Github user dszeto commented on the issue:

    https://github.com/apache/incubator-predictionio/pull/345
  
    @shimamoto There isn't really a policy right now but we should be on the 
right path if we eventually add support of different combinations like you said 
(Spark 2.x + Scala 2.10). What would be your suggestion based on this?


> Cross build for different versions of scala and spark
> -----------------------------------------------------
>
>                 Key: PIO-30
>                 URL: https://issues.apache.org/jira/browse/PIO-30
>             Project: PredictionIO
>          Issue Type: Improvement
>            Reporter: Marcin ZiemiƄski
>            Assignee: Chan
>             Fix For: 0.11.0
>
>
> The present version of Scala is 2.10 and Spark is 1.4, which is quite old. 
> With Spark 2.0.0 come many performance improvements and features, that people 
> will definitely like to add to their templates. I am also aware that past 
> cannot be ignored and simply dumping 1.x might not be an option for other 
> users. 
> I propose setting up a crossbuild in sbt to build with scala 2.10 and Spark 
> 1.6 and a separate one for Scala 2.11 and Spark 2.0. Most of the files will 
> be consistent between versions including API. The problematic ones will be 
> divided between additional source directories: src/main/scala-2.10/ and 
> src/main/scala-2.11/. The dockerized tests should also take the two versions 
> into consideration



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)

Reply via email to