[ 
https://issues.apache.org/jira/browse/PIO-30?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15906774#comment-15906774
 ] 

ASF GitHub Bot commented on PIO-30:
-----------------------------------

Github user shimamoto commented on the issue:

    https://github.com/apache/incubator-predictionio/pull/345
  
    Just for your information.
    I tried my hand at removing unnecessary dependences and setting the 
`crossScalaVersions` as well.
    
https://github.com/shimamoto/incubator-predictionio/commit/ad0cace0507a951a160e29b3e1e7d9deb58aba22
    
    At the tools project, type `+assembly`. Then, create all versions jars.
    
    This is just temporary. I still have things to do. Sooner or later, I will 
open a PR.


> Cross build for different versions of scala and spark
> -----------------------------------------------------
>
>                 Key: PIO-30
>                 URL: https://issues.apache.org/jira/browse/PIO-30
>             Project: PredictionIO
>          Issue Type: Improvement
>            Reporter: Marcin ZiemiƄski
>            Assignee: Chan
>             Fix For: 0.11.0
>
>
> The present version of Scala is 2.10 and Spark is 1.4, which is quite old. 
> With Spark 2.0.0 come many performance improvements and features, that people 
> will definitely like to add to their templates. I am also aware that past 
> cannot be ignored and simply dumping 1.x might not be an option for other 
> users. 
> I propose setting up a crossbuild in sbt to build with scala 2.10 and Spark 
> 1.6 and a separate one for Scala 2.11 and Spark 2.0. Most of the files will 
> be consistent between versions including API. The problematic ones will be 
> divided between additional source directories: src/main/scala-2.10/ and 
> src/main/scala-2.11/. The dockerized tests should also take the two versions 
> into consideration



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)

Reply via email to