[ 
https://issues.apache.org/jira/browse/PIO-30?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15449714#comment-15449714
 ] 

Pat Ferrel edited comment on PIO-30 at 8/30/16 6:16 PM:
--------------------------------------------------------

How does Spark do their multiple version support? They have releases on 1.x and 
2.x because they are not source compatible and I think we are in the same 
situation at least for code that calls the new libs (here I only have 
experience with ES) 

They even maintain versions of the docs site for 1.x and 2.x

We might as well ask these questions now even if we don't want to do it that 
way.




was (Author: pferrel):
How does Spark do their multiple version support? They even maintain versions 
of the docs site for past versions. We might as well ask these questions now.

> Cross build for different versions of scala and spark
> -----------------------------------------------------
>
>                 Key: PIO-30
>                 URL: https://issues.apache.org/jira/browse/PIO-30
>             Project: PredictionIO
>          Issue Type: Improvement
>            Reporter: Marcin ZiemiƄski
>
> The present version of Scala is 2.10 and Spark is 1.4, which is quite old. 
> With Spark 2.0.0 come many performance improvements and features, that people 
> will definitely like to add to their templates. I am also aware that past 
> cannot be ignored and simply dumping 1.x might not be an option for other 
> users. 
> I propose setting up a crossbuild in sbt to build with scala 2.10 and Spark 
> 1.6 and a separate one for Scala 2.11 and Spark 2.0. Most of the files will 
> be consistent between versions including API. The problematic ones will be 
> divided between additional source directories: src/main/scala-2.10/ and 
> src/main/scala-2.11/. The dockerized tests should also take the two versions 
> into consideration



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

Reply via email to