[
https://issues.apache.org/jira/browse/SPARK-2071?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14026195#comment-14026195
]
Patrick Wendell edited comment on SPARK-2071 at 6/10/14 7:16 AM:
-----------------------------------------------------------------
I just meant that we could introduce a new project in the build that only
existed for the purpose of donwloading the older artifacts. So the script could
do `sbt/sbt oldDeps/compile`. You could build the dependencies of the project
by adding the previousArtifact's from each of the other projects.
was (Author: pwendell):
I just meant that we could introduce a new project in the build that only
existed for the purpose of donwloading the older artifacts. So the script could
do `sbt/sbt oldDeps/retrieveManaged`.
> Package private classes that are deleted from an older version of Spark
> trigger errors
> --------------------------------------------------------------------------------------
>
> Key: SPARK-2071
> URL: https://issues.apache.org/jira/browse/SPARK-2071
> Project: Spark
> Issue Type: Sub-task
> Components: Build
> Reporter: Patrick Wendell
> Assignee: Prashant Sharma
> Fix For: 1.1.0
>
>
> We should figure out how to fix this. One idea is to run the MIMA exclude
> generator with sbt itself (rather than ./spark-class) so it can run against
> the older versions of Spark and make sure to exclude classes that are marked
> as package private in that version as well.
--
This message was sent by Atlassian JIRA
(v6.2#6252)