Reposting to the list...
Thanks for all the feedback everyone, I get a clearer picture of the
reasoning and implications now.
Koert, according to your post in this thread
http://apache-spark-developers-list.1001551.n3.nabble.com/Master-build-fails-tt14895.html#a15023,
it is apparently very easy
See previous discussion:
http://search-hadoop.com/m/q3RTtPnPnzwOhBr
FYI
On Thu, Nov 5, 2015 at 4:30 PM, Stephen Boesch wrote:
> Yes. The current dev/change-scala-version.sh mutates (/pollutes) the build
> environment by updating the pom.xml in each of the subprojects. If you
Yes. The current dev/change-scala-version.sh mutates (/pollutes) the build
environment by updating the pom.xml in each of the subprojects. If you were
able to come up with a structure that avoids that approach it would be an
improvement.
2015-11-05 15:38 GMT-08:00 Jakob Odersky
Maven isn't 'legacy', or supported for the benefit of third parties.
SBT had some behaviors / problems that Maven didn't relative to what
Spark needs. SBT is a development-time alternative only, and partly
generated from the Maven build.
On Fri, Nov 6, 2015 at 1:48 AM, Koert Kuipers
People who do upstream builds of spark (think bigtop and hadoop distros)
are used to legacy systems like maven, so maven is the default build. I
don't think it will change.
Any improvements for the sbt build are of course welcome (it is still used
by many developers), but i would not do anything
There was a lot of discussion that preceded our arriving at this statement
in the Spark documentation: "Maven is the official build tool recommended
for packaging Spark, and is the build of reference."
https://spark.apache.org/docs/latest/building-spark.html#building-with-sbt
I'm not aware of
Hey Jakob,
The builds in Spark are largely maintained by me, Sean, and Michael
Armbrust (for SBT). For historical reasons, Spark supports both a Maven and
SBT build. Maven is the build of reference for packaging Spark and is used
by many downstream packagers and to build all Spark releases. SBT