Its not included, it is downloaded on demand.
That said I think the fact that we can download the jar is a huge feature
of SBT, no installation needed, build the project as long as you have a JVM.
On Fri, Nov 6, 2015 at 4:49 PM, Jakob Odersky wrote:
> > Can you clarify which sbt jar (by path) ?
> Can you clarify which sbt jar (by path) ?
Any of them.
Sbt is a build tool, and I don't understand why it is included in a source
repository. It would be like including make in a project.
On 6 November 2015 at 16:43, Ted Yu wrote:
> bq. include an sbt jar in the source repo
>
> Can you clarify
bq. include an sbt jar in the source repo
Can you clarify which sbt jar (by path) ?
I tried 'git log' on the following files but didn't see commit history:
./build/sbt-launch-0.13.7.jar
./build/zinc-0.3.5.3/lib/sbt-interface.jar
./sbt/sbt-launch-0.13.2.jar
./sbt/sbt-launch-0.13.5.jar
On Fri, No
[Reposting to the list again, I really should double-check that
reply-to-all button]
in the mean-time, as a light Friday-afternoon patch I was thinking about
splitting the ~600loc-single-build sbt file into something more manageable
like the Akka build (without changing any dependencies or setting
oh ok i think i got it... i hope! since the app runs with the spark
assembly jar on its classpath, the exact version as resolved by spark's
build process is actually on the apps classpath.
sorry didnt mean the pollute this thread with my own dependency resolution
confusion.
On Fri, Nov 6, 2015 a
I think there are a few minor differences in the dependency graph that
arise from this. For a given user, the probability it affects them is low -
it needs to conflict with a library a user application is using. However
the probability it affects *some users* is very high and we do see small
change
On Fri, Nov 6, 2015 at 3:04 PM, Koert Kuipers wrote:
> if i understand it correctly it would cause compatibility breaks for
> applications on top of spark, because those applications use the exact same
> current resolution logic (so basically they are maven apps), and the change
> would make them
thats interesting...
if i understand it correctly it would cause compatibility breaks for
applications on top of spark, because those applications use the exact same
current resolution logic (so basically they are maven apps), and the change
would make them inconsistent?
by that logic all existin
I think we'd have to standardize on Maven-style resolution, or I'd at least
like to see that path explored first. The issue is if we switch the
standard now, it could cause compatibility breaks for applications on top
of Spark.
On Fri, Nov 6, 2015 at 2:28 PM, Jakob Odersky wrote:
> Reposting to
Reposting to the list...
Thanks for all the feedback everyone, I get a clearer picture of the
reasoning and implications now.
Koert, according to your post in this thread
http://apache-spark-developers-list.1001551.n3.nabble.com/Master-build-fails-tt14895.html#a15023,
it is apparently very easy t
Hey Jakob,
The builds in Spark are largely maintained by me, Sean, and Michael
Armbrust (for SBT). For historical reasons, Spark supports both a Maven and
SBT build. Maven is the build of reference for packaging Spark and is used
by many downstream packagers and to build all Spark releases. SBT is
Maven isn't 'legacy', or supported for the benefit of third parties.
SBT had some behaviors / problems that Maven didn't relative to what
Spark needs. SBT is a development-time alternative only, and partly
generated from the Maven build.
On Fri, Nov 6, 2015 at 1:48 AM, Koert Kuipers wrote:
> Peop
People who do upstream builds of spark (think bigtop and hadoop distros)
are used to legacy systems like maven, so maven is the default build. I
don't think it will change.
Any improvements for the sbt build are of course welcome (it is still used
by many developers), but i would not do anything t
There was a lot of discussion that preceded our arriving at this statement
in the Spark documentation: "Maven is the official build tool recommended
for packaging Spark, and is the build of reference."
https://spark.apache.org/docs/latest/building-spark.html#building-with-sbt
I'm not aware of anyt
See previous discussion:
http://search-hadoop.com/m/q3RTtPnPnzwOhBr
FYI
On Thu, Nov 5, 2015 at 4:30 PM, Stephen Boesch wrote:
> Yes. The current dev/change-scala-version.sh mutates (/pollutes) the build
> environment by updating the pom.xml in each of the subprojects. If you were
> able to come
Yes. The current dev/change-scala-version.sh mutates (/pollutes) the build
environment by updating the pom.xml in each of the subprojects. If you were
able to come up with a structure that avoids that approach it would be an
improvement.
2015-11-05 15:38 GMT-08:00 Jakob Odersky :
> Hi everyone,
>
Hi everyone,
in the process of learning Spark, I wanted to get an overview of the
interaction between all of its sub-projects. I therefore decided to have a
look at the build setup and its dependency management.
Since I am alot more comfortable using sbt than maven, I decided to try to
port the mav
17 matches
Mail list logo