Cool. It sounds like focusing on sbt-pom-reader would be a good thing for
you guys then.
There's a few... fun... issues around maven parent projects that are still
running around with sbt-pom-reader that appear to be fundamental ivy-maven
hate-based issues.
IN any case, while I'm generally
I like the pom-reader approach as well — in particular, that it lets you add
extra stuff in your SBT build after loading the dependencies from the POM.
Profiles would be the one thing missing to be able to pass options through.
Matei
On Mar 14, 2014, at 10:03 AM, Patrick Wendell
I think Kevin's point is somewhat different: there's no question that Sbt can
be integrated into Maven ecosystem - mostly the repositories and artifact
management, of course.
However, Sbt is a niche build tool and is unlikely to be widely supported by
engineering teams nor IT organizations. Sbt
we have a maven corporate repository inhouse and of course we also use
maven central. sbt can handle retrieving from and publishing to maven
repositories just fine. we have maven, ant/ivy and sbt projects depending
on each others artifacts. not sure i see the issue there.
On Tue, Mar 11, 2014 at
Asm is such a mess. And their suggested solution being everyone should
shade it sounds pretty awful to me (not uncommon to have shaded asm 15
times in a single project). But I guess it you are right that shading is
only way to deal with it at this point...
On Mar 11, 2014 5:35 PM, Kevin Markey
On Tue, Feb 25, 2014 at 03:20PM, Evan Chan wrote:
The correct way to exclude dependencies in SBT is actually to declare
a dependency as provided. I'm not familiar with Maven or its
Yes, I believe this would be equivalent to the maven exclusion of an
artifact's transitive deps.
Cos
With all due respect Patrick - this approach is seeking for troubles.
Proacively ;)
Cos
On Tue, Feb 25, 2014 at 04:09PM, Patrick Wendell wrote:
What I mean is this. AFIAK the shader plug-in is primarily designed
for creating uber jars which contain spark and all dependencies. But
since Spark
Hey,
Thanks everyone for chiming in on this. I wanted to summarize these
issues a bit particularly wrt the constituents involved - does this
seem accurate?
= Spark Users =
In general those linking against Spark should be totally unaffected by
the build choice. Spark will continue to publish
Couple of comments: 1) Whether the Spark POM is produced by SBT or Maven
shouldn't matter for those who just need to link against published
artifacts, but right now SBT and Maven do not produce equivalent POMs for
Spark -- I think 2) Incremental builds using Maven are trivially more
difficult
We maintain in house spark build using sbt. We have no problem using sbt
assembly. We did add a few exclude statements for transitive dependencies.
The main enemy of assemblies are jars that include stuff they shouldn't
(kryo comes to mind, I think they include logback?), new versions of jars
@mridul - As far as I know both Maven and Sbt use fairly similar
processes for building the assembly/uber jar. We actually used to
package spark with sbt and there were no specific issues we
encountered and AFAIK sbt respects versioning of transitive
dependencies correctly. Do you have a specific
@patrick - It seems like my point about being able to inherit the root pom
was addressed and there's a way to handle this.
The larger point I meant to make is that Maven is by far the most common
build tool in projects that are likely to share contributors with Spark. I
personally know 10 people
i dont buy the argument that we should use it because its the most common.
if all we would do is use what is most common then we should switch to
java, svn and maven
On Wed, Feb 26, 2014 at 1:38 PM, Mark Grover grover.markgro...@gmail.comwrote:
Hi Patrick,
And, to pile on what Sandy said.
Mark,
No, I haven't tried this myself yet :-p Also I would expect that
sbt-pom-reader does not do assemblies at all because that is an
SBT plugin, so we would still need code to include sbt-assembly.
There is also the trick question of how to include the assembly stuff
into sbt-pom-reader
Yes, but the POM generated in that fashion is only sufficient for linking
with Spark, not for building Spark or serving as a basis from which to
build a customized Spark with Maven. So, starting from SparkBuild.scala
and generating a POM with make-pom, those who wish to build a customized
Spark
Can't maven pom's include other ones? So what if we remove the
artifact specs from the main pom, have them generated by sbt make-pom,
and include the generated file in the main pom.xml?I guess, just
trying to figure out how much this would help (it seems at least it
would remove the issue of
Actually you can control exactly how sbt assembly merges or resolves conflicts.
I believe the default settings however lead to order which cannot be
controlled.
I do wish for a smarter fat jar plugin.
-Evan
To be free is not merely to cast off one's chains, but to live in a way that
17 matches
Mail list logo