Two builds is indeed a pain, since it's an ongoing chore to keep them
in sync. For example, I am already seeing that the two do not quite
declare the same dependencies (see recent patch).

I think publishing artifacts to Maven central should be considered a
hard requirement if it isn't already one from the ASF, and it may be?
Certainly most people out there would be shocked if you told them
Spark is not in the repo at all. And that requires at least
maintaining a pom that declares the structure of the project.

This does not necessarily mean using Maven to build, but is a reason
that removing the pom is going to make this a lot harder for people to
consume as a project.

Maven has its pros and cons but there are plenty of people lurking
around who know it quite well. Certainly it's easier for the Hadoop
people to understand and work with. On the other hand, it supports
Scala although only via a plugin, which is weaker support. sbt seems
like a fairly new, basic, ad-hoc tool. Is there an advantage to it,
other than being Scala (which is an advantage)?

--
Sean Owen | Director, Data Science | London


On Fri, Feb 21, 2014 at 4:03 AM, Patrick Wendell <pwend...@gmail.com> wrote:
> Hey All,
>
> It's very high overhead having two build systems in Spark. Before
> getting into a long discussion about the merits of sbt vs maven, I
> wanted to pose a simple question to the dev list:
>
> Is there anyone who feels that dropping either sbt or maven would have
> a major consequence for them?
>
> And I say "major consequence" meaning something becomes completely
> impossible now and can't be worked around. This is different from an
> "inconvenience", i.e., something which can be worked around but will
> require some investment.
>
> I'm posing the question in this way because, if there are features in
> either build system that are absolutely-un-available in the other,
> then we'll have to maintain both for the time being. I'm merely trying
> to see whether this is the case...
>
> - Patrick

Reply via email to