On 09/20/2014 03:17 PM, James wrote:
> 
> OK, that behind me now......
> 
> So the build fails, so I figure I'll just build it manually, then
> finish the ebuild. So I went to:
> 
> 
> /var/tmp/portage/sys-cluster/spark-1.1.0/work/spark-1.1.0
> and no configure scripts....
> 
> The README.md has this:
> 
> Spark is built on Scala 2.10. To build Spark and its example programs, run:
> 
>     ./sbt/sbt assembly
> 
> I did and it looks like the manual compile worked:
> 
> [info] Packaging
> /var/tmp/portage/sys-cluster/spark-1.1.0/work/spark-1.1.0/
> examples/target/scala-2.10/spark-examples-1.1.0-hadoop1.0.4.jar
> ...
> [info] Done packaging.
> [success] Total time: 786 s, completed Sep 20, 2014 3:04:22 PM
> 
> So I need to add commands to the ebuild to launch
> " ./sbt/sbt assembly"
> 
> I've been all over the man 5 ebuild and the devmanual. So naturally
> I've seen what to do, but missed it.
> 

Short answer: just put the command in the ebuild (which is nothing but a
fancy bash script). It'll run. For example,

src_compile() {
        ./sbt/sbt assembly || die "assembly build failed"
        [the rest of the build commands go here]
}

The long answer is that we usually have eclasses for different build
systems. The way eclass inheritance works, the eclasses override certain
phases of the ebuild. So for example, the haskell-cabal eclass knows
that it should run `runghc Setup.hs configure` or something like that
instead of ./configure (like we would run by default).

Unfortunately, I don't see any other ebuilds for scala packages in the
tree. So you're probably the first person to need such an eclass. Rather
than set yourself back implementing the eclass, you're probably better
off running all of the build commands manually and waiting for somebody
else to write the eclass.

Someone may be working on scala in an overlay; the java herd is taking
care of dev-lang/scala so perhaps you can ask in #gentoo-java.


Reply via email to