Hi Pramod,

If you are using sbt as your build, then you need to do sbt assembly once
and use sbt ~compile. Also export SPARK_PREPEND_CLASSES=1 this in your
shell and all nodes.
You can may be try this out ?

Thanks,

Prashant Sharma



On Fri, May 1, 2015 at 2:16 PM, Pramod Biligiri <pramodbilig...@gmail.com>
wrote:

> Hi,
> I'm making some small changes to the Spark codebase and trying it out on a
> cluster. I was wondering if there's a faster way to build than running the
> package target each time.
> Currently I'm using: mvn -DskipTests  package
>
> All the nodes have the same filesystem mounted at the same mount point.
>
> Pramod
>

Reply via email to