On Wed, Sep 10, 2014 at 3:48 PM, Steve Lewis <lordjoe2...@gmail.com> wrote:
> In modern projects there are a bazillion dependencies - when I use Hadoop I
> just put them in a lib directory in the jar - If I have a project that
> depends on 50 jars I need a way to deliver them to Spark - maybe wordcount
> can be written without dependencies but real projects need to deliver
> dependencies to the cluster

A solution that may not be super-pretty but works (and is used by
Spark itself) is to use the maven-shade-plugin to package all
dependencies in your app jar. See assembly/pom.xml in the Spark repo
for an example (since with this approach you have to manually handle
certain files that may conflict).

-- 
Marcelo

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to