Hi
U can use SBT assembly to create uber jar. U should set spark libraries as
'provided' in ur SBT
Hth
Marco
Ps apologies if by any chances I m telling u something u already know
On 4 Apr 2016 2:36 pm, "Mich Talebzadeh" <mich.talebza...@gmail.com> wrote:

> Hi,
>
>
> When one builds a project for Spark in this case Spark streaming with SBT,
> as usual I add dependencies as follows:
>
> libraryDependencies += "org.apache.spark" %% "spark-streaming" % "1.6.1"
> libraryDependencies += "org.apache.spark" %% "spark-streaming-kafka" %
> "1.6.1"
>
> However when I submit it through spark-submit I need to put the package
> containing KafkaUtils the same way I do it in spark-shell
>
> ${SPARK_HOME}/bin/spark-submit \
>                  --jars
> /home/hduser/jars/spark-streaming-kafka-assembly_2.10-1.6.1.jar \
> .....
>
> Now if I want to distribute this as all-in-one package so that it can be
> run from any node, I have been told  that I need to create an uber-jar. I
> have not done this before so I assume an uber-jar will be totally self
> contained with all the classes etc.
>
> Can someone elaborate on this please?
>
> Thanks
>
> Dr Mich Talebzadeh
>
>
>
> LinkedIn * 
> https://www.linkedin.com/profile/view?id=AAEAAAAWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw
> <https://www.linkedin.com/profile/view?id=AAEAAAAWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw>*
>
>
>
> http://talebzadehmich.wordpress.com
>
>
>

Reply via email to