you need an uber jar file.

Have you actually followed the dependencies and project sub-directory build?

check this.

http://stackoverflow.com/questions/28459333/how-to-build-an-uber-jar-fat-jar-using-sbt-within-intellij-idea

under three answers the top one.

I started reading the official SBT tutorial
<http://www.scala-sbt.org/0.13/tutorial/>.  .....

HTH

Dr Mich Talebzadeh



LinkedIn * 
https://www.linkedin.com/profile/view?id=AAEAAAAWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw
<https://www.linkedin.com/profile/view?id=AAEAAAAWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw>*



http://talebzadehmich.wordpress.com


*Disclaimer:* Use it at your own risk. Any and all responsibility for any
loss, damage or destruction of data or any other property which may arise
from relying on this email's technical content is explicitly disclaimed.
The author will in no case be liable for any monetary damages arising from
such loss, damage or destruction.



On 20 July 2016 at 09:54, Sachin Mittal <sjmit...@gmail.com> wrote:

> Hi,
> I am following the example under
> https://spark.apache.org/docs/latest/quick-start.html
> For standalone scala application.
>
> I added all my dependencies via build.sbt (one dependency is under lib
> folder).
>
> When I run sbt package I see the jar created under
> target/scala-2.10/
>
> So compile seems to be working fine. However when I inspect that jar, it
> only contains my scala class.
> Unlike in java application we build a standalone jar, which contains all
> the dependencies inside that jar, here all the dependencies are missing.
>
> So as expected when I run the application via spark-submit I get the
> NoClassDefFoundError.
>
> Here is my build.sbt
>
> name := "Test Advice Project"
> version := "1.0"
> scalaVersion := "2.10.6"
> libraryDependencies ++= Seq(
>     "org.apache.spark" %% "spark-core" % "1.6.1",
>     "org.apache.spark" %% "spark-sql" % "1.6.1"
> )
>
> Can anyone please guide me to as what is going wrong and why sbt package
> is not including all the dependencies jar classes in the new jar.
>
> Thanks
> Sachin
>
>
> On Tue, Jul 19, 2016 at 8:23 PM, Andrew Ehrlich <and...@aehrlich.com>
> wrote:
>
>> Yes, spark-core will depend on Hadoop and several other jars.  Here’s the
>> list of dependencies:
>> https://github.com/apache/spark/blob/master/core/pom.xml#L35
>>
>> Whether you need spark-sql depends on whether you will use the DataFrame
>> API. Without spark-sql, you will just have the RDD API.
>>
>> On Jul 19, 2016, at 7:09 AM, Sachin Mittal <sjmit...@gmail.com> wrote:
>>
>>
>> Hi,
>> Can someone please guide me what all jars I need to place in my lib
>> folder of the project to build a standalone scala application via sbt.
>>
>> Note I need to provide static dependencies and I cannot download the jars
>> using libraryDependencies.
>> So I need to provide all the jars upfront.
>>
>> So far I found that we need:
>> spark-core_<version>.jar
>>
>> Do we also need
>> spark-sql_<version>.jar
>> and
>> hadoop-core-<version>.jar
>>
>> Is there any jar from spark side I may be missing? What I found that
>> spark-core needs hadoop-core classes and if I don't add them then sbt was
>> giving me this error:
>> [error] bad symbolic reference. A signature in SparkContext.class refers
>> to term hadoop
>> [error] in package org.apache which is not available.
>>
>> So I was just confused on library dependency part when building an
>> application via sbt. Any inputs here would be helpful.
>>
>> Thanks
>> Sachin
>>
>>
>>
>>
>>
>

Reply via email to