Thanks all!

I figured it out...

I thought sbt package is enough...




2014-03-17 21:46 GMT-04:00 Debasish Das <debasish.da...@gmail.com>:

> You need the spark assembly jar to run spark shell....Please do sbt
> assembly to generate the jar....
> On Mar 17, 2014 2:11 PM, "Yexi Jiang" <yexiji...@gmail.com> wrote:
>
>> Hi,
>>
>> I am a beginner of Spark.
>> Currently I am trying to install spark on my laptop.
>>
>> I followed the tutorial at
>> http://spark.apache.org/screencasts/1-first-steps-with-spark.html (The
>> only difference is that I installed scala-2.10.1 instead of 2.9.2).
>>
>> I packaged spark successfully with "sbt package" and config the
>> spark-env.sh according to the tutorial.
>>
>> Now when I execute spark-shell, I got the following error:
>>
>> Failed to find Spark assembly in
>> "PATH/spark-0.9.0-incubating/assembly/target/scala-2.10/"
>> You need to build Spark with 'sbt/sbt assembly' before running this
>> program.
>>
>> Could anyone tell me what is the problem?
>>
>> Thank you very much!
>>
>> Regards,
>> Yexi
>>
>> --
>>
>>
>>


-- 
------
Yexi Jiang,
ECS 251,  yjian...@cs.fiu.edu
School of Computer and Information Science,
Florida International University
Homepage: http://users.cis.fiu.edu/~yjian004/

Reply via email to