I ran into a similar error last week when building Spark 0.8.0 from its
source.  I had upgraded from 0.7.3 by using git to check out the 0.8.0 tag.
 In my case, the problem was the ./assembly/target/scala-2.9.3 directory
contained two different versions of the Spark assembly jar, and I guess the
older version was being picked up in the scripts' classpaths.  Doing a
clean build via `sbt/sbt clean assembly` removed the outdated jars and
solved the problem for me.

- Josh


On Thu, Oct 10, 2013 at 2:14 AM, vinayak navale <[email protected]>wrote:

> Hi,
>
> I am trying to install spark latest version i.e  0.8.0
>
> when i run this  will get follwoing error.
>
> # ./start-master.sh
> starting org.apache.spark.deploy.master.Master, logging to
> /opt/spark/spark/bin/../logs/spark-root-org.apache.spark.deploy.master.Master-1-nymong04.adnear.net.out
> failed to launch org.apache.spark.deploy.master.Master:
>
> Error: Could not find or load main class
> org.apache.spark.deploy.master.Master
> full log in
> /opt/spark/spark/bin/../logs/spark-root-org.apache.spark.deploy.master.Master-1-nymong04.adnear.net.ou
>
> help will be appreciated
>
> Thanks,
> Vinayak.
>
>
>

Reply via email to