So follow the instruction and remove the extra spark-assembly jar from <spark>/assembly/target. Or remove all of <spark>/assembly/target and do `./sbt/sbt assembly/assembly`, or do `./sbt/sbt clean` before redoing `./sbt/sbt assembly`. In any case, you've got an extra assembly jar left over from a prior build that you did not clean before building the new assembly.
On Sun, Jan 5, 2014 at 7:37 PM, danoomistmatiste <[email protected]>wrote: > Hi, I have installed and built spark-0.8.1-incubating-bin-cdh4 with > sbt/sbt > assembly. I am running this with scala 2.9.3. When i try to start spark > master (./start-master.sh), I get this error message. > > failed to launch org.apache.spark.deploy.master.Master: > spark-assembly_2.9.3-0.8.1-incubating-hadoop2.0.0-mr1-cdh4.2.0.jar > Please remove all but one jar. > full log in > > /Users/hadoop/spark-0.8.1-incubating-bin-cdh4/bin/../logs/spark-hadoop-org.apache.spark.deploy.master.Master-1-localhost.out > > > > > -- > View this message in context: > http://apache-spark-user-list.1001560.n3.nabble.com/Cannot-start-spark-master-tp301.html > Sent from the Apache Spark User List mailing list archive at Nabble.com. >
