Mark, Thank you for your prompt response. I did follow the instructions, removed target and rebuilt (sbt assembly) spark. Now i am able to start start master. The instructions say however that it is supposed to publish the port and url using which the slave can be started but I am not able to see that. What is the default port on which the slave is started and what is the command for the same. Also, how many daemons are needed for a standalone spark instance running on mac os x.
On Sunday, January 5, 2014 8:13 PM, Mark Hamstra <[email protected]> wrote: So follow the instruction and remove the extra spark-assembly jar from <spark>/assembly/target. Or remove all of <spark>/assembly/target and do `./sbt/sbt assembly/assembly`, or do `./sbt/sbt clean` before redoing `./sbt/sbt assembly`. In any case, you've got an extra assembly jar left over from a prior build that you did not clean before building the new assembly. On Sun, Jan 5, 2014 at 7:37 PM, danoomistmatiste <[email protected]> wrote: Hi, I have installed and built spark-0.8.1-incubating-bin-cdh4 with sbt/sbt >assembly. I am running this with scala 2.9.3. When i try to start spark >master (./start-master.sh), I get this error message. > >failed to launch org.apache.spark.deploy.master.Master: > spark-assembly_2.9.3-0.8.1-incubating-hadoop2.0.0-mr1-cdh4.2.0.jar > Please remove all but one jar. >full log in >/Users/hadoop/spark-0.8.1-incubating-bin-cdh4/bin/../logs/spark-hadoop-org.apache.spark.deploy.master.Master-1-localhost.out > > > > >-- >View this message in context: >http://apache-spark-user-list.1001560.n3.nabble.com/Cannot-start-spark-master-tp301.html >Sent from the Apache Spark User List mailing list archive at Nabble.com. >
