Hi Steve,

Your spark master is not running if you have not started it. On windows its 
missing come scripts/and or the correct installation instructions.
I was able to start the master with

C:\> spark-class.cmd org.apache.spark.deploy.master.Master

Then on the browser with localhost:port you get the “spark url”, with the 7707 
port. This spark url has to be used to start the Spark worker


C:\>spark-class.cmd org.apache.spark.deploy.worker.Worker 
spark://D-113052037.wipro.com:7077

Also I ran the master and worker on different windows machines.

Also I don’t think the spark-shell is an alternative to starting the spark 
master.

The Spark instructions and most of the scripts are geared towards unix systems.

Regards,
Prajod

From: Steve Lewis [mailto:lordjoe2...@gmail.com]
Sent: Thursday, August 21, 2014 10:08 PM
To: user@spark.apache.org
Subject: I am struggling to run Spark Examples on my local machine

I download the binaries for spark-1.0.2-hadoop1 and unpack it on my Widows 8 
box.
I can execute spark-shell.com<http://spark-shell.com> and get a command window 
which does the proper things
I open a browser to http:/localhost:4040 and a window comes up describing the 
spark-master

Then using IntelliJ I create a project with JavaWordCount from the spark 
distribution. add


When I run the job with the -Dspark.master=spark://local[*]:7707 (I have tried 
MANY other string)
the Job fails for failure to connect to the spark master.

So my question is
1) Do I have a spark-master running? How can I tell? doesn't the web page say 
it is running
2) How to I find the port on which the master is running and test that it is 
accepting jobs
3) Are there other steps I need to take before I can run a simple spark sample?

14/08/21 09:27:08 INFO scheduler.TaskSchedulerImpl: Adding task set 1.0 with 1 
tasks
14/08/21 09:27:23 WARN scheduler.TaskSchedulerImpl: Initial job has not 
accepted any resources; check your cluster UI to ensure that workers are 
registered and have sufficient memory
...

14/08/21 09:28:08 ERROR cluster.SparkDeploySchedulerBackend: Application has 
been killed. Reason: All masters are unresponsive! Giving up.
14/08/21 09:28:08 INFO scheduler.TaskSchedulerImpl: Removed TaskSet 1.0, whose 
tasks have all completed, from pool
14/08/21 09:28:08 INFO scheduler.TaskSchedulerImpl: Cancelling stage 1
14/08/21 09:28:08 INFO scheduler.DAGScheduler: Failed to run collect at 
JavaWordCount.java:68
Exception in thread "main" org.apache.spark.SparkException: Job aborted due to 
stage failure: All masters are unresponsive! Giving up.




The information contained in this electronic message and any attachments to 
this message are intended for the exclusive use of the addressee(s) and may 
contain proprietary, confidential or privileged information. If you are not the 
intended recipient, you should not disseminate, distribute or copy this e-mail. 
Please notify the sender immediately and destroy all copies of this message and 
any attachments. 

WARNING: Computer viruses can be transmitted via email. The recipient should 
check this email and any attachments for the presence of viruses. The company 
accepts no liability for any damage caused by any virus transmitted by this 
email.

www.wipro.com

Reply via email to