What do the master logs show? Best Regards, Sonal Founder, Nube Technologies <http://t.sidekickopen13.com/e1t/c/5/f18dQhb0S7lC8dDMPbW2n0x6l2B9nMJW7t5XZs1pNkJdVdDLZW1q7zBxW64k9XR56dLFLf58_ZT802?t=http%3A%2F%2Fwww.nubetech.co%2F&si=5462006004973568&pi=903294d1-e4a2-4926-cf03-b51cc168cfc1>
Check out Reifier at Spark Summit 2015 <http://t.sidekickopen13.com/e1t/c/5/f18dQhb0S7lC8dDMPbW2n0x6l2B9nMJW7t5XZs1pNkJdVdDLZW1q7zBxW64k9XR56dLFLf58_ZT802?t=https%3A%2F%2Fspark-summit.org%2F2015%2Fevents%2Freal-time-fuzzy-matching-with-spark-and-elastic-search%2F&si=5462006004973568&pi=903294d1-e4a2-4926-cf03-b51cc168cfc1> <http://t.sidekickopen13.com/e1t/c/5/f18dQhb0S7lC8dDMPbW2n0x6l2B9nMJW7t5XZs1pNkJdVdDLZW1q7zBxW64k9XR56dLFLf58_ZT802?t=http%3A%2F%2Fin.linkedin.com%2Fin%2Fsonalgoyal&si=5462006004973568&pi=903294d1-e4a2-4926-cf03-b51cc168cfc1> On Mon, Aug 3, 2015 at 7:46 AM, Angel Angel <areyouange...@gmail.com> wrote: > Hello Sir, > > I have install the spark. > > > > The local spark-shell is working fine. > > > > But whenever I tried the Master configuration I got some errors. > > > > When I run this command ; > > MASTER=spark://hadoopm0:7077 spark-shell > > > > I gets the errors likes; > > > > 15/07/27 21:17:26 INFO AppClient$ClientActor: Connecting to master > spark://hadoopm0:7077... > > 15/07/27 21:17:46 ERROR SparkDeploySchedulerBackend: Application has been > killed. Reason: All masters are unresponsive! Giving up. > > 15/07/27 21:17:46 WARN SparkDeploySchedulerBackend: Application ID is not > initialized yet. > > 15/07/27 21:17:46 ERROR TaskSchedulerImpl: Exiting due to error from > cluster scheduler: All masters are unresponsive! Giving up. > > > > Also I have attached the my screenshot of Master UI. > > > Also i have tested using telnet command: > > > it shows that hadoopm0 is connected > > > > Can you please give me some references, documentations or how to solve > this issue. > > Thanks in advance. > > Thanking You, > > > --------------------------------------------------------------------- > To unsubscribe, e-mail: user-unsubscr...@spark.apache.org > For additional commands, e-mail: user-h...@spark.apache.org >