Hi Pillis,

I met with the same problem here. Could you share how you solved the issue more 
specifically?
I added an entry in /etc/hosts, but it doesn't help.

From: Pillis W [mailto:pillis.w...@gmail.com]
Sent: Sunday, February 09, 2014 4:49 AM
To: u...@spark.incubator.apache.org
Subject: Re: Akka Connection refused - standalone cluster using spark-0.9.0

I fixed my issue - two IP addresses had the same hostname.
Regards



On Fri, Feb 7, 2014 at 12:59 PM, Soumya Simanta 
<soumya.sima...@gmail.com<mailto:soumya.sima...@gmail.com>> wrote:
I see similar logs but only when I try to run a standalone Scala program. The 
whole setup works just fine if I'm using the spark-shell/REPL.



On Fri, Feb 7, 2014 at 3:05 PM, mohankreddy 
<mre...@beanatomics.com<mailto:mre...@beanatomics.com>> wrote:
Here's more information. I have the master up but when I try to get the
workers up I am getting the following error.

log4j:WARN No appenders could be found for logger
(akka.event.slf4j.Slf4jLogger).
log4j:WARN Please initialize the log4j system properly.
log4j:WARN See http://logging.apache.org/log4j/1.2/faq.html#noconfig for
more info.
14/02/07 15:01:17 INFO Worker: Using Spark's default log4j profile:
org/apache/spark/log4j-defaults.properties
14/02/07 15:01:17 INFO Worker: Starting Spark worker yyyyyyy:58020 with 16
cores, 67.0 GB RAM
14/02/07 15:01:17 INFO Worker: Spark home: /opt/spark
14/02/07 15:01:17 INFO WorkerWebUI: Started Worker web UI at
http://yyyyyyyyy:8081
14/02/07 15:01:17 INFO Worker: Connecting to master spark://xxxxx/:7077...
14/02/07 15:01:17 INFO RemoteActorRefProvider$RemoteDeadLetterActorRef:
Message [org.apache.spark.deploy.DeployMessages$RegisterWorker] from
Actor[akka://sparkWorker/user/Worker#2037095035<tel:2037095035>] to
Actor[akka://sparkWorker/deadLetters] was not delivered. [1] dead letters
encountered. This logging can be turned off or adjusted with configuration
settings 'akka.log-dead-letters' and
'akka.log-dead-letters-during-shutdown'.
14/02/07 15:01:37 INFO Worker: Connecting to master spark://xxxxx/:7077...
14/02/07 15:01:37 INFO RemoteActorRefProvider$RemoteDeadLetterActorRef:
Message [org.apache.spark.deploy.DeployMessages$RegisterWorker] from
Actor[akka://sparkWorker/user/Worker#2037095035<tel:2037095035>] to
Actor[akka://sparkWorker/deadLetters] was not delivered. [2] dead letters
encountered. This logging can be turned off or adjusted with configuration
settings 'akka.log-dead-letters' and
'akka.log-dead-letters-during-shutdown'.
14/02/07 15:01:57 INFO Worker: Connecting to master spark://xxxx/:7077...
14/02/07 15:01:57 INFO RemoteActorRefProvider$RemoteDeadLetterActorRef:
Message [org.apache.spark.deploy.DeployMessages$RegisterWorker] from
Actor[akka://sparkWorker/user/Worker#2037095035<tel:2037095035>] to
Actor[akka://sparkWorker/deadLetters] was not delivered. [3] dead letters
encountered. This logging can be turned off or adjusted with configuration
settings 'akka.log-dead-letters' and
'akka.log-dead-letters-during-shutdown'.
14/02/07 15:02:17 ERROR Worker: All masters are unresponsive! Giving up.



PS: I masked the IPs



--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/Akka-Connection-refused-standalone-cluster-using-spark-0-9-0-tp1297p1311.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.


Reply via email to