I have met a similar problem. My solution is: always guarantee there is only one IP address for one hostname.
For example, in /etc/hosts, you shouldn't let this happen: 127.0.1.1 base 168.144.8.8 base On Fri, Nov 1, 2013 at 9:00 PM, Chen Jingci <[email protected]> wrote: > Can you try to use the IP address instead of the name 'base'? I also > experience the same problem before, it worked after changed to IP. > > Thanks, > Chen jingci > ------------------------------ > From: Thorsten Bergler <[email protected]> > Sent: 2/11/2013 2:03 > To: [email protected] > Subject: Spark with HDFS: ERROR Worker: Connection to master failed! > Shuttingdown. > > Hello, > > I am new to Spark and doing my first steps with it today. > > Right now I am having trouble with the error: > > ERROR Worker: Connection to master failed! Shutting down. > > So far, I found out the following: > > The standalone version of Spark (without Hadoop-HDFS and YARN) works > perfectly. The master starts and also the worker starts and get registered > at the master. > > Logfile of the master: > > Spark Command: /usr/lib/jvm/jdk//bin/java -cp > :/home/thorsten/spark/conf:/home/thorsten/spark/assembly/target/scala-2.9.3/spark-assembly-0.8.0-incubating-hadoop1.0.4.jar > -Djava.library.path= -Xms512m -Xmx512m > org.apache.spark.deploy.master.Master --ip base --port 7077 --webui-port > 8080 > ======================================== > > 13/11/01 18:51:14 INFO Slf4jEventHandler: Slf4jEventHandler started > 13/11/01 18:51:14 INFO Master: Starting Spark master at spark://base:7077 > 13/11/01 18:51:14 INFO MasterWebUI: Started Master web UI at > http://base:8080 > 13/11/01 18:51:17 INFO Master: Registering worker base:32942 with 1 cores, > 512.0 MB RAM > > > Because I want to try out Spark with HDFS too, I tried to compile it also > that way: > > SPARK_HADOOP_VERSION=2.2.0 sbt/sbt assembly > > But with this compiled version, the registration of workers at the master > is not working anymore. > > Here is the logfile of the slave: > > Spark Command: java -cp > :/home/thorsten/spark-hdfs/conf:/home/thorsten/spark-hdfs/assembly/target/scala-2.9.3/spark-assembly-0.8.0-incubating-hadoop2.2.0.jar > -Djava.library.path= -Xms512m -Xmx512m > org.apache.spark.deploy.worker.Worker spark://base:7077 --webui-port 8081 > ======================================== > > 13/11/01 18:49:57 INFO Slf4jEventHandler: Slf4jEventHandler started > 13/11/01 18:49:57 INFO Worker: Starting Spark worker base:37950 with 1 > cores, 512.0 MB RAM > 13/11/01 18:49:57 INFO Worker: Spark home: /home/thorsten/spark-hdfs > 13/11/01 18:49:57 INFO WorkerWebUI: Started Worker web UI at > http://base:8081 > 13/11/01 18:49:57 INFO Worker: Connecting to master spark://base:7077 > 13/11/01 18:49:58 ERROR Worker: Connection to master failed! Shutting down. > > The logfile of the master: > > Spark Command: /usr/lib/jvm/jdk//bin/java -cp > :/home/thorsten/spark-hdfs/conf:/home/thorsten/spark-hdfs/assembly/target/scala-2.9.3/spark-assembly-0.8.0-incubating-hadoop2.2.0.jar > -Djava.library.path= -Xms512m -Xmx512m > org.apache.spark.deploy.master.Master --ip base --port 7077 --webui-port > 8080 > ======================================== > > 13/11/01 18:49:55 INFO Slf4jEventHandler: Slf4jEventHandler started > 13/11/01 18:49:55 INFO Master: Starting Spark master at spark://base:7077 > 13/11/01 18:49:55 INFO MasterWebUI: Started Master web UI at > http://base:8080 > > Hope anybody could help me, solving that problem. > > Thanks > Thorsten > -- Dachuan Huang Cellphone: 614-390-7234 2015 Neil Avenue Ohio State University Columbus, Ohio U.S.A. 43210
