I finally got it working. Main points:

- I had to add hadoop-client dependency to avoid a strange EOFException.
- I had to set SPARK_MASTER_IP in conf/start-master.sh to hostname -f
instead of hostname, since akka seems not to work properly with host names /
ip, it requires fully qualified domain names.
- I also set SPARK_MASTER_IP in conf/spark-env.sh to hostname -f so that
other workers can reach the master.
- Be sure that conf/slaves also contains fully qualified domain names.
- It seems that both master and workers need to have access to the driver
client and since I was within a VPN I had lot of troubles with this. It took
some time but I finally realized it.

Making these changes, everything just worked like a charm!



--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/A-Standalone-App-in-Scala-Standalone-mode-issues-tp6493p6514.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

Reply via email to