Hi all,
I run start-master.sh to start standalone Spark with
spark://192.168.1.164:7077. Then, I use this command as below, and it's
OK:./bin/spark-shell --master spark://192.168.1.164:7077
The console print correct message, and Spark context had been initialised
correctly.
However, when I run
I debugged it, and the remote actor can be fetched in the
tryRegisterAllMasters() method in AppClient: def tryRegisterAllMasters() {
for (masterAkkaUrl - masterAkkaUrls) { logInfo(Connecting to master
+ masterAkkaUrl + ...) val actor =
registering operation is not finished,
then failed.
I don't know what is the reason. Who know the answer?
Regards,
Yi
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/Why-association-with-remote-system-has-failed-when-set-master-in-Spark-programmatically-tp22911