I have the same problem, i.e. exception with the same call stack when I start
either pyspark or spark-shell. I use spark-1.3.0-bin-hadoop2.4 on ubuntu
14.10. 
bin/pyspark

bunch of INFO messages, then ActorInitializationException exception.
Shell starts, I can do this:
>>> rd = sc.parallelize([1,2])
>>> rd.first()
This call does not return.
Also if I start master, and then I tried to connect shell to the master it
fails to connect complaining about master URL.

The same tar works fine on windows.

Maybe some linux versions are not supported?
Thank you
Dima



--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/Actor-not-found-tp22265p22300.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to