During the last few days I've been trying to deploy a Scala job to a
standalone cluster (master + 4 workers) without much success, although it
worked perfectly when launching it from the spark shell, that is, using the
Scala REPL (pretty strange, this would mean my cluster config was actually
correct).

In order to test it with a simpler example, I decided to deploy  this
example
<https://spark.apache.org/docs/0.9.0/quick-start.html#a-standalone-app-in-scala>
  
in standalone mode(master + 1 worker, same machine). Please have a look at 
this gist <https://gist.github.com/JordiAranda/4ee54f84dc92f02ecb8c>   for
the cluster setup. I can't get rid of the EOFException.

So, I should definitely be missing something. Why it works when setting the
master config property to "local[x]" or launching it from the REPL, and not
when setting the master config property as an spark url?

PS: Please, notice I am using the latest release (0.9.1) prebuilt for Hadoop
2

Thanks,



--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/A-Standalone-App-in-Scala-Standalone-mode-issues-tp6493.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

Reply via email to