Can you try it with just:
spark-submit --master spark://master:6066 --class SimpleApp
target/simple-project-1.0.jar
And see if it works?
Even better idea would be to spawn a spark-shell (*MASTER=spark://master:6066
bin/spark-shell*) and try out a simple *sc.parallelize(1 to 1000).collect*
Hi,
we are trying to setup apache spark on a raspberry pi cluster for educational
use.
Spark is installed in a docker container and all necessary ports are exposed.
After we start master and workers, all workers are listed as alive in the
master web ui (http://master:8080
Hi,
we are trying to setup apache spark on a raspberry pi cluster for educational
use.
Spark is installed in a docker container and all necessary ports are exposed.
After we start master and workers, all workers are listed as alive in the
master web ui (http://master:8080