I have a spark cluster setup on the cloud with a master and few workers. I
do my spark app development on my local machine. Now what is the
recommended way of running the app on the cluster? I can think of few cases:

- Run the app locally (say within eclipse) but specify the master url to
that of the master node in the cluster. But what is the role of my local
JVM then?

- Package my app as jar, upload to the master node in the cluster and run
using the ./bin/spark-class org.apache.spark.deploy.Client launch script. I
feel this is not an efficient workflow.

Is there an efficient way to do this?

Reply via email to