Hi zeppelin users !

I am working with zeppelin pointing to a spark in standalone. I am trying
to figure out a way to make zeppelin runs the spark driver outside of
client process that submits the application.

According with the documentation (
http://spark.apache.org/docs/2.1.1/spark-standalone.html):

*For standalone clusters, Spark currently supports two deploy modes.
In client mode, the driver is launched in the same process as the client
that submits the application. In cluster mode, however, the driver is
launched from one of the Worker processes inside the cluster, and the
client process exits as soon as it fulfills its responsibility of
submitting the application without waiting for the application to finish.*

The problem is that, even when I set the properties for spark-standalone
cluster and deploy mode in cluster, the driver still run inside zeppelin
machine (according with spark UI/executors page). These are properties that
I am setting for the spark interpreter:

master: spark://<master-name>:7077
spark.submit.deployMode: cluster
spark.executor.memory: 16g

Any ideas would be appreciated.

Thank you

Details:
Spark version: 2.1.1
Zeppelin version: 0.8.0 (merged at September 2017 version)

Reply via email to