Hello all - i'd a basic question on the modes in which spark-shell can be
run ..

when i run the following command,
does Spark run in local mode i.e. outside of YARN & using the local cores ?
(since '--master' option is missing)

./bin/spark-shell --driver-memory 512m --executor-memory 512m

Similarly, when i run the following -

1) ./bin/spark-shell --master yarn-client --driver-memory 512m
--executor-memory 512m

   - Spark is run in Client mode & resources managed by YARN.

2) ./bin/spark-shell --master yarn-cluster --driver-memory 512m
--executor-memory 512m

    - Spark is run in Cluster mode & resources managed by YARN.

Reply via email to