Yes, the application is overwriting it - I need to pass it as argument to the application otherwise it will be set as local.
Thanks for the quick reply! Also, yes now the appTrackingUrl is set properly as well, before it just said unassigned. Thanks! Arun On Tue, Aug 19, 2014 at 5:47 PM, Marcelo Vanzin <van...@cloudera.com> wrote: > On Tue, Aug 19, 2014 at 2:34 PM, Arun Ahuja <aahuj...@gmail.com> wrote: > > /opt/cloudera/parcels/CDH/bin/spark-submit \ > > --master yarn \ > > --deploy-mode client \ > > This should be enough. > > > But when I view the job 4040 page, SparkUI, there is a single executor > (just > > the driver node) and I see the following in enviroment > > > > spark.master - local[24] > > Hmmm. Are you sure the app itself is not overwriting "spark.master" > before creating the SparkContext? That's the only explanation I can > think of. > > > Also, when I run with yarn-cluster, how can I access the SparkUI page? > > You can click on the link in the RM application list. The address is > also printed to the AM logs, which are also available through the RM > web ui. Finally, the link is printed to the output of the launcher > process (look for "appTrackingUrl"). > > > -- > Marcelo >