Github user srowen commented on a diff in the pull request:

    https://github.com/apache/spark/pull/8385#discussion_r37719398
  
    --- Diff: docs/running-on-yarn.md ---
    @@ -21,32 +21,51 @@ There are two deploy modes that can be used to launch 
Spark applications on YARN
     Unlike in Spark standalone and Mesos mode, in which the master's address 
is specified in the `--master` parameter, in YARN mode the ResourceManager's 
address is picked up from the Hadoop configuration. Thus, the `--master` 
parameter is `yarn-client` or `yarn-cluster`. 
     To launch a Spark application in `yarn-cluster` mode:
     
    -   `$ ./bin/spark-submit --class path.to.your.Class --master yarn-cluster 
[options] <app jar> [app options]`
    -    
    +   `$ ./bin/spark-submit --class path.to.your.Class --master yarn 
--deploy-mode yarn-client/yarn-cluster [options] <app jar> [app options]`
    +   
     For example:
     
         $ ./bin/spark-submit --class org.apache.spark.examples.SparkPi \
    -        --master yarn-cluster \
    +        --master yarn \
    +        --deploy-mode cluster
             --num-executors 3 \
             --driver-memory 4g \
             --executor-memory 2g \
             --executor-cores 1 \
             --queue thequeue \
             lib/spark-examples*.jar \
    -        10
    +        
    +`--deploy-mode` can be either client or cluster.
     
    -The above starts a YARN client program which starts the default 
Application Master. Then SparkPi will be run as a child thread of Application 
Master. The client will periodically poll the Application Master for status 
updates and display them in the console. The client will exit once your 
application has finished running.  Refer to the "Debugging your Application" 
section below for how to see driver and executor logs.
    +The above example starts a YARN client program which starts the default 
Application Master. Then SparkPi will be run as a child thread of Application 
Master. The client will periodically poll the Application Master for status 
updates and display them in the console. The client will exit once your 
application has finished running.  Refer to the "Debugging your Application" 
section below for how to see driver and executor logs.
     
    -To launch a Spark application in `yarn-client` mode, do the same, but 
replace `yarn-cluster` with `yarn-client`.  To run spark-shell:
    +To launch a Spark application in `yarn-client` mode, do the same, but 
replace `yarn-cluster` with `yarn-client` in the --deploy-mode.  To run 
spark-shell:
    --- End diff --
    
    This is wrong; the values aren't `yarn-client` and `yarn-cluster`


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to