felixcheung commented on a change in pull request #24191: [SPARK-27261][Deploy] 
Documented passing multiple configurations details while submitting spark 
applications
URL: https://github.com/apache/spark/pull/24191#discussion_r268454743
 
 

 ##########
 File path: docs/submitting-applications.md
 ##########
 @@ -44,7 +44,7 @@ Some of the commonly used options are:
 * `--class`: The entry point for your application (e.g. 
`org.apache.spark.examples.SparkPi`)
 * `--master`: The [master URL](#master-urls) for the cluster (e.g. 
`spark://23.195.26.187:7077`)
 * `--deploy-mode`: Whether to deploy your driver on the worker nodes 
(`cluster`) or locally as an external client (`client`) (default: `client`) <b> 
&#8224; </b>
-* `--conf`: Arbitrary Spark configuration property in key=value format. For 
values that contain spaces wrap "key=value" in quotes (as shown).
+* `--conf`: Arbitrary Spark configuration property in key=value format. For 
values that contain spaces wrap "key=value" in quotes (as shown). Multiple 
configurations shall be passed individually.(e.g --conf <key>=<value> --conf 
<key>=<value>)
 
 Review comment:
   also put `--conf <key>=<value> --conf <key>=<value` in backtick 

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
[email protected]


With regards,
Apache Git Services

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to