This is an automated email from the ASF dual-hosted git repository.

dongjoon pushed a commit to branch branch-3.0
in repository https://gitbox.apache.org/repos/asf/spark.git


The following commit(s) were added to refs/heads/branch-3.0 by this push:
     new cde5032  [MINOR][DOCS] Fix a typo for a configuration property of 
resources allocation
cde5032 is described below

commit cde50326ca8b357406abe2596ef8a724bb10ad0c
Author: Kousuke Saruta <saru...@oss.nttdata.com>
AuthorDate: Tue Jun 30 09:28:54 2020 -0700

    [MINOR][DOCS] Fix a typo for a configuration property of resources 
allocation
    
    ### What changes were proposed in this pull request?
    
    This PR fixes a typo for a configuration property in the 
`spark-standalone.md`.
    `spark.driver.resourcesfile` should be `spark.driver.resourcesFile`.
    I look for similar typo but this is the only typo.
    
    ### Why are the changes needed?
    
    The property name is wrong.
    
    ### Does this PR introduce _any_ user-facing change?
    
    Yes. The property name is corrected.
    
    ### How was this patch tested?
    
    I confirmed the spell of the property name is the correct from the property 
name defined in o.a.s.internal.config.package.scala.
    
    Closes #28958 from sarutak/fix-resource-typo.
    
    Authored-by: Kousuke Saruta <saru...@oss.nttdata.com>
    Signed-off-by: Dongjoon Hyun <dongj...@apache.org>
    (cherry picked from commit 5176707ac3a451158e5705bfb9a070de2d6c9cab)
    Signed-off-by: Dongjoon Hyun <dongj...@apache.org>
---
 docs/spark-standalone.md | 2 +-
 1 file changed, 1 insertion(+), 1 deletion(-)

diff --git a/docs/spark-standalone.md b/docs/spark-standalone.md
index 1e6f8c5..566f081 100644
--- a/docs/spark-standalone.md
+++ b/docs/spark-standalone.md
@@ -359,7 +359,7 @@ Spark Standalone has 2 parts, the first is configuring the 
resources for the Wor
 
 The user must configure the Workers to have a set of resources available so 
that it can assign them out to Executors. The 
<code>spark.worker.resource.{resourceName}.amount</code> is used to control the 
amount of each resource the worker has allocated. The user must also specify 
either <code>spark.worker.resourcesFile</code> or 
<code>spark.worker.resource.{resourceName}.discoveryScript</code> to specify 
how the Worker discovers the resources its assigned. See the descriptions above 
for ea [...]
 
-The second part is running an application on Spark Standalone. The only 
special case from the standard Spark resource configs is when you are running 
the Driver in client mode. For a Driver in client mode, the user can specify 
the resources it uses via <code>spark.driver.resourcesfile</code> or 
<code>spark.driver.resource.{resourceName}.discoveryScript</code>. If the 
Driver is running on the same host as other Drivers, please make sure the 
resources file or discovery script only returns  [...]
+The second part is running an application on Spark Standalone. The only 
special case from the standard Spark resource configs is when you are running 
the Driver in client mode. For a Driver in client mode, the user can specify 
the resources it uses via <code>spark.driver.resourcesFile</code> or 
<code>spark.driver.resource.{resourceName}.discoveryScript</code>. If the 
Driver is running on the same host as other Drivers, please make sure the 
resources file or discovery script only returns  [...]
 
 Note, the user does not need to specify a discovery script when submitting an 
application as the Worker will start each Executor with the resources it 
allocates to it.
 


---------------------------------------------------------------------
To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org
For additional commands, e-mail: commits-h...@spark.apache.org

Reply via email to