jiangxb1987 commented on a change in pull request #27768: [SPARK-31018][CORE] 
Deprecate support of multiple workers on the same host in Standalone
URL: https://github.com/apache/spark/pull/27768#discussion_r388499387
 
 

 ##########
 File path: docs/hardware-provisioning.md
 ##########
 @@ -66,7 +66,8 @@ Finally, note that the Java VM does not always behave well 
with more than 200 Gi
 purchase machines with more RAM than this, you can run _multiple worker JVMs 
per node_. In
 Spark's [standalone mode](spark-standalone.html), you can set the number of 
workers per node
 with the `SPARK_WORKER_INSTANCES` variable in `conf/spark-env.sh`, and the 
number of cores
-per worker with `SPARK_WORKER_CORES`.
+per worker with `SPARK_WORKER_CORES`. But please note that support of multiple 
workers on the
 
 Review comment:
   We should rewrite this paragraph to suggest launch multiple executors 
instead of mentioning multiple workers

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
[email protected]


With regards,
Apache Git Services

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to