[
https://issues.apache.org/jira/browse/SPARK-11841?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15018428#comment-15018428
]
Xiangyu Li commented on SPARK-11841:
------------------------------------
Thank you very much!
> None of start-all.sh, start-master.sh or start-slaves.sh takes -m, -c or -d
> configuration options as per the document
> ---------------------------------------------------------------------------------------------------------------------
>
> Key: SPARK-11841
> URL: https://issues.apache.org/jira/browse/SPARK-11841
> Project: Spark
> Issue Type: Bug
> Components: Spark Core
> Affects Versions: 1.5.2
> Reporter: Xiangyu Li
>
> I was trying to set up Spark Standalone Mode following the tutorial at
> http://spark.apache.org/docs/latest/spark-standalone.html.
> The tutorial says that we can pass "-c CORES" to the worker to set the total
> number of CPU cores allowed. But none of the start-all.sh, start-master.sh or
> start-slaves.sh would take those options as arguments.
> The start-all.sh and start-slaves.sh will just skip the options while
> start-master.sh can only take -h, -i, -p, --properties-file according to
> https://github.com/apache/spark/blob/master/core/src/main/scala/org/apache/spark/deploy/master/MasterArguments.scala
> So the only way I can limit the number of cores of an application at the
> moment is set the SPARK_WORKER_CORES in ${SPARK_HOME}/conf/spark_env.sh and
> then run start-all.sh
> So I think it is either an error in the document or the program.
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]