[ 
https://issues.apache.org/jira/browse/SPARK-11555?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14993775#comment-14993775
 ] 

Thomas Graves commented on SPARK-11555:
---------------------------------------

Note that it is broken with --num-executors also when using the Client 
interface directly.  Both go through same logic.

case ("--num-workers" | "--num-executors") :: IntParam(value) :: tail =>
          if (args(0) == "--num-workers") {
            println("--num-workers is deprecated. Use --num-executors instead.")
          }
          numExecutors = value
          args = tail

> spark on yarn spark-class --num-workers doesn't work
> ----------------------------------------------------
>
>                 Key: SPARK-11555
>                 URL: https://issues.apache.org/jira/browse/SPARK-11555
>             Project: Spark
>          Issue Type: Bug
>          Components: YARN
>    Affects Versions: 1.5.2
>            Reporter: Thomas Graves
>
> Using the old spark-class and --num-workers interface, --num-workers 
> parameter is ignored and always uses default number of executors (2).
> bin/spark-class org.apache.spark.deploy.yarn.Client --jar 
> lib/spark-examples-1.5.2.0-hadoop2.6.0.16.1506060127.jar --class 
> org.apache.spark.examples.SparkPi --num-workers 4 --worker-memory 2g 
> --master-memory 1g --worker-cores 1 --queue default



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to