[ 
https://issues.apache.org/jira/browse/SPARK-5005?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14324646#comment-14324646
 ] 

Sean Owen commented on SPARK-5005:
----------------------------------

I'm running exactly this on CDH 5.3 (Spark 1.2.0), and it succeeds:

{code}
spark-submit --class org.apache.spark.examples.JavaSparkPi --master yarn-client 
--num-executors 2 --executor-cores 1 --executor-memory 1g 
/opt/cloudera/parcels/CDH/lib/spark/lib/spark-examples-1.2.0-cdh5.3.0-hadoop2.5.0-cdh5.3.0.jar
 3
{code}

I'm inclined to close this since your error shows something fairly strange is 
happening. Something is causing you to submit these args directly to the YARN 
app master. The Spark code doesn't do that. Did you modify anything? anything 
funny in your Spark env variable conf? those are the likely culprits.

> Failed to start spark-shell when using  yarn-client mode with the Spark1.2.0
> ----------------------------------------------------------------------------
>
>                 Key: SPARK-5005
>                 URL: https://issues.apache.org/jira/browse/SPARK-5005
>             Project: Spark
>          Issue Type: Bug
>          Components: Spark Core, Spark Shell, YARN
>    Affects Versions: 1.2.0
>         Environment: Spark 1.2.0
> Hadoop 2.2.0
>            Reporter: yangping wu
>            Priority: Minor
>   Original Estimate: 8h
>  Remaining Estimate: 8h
>
> I am using Spark 1.2.0, but when I starting spark-shell with yarn-client 
> mode({code}MASTER=yarn-client bin/spark-shell{code}), It Failed and the error 
> message is
> {code}
> Unknown/unsupported param List(--executor-memory, 1024m, --executor-cores, 8, 
> --num-executors, 2)
> Usage: org.apache.spark.deploy.yarn.ApplicationMaster [options] 
> Options:
>   --jar JAR_PATH       Path to your application's JAR file (required)
>   --class CLASS_NAME   Name of your application's main class (required)
>   --args ARGS          Arguments to be passed to your application's main 
> class.
>                        Mutliple invocations are possible, each will be passed 
> in order.
>   --num-executors NUM    Number of executors to start (Default: 2)
>   --executor-cores NUM   Number of cores for the executors (Default: 1)
>   --executor-memory MEM  Memory per executor (e.g. 1000M, 2G) (Default: 1G)
> {code}
> But when I using Spark 1.1.0,and also using {code}MASTER=yarn-client 
> bin/spark-shell{code} to starting spark-shell,it works.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to