Felix Cheung created SPARK-12534:
------------------------------------
Summary: Document missing command line options to Spark properties
mapping
Key: SPARK-12534
URL: https://issues.apache.org/jira/browse/SPARK-12534
Project: Spark
Issue Type: Bug
Components: Deploy, Documentation, YARN
Affects Versions: 1.5.2
Reporter: Felix Cheung
Priority: Minor
Several Spark properties equivalent to Spark submit command line options are
missing.
{quote}
The equivalent for spark-submit --num-executors should be
spark.executor.instances
When use in SparkConf?
http://spark.apache.org/docs/latest/running-on-yarn.html
Could you try setting that with sparkR.init()?
_____________________________
From: Franc Carter <[email protected]>
Sent: Friday, December 25, 2015 9:23 PM
Subject: number of executors in sparkR.init()
To: <[email protected]>
Hi,
I'm having trouble working out how to get the number of executors set when
using sparkR.init().
If I start sparkR with
sparkR --master yarn --num-executors 6
then I get 6 executors
However, if start sparkR with
sparkR
followed by
sc <- sparkR.init(master="yarn-client",
sparkEnvir=list(spark.num.executors='6'))
then I only get 2 executors.
Can anyone point me in the direction of what I might doing wrong ? I need to
initialise this was so that rStudio can hook in to SparkR
thanks
--
Franc
{quote}
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]