The equivalent for spark-submit --num-executors should be spark.executor.instancesWhen use in SparkConf?http://spark.apache.org/docs/latest/running-on-yarn.html Could you try setting that with sparkR.init()?
_____________________________ From: Franc Carter <franc.car...@gmail.com> Sent: Friday, December 25, 2015 9:23 PM Subject: number of executors in sparkR.init() To: <user@spark.apache.org> Hi, I'm having trouble working out how to get the number of executors set when using sparkR.init(). If I start sparkR with sparkR --master yarn --num-executors 6 then I get 6 executors However, if start sparkR with sparkR followed by sc <- sparkR.init(master="yarn-client", sparkEnvir=list(spark.num.executors='6')) then I only get 2 executors. Can anyone point me in the direction of what I might doing wrong ? I need to initialise this was so that rStudio can hook in to SparkR thanks -- Franc