Hi,
How to set the number of executors and tasks in a Spark Streaming job in
Mesos? I have the following settings but my job still shows me 11 active
tasks and 11 executors. Any idea as to why this is happening
?
sparkConf.set("spark.mesos.coarse", "true")
sparkConf.set("spark.cores.max", "128")
sparkConf.set("spark.default.parallelism", "100")
//sparkConf.set("spark.locality.wait", "0")
sparkConf.set("spark.executor.memory", "32g")
sparkConf.set("spark.streaming.unpersist", "true")
sparkConf.set("spark.shuffle.io.numConnectionsPerPeer", "1")
sparkConf.set("spark.rdd.compress", "true")
sparkConf.set("spark.shuffle.memoryFraction", ".6")
sparkConf.set("spark.storage.memoryFraction", ".2")
sparkConf.set("spark.shuffle.spill", "true")
sparkConf.set("spark.shuffle.spill.compress", "true")
sparkConf.set("spark.streaming.receiver.writeAheadLog.enable", "true")
sparkConf.set("spark.streaming.blockInterval", "400")
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/How-to-set-the-number-of-executors-and-tasks-in-a-Spark-Streaming-job-in-Mesos-tp24348.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]