I'm in a situation where I'm running Spark streaming on a single machine
right now. The plan is to ultimately run it on a cluster, but for the next
couple months it will probably stay on one machine.

I tried to do some digging and I can't find any indication of whether it's
better to run spark as a single node cluster or just in local mode. As far
as I can tell, the only real difference is that it's difficult to configure
the executor memory in local mode.

I have been having problems with spark crashing in local mode so far, which
has lead me to do this research. I'll be migrating from Spark 1.0.2 to 1.1.0
in the next day or so to see if that helps.

Does anyone have any experience on the matter?



--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/Running-Spark-in-Local-Mode-vs-Single-Node-Cluster-tp14834.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to