Hi guys, I would like to run some algorithms on a single machine, using the local[k] option. I cannot however seem to set the amount of memory available, it always defaults to 700mb. I've tried setting spark.executor.memory before creating the spark context.
Is there any other way to increase the amount of memory when using the local[k] option? Thanks! -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Set-memory-when-using-local-k-tp1945.html Sent from the Apache Spark User List mailing list archive at Nabble.com.