When I use fewer partitions, (like 6)
It seems that all the task will be assigned to the same machine, because the
machine has more than 6 cores.But this will run out of memory.
How to set fewer partitions number and use all the machine at the same time?



--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/configuration-needed-to-run-twitter-25GB-dataset-tp11044p11150.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

Reply via email to