(- only limited part of my 20 cores in the
cluster)
I have important tasks that I want them to get 10 cores, and I have small
tasks that I want to run with only 1 or 2 cores)
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/Spark-standalone-cluster-resource
cores in the
cluster)
I have important tasks that I want them to get 10 cores, and I have small
tasks that I want to run with only 1 or 2 cores)
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/Spark-standalone-cluster-resource-management-tp23444p23445.html
to run with only 1 or 2 cores)
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/Spark-standalone-cluster-resource-management-tp23444p23445.html
Sent from the Apache Spark User List mailing list archive at Nabble.com
are registered
and have sufficient resources
Can't I tell spark to use only part of my cores for specific task? I need it
if I want to run many tasks in parallel
thanks, nizan
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/Spark-standalone-cluster-resource-management
have small
tasks that I want to run with only 1 or 2 cores)
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/Spark-standalone-cluster-resource-management-tp23444p23445.html
Sent from the Apache Spark User List mailing list archive at Nabble.com