Probably something like 8 is best on this kind of machine. What operations are 
you doing though? It's possible that something else is a contention point at 48 
threads, e.g. a common one we've seen is the Linux file system.

Matei

On Jul 13, 2014, at 4:03 PM, lokesh.gidra <lokesh.gi...@gmail.com> wrote:

> Hello,
> 
> What would be an ideal core count to run a spark job in local mode to get
> best utilization of CPU? Actually I have a 48-core machine but the
> performance of local[48] is poor as compared to local[10].
> 
> 
> Lokesh
> 
> 
> 
> --
> View this message in context: 
> http://apache-spark-user-list.1001560.n3.nabble.com/Ideal-core-count-within-a-single-JVM-tp9566.html
> Sent from the Apache Spark User List mailing list archive at Nabble.com.

Reply via email to